PDA

View Full Version : Advances in Computer Technology



ChunkyC
02-06-2007, 12:56 AM
Holographic storage disks (http://www.itworldcanada.com/Pages/Docbase/ViewArticle.aspx?ID=idgml-7e5826bf-b0ca-4042-9063-6f3dc0173b94&Portal=2e5351f3-4ab9-4c24-a496-6b265ffaa88c&ParaStart=11&ParaEnd=22&direction=prev&Previous=Previous).

Being a computer geek, I'm always amazed at the advances being made in computing hardware. One comparison I love to repeat is the one I heard this past year; that before the end of this decade, a desktop computer will have more processing power than a 1980s era million dollar Cray supercomputer.

MidnightMuse
02-06-2007, 10:06 PM
What blows my mind is that, when I was in High School, no one had PC's. I still remember when a tape drive was the coolest thing ever - and could back up your entire hard drive in under an hour !

I even remember when a Gigabyte hard drive was a big WOW factor, now it's terrabytes and beyond.

I'm not sure if I should feel old, or priviledged to live in times that change so quickly.

Jamesaritchie
02-06-2007, 10:20 PM
Holographic storage disks (http://www.itworldcanada.com/Pages/Docbase/ViewArticle.aspx?ID=idgml-7e5826bf-b0ca-4042-9063-6f3dc0173b94&Portal=2e5351f3-4ab9-4c24-a496-6b265ffaa88c&ParaStart=11&ParaEnd=22&direction=prev&Previous=Previous).

Being a computer geek, I'm always amazed at the advances being made in computing hardware. One comparison I love to repeat is the one I heard this past year; that before the end of this decade, a desktop computer will have more processing power than a 1980s era million dollar Cray supercomputer.

My worry with such things as holographic disks are many. How easily do they break? Fifty years seems like a long time, but it isn't. When do we get a really long lasting digital storage system?

And how do they know it will last fifty years. I remember when it was said that CDs would last from 20-50 years. I remember similar claims made about DVDs. It's darned near impossible to say how long something will last until it's actually been tested.

When you place as much information on a disk as these holographic disks will hold, that is going to be a very valuable disk, and you need to know, not guess, at how long it will last.

ChunkyC
02-06-2007, 10:31 PM
I still remember when a tape drive was the coolest thing ever - and could back up your entire hard drive in under an hour!
When I was in high school, the first electronic calculator hit the market. It was around $300 and if anyone got caught trying to bring one into class, they were in trouble. Not that anyone but the richest kids' parents could afford one.

I also remember punch cards for data entry, and these bad boys:

ChunkyC
02-06-2007, 10:37 PM
My worry with such things as holographic disks are many. How easily do they break? Fifty years seems like a long time, but it isn't. When do we get a really long lasting digital storage system?

And how do they know it will last fifty years. I remember when it was said that CDs would last from 20-50 years. I remember similar claims made about DVDs. It's darned near impossible to say how long something will last until it's actually been tested.

When you place as much information on a disk as these holographic disks will hold, that is going to be a very valuable disk, and you need to know, not guess, at how long it will last.
In many ways, we've regressed while we've progressed. Sure, we can cram astonishing amounts of data onto such disks, but it's tough to beat something like the Rosetta Stone for long-term data storage. Can't imagine a Palm Pilot utilizing granite as a storage medium, though. ;)

Judg
02-06-2007, 11:58 PM
One of the smartest things I ever did is repeatedly transfer out of computer science classes in university before the semester even started. Everything I would have learned would be useless now. They were still teaching punch cards... Mind you, calculators were becoming affordable and very rapidly at that, so I'm a tad younger than you.

We got our first PC fourteen years ago. My daughter was only three, and when we went to check on her one night, she rolled over in her sleep and said very distinctly, "Press escape." We had to hightail it out of there so our laughter didn't wake her up.

I bought one of my sons The Science of Star Trek a few years back, and one of the biggest sticking points preventing teleportation was computing power. That one may be overcome faster than we expected.

Jamesaritchie
02-07-2007, 12:48 AM
In many ways, we've regressed while we've progressed. Sure, we can cram astonishing amounts of data onto such disks, but it's tough to beat something like the Rosetta Stone for long-term data storage. Can't imagine a Palm Pilot utilizing granite as a storage medium, though. ;)


True enough. It's easy enough to back up text on a more permanent medium, really important text can always be backed up on acid free paper, but digital photos, movies, etc., are much more of a problem.

I have no doubt that long term storage will eventually be resolved, but I have stuff I'd like to store right now.

The two advancements I've been following are these: http://www-03.ibm.com/press/us/en/pressrelease/20744.wss
http://www.nytimes.com/2006/09/18/technology/18chip.html?ex=1316232000&en=28a5a164c7e8df41&ei=5088&partner=rssnyt&emc=rss

ChunkyC
02-07-2007, 01:38 AM
Cool links, James. :)

I'd heard about research into laser emitting chips. Previous demonstrations of optical computing devices always required, physically speaking, a relatively huge laser source. This is really neat. Switches operating at the speed of light means stunningly fast computers.

Of course, the first release of Microsoft Windows designed to run on them will make them crawl. ;)

Pthom
02-07-2007, 06:40 AM
The nice thing about holographs, is that if you do smash one, the information is still intact and readable. Only when the pieces become very small is there a noticible amount of information loss.

Of course, a shattered disk wouldn't be readable by the device mentioned. You'd need a well equipped optical labratory.

In a similar vein, whatever happened to the idea of storing data in the atomic matrix of crystals?

MidnightMuse
02-07-2007, 08:19 PM
I was wondering that myself - it was just a few years ago (wasn't it?) that the idea of data storage and retrieval on crystals was going to be THE next big advance in computer technology. Now I can't find any new research or information on it.

Jamesaritchie
02-07-2007, 09:18 PM
The nice thing about holographs, is that if you do smash one, the information is still intact and readable. Only when the pieces become very small is there a noticible amount of information loss.

Of course, a shattered disk wouldn't be readable by the device mentioned. You'd need a well equipped optical labratory.

In a similar vein, whatever happened to the idea of storing data in the atomic matrix of crystals?

Good point. But there's still that fifty year limitation.

Jamesaritchie
02-07-2007, 09:21 PM
Cool links, James. :)



Of course, the first release of Microsoft Windows designed to run on them will make them crawl. ;)

Or maybe, for the first time in history, a first release of Windows will actually be fast enough to use without taking a nap between mouse clicks.

Pthom
02-08-2007, 01:30 AM
Or maybe, for the first time in history, a first release of Windows will actually be fast enough to use without taking a nap between mouse clicks.
That will happen when we decide that computers (and by extension, operating systems) shouldn't be confused with entertainment systems.

I couldn't care less about a fancy interface with cute animated buttons and an almost impossible-to-read "skin" that looks like a 1960s set for a space opera.

I did hear (off the edge of my consciousness--wasn't paying close attention) that Vista® has restored the command line for some functions. What was that Charlie was saying about the Rosetta Stone? Oh thank the gods for reducing dependence on the mouse...and ease up on my carpal tunnel!

ChunkyC
02-08-2007, 03:20 AM
Yeah, but Vista also includes the loverly Aero graphics which will bring anything less than a Pixar animation workstation to its knees. ;)

greglondon
02-08-2007, 04:51 AM
I'm an electrical engineer during my day job. a couple of points:

electricity in a computer chip travels near the speed of light. switching to optronics doesn't speed up compute power. Currently, the speed limitation problem is more a matter of interconnect delays, not the gates in between that create the logic. imagine fast acting pneumatic switches separated by ten feet of garden hose, a lot of the delay is propagating through the hose.

And electricity propagates near the speed of light, so lasers won't speed things up much. In one nanosecond, light will travel one foot. A nanosecond is the period of a one-gigaherz clock. The amount of interconnect wiring in a chip might total a kilometer or more. There's just no way to avoid the delays that the interconnect will inject.

The big limitation about ASIC's is that they're basically a 2 dimensional layout. Gates that act like switchs are on the first level, and then there are many levels of wiring that interconnect the gates. It sort of looks like a city planning grid, but instead of skyscrapers, you've got single story buildings, and the roads are stacked several layers high. If they can ever figure out a way to build three dimensional chips, then you can pack a lot more gates a lot closer together, and interconnect delays will drop.

On a separate topic, burnable Compact Discs may only be good for a few years, depending on how you use and abuse them. If you burn a CD and put it in a case and never touch it, then you'll probably get the longest life out of it. personally, I have an external RAID drive (4 hard drives) with ethernet connection that I use for both day to day data storage and my long term storage.

benbradley
02-08-2007, 07:26 AM
I'm an electrical engineer during my day job. a couple of points:

Me too, though I'm "semi-retired."


electricity in a computer chip travels near the speed of light. switching to optronics doesn't speed up compute power. Currently, the speed limitation problem is more a matter of interconnect delays, not the gates in between that create the logic. imagine fast acting pneumatic switches separated by ten feet of garden hose, a lot of the delay is propagating through the hose.

And electricity propagates near the speed of light, so lasers won't speed things up much. In one nanosecond, light will travel one foot. A nanosecond is the period of a one-gigaherz clock. The amount of interconnect wiring in a chip might total a kilometer or more. There's just no way to avoid the delays that the interconnect will inject.

This replaces switching a voltage that goes along a very small wire (having significant capacitance to ground, and due to being so thin, significant series resistance - both of these slow down the switching speed) to other gates with a tiny light source shining on tiny light detectors which also need to switch on an off (and detect light turning on and off) in an incredibly small time frame to be as fast.

I've heard of lasers on chips many years ago - one goal for using them in computer chips is to make them as small as transistors are curently made, and they're being made really darn small now.

Many things have come and gone. I recall such great "future technologies" as Josephsen Junctions using superconducters, thus needing to be supercooled to function, and Bubble Memory which was actually manufactured and sold for a short while until the continually improving hard drive technology surpassed it in various aspects of price and density. But someday a fancy new technology probably will replace the current photoreduction methods (if you're using x-rays instead of light, is it still "photoreduction?") of making lots of tiny transistors on silicon - perhaps it will be molecular nanotechnology.


The big limitation about ASIC's is that they're basically a 2 dimensional layout. Gates that act like switchs are on the first level, and then there are many levels of wiring that interconnect the gates. It sort of looks like a city planning grid, but instead of skyscrapers, you've got single story buildings, and the roads are stacked several layers high. If they can ever figure out a way to build three dimensional chips, then you can pack a lot more gates a lot closer together, and interconnect delays will drop.

The problem will then be (actually this has always BEEN a problem - CPU's have had their own dedicated FANS since, I forget, the earlier Pentiums or even 486's?) removing heat. I can see where higher density "3D" chips will be liquid cooled (actual holes in the silicon that liquid flows through) with a radiator near the air intake of the computer box, much like a car engine's cooling system. The better you can remove heat, the faster you can run the chip without it getting too hot. This is of course what "overclockers" do to their CPU's, provide more cooling for the CPU than a standard system does so they can run the CPU faster than the manufacturer's guaranteed spec.


On a separate topic, burnable Compact Discs may only be good for a few years, depending on how you use and abuse them. If you burn a CD and put it in a case and never touch it, then you'll probably get the longest life out of it. personally, I have an external RAID drive (4 hard drives) with ethernet connection that I use for both day to day data storage and my long term storage.

This is another point I think most people don't appreciate, that CD-R's are a different technology from the mass-pressed CD's that commercial software and music come on, and CD-R's are much less durable. High humidity for a few hours, or a few minutes of direct sunlight are alleged to have damaged CD-R's but not mass-produced CD's. DVD's are apparently yet-again different technologies, and I haven't read much about them, but I wouldn't bet my life (either literally or figuratively) on data stored solely on them either.

For those who want to learn more (if you burn discs for any reason, you probably should) here's a CD-R FAQ:
http://cdrfaq.org/
It also has a pointer to this DVD FAQ:
http://www.dvddemystified.com/dvdfaq.html

For those wanting to preserve personal digital data long-term, I suggest making/keeping copies on several different types of media (hard disk drives, CD-R's, DVD's, USB Flash "thumb" drives), and refresh, update or make new copies at least once a year, and of course store copies in different locations in case of fire, theft or other distasters. When new media becomes popular, copy to them as well, and remember that some of the current media (we don't know which ones!) may go out of style and not be easily readable in a few years (remember 3-inch floppy drives?). You can't guarantee anything, but with enough work you can hedge your bets.

Shadow_Ferret
02-08-2007, 06:48 PM
I learned all about computers in high school. We had a week long class about punch cards.

When personal computers first started coming out (anyone remember the build it yourself one from Heath?), I thought they were pretty cool. I used to get caught up on the 8088, the 286, 386, 486, etc. etc. on and on. Then I realized that every time I spent $2500 on the latest and greatest it was outdated in 6 months by something newer and faster and the one I currently had could be purchased for a third of the cost.

So now I don't care any more. I buy whatever's cheapest (around $500), live with that for several years until all the Windows updates and service packs bog it down to unusability, and then I get a new one.

latoya
02-08-2007, 07:59 PM
I'm waiting for the day that you can speak commands to the computer! I'm tired of sore knuckles!

ChunkyC
02-08-2007, 08:08 PM
Regarding CPU fans -- they started showing up in 486s. My first PC was a 386SX20 and the only fan was the one in the power supply.

We have a server at my day job with a pair of 1Ghz Pentium IIIs and there are SIX fans in the case: 1 for each CPU, another for the hard drive enclosure, 2 for the case itself to draw air out of the box, and the aforementioned power supply fan.

We just purchased a desktop system and the CPU fan has this huge cone shaped thing on top of it that reaches right to the holes in the side panel of the case. It looks like one of those cones you put on your pet's head to keep them from licking a surgery incision or whatever. I gather the idea is to funnel the heat from the CPU straight out of the case, no letting it circulate around inside.

So yeah, heat is a huge problem in today's computers. I do believe these laser-light chips would address a portion of that issue, but all I can go on is what I've read in techy newsletters etc., so correct me if I'm mistaken.

ChunkyC
02-08-2007, 08:12 PM
I'm waiting for the day that you can speak commands to the computer! I'm tired of sore knuckles!
It's here. I have speech recognition installed on my work machine and my laptop. I can tell Microsoft Outlook to open an email, or delete one, I can dictate an email, or into a word processor. It's pretty darn cool.

You need to have a speech recognition engine installed, and software that can accept input through it. The engine comes with certain versions of MS Office, but I installed the Microsoft engine manually since I don't use Office.

Check it out here (http://www.microsoft.com/windowsxp/using/setup/expert/moskowitz_02september23.mspx).

Download the Microsoft engine here (http://www.microsoft.com/downloads/details.aspx?familyid=5E86EC97-40A7-453F-B0EE-6583171B4530&displaylang=en).

More info here (http://support.microsoft.com/default.aspx?scid=kb;en-us;306537) on installing and configuring it.

I'm still trying to figure out how to get the OS itself to accept commands, but so far no go. The technology is in the early stages yet, but it's a step towards being able to interact verbally with your computer a-la Star Trek.

Shadow_Ferret
02-08-2007, 08:17 PM
I'm waiting for the day that you can speak commands to the computer! I'm tired of sore knuckles!
I wouldn't bother since my brain isn't geared toward fiction and speach. I've found that out trying to transcribe thoughts into a tape-recorder. I have this great idea in my head but somehow the mechanics of producing speach get in the way of my creativity whereas things flow much more smoothly when I type.

shawkins
02-08-2007, 08:29 PM
<Mounts hobbyhorse>

To me the scariest thing about the march of technology is 1) facial recognition, 2) voice transcription, and 3) natural language processing.

At the moment there are no real privacy concerns. Although we're videotaping lots of public spaces, in general there isn't anyone watching the camera. Nonetheless, the data is out there (more and more so every day) just waiting for someone to figure out a way to start snooping with it.

There are two practical problems with large scale electro-snooping:

1. It's computationally intensive. (That's a comp.sci euphemism for slow.)

2. Even given unlimited access to computing power, the algorithms for addressing the three problems are still in their infancy.

In regards to the first hurdle, the EE types pointed out that machines are getting faster all the time. A problem that is prohibitively difficult today is not likely to be so in 10 years.

For instance, in 1976 the U.S. adopted an algorithm called DES (Data Encryption Standard) for secure communications. It was handed out to civilian industries that had a legitimate need for secure communications--mostly banks, if memory serves. About 10 years later the DES was useless. The problem was that in the intervening decade, computers had gotten so much faster it was now possible to just do an exhaustive search of all the possible keys for a given message.

Anyway, given Moore's law (http://www.webopedia.com/TERM/M/Moores_Law.html) hardware will sooner or later be fast enough and cheap enough to devote lots of processing power to every bit of video, voice, and text data everywhere.

Which brings me to the second hurdle.

Regardless of how fast and cheap your hardware is, there is currently no good way to, say, have a machine monitor conversations. Natural language processing (machine 'understanding' of human speech) is a tough nut to crack.

However, there are a lot of really bright Comp. Sci. graduates tilting at this particular windmill.* I'm reasonably confident that in the next 50 years or so we'll have something approaching true understanding of natural language by machines.

So, here's the kicker:
1. The data is out there waiting to be sifted.
2. It's more or less inevitable that the computing power to do large-scale sifting of the data will be available shortly.
3. It's less inevitable that effective sifting techniques will be developed, but I personally am convinced that they too are on the way.

So why should you care?

First, let's assume for the sake of argument that you're one of those from the "if you're not doing anything wrong you don't have anything to worry about" camp. Even so, it's scary.

Let's say there's an election coming up, and polls show that it's going to be a tight one. Let's further say that political strategists have identified two or three districts that are likely to be pivotal. If the party in power has access to an adequately sophisticated surveillance system, is it not possible that they might sift through the data looking for felonies commited by opposition voters? (Felony conviction == disenfranchisement.)

If that's too far-fetched for your taste, what's wrong with the following: In 2020, Bob is running against the incumbent candidate for senate, and he's winning. The incumbent candidate, by virtue of being on some senate committee or other, has access to our surveillance system. He asks his buddy in the FBI to sift through all the ATM, grocery store, and whatever video footage from Bob's college days and see if they can't find pics of Bob smoking something illegal, kissing a member of the same sex, or whatever. Is that really so unlikely?

(At this point the reader is encouraged to construct their own paranoid fantasies. )

shawkins
02-08-2007, 08:35 PM
Further supporting evidence for my paranoid fantasy can be found in Charlie's post on speech recognition. It is, in fact, here. When I was in school, it was considered a difficult-to-unsolvable problem...on the hardware of the day.

ChunkyC
02-08-2007, 08:36 PM
find pics of Bob smoking something illegal, kissing a member of the same sex, or whatever
Hopefully society will evolve along with the technology to the point where John Q. Public won't give a flying fart about Bob kissing Billy ten years earlier.

But that's probably less likely than your scenario. ;)

Shadow_Ferret
02-08-2007, 09:23 PM
Simply having the technology to do those things doesn't make it legal. ANd it won't make it legal in the future unless we, as voters, simply roll over and give in.

Your scenario doesn't wash with me because I don't believe we'd allow it.

finch
02-08-2007, 09:55 PM
I know this isn't TIO and I don't want to stray there, but I have two revelant beliefs: one, I don't think that the bulk of modern voters are active enough in politics to either notice or care when those decisions are made for them, and two, I don't think the law keeps up adequately with new technology to govern its use effectively.

shawkins is pointing at a 1984 scenario without the need for something as obvious as a telescreen. While I'm no Hari Seldon, the potential for us to end up there -- based on consistently rapid tech advances and a legal system reluctant to move into the mid-20th century -- is at least measurable. Whether we can recover if it happens, and whether we have the intestinal fortitude to confront this stuff when it happens is probably more a TIO topic, but tech outpacing legal is absolutely a concern.

shawkins
02-08-2007, 09:59 PM
Well, IANAL, but I'm pretty sure that anything done in a public place is fair game for visual and audible surveillance. But legal or not, it's definitely being done.

In 2001, the faces of all Superbowl attendees were scanned (http://www.wired.com/news/politics/0,1283,41571,00.html) and matched against a database of wanted criminals.

Cell phone records aren't quite so wide open, but they're routinely used. If you own a new cell phone, there's a GPS chip in it. Unless they dumbed the GPS technology down for commercial usage, that means that to whatever degree you carry around a cell phone, your whereabouts can be determined to within about 2 meters. The change happened a couple of years ago, was widely reported, and sparked basically zero public outcry.

Most cars now do black-box recordings of your driving behavior. Literally the only possible use for such records is in a criminal prosecution of the driver. This was widely reported, and there was basically zero public outcry.

My firm opinion is that we, as voters, are dumb enough to approve absolutely anything at all.

shawkins
02-08-2007, 10:01 PM
I know this isn't TIO and I don't want to stray there

Oh, sorry--should I not be doing this on this board? I'm seriously not trying to start an argument.

finch
02-08-2007, 10:15 PM
Oh, sorry--should I not be doing this on this board? I'm seriously not trying to start an argument.

I didn't mean to imply you were -- more along the lines that I will if I don't remind myself to play nice ;) The social ramifications of tech advances are absolutely of interest to me, and in theory this should be a great place to discuss it, I just wanted to make it absolutely clear I'm not trying to stir the pot.

Might not be a bad idea to split the thread at this point, actually, as we've strayed a bit from the OP.

Shadow_Ferret
02-08-2007, 10:15 PM
My firm opinion is that we, as voters, are dumb enough to approve absolutely anything at all.
I guess I have more faith in people than you do. I'm of the Star Trek optimistic future, you seem to be of the bleak pessimistic future.

latoya
02-08-2007, 10:46 PM
It's here. I have speech recognition installed on my work machine and my laptop. I can tell Microsoft Outlook to open an email, or delete one, I can dictate an email, or into a word processor. It's pretty darn cool.

You need to have a speech recognition engine installed, and software that can accept input through it. The engine comes with certain versions of MS Office, but I installed the Microsoft engine manually since I don't use Office.

Check it out here (http://www.microsoft.com/windowsxp/using/setup/expert/moskowitz_02september23.mspx).

Download the Microsoft engine here (http://www.microsoft.com/downloads/details.aspx?familyid=5E86EC97-40A7-453F-B0EE-6583171B4530&displaylang=en).

More info here (http://support.microsoft.com/default.aspx?scid=kb;en-us;306537) on installing and configuring it.

I'm still trying to figure out how to get the OS itself to accept commands, but so far no go. The technology is in the early stages yet, but it's a step towards being able to interact verbally with your computer a-la Star Trek.How cool! I have to check that out.

benbradley
02-08-2007, 11:01 PM
<Mounts hobbyhorse>

To me the scariest thing about the march of technology is 1) facial recognition, 2) voice transcription, and 3) natural language processing.

At the moment there are no real privacy concerns. Although we're videotaping lots of public spaces, in general there isn't anyone watching the camera.
I'm not sure of the current tecnology, but my guess is the "tape" being used is hard disk. Actual videotape is pretty much as obsolete as audio cassettes. The video being digitized and "online" makes it hugely easier for a computer to go through, as opposed to thousands of videotapes sitting on shelves.

Nonetheless, the data is out there (more and more so every day) just waiting for someone to figure out a way to start snooping with it.

There are two practical problems with large scale electro-snooping:

1. It's computationally intensive. (That's a comp.sci euphemism for slow.)

2. Even given unlimited access to computing power, the algorithms for addressing the three problems are still in their infancy.

There's one "saving grace" that makes large scale snooping possible, or at least saves the data for future years when getting interesting things out of huge amounts of data becomes more practical. I recall thinking shortly after 9/11 that hard disk drives then cost a few (two or three?) dollars per gigabyte. That's hugely cheap. Current prices (http://www.pricewatch.com/hard_drives/) are under 25 cents per gigabyte. Hard drives are not neccesarily good long-term storage devices (though perhaps they are better in RAID arrays, like the people who would do this would use), but they're certainly cheap enough. Info can be stored as it's scooped up for analysis years later when not only the technology is better, but when there's a "good reason" (like another 9/11 type attack) to go through it.

I recall lots of talk about Echelon in the late '90's, an alleged secret international snoop organization that records virtually all Internet and email traffic, faxes, and many telephone (land-line and cell) calls. Who knows how much, if any, of it is real, but the point is that it's POSSIBLE, so someone is very likely doing some of it.


In regards to the first hurdle, the EE types pointed out that machines are getting faster all the time. A problem that is prohibitively difficult today is not likely to be so in 10 years.

Circa 1986 I opined on a computer bulletin board (remember those? I had a 1200 baud modem!) that a computer was eventually going to beat the best human player at chess. They (including another programmer) chided me, saying you can't make a computer play a better game than you can play. But I knew Moore's Law and saw where computer chess ratings were going up roughly linearly, and that winning chess programming could be done by fast enough computing with brute-force look-ahead and appropriate "alpha-beta" pruning, as opposed to deep knowldege of the game. My prediction came true in (IIRC) 1997.


For instance, in 1976 the U.S. adopted an algorithm called DES (Data Encryption Standard) for secure communications. It was handed out to civilian industries that had a legitimate need for secure communications--mostly banks, if memory serves. About 10 years later the DES was useless. The problem was that in the intervening decade, computers had gotten so much faster it was now possible to just do an exhaustive search of all the possible keys for a given message.

Was it JUST that computers got faster? I vaguely remember reading about this, I thought it had more to do with discovery of better (by many orders of magnitude) algorithms, rather than Moore's Law. If it was solely the increase in speed, then they should have known better - that could have been easily predicted. The discovery of new algorithms is much harder to predict.


Anyway, given Moore's law (http://www.webopedia.com/TERM/M/Moores_Law.html) hardware will sooner or later be fast enough and cheap enough to devote lots of processing power to every bit of video, voice, and text data everywhere.

Which brings me to the second hurdle.

Regardless of how fast and cheap your hardware is, there is currently no good way to, say, have a machine monitor conversations. Natural language processing (machine 'understanding' of human speech) is a tough nut to crack.

However, there are a lot of really bright Comp. Sci. graduates tilting at this particular windmill.* I'm reasonably confident that in the next 50 years or so we'll have something approaching true understanding of natural language by machines.

You think it will take that long? You're being conservative. :) Have you read Vernor Vinge's "Singularity" paper? His and others' timeline for "machine understanding" is more like within 30 years (starting 1993!), and they believe the machines will be understanding a lot more than just "natural language," they will be intellectual equivalents of humans.
http://www-rohan.sdsu.edu/faculty/vinge/misc/singularity.html
But I admit that's getting beyond the scope of this thread.


So, here's the kicker:
1. The data is out there waiting to be sifted.
2. It's more or less inevitable that the computing power to do large-scale sifting of the data will be available shortly.
3. It's less inevitable that effective sifting techniques will be developed, but I personally am convinced that they too are on the way.

So why should you care?

First, let's assume for the sake of argument that you're one of those from the "if you're not doing anything wrong you don't have anything to worry about" camp. Even so, it's scary.

Let's say there's an election coming up, and polls show that it's going to be a tight one. Let's further say that political strategists have identified two or three districts that are likely to be pivotal. If the party in power has access to an adequately sophisticated surveillance system, is it not possible that they might sift through the data looking for felonies commited by opposition voters? (Felony conviction == disenfranchisement.)

If that's too far-fetched for your taste, what's wrong with the following: In 2020, Bob is running against the incumbent candidate for senate, and he's winning. The incumbent candidate, by virtue of being on some senate committee or other, has access to our surveillance system. He asks his buddy in the FBI to sift through all the ATM, grocery store, and whatever video footage from Bob's college days and see if they can't find pics of Bob smoking something illegal, kissing a member of the same sex, or whatever. Is that really so unlikely?

(At this point the reader is encouraged to construct their own paranoid fantasies. )

I need no such encouragement...heck, I need to hurry up and write a bunch of Tom Clancy type novels (I would have said Crichton, but not after what he wrote in his latest novel about that reviewer) and get them published before they become obsolete!

ChunkyC
02-08-2007, 11:23 PM
Oh, sorry--should I not be doing this on this board? I'm seriously not trying to start an argument.
I'm sure Pthom won't mind us having a civil discussion of the social ramifications of advancements in computer technology. Nothing exists in a vacuum, after all.

Except perhaps for whatever might be in the one between Michael Crichton's ears. (see Ben's post above ;) )

Jamesaritchie
02-09-2007, 01:05 AM
I'm waiting for the day that you can speak commands to the computer! I'm tired of sore knuckles!

I've been doing that for a couple of years. I can't control everything this way, but I can control all MS software, including Word and Outlook, plus a few others.

Jamesaritchie
02-09-2007, 01:07 AM
I guess I have more faith in people than you do. I'm of the Star Trek optimistic future, you seem to be of the bleak pessimistic future.

I don't think I'm unduly pessimistic, but if you look at where technology is going, wearable computers, body and brain implants, etc., it seems to me we're on a path to becoming the Borg, not the Federation.

Jamesaritchie
02-09-2007, 01:08 AM
Hopefully society will evolve along with the technology to the point where John Q. Public won't give a flying fart about Bob kissing Billy ten years earlier.

But that's probably less likely than your scenario. ;)

I'd place the odds on this at about ten million to one.

Jamesaritchie
02-09-2007, 01:12 AM
I wouldn't bother since my brain isn't geared toward fiction and speach. I've found that out trying to transcribe thoughts into a tape-recorder. I have this great idea in my head but somehow the mechanics of producing speach get in the way of my creativity whereas things flow much more smoothly when I type.

I have the same problem. I do use voice commands to control a great many programs on my computer, but I can't write fiction this way, even though the latest Dragon is darned near perfect at voice recognition.

But I can write e-mails this way, and it is nice to be able to control my computer from across the room, or just when I'm pacing and thinking, using only my voice.

ChunkyC
02-09-2007, 03:58 AM
it is nice to be able to control my computer from across the room, or just when I'm pacing and thinking, using only my voice.
That's where I think it's going to get pretty cool. I envision a home with microphones placed around the house and a server that controls all sorts of stuff. For example; though universal remotes today reduce the number of remotes lying around, the remote you do have is so blasted complicated. I would much rather be able to just say, "Jeeves ... television on ... channel 243." Think of the scene in Back to the Future II where Marty McFly's kid tells the TV to display all his favourite channels on the wall screen.

That day is getting close, methinks.

greglondon
02-09-2007, 06:43 AM
Well, since this thread has moved on to discuss vaporware, I really have nothing to add, other than I'm not worried about it.

Jamesaritchie
02-09-2007, 05:52 PM
That's where I think it's going to get pretty cool. I envision a home with microphones placed around the house and a server that controls all sorts of stuff. For example; though universal remotes today reduce the number of remotes lying around, the remote you do have is so blasted complicated. I would much rather be able to just say, "Jeeves ... television on ... channel 243." Think of the scene in Back to the Future II where Marty McFly's kid tells the TV to display all his favourite channels on the wall screen.

That day is getting close, methinks.

Closer than you think. I've seen model homes that are already this way. It's easy to do this with a TV. The trick is making it cheap enough that people don't mind paying for it.

shawkins
02-09-2007, 05:55 PM
I'm not sure of the current tecnology, but my guess is the "tape" being used is hard disk. Actual videotape is pretty much as obsolete as audio cassettes. The video being digitized and "online" makes it hugely easier for a computer to go through, as opposed to thousands of videotapes sitting on shelves.


True that. My '80s are showing, sorry.



I recall lots of talk about Echelon in the late '90's, an alleged secret international snoop organization that records virtually all Internet and email traffic, faxes, and many telephone (land-line and cell) calls. Who knows how much, if any, of it is real, but the point is that it's POSSIBLE, so someone is very likely doing some of it.


Remember in 1983 (!) when the Russians shot down that Korean airliner? That night the news played a recording of the Soviet pilot talking back and forth with his commander before the shooting. I doubt that CBS got that recording from Pravda.






Was it JUST that computers got faster? [that made the DES obsolete] I vaguely remember reading about this, I thought it had more to do with discovery of better (by many orders of magnitude) algorithms, rather than Moore's Law. If it was solely the increase in speed, then they should have known better - that could have been easily predicted. The discovery of new algorithms is much harder to predict.

In the case of DES, I think it was just Moore's law in action. DES had a 56 bit key. When it became clear that that was no longer cutting it, they moved to "Triple DES" which was basically 3 iterations of encryption via DES. Triple DES is still in use in OpenSSL, though I don't think it's trendy anymore.



I need no such encouragement...heck, I need to hurry up and write a bunch of Tom Clancy type novels (I would have said Crichton, but not after what he wrote in his latest novel about that reviewer) and get them published before they become obsolete!

My man. Race you to the heavily fortified cabin in Idaho!

Mustangpilot
02-09-2007, 08:46 PM
The problem will then be (actually this has always BEEN a problem - CPU's have had their own dedicated FANS since, I forget, the earlier Pentiums or even 486's?) removing heat. I can see where higher density "3D" chips will be liquid cooled (actual holes in the silicon that liquid flows through) with a radiator near the air intake of the computer box, much like a car engine's cooling system. The better you can remove heat, the faster you can run the chip without it getting too hot.


My first experience with a "real" computer was in 1958 when I worked in the aircraft industry in California. It was on the second floor of the engineering building enclosed in a 40 foot square room with glass walls and served by two refrigeration units. Inside the glass room were racks of vacume tubes and spinning tape drives. White coated high priests moved among the racks changing out individual racks because of some failure. Outside in the main room were rows of punch card machines and women, vestal virgins (at least as far as I was concerned), transcribing info to the cards. There was an input key board, cant remember what it looked like now but I think it was sort of like a typwriter key board, not the electronic ones of today, located on and outside wall where you could send in commands. We used to play Tic Tac Toe on it. The 'puter would allways win unless you pushed a key that supposedly shorted out part of its logice. Then you could win now and then.

What impressed me the most at the time was the refrigeration required to keep the thing from melting down and the slightest power surge that could mess up days effort.

greglondon
02-09-2007, 09:08 PM
Triple DES is still in use in OpenSSL, though I don't think it's trendy anymore.

For the best info on all things encrypted, see Bruce Schneier.

ChunkyC
02-09-2007, 10:56 PM
So -- if not optical chips, how about chemical?

Microfluidic Bubbles (http://www.informationweek.com/news/showArticle.jhtml?articleID=197004560)

MidnightMuse
02-09-2007, 11:00 PM
Chemical computer chips?

ChunkyC
02-09-2007, 11:38 PM
Weird eh? I had to read the article twice to make sure I was reading it right. ;)

MidnightMuse
02-09-2007, 11:59 PM
Fascinating!

Pthom
02-10-2007, 12:18 AM
Nope, there is nothing in this thread that causes me concern. Just make sure we're discussing actual science, not conjecture. There are two places for conjecture in AW: SF/F for non confrontational, and TIO for ...well, you know. Keep going.
re: chemical computers. Remember fluidics? ;) Seriously, I seem to recall reading about someone investigating bionic computation--ie: bacteria and nano-machines. :Shrug: The problem I can see is how to manage the required liquids. Dick Tracy and his wrist radio/TV/computer connected to a hydration pack on his back?
Seriously, I believe the days of owning a stand-alone computer (the kind that does all things you want done) are numbered. The first computers (not counting the Jacquard loom) were analog devices, so huge and cumbersome there were but a few. These were replaced by digital machines, also huge, expensive and not available to the average individual. Then came the personal computer. We all joke that our desktop machines are more powerful than those which calculated orbits for the Apollo missions, and they are.

But the greatest power in computing is achieved by distributing the tasks over many machines. In other words, networking. How many of you have home networks? How long ago was it when you didn't have? I don't want a stand-alone computer in my microwave, or to burden the machine where I do word processing with timing my cooking, but I do want all those things to be done.

So I can go watch the lions in the holo-suite down the hall. :D

ChunkyC
02-10-2007, 03:03 AM
I want everything in my home networked. That way if I forget and I'm halfway to work, I can pull out my PDA, log into my home network and flush my toilet. :D

Dave.C.Robinson
04-08-2007, 03:44 AM
As to voice-activated computing, Vista comes with it built-in