PDA

View Full Version : Data Storage in the Future ::ATTN GEEKS::



indiriverflow
02-27-2009, 11:54 AM
My WIP takes place about forty years in the future, to allow for the development of the computer technology at the heart of my premise.

Storage devices are rapidly evolving. This may not always be the case, but the trend ought to continue.

What will storage technology be like in 40 years? Will RAM still be a term in use? What will hard drives be called, and how might they be designed?

How about transfer storage devices? We've had the floppy, the hard floppy, CD, flash, optical and others I'm forgetting. What will this technology look like and what might it be called?

Finally, the computer in question is based on quantum processors, and fortunately, this is already in the embryonic stage. The basic unit of quantum storage is the qu-byte. According to my research (thanks Wiki!), the largest prefix for expressing orders of byte is Yottabyte (http://en.wikipedia.org/wiki/Yottabyte) 1024. How would I say this? Yottaqubyte...ugh!

Any idea what would be larger than this if the pattern is followed?

I have more questions, but this will do for now. Thanks for all your answers.

dpaterso
02-27-2009, 12:09 PM
It's going to be wireless, and it's going to be part of you, like maybe a fingernail, or an ear stud, or an injected micro-capsule. You'll walk into a room or a building and be auto-connected, able to upload/download/browse at will. No one will even think in terms of storage or RAM any more because unlimited capacity will be available to everyone, everywhere. Even now, you can get free mailboxes that hold 7Gb or more. That's just going to go up and up until numbers mean nothing except to the uber-techies who maintain the infinite server ocean (ISO).

-Derek

indiriverflow
02-27-2009, 12:25 PM
It's going to be wireless, and it's going to be part of you, like maybe a fingernail, or an ear stud, or an injected micro-capsule. You'll walk into a room or a building and be auto-connected, able to upload/download/browse at will. No one will even think in terms of storage or RAM any more because unlimited capacity will be available to everyone, everywhere. Even now, you can get free mailboxes that hold 7Gb or more. That's just going to go up and up until numbers mean nothing except to the uber-techies who maintain the infinite server ocean (ISO).

-Derek
Okay, but the trend has been for the complexity and operating bytes of software to rise, if anything, more quickly than the hardware. The more capacity, the more you'll need.

I'm not talking about personal computing so much as servers. As we all know, there is no such thing as unlimited storage, processor use, or bandwidth on a server. These won't be serving the net, but what I will somewhat inaccurately call AI's for the purpose of brevity.

The functions of storage, processing, and temporary memory seem basic to computing to me. I can't really imagine how it could be built without it.

I mean, that's been constant for the past forty years of computing. What kind of paradigm shift would override these functions?

Fullback
02-27-2009, 12:36 PM
A three-dimensional memory with Z axis depth more than current cd/dvd pit depths. Perhaps some form of enclosed gaseous or liquid medium with extraordinary capacity.

We're just scratching the surface to store and retrieve data, literally and figuratively. :D

indiriverflow
02-27-2009, 12:38 PM
Fullback, tell me more!

dpaterso
02-27-2009, 12:41 PM
Okay, but the trend has been for the complexity and operating bytes of software to rise, if anything, more quickly than the hardware. The more capacity, the more you'll need.

I'm not talking about personal computing so much as servers. As we all know, there is no such thing as unlimited storage, processor use, or bandwidth on a server. These won't be serving the net, but what I will somewhat inaccurately call AI's for the purpose of brevity.

The functions of storage, processing, and temporary memory seem basic to computing to me. I can't really imagine how it could be built without it.

I mean, that's been constant for the past forty years of computing. What kind of paradigm shift would override these functions?
I disagree. Going back 40 years, 1 kay of RAM wouldn't fit in your purse. Right now I'm sitting with a 1 Terabyte external drive alongside my laptop. That's your paradigm shift right there.

We're riding the crest of an exponential tech development wave that just isn't going to stop. 40 years from now there will be, to all intents and purposes, unlimited storage, unlimited bandwidth and unlimited networking. Bet you 5 bucks, with compound interest.

-Derek

indiriverflow
02-27-2009, 12:44 PM
I disagree. Going back 40 years, 1 kay of RAM wouldn't fit in your purse. Right now I'm sitting with a 1 Terabyte external drive sitting beside my laptop. That's your paradigm shift right there.

We're riding the crest of an exponential tech development wave that just isn't going to stop. 40 years from now there will be, to all intents and purposes, unlimited storage, unlimited bandwidth and unlimited networking. Bet you 5 bucks, with compound interest.

-Derek
I might take that bet. Five dollars, even with compound interest, will be toilet paper. I bet the dollar doesn't even last that long.
Um...what's the rate of interest?

BTW, I'm setting it that far because that's how long I think it will take for this technology to be invented under ideal conditions. I could slide the timeline, but I'd have to be convinced by some uber-geeks.

I think its safe to speculate four decades ahead on this.

Fullback
02-27-2009, 03:22 PM
Fullback, tell me more!

I'm just brainstorming with you now. :D

What if we moved ahead by moving backward to a form of contextual analog data system, with far more data embedded than recompiling zeros and ones? The medium is a man-made compound with a structure where each molecule retains eight data points in eight atoms to form a more cohesive and connected information point.

As data is learned (accessed in unison with associated data), the molecules drift closer in 3-D space for faster recovery and also to be accessed if they are related to the original data search. It's like lateral thinking. We can use the close molecules or discard them if not necessary. The data is "thinking" by associating close data. It's like our data is saying "Oh, by the way, you should think about B and C if your thinking about A." The data itself is smart.

The smart data is embedded and recovered by 3 triangulated constant light sources that use diffused light instead of lasers and address each molecule by another embedded "locater atom."

My head hurts now, but I can see it in my tiny, little brain. :D

indiriverflow
02-27-2009, 03:25 PM
I'm just brainstorming with you now. :D

What if we moved ahead by moving backward to a form of contextual analog data system, with far more data embedded than recompiling zeros and ones? The medium is a man-made compound with a structure where each molecule retains eight data points in eight atoms to form a more cohesive and connected information point.

As data is learned (accessed in unison with associated data), the molecules drift closer in 3-D space for faster recovery and also to be accessed if they are related to the original data search. It's like lateral thinking. We can use the close molecules or discard them if not necessary. The data is "thinking" by associating close data. It's like our data is saying "Oh, by the way, you should think about B and C if your thinking about A." The data itself is smart.

The smart data is embedded and recovered by 3 triangulated constant light sources that use diffused light instead of lasers and address each molecule by another embedded "locater atom."

My head hurts now, but I can see it in my tiny, little brain. :D

Hot! I was just talking about this with my friend and hoping you'd pipe back in.
This is exactly what I have in mind. What if the code is actually quinary, to represent DNA (four bases and a stop)?

Fullback
02-27-2009, 03:32 PM
Why not? I think that's a great idea.

I feel like I understand what you're after, and we haven't even included any "plus alpha" into this yet. We're still talking about things within the realm of our knowledge and imagination. We haven't included some magic bullet discovery of fundamental physics that we have no clue of right now.

I think you have the seed of an idea now.

backslashbaby
02-27-2009, 03:47 PM
Fun stuff!

I agree with Derek - no worries on capacity, definitely. You are correct about software complexity, but also remember that new algorithm/software design issues are changing, too.

I say no imbeddings, etc though, due to the FDA, seriously (40 years). Incredible portability? Yes. Visor-type things for screens, and mind-control inputs are the rage in research, but I question the everyday-wearability of the visors, seriously. Will Blackberry-hand turn into visored addicts? Eh, maybe?

There will be entirely new names for transfer devices... very fast and small and wireless (new technology there, for sure). There will still be transfer devices, because people like buying their own brand of phone/media player/pda/portable PC that needs to coordinate with the one at work, their friends', etc.

There will be no hard drives or RAM as we know them. Sorry, but they've been working on new hardware theories for computers since computers were invented (quantum, biological, etc). New operating systems will handle new hardware, so while there will be analagous ideas for temporary processing, etc, throw away your notions of how things are separated/grouped/named now.

I'd love to get you some tidbits of what it actually being invented, if you like :) But are you going for what has been begun in actuality, or just what kinds of ideas there are?

backslashbaby
02-27-2009, 04:11 PM
Brontobyte, geopbyte

indiriverflow
02-27-2009, 04:24 PM
Fun stuff!

I agree with Derek - no worries on capacity, definitely. You are correct about software complexity, but also remember that new algorithm/software design issues are changing, too.

I say no imbeddings, etc though, due to the FDA, seriously (40 years). Incredible portability? Yes. Visor-type things for screens, and mind-control inputs are the rage in research, but I question the everyday-wearability of the visors, seriously. Will Blackberry-hand turn into visored addicts? Eh, maybe?

There will be entirely new names for transfer devices... very fast and small and wireless (new technology there, for sure). There will still be transfer devices, because people like buying their own brand of phone/media player/pda/portable PC that needs to coordinate with the one at work, their friends', etc.

There will be no hard drives or RAM as we know them. Sorry, but they've been working on new hardware theories for computers since computers were invented (quantum, biological, etc). New operating systems will handle new hardware, so while there will be analagous ideas for temporary processing, etc, throw away your notions of how things are separated/grouped/named now.

I'd love to get you some tidbits of what it actually being invented, if you like :) But are you going for what has been begun in actuality, or just what kinds of ideas there are?

You may think of it as is a virtual environment for AI's. That's only partially true, but it will cover it for now.

I am interested in making plausible projections about the evolution of this technology-so long as it serves the needs of my story.

How much data in a brontobyte or geopbyte? Any etymology you'd care to supply there?

backslashbaby
02-27-2009, 04:36 PM
So not so much consumer focus then, I see. What you and Fullback were discussing sounds like a great start at the underlying architecture. I'll readily admit, other than what I've read in research, my personal knowledge starts more at the machine language stage and up.

AI for a general technology, or more specialized? Don't worry, I'm way geekier than I sound (well, I'm posh-degreed in AI/computing, in any case ;) )

Edit: I'm only more curious on the use of the AI because different technologies are better suited to AI than others....

indiriverflow
02-27-2009, 04:42 PM
So not so much consumer focus then, I see. What you and Fullback were discussing sounds like a great start at the underlying architecture. I'll readily admit, other than what I've read in research, my personal knowledge starts more at the machine language stage and up.

AI for a general technology, or more specialized? Don't worry, I'm way geekier than I sound (well, I'm posh-degreed in AI/computing, in any case ;) )

Edit: I'm only more curious on the use of the AI because different technologies are better suited to AI than others....

Let's just say for now that the primary purpose of these AI's is to live and work as we do. Think of the server as a city.

And to avoid confusion, the premise is that these entities are every bit as self-aware as we are. Also, most are not true AI's, but we can deal with them as if they were.

Your geek credentials are accepted. Anyone who uses a computer and thinks is very welcome to contribute.

I don't believe that leaves anyone out. :)

backslashbaby
02-27-2009, 04:46 PM
Yeah, very cool! So the links I'm thinking of on new research might be relevent. I'll gather them up...

indiriverflow
02-27-2009, 06:02 PM
Something I need to settle, right this minute, is if data transfer will truly be wireless, or if we might still see a cable for downloads of huge files.

I sort of agree with the wireless comments above, except that these are huge files being downloaded, so I wondered if anyone thought that might still require a cable. Perhaps for security reasons?

backslashbaby
02-27-2009, 06:22 PM
It would always be more secure via a cable, IMHO, but encryption can make any data transfer pretty darned strong.

Here is an interesting, albeit older, press release for wireless technology: http://www.gatech.edu/newsroom/release.html?id=1431

Reading these abstracts should be interesting, too:
http://www.mtt-tpms.org/cgi-bin/symposia_v4/technicalprogram.cgi?Symposium_Name=IMS2008

indiriverflow
02-27-2009, 06:33 PM
I think I'm going with the cable, because it adds a layer of security for the file transfer, which is extremely sensitive.

Seems like there is no way to hack a box which is never online...just hooked physically to the source. Except physically, which is what is going on now. Better for the narrative if this is as tough as conceivably possible.

Next issue: redundancy.

This creates complications, because I'm not sure about backup. Things get complicated if there are duplicates of my cyber-people. I sort of want to dispense with these by means of a legal caveat-that these files may not legally be duplicated. This makes more sense than it might seem.

This seems the best way to deal with the issue...make the files unique by proprietary copyright law, and then have someone break the law if I for some reason want a dupe.

So how can the integrity of the files be guaranteed without a backup?

Maybe I'll just swallow it and allow inactive archives only on the backup.

Sorry for thinking aloud in the thread, I'm in process right now and very absorbed in the story.

backslashbaby
02-27-2009, 06:42 PM
Would it suit your purposes to have the files split apart and encrypted in separate parts/locations, or is that still too much of a duplicate out there, I wonder...

indiriverflow
02-27-2009, 07:01 PM
Would it suit your purposes to have the files split apart and encrypted in separate parts/locations, or is that still too much of a duplicate out there, I wonder...

Maybe...that might get interesting. Then I can have a subplot where all the different pieces have to be put together, say three in physically separated DC's. Like a treasure hunt to put Humpty Dumpty together again.

Hmmm...you give me possibilities to gnaw on.

Lovin' it.

50 Foot Ant
02-27-2009, 08:24 PM
Well, let me drag out my old sci-fi cyberpunk novel, and maybe this will help you...

Packet: Undisclosed amount of data, a singular "packet" over the high data stream above room temperature superconductor hard lines.
Wave: Wireless systems. Can handle Mid-function web immersion. A Wave is an undisclosed amount of data.

5 P/1 W

Standard transmission rate: 1P/ms (1 packet per millisecond) 1W/ms (1 Wave per millisecond) with transmission lengths varying.

Attack Program, Wireless: 2.5 Waves in size, extremely small, meant to be transmitted in approximately 2.5 milliseconds, used for a fast, light attack to test/crack the intrusion countermeasure coding. (ICC) 0.5 Packets in size for the same program, meaning that someone fighting via Wave VS someone fighting VS packets is going to be at a disadvantage.

Portable Memory:

Packet-Stick: 1-5 Mp (1-5 million Packets) About the size of a small chemlight
Wave-Chip: 1-3 KW chip used in portable computers, about the size of a nickle, but often carried in a 4-8 pack "wave-card"


OK, DON'T try to use modern terms. You WILL lose. I reread a science fiction piece where the author was talking about a totally new "CPU" that was up to 10x the speed of the common 256MHz chip. This was 100 years in the future.

Any time you try to make your book seem more advanced, real technology will catch up, kick you in the face, and blow right by you, leaving your book dated.

Make up new terms, but WRITE THEM DOWN so that they are solid through the whole book/series. Introduce new technology as logical progressions. Some of these may be a big deal, others will be grabbed without a second thought.

Trying to figure out data-storage in the future is a losing game.

Just look at the advertisement for the hard drive I have pinned to the wall...

"15 Mb!!! All the storage you will EVER need in the smallest package ever!"

I have a smaller, faster, and larger capacity thumb drive that's bullet-proof and high encryption. (Iron-Key)

indiriverflow
02-27-2009, 09:26 PM
Well, let me drag out my old sci-fi cyberpunk novel, and maybe this will help you...

Packet: Undisclosed amount of data, a singular "packet" over the high data stream above room temperature superconductor hard lines.
Wave: Wireless systems. Can handle Mid-function web immersion. A Wave is an undisclosed amount of data.

5 P/1 W

Standard transmission rate: 1P/ms (1 packet per millisecond) 1W/ms (1 Wave per millisecond) with transmission lengths varying.

Attack Program, Wireless: 2.5 Waves in size, extremely small, meant to be transmitted in approximately 2.5 milliseconds, used for a fast, light attack to test/crack the intrusion countermeasure coding. (ICC) 0.5 Packets in size for the same program, meaning that someone fighting via Wave VS someone fighting VS packets is going to be at a disadvantage.

Portable Memory:

Packet-Stick: 1-5 Mp (1-5 million Packets) About the size of a small chemlight
Wave-Chip: 1-3 KW chip used in portable computers, about the size of a nickle, but often carried in a 4-8 pack "wave-card"


OK, DON'T try to use modern terms. You WILL lose. I reread a science fiction piece where the author was talking about a totally new "CPU" that was up to 10x the speed of the common 256MHz chip. This was 100 years in the future.

Any time you try to make your book seem more advanced, real technology will catch up, kick you in the face, and blow right by you, leaving your book dated.

Make up new terms, but WRITE THEM DOWN so that they are solid through the whole book/series. Introduce new technology as logical progressions. Some of these may be a big deal, others will be grabbed without a second thought.

Trying to figure out data-storage in the future is a losing game.

Just look at the advertisement for the hard drive I have pinned to the wall...

"15 Mb!!! All the storage you will EVER need in the smallest package ever!"

I have a smaller, faster, and larger capacity thumb drive that's bullet-proof and high encryption. (Iron-Key)

Good info and good points.
You're right about not being able to peg the future...that would be hopeless. Personally, I'm not sure my premise will become a reality at all, although scientists are certainly laying the groundwork for it.

So what I'm shooting for is plausible projection rather than predictive accuracy. Trying to be a storyteller, not a prophet.

Already decided CPU is on the way out, am using "core" or "processor" instead.

You think I should invent my own data-metric terms instead of trying to project from today's terms? I dig what you are saying, but it seems like there ought to be a logic rooted in the way we speak today. Besides, I'd like the readers to recognize the basic function of hardware and software in their own terms.

In some ways, that might be a higher production value than hard science that only engineers can appreciate anyway.

Plausibility...that delicate balance of sounding sincere while spewing bullshit in the reader's face.

50 Foot Ant
02-27-2009, 09:35 PM
You think I should invent my own data-metric terms instead of trying to project from today's terms? I dig what you are saying, but it seems like there ought to be a logic rooted in the way we speak today. Besides, I'd like the readers to recognize the basic function of hardware and software in their own terms.
Crap, I forgot my cheat. LOL

OK, yeah, I used wave and packet, but they pretty much corresponded to the (then) state of the art tech at the time, only taken back two steps.

Right now 1TB drives are available, SATA for high speed data transmission.

You have it be a 1Gp (gigapacket) or 1Mp (Mega-Packet) drive. Use the common knowledge, but just replace it with your new terms. It's fairly easy for the reader to understand.

If you have the video card able to render "11 multi-facet objects per milli-second" you can explain why it is SOTA or junk. Say the main character snorts and realizes that the "chat rooms" he hangs out in has at least 20 mfO's in it, and that everyone would flicker and it would annoy him. Or think to himself that even the heavy duty databases only have 7 mfO's in them, and the card is definitely SOTA, maybe even out of Pakistan or Isreal.

Most people can recognise the lingo, and compare it. They'll realize that a packet is larger than a byte (on my cheat-sheet, I decided a packet was about 100Mb in size), as long as you explain it.

:-) Hopes this helps.

backslashbaby
02-28-2009, 03:24 AM
:) Good stuff there!

IBM prefixes are for 1024 Yottabytes = 1 Brontobyte and 1024 Brontobytes = 1 Geopbyte. To the best of my knowledge, no standardized prefix goes above Yotta/Yobi yet :)

WriteKnight
02-28-2009, 03:38 AM
I describe a 'memory gel' technology in one of my older screenplays, that roughly translates to the 3D memory storage described above. And I wrote that ten years ago.

I'm going to disagree on the 'unlimited' use of 'unlimited' in this conjecture. One thing will NOT change, and that's the penchant for making money. You can't make money off of something that is UNLIMITED. So you should give some thought to the social aspects as to HOW to restrict this amazing flow and storage of information, and WHO would restrict it as well.

The internet is becomming less and less 'free' - even as it is becoming more and more 'unlimited'.

indiriverflow
02-28-2009, 06:48 AM
I describe a 'memory gel' technology in one of my older screenplays, that roughly translates to the 3D memory storage described above. And I wrote that ten years ago.

I'm going to disagree on the 'unlimited' use of 'unlimited' in this conjecture. One thing will NOT change, and that's the penchant for making money. You can't make money off of something that is UNLIMITED. So you should give some thought to the social aspects as to HOW to restrict this amazing flow and storage of information, and WHO would restrict it as well.

The internet is becomming less and less 'free' - even as it is becoming more and more 'unlimited'.

Your point is relevant to the non-technical aspects of the plot...and one of the reason I want to get the tech straight. Part of what I want to show by this is why it is all so expensive.

backslashbaby
03-04-2009, 09:30 PM
OK, let's get quantum here :) My dad was a nuclear physicist in the Army for many years, and we've been chewing this subject over (thank God we are off political talks :) )

If you'd rather take it to email, lmk :)

This article http://www.cbc.ca/technology/story/2009/02/09/f-quantum-computing.html is a good overview of the general subject.

Usually the math used to describe the quantum states falls naturally from how the states are detected. Qubits hold a great deal more information than current binary systems, and that's the big deal with quantum computing. The research today has identified what sorts of logic gates (think circuits) can be used with all of these states. Experimental chips have been created that demonstrate that quantum states can be observed and used to represent data (instead of on/off for each gate, there are more possibilities).

Problems? Temperature can be a problem. How cold is likely for your servers? Many of the ways to observe quantum states are incredibly cold!! Some are just really cold, so that will matter. I found a couple that can be done without low temperatures, but that limits your choices of HOW they are observing the quantum states, and possibly how badly that mode of attack can handle errors.

The problem of errors: huge in something so sensitive to environment. Many approaches to reduce errors in quantum computers are algorithmic, currently. There is a fascinating new science called topological quantum computing that significantly reduces errors physically, but it involves really theoretical issues more than current quantum computing architecture. Do you like the idea of the newist kid on the block? The vocabulary rocks, that's for sure :)

I guess the first question before going into any particular research is How cold can it be? And how out-there can the theory be? The topological theory erases the need to know about a lot of earlier designs :)

indiriverflow
03-05-2009, 03:07 AM
OK, let's get quantum here :) My dad was a nuclear physicist in the Army for many years, and we've been chewing this subject over (thank God we are off political talks :) )

If you'd rather take it to email, lmk :)

This article http://www.cbc.ca/technology/story/2009/02/09/f-quantum-computing.html is a good overview of the general subject.

Usually the math used to describe the quantum states falls naturally from how the states are detected. Qubits hold a great deal more information than current binary systems, and that's the big deal with quantum computing. The research today has identified what sorts of logic gates (think circuits) can be used with all of these states. Experimental chips have been created that demonstrate that quantum states can be observed and used to represent data (instead of on/off for each gate, there are more possibilities).

Problems? Temperature can be a problem. How cold is likely for your servers? Many of the ways to observe quantum states are incredibly cold!! Some are just really cold, so that will matter. I found a couple that can be done without low temperatures, but that limits your choices of HOW they are observing the quantum states, and possibly how badly that mode of attack can handle errors.

The problem of errors: huge in something so sensitive to environment. Many approaches to reduce errors in quantum computers are algorithmic, currently. There is a fascinating new science called topological quantum computing that significantly reduces errors physically, but it involves really theoretical issues more than current quantum computing architecture. Do you like the idea of the newist kid on the block? The vocabulary rocks, that's for sure :)

I guess the first question before going into any particular research is How cold can it be? And how out-there can the theory be? The topological theory erases the need to know about a lot of earlier designs :)

Well, these boxes can be as cold as I want...in fact it adds an interesting layer to the story. The power drain sounds enormous, but it serves my purposes as well that it be power consumptive.

Do you suppose it would be possible for the supercooled state to be localized to the quantum core? Or would the entire works need to be in cryogenic storage? (Yes! I can make that happen. Witness the godlike powers I wield over this world. Bwahahahahah!)

The story does tend to assume that quantum computers will have the same basic components as servers of today. The quantum computer is so a) today's embryonic tech can be telegraphed b)AI tech can become practical c)it sounds cool.

I don't mind being out on a limb with the tech, and I hope to put a credible vision of it together. My MC "may or may not" be a former programmer, but as a user, he doesn't need to be more than superficially aware of the details.

50 Foot Ant
03-05-2009, 03:20 AM
If you have superconductor, I seem to recall something about it being of use as a cooling array.

Since superconductor likes to be the same temperature all over, you heat-sink it in roughly 8 points, put it in a vacuum container (the heat sinks connect to the casing), and presto-chango, you have a serious cooling array.

I wish I could remember the exact technical data on it.

Think of a superconductor Farraday Cage. Plus, it would help shield against EM interference.

Prawn
03-05-2009, 02:36 PM
Cryptonomicon, a great book by Neal Stephenson, has as its plot a secure data storage site which would act and an electronic currency for the world. Something you might want to check out when you get a chance.

indiriverflow
03-05-2009, 02:57 PM
Cryptonomicon, a great book by Neal Stephenson, has as its plot a secure data storage site which would act and an electronic currency for the world. Something you might want to check out when you get a chance.

I have read and been awed by it, thanks. This is a considerably different realm of technology, however; we're shooting into the future here.

I do love Neal Stephenson and that book in particular. My favorite line is the one about Waterhouse's private code. I still laugh when I think about what "I'm going to church" means when decrypted.

Are we agreed that some sort of supercooled liquid memory is likely, then? Any idea what sort of substance that would be?
Any other suggestions? Reasons this might not be plausible?

backslashbaby
03-05-2009, 04:55 PM
It would probably be easiest to have the whole thing in supercold conditions to take best advantage of the amazing powers of the processor. You could have the 'results' sent out to RAM (yes there is quantum RAM, I found!) that doesn't need to be so cold, but why not use the 'core' for the bulk of everything? It's not like you'd run out of memory, bwahaha :)

OK, so if you do the amazing topological QC:
Yes, supercold! To get the electrons to play nice.

The most studied http://arxiv.org/PS_cache/arxiv/pdf/0707/0707.1889v2.pdf uses materials that are like crystals, grown in the lab by something called "molecular beam epitaxy (MBE)". The crystals are called "(modulation-doped) 2D
GaAs-AlGaAs heterostructures".

This structure allows for 2 dimensional electon gases to be observed for the quantum states. There would be layers of the heterostructures, then buffers, and then top layers of them (http://nanoscale.micro.uiuc.edu/Publications/publications/JVSTB11_p2254_1993.pdf

These are more advanced than your basic QC because they have "non-abelian" properties (hard to find!) that are QC's answer to low error rates. There's a whole braiding structure that is fascinating that accounts for the low error rates (very mathematical! :)

I'll chat with Dad about what I'm missing there, but that's the general idea :)

indiriverflow
03-05-2009, 05:19 PM
It would probably be easiest to have the whole thing in supercold conditions to take best advantage of the amazing powers of the processor. You could have the 'results' sent out to RAM (yes there is quantum RAM, I found!) that doesn't need to be so cold, but why not use the 'core' for the bulk of everything? It's not like you'd run out of memory, bwahaha :)

OK, so if you do the amazing topological QC:
Yes, supercold! To get the electrons to play nice.

The most studied http://arxiv.org/PS_cache/arxiv/pdf/0707/0707.1889v2.pdf uses materials that are like crystals, grown in the lab by something called "molecular beam epitaxy (MBE)". The crystals are called "(modulation-doped) 2D
GaAs-AlGaAs heterostructures".

This structure allows for 2 dimensional electon gases to be observed for the quantum states. There would be layers of the heterostructures, then buffers, and then top layers of them (http://nanoscale.micro.uiuc.edu/Publications/publications/JVSTB11_p2254_1993.pdf

These are more advanced than your basic QC because they have "non-abelian" properties (hard to find!) that are QC's answer to low error rates. There's a whole braiding structure that is fascinating that accounts for the low error rates (very mathematical! :)

I'll chat with Dad about what I'm missing there, but that's the general idea :)

You are bringing in the hardcore science, BSB! Definitely want to hear more about the braiding structure...reminds me of something else with a braiding structure.

While we're on the subject, I'd be interested in his thoughts on software to render:
a complete human genome (base-five quinary code?)
a human neural net
a dynamic environment program serving realistic sensory input to thousands of these internally hosted AI's

what kind of math would best express these functions in code form? We'll be representing every axon and dendrite, at least in extrapolated modeling, but I was thinking the DNA file would be separate from the neuronic code, even though they would interact very closely.

Any guesses on how many qubits would we be talking there? Just a rough estimate would be really helpful. What sort of processor speed would be needed to churn all this data? I imagine sort of a cluster arrangement.

Remember, this is set four decades in the future to allow this to mature into commercial technology. So we're speculating about the finished evolved product of these experimental efforts.

Please thank your Dad on my behalf; I'll make room for both of you on the acknowledgments page, if and when it comes to that.

benbradley
03-05-2009, 09:36 PM
Okay, but the trend has been for the complexity and operating bytes of software to rise, if anything, more quickly than the hardware. The more capacity, the more you'll need.

I'm not talking about personal computing so much as servers. As we all know, there is no such thing as unlimited storage, processor use, or bandwidth on a server. These won't be serving the net, but what I will somewhat inaccurately call AI's for the purpose of brevity.

The functions of storage, processing, and temporary memory seem basic to computing to me. I can't really imagine how it could be built without it.

I mean, that's been constant for the past forty years of computing. What kind of paradigm shift would override these functions?
These things are "basic" because those are the tradeoffs in current technology - temporary memory is fast in both writing and reading, but it needs power connected to keep its contents. That is current RAM technology. FLASH is permanent, but slower in writing. DIsk drives are slower in both reading and writing, and is also physically fragile - dropping a computer a couple feet can damage the drive.

I disagree. Going back 40 years, 1 kay of RAM wouldn't fit in your purse. Right now I'm sitting with a 1 Terabyte external drive alongside my laptop. That's your paradigm shift right there.
(ignoring the fact that you're comparing RAM from back then to disk storage now..)

I agree that such a huge QUANTITATIVE change makes for a QUALITATIVE change (We've got fancy windowing interfaces instead of cryptic-looking command lines), but in addition there are other significant chances. Forty years ago most computers' main memory was magnetic cores - little donuts, 1 bit per core. This was RAM, but unlike the semiconductor that became denser and more cost-effective, core memory does not lose its data when the circuitry is powered off. I don't know if that feature was heavily relied on back then, but it's definitely a useful feature that semiconductor RAM doesn't have (thus we have "sleep" mode where Windows writes an image of RAM to disk before powering off, and on powering up reloads the RAM imageso it doesn't have to go through the contortions of a full bootup and restarting your apps).

Other semiconductor memory such as EPROM, EEPROM, and most recently FLASH do save their contents when you power them off, but it takes about a thousand times longer to write each word of data to it than it does with RAM.

There is a newish invention called a memristor that's been in the electronics press in the last few years. It was conceived of in the 1970's or so as the "fourth basic electrical component" after resistors, capacitors and inductors. I think that's overstating its fundamental importance, but even so, it might become as important as the transistor. One of these was made and demonstrated in the lab in recent years. If it can be made small and millions/billions put on a chip the way transistors are now, it could be a revolutionary new type of memory, that could replace both a computer's main RAM and long-term non-volatile (hard disk and FLASH) storage, similar to what core memory could have done if it weren't so large and expensive (each core has three wires that go through it, and they were hand-wired).

It would always be more secure via a cable, IMHO, but encryption can make any data transfer pretty darned strong.
You might want to make the encryption keys have maybe 100 or 1,000 times as many bits as are currently used. This shouldn't slow down things too much, but quantum computers will have huge parallelism and break current keys in milliseconds. Having huge (by today's standards) might at least slow it down to hours or (you would hope) years.

As far as 40 years in the future, I imagine data would be stored in electron and atomic spins (there's even something CALLED spintronics), making storage density basically equal to atomic density.

Of course eveything will be so ubiquitous that only specialists will know anything about their internals, such as how much storage is in the chip under your fingernail.

And of course everything you do in life will be recorded, not just audio and video, but in 3-D...

benbradley
03-05-2009, 09:54 PM
You are bringing in the hardcore science, BSB! Definitely want to hear more about the braiding structure...reminds me of something else with a braiding structure.

While we're on the subject, I'd be interested in his thoughts on software to render:
a complete human genome (base-five quinary code?)
a human neural net
a dynamic environment program serving realistic sensory input to thousands of these internally hosted AI's

what kind of math would best express these functions in code form? We'll be representing every axon and dendrite, at least in extrapolated modeling, but I was thinking the DNA file would be separate from the neuronic code, even though they would interact very closely.

Any guesses on how many qubits would we be talking there? Just a rough estimate would be really helpful. What sort of processor speed would be needed to churn all this data? I imagine sort of a cluster arrangement.

Remember, this is set four decades in the future to allow this to mature into commercial technology. So we're speculating about the finished evolved product of these experimental efforts.

Please thank your Dad on my behalf; I'll make room for both of you on the acknowledgments page, if and when it comes to that.
Hans Moravec has done this sort of calculation, specifically using neurons in the retina and extrapolating from there, in his book "Robot: Mere Machine to Transcendent Mind" and he was quoted in Ray Kurzweil's "The Singularity Is Near." You can get an overview by reading Ray's book, or you can go more in-depth by reading (ETA: the books listed in) the bibliography in Ray's book. They don't give values in qubits but in standard bits and bytes (multiplied by 10 to the power of large exponents), and speeds in comparison to modern computers. And they say right out these are approximations, extrapolations, etc., and can be off a couple magnitides either way.

For some thoughts on software/alcorithms to simulate neurons, I'm currently reading "Neural Computing Theory and Practice" by Philip D. Wasserman. It's a 20 year old book, but still interesting. It may be more detail than you really want to know, it describes the basic mathematical modeling (then) currently done for artificial neural nets (with an admission that the modeling is simplified from actual neurons, but these nets DO learn and do useful things), but it also gives a short history of 1960's neural network experiments with "perceptrons" and such.

kuwisdelu
03-05-2009, 09:56 PM
Meanwhile, I'm still listening to vinyl LPs....

/derail

benbradley
03-05-2009, 10:35 PM
Meanwhile, I'm still listening to vinyl LPs....

/derail
Not that there's anything wrong with that. ;)

Mr.H.
05-01-2009, 05:19 PM
Wow! I'm glad I stumbled across this thread. I happen to be writing a novel I started a few years ago on this very topic! I've been writing furiously on it lately (I don't know what's motivating me so much lately, but whatever!), and I am thrilled to see so many parallels in my book and your thinking. However, just in reading through this thread, I have a few more ideas/directions to go that are helping me with writer's block I didn't even know I had!

Thanks!!

dgiharris
05-02-2009, 10:43 AM
I don't think i've heard it mentioned, but data storage via transistor technology follows Moore's Law

http://en.wikipedia.org/wiki/Moore's_law

Experts feel we will hit the limit of Moore's Law in about 10 - 20 yrs. You can extend that equation to forty years and it will be fairly accurate about data storage capabilities.

My understanding of Quantum computing is limited. However, I think all discussions have always been on processing and NOT data storage.

Mel...