Jump to content
Science Forums

Data storage limits - a new take on it?


Recommended Posts

So common wisdom will let you know that the ultimate limit to data storage capacity of any particular device (be it a disk platter or solid-state memory) is determined by how many atoms there are in a given surface area. We are still ages away from approaching this theoretical limit, but we are getting there.

 

For instance, if your data is stored based on the magnetic orientation of any given structure, you can make it smaller and smaller until you reach the point where you determine the magnetic properties of a single atom. That will represent one bit of information. And the atom right next to it, the second bit. And so on. But once you reach the scale of actual atoms for data storage, you're pretty much screwed. You've reached a conceptual limit because our approach to the whole idea is a bit myopic.

 

Consider: (as a handy analogy)

 

I have eight iron nails, lined up next to each other. They are all unmagnetized. If I were to use the magnetic properties of these eight nails as data, they would represent 00000000. Let's say I magnetize the first, fifth, seventh and eighth nails. They would now represent 10001011 in binary, or 139 in decimal. That's exactly what happens in a hard drive. Now if each nail represented an atom, I'd be stuffed - there is nothing smaller (let's forget about subatomic particles for a second - although the argument would hold for whatever smallest limit you'd care to use).

 

But what if I were to measure another, independent property of these nails?

 

Let's leave the magnetic properties as we did, having 10001011 as data in that particular dimension. Let's paint the first, second, fourth and seventh nails blue, and the rest yellow. Let's say yellow represents 0, and blue represents 1. Then we'd have 11010010 as data in this dimension, without altering or disabling the usability of the magnetic property as a data carrier at all.

 

We can even paint the nails' heads a different scheme - let's say red is 0 and green is 1. Then, the body of the nail will be one data carrier, being blue and yellow, but the heads will be red and green, being a completely different data carrier. Let's say we paint five and seven's heads green, and the rest red. Then we would have 00001010 as data using the heads' colours.

 

We can also turn the nails upside down: Head up is 1, head down is 0. Lets turn one, three, five and eight upside down. That gives us 10101001 as a totally separate, independent data stream.

 

Thus, using only the samples above, eight nails, or fundamental particles, if you will, gives us four completely independent data streams, the usage of any of them or the changing of any of their individual properties as determined by how the observer tests for a 1 or a 0 does not impede on any of the other's properties, making four perfectly usable data streams.

 

Using the magnetic property, eight nails gives us:

10001011

 

By looking a little further and using other independent properties of these same nails, eight nails gives us:

10001011

11010010

00001010

10101001

 

That's four completely independent data streams, using different properties as the carrier. Not one of them will influence any other. You can basically "format" the magnetic property of your "nail-drive", leaving the other three data streams intact.

 

Sure - it's a bit of a rough analogy, but it just serves to illustrate that the fundamental limits to storage space as presented to us by the computer industry is not necessarily determined just by scale. If we use our heads, we should be able to conjure ways of testing for other properties but the magnetic/electrically charged one? Leaving us with a limit not of the size of the actual atoms determining how much data can be stored per cm², but of how many individually changeable properties an atom might have?

 

Any ideas?

Link to comment
Share on other sites

well, how fast can alternitive storage methods read data? Most hard drives read data at around a few thousand RPM, and the newer ones can read data at around five thousand RPM. Hard drives also store data in sectors and they usually have three or four platters. So how long would it take to read and write data?

 

There is another method of data storage called solid state drives. These are in their early stages and cost a lot of money for little amounts of storage space. They read and write data differently, in blocks. These are better for laptops and such because if they are dropped there is no fear of the platters being scratched, because there arnt any.

Link to comment
Share on other sites

There are a lot of data storage schemes in use, with different data densities, access speeds, and other characteristics, using many different underlying physical mechanisms, not just magnetic polarity (like a hard drive) or electrical charge (like a flash memory). Some are pretty amazing – I recall a storage method that IBM was working on around the turn of the millennium that used a microscopic array of precisely movable “needle” styluses like those on a Scanning tunneling microscope to store data as pits and bumps no larger than an atom or two on various kinds of material. Another I thought wondrous involved using protein sequencing and assembling devices to store digital information in long protein strands, such as DNA. Most of these schemes haven’t come to market or gotten a lot of attention, because, I think, methods like magnetic hard drives and flash memory have done so well at meeting present day practical needs.

 

Nothing I’ve heard of could exceed a 1 bit/atom storage density, and most are many times less dense.

 

Boerseun’s idea of using many measurable characteristics of some material, such as magnetic polarity, physical shape, and color of reflected light, doesn’t seem to me to have much potential, because these characteristics require collections of atoms many times greater than present day storage devices need to store the same number of bits.

 

There is an obvious way to break the roughly 1 bit/atom data density limit: use something much, much denser than atoms, or fermions of any kind. Since non-fermions – that is bosons – follow completely different physical laws concerning density than fermions, the greatest consequence being essentially no limit on how many can exist within the same volume at the same time – it stands to reason that a boson-based storage device could have much, much greater density than any present-day, fermion-based ones. Everybody’s favorite boson is the photon, so a good name for this approach, and the one most widely used, is photonic computing.

 

That said, current technology is far, far from being able to produce better than a rudimentary photonic storage device. Most interest is currently in developing very high speed (more than 1000 times greater than the best electronic approaches) computing and data transmission devices, not in using them for storage, which electromagnetic approaches do well now.

 

I find it interesting to muse that, according to some of the most popular theories, the whole universe is now, and will ultimately be nothing but, a big photonic storage device. Nearly every datum worth knowing is presently represented somewhere in space as an orderly pattern of photons – the light from my computer screen, for instance, speeding away from earth though space. Far, far in the future, it’s predicted, all bosonic matter will have been transformed into photons, and the whole universe will be nothing but that data, with no possible way to read it, or write any new.

Link to comment
Share on other sites

Very interesting post Craig (must spread the love around some more apparently :)).

 

How would a photonic storage device work, theoretically?

If any number of photons can occupy any given space, then how would one differentiate the photons in that space? Perhaps the number of photons occupying a space could be calculated and assigned a value, kind of like a photon assembly code?

Link to comment
Share on other sites

How would a photonic storage device work, theoretically?

If any number of photons can occupy any given space, then how would one differentiate the photons in that space? Perhaps the number of photons occupying a space could be calculated and assigned a value, kind of like a photon assembly code?

Presumably one way for them to work would be the same way “photonic communication devices”, of which we have plenty these days, more commonly know as fiber optic comunication devices. Digital data (binary is easiest, though other bases are possible) is encoded as short pulses of many photons. Alternately, and offering much more data, speed, and less power, the data can be coded with the polarity of each photon, using, for example, up-down vs. left-right polarization. Both kinds of devices are on the marked now, though the polarity-encoding kind are more exotic, and used sometimes for quantum encryption, a cool but off-topic subject. :)

 

All mainstream physics suggests that you can reliably encode one of a huge number of states in a single photon – the wikipedia article states, unreferenced, “The visible light spectrum alone could enable 123 billion bit positions”. So the potentially data density appears to be pretty mind-boggling.

 

Encoding the data as photons is the easy part. Actually keeping the data around long enough to be useful is the hard part, as unlike electrons and atoms, photons don’t characteristically stay nearby very long. Though one could in principle keep a lot of data reflecting between a pair of mirrors, in practice maintaining this for even a brief duration is a daunting feat of optical engineering without some sort of “photonic amplifier/repeater”. Such devices appear to me to be in their technological infancy. I’ve read a few references to “trapping light in crystals” with regards to photonic storage, similar to some well-reported “making light stand still” experiments, but this too seems very early research.

 

Photonic computers are pretty well represented in science fiction, at least the space opera branch of it. Lots of 1990s Star Trek made passing references to it.

Link to comment
Share on other sites

Quite interesting!

 

It makes sense now that photons could encode an extremely high amount of bits by utilizing all of its properties (wavelength, frequency, etc.). It is, as you and wiki suggest, boggling.

 

I'm intrigued by the "trapping light in crystals" remark. Can you provide a good reference for such a study? My curiosity is proving infallible right now. :)

Link to comment
Share on other sites

here's a blurb of a though i had reading this thread:

 

qbits are awesome, i've been looking at that for quite a bit, with some sort of quantum data storage, and asymmetric multiprocessing, yes, not really even quantum, just many simplistic cores, talking many million core processor and software written to utilize them (threading is really appealing all of the sudden :) ) perhaps together with vectorized APU's perhaps of a similar structure, not unlike in the original cell architecture, and we may have first machines that won't operate in binary, even though internally it will still be signal/no signal, they will not use single core logic, logic can consist of multiple states/results of multiple cores...

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...