This site may earn affiliate commissions from the links on this page. Terms of use.

Hard drive chapters is getting surprisingly large these days, just the density of data written to traditional materials tin just take the states so far. Researchers from Delft University of Technology in the Netherlands have devised a way to shop information on the atomic scale by manipulating single atoms. The density of the proof-of-concept atomic storage created for the report is 500 terabytes per square inch. That's 500 times more than dense than the highest chapters difficult drives.

The atomic storage developed at Delft Academy of Technology is based on the scanning tunneling microscope (STM). The same bones technique has been used to motion individual atoms effectually for decades. In 1990, physicist Don Eigler made news when he managed to spell out the letters "IBM" with 35 xenon atoms. Scientists have been experimenting with storing data with arrangements of atoms ever since, but the Delft written report avoids many of the pitfalls of such methods by using a storage grid based on atomic vacancies rather than where atoms are.

While it'due south possible to move atoms around, they don't necessarily similar to stay put. The team from Delft realized that chlorine atoms deposited on a copper substrate would form a regular grid. Information could be stored much more than reliably in this grid by sliding atoms effectually similar a puzzle with the STM tip. The design of vacancies actually encode bits of information. This is much more stable than attempting to arrange private atoms — the team reports improve than 99% reliability in the data.

1468818727292621

1KB of atomic storage

An example of the storage matrix in a higher place shows what the researchers were able to reach. The array is a few nanometers beyond and contains one kilobyte of data. At that place are 144 blocks, each one with atomic vacancies that encode eight binary $.25 of information. In this case, information technology's storing an excerpt from Richard Feynman's "There'southward plenty of room at the bottom" lecture.

Unfortunately, this is the largest example of diminutive storage the squad has produced so far. The final capacity isn't the only hurdle to overcome, either. Even with the improved stability of the vacancy approach, the storage medium still needs to be cooled with liquid nitrogen at -196 degrees Celsius (-320 F). The pathfinding algorithm used to read data from the atoms is slow by storage standards as well — it takes 10 minutes to read the 1KB block right now. Existing STM technology should be able to boost that to 1Mbps eventually. The other problems will accept more work and the development of new technologies, merely the team believes this has the potential to change the nature of data centers.

Now read: How DNA data storage works