Rather than trying to shrink current data storage technologies further, IBM took the opposite approach and designed a new system from the ground up—building it individual atoms. The new storage could lead to 100-fold increases in chip densities. Take that, Moore's Law.
Researchers at IBM, in conjunction with the German Centre for Free-Electron Laser Science, used a scanning tunneling microscope to line up iron atoms that comprise the magnetic storage system. They found that twelve iron atoms, assembled in two rows of six, was the minimum number necessary to stably hold a single bit of information. Eight pairs of rows, obviously, are needed to hold a byte. Conventional hard drives require more than a million atoms per bit, topping a half billion per byte.
Unfortunately researchers aren't holding their breath waiting for this technology to come to market. Its production would require an entirely new fabrication process and the huge capital outlay for equipment and facilities that comes with it. The prototype itself only existed in an extremely low-temperature environment and data had to be written with a scanning tunneling microscope.
The situation wasn't a complete wash, however. IBM identified a type of magnetism that could prove extremely useful in future data storage products. Known as antiferromagnetism, it's the inverse of the force what keeps your kids art on the fridge. Conventional magnetism doesn't work well on the atomic scale as neighboring magnets would interfere with one another. Antiferromagnetism, on the other hand, inhibits this interaction and allows researchers to build much smaller structures. [Popular Science - GigaOm - MSNBC - Image: Sebastian Loth/CFEL]