Computers a million times faster than the ones we have now? Who wouldn’t want that?
Scientists at Georgia State University in the US and Max Planck Institute for Quantum Optics in Germany have come up with a new process for the “on/off” state that forms the basis of electronic computing. Using materials called dielectrics, scientists discovered they could process signals in the breakneck femtosecond range – way beyond today’s supercomputers.
Dielectrics involves flashing pulses of light onto a new type of substrate, which will result in a binary calculation 10,000 times faster than modern transistors can manage. Compound that theory into a working chip and computers could work in petahertz range – a million times faster than today’s computers which operate in the gigahertz range.
But before you get your stockbroker on the phone, the experiments were conducted with mere chip components, not a complete chip or working processor.
“There’s quite a gap between observing an effect and its implementation in a computer,” says postdoctoral researcher Nicholas Karpowicz, who worked on the project at the Max Planck Institute. “In the near term we have to do much more.”
John Gustafson, chief product architect at chip manufacturer AMD, points out that a computer is only as fast as its slowest part. “If you make every transistor ‘infinitely’ fast it’d still be so limited by the speed of its wires you’d only see a few per cent increase in performance,” he says.
The other changes in chip design needed to bring dielectrics to their full potential in chips is further miniaturisation and managing the physics of heat and power consumption. “We can print billions of transistors on a single chip, but if we try to operate them all at once it’ll get hotter than a stovetop,” says Gustafson.
Assuming scientists can develop the new technology into chips and business can mass market it after a time-consuming and expensive process of building or retooling factories, recalibrating tools and then waiting four years (the usual hardware product cycle), a brave new world certainly awaits.
Mike Ford, head of physics and advanced materials at the University of Technology, Sydney, agrees the applications are a long way off but says they are meaningful, especially if dielectric experiments lead to optical chips. Fibre optics are fast (light speed, in fact), but traditional, slower processors at each end still have to process signals.
“Optical chips are potentially very fast, and it’s the bandwidth advantage that may be more significant,” he says.
Also, such quantum leaps in performance will affect only specific data management fields – they won’t vastly change your laptop or smartphone experience. While quantum physics calculations or social media data mining will be transformed, ‘faster CPUs won’t make Internet Explorer suck less’, is how one technologist, Soren Harner, vice-president of engineering at e-commerce platform Bigcommerce, put it.
Those sectors faster computing will expand (and create) will be those like real-time analytics that let us act on insights quicker, from global Twitter conversations to datasets from the entire satellite fleet to predict dangerous weather events.
Karpowicz mentions biological molecule simulation and meteorological models as two examples strongly limited by the inadequacy of the available computing power.
Harner also raises the option of artificial intelligence agents doing our online bidding for us with the help of these faster processors.
“We could create intelligent systems to manage resources dynamically,” he says. “Imagine if a human specified high-level goals and the software agent solved these goals in a semi-autonomous fashion, whether bidding for electricity on a dynamic market or scouting local farms for seasonal produce delivery.”
Given the often surprising ways people have found to use everything from the Xbox Kinect to the NBN, Karpowicz says technology is never a given.
“Often we don’t realise how useful a new technology is until we have the chance to use it,” he says.