With computers evolving at such a pace, can anyone say what they won’t be able to do in 50 years?
From robot servants to small handheld communicators, biological clones to a computer network connecting you to the sum total of human knowledge, some of the most iconic ideas from a century of science fiction have come to pass.
Computing is like travelling. It used to be an isolated, specialised exercise — trudging overland with teams of donkeys. Now it’s like being able to fly anywhere on Earth within 24 hours. The power to pluck knowledge, engage in commerce, seek inspiration or commit crime at the four corners of the globe is on a desk in virtually every home in the western world.
It’s even more like travel when you consider that after those intrepid explorers conquered the known world, where else was there to go? Now everything’s connected to everything else, has the computer reached its nadir? Is earth now one giant supercomputer, eerily prophesised in Douglas Adams’ Hitchhiker’s Guide to the Galaxy series?
Marian Salzman is part of the New York-based team of futurists behind the bestselling books Next, The Future of Men and Next Now. She thinks it’s because of ubiquitous connectivity that we haven’t yet scratched the surface of computing. “Actually there are numerous new worlds to conquer because of the Internet,” she says. “An infinite number of nuanced experiences and opportunities to pop into and out of bytes of life.”
In other words, where mere websites might not have brought people in their millions together, the so-called ‘Web 2.0’ applications like YouTube and Flickr have, and in doing so, they’ve created a new world within the Internet. The Net itself, it seems, is just the construct — waiting for us to overlay our nature upon it.
The Sum of Us
But one thing will never change, and that’s the fact that you need a machine to process and retrieve data. Whether it’s the 1800 square foot ENIAC — the world’s first digital computer that was switched on in July 1947 — or a nano-particle we’ll one day inject into our cerebellum, there are certain constants.
“You can think of a computer like a car,” says Jeff Morris, client strategist, Dell Asia Pacific. Dell is one of the more obliging computer manufacturers when it comes to talking about their future product designs. Most vendors are secretive simply because of the money involved in the manufacture and provision of computer equipment, so we’re left with a curious world in technology where virtually no future exists until we’re on the cusp of the PR junket to launch it.
“There’s an engine, brakes and gears,” Morris continues. “Computers are similar — the principles have been the same since the beginning and have set down the components we use today, no matter how different. What changes is the evolution of how they look and feel. For example, moving from LCD to LED will make notebooks thinner and lighter, and technologies like Windows Vista are driving the increase in graphics processing, so notebooks have to be rebuilt to adapt to it, by cooling more efficiently, for example.”
To operate a computer, you need a processor chip, short term memory in which applications work (RAM), you need a physical media where information is stored digitally (hard drive), and you need the interface to both input calculations and receive the results (the monitor, keyboard and mouse). In Jeff Morris’ words ‘there’ll always be a piece of silicon processing information.’
Caught in the Net
What will change — the groundwork for which we’ve already seen — is where and how those components work. Slowly but surely, they’re all moving online.
It started with free web hosting back in the day, then Google threw down the Gauntlet with the 2.5Gb of space they offer for GMail users. Free or low cost online storage services are springing up all over the web; could they be the future, when all data storage is distributed across several enormous server farms instead of sitting on billions of comparatively tiny (and mostly empty) standalone hard disks across the world?
Serving applications over the Internet has already become a reality. Again on the cutting edge, Google launched its online suite of office applications during 2006. While it hasn’t caused the wholesale dumping of Microsoft Office, every revolution has its small first steps.
On the back of keeping your data and applications online, can online processing be far off? Instead of your request to send an email or update the shopping list in Excel going through a processor in your PC, why not send it over the Internet to a massive bank of supercomputers that can process it (and trillions like it) every second — particularly if everything else is online to begin with.
You may be surprised, but that’s already here too — since the late 90s, SETI@Home has operated using that precise model. An application that lives on your computer, it receives data from radio telescopes, crunches it to look for anomalies, and reports the findings back to the Search for Extraterrestrial Intelligence. Instead of buying an enormous and expensive machine to process their own data, SETI created one of the world’s first distributed processing systems. With so many net surfers tapping into the novelty of what they were doing, SETI turned PCs the world over into one single virtual processor.
As Dell’s Jeff Morris describes the future of online computing; “You could just have a little box on your desk that your keyboard, mouse and an Internet connection plug into, and your computer’s sitting somewhere over the other side of the world.”
Even if your data, processor and programs aren’t flung to the four winds one day, it’s unlikely you’ll have a host of computers all doing different things — one to play to your music and movies, another for the kids to do their homework, etc. If you’ve bought a Windows PC in the last 6 months, chances are you have Windows Media Centre pre-installed. And computers that look like DVD players or stereo decks are invading lounge rooms everywhere.
So the infrastructure for every PC in the home to merge is here. Coupled with virtualisation — where the same computer runs several completely independent operating systems — the concept of a ‘wired home’ is only a short step away. If you buy a house in 2015, it’ll undoubtedly come with a small, airtight cool room in the roof cavity which houses your server, wireless interfaces in every room accessing everything from the web to your digital photo album, your household budget, movies on demand, the air conditioning and the home security.
The More Things Change…
There’s just one chink in the armour of the perfectly realised computing universe where everything digital is distributed and served across a super-network, and that’s us.
“People growing up with computers nowadays are used to a paperless work environment,” says Dell’s Jeff Morris. “For Gen X and the boomers having something physically in the hand is something they’ve always needed culturally.”
So it appears we can take a leaf out of the book of the defenders who laugh off the advent of e-books at every turn. If you pay money for something — especially something like a computer that can cost thousands of dollars — not having something in your hands to show for it just doesn’t feel right.
Genevieve Bell has what must be one of the coolest jobs in the computing industry. She’s an anthropologist for Intel, literally studying human culture so as to help steer Intel in the most profitable direction based on consumer behaviour. She makes a point that reminds us why we still have the automobile (a 100 year old technology), TV (60 years old) and farming (now 5,000 years old) — they work and we like them.
“Those who work in technology are often seduced by the rate of technology change,” Bell says. “We forget that people, cultures and societies don’t change quite that quickly. In 100 years we’ll still care about things we do now like communication, who we are and how we’re perceived. We’ll still have financial transactions, we’ll still want to preserve and celebrate our histories.”
And as long as using a computer is, as Bell puts it, a ‘compelling experience’ — we’ll still want to do it with a machine on our desk, even if it’s just a box connected to the Internet serving our data everywhere from Bangalore to Brisbane and Seattle to Seoul.
At this point it’s also worth mentioning the commercial imperative that drives almost everything in our world. Strange though it would have seemed to our egalitarian hunter-gatherer ancestors, very little gets done in human endeavour in the capitalist age unless there’s money in it.
When there is and a market is created, human achievement through technology moves forward a few steps. That’s why DVD players have plunged from $1,200 to as low as $50 in about five years and there’s no line around the block for Richard Branson’s 2009 commercial spaceflight ($250,000 a ticket).
The biggest step forward in PCs recently has been the rise of the notebook, and that’s affecting the way people design computers and therefore the way we use them. “People are buying more notebooks than desktops,” Dell’s Jeff Morris says, “That’s what’s driven the change in form factors. The demand in computing around the world has been because of the commodification of technology and that’s why we’ve seen such huge uptake.”
The Future of Sages Past
So how do we predict what PCs of the future will look like, how they’ll behave, how they’ll work? Charles Duell, US Commissioner of patents, once said that everything that can be invented already had been. He said that in 1899, and considering we’ve seen more technical innovation in the last hundred years than in all of history up until then, apparently Americans weren’t any better at choosing their public officials in those days than they are now.
You’d think the Chairman of IBM would be more enlightened. When the first, caravan-sized digital computers were in development, IBM’s killer app was punch-card tabulating machines. Although there’s dispute the quote can be attributed to him, then Chairman Thomas J Watson said there was a world market for ‘maybe five computers’.
If you want to make a prediction, history may judge you as a philistine or idiot. But if the PC of 100 years hence it still a machine that processes information, it might be more recognisable than we think. As futurist Marian Salzman puts it; “At this stage, it’s hard to envision a life where place is an anchor. Computing will be anywhere and everywhere.”
Now that notebook sales outstrip those of desktop PCs, we’ll be gradually unshackled from the wall-mounted power point. But with current lithium battery technology giving little more than 2-3 hours of laptop power, we’ll never be truly mobile until we tap an alternative. Living cells are bursting with energy; how about dangling two diodes into a glass of water for unlimited laptop power, or plugging them into two ports on your forearm?
Will touchy and expensive metal hard discs be replaced by throwaway DVD technology as the etching lasers get smaller (allowing more data per surface)? More intriguing — and the dream of science fiction writers throughout the 20th century — when will we interface with computers totally by voice, by walking through a VR simulation or by dealing with an artificially intelligent agent?
Beige No more
In the mid 1990s, computers were easily recognisable even to people who didn’t use them — they were square, plastic and beige.
One struggling vendor bought back their high profile CEO, put a lower case ‘i’ in front of a product and changed our relationship with technology forever. Although other vendors such as Nokia have jumped on the design bandwagon and made themselves obscenely rich in doing so, Apple is generally credited with making technology human.
The 1998 iMac and 2001 iPod kicked off twin revolutions by transforming technology into objects d’art — devices we wanted to hold and play with, machinery that expressed who we were, tools that we loved. Just think of the Vertu mobile phone, a $10,000 device that by all accounts has the same features as those you can get on a $30 a month plan.
Partly because technology vendors are trying to cut themselves a slice of the dazzling iPod pie, we’ve seen an explosion in computer design over the last 10 years, and that as much as any other factor is going to influence what computers will look like in the next 10, 50 and 100 years.
And despite a history of laughable predictions and the cultural landscape changing at an ever-increasing pace, PC makers are toiling away in secret laboratories designing the computer you’ll buy in years to come…
Computers in Sci-Fi
The most dramatic application of computer technology in movies and literature has always been the Promethean myth — for our creation to turn against us. But whether it’s for good or evil, computers on the silver screen have always had one thing in common; personality.
Name: HAL
Film: 2001: A Space Odyssey (1969)
Application: Programmed to lip-read, HAL realises the crew of the Discovery intend to switch him off after an incorrect malfunction report. He decides to kill them but when astronaut David Bowman outwits and disconnects him, he sings Daisy Bell while slowly losing the logic circuits from his consciousness.
Name: Skynet
Film: The Terminator (1984)
Application: Designed as an Internet-style military network, the US government flicked the switch on August 29 1997 and it immediately launched the entire US nuclear arsenal, intending to wipe humanity out.
Name: Holly
Show: Red Dwarf (1988)
Aplication: The AI system aboard the Red Dwarf, this blonde, deadpan cockney routinely reports on ships functions and the environment while keeping a cool head.
Name: The Matrix
Film: The Matrix (1999, 2003)
Application: The virtual reality simulation used to amuse and stimulate the human race while the electronic life forms of the far future harvest the biological electricity from our bodies to power their society.
Name: R2D2
Film: Star Wars (1977, 1980, 1982, 1999, 2002, 2005)
Application: Ostensibly a mechanic, frequently the comic relief and a classic Hardy to C3PO’s camp Laurel. First in a long line of lovable robots.
Name: ED209
Film: Robocop (1987)
Application: The Enforcement Droid series 209 is the ultimate cop, with ‘superior firepower and the reflexes to use it’ (according to its creator). It malfunctions and blows a senior executive away during the launch shindig, but that’s just an operational glitch – they already have a military contract.
The Keyboard 2.0
Many of us remember when you could use a PC and not even need a mouse. Now, you can’t survive without one. But following work being done at Sydney’s Macquarie University, the humble point and click action that’s second nature to most of us might soon be an antique.
Dr Manolya Kavakli, Head of the Virtual Reality Systems Lab at Macquarie’s Department of Computing, explains how VR will allow us all — including the disabled — to use tomorrow’s computers with less effort.
Speech and Emotion Recognition
Using Pidgin Languages, the team has developed a way to talk to characters in a computer game. Threshold markers detect the emotional level in a human voice, so the computer not only hears your command, it perceives your emotional state.
Measuring Emotional Engagement using Biometrics
A VR training simulation in a 6 metre wide wraparound screen (like IMAX) where a user’s engagement with the task at hand can be measured with biometrics such as taking ECG and EEG readings.
Robotics Wheelchair
An electronic wheelchair quadriplegics can operate using redundant body signals generated in the muscles, transmitted through a sensor jacket.
Gesture Recognition
A system used to communicate with computer- generated characters in a training simulation), allowing them to interpret your gestures and act accordingly.
Face Reconstruction via Virtual Sculpting
Aids forensic artists by allowing them to carve the face of a criminal. The interface will present a 3D generic face model to the forensic artist, which they’ll be able to manipulate from comments by an eyewitness for matching from a database.
AI: The Ultimate Computer
It can be said human brains are mere data processing machines. But as we all know from first hand experience, there’s a thinking, feeling ‘someone’ inside the machine.
AI is the pursuit to engineer such a ‘someone’ inside a digital computer. It’s the mother lode of computing; the day when computers can think for themselves.
It’s also a lot harder than the movies and TV would have us believe. Without really knowing what consciousness is made of, we don’t know where to start in replicating it. If we replicate a human brain out of transistors and tubes, will a ‘self’ spontaneously arise in it?
We’re also limited in that we can never know how anyone else in our life feels at every minute, only ourselves. So if an AI machine is created, whose word do we take that it’s really a sentient being?
That’s where we need to avail ourselves of the Turing Test. Alan Turing, considered the father of computer science, kicked off the AI debate in 1950.
He proposed that simple, everyday language requires such a convoluted set of rigid rules of syntax together with on-the-fly abstractions and adjustments that you could never completely program it. If a machine can successfully hold a conversation, you can be sure it was sentient, making it up as it goes along as we humans must when we communicate.