Already by 2000, we’d lost our innocence. No more the tinkering around with toys and making inventions of them. We’ve been going aggressively ahead, looking at the basic laws of physics and asking, “Why not?”
“Miniaturisation” is not restricted to gadgets: we’re also getting to the nanoscale. A great example is the DVD to blue-laser transition: we packed more onto a DVD than we could on a CD, so find new materials! Invent better microscopes! Dig out more of nature’s secrets! Do whatever, and figure out just how densely those bits can be packed! It was in this spirit of scientific aggressiveness that HD-DVD and Blu-ray were born-as with almost everything else that’s happening in tech today: processors, RAM, and all the rest.
Early this decade was born the Pentium 4. AMD came into its own this decade, and became a true competitor after long years of lagging behind. The gigahertz races had begun. (An aside: AMD reached the 1 GHz finish line first with its 1 GHz Athlon, just days before Intel released the 1 GHz Pentium III.) Then someone got the not-so-original idea that if one processor could do this much work, why not employ two or more-and thus came in dual-core and multi-core: the future of desktop computing.
Meanwhile, the graphics processor was becoming less specialised in its capabilities and more flexible in its nature. Thus was born the idea of the Graphics Processing Unit, or the GPU. The first GPU was the NVIDIA GeForce 256, launched in August of 1999.
If the GPU primarily enhanced the gaming experience, what was needed was better sound-and starting off from the simple SoundBlaster sound cards of yore, we’re moving on to immersive sound experiences. The x.x speaker thing is for the most part a phenomenon of this decade: 4.1, 5.1, now 7.1 and even 9.1. (Maybe there’s a 11.1 set-only we don’t know!) Similarly with displays: with so many visual bytes ready to be delivered, we needed better things to display them. So CRTs became better, LCD technology went mainstream, and OLED and plasma displays came in. Sure, LCDs are still weak when it comes to games, but do you really believe they won’t be on par by, say, 2012?
Games have thus far been driving much of the PC consumer industry. We now have physics processors for games. The PS3 uses the Cell processor, dubbed a supercomputer in its own right.
While the Internet in the ’90s was all about expansion, it’s now about broadband: people are clamouring for more. More speed, more content, more services. They want all their data to come in through one pipe, too. That’s why IPTV and VoIP are taking off: one line, many services-WWW, TV, phone, movies. And all high-definition, high-fidelity.
(Pause for breath.) Think about how much broadband has proliferated: in fact, in the developed world, everyone has it, so it almost makes no sense to call it by a special name. From 2 to 5 to 10 Mbps-to 50 and even 100 Mbps in some countries. (Off the top of our heads; there might even be 1 Gbps in some places!)
Not only do we want more, we want it along, wherever we go. Yes, the mobile revolution: it all happened so fast, the timeline is a series of months rather than years. Sometime between 1995 and 2000, it happened in India: by 2003 or so, everybody had a cell phone. In the developed world, kids as young as 8 and 10 years old have cell phones. (Younger kids are tending to get RFID-tagged in countries such as Japan.) And once you’ve got a gadget, why not pack more into it? Naturally, we don’t need to tell you about when cell phones became phones cum PDAs cum MP3 players and radio tuners and what not.
It doesn’t seem to make sense to call it a “phone” anymore; it’s just “the Gadget.” These Gadgets have processing power, and they’re growing hard disks and all-so why not store and/or watch movies and TV on them? That’s where 3G and 4G and all come in; they enable everything on the gadget, just like the Internet connection will (and in some places, does) enable everything on the PC. The Gadget is surpassing the laptop.
Getting back to the Internet, what we’ve been seeing is expanded use. Blogging comes to mind first: as the world increasingly deals with information in byte-sized chunks, people increasingly turned towards bite-sized write-ups called blog posts instead of long, boring Web pages to navigate through. Then there’s MMORPGs: that’s real virtual reality. People in countries where the bandwidth lies live, sleep, work, play, and even mate online. They can’t eat online-yet; although at least one MMORPG allows you to order pizza through its interface. When will that promised food-pill arrive? No, we’re beyond kidding now: nanotech might well produce that staple of science fiction.
While on the subject of the Internet, we must talk about the real problem of our age-the clichéd “information overload.” There are two problems here: storing it all, and retrieving it all. As for the retrieval, trust all the search companies out there-and, going a level beyond, the data mining companies. Yes, data mining and text mining will become more important. We want only what we need: data mining will provide precisely that, patterns gleaned from heaps of information. Text mining will enable us to read summaries of articles, the main points of a news item, and so on. Like we said, we like everything in bite-sized chunks these days.
Coming to the storage, just make the hard disks larger! Did we say “hard disks”? No, no, it won’t be hard disks. There’s already Flash memory replacing it-initially for faster bootups-and then there will be holographic storage (due out commercially this year, by the way), and ultimately carbon nanotube storage. A pinhead of nanotubes will hold terabytes.
Everything else is moving on, but when it comes to the interfacing with our Gadgets and our PCs, we’re stuck with mouse, keyboard, and stylus. Will AI please come to the rescue? How soon will we able to just tell our display device to play a certain movie?
Will AI please come to the rescue? How soon will we able to just tell our display device to play a certain movie?
And when you talk about interaction, think of pervasive computing: already, researchers are talking about it. Check out www.pervasive.dk: “Pervasive computing is the next generation computing environments with information and communication technology everywhere, for everyone, at all times. Information and communication technology will be an integrated part of our environments: from toys, milk cartons and desktops to cars, factories and whole city areas-with integrated processors, sensors, and actuators connected via high-speed networks, and combined with new visualisation devices-ranging from projections directly into the eye to large panorama displays.”
“Projections directly into the eye”? Now that’s an idea, come to think of it!
The decade so far has been, and the years to come will be, not so much about new technologies such as perpendicular recording for hard disks. It’s more about people generally having come to believe-perhaps rightly so- that technology can give us whatever we want.
On-the-fly. Instant. Anyplace. Anytime. More. Even more. We want. It’s moving fast. It’s getting faster. It’s getting smaller. Into our palms. Under our skin. Everywhere around us. Possibly inside us. God is now well and truly the Great Geek in the Sky-and what we want, as of this decade and in the decades to come, is for all our material desires to be fulfillable anytime, anyplace, right next to us, on demand.
The Gadget |
It’s the definitive development of this decade thus far: the portable Gadget. It was a cell phone at first, and at some point, when both cell phones and PDAs were being used, it became obvious that they could be combined. Some called the new gadget the smartphone. Then the big idea: if everyone’s got this in their pockets, why not make it more capable? They added a digital music player. They added FM radio. They added faster processors and more memory. They Web-enabled it. They added a camera, so you could click snaps and instantly send them off to someone via e-mail. (“Look! Here’s where I am!”) And the cameras are piling on megapixels, the network is gaining in speed, the processors are getting better, and all the rest of it.
The Gadget, as we’re calling it, represents convergence. But it’s not just about convergence, or even about convenience. It goes beyond: we want to break free of the shackles of time and space, as it were. |