A little more than 20 years ago, my family won a giant TV at some kind of local charity raffle. It was the newest model and it was enormous. Between the box and the screen, it was more than six feet tall, seemingly weighed 900 pounds, and took an awkward gaggle of teenagers to maneuver it into the basement. It was the best TV anyone had ever seen, and it served as a monument to technology.
And then, little by little, it became old-fashioned, outdated, obsolete. We still watched football games there every Sunday until my parents moved, but that was mostly due to tradition and because no one wanted to lug that thing out of there.
I bring this up because I think it is tempting to believe that we have reached a high-water mark with our mobile tech – that at the very least, the outlines of permanence are there. After all, our smartphones and tablets can do things we barely dreamed possible even ten years ago. And while we know that wearables and the “Internet of Things” are going to be integrated into the system and make everything more convenient, it seems like change by a matter of degree, not type.
But that’s probably not the case. It’s always tempting to think that you live in the best of all possible worlds, and it’s true that the times we live in are amazing (we have close-ups of Pluto!), but we are also just at the leading edge of technology. 10 or 20 years hence, our mobile tech will be unrecognizable, and it will start with artificial intelligence.
The Amazon Echo and How Our Mobile Tech Learns
This week, Amazon released its Echo device to broad use (it had previously been available to some Prime members). The Amazon Echo is, essentially, a stand-up, free-floating Siri (or Galaxy) that you can put in any room. You address it as “Alexa” (the name of the software, which Amazon hopes to use elsewhere), and can ask it questions or give it simple commands. You can ask it what the weather is like, what you have scheduled for today, who won the 2005 World Series, and a variety of other questions, ranging from useful to trivial.
The Echo is also helpful for listening to music, as you can request a song or an album or an audiobook (assuming you have already purchased it). One of the cooler things, according to a Walt Mossberg review in Re/Code, is that you can listen to a newscast on demand, a “Flash Briefing” from sources like NPR, the BBC, or ESPN. It’s an extension of the instant and tailored news that Facebook is bringing to its site – a new way to tailor information.
To me, that stuff is great, but it is really just an extension of what we already have. However, there are a few other applications that really hint at the future. With voice commands, Echo can control light switches or light bulbs in the house, provided they are part of the Belkin WeMo system, which seems to be the only compatible system thus far.
Now we’re talking. This is beginning to enter the realm of sci-fi, and it is due to the connectivity of the Internet of Things. Walking into a room and saying, “Alexa, turn the lights on, heat up the coffee, and get me the news from NPR” is a futurist dream. The question is, what comes after that?
Artificial Intelligence and the Future of Mobile Tech
That’s where Artificial Intelligence (AI) comes in. Computer scientists have been working on this – on so-called “deep learning” – for years. And while that is also the sci-fi nightmare, there are some very useful, practical, just-plain-cool applications.
Right now, your Amazon Echo (or any similar device that comes out) can respond to your questions or simple commands. But deep learning enables it to do something better than that – something even more sophisticated. Deep learning is meant to mimic our own thought patterns and the neural connections that distinguish thought from mere instinct. It’s the difference between recognizing that someone is crying and understanding that they’re sad.
This has huge implications for science, medicine, and work (not to mention, deep moral questions), but this blog deals with mobile tech, at least in theory. What does this mean for our tech? It means anticipating things – it means walking in a room, and the lights turning on not because you triggered a sensor, but because the tech knows you want them on. Or them not turning on because it is late, and you only need the one light near the fridge to go on. It means your phone determining what kind of music you are in the mood for based on other cues. It means learning.
This kind of learning is more than just remembering. It’s about imagining what you will need based on patterns, and based on who you are. Think of someone who demands to know why you’re so happy, versus someone who just enjoys it. The amazing present is walking into a room and being able to command a small device to start your day. The onrushing future is not having to say anything at all.