October 26, 2012
Yesterday I needed to see something in a darkened corner of my office. I reached into my pocket, pulled out my iPhone5, held the button that summons Siri, and said, “Flashlight.”
Suddenly the corner lit up. Holy frickin’ Apple genius! Siri found my flashlight app and turned it on.
My phone and I experienced… a… moment. I swear it felt like love. Literally. There was just something so accommodating and surprising about the flashlight working exactly as I hoped it would. I use Siri often, but usually just for setting an alarm or sending a quick text message. Those situations are cool and often convenient, but they do not feel like love. I expect Siri to do voice-to-text, and I expect it to set my alarm clock on command. What I didn’t expect is that my command “Let there be light” would be obeyed. It made me feel godlike. In all seriousness, it was a rush.
I realize that this sounds like an Apple commercial, but I remind you of my history of despising most Apple products. Every Mac I’ve owned had been a lemon. My first iPhone was a total disaster. The iTunes interface is a mess that looks a Microsoft product from the nineties. My original version iPad pisses me off every day. Prior to the flashlight moment I had resisted Steve Jobs’ Rasputin-like powers of seduction that now extend beyond the grave.
I own Apple stock because it seems underpriced, but I’ve never been a fan of the products despite buying them far too often. The iPhone5 changed that. It is an extraordinary feat of engineering. I am not kidding when I say I feel emotion for it. The designers and engineers at Apple have crossed some sort of psychological barrier that will someday be recognized as one of the great transitions in civilization. They literally engineered love. And by that I mean they created a device that stimulates my body chemistry in a way that feels somewhat similar to love. And I think that accomplishment will someday be seen as a bigger deal than we recognize today.
Suppose someday an industry standard is created to promote this sort of machine-generated love. The standard would simply allow anything in your environment – from your automobile to the rooms in your home – to respond to you individually, immediately, and sometimes surprisingly, the way an iPhone5 does. And perhaps the environment could be interacting with the smartphone in my pocket to make some of those actions a reality.
Suppose you had an industry standard for light bulbs and light switches that allowed any room to sense who is in it and convey that information to the electronics and other appliances in the home. Wifi-enabled light bulbs already exist, so this isn’t a stretch. Let’s say your light switch can detect motion and heat, so it knows when a room is occupied. It can also do facial recognition via its Internet connection. It knows who belongs in the house, including friends. It can pick up Bluetooth signals from phones that come near. Your phone also uses its GPS to tell the room it is near. The cloud holds my list of personal preferences, so as I move from room to room, my environment conforms to me.
The lighting adjusts to my preferences when I enter, and shuts off when I leave. If more than one person is in the room, the system intelligently negotiates priorities. For example, if one person is located in a bed, the room light will stay off when a new person enters.
My heating and cooling adjust according to who is in the house and what time it is. Even the curtains are automated.
According to my profile in the cloud, my television turns on if I am near it in my house between 9 pm and 11 pm in the evening. The screen goes to the DVR recording page and shows only the shows I liked enough to record. As I walk from room to room, the show follows me to each TV and pauses while I’m in hallways or the bathroom.
I walk to my computer and it knows who I am before I even touch it. No password needed. The screen pops to attention as I approach it.
Someone rings the doorbell and both my phone and TV present a picture of who is at the door. No cameras needed. The doorbell sensor identifies the visitor, either by facial recognition or by his phone’s signal, and his profile picture is sent to the TV and my phone. His phone and mine are automatically connected through the cloud. I just say, “Come on in, Bob. The door is unlocked.” It’s not actually unlocked until I say it. The home listens to me, understands the context, and unlocks the door electronically.
When Skype-like functions are on every television, and there’s a flat screen on every wall, all you need is your Bluetooth earpiece and the walls will seem to respond to you. Say, “Call Shelly” and the nearest TV fires up a Skype call. Whatever room Shelly is in, anywhere in the world, fires up the nearest TV screen and connects my call. If she’s walking down a public street, the street cams show on my TV, switching from one to another as she walks and talks.
My dog’s collar also has a location sensor and a speaker. I say aloud, “Where is the dog” and the house says, “The dog is in the kitchen.”
Most of what I described is unsurprising to any sci-fi fan. It’s the sort of thing we’ve seen predicted for decades. All I’m adding to the conversation is two notions:
1. Done right, the user will feel something closer to love than simple convenience. Apple has shown that to be possible.
2. To get to that awesome future, the world probably needs some sort of an industry standard for sensing human locations, identifying people, accessing each person’s profile in the cloud, and negotiating preferences when there is more than one person in the room. And you probably need some standards for user interfaces that are common across all devices.
This is one case in which I’d like to see an activist government organizing industry players to create such a standard. Imagine the economic growth that could happen as the world transitions from our current heartless environment to one in which every room and every device shows you love the way an iPhone5 does.
I also think this future world of machine loving will be a partial cure for loneliness. This will seem like a stretch, but hear me out. When I lost my ability to speak for over three years, I felt lonely even in a room full of people. It turns out that you can only cure loneliness by feeling heard, not by hearing others. My iPhone5 hears me and does its best to understand and respond. When my entire environment starts acting the same way, I think I’ll feel less lonely even when no other human is in the room. I’ll feel heard, even if only by a set of connected machines.
I think the future for senior citizens will be bleak until this sort of technology arrives. Every elderly person I know is severely bored and lonely. It is human nature for young people to prefer spending time with other young people and to limit their time with the old. I think it will be a huge boon to the elderly to live in a machine-love world in which their environment responds to them, and they can connect to any living person with just a verbal command.
Machine love: It’s the future.