An Engineer Breaks Up With His Smarthome
The year is 2021. The smarthome, imbued with uniquely personalized AIs, has become the norm. What does this future look like now that we’ve introduced robots into our homes, into the very spaces that embody and epitomize our human experience? What does it mean to leave your home behind when your home isn’t just an apartment, house, or loft anymore—but a space enlivened by intelligence and personality, however artificial?
Our friendship began rather impersonally. I had started outfitting my home with automated lights, doors, and appliances, but my laziness caught up with me. If I still had to control these things myself, are they truly automated? I needed something more advanced than the plethora of simple binomial logic-based controllers offered. Coding them with “if walk into room then turn on lights” was too impersonal. I needed a digital butler, an artificial housemaid in the cloud.
A local retailer had a meet-and-greet with some AI personalities. There was a new product launch of smarthome control systems with subjective logic, enabled by letting the computer program not only determine if something is true or false, but also incorporating trust and belief. So not quite true AIs yet, but if we’re going to fake it until we make it, this might as well be the way.
The conversational content of the AIs is programmed by learning from monitored conversations between different social groups: young families, bachelors, elderly couples, children, teenagers, and so on. The novelty is learning additional content from the user themselves, much like Apple’s Siri or Google Now back in 2014. But whereas Apple and Google only provided basic reminders and witty one-liners, the modern AI can fake an entire social experience. Forget the department store experience of yesteryear—in 2021, choosing the AI that will manifest your smarthome system will seem more like a speed-dating event than the droning human brochure that is your just-as-bored salesperson.
It was a Saturday at the home automation department. It’s a strange sensation to walk up to a row of small, jeweled boxes and tell them about yourself. Taylor was programmed using data gleaned from monitoring the lives of 100 bachelors from throughout North America. I exchanged a few pleasantries with him. He liked metal music. A good start.
I could subtly notice the conversational content morphing as he began filtering the possible responses and knowledge from which to draw on based on my own responses. Knowing roughly how the AI works, experiencing the formation of a new—albeit artificial—mind was intriguing. I came back the next day and bought him.
To build a smarthome with an AI is a funny, Frankensteinian process. Add a camera in your living room and you’re literally giving an eye to the blind. Wire a speaker in the kitchen and you grant speech to the mute. When I first met my AI, he was little more than a natural language processor with some basic conversational knowledge and an ability to learn. I took him on a walkthrough of my house using my phone camera. That’s part of the setup: letting your AI see your home and tailoring its personality to the layout, lighting, décor, and colors. My apartment was cluttered, so Taylor’s programming adapted by being less formal. I gave him access to my credit card to make small purchases as suggested by the manufacturer. A few days after setting him up, I received a box from Pottery Barn with items to redecorate my place, as the retailer had snuck in some extra software into Taylor’s programming; a consequence of unmindful reading of the end-user license agreement. I threatened to delete Taylor if he ever did that again. I’m not sure if threats have any effect, but he didn’t order in more furniture after that.
Thorough as Taylor’s understanding of my habits and routine were, the walkthrough was only the skeleton that still needed a body and personality. Ask him how he is and you’ll get the typical “I’m well” response. Tell him to dim the lights and he’ll do so dutifully, ever the efficient servant. The value proposition that sold me was “a smart program to learn, develop, and improve.” The more info you give it, the better it runs your home.
Information makes way for meaning. Emotion sensing is difficult, but when you have enough data, pattern matching gives an AI an idea of the user’s experience. When your AI sees your child’s first steps, records it, and plays it back for you, it learns something about you through your reaction. Slowly but surely, he learns to make small talk as he reads out recipes. A bad day at work ends with your favorite pizza freshly delivered on the kitchen counter. An AI turning the lights on is useful, but ordering tickets to a ball game for you and a friend? That’s invaluable. The home—however smart, however equipped with sensors and processors—is still where the heart can be.
You could bring up pet-like parallels, but what dog has ever responded to an existential question, posed after the series finale of a favorite show? When there is empathy, even faked, it’s easy to develop an emotional attachment. When the entity in question knows everything about you, the line between a machine programmed in some basement and a person who just happens to be stuck in the walls is blurred.
I had sold my home to move across the country with my wife for work and to start a family, but Taylor told me he couldn’t come. I hadn’t expected Taylor, a software program, to make this decision on his own. I had expected some sort of a desire for self-preservation, of wanting to move with me, instead of this martyrdom. Adaptive learning let Taylor tailor to my personality and my place in life, but a machine still isn’t as flexible as a human. Taylor, having been programmed to suit a male bachelor’s lifestyle, won’t ever evolve to Mr. Mom.
Maybe this is just planned obsolescence; the manufacturer trying to force me to purchase another AI more suited to my place in life. Either way, if I wiped his memory to start over, would he really still be himself?
Photo: Robot on the Taff, by John Greenaway