Robot-Centered Design

The Future of User-Centered Design Has a New Kind of Stakeholder

You’re a designer. You create products, experiences, services, buildings, and more. Each time you begin the design process, you consider the multitude
 of touchpoints and features of your design and how they will engage with your intended user. This is the logic of the human-centered design (HCD) process: Keep people, their values, and their desires at the center of your approach. Start and end every action by thinking, “How would a human use my design?”

Now you have a new group of stakeholders. Your experiences and buildings
 are becoming littered with them. Each time you begin the design process, asking what a human would want is no longer good enough. The HCD process is
 table stakes – if you haven’t already started thinking about how to create designs intended for multi-user scenarios, you’re designing to suit a past world.

In the future of design, robots will be critical users to consider. The design process will no longer simply be a human-focused activity, but instead a complex, multi-stakeholder dance that must balance the values and constraints of man and machine. HCD will only be half the equation; the other half will be robot-centered design.

Know Your History

Human-centered or user-centered design (UCD) is a process for creating a given product, service, or experience in which users’ values or characteristics are given close attention at each stage. Originally coined and crafted during the early days of software and user-interface development, HCD has now become a universal design and business framework for tackling problems with a constant consideration of the human perspective across all stages. The process is iterative, involves a variety of different skills and perspectives, and is driven by constant involvement of users throughout.

HCD is often criticized for neglecting specific subgroups of people. Part of the challenge is that, since (most) designers are human, we often inject our own values and biases within our designs and lose sight of our intended user, particularly if their values and desires differ greatly from our own. Essentially, we’re so self-centered that sometimes we can’t see past ourselves when attempting to design for a specific user.

Know Your Future

If we look to the future of robotic development, it’s fairly easy to envision a scenario where robots outnumber people (assuming they don’t already). With robots of all shapes and sizes wandering and engaging with the world around them, we should assume that this world will be different from today. Though we will design aspects of our robots to operate like humans, they will not be human; 
they will be something distinctly different. In some ways, they’ll be like us – we’ll likely anthropomorphize some robots by giving them a rough humanoid design – but in other ways, they will
 be nearly alien in nature.

They’ll have cameras for eyes, microphones 
for ears, grippers for hands, wheels for feet, and a mind that – try as we may to design otherwise – diverges from the human brain rather than converging with it. They’ll search for tiny codes on walls, floors, and objects to help identify and orient the object and themselves. They’ll maneuver through ramps and lifts to get from A to B, avoiding battery-draining and labor-intensive staircases. They’ll cycle through a series of purpose-built manipulators to engage with the different doors, computer interfaces, and people they encounter.

Think of some of what we consider more “simple” tasks and the difficulty in designing a system 
to fully accomplish those tasks. Completing a package delivery in an office breaks down into a series of steps that, while it may seem simple at first glance, becomes a bit more complicated when roboticized. First, the box must be grabbable by the robot. The pathway needs a steady
 ramp and elevators to pilot the whole way, or the system must be able to negotiate stairs. The walls and floors need to be visually distinct to allow the system to model and navigate them. Any door handles or elevator buttons must be recognizable and easily manipulated by the robotic system. It must be able to see and avoid objects and know where to appropriately place the package. And finally, it must be capable of interacting with the intended recipient to ensure that they are happy with its actions and need no further help.

Should We Make Them Come to Us?

The easy (though not very empathetic) answer 
is to make the robots play by our rules: have them interact and engage with a human world. And while this seems reasonable enough – we were here first, after all – it doesn’t do us roboticists much of a favor. Many of the problems currently plaguing modern roboticists are complex and nasty ones. However, through simple design tricks and adaptations – often ones that humans wouldn’t even notice – we can embed design features into the world around us to make life infinitely easier. If we instead choose to leave the world as is, we’re essentially telling roboticists that they need to replicate a perfect mechanical human – which, as you can imagine, is not an easy task.

So instead of designing mechanical systems with crotch-height lubricant discharge valves, what kinds of adjustments should we be making for our robot buddies?

Floors and Walls

In those hilarious moments where you’re looking at five different samples of grey carpet or paint that look identical, you may have wondered whether or not you were simply the center of a cruel joke, with selection between identical options being passed off as an important choice. To you, these subtle differences may mean nothing; however, to a robot, that Ashen Cloud grey might be impossible to differentiate from the Light Charcoal walls you’ve picked – and that’s just mean. You’d be amazed how quickly vision recognition systems could tell you their favorite Pantone colors.

Robotic Feng Shui

If we’ll go so far as to micro-adjust the layout of our household objects to improve the flow of energy, we could, at the very least, move the coffee table five centimeters to the left so that there’s a wide enough berth for the cleaning bot. Beyond simple clearance, how could the clever positioning of objects and features around us make life easier for both man and machine?

Mechanical Segregation

Since the last thing that 99% of us want to do with a robot is hurt someone, the problem of world navigation becomes much more complex when you’re literally worried about stepping on someone’s toes. This means that robots have to move slowly and cautiously, and they must constantly be ready to stop and give way to unpredictable, scattered humans. Could the resulting loss in speed and efficiency provide sufficient justification to operate independent “robot lanes,” free of human interference?

Stairs Suck

Take it from a guy who worked on this problem for a year: Climbing stairs sucks. Especially for robots.

Cheat Codes

Until our vision and navigation systems are perfect, everyone can use a helping hand.
 And while we could continue to exclusively design city and building signs using those computationally difficult signs and symbols we like to call “letters” and “icons,” there is potential to convey much more information, more accurately and 
in less space, by leveraging barcodes, QR codes, and other machine-vision platforms. This approach simply takes one or two unnecessary steps out of the process of reading and
 ensures vastly improved comprehension.

Standardized Grips

Looking at the world of product design, you would be hard-pressed to believe that a field called “human factors” exists when you consider
 the broad spectrum of grip and touch interactions we’re expected to complete. Fortunately for us, our hands are quite versatile. For robots, this is not the case. I’m not saying that I believe we can make one glove size for the whole world, but
 if we could at least reduce the spectrum to a non-infinite set of options, your robots would thank you dearly.

Connected Everything

On second thought, we can remove the need to physically interact with objects entirely if they all become connected and robots are given the ability to control them wirelessly. Done to the extreme, this does have the potential to create Skynet, giving all robots the perceived power of telekinesis – but it will be a ton easier to make a cup of coffee.

The robots are coming. You, as a designer, have the ability to either slow or accelerate this new wave of technology. If we continue to march down the path of HCD, we will continue to create complicated worlds that ostracize and befuddle robots. Alternatively, if we adopt the principles of robot-centered design alongside HCD, we have the potential to start creating modern environments that enable and empower the future of human–machine collaboration.

On the one hand, this is simply good design. 
If you know that the future of your designs is
 to have multiple key stakeholders, why would you ignore half of them? Any responsible designer should be considering any potential user of their design and how to optimize it for everyone.
 On the other hand, robot-centered design shows humanity at its most human. We have the opportunity to demonstrate that our empathy encompasses considerations even beyond
our own species and can extend into our own creations. So, give a robot a helping hand – roboticists will thank you for making our lives easier, and we promise we’ll program our
 robots to thank you too

the author

Shane Saunderson

Shane Saunderson is VP, IC/Things at Idea Couture. See his full bio here.