SXSW 2018: 4 Takeaways for Designers

In a world where we are happy to drink the design-thinking Kool-Aid, the title “designer” is thrown around a lot. But what does it mean to be a designer today? What role do we designers play in shaping technology, business, and society at large?

In March 2018, I listened as dozens of speakers shared their perspectives on these questions at South by Southwest (SXSW). Inspired by their thinking, I came away from SXSW with my own new thoughts on the role of designers. If it’s true that “with great power comes great responsibility,” then I believe that today’s designers are responsible for doing the following four things.

Design Lesson 1: We must be intentional about the outcomes we want. 

“The future shouldn’t be self-driving!” – Josh Clark, Big Medium

As we build increasingly powerful tools equipped with AI and machine-learning capabilities, designers will be responsible for curating their uses. To ensure these tools are used for problems worth solving, we must understand what the true human need is in any given context, define the outcomes we want, and create guardrails accordingly. Only by remaining truly human-centric will we move away from solving for conveniences and toward solving for real needs. In doing so, we will build technology that helps us all lead better lives instead of merely creating powerful “supermachines.”

This level of intention is equally critical to making our businesses more successful. Designers must develop intelligent metrics in order to define and measure business priorities in a way that reflects the needs we’re trying to solve for and the experiences we’re trying to create.

Design Lesson 2: We need to design for humility and openness.

We see it as our job to “delight” our users. But when we’re designing new tools, like AI, it can be equally as important to design for failure. The systems we create will often fail – especially initially. With this in mind, we need to design experiences that result in enjoyable outcomes, regardless of whether these outcomes are perfect. We also need to design for uncertainty – that is, for users not knowing whether the response or recommendation they’re getting is truly optimal. One way to do this is by clearly outlining the rationale behind the decisions made by AI – after all, user understanding is the first step to user trust. A designer can add a lot of value by working with engineers to visually and intuitively illustrate the inner workings of a given tool. For example, Robbie Barrat, an AI researcher at Stanford, has trained an AI to paint the human body based on what it perceives humans to look like from its data set. Barrat’s work is a great illustration of the disconnect between what users expect a machine to be thinking and what it’s really thinking or doing. By bringing this to the surface, designers can help users learn to better interact with tools and to interpret their output.

Today’s algorithms have an overconfidence problem – which, at the end of the day, is a data presentation problem (i.e. a design problem). How do we illustrate the actual level of confidence that goes along with an answer a user receives? One way is to start acknowledging nuances and openly displaying the confidence level associated with an output. This is information that is often already available in the raw data output from an algorithm, but that we don’t typically present. By designing systems that are smart enough to openly acknowledge what they don’t know – ones that ask for clarification when they need it – we can truly position the machine as a helper for humans rather than a replacement.

Finally, we need to be extra transparent when dealing with topics requiring heightened critical thinking, also known as “hostile information zones.” We must be explicit about which topics our tools are not equipped to handle. We also need to audit the logic of our machines in these domains (much like we would audit human logic) by taking an open-source approach to sharing algorithms and data. This will lead to quicker and greater improvement of all tools.

Design Lesson 3: We must be inclusive and take responsibility over data.

Our algorithms are only as good as the data we feed them. When technology shows bias reflective of human prejudice – like, for example, when a Marriott hotel’s automatic faucets failied to recognize dark skin – the biases and flaws in our data sets become starkly apparent. We must ensure that we’re not codifying our past as we design the tools of the future. Failures like this highlight our existing biases and the need to correct them; clearly, we need to get better at building inclusive data sets from the start. Designers are trained to look at the lead and lag users of a given population distribution to inspire our designs. There’s no reason we shouldn’t be using that same logic as we build data sets and algorithms.

As Josh Clark of Big Medium points out, “Data gathering doesn’t seem like the job of a designer at first glance, but you can actually think of it as UX research at a massive scale. It involves thinking through things like:

/ What’s the real problem to solve?

/ What data will help determine the answer?

/ Who holds that data?

/ Who are the people we need to serve?”

By designing our data-gathering systems to make it easy to contribute accurate data for all segments of our population, we can be the drivers of better, more inclusive technology.

Design Lesson 4: We need to be provocative and to design for confrontation.

Most products are designed to make our lives convenient and frictionless, which we assume is a good thing. But some are arguing that we’ve gone too far. As Steve Selzer, Design Manager at Airbnb, points out, “When we remove all friction, we also remove moments for serendipity and self-reflection.” How might designers make it less daunting for people to have more confrontations and spark more personal growth?

Regardless of our differences, all people inevitably have some overlapping values or areas of interest. Too often, today’s products focus on consumers’ differences and pull us further apart, creating echo chambers and making us think we are more different than alike. By designing products that root consumers instead in our similarities, designers can make people feel secure enough to have the difficult conversations necessary for understanding and addressing our differences. In other words, designers have the power to help people develop the mindset and skills needed to confront and work through challenges. To do this, we must decide what kinds of confrontations we want to design for.

Designers have the privilege of shaping the products and services of tomorrow – and the power to make sure this future is a more responsible, inclusive, and human one.

the author

Anna Roumiantseva

Anna Roumiantseva is an Innovation Strategist at Idea Couture.