Skip to main content
Humans+Robots

Why your app doesn’t care who you are

Silicon Valley’s insularity has been a talking point for years. Sometimes, that has real-life consequences.

By Schuyler Velasco

Sara Wachter-Boettcher, an avid runner, tore her ACL last year and couldn’t schedule surgery for several months. In the meantime, she kept running, though not nearly as fast as usual.

But there was no way to tell that to Runkeeper. The popular app for runners had no setting that let her pause her goals toward faster run times without erasing them altogether. “I got a lot of these peppy notifications, like, ‘This is your 79th fastest 5K. Congratulations!’” Wachter-Boettcher says. “What do you mean, ‘Congratulations?’ I’m happy to get out there and do this at all.”

It was a small inconvenience, but a sign of a larger problem as our lives become increasingly intertwined with our phones. We clock hours into sleep trackers; log calories and miles into MyFitnessPal or FitBit; use phones for basic banking and transportation.

But when digital apps are largely designed around simplicity and ease of use, they don’t wind up fitting the needs of everyone. Sometimes, mild annoyance can bleed into insensitivity: Automated forms that don’t allow for non-binary gender identification. Chipper Facebook reminders for a dead loved one’s birthday.

And sometimes, the inconvenience falls hard on groups that wonder if they’re invisible to Silicon Valley altogether. Kelly O’Leary of Berkeley, California, complains that Planned Parenthood’s Spot On app couldn’t be paused when she got pregnant, and now might miscalculate her ovulation cycle. “It’s storing years of useful health data so I’d like to keep using it,” she says.

In a September article in Engadget, writer Swapna Krishna detailed the ways fitness and health apps frustrated her throughout her pregnancy: She was sent dire warnings about her weight gain, and her Apple Watch pestered her about her decreasing activity level. “Tech has failed the bodies of biological women more than it’s served us,” Krishna wrote.

It’s not hard to draw a line between those problems and the larger controversies swirling around the tech industry. There’s the lack of diversity: Despite some high-profile public initiatives, the Silicon Valley workforce is still disproportionately white, male, and young, and those hiring shortcomings can be reflected in tech products.

Critics also say tech companies hew to narrow ways of thinking when their entire development teams come from a similar background, or have the same educational training — often from a handful of the same colleges.

The simplest scenarios become default: a single person catching an Uber; a young, healthy, non-pregnant person logging faster and faster miles.

There’s also the premium Silicon Valley puts on speed and disruption amid tech’s breakneck atmosphere of competition. Wachter-Boettcher, the author of Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech, cites the industry concept of the “minimum viable product:” releasing an app with basic features as quickly as possible, then “not necessarily going back and asking deeper questions about it,” or taking steps to expand upon that basic functionality in later versions.

That’s how the simplest scenarios become default, she says: a single person catching an Uber; a young, healthy, non-pregnant person logging faster and faster miles.

“There are needs that are perceived as ‘normal,’ and other needs that are perceived as ‘abnormal’ or ‘fringe,’” Wachter-Boettcher says. “That makes it harder to serve more people’s needs, because you’re thinking of them as additives.”

For certain types of apps — dating, house-sharing, ride-sharing — a failure to foresee every scenario can put safety at stake, says Juliette Kayyem, a security expert and the CEO of the Boston-based ride-share startup ZemCar.

“All these big companies that view themselves as technology companies, but not as security companies. They’re learning [safety] on the back end,” Kayyem says. “Companies are focused on, ‘Look, we created this amazing platform.’ But it’s not just a platform; we’re bringing people together. And we need to recognize that there’s a vulnerability in that.”

Many Uber and Lyft drivers have come face-to-face with that tension. Under Uber’s terms of use, for instance, minors under 18 aren’t allowed to request rides, or take rides unaccompanied by an adult. But John Hammons, who drives for Uber and Lyft in Florida, says he’s asked to pick teenagers up all the time.

It’s all part of the apps’ design; Hammons has no way of knowing who is requesting a ride beforehand, and if he checks the ID of everyone who looks young, he risks bad reviews and lower ratings — not to mention lost fares.

Younger children put him in another sticky spot. After being pressured a handful of times to transport small children with no proper car seat — which is illegal in most places — he ended up buying his own car seat and booster.  

Many drivers complain about the challenges of transporting minors, says Brett Helling, a former Uber driver who runs the website Gigworker.com. “Uber and Lyft should never put drivers in a position where if they obey the law they’ll be penalized, or if they break the law they’ll be rewarded,” he says.

The industry is slowly catching on. Uber is testing an option that would allow passengers to request car seats in four U.S. cities, and the company recently relaxed its stiff penalties for drivers who cancel rides. Lyft has a “car seat mode” in New York. And startups are swooping in to fill the gaps — from New York’s Kid Car to ZemCar, which pairs families with drivers for regularly scheduled rides for their kids. (Kayyem, of ZemCar, also sees a potential market for elderly people who need regular rides.)

Some drivers find those narrower options attractive.

“Being a woman, driving at night, it’s not always the most comfortable situation. You don’t know who’s getting in your car,” says Natalie Kostich, who used to drive for Uber and now has a few regular ZemCar customers. Her work for ZemCar isn’t as flexible: ZemCar’s on-demand requests tend to all come in during the same peak hours. But it’s consistent, and she meets almost all of her clients face-to-face.

Elsewhere, companies are taking steps to ensure their tech works for wider swaths of people. Pinterest, after realizing its search results for beauty, hair, and makeup were dominated by images of blondes with porcelain skin, introduced a tool in April that allows people to refine their recommended “pins” based on skin tone. A handful of health apps now take pregnancy into account, including Weight Watchers and Nokia Health/Withings’ Pregnancy Tracker.

Wachter-Boettcher says companies could do a better job making those changes on the front end. It’s “very feasible to design features and products that do serve a broader range of people and actually cohesively meet needs,” she says, “as opposed to feeling like you’re going to Frankenstein it together later.”

And companies need to be more honest about the potential negative consequences of their technology, she says.

“It’s not very encouraged in a lot of tech teams, because it’s seen as negative,” she says. “They’re focused on the blue sky and positive outcomes and having vision. That’s great. Talk about that stuff. But you also need to be talking about how could this go wrong.”

Published on

Schuyler Velasco is Senior Editor of Experience.

 

Illustration by Franziska Barczyk

Humans+Robots

How to prepare for sentient AI

And why the experts hope it never happens

By Schuyler Velasco