I was tapping on my phone, looking for someone who could send me a photo of the Barnes and Noble on East 86th Street in Manhattan.
I wanted to see if the store was letting customers sit inside — not because I needed to, but because I could. It was one of the most benign uses I could imagine for a new app called Mole — which promises to give you access to a network of spies-for-hire, who will snap a cellphone photo of anything you want.
Mole was launched in July with a publicity campaign touting the app as an “Uber for eyeballs”: a way to pay a stranger to take live pictures or stream real-time videos-on-demand of anything, anywhere. The operation was billed as a win-win; the picture-takers, called Moles, get a chance to make some extra green, while the users, called Agents, get a peek at something without having to go there. Mole’s publicist told me that early Agents had sent in such requests as: checking the length of the lines at Walt Disney World in Florida, and whether people there were wearing masks; seeing if a monument was still up after a protest; asking how busy the dog park was in Prospect Park in Brooklyn; sending a video of the men’s section at Bloomingdale’s; and checking on the exterior of a house when the owners were away.
This last one gave me pause — it seemed a bit invasive — but I was intrigued. Sure, I was curious about the Barnes and Noble, and I wanted to know if there was a long line at the voting place near me. But if we’re being honest, I was more interested in knowing what my ex’s new wife and just-as-new house looked like.
That’s the temptation, and the threat, of the new surveillance tools at our disposal. If we can use them for spying on each other, how long will it take before we do?
Mole’s creator is a 42-year-old serial entrepreneur named Avery Pack. He has also started a company that customizes fleets of bicycles and has a client list that includes Google and American Airlines. Pack, who lives in Miami, came up with the idea for Mole five years ago, when he wondered why nothing existed that would let him drop a pin in a specific location and see something through another person’s camera. Two years ago, he returned to the idea, hired designers, got a patent, and rolled out the product in New York. The dense city was perfect for regional testing, he says, since requests could be fulfilled within a close proximity.
By the time I talked to Pack by phone in the fall, Mole had 10,000 users, most of them in New York. The app was charging a minimum of $1.50 for the connection between Mole and Agent, then an additional 50 cents per minute for streaming. Using a platform model, the average assignment earned a Mole $5 to $10, plus money for travel time.
“I’d like to know if there’s a long line at the coffee shop, but I don’t know if that’s a problem that needs to be solved by technology like this.”
Jeff Hancock, founding director of the Stanford Social Media Lab
In the Mole press release, Peck talks breezily about the culture of surveillance. “There’s still a false notion of privacy that we need to just dust away and adjust to behaviorally,” he says. Mole’s early press coverage doubled down on this idea: a New York Post article from September was headlined, “This app lets you spy on your neighbors through your phone.”
But when I followed up with him, Pack downplayed the ethical questions around his creation — or the idea that the app was intended for surveillance at all. “We’re a small startup trying to get attention. Why not confront what everyone was thinking it could be used for headfirst and wear it on our sleeve?” he told me in a phone interview. “So we called the app Mole. The obvious conclusion is to spy on people, but to me, it’s the least interesting utility.”
In a world of intrusive technology and endless temptation, though, ethics experts say it’s far too easy for people to succumb to the temptation of spying. “Apps like Mole push an ethical boundary. This tool is designed for misuse,” says Jeff Hancock, a communications professor at Stanford University and the founding director of the Stanford Social Media Lab, which studies psychological and interpersonal processes in social media. An app like Mole, Hancock says, “doesn’t make us do anything, but it does raise the potential for abuse. And if there is potential, are there people who will want to try it out?”
It’s true that Mole is only laying bare a surveillance ability we’ve had for years, through any number of digital tools and smart devices. A smart garbage assistant can scan the items you throw away and instantly order new ones on Amazon. Smart car features can alert a parent to when a teen driver is speeding. The GPS-driven “share my location” feature on iPhones allows parents to track their kids’ movements, or teenagers to track one another. Dating apps like Hinge and Grindr let us know when people we have liked are in our vicinity — if they, too, have switched on their “share my location” preference.
The ability to find a lost cell phone, navigate away from traffic, or make sure children have arrived at their destination — it all feels inoffensive, even helpful. But more and more technologists and ethicists have been sounding alarms about the hidden effects of data sharing. “We are normalizing surveillance,” says Michael Zimmer, a computer science professor at Marquette University who specializes in privacy and internet ethics. “We have lost sight of what the tradeoff is,” he continues. “My Spotify playlist is customized, so I’m benefiting from that. But is that benefit worth the cost of what I’m losing? Maybe I’m losing something I don’t even know I’m losing. It raises all these new social possibilities that aren’t always clear.”
Our loss of judgment, of seeing clear lines between right and wrong, took years to blur. Some say it started with reality television. Others cite the making of insta-celebrities or the ability for videos to go viral. Zimmer brings up gossip publications like the defunct Gawker, which over a decade ago invited readers to post photos of celebs they spotted in real time in a feature called Gawker Stalker.
“Things changed with the combination of smartphones having a camera and the ease of uploading almost any content, at any given moment,” Zimmer says. “If I saw a celeb eating lunch, I could take a photo, post it to the site, and say where I spotted them. Gawker framed it as celebs doing everyday things, but it was invasive, and they dropped it because there was such backlash.”
Pack insists that Mole fills a niche that has little to do with spying on others, and more about seeing precisely what you want to see. If you scroll through Instagram for photos of a vacation destination or a public event, he says, you’re dependent upon what someone else chooses to photograph. With Mole, you can direct the shot and the angle.
He adds that Mole offers anonymity. An Agent doesn’t know who is snapping a photo or video, and a Mole doesn’t know who is requesting the job. No numbers are exchanged. No opportunity to speak to one another is available. And you can’t request a specific person if you’re happy with what someone has captured for you before.
“It’s far less egregious than other invisible intrusions that are around us,” he says. “This is a tool to empower people to see something. It’s not being rebroadcasted.”
But I wonder if that’s altogether true. I’d want to screenshot or save any photo I ordered — especially if I caught my spouse cheating or needed visual proof for a legal battle. A scandalous video, shared on the right platforms, could live on social media forever.
Hancock didn’t discuss the legal ramifications of Mole itself, but he did say that privacy protections often haven’t caught up with technological advances. “Law tends to lag. It’s super slow,” he says. “Technology and our ability to access new kinds of information is fast. That creates a mismatch.”
Our personal notions of privacy and ethics need to evolve as well, he says.
“I don’t think these technologies do stuff to us. We have responsibility and agency,” said Hancock. “People that end up using apps or technology for nefarious purposes bear as much responsibility as the person who created it. And yes, I’d like to know if there’s a long line at the coffee shop, but I don’t know if that’s a problem that needs to be solved by technology like this.”
Hancock and Zimmer want to train the next generation of app developers—and consumers—to use new technology responsibly. Hancock’s Stanford class “Truth, Trust, and Tech” examines how technology both deceives and abets deception, and teaches students to become more resilient to disinformation. And Zimmer says college courses like the ones he teaches at Marquette about the social and ethical implications of technology are becoming more popular, and sometimes even required. In recent years, MIT added an ethics in technology course, taught by a philosophy professor. Northeastern University is integrating an ethics curriculum into all of its computer science programs.
Zimmer says he tries to teach his students that technology should serve not just a user’s immediate needs, but also the collective good. “I try to get my students to focus on dignity and autonomy—not utility, low cost, or efficiency,” he says.
Perhaps I needed a lesson, too—though in the end, it was Mole’s still-small user base that kept me from spying. I tried several times to find an agent to snap a few photos at the Barnes and Noble on 86th Street, but each time, no one replied. At first, I was disappointed, as I wanted to know if the app worked, if the photos would be revealing and helpful. Then I was relieved. Perhaps not that many people in my neighborhood were out scouting locations. Perhaps no one wanted to be a spy. Or maybe, just maybe, New Yorkers were having a collective ethical moment.
Published on