Before the smartphone backlash, before apps were likened to cigarettes for kids or Facebook co-founder Sean Parker mused that “God knows what it’s doing to our children’s brains” or Tim Cook revealed he doesn’t let his nephew touch social media, and before the demands for studies and regulations and shutting down apps, Riddhi Shah was en route to a weekend trip to unplug from tech-ified San Francisco.
It was late 2015, and riding in the car with her husband and another couple, Riddhi, a friend of mine, was many months pregnant, and racked with questions about how she would inhabit her new role as a mom. That makes her the same as every first-time parent in the history of the world. However, the terms of parenthood changed abruptly back in 2007 when Steve Jobs introduced the shiny object that many humans would either spend the next decade staring at or consciously telling themselves not to focus on. Now, parenting has gone, as one pediatrician told me, to “a 3.7 difficulty Olympic dive – up from a 2.8” a decade ago.
Parenting now means having the “smartphone debate” — not just with your kids, constantly — but with other parents. And as Riddhi was about to find out, hell is other people’s screen time policies.
Hell is other people’s screen time policies
Riding shotgun, Riddhi’s artist friend, who was confident in her parenting and not one to mince words, said she and her husband didn’t let their kid ever see a screen in his first two years. Riddhi pointed out that would be impossible: Riddhi was a content strategist at a tech company who had to attend to off-hours Slack and emails. Her husband (riding next to her, silently) was an avid CNN watcher and online news scroller. Riddhi explained that their jobs made it impossible to just give up screens cold turkey once a baby arrived.
Her friend pounced. “That’s bullshit! It’s just a question of prioritizing.”
Discussions like this one — about smartphones and their effect on people’s brains and mental health — are everywhere now, and as Riddhi experienced, they’re particularly intense when it comes to children. Tech industry insiders and children’s media watchdogs have launched a “Truth About Tech” campaign, pushing to create ethical standards for tech companies, lobbying for government regulation and government-funded research on the effects of all these screens, and advocating digital literacy curricula in schools. Groups like Campaign for a Commercial-Free Childhood and scores of allies are demanding that Facebook shut down its recently launched Messenger chat for kids. Shareholders are demanding that Apple study the effects of its phones on children and offer better parental controls. Many of these voices are not your typical professional worrywarts, but former Silicon Valley execs and apostates themselves.
Parents are still the only regulators
But until some massive industry or regulatory reconsideration takes effect — and it may never, given the business interests at play — parents are still the only regulators attempting to set rules for their kids and hashing out best practices with other parents.
Riddhi felt stung by her friend’s response. She’d been sorted into the slacker-parent category while her baby was still in utero. Two years on, her hurt has deepened into envious resentment. She calls the anti-smartphone crowd the “vegans of the parenting world.” “Women who are so ‘anti’ seem like they have all the answers, like they don’t need the technological distraction that mere mortal parents do,” she said. “There’s just morality around this whole technology issue — the equivalent of religiousness. It feels like it has that same fervor, in a way that not many other things do [about] bringing up a kid in San Francisco.”
I wanted to hear from the stricter side of the parental trenches, so I called a friend who we’ll refer to as Julie. (It perhaps says something about the tenor of the smartphone debate in 2018 that you ask for a pseudonym for fear of rankling friends and clients.) Julie embodies that Northern California species of hyper-organized hippie: she’s a Leadership in Energy and Environmental Design-certified interior designer with a musician husband. Her nearly three-year-old son’s screen time is restricted to viewing family pics, FaceTiming his grandparents, and — only when he’s waiting out his mom’s dance class or together with the family — watching E.T.
In a world awash in screens, Julie’s stickler stance can cause a fair amount of tension. She walked in from work recently to find her nanny looking at a smartphone with her son, disobeying Julie’s ban. Julie didn’t want drama with a woman she considers “like family,” so kept quiet. Likewise, she is growing increasingly annoyed with the friends who, when she visits with her son for a playdate, turn on what they call their “TV babysitter” so they can chat without having to watch the kids. They preempt her judgment by saying, “I know we’re bad parents.”
“I’m not going to tell them they can’t turn on their own TV,” Julie says, in a tone that says she also kind of wishes she could. (The result of this scenario was chaos, furthering her anti-screen conviction: “He didn’t like that his friend was completely glued to the TV. He wanted her attention, and before we knew it, he was pulling her hair and pulling her down. It makes him a little crazy.”) In a Facebook parenting group, Julie sees other parents posting things like, “I’m not one of those crazy people who don’t let their kids have screen time.” Her response: “I do judge secretly,” she says. “I think they’re trying to make themselves feel better… I’m just like whatever, they’re not informed.”
“I think they’re trying to make themselves feel better.”
The arguments against screens usually center on how they affect brain development and the ability to focus. Julie’s information about phones mostly comes from her husband, who read Sherry Turkle’s Alone Together: Why We Expect More from Technology and Less from Each Other before their son was born, which is about the decay of relationships in the digital age. He also tries to keep up with the articles that come out on the subject; he has been especially moved by Jean M. Twenge’s writing about the demise of the first generation raised on smartphones, and Andrew Sullivan’s piece in New York magazine on how he cured his internet addiction with a smartphone-free camp.
But much of the most alarming pieces about phones relies on anecdotes or surveys with unclear causation, like do depressed kids use social media, or does social media make kids depressed? Solid studies on smartphones are hard to find or yield contradictory results. Smartphones are pushing us to “the brink of the worst mental-health crisis in decades,” writes Twenge, a psychologist. Other experts say that in moderation, social media can help kids build their social skills and resilience. NPR ran the headline “How Smartphones are Making Kids Unhappy.” The Huffington Post’s take: “How Technology Has Made Our Kids Smarter Than Ever.”
Solid studies on smartphones are hard to find
Anya Kamenetz, an NPR reporter and author of the new book The Art of Screen Time: How Your Family Can Balance Digital Media and Real Life, dug into the science, and her takeaway is that the available studies are too limited to give any definitive answers. As she notes, the last major federally funded research on children and media came out in 1982. Since then, Kamenetz writes: “[R]esearchers have struggled to keep up with the onslaught of new devices and technologies. The 2016 American Academy of Pediatrics guidelines cite just a handful of small experiments involving touchscreens and young children, for example; no large-scale studies, meta-analyses, or longitudinal studies either. Nor do researchers have access to inside industry information about how game-makers produce the most effective content.”
“I’d love [screen time] to be a science,” Kamenetz added in a recent radio appearance. “But the fact is the scientists out there are telling us there is a whole lot that they don’t know.’
“We” — as in parents — “are a little bit on our own.”
“We are a little bit on our own.”
It’s no wonder that, with so much uncertainty, parents are self-selecting smartphone-policy tribes with their own rules.
“I think people bristle at being told how to parent, so it’s a touchy area,” says Sierra Filucci, who oversees the parental advice from the San Francisco-based children’s media watchdog Common Sense Media. (Her staff bio counts among her interests, “bossing people around.”) “On one hand, they want really clear rules. But they want to know it’s coming from a trusted authority.”
More specifically: “They don’t want it from the other mom in the classroom.”
At age 11, smartphones have only entered the tweens themselves, so the research about them is likewise short-term and prepubescent. With that shortage, experts often reach for peer-reviewed studies of those screens with a much longer history — TV — that discern the effects of watching it in excess: obesity, worse school performance, social and language delays, sleeping problems, worse family dynamics. The smartphone backlash crowd articulates logical reasons why smartphones could be like TV, only worse: they’re with kids 24/7, and they’re open to all the content and creeps of the internet; some apps are even programmed to be addictive to our reptilian minds.
Conventions around child watching television took years to evolve — and they continue to do so. Back in the more innocent age of 1999, the American Academy of Pediatrics (AAP) put out a policy that recommended no TV for children younger than two. “The original recommendation of ‘no screens before two’ was so clear, so definitive, that it felt really good,” says Filucci from Common Sense Media. “Parents were like, ‘I know what the rule is, and what I need to do to be a good parent.’ They felt guilty if they didn’t follow it.”
Then, after many subsequent studies that showed the right kind of media could actually aid learning — and after critics called for a near-prohibition that is out of step with modern parenting — the AAP revised its policy in 2016. “We were being accused of being net nannies,” says Victor Strasburger, a pediatrician who was a consultant on the 1999 recommendations. “The academy was, I think, concerned that parents weren’t buying into the original recommendation.” In other words: the experts needed to become more realistic.
“The original recommendation of ‘no screens before two’ was so clear, so definitive, that it felt really good.”
The revised maxim wasn’t vastly different in its age recommendations — no screens before 18 months instead of two years — but it became more nuanced, emphasizing a kid’s overall media diet. The AAP said that video chatting shouldn’t count against screen time and could be used with even babies since studies show it allows children to maintain ties to remote family. Also, cap media to one hour a day for kids ages two to five, but with caveats: make it educational programming (PBS is a good bet), no fast-paced or violent content, no apps with too many distractions, no screens at meals or for an hour before bedtime, and parents should co-view media with their kid and reteach the lessons. They still warn against using tech as a soothing strategy in anything but pinches, like long flights. For kids older than five, the policy set no media time cap at all, but instead suggested a family media plan.
Suddenly, what had been a cut-and-dried age restriction changed into a string of parental judgment calls. (Even the length of the children’s policy went from two pages to four.) “Sometimes parents want very specific rules,” says Filucci. “That [rule] changed, and then it was complicated.”
Strasburger says of course more research is needed, yet he defends the available studies as being enough to justify controlling kids’ screen use. “We do have data on the effect of television and movies and music videos on kids, and these devices are simply being used to access to a great extent those types of media,” he says. “We don’t have good information yet on social networking, but research is evolving and it’s coming. We do have good research on cyberbullying and sexting, which are the two areas that are the biggest potential problems.”
“The filters are parents.”
Bottom line: “So we know a lot, and these new devices put media in the hands of kids at too early an age and are unfiltered — and it needs to be filtered. The filters are parents.”
Many in the control-and-filter camp take great pains to say they’re not anti-technology altogether. Kamenetz, the author, suggests this rule of thumb: “Enjoy screens, not too much, mostly together” (a nod to Michael Pollan’s dictate for healthy eating: “Eat food, not too much, mostly plants”). Common Sense Media, the nonprofit that advises parents on children’s media, led the “Truth About Tech” campaign with tech leaders unveiled last month. Yet, they are the embodiment of the nuanced approach: “We think technology can be genuinely beneficial to kids if you find good quality content (which is why we rate and review every kind of media), and if they use it in balance with other parts of their lives,” spokeswoman Karen Zuercher said over email.
Still, parents are realizing they can’t filter screens on their own; they need to enlist other parents and schools of the same mind. DeeDee Schroeder, an attorney for Bechtel who lives in San Francisco, gave her two older daughters phones when they were in seventh grade, before she knew the consequences. Now “they will disappear in their rooms with laptops and watch Netflix. I feel like it is too late to reach them.” Not only does she think they no longer listen to her, she’s seen one daughter “disappearing into her phone,” and worries about her careening down a path of anxiety and body image problems, echoing some of the scariest early research on smartphones’ effects on teen mental health.
So Schroeder turned her focus onto her younger two kids, who are instead coming into their tween years as the smartphone backlash is gaining steam, “I’m going to do everything in my power to not let this happen to me again.”
“This is iPhone celibacy!”
Schroeder found out about the Wait Until 8th pledge, which was started last year by a mother in Texas, that asks parents to sign up 10 families in their kid’s grade level who will postpone giving them a smartphone until 8th grade. Schroeder and another mom convinced 15 families to sign the pledge. “I’m like, there’s no one doing anything about this. The government isn’t doing anything. The schools aren’t doing it. We need a community effort.” When Schroeder talked about the pledge at a school meeting about tech use, most parents — many of them working at tech companies — were interested and excited. Fifteen families signed the pledge; not all. One dad, who is the CEO of a tech company, Schroeder says, snorted, “This is iPhone celibacy!”
Schroeder concedes that “Wait Until 8th has the sound of a sex ban.” Still, she insists she’s no abolitionist: “We’re so frustrated with that response.” Given their private school already uses an iPad curriculum, part of the pledge is to educate them about tech — give them rules of the road before handing over the keys. “We want our kids to be more tech-savvy than the average kid.”
Two years after the car ride, Riddhi really hasn’t changed her mind on the impossibility of her and her husband surrendering their screens. Still, they have cobbled together a workable policy: she defers emailing while her daughter is awake, and he limits his CNN sessions to maybe 20 minutes, whereas before “it was the soundtrack to our lives.” The other night, she had to take a nighttime work call to try to convince a job candidate to join her company, while her daughter kept clamoring for the phone, chanting “Picture, baby!” (She wanted Riddhi to take a photo of her that she could immediately view, entranced.)
“Half my brain was feeling guilty about setting a bad example for her. So sometimes it feels like a battle between a career and being a ‘good mom.’”
“Picture, baby!”
Because her daughter seemed to be obsessed with watching videos of herself, Riddhi stopped letting her see them, whittling down her screen viewing to just a half hour of Elmo on weekends. She hopes that’s the right thing to do. “I’d say the potential damage to kids is probably less acute than the damage to parents’ psyches.”
That, she concedes, is based on zero evidence, but the fact it feels true taps into the heart of the parental debate: it’s far from being just about kids; it says just as much, if not more, about parents and their anxieties.
In fact, the AAP policy also includes guidelines for parental screen time, citing studies that indicate that parents on their phones interact less with their kids, may have more conflict with them, and their own use is a likely predictor of their kids’ habits. There’s nothing like trying to regulate your own phone use to drive home just how compulsive our relationships with our device has become.
Where are our parents to set the limits?
Parents don’t need scientific research to tell them phones can be dangerous; they can deduce the ills from their own overuse. Julie, the proudly anti-screen mom, knows social media can make a kid depressed because she herself feels anxious when scrolling through “all these amazing lives” on Facebook, and has to remind herself every moment observing others online is a moment she is not working on her own goals. She has had to train herself, as a parent, to put down the phone, too.
Likewise, Riddhi gets that her toddler’s brain can’t handle a multitude of bombastic kid apps, mostly because she’s a grown woman who feels her own attention is frayed by constant notifications. Setting rules for your kids is easier than attacking your own behavior. Where are our checks and balances? Where are our parents to set the limits? Sometimes it would be an utter relief for mom and dad — or another qualified authority — to snatch the phone out of our hands and set some house rules.
]]>Two weeks after the US presidential election, Reverend Rebecca Bryan stepped up to her pulpit in the Boston suburb of Brookline and looked out on her congregation. The liberal Unitarian Universalists at First Parish were already gearing up for political action, a central tenet of the church’s 300-year-old mission. The members regularly protest for immigration and prison reform. Post-election, they were debating turning the church into a sanctuary for undocumented immigrants.
That Sunday, Bryan shared an idea about how to communicate: “Whats-App,” she said, enunciating the two syllables as if trying the name out for the first time. “A group that is entirely encrypted, where we know what we share is completely safe.”
Nathan Freitas, a software developer and privacy activist, was sitting in the pews. “I’m like, did my reverend just use the word ‘encryption’?”
“Did my reverend just use the word ‘encryption’?”
Freitas had mentioned WhatsApp to Reverend Bryan as a tool for the parish’s social action because last year, the world’s most popular messaging app updated to using end-to-end encryption by default. That means messages don’t live on servers and aren’t readable to eavesdroppers, or even WhatsApp’s parent company, Facebook. Overnight, the words typed by a billion people were whisked from the threat of surveillance, thanks to the code developed by a San Francisco open-source project called Open Whisper Systems.
Founded in 2013 by idealistic cryptographer Moxie Marlinspike, Open Whisper developed the protocol adopted by not only WhatsApp, but Facebook Messenger’s “secret conversations” feature, and Google Allo’s “incognito mode.” Open Whisper also has its own flagship messaging app called Signal, and people have been flocking to it since the election. On November 9th Signal vaulted up App Annie’s ranking of the most popular iOS social networking apps, from 98th place to 45th. Worldwide downloads grew 70 percent in the last quarter of 2016, with nearly half a million downloads coming from the US. As activists and wary citizens prepare for a president-elect who seemingly disdains the First Amendment, flirts with the idea of creating a Muslim registry, and will soon sit atop a powerful surveillance apparatus, Signal has become the gateway drug to the encrypted life.
While Trump’s election helped propel Signal toward the mainstream, the Trump era heralds more uncertainty for encryption’s already uncertain future. On one hand, law-and-order Trump has made statements about protecting the country from hacks, announcing in early January that he’d start a committee in his first 90 days in office to combat cyberattacks (“We’re like the hacking capital of the world,” he told reporters). Yet he’s also spoken critically about encryption and privacy protections. On the campaign trail, Trump blasted Apple for refusing to break into the iPhone of one of the San Bernardino shooters and spoke enviously about Russia’s hacking of the Democratic National Committee — “I wish I had that power,” he said at a press conference, “man, that would be power.” Trump has already appointed encryption critic Jeff Sessions as head of the justice department, and “robust surveillance” advocate Mike Pompeo to head the CIA.
In the meantime, Signal’s post-election popularity appears to be more than a temporary blip. Marlinspike said the app is getting more sustained daily downloads than Open Whisper’s encrypted products did after its second biggest boost: Edward Snowden’s revelations about the surveillance conducted by the NSA.
It was five weeks since the election when I met Moxie Marlinspike at the humble office he shares in San Francisco’s Mission District. Open Whisper’s logo is printed on a sheet of paper and taped to the door, which Marlinspike opened as he showed me into the kitchen. An aerobics class pounded through the ceiling. His blond dreads were gathered into a ponytail, and he wore a black hoodie zipped over another black hoodie — the cliche hacker getup, squared. “I don’t have a lot of clothes,” he explained.
While he now lives in an Oakland apartment, Marlinspike used to have a more itinerant lifestyle; he lived in a warehouse after moving to San Francisco in the late ‘90s as a college-shirking coding wunderkind. He took breaks from software jobs to indulge his wanderlust, sailing around the Bahamas and jumping trains across the US. In 2015, he posted a picture on his Instagram standing next to Signal’s most famous endorser, Edward Snowden, in Moscow. But all those biographical facts come from the internet: Marlinspike is cagey on all personal details, reticent to reveal anything beyond the fact that he’s now housed. He’s equally tight-lipped on the election, not revealing whether he’d even voted.
Marlinspike is a reluctant participant in Silicon Valley’s startup culture. “When I was a kid, ‘entrepreneur’ was not a compliment,” he said. “That wasn’t the kind of thing college kids sat around in their dorm rooms dreaming about.” Marlinspike worked for other software companies for years, and his first startup, Whisper Systems, was put on hold when he was acquihired by Twitter in 2011 to head its security department. The gig didn’t last long. Fourteen months later, Marlinspike walked away from $1 million in stock options, quitting after a near-death sailing incident in San Francisco Bay to get back to the work that most fascinated him. He returned to developing two encrypted apps for Android: TextSecure for messaging, and RedPhone for calls. In 2015, he folded both services into Signal.
Signal’s clean blue-and-white design makes it nearly indistinguishable from any other messenger. Apart from a few missing features like sending voice messages on iOS (development is in progress), there’s little it can’t do. Signal is widely regarded as the best in field for security, both for its encryption and for retaining less info on its users than other privacy-focused apps. People also tout its ease, devoid of the clunky shenanigans of PGP encryption. The only hint of its underlying security is when using Signal for phone calls. Once both parties are connected, each is given a random pairing of words to confirm the identity of the person on the other end of the line.
If encryption is difficult it self-selects for people willing to jump through those hoops, and bad guys are always willing to jump through the hoops
Three years in, Open Whisper is still running with a skeleton crew: just five employees, plus more than a dozen contributors in the open-source community. Marlinspike wants to grow that team, and he’s hunting for talent willing to sacrifice the usual tech salaries and cush Silicon Valley perks. (The website states that there’s “no gym membership, but any time you feel like doing push-ups, we’ll do them with you.”) Open Whisper doesn’t patent its open-source protocol, giving it away to the most highly valued companies on the planet for free. Instead, Open Whispers’ work is completely funded by private donors and foundation grants. (Marlinspike declined to name individual donors, and tax forms don’t disclose them.)
Marlinspike’s goal isn’t unicorn riches, but unicorn ubiquity. For that, he wants to make encrypted messaging as easy — as beautiful, as fun, as expressive, as emoji-laden — as your default messaging app. His reason: if encryption is difficult, it self-selects for people willing to jump through those hoops. And bad guys are always willing to jump through the hoops. “ISIS or high-risk criminal activity will be willing to click two extra times,” he told me. “You and I are not.”
“What?!” I ask. “You are, aren’t you?”
“At this point — no,” he said. “I’m unwilling. I don’t use PGP. I don’t do any wing nut stuff. I just want to be a normal person. Unless you have Facebook and Gmail, you won’t understand what people’s expectations are.” Marlinspike thinks the reign of the crypto-nerds has lasted long enough: “There are some people who’ve built an identity around learning how to use this crazy stuff. And if the basis of your identity is you have master skills that are difficult to use, you’re not incentivized to help other people to do it.”
Marlinspike wants to make encryption uncoolly mainstream, in the way Facebook became uncool when everyone’s parents got an account. That’s because Marlinspike said people will find that the classic dismissal of encryption — “I have nothing to hide” — isn’t actually true if the government or nefarious hackers target them. Who doesn’t have an email sitting in their inbox that would embarrass them to death should it be revealed? And Marlinspike wrote that with 27,000 pages of federal statutes and 10,000 administrative regulations, “You probably do have something to hide, you just don’t know it yet.”
With complete computer security now impossible, he said, the only way to ensure privacy is to make the hacked spoils useless — like a thief getting into a lockbox to find only lumps of coal.
“If you have an unencrypted email sitting somewhere, it will become public,” he states plainly with no hint of alarm. “That is the inevitable outcome.”
There’s no question Washington learned the value of better encryption during the presidential election, with the hacks of the Democratic National Committee and Hillary Clinton’s campaign chairman John Podesta. After the DNC learned it had been compromised, Clinton’s campaign staffers were instructed to use Signal for any communications regarding Trump. It was, Marlinspike said, a “good decision.”
Though digital security buffs have long preached about Tor and Signal at professional conferences, for many surveillance can seem like a distant threat, one that is logistically annoying and just a little tinfoil hatty to deal with. Certainly, some communities have been vigilant about the safety of their communications. Snowden and Laura Poitras were early Signal adopters and hearty endorsers. Civil rights activists also used the app early on, having dealt with government spying since at least COINTELPRO, continuing on to the FBI’s surveillance flights above Black Lives Matter protests.
“First of all, black communities been knowing that we had to code our communications,” said Malkia Cyril of the Oakland-based Center for Media Justice, who has used Signal to organize Black Lives Matter actions with other activists since 2014. “We been coding our communications since slavery. The knowledge of the need is not what’s missing for us. It’s the skills and the information.”
But for the ranks of the lazy, oblivious, or those privileged enough to assume they wouldn’t be targeted, there was always an excuse to put off switching to encrypted communication. “I’m not doing anything wrong so what do I care about the NSA,” as one documentary filmmaker put it to me recently, or, as summed up in a recent Google chat with a friend, “I’ve debated that… but it seems like so much work!”
Trump’s election has prompted many to reconsider their relationship to surveillance. The week after the election, Deanna Zandt, a Brooklyn-based digital strategist with hot-pink-and-purple hair, sent out an email to her clients advocating Signal. “Hey y’all, As the future looms so pale in front of us, a lot of us tech folks are gearing up for the long haul. I wanted to make sure that everyone feels like they have a handle on how to keep themselves secure online,” she wrote.
For the ranks of the lazy, oblivious, or those privileged enough to assume they wouldn’t be targeted, there was always an excuse
Zandt advises nonprofits like Race Forward, Women Action & the Media, and Planned Parenthood, organizations that are not just worried about government surveillance, but doxxing from political opponents. “There’s a reason people volunteer to be clinic escorts in the offline world,” Zandt said. “We need the equivalent of that level of security in the online world as well.” That doesn’t mean everyone is psyched about the logistics of it — when Zandt stood before a Race Forward staff meeting in New York recently and instructed them to swap out their legions of saved passwords for more complicated ones, she was met with groans.
“When we talked about the revelations of Snowden, it was hard to convince some folks on the progressive side to care,” said Josh Levy, advocacy director of Access Now, a nonprofit that runs a digital security hotline to counsel activists around the world on how to avoid surveillance. “I would talk to folks about the vulnerability of their data and reliance on large commercial platforms, but because of their trust in the president” — meaning Obama — “they didn’t pay attention to it. I’d say, ‘What if Trump came to have access to that data?’”
Americans had been a minority of callers to Access Now’s digital security hotline. In the weeks post-election, Americans dominated it.
The nascent Trump opposition has also been organizing a slew of digital privacy crash courses. The human rights organization Security First said they’ve gotten 20 post-election requests from US media and community groups for digital security consulting, double the amount they normally get over an entire year. Cyril’s Center for Media Justice co-hosted a forum to discuss police surveillance in Washington, DC, and the Freedom of the Press Foundation — the fiscal sponsor of Open Whisper Systems — spoke at a standing room-only Brooklyn cryptoparty in early December called “Communicating in Trump’s America.”
That event was a coming out of sorts for the newly crypto-woke, people like Phyllis, a 29-year-old GIF artist in Brooklyn, whose website portrait features her lying on candy in a gummy worm sweatshirt. Phyllis considers herself an “amateur activist.” She had always been interested in online privacy given that her laptop is her primary work tool, but (common story) had never actually gotten around to doing much about it. Jarred by Trump’s election, she’s helping organize a group of graphic designers to do pro bono work for civil rights causes. She came away from the cryptoparty with an action item for the graphic designers’ first meeting: everyone join a Signal group chat. “Friends doing activism stuff are more proactive about it,” Phyllis said. “My friends [who are] not as involved don’t see the need to do it. I texted a friend like are you meeting me at a certain time? And no response. So I gave up on logistical non-activism stuff.”
Some users say they’re constantly arbitraging texts — political info to Signal, banalities to other messengers. “If you’re talking about the Emmys, please get up off my Signal,” said Cyril of the Center for Media Justice. “I don’t need to encrypt a conversation with my wife about what groceries we need to buy.” But JP, a Bay Area documentary filmmaker, got his family on the app, and they’re already using it for not-so-sensitive intel (to wit: “We are gonna study at Brianna’s house but going to Trader Joe’s real quick for snacks.”)
Marlinspike said now is the perfect time for Signal to teeter on default status in politically active circles. People don’t generally interrupt a protest to ask, “‘Wait a second, what app should we be using?’”he said. “That’s why making this stuff the new normal and a successful part of everyday life is important.” So for Marlinspike, an arguably bigger moment for encryption than the election came when WhatsApp or the “secret conversations” feature of Facebook Messenger adopted his protocol, inching him closer to his goal: that Signal-style encryption becomes so universal that he works himself out of his job, Mary Poppins-style.
“If you’re talking about the Emmys, please get up off my Signal.”
While the makers of encrypted services say they’ve seen previous spikes in activity around current events, Trump’s rise seems different. Nathan Freitas, the Unitarian Universalist developer, directs an open-source organization called the Guardian Project, which created the Tor browser for Android named Orbot. When a Brazilian judge outlawed WhatsApp for a few days in May, downloads of Orbot shot up 24 percent in the first 24 hours. Another spike came after the Turkish prime minister briefly blocked Twitter during 2014 protests. Freitas said daily Orbot downloads in the United States are up 30 percent after the election, and holding steady. He’s also been fielding demands for more secure options on iOS — including an updated Tor browser and a way to route apps through Tor. And Freitas has witnessed other post-election changes: his 10-year collection of activists in his phone contacts have begun pinging him on Signal.
“We’ll see a time-bound spike for a day or a week and then things go back to normal. What’s different in this case,” Freitas said of Trump’s election, “is things don’t go back to normal.”
Like Signal, the Guardian Project is partially funded by the Open Technology Fund, a federally funded program that, since 2012, channels money to technologies that promote transparent societies, circumvent censorship, or skirt surveillance around the globe. “We just never anticipated it was going to become a domestic issue, because we were thinking about tyrannies,” a source close to the OTF said.
Marlinspike said “it would be a shame” if the new administration interfered in future OTF funding for encryption. The money for the fund is earmarked by Congress, so it’s subject to political tinkering and has fluctuated over its five-year life. Still, support for internet freedom appears in appropriations bills since 2012, fanned out over several departments. “Why would we [the US] stop supporting something that upsets Russia, Iran, and North Korea?” the source close to OTF said. “If those people hate this work, I would continue to do it.”
It’s unclear how the new administration’s actions could affect the new crypto-awakening. Increasingly aggressive surveillance policies could solidify the interest, prompting more people to make Signal their default messaging app. But there are several ways the administration could make things difficult for the project, and though Trump hasn’t yet said Signal by name, all signs point to him not liking the concept one bit.
Trump badgered Apple earlier this year for refusing to decrypt one of the San Bernardino shooter’s iPhones at the request of the FBI: “Who do they think they are?” he sneered in an interview on Fox & Friends. Meanwhile, Marlinspike literally led a round of applause to praise Apple’s position onstage at a San Francisco cryptography conference, stating, “I actually think that law enforcement should be difficult. And, in fact, I think it actually should be possible to break the law.”
“I don’t like ISIS. Fuck those people, right? It’s not like I’m happy that they’re using Signal.”
Back at his Mission District headquarters, Marlinspike clarified his statement. He said he’s not agnostic about how people are using Signal. “I have opinions. I don’t like ISIS. Fuck those people, right? It’s not like I’m happy that they’re using Signal or something like that.” But, he said, bad actors will always find a way to encrypt their communications, while the average person will only do so if it’s easy.
Yet that criminal activity — or at least the suspicion thereof — will always provide the major argument against encryption. Signal was legally challenged for the first time last year, when a federal grand jury in Alexandria, Virginia, subpoenaed all Signal data, messages, and subscriber details associated with two phone numbers for an unspecified criminal investigation. One turned out not to be a Signal user. For the other, Signal disclosed the only info it logs: when the phone number first downloaded the app, and the last time they used the service. Signal also lawyered up with ACLU attorneys and challenged the grand jury’s gag order forbidding them from publicizing the subpoena. Signal won, and wrote a blog post about the ordeal.
The new administration will have a range of policy carrots and sticks at its disposal. “Ultimately, it’s not going to be possible to stop encryption from being used by people,” said Electronic Frontier Foundation executive director Cindy Cohn, “because encryption is applied math, and it turns out you can’t ban math. But the government can do a lot to discourage use of the tools.” For instance, Cohn said, lawmakers could require a backdoor to messages for law enforcement, or the FBI could pressure companies into weakening security. Companies are “susceptible because maybe they want to be a government contractor, or cooperate with investors or business partners that insinuate that they cooperate with law enforcement,” Cohn said. “I don’t have a clear map of all the pressure points, but it’s not hard to think about the various ways this could happen.”
In December, a House working group released a report advocating a middle ground approach to any future regulations, arguing encryption was necessary for national defense and to safeguard critical infrastructure, yet creates trouble for law enforcement that “cannot be ignored.”
As our interview wrapped up, Marlinspike hopped into the elevator to head out into the dark-too-early December afternoon at the corner of 16th and Mission. The street has hosted its share of both World Series victory riots and political marches. I’d read that Marlinspike used to attend political protests, and when I asked if he attended an anti-Trump one, he just answered, “I don’t really want to talk about that,” and smiled. He questions why reporters label him an anarchist. Well, is that a misnomer? He paused for awhile, then replied, “It’s complicated.” That falls into his personal beliefs, a topic he doesn’t want to talk much about. “I’m not at all used to being a public figure, you know what I mean? I’m a private person.”
“I’m less interested in asking people to stop surveilling us, than doing things so that they can’t surveil us.”
Marlinspike prefers to let his tech speak for itself: a tool with clear political implications and anti-authoritarian uses. That’s where he’s more useful anyways, he said: pushing code, not policy. “I’m happy to support people fighting that fight [for encryption], but simply people like Apple computer or Facebook are probably better positioned to have those conversations than I am.” Marlinspike instead will force the issue by continuing to roll out his protocol. He likens Signal to the demonstrators at Standing Rock, who stopped the oil pipeline by physically blocking its construction. “I’m less interested in asking people to stop surveilling us, than doing things so that they can’t surveil us,” Marlinspike said. “That direct action is what I want to continue to focus on.”
Lately he’s gotten some experience sparring with governments trying to ban his work. When Signal was blocked in Egypt and the United Arab Emirates in December, Open Whisper fought back successfully with a workaround for its Android app. Reading the news, I texted him on Signal about whether he’d do the same if ever blocked in the United States.
His reply pings:
“Yeah we’d work to make it available.”
]]>