In an age when so much of the internet feels bad, Pinterest has carved out a niche as the place you come to feel good. So when the company noticed Pinterest users searching for content related to “self-harm”—not a ton, but enough to catch someone’s attention—it decided first to filter out what would show up on the site.
Pinterest had already done this kind of moderation work on terms like “anti-vaccination,” using a combination of humans and machine learning to clean up the search results. Now, after training its algorithms to recognize content that promotes self-harm, the company says reports of those pins are down 88 percent. When someone does flag a pin that violates its community guidelines, which ban content that encourages suicide or self-injury, it’s three times faster at removing it.
Removing bad content was only part of the problem. The fact remained that people were coming to Pinterest to look for stuff related to self-harm, feeling overwhelmed and in pain. “Lots and lots of people are using it for that,” says Evan Sharp, Pinterest’s cofounder and chief of design. “So, how do we hold up our responsibility to do our best to help them?”
The company launched an initiative over the summer called “compassionate search,” which aimed to help Pinterest users combat stress and anxiety by channeling negative emotions into exercises like guided breathing or journaling. Now the company has built another series of exercises specifically geared toward the emotional turbulence of self-injury. Developed in collaboration with a group of mental health organizations, the exercises take techniques from dialectical behavior therapy—a type of psychotherapy used to treat mood disorders, self-harm, and suicidal ideation—and reimagine them for the smartphone.
When a pinner enters a related search term, the site will surface a prompt for these exercises. (It also displays the number for the National Suicide Prevention Lifeline, one of Pinterest’s partners.) Pinterest does not store data about who uses these exercises, and participation will never affect advertisements or pin recommendations on the platform.
“We do in fact have evidence-based treatments, based in dialectical behavioral therapy, that work well,” says Nina Vasan, a clinical assistant professor at Stanford’s School of Medicine and the founder and director at Brainstorm, Stanford’s lab for mental health innovation, which is one of the groups that worked with Pinterest. “The problem is that people don’t know about them or don’t have access to the treatments, so as physicians we feel like it’s urgent for us to think about creative ways of educating people about and increasing access to the treatments that we know work.”
One of the new Pinterest exercises, called Redirect Your Energy, offers guided practices in journaling, drawing, scribbling, or making a playlist as ways of releasing intense emotions. Another, called Cool Down, instructs someone to hold an ice cube in their hand or the crook of their arm and then focus on it as it melts.
These strategies borrow concepts from dialectical behavior therapy, like self-soothing or distracting when difficult emotions come up. “There are generally two ways people feel that leads to them having the urge to self-harm—either they have so much emotion that it is overwhelming or it’s the opposite, they feel nothing or numb and want to feel something,” Vasan says. Watching a melting ice cube might seem simplistic, but Vasan says stuff like this is proven to work. It’s immediate, and it can serve as either a distraction or a way to feel a strong sensation, which “goes directly to the core of the emotional experience.”
Pinterest has been mindful to let organizations like Vasan’s design the experiences around compassionate search, combining the platform’s user data with evidence-backed research around what works for improving emotional outcomes. “I’m pretty shocked by how long it’s taken us to get here,” says Sharp, “let alone other companies that have the same user behavior.”
Pinterest deferred to mental health organizations like Vasan’s while designing features for compassionate search, combining the platform’s user data with research on the best ways to improve emotional outcomes.
Searches arising from depression, anxiety, and other mental health concerns have always been a part of Pinterest, as on other parts of the internet. (Pinterest declined to share specific numbers on how many people search for “self-harm,” but noted that it is not a new phenomenon.) Rates of documented self-harm are on the rise in kids and young adults, Vasan says, although what role, if any, online platforms themselves play is still up for debate. What is clear is that people who plan to self-harm have some relationship to the internet—whether it’s Googling the term, posting about their feelings, or otherwise looking for help. “There’s an opportunity to meet those pinners where they are,” says Sharp.
Other major platforms have also designed experiences to help users in crisis or redirect them toward reputable mental health organizations. On Facebook, searches for “depression” lead users to a landing page that offers the phone numbers for several crisis hotlines, along with a series of self-care exercises. Because this is Facebook, there’s also a recommendation to reach out to a trusted Facebook friend, with an automated message written by the company: “Hi, I’m going through something difficult and was hoping to talk with you about it. If that’s OK with you, please message me back.” Instagram points users toward a similar page for hashtags like #depression and #suicide. If you search for “depression” on Google, the search returns a box that defines the condition, along with an optional diagnostic survey to check if you are clinically depressed.
Sharp says that Pinterest considered making a similar tool, which would leverage the industry-standard diagnostics, but the company decided against it. “For Pinterest, maybe more important than ‘am I depressed or not’ is what clinical practices are proven to be effective to people who may be depressed,” he says. “Those are beneficial whether you’re clinically depressed or not. I go to Google to find information. I go to Pinterest to find inspiration.”
That can sound like a lot of fluff, especially for a technology platform that went public this year and is trying to recoup falling stock prices. But Sharp, who speaks openly about his experiences in therapy, is one of the few founders in Silicon Valley who come across genuine and earnest in wanting to make people happier with technology. He sees Pinterest less as a place people go to do things, and more of a place people go to feel things. If a platform can cause negative emotional outcomes, he says, then it can also cause positive ones. It’s just a matter of what those company leaders choose to create. At Pinterest, Sharp calls it “compassionate design.”
Across the Valley, other platforms are beginning to acknowledge the way they make their users feel. Last week, Instagram announced that it would make “like” counts private, a decision the company made to reduce “social comparison” and improve the emotional well-being of its users. Sharp points out that Pinterest removed its Like button almost five years ago and recently disappeared its follower counts, “because it felt like the social sharing thing was leading to the wrong emotional outcomes for users.”
Measuring “emotional outcomes” can be difficult, and Sharp thinks that has made it harder for platforms to integrate them into product road maps. It’s easy to quantify engagement or user growth; not so easy to quantify how happy or sad your users feel after spending time on your platform. But Sharp talks about Pinterest as an emotional experience and says the company has a duty to try to make those experiences positive.
“Part of our business is helping people work through emotional situations,” says Sharp. “We’re not going to sit down with them and give them one-on-one therapy, but we can do small things to hopefully help them redirect their energy. A lot of what we’re doing today is actually just us living up to our responsibility.”
If you or someone you know needs help, call 1-800-273-8255 for free, 24-hour support from the National Suicide Prevention Lifeline. You can also text HOME to 741-741 for the Crisis Text Line. Outside the US, visit the International Association for Suicide Prevention for crisis centers around the world.
More Great WIRED Stories
- Meet the immigrants who took on Amazon
- The enduring power of Asperger’s, even as a non-diagnosis
- How to opt out of the sites that sell your personal data
- What Google’s Fitbit buy means for the future of wearables
- Andrew Yang is not full of shit
- 👁 A safer way to protect your data; plus, check out the latest news on AI
- 🏃🏽♀️ Want the best tools to get healthy? Check out our Gear team’s picks for the best fitness trackers, running gear (including shoes and socks), and best headphones.