A comforting message from a friend blinks at the top right of your screen: A tweet coming in from someone who loves you, who wants to make sure you’re feeling okay, and that you’re taking care of yourself. It’s been a rough few weeks, after all — the election was just over a month ago; it felt more grueling and long than previous elections, and it seems like many have not yet recovered from the shock. “here is a personal reminder for you,” it says, “remember to take time to take a few deep breaths please! ilu!!” You breathe deeply, thankful for the reminder from someone who is clearly concerned about your well-being. The prompt comes from @tinycarebot, a Twitter bot built right after the election by writer Jonny Sun, which now has over 44,000 followers. The bot is an impersonal rotation of self-care tweets that nudge you toward stress-relief and comfort, an antidote to everyone’s digital exhaustion from so much bad news.
“[The election] kind of rocked everyone’s world, and totally destabilized everything. I haven’t been able to find my grounding since then,” Sun recently told me in an interview. “It’s pretty horrible for the collective mental health of the internet."
The bot, like so many online self-care tools, is intended to help you reconnect with yourself even if you can’t detach from the internet. Twitter has been so anxiety-ridden for the last year in particular, a perfect storm of election anguish, the deaths of beloved celebrities, and, frankly, a lot more of the usual racism and sexism. But the internet is, ultimately, too enticing to look away from, too necessary to our daily existence to not engage with. Ignoring it is no longer an option as the line between digital and tangible experiences are blurred, so instead of running away from it, people are creating pathways to serenity: Use a known evil for some marginal good.
Self-care, the idea that you should take time to find personal comfort and relaxation, has been around for a while, but online, it’s performed publicly and to varying degrees. There are bots like Sun’s, but also gifs that help you regulate your breathing, an interactive questionnaire to help you determine what you can do to help relieve anxiety or stress, and relaxing videos of substances being soothingly cut or poked or mixed. Digital self-care works in micro moments: 30 seconds with kinetic sand instead of a 90-minute Swedish massage, or a minute with a breathing gif instead of a 20-minute meditation. Oddly enough, these coping tools manage to create a kind of intimacy between user and creator, even though they don’t know each other at all. The viewer can let their guard down because they won’t be judged or criticized.
“A lot of self-care tips are about finding connection to your body and to the real world or the physical world,” Sun said. If the rest of the world is going to feel so bad, so unmanageable, so untouchable, why not get a little love from a machine?
A few weeks ago, a link to a barebones website, Click, started getting passed around online. It’s a site that records and recites your movements back to you, whether you hit a button, or how fast your cursor is moving, or whether you’re about to leave the page. Using that data, an impersonal male voice begins talking at you like you’re a living experiment, positing how smart you might be or what kind of user you are. “Unusual behavior. Hmm,” it says when you hop from tab to tab. “Subject is about to leave. That’s troubling. Don’t go. Don’t go!”(When I use it on Chrome, it calls me “mainstream.”) Machines aren’t supposed to notice us. So, while it’s eerie that the program functions by using data collection and online tracking, it’s also oddly comforting to have a machine talk to you like it knows you. “Looking at subject’s time zone, he probably should be at work,” it says. “Thank you for coming back. You seem so special, so beautiful and smart. Unlike any subject I’ve met before.”
If the rest of the world is going to feel so bad, why not get a little love from a machine?
But that might be one potent effect of digital self-care: It affects us where we spend the most time, and tricks us into depending on bots, videos, or gifs to elicit feelings from us as if they were real people. (This might also explain why so many people reply to @tinycarebot with “I love you too.”) “I designed it so that you would feel some level of empathy towards it; it creates some sort of empathetic response,” Sun said. “I think even though you know it’s a bot and even though you know it’s an automated response, for some reason, people still connect to it.”
Anna Lomanowska, a University of Toronto researcher on online intimacy and well-being, looks at how people use digital technologies to relive intimate experiences online. “I think people are realizing the vulnerability that comes with this constant connection. Whether it’s images or tweets coming through constantly bombarding your feed … we're so used to that way of connecting, it’s difficult to get away from it,” she said. “We then reach out to our communities: Well, what can we do to help each other get through this experience?”
On Tumblr, posts tagged with “self-care” produce an endless scroll of ways to check in with your mental health and develop coping skills. “Daily Reminder #2,” writes user bonbonpalace. The post simply says, “Drink plenty of water” surrounded by little pink and white hearts. There are also simply designed images that say things like, “Recovery Isn’t Linear” or photos of the kind of bath bombs that can help with relaxation. The internet, which is so often a place where harassment festers, is also where these communities take care of each other. These posts speak directly to an audience that needs comfort, from creators who know the feelings of anxiety, or loss, or stress, or depression all too well. “People are able to extend beyond their physical self,” Lomanowska says. “[They] feel in ways that are obviously, for some, very much effective in producing these positive responses.”
Digital self-care can also create pleasing physical sensations. Nothing feels as physically intimate (without any actual touching) as digital autonomous sensory meridian response or ASMR. It’s an experience often characterized by tingling sensations in your brain or top of the spine, usually triggered by specific visual or auditory stimulants. Whispering, crinkling paper, close personal attention all tend to trigger an ASMR response. A recent study found that personal attention has one of the highest response rates and so there’s a robust list of personal care ASMR YouTube videos with millions of views. For instance, GentleWhispering’s “~Simple Pleasures~ ASMR Soft Spoken Personal Attention” video, which features 38-minutes of a woman waving her hands in front of a camera, playing with a hairbrush, waving steam from an oil diffuser into your face (your “face”), and talking gently to you like you’re really there with her currently has 7.9 million views. “I sense that something could be troubling you,” she says two minutes in. The comments are filled with people saying yes, indeed, something has been troubling them — grades, family, work, mental health — but this video helped, even if only for a minute. “I’m not totally comfortable with how much I enjoyed this,” writes one. “Yup,” says another. “Always love me some quality 1 on 7.3 million time.”
“It’s like making a family on the internet, making everyone feel connected.”
In fact, there’s an entire industry for ASMR triggers on YouTube and Instagram — videos tagged with “ASMR” or “no talking” or “mouth sounds” or “ear brushing” or “one-on-one attention.” Some of the videos are strictly business, with a woman (usually) stroking or brushing or petting another woman (usually) without talking. Other times, they’re eerie facsimiles of human interaction: A breathy visit with an ear doctor who wants to make sure you’re healthy, a makeup artist standing two inches from your face and telling you your skin looks great, a librarian, a barber, an expert in migraine treatment — all routines that involve the creator pretending to touch you, your hair, your neck, or an inanimate object with incredible care.
“Those behaviors trigger your brain to tell you that the other person is safe, and it’s a kind of hardwiring we have, probably since infancy, that when someone talks to you in a soft, gentle way, with slow hand movements, and they’re grooming you, they’re caring for you, [it’s] a safe, comforting signal,” said Craig Richard, a professor at Shenandoah University who researches ASMR stimuli and responses.
Char, an ASMR YouTuber in the Bay Area, started making personal attention ASMR videos about three years ago. Despite the closeness she seems to build with her viewership, and the empathy she displays on film, she asked we not publish her last name or her age in case anyone finds her physical location. (Internet sleuths in her comments usually try to figure out as much about her as they can.) In her day to day, she works in the health care industry, so digital care seemed like an obvious next step. For her, the videos promote a kind of mental health well-being that adults rarely get once they become, well, adults. There’s still a significantly greater demand for mental health services than there is available supply, leaving people to fend for themselves.
“It’s like making a family on the internet, making everyone feel connected,” Char said. The comments on Char’s videos show how invested people are in her — and the videos she makes: People write specific requests (“Please make a Halloween inspired ASMR!”). They also thank her for helping them fall asleep after bouts of insomnia and for the tingly sensations they thought they could no longer experience. There’s also plenty of comments about Char’s appearance — her makeup, how beautiful she is, how much they wish they knew someone like her in real life. It’s intimate, if one-sided.
A lot of personal attention ASMR videos also tackle traumatic events, taking away some of the stress. “Some people have said they tried to end their lives [but] you just say something and it hits them, it changes things around for them,” Char said. Vulnerability, in this community, breeds comfort. In her video, “Doctor Exams Post Accident Roleplay,” Char simulates calming you down immediately after a car accident — when you’re still in the car and in the middle of the road — and goes through the motions of checking your body during a physical. These videos that soothe triggers also come in the form of checkup ASMR videos, or dentist ASMR videos, or even gynecologist videos: If you usually feel stressed going to a physician, there is absolutely a 40-minute video just for you. Online self-care is judgement free: It doesn’t care about the source of your anxiety, or even how it’s manifesting, and it certainly won’t criticize you for what you need. You can let a machine soothe you without having to risk telling another human being what would bring you comfort.
Natasha, a 17-year-old living in Quebec, practices an offshoot of traditional ASMR videos on Instagram, where she films herself methodically cutting, squishing, and folding a type of slime. She also sells the product — a glue-based, but non-sticky, viscous dough in different colors and textures — on her Etsy page. She asked me not to use her last name because, “I feel like some of my family or friends or just general people I’m around wouldn’t understand it and would kind of think that I’m weird.” And fair enough, because Natasha’s account, slime.bun, is mostly her stabbing into slime balls with her stiletto nails, reformatting it, and mixing it with floam (essentially styrofoam and borax). It’s hard to explain what’s so comforting and satisfying about watching (or doing) what she does. You either get it or you don’t.
Predictably there is … not a lot of research for why some people like watching videos of disembodied hands poking slime or cutting kinetic sand, but there are a surprising number of accounts that do it on Instagram. Natasha’s slime account has 42,700 followers alone, with her videos views usually hitting five figures. “All these satisfying videos, they’re all pretty hands on,” Natasha said. “When I’m watching slime videos or things like that, I like to imagine myself poking it. It could be related to us being so virtual so this is kind of an escape from that.”
All of these self-care blips feel intimate, but it’s intimacy with an asterisk, intimacy with no real risks. Like most of the things we find on Instagram or Twitter, it looks like real life, it feels like real life, but there’s something missing. Risk is inherent to intimacy: the risk of touch, of emotional openness, of rejection. This intimacy, though it feels good, is ultimately a bit artificial. We’re looking for ways to be taken care of without having to be reciprocal, without having to share parts of ourselves, and without the taxing complications of face-to-face interaction.
Digital self-care doesn’t require intimacy as much as it does trust. Trust in what is, ultimately, not a person caring for you, but the idea that your needs will always be centered, without judgment or any necessary reciprocation. It’s always going to be about you. And nothing is required of you besides a few moments of your time. Those creating self-care content are, of course, invested in healing communities. But users are still anonymous strangers on the other side of a computer screen, while the creators are simply people to project your needs onto.
“There’s a lot of what you project onto the bot and what you are telling yourself that you need,” Sun said. “If I need this thing, I can project that entire persona and character that I need in my head onto an automatic bot. You can anthropomorphize anything.”
All of these self-care blips feel intimate, but it’s intimacy with an asterisk, intimacy with no real risks.
Ultimately, though, there is a real person out there in the world making this stuff for you so you can feel better. It might be for profit — ASMR videos sometimes play ads and Instagram users can link their accounts to an online store, which some slime and floam accounts have — but more often than not, it means there are multiple communities online trying to make you feel a little bit better. (Natasha said she sometimes retakes entire videos because a sudden sound might upset one of her viewers.) “There’s something very intriguing and something very human about reaching out and connecting,” Lomanowska said. But, unlike a typical connection, you don’t have the sometimes burdensome responsibility of actually responding. “You can be completely passive.” And maybe people need that.
Being kind to ourselves, is increasingly hard to do the busier we get, the more claustrophobic the internet becomes, and the more we rely on the internet. First we have to convince ourselves we’re worth it, that taking a break from work won’t get us fired, that we can’t comfort others until we, ourselves, get a piece of that comfort, too. It’s like that rule on airplanes that says you should always put on your own oxygen mask first before assisting others. We have to help ourselves before we can truly help others.
But then there’s always the complicated matter of how we go about comforting ourselves. What makes us feel safe? Is it someone talking us down? Is it the feeling of long nails running through our hair? Or maybe it’s the memory of our mother rolling out dough on the countertop, a mundane act done in near silence in the morning. Then there’s the complicated matter of finding something that will replicate the intimacy we’re looking for, something disembodied enough to make it easy for us to feel vulnerable but tangible enough that we can connect through a screen. Whatever it is, it’s the thing we spend most of our adult lives looking for: What’s the thing that’s going to make me feel a little bit better?