We've Got the Screen Time Debate All Wrong. Let's Fix It

The narrative around tech addiction has been driven more by fear than facts. But that's finally starting to change.
Casey Chin

In 1995, New York City psychiatrist Ivan Goldberg logged onto PsyCom.net, then a popular message board for shrinks, to describe a new disease he called "internet addiction disorder," symptoms of which, he wrote, included giving up important social activities because of internet use and "voluntary or involuntary typing movements of the fingers."

It was supposed to be a joke.

But to his surprise, many of his colleagues took him seriously. Their response led him to create an online support group for internet addicts—though he quickly downgraded the affliction, renaming it "pathological internet-use disorder." The word addiction "makes it sound as if one were dealing with heroin, a truly addicting substance," Goldberg told the New Yorker in 1997. "To medicalize every behavior by putting it into psychiatric nomenclature is ridiculous."

Today, more than two decades after Goldberg's joke fell flat, mental health professionals find themselves in a similar bind. Public anxiety over the side effects of screen time—the hours we spend staring at our various devices—is the highest it's been in years. That anxiety has manifested in the form of self-help books, social movements, major media outlets foretelling the "the worst mental-health crisis in decades," and no shortage of guilt. (You let your kid play with an iPad at restaurants? You spent 30 minutes browsing Instagram when you could have been exercising? Or playing board games with your family? Or learning a second language? You sad/selfish/lonely monster!) And yet, there exists little clear evidence that we are locked in an unambiguously harmful relationship with our devices—let alone addicted to them in any clinical sense. "For the past twelve months, the narrative surrounding technology use and screen time has been consistently negative, but it's been driven more by fear than facts," says UC Irvine psychologist Candice Odgers.

Experts like Odgers say we'll never get good answers about the effects of screen time, unless we start asking better questions. And that means being honest with ourselves about what we mean by "screen time" in the first place.

This year, the conversation around digital dependence entered a new phase when Facebook CEO Mark Zuckerberg resolved to spend 2018 fixing Facebook, vowing, among other things, to ensure that time spent on the social network would be "time well spent." (Zuckerberg borrowed the phrase from former Google design ethicist Tristan Harris, who has popularized the term in recent years by characterizing it as the opposite of time surrendered involuntarily to devices, apps, and algorithms designed to "hijack our minds.") A few days after Zuck's post went public, major Apple shareholders urged that company to study its products' effects on children and equip parents with better tools for managing their kids' screen time. The following month, Harris formed the Center for Humane Technology—an alliance of tech-giant turncoats united in opposition against the attention-grabbing products they helped create.

These events helped set the tone of the year to come. "For right or wrong, big tech companies have seen which way the wind is blowing and responded," says Andrew Przybylski, an experimental psychologist at the Oxford Internet Institute. Google led the charge, pledging its commitment to digital well-being and releasing new tools designed to help Android users monitor their tech habits. Apple followed suit, unveiling features designed to help users understand and manage the time they spend on their iOS devices. Then came Facebook and Instagram, each of which released features designed to help users track, and set limits on, the time they spend in-app.

None of those companies has shared whether its tools have been effective. It's possible they never will release any data—and even if they do, researchers say, taking them at face value could be hard. "There wasn't a good empirical basis for their creation in the first place, so there probably won't be good evidence for their effectiveness," Przybylski says.

Look what happened earlier this year, when Congress asked the National Institutes of Health what science had to say about tech's addictive potential and the effects of screen time on kids. NIH director Francis Collins' response amounted to a big shrug: What limited research exists has been inconclusive, he wrote. Tech addiction? Scientists don't even agree on how to define it, let alone measure it. As for screen time's impact on developing minds, Collins said researchers are still gathering evidence on how best to balance technology's "obvious benefits" with its "potential harms."

A highly publicized study helps make Collins' point. A year ago, researchers reported that they had observed an association between screen time and depression rates in young girls. But the effect size the researchers found was tiny and the relationship was correlational, which made it impossible to say whether more screen time was leading to higher rates of depression in girls, or vice versa.

Media outlets communicated the results with an alarmist tone and an uneven hand, failing to communicate the nuance of the findings. "Less screen time is the secret to teen happiness, new research shows," read the Washington Post. USA Today wrote: "Screen time increases teen depression, thoughts of suicide, research suggests."

News outlets responded similarly in early December, when 60 Minutes reported on preliminary results from the Adolescent Brain Cognitive Development Study, a large, long-term investigation of child development that's investigating the effects, among other things, of different tech habits. Those unpublished results showed that screen time was associated with structural differences in kids' brains—findings that led the Boston Globe to describe the ABCD Project as "the latest in a series of studies that may make parents wish they could crowbar that pesky smartphone and other devices out of their children’s hands."

But the observation that an activity changes the structure or function of an adolescent's gray matter is the scientific equivalent of observing that water is wet. Many childhood activities alter the brain; what matters is the downstream effects of the alterations.

Those effects are notoriously difficult to untangle from the many, many activities that shape a developing mind literally and figuratively. "It's a very complicated question, so people often oversimplify this kind of research," says neurobiologist Gaya Dowling, NIH director of the ABCD project. "Like the cortical thinning I mentioned on 60 Minutes: We don't know if it's good or bad—we just know that it is. That's one message that got lost in recent coverage of our study: We're seeing these associations, but we don't yet know what they mean."

Not that researchers can't or shouldn't do the hard work of untangling those associations. The ABCD Project will spend the next decade doing exactly that. Plenty of other researchers are trying to, as well. Odgers, whose lab at UC Irvine studies how technology affects adolescents' physical, emotional, and cognitive development, has helped shed light on the potential benefits of screen time. "The assumption for a lot of people panicking is: Oh my god, these phones or devices are basically causing depression or anxiety. But if you go in and talk to kids, a lot of them are turning to the internet for social support, information about symptoms, and reported feeling better about themselves when they were online than when they were off. They actually were going online to feel better."

Przybylski, at the Oxford Internet Institute, has made similar observations. In a study examining the digital habits and mental health of more than 120,000 kids, he found that a few hours of device use every day was actually associated with better well-being than none at all. The negative associations didn't crop up until kids were spending six hours or more on their devices per day—and even then, they were small and correlational.

If there's one thing that gets lost most consistently in the conversation over alluring technology, it's that our devices contain multitudes. Time spent playing Fortnite ≠ time spent socializing on Snapchat ≠ time spent responding to your colleague's Slack messages. "The time spent on digital devices is not monolithic," Dowling says. "That might seem obvious, but people tend to lump them all together." That's why researchers participating in the ABCD project make sure to differentiate between video games, social media, video chatting, and other forms of screen time. "They're still large buckets," Dowling says, "but we're starting to get some granularity there."

The operative word there is "starting." Because that granularity exists not only between apps, but within them: The time someone spends actively watching YouTube videos from Khan Academy is different from the time they spend consuming passively from the platform's algorithmically generated nexus of conspiracy theories or disturbing kids' videos. That's a subtlety researchers are only just beginning to explore, but Odgers, Dowling, and Przybylski all agree that studying these intra-application differences will be essential to understanding not only the full scope of screen time's impact, but when and whether our relationships with our devices warrant actual concern.


More Great WIRED Stories