Does Screen Time Stunt Development?

While some tech enthusiasts believe educational apps can help toddlers hit fine motor milestones earlier, researchers say otherwise.

Written by Richard E. Cytowic
Published on Oct. 01, 2024
A child sitting at a table doing homework with an iPad propped up in front of her.
Image: Shutterstock / Built In
Brand Studio Logo

Dr. Karmiloff-Smith’s Birkbeck Centre for Brain and Cognitive Development in London published a survey one year after its endorsement of early iPad use, looking for any association (not causation) with developmental milestones in 366 toddlers aged 19 to 36 months.

It found that “earlier touchscreen use, specifically scrolling the screen, was associated with earlier fine motor achievement (e.g., stacking blocks).” But “no significant relationships were found between touchscreen use and either gross motor (walking) or language milestones (producing two-word utterances).”

The conclusions were largely circular: scrolling a screen early on enables a child to do similar actions later. A subsequent study from South Korea confirmed that there is no evidence that tablets facilitate earlier language acquisition.

How Much Screen Time Do Kids Get Per Day?

  • On average, children eight to 10 years old get six hours per day.
  • Children 11 to 14 years old get nine hours per day.
  • Children 15 to 18 years old get 7.5 hours per day.

More on EdtechTech in the Classroom: What Is It Good For?

 

Do Skills From Apps Transfer to Real Life?

After reading my essay, “Your Brain on Screens,” in The American Interest, a concerned father wrote to say he was about to move to Europe to work for “an educational company” that specialized in preschool apps. During the months before his scheduled move, he had let his young daughter “play these ‘educational’ games from the company.”

Well, let me just say that the only things these apps did for her was to get her better at playing the apps. Not only was there no “carry over” of skills to other parts of her life, as they said there would be, but we observed over weeks a visible degradation in her attention span, as well as a slowing down of her fine and gross motor development. I tried at first to chalk this up to something else. As time went on, this became harder to do. These things are worse than entertainment, much worse. I cannot in good conscience go to work for them now. In fact, I have been spreading the news about your findings to others I know, in the hope that they will make the right choices for their children and their students.

What babies learn to recognize best during their early visual apprenticeship with the world are facial expressions. Without effort they distinguish one look from another, understand what different looks mean and judge smiles of different intensities.

They can differentiate surprise, anger, happiness, sadness, disapproval and fear — an aptitude that watching screens can never instill. Babies learn to read others long before they can speak, making the capacity a key part of emotional intelligence. The ability to read faces is necessary for human empathy and shared subjectivity. 

 

What’s Wrong With Young Kids Using iPads?

Up until now we have been overly optimistic about how transformative iPads would be and have not given enough thought to the price such technology would exact. Newborns effortlessly adapt to all the unfamiliar newness around them.

They have never seen, heard, tasted or felt anything like their new multisensory world — so different from the oceanic oneness felt inside their mother’s womb. Rather than empty vessels waiting passively to be filled, babies are active explorers of their surroundings, other people and what their own bodies are capable of.

Babies take in, organize and interpret the energy flux coming from multiple sources, including proprioception of their own bodies (the awareness of its limb and core position, movements and equilibrium) as they learn to distinguish inner sensations from outer ones.

Through repetition they learn the links between perception and action, which thoughts produce the movement they intend and which looks, expression and vocalizations bring about a desired result. Infants are superbly motivated to explore, soak up the environment, attend to this and that and, most of all, engage socially.

The retina’s fovea is the matchhead-size spot responsible for sharp 20/20 vision. The fovea does not fully mature until age four, which is reason enough not to occlude the visual field with an iPad during this critical window of growth.

A newborn’s visual acuity typically measures 20/400. Acuity sharpens in the first months after birth. Color vision becomes functional around age four to six months, as do other networks for decoding the perceptual complexity of different kinds of movement. Despite their underdeveloped vision, newborns are astute social creatures: locking onto an adult’s dilated pupils is a common sign of interest and pleasure that makes an infant smile.

Visual attention similarly takes time to develop. The requisite eye muscles and brainstem circuits are functional at birth, whereas learning how to attend to and track an object takes practice. Smooth tracking steadies itself around two months. Command of the quick, jerky saccades that shift the visual frame from one point to another, as they do in reading or looking, is a different skill that takes longer to mature.

Object recognition requires visual attention — learning to understand edges, boundaries, textures and partly occluded objects; learning to calculate shape, size, relative movement, velocity and distance from the way edges intersect; and learning that objects maintain a constant shape and size despite changes in distance and the viewing angle.

It is an enormous undertaking; no wonder babies sleep so much.

While they sleep, pertinent synapses and network circuits are being laid down and consolidated so that these skills gradually become automatic. A curious child learns these things naturally.

Sticking an iPad in the way impedes the process as well as the evolutionarily ancient neuronal development that leads to their attachment to fellow humans.

More on NeuroscienceTech Can Be Stressful. Here’s How Neuroscience Can Help.

 

Be Careful of the Tech You Introduce to Your Kids

With respect to hearing, infants are partial to high-frequency sounds. The ability to discriminate low-frequency sounds does not reach adult levels until the age of 10. Infants are more sensitive to frequency differences (changes in pitch) than intensity (loudness), meaning that locating the source of a sound in three-dimensional space is difficult for them.

Infants also have a hard time separating speech from other sounds even though they are universally sensitive to the individual sound units (phonemes) of all languages. They must surmount the difficulty of categorizing sounds into meaningful groups, a tough cognitive task that is yet another reason to spare them the interference from screen media.

Your Stone Age Brain in the Screen Age book cover
Image provided by MIT Press.

A synesthetic cross-sensory connection occurs naturally during the first year of life, especially close in time to the incipient emergence of speech. That is, all infants see speech as well as hear it. Normally, sight, sound and movement are tightly coupled, which is why even bad ventriloquists can convince us that the dummy is talking. Cinema likewise persuades us that dialogue comes from the mouths on screen rather than surrounding speakers.

Without realizing it we all lip-read, too, and the noisier it is the more we are forced to look at the speaker’s face to see what they are saying. Cross-sensory couplings are so automatic and compelling that even two-year-olds succumb to their effects. They fall for the McGurk illusion in which a listener hears a different sound from the one the speaker actually makes.

For example, listeners hear the sound of “ba” as “da” when looking at the lip movements associated with saying “ga.” (You can find illustrations of the McGurk effect on YouTube.) Visual lip movements and vocal sounds influence one another even before sound and visual perceptions become assigned to a specific phoneme or word category.

While we cannot get inside an infant’s mind, we can draw on more than a century of research about how perception and thought develop in order to prudently weigh the eager but evidence-free claims of overenthusiastic tech advocates.

This extract is from Your Stone Age Brain in the Screen Age: Coping with Digital Distraction and Sensory Overload by Richard E. Cytowic, published on October 1, 2024. Reproduced with permission from the MIT Press.

Explore Job Matches.