top of page
  • LinkedIn
  • Grey Instagram Icon
  • Grey Facebook Icon

The Neuroscience Behind How Digital Technology Affects Our Brain, or Maybe Just My Brain

  • Laura Bailey
  • Apr 14, 2020
  • 6 min read

Updated: Jun 28, 2020

For the first time ever this year I purchased a SIM card while traveling. I’d never done this before, mostly because traveling for me was always a way of escaping the familiar and the mind clutter of messages and notifications; to just generally be off grid. This time, however, I saw it as a tool that would make getting from A to B immensely more efficient. And it did, but it also became more than that- and for the first time I realised that something was subtly different about me, and I couldn’t help but suspect my smart phone was to blame.


Digital technology nowadays has become more than just a convenient tool, it’s such an extension of ourselves that it now is being studied by the same scientists who were studying the psychological effects of bionic limbs! In the UK, on average, people spend 8 hours and 41 minutes on media devices per day. For Americans, its 10 hours and 39 minutes just looking at screens. Whether or not we are touted technophobes or not, we have to acknowledge there’s been an indelible change that will not least require our brains to adapt, but also mandate an investigation into whether all these devices could be harmful or not.


A long history of techno-phobia


Back in 1936, the music magazine Gramophone described how radio programs were disturbing the balance of young people’s excitable minds. A little further back in history to the 18th century and you’ll see reports of newspapers being accused of socially isolating readers. Way, way further back in 1565, Swiss scientist Conrad Gessner was probably the first to raise the alarm on the disparaging overflow of information from the printing press. Socrates even said there was something dangerous about children writing down information instead of retaining it in memory.


So it would seem this 21st century fear of technology is one as old as information itself, merely with a different face. And we certainly got through our previous ambivalent relationships with technology so why should this be any different? And what’s it got to do with me buying a SIM card?


The fact is, media is a part of human life, and as civilization progresses we lap up any and all new ways to consume more. What studies are showing now though, is that it’s not the information that’s the problem, it’s the lack of consent given by the consumer in what is consumed. Instead of voluntarily regulating our news content, our attention is being grabbed; we are being distracted.


The Attention Deficit


Richard Davidson, leading neuroscientist and director of the Centre for Healthy Minds, says that devices are causing a global attention deficit, in its ongoing power to distract. While, yes, technology increases our ability to multi-task and get to information quicker, Davidson points out that the brain is less able to delve deep into the content we interact with. This results in not so much an inability to pay attention but an inability to pay attention on one thing for a long time. Instead we are pulled by the reigns of our boredom threshold, in one direction and the next, like a boat in a choppy current.


My friends will tell you that I’ve always viewed the online social world as a grim place where fakers and fablers go to frolic, and how “I just don’t think I gel with the modern way of being constantly connected”. However, there was a moment on a local bus in Sri Lanka when I realised I was in fact now constantly connected. Squished up to the window during a 4 hour drive, I listened to music while checking my movement on google maps, posting on Instagram, replying to friends, looking up accommodation options and reading blogs on the most unique travel experiences in Sri Lanka.


I would have managed not buying the SIM card, but because I did, I couldn’t stop myself from the temptation of nibbling at little news bites every now and then. My consumption of devices, I noticed, had definitely become a remedy to boredom, but which has led to increasing instances of boredom every day.


What the Neuroscientists Say: the Negative Associations


But is there really anything wrong with this? To see if there really is a relationship between increasing technology use and cognitive performance, I decided to look at the empirical research done by neuroscientists, specifically related to the smartphone.


It turns out that for all the media attention and bad rap the smartphone gets, the scientific literature is still in its infancy. Given that smartphones have only been in circulation for a short time, data on their long-term effects doesn’t really exist yet. Admittedly, I’ve struggled to discern whether I should be concerned for the future of brain health itself, or if this is just a personal issue I should discuss with my therapist.


However, generally the studies do seem to conclude that there is a negative correlation present between using multiple media platforms simultaneously and working memory capacity.


Eyal Ophir did a study where they found that those who multi-tasked more with media, were less able to filter out distractions during computer-based tasks. It also showed a tendency to what they termed bottom-up processing- which means they were more inclined to let the environment impact where their thoughts took them, as opposed to an internal control of those thoughts.


Any neuroscience aficionado can tell you that the left prefrontal cortex is more activated when we engage in top-down processing- the industry term being “cognitive control”. While the right is more activated in individuals who have less control of their attention. Those who demonstrated that they multi-tasked more with media devices also showed increased activity in the right prefrontal cortex. While the studies are correlational and small in number, this is still pretty impressive evidence.


Maybe it’s Not So Bad After all


Lets’ hold our horses though. In these studies, they use a questionnaire to measure people’s media multi-tasking habits. Sometimes a questionnaire is all we have, but it by no means equates to accuracy. Especially when this particular questionnaire conflates ‘playing video games’ with ‘listening to music’- both of which obviously require different levels of attention and memory refinance.


And don’t forget, correlation doesn’t mean causation. There are also studies showing that there are positive associations with video games and selective attention. Specifically, this relates to “action video games”. Who would have thought- Mom was wrong all along – we did tell her! Yes, action video games require what is known as high order metacognitive functions; learning to learn. Candy crush, however, well they still have some more studies to do on that one.


So it would seem a growing body of evidence is suggesting that smartphones result in lower cognitive control. But are these selective correlational studies just another kind of confirmatory bias? Perhaps I am inclined to believe it because of my love for Richard Davidson, who also happens to be best friends with the Dalai Lama.


My Educated Opinion on the Matter


I think it is important to question sensationalist headlines always. There is not enough evidence to prove that my smartphone was leading to my increased instances of boredom and waning focus the existing evidence is definitely hard to ignore.


What I will say is that I miss the days of being off-grid for longer than a couple days; of being more present in the natural world. Some of us definitely reap the intellectual and creative benefits of smart phones, rather than using it as a distraction device. But my knowledge of the brain and its evolution tells me this is not what we are wired for.


Just as the human body evolved not to have such abound access to sugar, attention was also a precious entity. And much like now when we eat lots sugar we get fat, when we stuff ourselves with instant-gratification distractions, our minds get chubby on information and it dilutes our focus. The problem with this is that the success of being human really relies on delayed gratification and long-term focus. But our ability to do this doesn't come naturally, it is trained through socialisation. It is controlled in the prefrontal cortex- which scientists now know doesn't fully develop until we are 25 years of age. This means, it is a muscle that must be trained.


So there you go. It would seem, intuitively to me, that doing things that make us instantly happy all the time just isn't good for us. We need to endure some sort of restraint- could this be what it means to be human? I'm probably over-generalising, as per usual there.


But perhaps the real questions is, is it still possible to limit my use of digital technology, and still connect with all my friends and family who are using these devices for 8 hours and 41 minutes day? It’s a decision I make daily. Perhaps I just won’t get a SIM card next time I go travelling.

コメント


  • LinkedIn
  • Grey Instagram Icon
  • Grey Facebook Icon

© Laura Alexandra   A Writer's Blog

bottom of page