In a world where screens are ubiquitous, both in work hours and leisure time, it was inevitable that the way we read would change.
Digital content ranges from brief social media posts and video subtitles to longer-form material like e-books, reports and academic papers, which require more sustained focus and attention and a higher level of comprehension.
While the basic process of looking at and interpreting the written word remains the same, reading on a screen differs from ‘static’ reading of paper-based material.
Macquarie University researchers, Dr Sixin Liao, Dr Lili Yu, Dr Jan-Louis Kruger and Professor Erik Reichle recently conducted , published in Trends in Cognitive Science.
Professor Erik Reichle, from the School of Psychological Sciences, says in terms of human brain development, reading is a recent addition to our communication arsenal.
Books are static, there’s nothing moving or flashing, so it has become harder for them to keep our attention.
“We did not evolve to read, and the interplay of cognitive and visual systems that is needed for us to do so is incredibly complex,” he says.
“It requires rapid shifts of attention and higher-level linguistic representation in a series of separate mental processes all happening dynamically, and each taking 60-200 milliseconds.
“Our language developed as speech, so speaking and listening come naturally, but reading only emerged about 5500 years ago.
“We are not hard-wired to read, and it remains something we have to learn, and can only master by practising for 10 to 15 years.”
The screen inferiority effect
If reading is a recent innovation, then reading on screens is brand new, and it has brought further challenges for our brains.
A number of research studies strongly suggest that when we read text on a screen, we understand less than if we read the same text on paper, and this applies across languages and writing systems.
This is the screen inferiority effect, and one of its problems is that we are likely to come away with only the gist of what we’ve read but struggle to recall details.
Dr Lili Yu, from the School of Psychological Sciences, says a number of contributing factors may be at play, including the content of what we are reading.
“When people become immersed in a narrative, like a novel, then comprehension is less likely to be affected by reading on a screen,” she says.
“However, comprehension drops when we are using a screen to read information-dense text, like a textbook for study.
“The amount of time you have available also seems to be a factor, as when readers are put under pressure in studies to read something quickly, their comprehension drops for test on screen compared to paper.
“The effect is more pronounced for less skilled readers, and one study also suggested that reading on a screen can increase readers’ susceptibility to misinformation, as they don’t notice discrepancies in the content so easily.”
Just why this drop in comprehension happens is not well understood and requires further research.
Dr Yu says physical factors such as eye strain, brightness, comfort and fatigue may be adding to it, and habit and association could also have an effect.
With the advent of smartphones, we have come to associate screens with shorter, less serious content that encourages skimming. Trying to read something longer and with more complex language can result in a struggle to focus, particularly on small screens.
Screens also come with built-in distractions like frequent notifications, animated ads, pop-ups, auto‑playing video, and links to take us away to other stories, and all these are competing for our attention.
The only exception to the screen inferiority rule may be e-readers, which are designed to mirror the experience of reading a book as closely as possible.
Dr Yu says that screen inferiority may have implications for learning, both online and in classrooms when screens are used.
“We know people who are less skilled readers are most affected, which means the greatest impact is likely to be on the people who already need more help to succeed,” she says.
“Something we don’t yet understand is what effect it will have on children who are learning to read primarily on screens, and we are not likely to find out for another 10 to 15 years.”
Getting back to paper
As reading on screens is chipping away at our comprehension, some of us are also noticing that our ability to focus on printed material like books has fallen.
Concentration: Professor Erik Reichle, pictured above with research partner Dr Lili Yu, says multitasking affects our ability to focus.
Professor Reichle says constant multitasking has had a part to play to reducing the amount of time we can sustain our concentration.
“When we are watching TV or talking to someone, we are often using our phones to scroll social media or play a game at the same time,” he says.
“We aren’t giving either activity our full attention, but the content we’re seeing there is very short, it is engaging, and we’ve learned that it can give us frequent hits of dopamine.
“Books are static, there’s nothing moving or flashing, so it has become harder for them to keep our attention.”
And if we are hoping for a quick fix, the news is not promising.
The only way to get used to focusing on books again is to spend more time reading them, Professor Reichle says.
He recommends choosing a book you know will be of interest, sitting in a comfortable place with good lighting, and minimising other distractions like phones and TVs.
“Focus is a skill that you have to rebuild gradually, so don’t expect to get it back immediately,” he says.
“It’s going to take time, but it is worth the effort.”
is a Professor of Psychology at the Macquarie University School of Psychological Sciences, the Macquarie University Centre for Reading, and the .
is a Lecturer at the Macquarie University School of Psychological Sciences and the Macquarie University Centre for Reading.
Professor Reichle and Dr Yu have also released a new book, , published by Cambridge University Press.