It's widely accepted that the internet changes our brains, but whether that's a bad thing is up for debate
Highlighting that the internet makes us lazy has, ironically, become quite a lazy statement, but there's a real point to it. Why remember facts and figures when Google is always in your pocket? Why learn the layout of London, when satellite navigation systems can do the heavy lifting for us? Worst of all, why even bother taking part in a pub quiz when the fun will likely be ruined by someone looking up the answers?
More damaging is the temptation to outsource our thinking to the internet. It's easy to understand why this seems appealing: there's an enormous collective intellect out there waiting to be tapped (albeit with a lot of detritus to wade through), but the true extent of this laziness only became apparent this month with a study from the University of Waterloo. This found that participants had a small, but significant urge to doubt their own knowledge and confirm facts on the internet when given the opportunity to double check.
Wanting to fact-check something before you make a fool of yourself is one thing, but there's also some evidence to suggest that we're just less likely to bother making the effort to remember stuff if we know it's all stored for us elsewhere in the cloud, or on our devices. This isn't a conscious choice, but, on some level, our brains just don't bother committing things to memory in the same way. The security firm Kaspersky did some research into this phenomenon and dubbed the whole thing “digital amnesia”. The theory behind this weakened memory is that we're less likely to reinforce memories by simply looking them up as we need them or, as Dr Maria Wimber from the University of Birmingham told the BBC: “Our brain appears to strengthen a memory each time we recall it, and at the same time forget irrelevant memories that are distracting us. In contrast, passively repeating information, such as repeatedly looking it up on the internet, does not create a solid, lasting memory trace in the same way.”
There are other less mechanical and more inherently optimistic theories for this, though. A 2011 University of Wisconsin study found that participants asked to type 40 facts were more likely to remember the pieces of trivia when told that the document would be deleted at the end of the test. In other words, the brain is actually optimising itself by outsourcing memories, rather than weakening. Indeed, a second part of the study revealed that participants were more likely to remember the location of the computer folder containing the facts, rather than the facts themselves. Depressing, but efficient.
Of course, there's a school of thought that says this is just an extension of what we've always done – a form of transactive memory, where groups share memories. “I don't need to remember my cousins' birthdays, because my husband knows them” – that kind of thing. The psychologist who came up with the transactive memory hypothesis in 1985, Daniel Wegner, told Harvard Magazine that he believes that the internet has become an extended – and particularly knowledgable – part of this collective social memory: “We become part of the internet in a way. We become part of the system and we end up trusting it.”
That's fine for hard facts that you've submitted yourself – when your cousin's birthday is on a Google calendar, for example – but what about when you're relying on other people's knowledge? In theory, we've got a healthy level of distrust for what the internet tells us, with a whopping 98% of people distrusting the internet as a source of information according to one 2012 survey, but we know that even information we instinctively distrust can make us doubt ourselves.
Then there's concentration: plenty has been written about the internet's impact on our ability to avoid distractions and concentrate, but much of it is anecdotal. In a broader sense, other factors could be just as responsible for our collective lack of focus. One particularly fascinating study found that members of the Namibian Himba tribe who had recently moved to urban settlements had far weaker levels of concentration than their contemporaries who had maintained their traditional rural existence.
Nicholas Carr, author of The Shallows: What the Internet is Doing to our Brains, reckons that much of this can be undone by spending more time off the internet, and the plasticity of our brains suggests that should have an impact. But in a society that relies so much on being connected, is there really any advantage to fighting the way our brains have adapted to our digital lives, other than flimsy nostalgia? Maybe not, although as with almost everything with the brain, an enormous amount is left unknown, even if using the web as additional memory storage seems fine and dandy. “Nobody knows now what the effects are of these tools on logical thinking,” Wegner reminds us.