There is no doubt that the internet has changed our lifestyle in various important ways, especially in everything related to information, knowledge and communication. The immediate availability of much of what humanity has produced and the faster and shorter bites of information that we exchange have certainly changed the way we live — and also possibly the way we think.
There is a widespread perception that the digital lifestyle has made us generally dumber. This viewpoint emphasises first the drastic decline in people’s attention span: We have gone from books, to essays, to blog pieces, to micro-blogging, aka Twitter, to SMSs and quick-fire chat exchanges. Secondly, we now seem unable, or at least unwilling, to memorise anything — even simple facts about the world — since we can instantly google that information up. Thirdly, the internet has levelled the playing field of knowledge and expertise. Everyone can post his/her “knowledge” on any topic and most of us then find it difficult to distinguish the (few) genuine high-quality goods from the multitude of junk.
And indeed, books have appeared on that theme: The Dumbest Generation: How the digital age stupefies young Americans and jeopardises their future — The Shallows: What the Internet Is Doing to Our Brains — and that of others.
More recently, however, the reverse idea has begun to be expressed more and more boldly, including two books that came out this year: Smarter Than You Think: How Technology is Changing Our Minds for the Better, and Infinite Progress: How the internet and technology will end ignorance, disease, hunger, poverty and war.
Let me briefly review the arguments of the “making us smarter” camp. The proponents of that view first remind us that every new technology was originally decried as the potential ender of civilisation. Socrates is famous for having abhorred the written text, stating that “knowledge stored is not really knowledge at all”. Television has also long been seen as a dumbing-down device ... despite its huge impact in terms of instant information (satellites relaying breaking news and affecting people continents apart) and education (some channels specialising in teaching formally or informally, e.g. through excellent documentaries). Indeed, every technology, including text and video, always brings with it the good and the bad, and reading, often seen as a high-brow activity, is in fact rarely synonymous with high culture. Bookstores are filled more with mindless writings than with refined literature.
The digital age, however, it is now argued, brings two new important developments that should make us hopeful and cheerful instead of grim and concerned about the future of humanity. First, we now not only have instant access to the collective intellectual production of humanity, but we now have a collective, networked brain to connect pieces of information and tackle our problems (disease, ignorance, poverty, extremism etc.). Secondly, we now seem to have unconsciously interfaced our brains with all the information that we can tap into.
The first point highlights the extraordinary possibilities that “big data” offers. For instance, if we can have all the medical cases of humans from the past many years stored and make them accessible to doctors everywhere, then someone is bound to notice a trend — what seems to have worked with patients who have some particular disease. Serendipitous discoveries will become less fortuitous and systematic exploration of data/information will produce many new discoveries — for the benefit of humanity.
The second point notes how we have now become better at freeing our brains from any storage tasks (nobody commits whole texts to memory anymore), turning them instead into meta-information systems, where one only needs to know how and where to find useful data on each specific topic and how to connect information and concepts.
Critics will counter with surveys showing that we read fewer books and waste too much time with our smartphones and on social networks. Some will cite studies to back up one gloomy view or another. But on such topics, one can often find studies to support any claim and it is too early to conclude that there is any negative effect on our brains. And in any case, we humans have proved to be masters at transforming ourselves, our minds and our lives to make the best of each new development.
This realisation of how the internet has changed our way of thinking has significant implications on our educational approaches, in particular. It has been noted that when students are asked to post their essays (or even drafts) on websites for everyone to see, they (expectedly) put in much greater efforts. Hence essays should probably always be posted instead of handed in. And schools and universities should teach more how to find and sift through information and how to analyse it rather than “learn” quantities of information.
On another front, the wiki and the open-source movements have (re)ignited our collaborative human spirit. Perhaps that kind of approach should be generalised to other walks of life.
The digital world is now rather fully connected to our brains, individually and collectively. We cannot bemoan this. Instead, we need to understand the various effects and impacts and try to make the best of them.
Nidhal Guessoum is associate dean at the American University of Sharjah. He can be followed on Twitter at: www.twitter.com/@NidhalGuessoum