The iBrain generation: Growing up wired, distracted and overknowledge’d
II) THE INTERNET: DIGITAL PLAYERS
III) DELVING DEEPER: A congruence analysis
I. Information processing in an ‘instant gratification’ society (the only 14 child syndrome)
II. How much is too much? Multi-tasking vs multi-distracting? Boundarylessness, perfectionism and attention management
IV) DRAWING CONCLUSIONS
Seeing leverage points for a mental model switch
I. Introduction: my personal dilemma
The iBrain generation: Growing up wired, distracted and overknowledge ’ d
2011 was meant to be the year of productivity. That was my New Years resolution. As January lurked around the corner, though, thoughts of designing a rigid revision timetable for upcoming exams gave way to mindlessly Googling turkey recipes for the family’s holiday dinner. My mind drifted. Yet again.
The number of parent-reported cases of children1 diagnosed with Attention Deficit Hyperactivity Disorder in the U.S. has reached a whooping 9%, largely due to the surge of social media and gadgets demanding (or luring) our attention on a twenty- four hour basis. Acquiring a smartphone essentially means one is somehow connected and stimulated or engaged at every minute of every day. Whilst making our lives easier and paving the way for a more interactive, interconnected world, recent studies have shown that instant technology has also caused unintended side effects, most notably through its impact on learning. A study by Sparrow et al2 demonstrates that our brains immediately associate solving a difficult task with a computer. When faced with a question to which we do not know the answer, our brains automatically think of using a computer to find it. Is this the effect Google, Wikipedia and their extensive knowledge databases have on our brains? That we have lost the ability to go through that first step of thinking about the problem critically, analytically and on our own ?
The latter question propelled me to talk about technology. A perfectionist since birth, I have sought to do everything to the best of my abilities and striven to excel, driven to this ultimate ideal of perfection. The fallacy in my approach has been failing to recognise that perfection simply does not exist and that some of the most valuable lessons to be learnt stem from, not perfection but im perfection. Failure.
Yet there are no failures. There are only opportunities to learn.
In this essay, I seek to learn from two personal shortcomings I have recently been made aware of: (1) my inability to find boundaries and (2) failure to manage my time and focus in an increasingly informative world. Coming across an article entitled “Is Google Making Us Stupid ?”3 I realised I wasn’t the only one. I asked myself why. Since childhood, I have sought to find the answer to everything that sparked my curiosity: from the sky’s blueness to a solar-powered calculator to the Milky Way. My favourite pastime was exploring one of the first computer-based games I ever encountered: how stuff works. This interactive game aimed at teaching children how everything around them worked, in anatomic, physical or otherwise terms.
I mirror that approach and try to put the digital revolution and my short-attention span in a broader cultural and social context. We now live in a fast-moving, “wiki” society - “wiki” meaning “quick” in Hawaiian -, which I believe has exacerbated my inability to focus. Hence the term “wikibrain”. I look at whether that is also true on a collective level, what forces underlie the digitalization movement and whether it is desirable to resist it. Here lies my personal (and I assume, shared with my generation’s) dilemma: although the Internet provides access to an unprecedented breadth of information and has helped shape my personal and professional development, it also impairs my ability to effectively manage my time and attention and, more alarmingly, has affected the way I approach learning.
What is the answer? To go off-line? My first reaction to a study on the cognitive consequences of information technology, and Carr’s two technology-focused books4, was to think linearly: if engaging with technological gadgets has decreased my capability to retain information and impacted on my ability to single-task, the answer is to simply stop. Correct an inequality with an antagonistic action. But does that truly solve the problem, or merely shifts the burden elsewhere? Such a drastic reaction does not target the underlying cause, proves counterproductive and is unrealistic in our technology-centered world. Further, it doesn’t seek to address the issues on a macro level, which is something this paper targets: thinking about solutions not only for myself, but our generation as a whole.
Indeed, reports of ‘digital detoxing’ are becoming evermore frequent as people begin to recognise how alarmingly dependent they have become. Examples include an Australian journalist and her family’s decision to live technologically-free for six months detailed in the humorous “The Winter of Our Disconnect: How Three Totally Wired Teenages (and a Mother Who Slept with her iPhone) Pulled the Plug on Their Technology and Lived to Tell the Tale 5, and writer Dan Roberts’ similar abandonment quest6.
In fact, Roberts’ account scarily resonated with me: “…I would reach the end of a 10- hour day and feel I had achieved almost nothing. What had I learnt? Which of those hundreds of bite-size pieces of information had actually lodged in my brain?” Does not the fact we have the answer to our every question in our fingertips make us more passive rather than active learners? Moreover, as the source base of digital information is limitless, how to we find boundaries when there are no dead ends, and how do we filter through what is reliable from what is not?
My journey to answer these questions is divided into three parts:
First, a broad systemic overview of how the Internet came to fruition so I can get a more in-depth understanding of the social, economic and political issues in question and who the main stakeholders are, Second, I zoom into the Individual stakeholder realm and look at how I have been affected by technology, exploring two main issues through systems model frameworks:
I. Impact on memory, information processing, and analytical reasoning
II. Attention deficit, focus management, and the Internet’s boundarylessness: learning how to prioritise and assimilate information
Third, I dare to think ahead and offer suggestions of potential long-term leverage points (on a macro level) and ways by which I can attempt to switch individual mental models, arguing the answer lies not in an all-or-nothing approach. It lies in technology itself.
II. The Internet: Digital Players
Technological advancement has changed the world on an unprecedented scale. It has been accompanied by incredible economic development and growth. Never before have we had this much access to this much information. However, there are two sides to every coin: the information we have has been tempered with, filtered and selected. It is at times unreliable. More pertinently, our search for convenience has meant we are gradually digitalizing our daily routines, which has brought a number of unintended, knock-over effects. The problem with these issues, and how they have arisen, is that society fails to think through long-term lens, focusing only on quick-fix solutions to the problems we now face as opposed to looking beyond the surface to what truly lies beneath.
We have boarded an ever-moving train, which allows us to explore and develop ways of enhancing our lives on a scale and at a speed never before imagined. We have drastically changed the way we experience the world: e-commerce has meant real-life shopping slowly becomes a thing of the past as groceries can be delivered to our doorsteps; Amazon and eBay have facilitated inter-personal trade of tangible goods; the CD, DVD and portable tape industries are gradually overshadowed by the development of online media platforms like iTunes, Hulu and BBC iPlayer. Images and films are becoming three-dimensional. No longer do we need to go jogging outside or rent a court for a friendly tennis match: all we need is the click of a button on a Wii console in our living rooms.
How have we got here? Parallels can be drawn with the Industrial Revolution. The industrialization and urbanization we witnessed at the close of the 19th century was driven by the need for higher efficiency (with a view to capital gains), attempting to create a more automated and mechanistic, Frederick Winslow Taylor-esque workforce. ]Capitalism and consumerism as the driving forces for change and innovation. People conglomerate in cities, urban expansion generating market growth. Population size expands at a faster rate. Our money-oriented and increasingly democratic societies promote competition, innovation and business development. Governments need strong economies, so further push for economic development.
The past-faced, ‘fast food’ society.
Today’s problems are yesterday’s solutions. The international political tensions of the twentieth century, of which the Arms Race is but one example, and decolonization and democratization movements, have propelled a snowball of economic growth and development. As more countries interact on the international political plane and try to address international tensions, they compete against one another through speedy growth, innovative technology and attainment of sustainable self-sufficiency. How to achieve that? More effective and prosperous military and economic contingencies. And how to achieve that ? An increased focus on technology and infrastructure. Thus the emphasis on technological development, culminating in the establishment of computer technology.
I see computerization as filling in the gap for a stark political divide and lack of co- operation. The failure of the League of Nations, and the monstrosities of the two World Wars and the Cold War, underscored humans’ failure to communicate and collaborate. There are obviously other factors at play (nationalism, self-interest etc) which are beyond the scope of this essay, but, in summary, the political failures of the twentieth century highlighted a very central need to connect. Hence, the Internet: an inter-connected network database.
Computers came to the forefront in the 1960s-1970s but only began developing into something akin what we have today in the 1980s and 1990s. An industry mascot at first, they were soon marketed as consumer goods. Moreover, they were soon to be the home of the most revolutionary idea of all: the Internet. It epitomized humans’ search for ways to make life easier and simpler, and us, more productive. Again, the Tayloresque7 idea of efficiency. The invention of the worldwide web reflects our need for immediate collaboration, having been almost gone to the brink of destruction during the tense Cold War era.
1 Summary Health Statistics for U.S. Children: National Health Interview Survey, 2009, CDC.
2 Sparrow, Betsy; Liu, Jenny; Wegner, Daniel M., “Google Effects on Memory: Cognitive Consequences of Having Information at Our Fingertips”, Science Magazine, July 2011.
3 Carr, Nicholas, “Is Google Making Us Stupid? What The Internet is Doing to Our Brains”, The Atlantic, July/August 2008.
4 Carr, Nicholas “The Big Switch: Rewiring the World” and “The Shallows” [...see Bibliography]
5 Maushart, Sushan, Tarcher Reprint Edition, 2011.
6 Roberts, Dan, “Digital Detox: a writer abandoning technology”, The Telegraph Newspaper; Jun 2010.
7 “In the past the man has been first [...] in the future the system must be first.”, Taylor, Frederick Wislow, “The Principles of Scientific Management”, Dover Publications, 1997.
- Quote paper
- Luciana Carvalho Se (Author), 2011, The 21st Century WikiBrain: Overexposure and Addition to Technological Stimuli, Munich, GRIN Verlag, https://www.grin.com/document/177537