Extremely long, but
1) that's kind of the point, and
2) it's extremely interesting.
http://www.theatlantic.com/doc/200807/google.
What the Internet is doing to our brains
by Nicholas Carr
Is Google Making Us Stupid?
"Dave, stop. Stop, will you? Stop, Dave. Will you stop, Dave?" So the supercomputer HAL pleads with the implacable astronaut Dave Bowman in a famous and weirdly poignant scene toward the end of Stanley Kubrick's 2001: A Space Odyssey. Bowman, having nearly been sent to a deep-space death by the malfunctioning machine, is calmly, coldly disconnecting the memory circuits that control its artificial brain. "Dave, my mind is going," HAL says, forlornly. "I can feel it. I can feel it."
I can feel it, too. Over the past few years I've had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn't going—so far as I can tell—but it’s changing. I'm not thinking the way I used to think. I can feel it most strongly when I'm reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I'd spend hours strolling through long stretches of prose. That's rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I'm always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.
I think I know what’s going on. For more than a decade now, I’ve been spending a lot of time online, searching and surfing and sometimes adding to the great databases of the Internet. The Web has been a godsend to me as a writer. Research that once required days in the stacks or periodical rooms of libraries can now be done in minutes. A few Google searches, some quick clicks on hyperlinks, and I've got the telltale fact or pithy quote I was after. Even when I’m not working, I’m as likely as not to be foraging in the Web’s info-thickets—reading and writing e-mails, scanning headlines and blog posts, watching videos and listening to podcasts, or just tripping from link to link to link. (Unlike footnotes, to which they're sometimes likened, hyperlinks don’t merely point to related works; they propel you toward them.)
For me, as for others, the Net is becoming a universal medium, the conduit for most of the information that flows through my eyes and ears and into my mind. The advantages of having immediate access to such an incredibly rich store of information are many, and they’ve been widely described and duly applauded. "The perfect recall of silicon memory," Wired’s Clive Thompson has written, "can be an enormous boon to thinking." But that boon comes at a price. As the media theorist Marshall McLuhan pointed out in the 1960s, media are not just passive channels of information. They supply the stuff of thought, but they also shape the process of thought. And what the Net seems to be doing is chipping away my capacity for concentration and contemplation. My mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski.
I'm not the only one. When I mention my troubles with reading to friends and acquaintances—literary types, most of them—many say they’re having similar experiences. The more they use the Web, the more they have to fight to stay focused on long pieces of writing. Some of the bloggers I follow have also begun mentioning the phenomenon. Scott Karp, who writes a blog about online media, recently confessed that he has stopped reading books altogether. "I was a lit major in college, and used to be [a] voracious book reader," he wrote. "What happened?" He speculates on the answer: "What if I do all my reading on the web not so much because the way I read has changed, i.e. I'm just seeking convenience, but because the way I THINK has changed?"
Bruce Friedman, who blogs regularly about the use of computers in medicine, also has described how the Internet has altered his mental habits. "I now have almost totally lost the ability to read and absorb a longish article on the web or in print," he wrote earlier this year. A pathologist who has long been on the faculty of the University of Michigan Medical School, Friedman elaborated on his comment in a telephone conversation with me. His thinking, he said, has taken on a "staccato" quality, reflecting the way he quickly scans short passages of text from many sources online. "I can't read War and Peace anymore," he admitted. "I've lost the ability to do that. Even a blog post of more than three or four paragraphs is too much to absorb. I skim it."
Anecdotes alone don't prove much. And we still await the long-term neurological and psychological experiments that will provide a definitive picture of how Internet use affects cognition. But a recently published study of online research habits, conducted by scholars from University College London, suggests that we may well be in the midst of a sea change in the way we read and think. As part of the five-year research program, the scholars examined computer logs documenting the behavior of visitors to two popular research sites, one operated by the British Library and one by a U.K. educational consortium, that provide access to journal articles, e-books, and other sources of written information. They found that people using the sites exhibited "a form of skimming activity," hopping from one source to another and rarely returning to any source they'd already visited. They typically read no more than one or two pages of an article or book before they would "bounce" out to another site. Sometimes they'd save a long article, but there's no evidence that they ever went back and actually read it. The authors of the study report:
It is clear that users are not reading online in the traditional sense; indeed there are signs that new forms of "reading" are emerging as users "power browse" horizontally through titles, contents pages and abstracts going for quick wins. It almost seems that they go online to avoid reading in the traditional sense.
Thanks to the ubiquity of text on the Internet, not to mention the popularity of text-messaging on cell phones, we may well be reading more today than we did in the 1970s or 1980s, when television was our medium of choice. But it's a different kind of reading, and behind it lies a different kind of thinking—perhaps even a new sense of the self. "We are not only what we read," says Maryanne Wolf, a developmental psychologist at Tufts University and the author of Proust and the Squid: The Story and Science of the Reading Brain. "We are how we read." Wolf worries that the style of reading promoted by the Net, a style that puts "efficiency" and "immediacy" above all else, may be weakening our capacity for the kind of deep reading that emerged when an earlier technology, the printing press, made long and complex works of prose commonplace. When we read online, she says, we tend to become "mere decoders of information." Our ability to interpret text, to make the rich mental connections that form when we read deeply and without distraction, remains largely disengaged.
Reading, explains Wolf, is not an instinctive skill for human beings. It's not etched into our genes the way speech is. We have to teach our minds how to translate the symbolic characters we see into the language we understand. And the media or other technologies we use in learning and practicing the craft of reading play an important part in shaping the neural circuits inside our brains. Experiments demonstrate that readers of ideograms, such as the Chinese, develop a mental circuitry for reading that is very different from the circuitry found in those of us whose written language employs an alphabet. The variations extend across many regions of the brain, including those that govern such essential cognitive functions as memory and the interpretation of visual and auditory stimuli. We can expect as well that the circuits woven by our use of the Net will be different from those woven by our reading of books and other printed works.
Sometime in 1882, Friedrich Nietzsche bought a typewriter—a Malling-Hansen Writing Ball, to be precise. His vision was failing, and keeping his eyes focused on a page had become exhausting and painful, often bringing on crushing headaches. He had been forced to curtail his writing, and he feared that he would soon have to give it up. The typewriter rescued him, at least for a time. Once he had mastered touch-typing, he was able to write with his eyes closed, using only the tips of his fingers. Words could once again flow from his mind to the page.
But the machine had a subtler effect on his work. One of Nietzsche’s friends, a composer, noticed a change in the style of his writing. His already terse prose had become even tighter, more telegraphic. "Perhaps you will through this instrument even take to a new idiom," the friend wrote in a letter, noting that, in his own work, his "'thoughts' in music and language often depend on the quality of pen and paper."
"You are right," Nietzsche replied, "our writing equipment takes part in the forming of our thoughts." Under the sway of the machine, writes the German media scholar Friedrich A. Kittler, Nietzsche’s prose "changed from arguments to aphorisms, from thoughts to puns, from rhetoric to telegram style."
The human brain is almost infinitely malleable. People used to think that our mental meshwork, the dense connections formed among the 100 billion or so neurons inside our skulls, was largely fixed by the time we reached adulthood. But brain researchers have discovered that that’s not the case. James Olds, a professor of neuroscience who directs the Krasnow Institute for Advanced Study at George Mason University, says that even the adult mind "is very plastic." Nerve cells routinely break old connections and form new ones. "The brain," according to Olds, "has the ability to reprogram itself on the fly, altering the way it functions."
As we use what the sociologist Daniel Bell has called our "intellectual technologies"—the tools that extend our mental rather than our physical capacities—we inevitably begin to take on the qualities of those technologies. The mechanical clock, which came into common use in the 14th century, provides a compelling example. In Technics and Civilization, the historian and cultural critic Lewis Mumford described how the clock "disassociated time from human events and helped create the belief in an independent world of mathematically measurable sequences." The "abstract framework of divided time" became "the point of reference for both action and thought."
The clock’s methodical ticking helped bring into being the scientific mind and the scientific man. But it also took something away. As the late MIT computer scientist Joseph Weizenbaum observed in his 1976 book, Computer Power and Human Reason: From Judgment to Calculation, the conception of the world that emerged from the widespread use of timekeeping instruments "remains an impoverished version of the older one, for it rests on a rejection of those direct experiences that formed the basis for, and indeed constituted, the old reality." In deciding when to eat, to work, to sleep, to rise, we stopped listening to our senses and started obeying the clock.
The process of adapting to new intellectual technologies is reflected in the changing metaphors we use to explain ourselves to ourselves. When the mechanical clock arrived, people began thinking of their brains as operating "like clockwork." Today, in the age of software, we have come to think of them as operating "like computers." But the changes, neuroscience tells us, go much deeper than metaphor. Thanks to our brain’s plasticity, the adaptation occurs also at a biological level.
The Internet promises to have particularly far-reaching effects on cognition. In a paper published in 1936, the British mathematician Alan Turing proved that a digital computer, which at the time existed only as a theoretical machine, could be programmed to perform the function of any other information-processing device. And that's what we're seeing today. The Internet, an immeasurably powerful computing system, is subsuming most of our other intellectual technologies. It's becoming our map and our clock, our printing press and our typewriter, our calculator and our telephone, and our radio and TV.
When the Net absorbs a medium, that medium is re-created in the Net's image. It injects the medium's content with hyperlinks, blinking ads, and other digital gewgaws, and it surrounds the content with the content of all the other media it has absorbed. A new e-mail message, for instance, may announce its arrival as we're glancing over the latest headlines at a newspaper's site. The result is to scatter our attention and diffuse our concentration.
The Net's influence doesn't end at the edges of a computer screen, either. As people's minds become attuned to the crazy quilt of Internet media, traditional media have to adapt to the audience's new expectations. Television programs add text crawls and pop-up ads, and magazines and newspapers shorten their articles, introduce capsule summaries, and crowd their pages with easy-to-browse info-snippets. When, in March of this year, TheNew York Times decided to devote the second and third pages of every edition to article abstracts, its design director, Tom Bodkin, explained that the "shortcuts" would give harried readers a quick "taste" of the day's news, sparing them the "less efficient" method of actually turning the pages and reading the articles. Old media have little choice but to play by the new-media rules.
Never has a communications system played so many roles in our lives—or exerted such broad influence over our thoughts—as the Internet does today. Yet, for all that's been written about the Net, there's been little consideration of how, exactly, it's reprogramming us. The Net's intellectual ethic remains obscure.
About the same time that Nietzsche started using his typewriter, an earnest young man named Frederick Winslow Taylor carried a stopwatch into the Midvale Steel plant in Philadelphia and began a historic series of experiments aimed at improving the efficiency of the plant's machinists. With the approval of Midvale's owners, he recruited a group of factory hands, set them to work on various metalworking machines, and recorded and timed their every movement as well as the operations of the machines. By breaking down every job into a sequence of small, discrete steps and then testing different ways of performing each one, Taylor created a set of precise instructions—an "algorithm," we might say today—for how each worker should work. Midvale's employees grumbled about the strict new regime, claiming that it turned them into little more than automatons, but the factory’s productivity soared.
More than a hundred years after the invention of the steam engine, the Industrial Revolution had at last found its philosophy and its philosopher. Taylor's tight industrial choreography—his "system," as he liked to call it—was embraced by manufacturers throughout the country and, in time, around the world. Seeking maximum speed, maximum efficiency, and maximum output, factory owners used time-and-motion studies to organize their work and configure the jobs of their workers. The goal, as Taylor defined it in his celebrated 1911 treatise, The Principles of Scientific Management, was to identify and adopt, for every job, the "one best method" of work and thereby to effect "the gradual substitution of science for rule of thumb throughout the mechanic arts." Once his system was applied to all acts of manual labor, Taylor assured his followers, it would bring about a restructuring not only of industry but of society, creating a utopia of perfect efficiency. "In the past the man has been first," he declared; "in the future the system must be first."
Taylor's system is still very much with us; it remains the ethic of industrial manufacturing. And now, thanks to the growing power that computer engineers and software coders wield over our intellectual lives, Taylor's ethic is beginning to govern the realm of the mind as well. The Internet is a machine designed for the efficient and automated collection, transmission, and manipulation of information, and its legions of programmers are intent on finding the "one best method"—the perfect algorithm—to carry out every mental movement of what we've come to describe as "knowledge work."
Google's headquarters, in Mountain View, California—the Googleplex—is the Internet's high church, and the religion practiced inside its walls is Taylorism. Google, says its chief executive, Eric Schmidt, is "a company that’s founded around the science of measurement," and it is striving to "systematize everything" it does. Drawing on the terabytes of behavioral data it collects through its search engine and other sites, it carries out thousands of experiments a day, according to the Harvard Business Review, and it uses the results to refine the algorithms that increasingly control how people find information and extract meaning from it. What Taylor did for the work of the hand, Google is doing for the work of the mind.
The company has declared that its mission is "to organize the world’s information and make it universally accessible and useful." It seeks to develop "the perfect search engine," which it defines as something that "understands exactly what you mean and gives you back exactly what you want." In Google’s view, information is a kind of commodity, a utilitarian resource that can be mined and processed with industrial efficiency. The more pieces of information we can "access" and the faster we can extract their gist, the more productive we become as thinkers.
Where does it end? Sergey Brin and Larry Page, the gifted young men who founded Google while pursuing doctoral degrees in computer science at Stanford, speak frequently of their desire to turn their search engine into an artificial intelligence, a HAL-like machine that might be connected directly to our brains. "The ultimate search engine is something as smart as people—or smarter," Page said in a speech a few years back. "For us, working on search is a way to work on artificial intelligence." In a 2004 interview with Newsweek, Brin said, "Certainly if you had all the world’s information directly attached to your brain, or an artificial brain that was smarter than your brain, you’d be better off." Last year, Page told a convention of scientists that Google is "really trying to build artificial intelligence and to do it on a large scale."
Such an ambition is a natural one, even an admirable one, for a pair of math whizzes with vast quantities of cash at their disposal and a small army of computer scientists in their employ. A fundamentally scientific enterprise, Google is motivated by a desire to use technology, in Eric Schmidt's words, "to solve problems that have never been solved before," and artificial intelligence is the hardest problem out there. Why wouldn't Brin and Page want to be the ones to crack it?
Still, their easy assumption that we'd all "be better off" if our brains were supplemented, or even replaced, by an artificial intelligence is unsettling. It suggests a belief that intelligence is the output of a mechanical process, a series of discrete steps that can be isolated, measured, and optimized. In Google's world, the world we enter when we go online, there's little place for the fuzziness of contemplation. Ambiguity is not an opening for insight but a bug to be fixed. The human brain is just an outdated computer that needs a faster processor and a bigger hard drive.
The idea that our minds should operate as high-speed data-processing machines is not only built into the workings of the Internet, it is the network's reigning business model as well. The faster we surf across the Web—the more links we click and pages we view—the more opportunities Google and other companies gain to collect information about us and to feed us advertisements. Most of the proprietors of the commercial Internet have a financial stake in collecting the crumbs of data we leave behind as we flit from link to link—the more crumbs, the better. The last thing these companies want is to encourage leisurely reading or slow, concentrated thought. It's in their economic interest to drive us to distraction.
Maybe I'm just a worrywart. Just as there's a tendency to glorify technological progress, there's a countertendency to expect the worst of every new tool or machine. In Plato's Phaedrus, Socrates bemoaned the development of writing. He feared that, as people came to rely on the written word as a substitute for the knowledge they used to carry inside their heads, they would, in the words of one of the dialogue's characters, "cease to exercise their memory and become forgetful." And because they would be able to "receive a quantity of information without proper instruction," they would "be thought very knowledgeable when they are for the most part quite ignorant." They would be "filled with the conceit of wisdom instead of real wisdom." Socrates wasn't wrong—the new technology did often have the effects he feared—but he was shortsighted. He couldn't foresee the many ways that writing and reading would serve to spread information, spur fresh ideas, and expand human knowledge (if not wisdom).
The arrival of Gutenberg's printing press, in the 15th century, set off another round of teeth gnashing. The Italian humanist Hieronimo Squarciafico worried that the easy availability of books would lead to intellectual laziness, making men "less studious" and weakening their minds. Others argued that cheaply printed books and broadsheets would undermine religious authority, demean the work of scholars and scribes, and spread sedition and debauchery. As New York University professor Clay Shirky notes, "Most of the arguments made against the printing press were correct, even prescient." But, again, the doomsayers were unable to imagine the myriad blessings that the printed word would deliver.
So, yes, you should be skeptical of my skepticism. Perhaps those who dismiss critics of the Internet as Luddites or nostalgists will be proved correct, and from our hyperactive, data-stoked minds will spring a golden age of intellectual discovery and universal wisdom. Then again, the Net isn't the alphabet, and although it may replace the printing press, it produces something altogether different. The kind of deep reading that a sequence of printed pages promotes is valuable not just for the knowledge we acquire from the author's words but for the intellectual vibrations those words set off within our own minds. In the quiet spaces opened up by the sustained, undistracted reading of a book, or by any other act of contemplation, for that matter, we make our own associations, draw our own inferences and analogies, foster our own ideas. Deep reading, as Maryanne Wolf argues, is indistinguishable from deep thinking.
If we lose those quiet spaces, or fill them up with "content," we will sacrifice something important not only in our selves but in our culture. In a recent essay, the playwright Richard Foreman eloquently described what's at stake:
I come from a tradition of Western culture, in which the ideal (my ideal) was the complex, dense and "cathedral-like" structure of the highly educated and articulate personality—a man or woman who carried inside themselves a personally constructed and unique version of the entire heritage of the West. [But now] I see within us all (myself included) the replacement of complex inner density with a new kind of self—evolving under the pressure of information overload and the technology of the "instantly available."
As we are drained of our "inner repertory of dense cultural inheritance," Foreman concluded, we risk turning into "'pancake people'—spread wide and thin as we connect with that vast network of information accessed by the mere touch of a button."
I'm haunted by that scene in 2001. What makes it so poignant, and so weird, is the computer's emotional response to the disassembly of its mind: its despair as one circuit after another goes dark, its childlike pleading with the astronaut—"I can feel it. I can feel it. I'm afraid"—and its final reversion to what can only be called a state of innocence. HAL's outpouring of feeling contrasts with the emotionlessness that characterizes the human figures in the film, who go about their business with an almost robotic efficiency. Their thoughts and actions feel scripted, as if they're following the steps of an algorithm. In the world of 2001, people have become so machinelike that the most human character turns out to be a machine. That's the essence of Kubrick's dark prophecy: as we come to rely on computers to mediate our understanding of the world, it is our own intelligence that flattens into artificial intelligence.
Discussion: First of all, I think because I'm so stupidly paranoid sometimes, I love dystopian stories. I would really like to write one one day. I had an idea, it's sitting in a folder on my computer, but the problem is that it's more for the technology involved on a cool level, rather than the dystopian issues. Basically like distilling Star Wars down to lightsabers. Cool, but you don't get any story. Someday hopefully I'll have a good idea though.
Now, to the actual article. (I have two post-its full of notes I made about the article, but I'm sure I am missing a few because I wrote them as I skimmed it a second time.) I've noticed this myself, and in fact I think I may have even blogged about it before. Probably talking about internet addiction and stuff. I know that I am a whore for information. Seriously, trivia, knowledge, you name it. I've pretty much always wanted to be in school, just at my own pace. I think, for the rest of my life, I will always be taking some sort of class. I enjoy learning new techniques and gaining new skills and having information stored away at the back of my brain. The problem is, you forget it over time... But that's not my point, I'm getting a bit tangential. The point is, I've noticed I find it harder to read books. Unless it's extremely compelling, I will stop and think a few pages/chapters in, but my eyes keep moving, which is annoying because then you have to go and read a whole page over again. I've always had that quality, that I will think, almost anywhere. My best friend in gradeschool used to tease me about it all the time, because sometimes I would stop talking to her in a car ride or something and just be staring out the window, thinking. I used to get people asking me if I was okay, yeah, I'm fine, I'm just staring out the window looking at things and thinking about them... But I have noticed that after a little bit I lose interest in the book, it just becomes words on the page. More than just stopping and thinking (or getting easily distracted, like many books you have to read for school, oh, let's say, Wuthering Heights [BLECH]), but completely losing interest. Almost as if your eyes lose focus because you've been reading too long, but it's coming from your brain instead. The words on the page just become mush. It's so frustrating.
Sometime last year, I can't remember if it was before drum corps or after fall semester, I told myself that this year would be a big change for me. I was going to Hawaii to get in some relaxation, some adventure, figure out what in my life is not going right. There are a lot of small things, I suppose, but I think the main thing I noticed is that I've lost my self-limitation. I used to be able to just come home, do what I had to do, and then I could do what I wanted. Part of it was that gradeschool was too easy in some ways, and not interesting in many others, and when I got to high school I cared less about classes because I cared more about friends. Same with college, and especially because I told myself I'm doing engineering because I can, not because it's my main interest. I just want to have the skillset. Which only leads to more procrastination. But the issue here is that I needed to sit myself down and say, no, you have to start caring about stupid things again, they aren't going to work themselves out. I need to manage my time online. I need to exercise more. I need to make an effort getting out to meet people. I need to be ME and not lose myself in inane things. I don't even know where I lost myself, but somewhere toward the end of high school and the beginning of college, I did. I focused too much on what other people wanted me to be, so much the adult because I know, and they know, that I can handle the responsibility-- but I can't forget that I am so much a child's imagination. I daydream, I sketch, I have too many ideas and aspirations. I'm too hopeful in humanity, probably. But I've always felt that way, and I think I tried to crush it. It seems like all throughout gradeschool I was one of the most mature kids in my class, though I always felt frustrated by the fact that I had no real knowledge/understanding/interest in current events, other than scientific. In that sense I think I definitely matured, though it pretty much took me all throughout high school. But I can definitely at least better understand some politics and things, or at least I have more of an interest in them. Something about them is still not there for me to grasp. Anyway, the point is that I have a need to create and a need to let my imagination go and a need to be a dork. I can't sacrifice that for anything. Maybe I'm not going to play with all my toys but it doesn't make it any less fun having them. Laura was looking at me ridiculously for buying so many of the Indiana Jones 3 3/4" figures, but as soon as I pulled them out of the box to show her she started messing with the accessories and the boxes and we actually spent a little while setting them up and trying to keep Spalko's pistol from getting eaten by Jack.
And now I am extremely far off topic. Back to my post-its.
Artificial intelligence scares the SHIT out of me. Not in the sense that computers will take over and we'll become a biology to be exploited or killed, but rather in the sense that humans will become nothing more than advanced technology, modifying ourselves until we lose every element of humanity... and yeah, I know that deep down poetically it's emotion, but it scares me that people want to genetically CHOOSE what their kids will be, not just male or female but eye color and hair color and intelligence and skills and strength, and that those are things that would eventually become outdated as they are upgraded, and soon you have outdated children. Who could treat a person like that? Another human being? This is ignorance. (This is Sparta! Oh wait. No, it's just ignorance. Maybe politics. But it's madness, too.)
And maybe genetic modification everyone decides is unethical. (Though there would probably still be someone out there experimenting... the law isn't perfect) So what if people can put chips into their brain to remember things? Yeah, I remember reading a kids magazine wishing you could just do that so you wouldn't have to go to school. Convenient, but what if your computer broke, or was programmed badly? Or you crashed in a forest and a moose trampled on you and broke the circuit? Then where would you be? And it also leads to the ego that you think you know something, but really, you don't. This is another thing I've had to find out about myself recently. Gradeschool was too easy and not interesting, and it's easy to delude yourself about teachers and books and things if you're obsessive about good grades, but starting senior year and going 'til now I've been able to admit to myself a few things. Grades aren't as important as they seem, because you should get out and live. That being said, they're very convenient for scholarships and things, and I will probably be kicked out of the honors program. Which upset me, but after a while being bitter I realized there really aren't any benefits anyway, other than being able to check out 100 books from the library at a time. (If I'm still in, I'm going to do it. If not, I'm stealing Candice's card to do it.) Oh, and I won't graduate Summa Cum Laude (I think?), but... does that really matter? Besides grades, I had to learn that I'm not as much of a genius as I wished I was. There are things I don't understand, and probably my mewest revelation is that you can't just learn them for a semester, for a test. You're not really learning anything, and I'm starting to feel those effects. You have discussions with your friends and realize that even though you studied and at one point knew what they are talking about, you can't recall as much of it as you'd like. (Then again, maybe I'm too obsessive and I just want to remember it all. Who knows.)
Can't remember if I already mentioned this a few paragraphs up, but I'm just going down the post-its now. Staying in school... I think I will be always taking classes, about something. Literature, history, art, you name it. And because classes in school are so focused, I'm always reading. Used to be magazines and books, now mostly internet. I have acess to learning so many things I may never need to know directly, but what if them one day applies to a creative engineering solution or a writing idea or an art idea? That's how my brain works, I have millions of stupid little things floating around, and somehow they will crash into eachother and I'll have an idea. In order to be creative I know I have to feed my brain, but I've also learned if I'm feeding it I need to be creative.
Something else I think I've touched on before, and I think about at least once a week if not once a day; age and technology. I wonder how a generation that's grown up understanding computers will adjust to technology as they get older, as older generations fall behind (and older technology falls behind, and the things that go with it: media, politics... sadly). I wonder how technology's going to change...
Now, malleability of the brain, that's interesting. It's both good and bad. Bad in that my brain is becoming more rapid-fire and I lose focus reading books. Not good at all. Good in that it means you can teach an old dog new tricks... Definitely one of my fears. That I will stop learning, that I will get stupid with age (yeah, not just fear of dying here AUGH), that I will lose ability to do things as I get old. Not that I feel old now, though I think people expect me to say that because OHMYGODI'MTWENTY (well, not yet, technically). Like I've said before, I'm too much a child's mind and I don't want to not learn things, not be able to go on adventures. (At least, it's self-encouragement to get in shape. So when I in my eighties I can still run.)
Internet is not the only thing that messes with my brain. I cannot listen to music when I study, or really when I do anything that is not expressly listening to music. Maybe it's because I was so much raised on music that it's like my sixth sense, tempo and tuning and listening and feeling it. For a while I just tried studying to music without words, because I figured it was that language is so important to us that it would be what distracts us, but it's not true for me, it's definitely any sort of music. That being said, I just had the remembrance that a common "science fair" in grade school was to see who took a test better, those who listened to music while studying, and those who didn't, and that those who listened to (I think classical) did better. I am not sure that would work for me, though I think the science behind it is that the music stimulates more of your brain to get it active and remembering. Only problem for me is that I guess it is too active. Maybe it's just that my brain is turning into an internet-brain and I lose focus on studying, but really music should help... who knows.
I've also noticed my mental to-do list skills breaking down. Now whether that was that I have so much more to keep track of now, not just do your homework do your chores work on a craft project and go to bed, but do your homework go to work remember what to wear and then go to the bank and then go to walmart (with a whole separate mental list) and then go home to meet laura and pack (yet another list) and then clean the house so people can come over at x time and then go on vacation... remains to be seen. Or maybe that I now purposefully carry around a notebook and pen at all times, and write down project ideas and things I need to do, and I rely more on that... But I almost always forget to go back and check it. Right now my desk is COVERED in post-it notes and the notepad function on my phone is full. For a while I tried a method of keeping "A, B, C, D" level lists on my desktop in Word, but that was forgotten quickly because it requires upkeep. So... I don't know.
Another topic dealing with age and technology: how people watch movies. When you are younger, everything is entertaining, and when you are older you start watching stuff made more for the story and the thoughtfulness and the emotion and stuff. And if you're trained in certain technical areas, I'm noting more nowadays that you watch the costumes or the acting or the way the camera pans or the editing or the CGI. I'm not convinced it's because a lot of my friends have interest or training in that, though I'm sure that's part of it. I think part of the issue is that it's popular (popular? I don't know) to be a cynic, to think you're an expert, to watch it not for entertainment but on a meta-level to guess it's predictability and it's qualities that I think ten years ago only a professional (or indie) movie maker would have cared about. Maybe it's part youtube, maybe it's the addition of behind-the-scenes special features on almost all DVDs (and I swear to god if they don't start putting more special features on the Indy dvds I will kill someone), who knows. Maybe it's all of that.
You know what word never ceases to distract me from the quality of the writing? "Gewgaws." PLEASE STOP USING THIS WORD, PEOPLE, I am begging you
Part of me has always wanted simplicity. Not quite being a hermit but living in a small house being able to take care of all my stuff and growing my own food (or catching/killing my own food, whatever) and being able to manage everything myself. Part of it is that machines are becoming too complicated, it requires a LOT of skill to be able to fix cars now. Part of it is my concern for the environment, that owning little and having a small footprint is best. Part of it is that man is turning into machine... a corporate machine, maybe, but a machine nonetheless. Buy the clothes buy the ipod buy the phone buy the car buy the gas buy the fancy restaurants buy the (COMPLETELY BULLSHIT MARKET) diamonds. I think, in a small-market system, captialism is good. Maybe I'm just too much of a gloom-and-doom worry wart, but I'm really afraid for where we, as a society, are headed. Environment, politics, corporations... Ugh. Maybe just another reason to want to get away from it all...
Plus, I've always thought survivalist skills are cool. (AND ENVIRONMENTAL HOUSE DESIGN! Awesome) I'd love to see anyone from Hollywood survive in a forest for a week. I know my plants recognition skills aren't all that great, but I don't think I'd do too badly. Shelter, water, I am there. Fire in theory. Trapping animals a bit more of a challenge.
(WOO, one post-it down!)
Back to my dystopia thing for a second-- you know what would be interesting? A current-day dystopia novel. Something that would happen tomorrow. I think partially because technology moves at an exponential rate, that it would both have to be as well as make it more interesting. (Only thing is that it would go out of date probably very quickly...) Hmm. I'll have to think on this some more.
Another concern of having a mind that moves too quickly: ADD diagnoses. They're already way too high, and way overmedicated, but being raised on the internet might only make that worse.
Maybe I am just a bit old-fashioned, wanting to avoid having a brain that moves too quickly, but in the same manner I hate having downloaded-only music and movies. I'd rather own the dvd and the cds. I think it's still too easy for files to corrupt, for computers to crash or get broken, and there are all the stupid limits like iTunes "you can only download this file to five computers!" I know it's so they make more money, but, UGH, I'm not going to get into a political rant about how I think this (almost)depression is good for most Americans... Also, I almost always have music playing in my head. You think I'm kidding... I'm not. The only time I don't consciously hear it is if I'm thinking particularly hard about something interesting.
Also, scary that we are losing our ability to focus long-term... I already think it's scary we used to have much more impressive memory skills, but lost that with the invention of the printing press. Maybe it's just that I am too insecure and always afraid I will need to remember something I have forgotten, or that I hate egotism so much and there are so many people who think they're really smart but really know nothing. I don't know. Also, I think losing the ability to focus on something for a long time will just accentuate the aspect of immaturity that you don't think outside of your immediate actions, the whole monkeysphere theory. Truly fascinating stuff.
(My next thought was, this sounds like something I read the other day that sounded like it could have come from Douglas Adams [I don't remember what it was, but it involved discussion of "it's a small world" and that it's all an act that there are only 500 people besides you {which isn't true, I have over 500 people friended on facebook, all but a handful I actually know in person}, but in reality it's just that there are many smaller circles of about 500 people... maybe it was Neil Gaiman?]. And then, it's interesting how British humor always has an interesting take on something, like they sat back to get a bit more perspective than you, and I love that. But then that also relates to this discussion, what if it's just their being a bit more old-fashioned? Almost like the metaphorical US as a pre-teen and Britain being in it's late 20's saying there-there, I told you so, in a mocking but not annoying manner. Which makes me wonder, what if the British have got it right all this time...) (I dunno, I do kind of think I could easily live there. Reading Adams, he's said that Brits are much more sensible and that atheism and agnosticism is not looked down upon as badly as it is in the US. BY THE WAY, WHAT THE HELL IS THIS. I can't see the video, but from the quotes that sounds ATROCIOUS. Just makes me more frustrated for that whole campaign by scientists to get more discussion of technology and global warming and important topics into legislature... ARGH)
Um, and back to the Epic Memory thing real fast (termed because I remember that people used to memorize the Odyssey and the Iliad), most memorization today is just rote. You do it until it becomes an instinct, not necessarily putting the thinking behind it, and I think that was an issue for me for learning some things. Or rather, testing on them. You can't teach a method of study of rote memorization and then expect it to be APPLIED. Argh.
Oh, and I think I may have actually lost a lot of my skill telling jokes. Laura and I used to be able to go on forever, back and forth from one joke to the next. I think part of that is that as you get older you find other things to entertain yourself with, and that I've stopped reading jokes as much, and I know I haven't listened to the Dr. Laura Thanksgiving Day joke show in a while, and I don't even know if she doesn't anymore. (Still annoyed that the one year we got through, LAURA STOLE MY JOKE ...yeah, I can hold grudges like nobody's business)
Interesting that the author of the article talks about pancake people. I think that's definitely true, and kind of goes back to my idea that perhaps capitalism on a more personal level is better. Mom and Pop stores and people actually learning trades if they have no interest in school, and a car I can fix myself... But that's back to politics again, I'd rather not go there. Only thing is, I don't know if I'm a product of society in that way, or if it's just my personality, but I've always been sort of a take-interest-in-everything want-to-be-a-renaissance-man sort. I tend to think it's me and not society, as this is still relatively new in regards to the internet raising people, and I am a bit extreme and all-over-the-board in my interests.
Now, back to something off topic but relating to my dystopia interest. I have almost always read books for entertainment rather than education (though a lot of educational stuff was entertaining to me...), and my dad and I have always thought that you should be able to look at a book as entertainment and not have to analyze it for themes and symbolism and metaphors. But I also wonder, it can be very interesting to have that secondary level, if you study it. I've always wanted to write a book and just say "there's no lesson here, it's just supposed to be fun", but I don't know if I could anymore. I mean, I think I could, but it wouldn't be as interesting as something with more significance. (But when I try to write significance with metaphors, I go a bit crazy, and that's another issue.) Are there are stories just pure entertainment with no lesson, though? Star Wars? I mean, I love the hell out of Indiana Jones, but I'm pretty sure the basis is to be responsible for what is right, there, and have a sense of mystery and awe. Terminator? I mean is there some Kubrickesque dystopian message there? I have no idea (I've only seen it once).
Ideally, my writing style is like Hitchhiker's Guide or Neil Gaiman or Diana Wynne Jones, all of which I consider to be entertainment primarily. (However, American Gods has a deeper meaning for me, which is slightly off topic.) Are there any messages in Hitchhiker's Guide? I don't really think so, other than, stop taking it all so seriously because it doesn't really matter (which, while sometimes good, also worries me). My only problem is that it takes quite a lot of work to get me in that mood, and it's difficult to keep. Like I said, I think I'm too much of a renaissance man. I would write a book in that style and never be able to write another one like that again, but I would easily move on to another idea another genre another style. I could act if I had the chance, I would sing if I had the chance, I would be an architect or an Imagineer. Who knows, really...
(Incidentally, 2001 was a movie I felt they could have cut the beginning and the end off of and had a decent movie. I guess I'll have to watch it again with the right metaperspective. Damn me for expecting entertainment...?)
[EDIT] Also, I wonder if my newfound semi-carsickness (I can no longer read in the car... extremely disheartening for me) relates at all to this. Truly.
1) that's kind of the point, and
2) it's extremely interesting.
http://www.theatlantic.com/doc/200807/google.
What the Internet is doing to our brains
by Nicholas Carr
Is Google Making Us Stupid?
"Dave, stop. Stop, will you? Stop, Dave. Will you stop, Dave?" So the supercomputer HAL pleads with the implacable astronaut Dave Bowman in a famous and weirdly poignant scene toward the end of Stanley Kubrick's 2001: A Space Odyssey. Bowman, having nearly been sent to a deep-space death by the malfunctioning machine, is calmly, coldly disconnecting the memory circuits that control its artificial brain. "Dave, my mind is going," HAL says, forlornly. "I can feel it. I can feel it."
I can feel it, too. Over the past few years I've had an uncomfortable sense that someone, or something, has been tinkering with my brain, remapping the neural circuitry, reprogramming the memory. My mind isn't going—so far as I can tell—but it’s changing. I'm not thinking the way I used to think. I can feel it most strongly when I'm reading. Immersing myself in a book or a lengthy article used to be easy. My mind would get caught up in the narrative or the turns of the argument, and I'd spend hours strolling through long stretches of prose. That's rarely the case anymore. Now my concentration often starts to drift after two or three pages. I get fidgety, lose the thread, begin looking for something else to do. I feel as if I'm always dragging my wayward brain back to the text. The deep reading that used to come naturally has become a struggle.
I think I know what’s going on. For more than a decade now, I’ve been spending a lot of time online, searching and surfing and sometimes adding to the great databases of the Internet. The Web has been a godsend to me as a writer. Research that once required days in the stacks or periodical rooms of libraries can now be done in minutes. A few Google searches, some quick clicks on hyperlinks, and I've got the telltale fact or pithy quote I was after. Even when I’m not working, I’m as likely as not to be foraging in the Web’s info-thickets—reading and writing e-mails, scanning headlines and blog posts, watching videos and listening to podcasts, or just tripping from link to link to link. (Unlike footnotes, to which they're sometimes likened, hyperlinks don’t merely point to related works; they propel you toward them.)
For me, as for others, the Net is becoming a universal medium, the conduit for most of the information that flows through my eyes and ears and into my mind. The advantages of having immediate access to such an incredibly rich store of information are many, and they’ve been widely described and duly applauded. "The perfect recall of silicon memory," Wired’s Clive Thompson has written, "can be an enormous boon to thinking." But that boon comes at a price. As the media theorist Marshall McLuhan pointed out in the 1960s, media are not just passive channels of information. They supply the stuff of thought, but they also shape the process of thought. And what the Net seems to be doing is chipping away my capacity for concentration and contemplation. My mind now expects to take in information the way the Net distributes it: in a swiftly moving stream of particles. Once I was a scuba diver in the sea of words. Now I zip along the surface like a guy on a Jet Ski.
I'm not the only one. When I mention my troubles with reading to friends and acquaintances—literary types, most of them—many say they’re having similar experiences. The more they use the Web, the more they have to fight to stay focused on long pieces of writing. Some of the bloggers I follow have also begun mentioning the phenomenon. Scott Karp, who writes a blog about online media, recently confessed that he has stopped reading books altogether. "I was a lit major in college, and used to be [a] voracious book reader," he wrote. "What happened?" He speculates on the answer: "What if I do all my reading on the web not so much because the way I read has changed, i.e. I'm just seeking convenience, but because the way I THINK has changed?"
Bruce Friedman, who blogs regularly about the use of computers in medicine, also has described how the Internet has altered his mental habits. "I now have almost totally lost the ability to read and absorb a longish article on the web or in print," he wrote earlier this year. A pathologist who has long been on the faculty of the University of Michigan Medical School, Friedman elaborated on his comment in a telephone conversation with me. His thinking, he said, has taken on a "staccato" quality, reflecting the way he quickly scans short passages of text from many sources online. "I can't read War and Peace anymore," he admitted. "I've lost the ability to do that. Even a blog post of more than three or four paragraphs is too much to absorb. I skim it."
Anecdotes alone don't prove much. And we still await the long-term neurological and psychological experiments that will provide a definitive picture of how Internet use affects cognition. But a recently published study of online research habits, conducted by scholars from University College London, suggests that we may well be in the midst of a sea change in the way we read and think. As part of the five-year research program, the scholars examined computer logs documenting the behavior of visitors to two popular research sites, one operated by the British Library and one by a U.K. educational consortium, that provide access to journal articles, e-books, and other sources of written information. They found that people using the sites exhibited "a form of skimming activity," hopping from one source to another and rarely returning to any source they'd already visited. They typically read no more than one or two pages of an article or book before they would "bounce" out to another site. Sometimes they'd save a long article, but there's no evidence that they ever went back and actually read it. The authors of the study report:
It is clear that users are not reading online in the traditional sense; indeed there are signs that new forms of "reading" are emerging as users "power browse" horizontally through titles, contents pages and abstracts going for quick wins. It almost seems that they go online to avoid reading in the traditional sense.
Thanks to the ubiquity of text on the Internet, not to mention the popularity of text-messaging on cell phones, we may well be reading more today than we did in the 1970s or 1980s, when television was our medium of choice. But it's a different kind of reading, and behind it lies a different kind of thinking—perhaps even a new sense of the self. "We are not only what we read," says Maryanne Wolf, a developmental psychologist at Tufts University and the author of Proust and the Squid: The Story and Science of the Reading Brain. "We are how we read." Wolf worries that the style of reading promoted by the Net, a style that puts "efficiency" and "immediacy" above all else, may be weakening our capacity for the kind of deep reading that emerged when an earlier technology, the printing press, made long and complex works of prose commonplace. When we read online, she says, we tend to become "mere decoders of information." Our ability to interpret text, to make the rich mental connections that form when we read deeply and without distraction, remains largely disengaged.
Reading, explains Wolf, is not an instinctive skill for human beings. It's not etched into our genes the way speech is. We have to teach our minds how to translate the symbolic characters we see into the language we understand. And the media or other technologies we use in learning and practicing the craft of reading play an important part in shaping the neural circuits inside our brains. Experiments demonstrate that readers of ideograms, such as the Chinese, develop a mental circuitry for reading that is very different from the circuitry found in those of us whose written language employs an alphabet. The variations extend across many regions of the brain, including those that govern such essential cognitive functions as memory and the interpretation of visual and auditory stimuli. We can expect as well that the circuits woven by our use of the Net will be different from those woven by our reading of books and other printed works.
Sometime in 1882, Friedrich Nietzsche bought a typewriter—a Malling-Hansen Writing Ball, to be precise. His vision was failing, and keeping his eyes focused on a page had become exhausting and painful, often bringing on crushing headaches. He had been forced to curtail his writing, and he feared that he would soon have to give it up. The typewriter rescued him, at least for a time. Once he had mastered touch-typing, he was able to write with his eyes closed, using only the tips of his fingers. Words could once again flow from his mind to the page.
But the machine had a subtler effect on his work. One of Nietzsche’s friends, a composer, noticed a change in the style of his writing. His already terse prose had become even tighter, more telegraphic. "Perhaps you will through this instrument even take to a new idiom," the friend wrote in a letter, noting that, in his own work, his "'thoughts' in music and language often depend on the quality of pen and paper."
"You are right," Nietzsche replied, "our writing equipment takes part in the forming of our thoughts." Under the sway of the machine, writes the German media scholar Friedrich A. Kittler, Nietzsche’s prose "changed from arguments to aphorisms, from thoughts to puns, from rhetoric to telegram style."
The human brain is almost infinitely malleable. People used to think that our mental meshwork, the dense connections formed among the 100 billion or so neurons inside our skulls, was largely fixed by the time we reached adulthood. But brain researchers have discovered that that’s not the case. James Olds, a professor of neuroscience who directs the Krasnow Institute for Advanced Study at George Mason University, says that even the adult mind "is very plastic." Nerve cells routinely break old connections and form new ones. "The brain," according to Olds, "has the ability to reprogram itself on the fly, altering the way it functions."
As we use what the sociologist Daniel Bell has called our "intellectual technologies"—the tools that extend our mental rather than our physical capacities—we inevitably begin to take on the qualities of those technologies. The mechanical clock, which came into common use in the 14th century, provides a compelling example. In Technics and Civilization, the historian and cultural critic Lewis Mumford described how the clock "disassociated time from human events and helped create the belief in an independent world of mathematically measurable sequences." The "abstract framework of divided time" became "the point of reference for both action and thought."
The clock’s methodical ticking helped bring into being the scientific mind and the scientific man. But it also took something away. As the late MIT computer scientist Joseph Weizenbaum observed in his 1976 book, Computer Power and Human Reason: From Judgment to Calculation, the conception of the world that emerged from the widespread use of timekeeping instruments "remains an impoverished version of the older one, for it rests on a rejection of those direct experiences that formed the basis for, and indeed constituted, the old reality." In deciding when to eat, to work, to sleep, to rise, we stopped listening to our senses and started obeying the clock.
The process of adapting to new intellectual technologies is reflected in the changing metaphors we use to explain ourselves to ourselves. When the mechanical clock arrived, people began thinking of their brains as operating "like clockwork." Today, in the age of software, we have come to think of them as operating "like computers." But the changes, neuroscience tells us, go much deeper than metaphor. Thanks to our brain’s plasticity, the adaptation occurs also at a biological level.
The Internet promises to have particularly far-reaching effects on cognition. In a paper published in 1936, the British mathematician Alan Turing proved that a digital computer, which at the time existed only as a theoretical machine, could be programmed to perform the function of any other information-processing device. And that's what we're seeing today. The Internet, an immeasurably powerful computing system, is subsuming most of our other intellectual technologies. It's becoming our map and our clock, our printing press and our typewriter, our calculator and our telephone, and our radio and TV.
When the Net absorbs a medium, that medium is re-created in the Net's image. It injects the medium's content with hyperlinks, blinking ads, and other digital gewgaws, and it surrounds the content with the content of all the other media it has absorbed. A new e-mail message, for instance, may announce its arrival as we're glancing over the latest headlines at a newspaper's site. The result is to scatter our attention and diffuse our concentration.
The Net's influence doesn't end at the edges of a computer screen, either. As people's minds become attuned to the crazy quilt of Internet media, traditional media have to adapt to the audience's new expectations. Television programs add text crawls and pop-up ads, and magazines and newspapers shorten their articles, introduce capsule summaries, and crowd their pages with easy-to-browse info-snippets. When, in March of this year, TheNew York Times decided to devote the second and third pages of every edition to article abstracts, its design director, Tom Bodkin, explained that the "shortcuts" would give harried readers a quick "taste" of the day's news, sparing them the "less efficient" method of actually turning the pages and reading the articles. Old media have little choice but to play by the new-media rules.
Never has a communications system played so many roles in our lives—or exerted such broad influence over our thoughts—as the Internet does today. Yet, for all that's been written about the Net, there's been little consideration of how, exactly, it's reprogramming us. The Net's intellectual ethic remains obscure.
About the same time that Nietzsche started using his typewriter, an earnest young man named Frederick Winslow Taylor carried a stopwatch into the Midvale Steel plant in Philadelphia and began a historic series of experiments aimed at improving the efficiency of the plant's machinists. With the approval of Midvale's owners, he recruited a group of factory hands, set them to work on various metalworking machines, and recorded and timed their every movement as well as the operations of the machines. By breaking down every job into a sequence of small, discrete steps and then testing different ways of performing each one, Taylor created a set of precise instructions—an "algorithm," we might say today—for how each worker should work. Midvale's employees grumbled about the strict new regime, claiming that it turned them into little more than automatons, but the factory’s productivity soared.
More than a hundred years after the invention of the steam engine, the Industrial Revolution had at last found its philosophy and its philosopher. Taylor's tight industrial choreography—his "system," as he liked to call it—was embraced by manufacturers throughout the country and, in time, around the world. Seeking maximum speed, maximum efficiency, and maximum output, factory owners used time-and-motion studies to organize their work and configure the jobs of their workers. The goal, as Taylor defined it in his celebrated 1911 treatise, The Principles of Scientific Management, was to identify and adopt, for every job, the "one best method" of work and thereby to effect "the gradual substitution of science for rule of thumb throughout the mechanic arts." Once his system was applied to all acts of manual labor, Taylor assured his followers, it would bring about a restructuring not only of industry but of society, creating a utopia of perfect efficiency. "In the past the man has been first," he declared; "in the future the system must be first."
Taylor's system is still very much with us; it remains the ethic of industrial manufacturing. And now, thanks to the growing power that computer engineers and software coders wield over our intellectual lives, Taylor's ethic is beginning to govern the realm of the mind as well. The Internet is a machine designed for the efficient and automated collection, transmission, and manipulation of information, and its legions of programmers are intent on finding the "one best method"—the perfect algorithm—to carry out every mental movement of what we've come to describe as "knowledge work."
Google's headquarters, in Mountain View, California—the Googleplex—is the Internet's high church, and the religion practiced inside its walls is Taylorism. Google, says its chief executive, Eric Schmidt, is "a company that’s founded around the science of measurement," and it is striving to "systematize everything" it does. Drawing on the terabytes of behavioral data it collects through its search engine and other sites, it carries out thousands of experiments a day, according to the Harvard Business Review, and it uses the results to refine the algorithms that increasingly control how people find information and extract meaning from it. What Taylor did for the work of the hand, Google is doing for the work of the mind.
The company has declared that its mission is "to organize the world’s information and make it universally accessible and useful." It seeks to develop "the perfect search engine," which it defines as something that "understands exactly what you mean and gives you back exactly what you want." In Google’s view, information is a kind of commodity, a utilitarian resource that can be mined and processed with industrial efficiency. The more pieces of information we can "access" and the faster we can extract their gist, the more productive we become as thinkers.
Where does it end? Sergey Brin and Larry Page, the gifted young men who founded Google while pursuing doctoral degrees in computer science at Stanford, speak frequently of their desire to turn their search engine into an artificial intelligence, a HAL-like machine that might be connected directly to our brains. "The ultimate search engine is something as smart as people—or smarter," Page said in a speech a few years back. "For us, working on search is a way to work on artificial intelligence." In a 2004 interview with Newsweek, Brin said, "Certainly if you had all the world’s information directly attached to your brain, or an artificial brain that was smarter than your brain, you’d be better off." Last year, Page told a convention of scientists that Google is "really trying to build artificial intelligence and to do it on a large scale."
Such an ambition is a natural one, even an admirable one, for a pair of math whizzes with vast quantities of cash at their disposal and a small army of computer scientists in their employ. A fundamentally scientific enterprise, Google is motivated by a desire to use technology, in Eric Schmidt's words, "to solve problems that have never been solved before," and artificial intelligence is the hardest problem out there. Why wouldn't Brin and Page want to be the ones to crack it?
Still, their easy assumption that we'd all "be better off" if our brains were supplemented, or even replaced, by an artificial intelligence is unsettling. It suggests a belief that intelligence is the output of a mechanical process, a series of discrete steps that can be isolated, measured, and optimized. In Google's world, the world we enter when we go online, there's little place for the fuzziness of contemplation. Ambiguity is not an opening for insight but a bug to be fixed. The human brain is just an outdated computer that needs a faster processor and a bigger hard drive.
The idea that our minds should operate as high-speed data-processing machines is not only built into the workings of the Internet, it is the network's reigning business model as well. The faster we surf across the Web—the more links we click and pages we view—the more opportunities Google and other companies gain to collect information about us and to feed us advertisements. Most of the proprietors of the commercial Internet have a financial stake in collecting the crumbs of data we leave behind as we flit from link to link—the more crumbs, the better. The last thing these companies want is to encourage leisurely reading or slow, concentrated thought. It's in their economic interest to drive us to distraction.
Maybe I'm just a worrywart. Just as there's a tendency to glorify technological progress, there's a countertendency to expect the worst of every new tool or machine. In Plato's Phaedrus, Socrates bemoaned the development of writing. He feared that, as people came to rely on the written word as a substitute for the knowledge they used to carry inside their heads, they would, in the words of one of the dialogue's characters, "cease to exercise their memory and become forgetful." And because they would be able to "receive a quantity of information without proper instruction," they would "be thought very knowledgeable when they are for the most part quite ignorant." They would be "filled with the conceit of wisdom instead of real wisdom." Socrates wasn't wrong—the new technology did often have the effects he feared—but he was shortsighted. He couldn't foresee the many ways that writing and reading would serve to spread information, spur fresh ideas, and expand human knowledge (if not wisdom).
The arrival of Gutenberg's printing press, in the 15th century, set off another round of teeth gnashing. The Italian humanist Hieronimo Squarciafico worried that the easy availability of books would lead to intellectual laziness, making men "less studious" and weakening their minds. Others argued that cheaply printed books and broadsheets would undermine religious authority, demean the work of scholars and scribes, and spread sedition and debauchery. As New York University professor Clay Shirky notes, "Most of the arguments made against the printing press were correct, even prescient." But, again, the doomsayers were unable to imagine the myriad blessings that the printed word would deliver.
So, yes, you should be skeptical of my skepticism. Perhaps those who dismiss critics of the Internet as Luddites or nostalgists will be proved correct, and from our hyperactive, data-stoked minds will spring a golden age of intellectual discovery and universal wisdom. Then again, the Net isn't the alphabet, and although it may replace the printing press, it produces something altogether different. The kind of deep reading that a sequence of printed pages promotes is valuable not just for the knowledge we acquire from the author's words but for the intellectual vibrations those words set off within our own minds. In the quiet spaces opened up by the sustained, undistracted reading of a book, or by any other act of contemplation, for that matter, we make our own associations, draw our own inferences and analogies, foster our own ideas. Deep reading, as Maryanne Wolf argues, is indistinguishable from deep thinking.
If we lose those quiet spaces, or fill them up with "content," we will sacrifice something important not only in our selves but in our culture. In a recent essay, the playwright Richard Foreman eloquently described what's at stake:
I come from a tradition of Western culture, in which the ideal (my ideal) was the complex, dense and "cathedral-like" structure of the highly educated and articulate personality—a man or woman who carried inside themselves a personally constructed and unique version of the entire heritage of the West. [But now] I see within us all (myself included) the replacement of complex inner density with a new kind of self—evolving under the pressure of information overload and the technology of the "instantly available."
As we are drained of our "inner repertory of dense cultural inheritance," Foreman concluded, we risk turning into "'pancake people'—spread wide and thin as we connect with that vast network of information accessed by the mere touch of a button."
I'm haunted by that scene in 2001. What makes it so poignant, and so weird, is the computer's emotional response to the disassembly of its mind: its despair as one circuit after another goes dark, its childlike pleading with the astronaut—"I can feel it. I can feel it. I'm afraid"—and its final reversion to what can only be called a state of innocence. HAL's outpouring of feeling contrasts with the emotionlessness that characterizes the human figures in the film, who go about their business with an almost robotic efficiency. Their thoughts and actions feel scripted, as if they're following the steps of an algorithm. In the world of 2001, people have become so machinelike that the most human character turns out to be a machine. That's the essence of Kubrick's dark prophecy: as we come to rely on computers to mediate our understanding of the world, it is our own intelligence that flattens into artificial intelligence.
Discussion: First of all, I think because I'm so stupidly paranoid sometimes, I love dystopian stories. I would really like to write one one day. I had an idea, it's sitting in a folder on my computer, but the problem is that it's more for the technology involved on a cool level, rather than the dystopian issues. Basically like distilling Star Wars down to lightsabers. Cool, but you don't get any story. Someday hopefully I'll have a good idea though.
Now, to the actual article. (I have two post-its full of notes I made about the article, but I'm sure I am missing a few because I wrote them as I skimmed it a second time.) I've noticed this myself, and in fact I think I may have even blogged about it before. Probably talking about internet addiction and stuff. I know that I am a whore for information. Seriously, trivia, knowledge, you name it. I've pretty much always wanted to be in school, just at my own pace. I think, for the rest of my life, I will always be taking some sort of class. I enjoy learning new techniques and gaining new skills and having information stored away at the back of my brain. The problem is, you forget it over time... But that's not my point, I'm getting a bit tangential. The point is, I've noticed I find it harder to read books. Unless it's extremely compelling, I will stop and think a few pages/chapters in, but my eyes keep moving, which is annoying because then you have to go and read a whole page over again. I've always had that quality, that I will think, almost anywhere. My best friend in gradeschool used to tease me about it all the time, because sometimes I would stop talking to her in a car ride or something and just be staring out the window, thinking. I used to get people asking me if I was okay, yeah, I'm fine, I'm just staring out the window looking at things and thinking about them... But I have noticed that after a little bit I lose interest in the book, it just becomes words on the page. More than just stopping and thinking (or getting easily distracted, like many books you have to read for school, oh, let's say, Wuthering Heights [BLECH]), but completely losing interest. Almost as if your eyes lose focus because you've been reading too long, but it's coming from your brain instead. The words on the page just become mush. It's so frustrating.
Sometime last year, I can't remember if it was before drum corps or after fall semester, I told myself that this year would be a big change for me. I was going to Hawaii to get in some relaxation, some adventure, figure out what in my life is not going right. There are a lot of small things, I suppose, but I think the main thing I noticed is that I've lost my self-limitation. I used to be able to just come home, do what I had to do, and then I could do what I wanted. Part of it was that gradeschool was too easy in some ways, and not interesting in many others, and when I got to high school I cared less about classes because I cared more about friends. Same with college, and especially because I told myself I'm doing engineering because I can, not because it's my main interest. I just want to have the skillset. Which only leads to more procrastination. But the issue here is that I needed to sit myself down and say, no, you have to start caring about stupid things again, they aren't going to work themselves out. I need to manage my time online. I need to exercise more. I need to make an effort getting out to meet people. I need to be ME and not lose myself in inane things. I don't even know where I lost myself, but somewhere toward the end of high school and the beginning of college, I did. I focused too much on what other people wanted me to be, so much the adult because I know, and they know, that I can handle the responsibility-- but I can't forget that I am so much a child's imagination. I daydream, I sketch, I have too many ideas and aspirations. I'm too hopeful in humanity, probably. But I've always felt that way, and I think I tried to crush it. It seems like all throughout gradeschool I was one of the most mature kids in my class, though I always felt frustrated by the fact that I had no real knowledge/understanding/interest in current events, other than scientific. In that sense I think I definitely matured, though it pretty much took me all throughout high school. But I can definitely at least better understand some politics and things, or at least I have more of an interest in them. Something about them is still not there for me to grasp. Anyway, the point is that I have a need to create and a need to let my imagination go and a need to be a dork. I can't sacrifice that for anything. Maybe I'm not going to play with all my toys but it doesn't make it any less fun having them. Laura was looking at me ridiculously for buying so many of the Indiana Jones 3 3/4" figures, but as soon as I pulled them out of the box to show her she started messing with the accessories and the boxes and we actually spent a little while setting them up and trying to keep Spalko's pistol from getting eaten by Jack.
And now I am extremely far off topic. Back to my post-its.
Artificial intelligence scares the SHIT out of me. Not in the sense that computers will take over and we'll become a biology to be exploited or killed, but rather in the sense that humans will become nothing more than advanced technology, modifying ourselves until we lose every element of humanity... and yeah, I know that deep down poetically it's emotion, but it scares me that people want to genetically CHOOSE what their kids will be, not just male or female but eye color and hair color and intelligence and skills and strength, and that those are things that would eventually become outdated as they are upgraded, and soon you have outdated children. Who could treat a person like that? Another human being? This is ignorance. (This is Sparta! Oh wait. No, it's just ignorance. Maybe politics. But it's madness, too.)
And maybe genetic modification everyone decides is unethical. (Though there would probably still be someone out there experimenting... the law isn't perfect) So what if people can put chips into their brain to remember things? Yeah, I remember reading a kids magazine wishing you could just do that so you wouldn't have to go to school. Convenient, but what if your computer broke, or was programmed badly? Or you crashed in a forest and a moose trampled on you and broke the circuit? Then where would you be? And it also leads to the ego that you think you know something, but really, you don't. This is another thing I've had to find out about myself recently. Gradeschool was too easy and not interesting, and it's easy to delude yourself about teachers and books and things if you're obsessive about good grades, but starting senior year and going 'til now I've been able to admit to myself a few things. Grades aren't as important as they seem, because you should get out and live. That being said, they're very convenient for scholarships and things, and I will probably be kicked out of the honors program. Which upset me, but after a while being bitter I realized there really aren't any benefits anyway, other than being able to check out 100 books from the library at a time. (If I'm still in, I'm going to do it. If not, I'm stealing Candice's card to do it.) Oh, and I won't graduate Summa Cum Laude (I think?), but... does that really matter? Besides grades, I had to learn that I'm not as much of a genius as I wished I was. There are things I don't understand, and probably my mewest revelation is that you can't just learn them for a semester, for a test. You're not really learning anything, and I'm starting to feel those effects. You have discussions with your friends and realize that even though you studied and at one point knew what they are talking about, you can't recall as much of it as you'd like. (Then again, maybe I'm too obsessive and I just want to remember it all. Who knows.)
Can't remember if I already mentioned this a few paragraphs up, but I'm just going down the post-its now. Staying in school... I think I will be always taking classes, about something. Literature, history, art, you name it. And because classes in school are so focused, I'm always reading. Used to be magazines and books, now mostly internet. I have acess to learning so many things I may never need to know directly, but what if them one day applies to a creative engineering solution or a writing idea or an art idea? That's how my brain works, I have millions of stupid little things floating around, and somehow they will crash into eachother and I'll have an idea. In order to be creative I know I have to feed my brain, but I've also learned if I'm feeding it I need to be creative.
Something else I think I've touched on before, and I think about at least once a week if not once a day; age and technology. I wonder how a generation that's grown up understanding computers will adjust to technology as they get older, as older generations fall behind (and older technology falls behind, and the things that go with it: media, politics... sadly). I wonder how technology's going to change...
Now, malleability of the brain, that's interesting. It's both good and bad. Bad in that my brain is becoming more rapid-fire and I lose focus reading books. Not good at all. Good in that it means you can teach an old dog new tricks... Definitely one of my fears. That I will stop learning, that I will get stupid with age (yeah, not just fear of dying here AUGH), that I will lose ability to do things as I get old. Not that I feel old now, though I think people expect me to say that because OHMYGODI'MTWENTY (well, not yet, technically). Like I've said before, I'm too much a child's mind and I don't want to not learn things, not be able to go on adventures. (At least, it's self-encouragement to get in shape. So when I in my eighties I can still run.)
Internet is not the only thing that messes with my brain. I cannot listen to music when I study, or really when I do anything that is not expressly listening to music. Maybe it's because I was so much raised on music that it's like my sixth sense, tempo and tuning and listening and feeling it. For a while I just tried studying to music without words, because I figured it was that language is so important to us that it would be what distracts us, but it's not true for me, it's definitely any sort of music. That being said, I just had the remembrance that a common "science fair" in grade school was to see who took a test better, those who listened to music while studying, and those who didn't, and that those who listened to (I think classical) did better. I am not sure that would work for me, though I think the science behind it is that the music stimulates more of your brain to get it active and remembering. Only problem for me is that I guess it is too active. Maybe it's just that my brain is turning into an internet-brain and I lose focus on studying, but really music should help... who knows.
I've also noticed my mental to-do list skills breaking down. Now whether that was that I have so much more to keep track of now, not just do your homework do your chores work on a craft project and go to bed, but do your homework go to work remember what to wear and then go to the bank and then go to walmart (with a whole separate mental list) and then go home to meet laura and pack (yet another list) and then clean the house so people can come over at x time and then go on vacation... remains to be seen. Or maybe that I now purposefully carry around a notebook and pen at all times, and write down project ideas and things I need to do, and I rely more on that... But I almost always forget to go back and check it. Right now my desk is COVERED in post-it notes and the notepad function on my phone is full. For a while I tried a method of keeping "A, B, C, D" level lists on my desktop in Word, but that was forgotten quickly because it requires upkeep. So... I don't know.
Another topic dealing with age and technology: how people watch movies. When you are younger, everything is entertaining, and when you are older you start watching stuff made more for the story and the thoughtfulness and the emotion and stuff. And if you're trained in certain technical areas, I'm noting more nowadays that you watch the costumes or the acting or the way the camera pans or the editing or the CGI. I'm not convinced it's because a lot of my friends have interest or training in that, though I'm sure that's part of it. I think part of the issue is that it's popular (popular? I don't know) to be a cynic, to think you're an expert, to watch it not for entertainment but on a meta-level to guess it's predictability and it's qualities that I think ten years ago only a professional (or indie) movie maker would have cared about. Maybe it's part youtube, maybe it's the addition of behind-the-scenes special features on almost all DVDs (and I swear to god if they don't start putting more special features on the Indy dvds I will kill someone), who knows. Maybe it's all of that.
You know what word never ceases to distract me from the quality of the writing? "Gewgaws." PLEASE STOP USING THIS WORD, PEOPLE, I am begging you
Part of me has always wanted simplicity. Not quite being a hermit but living in a small house being able to take care of all my stuff and growing my own food (or catching/killing my own food, whatever) and being able to manage everything myself. Part of it is that machines are becoming too complicated, it requires a LOT of skill to be able to fix cars now. Part of it is my concern for the environment, that owning little and having a small footprint is best. Part of it is that man is turning into machine... a corporate machine, maybe, but a machine nonetheless. Buy the clothes buy the ipod buy the phone buy the car buy the gas buy the fancy restaurants buy the (COMPLETELY BULLSHIT MARKET) diamonds. I think, in a small-market system, captialism is good. Maybe I'm just too much of a gloom-and-doom worry wart, but I'm really afraid for where we, as a society, are headed. Environment, politics, corporations... Ugh. Maybe just another reason to want to get away from it all...
Plus, I've always thought survivalist skills are cool. (AND ENVIRONMENTAL HOUSE DESIGN! Awesome) I'd love to see anyone from Hollywood survive in a forest for a week. I know my plants recognition skills aren't all that great, but I don't think I'd do too badly. Shelter, water, I am there. Fire in theory. Trapping animals a bit more of a challenge.
(WOO, one post-it down!)
Back to my dystopia thing for a second-- you know what would be interesting? A current-day dystopia novel. Something that would happen tomorrow. I think partially because technology moves at an exponential rate, that it would both have to be as well as make it more interesting. (Only thing is that it would go out of date probably very quickly...) Hmm. I'll have to think on this some more.
Another concern of having a mind that moves too quickly: ADD diagnoses. They're already way too high, and way overmedicated, but being raised on the internet might only make that worse.
Maybe I am just a bit old-fashioned, wanting to avoid having a brain that moves too quickly, but in the same manner I hate having downloaded-only music and movies. I'd rather own the dvd and the cds. I think it's still too easy for files to corrupt, for computers to crash or get broken, and there are all the stupid limits like iTunes "you can only download this file to five computers!" I know it's so they make more money, but, UGH, I'm not going to get into a political rant about how I think this (almost)depression is good for most Americans... Also, I almost always have music playing in my head. You think I'm kidding... I'm not. The only time I don't consciously hear it is if I'm thinking particularly hard about something interesting.
Also, scary that we are losing our ability to focus long-term... I already think it's scary we used to have much more impressive memory skills, but lost that with the invention of the printing press. Maybe it's just that I am too insecure and always afraid I will need to remember something I have forgotten, or that I hate egotism so much and there are so many people who think they're really smart but really know nothing. I don't know. Also, I think losing the ability to focus on something for a long time will just accentuate the aspect of immaturity that you don't think outside of your immediate actions, the whole monkeysphere theory. Truly fascinating stuff.
(My next thought was, this sounds like something I read the other day that sounded like it could have come from Douglas Adams [I don't remember what it was, but it involved discussion of "it's a small world" and that it's all an act that there are only 500 people besides you {which isn't true, I have over 500 people friended on facebook, all but a handful I actually know in person}, but in reality it's just that there are many smaller circles of about 500 people... maybe it was Neil Gaiman?]. And then, it's interesting how British humor always has an interesting take on something, like they sat back to get a bit more perspective than you, and I love that. But then that also relates to this discussion, what if it's just their being a bit more old-fashioned? Almost like the metaphorical US as a pre-teen and Britain being in it's late 20's saying there-there, I told you so, in a mocking but not annoying manner. Which makes me wonder, what if the British have got it right all this time...) (I dunno, I do kind of think I could easily live there. Reading Adams, he's said that Brits are much more sensible and that atheism and agnosticism is not looked down upon as badly as it is in the US. BY THE WAY, WHAT THE HELL IS THIS. I can't see the video, but from the quotes that sounds ATROCIOUS. Just makes me more frustrated for that whole campaign by scientists to get more discussion of technology and global warming and important topics into legislature... ARGH)
Um, and back to the Epic Memory thing real fast (termed because I remember that people used to memorize the Odyssey and the Iliad), most memorization today is just rote. You do it until it becomes an instinct, not necessarily putting the thinking behind it, and I think that was an issue for me for learning some things. Or rather, testing on them. You can't teach a method of study of rote memorization and then expect it to be APPLIED. Argh.
Oh, and I think I may have actually lost a lot of my skill telling jokes. Laura and I used to be able to go on forever, back and forth from one joke to the next. I think part of that is that as you get older you find other things to entertain yourself with, and that I've stopped reading jokes as much, and I know I haven't listened to the Dr. Laura Thanksgiving Day joke show in a while, and I don't even know if she doesn't anymore. (Still annoyed that the one year we got through, LAURA STOLE MY JOKE ...yeah, I can hold grudges like nobody's business)
Interesting that the author of the article talks about pancake people. I think that's definitely true, and kind of goes back to my idea that perhaps capitalism on a more personal level is better. Mom and Pop stores and people actually learning trades if they have no interest in school, and a car I can fix myself... But that's back to politics again, I'd rather not go there. Only thing is, I don't know if I'm a product of society in that way, or if it's just my personality, but I've always been sort of a take-interest-in-everything want-to-be-a-renaissance-man sort. I tend to think it's me and not society, as this is still relatively new in regards to the internet raising people, and I am a bit extreme and all-over-the-board in my interests.
Now, back to something off topic but relating to my dystopia interest. I have almost always read books for entertainment rather than education (though a lot of educational stuff was entertaining to me...), and my dad and I have always thought that you should be able to look at a book as entertainment and not have to analyze it for themes and symbolism and metaphors. But I also wonder, it can be very interesting to have that secondary level, if you study it. I've always wanted to write a book and just say "there's no lesson here, it's just supposed to be fun", but I don't know if I could anymore. I mean, I think I could, but it wouldn't be as interesting as something with more significance. (But when I try to write significance with metaphors, I go a bit crazy, and that's another issue.) Are there are stories just pure entertainment with no lesson, though? Star Wars? I mean, I love the hell out of Indiana Jones, but I'm pretty sure the basis is to be responsible for what is right, there, and have a sense of mystery and awe. Terminator? I mean is there some Kubrickesque dystopian message there? I have no idea (I've only seen it once).
Ideally, my writing style is like Hitchhiker's Guide or Neil Gaiman or Diana Wynne Jones, all of which I consider to be entertainment primarily. (However, American Gods has a deeper meaning for me, which is slightly off topic.) Are there any messages in Hitchhiker's Guide? I don't really think so, other than, stop taking it all so seriously because it doesn't really matter (which, while sometimes good, also worries me). My only problem is that it takes quite a lot of work to get me in that mood, and it's difficult to keep. Like I said, I think I'm too much of a renaissance man. I would write a book in that style and never be able to write another one like that again, but I would easily move on to another idea another genre another style. I could act if I had the chance, I would sing if I had the chance, I would be an architect or an Imagineer. Who knows, really...
(Incidentally, 2001 was a movie I felt they could have cut the beginning and the end off of and had a decent movie. I guess I'll have to watch it again with the right metaperspective. Damn me for expecting entertainment...?)
[EDIT] Also, I wonder if my newfound semi-carsickness (I can no longer read in the car... extremely disheartening for me) relates at all to this. Truly.