top of page
  • Writer's pictureDaniel Vollaro

The Thinker, an Endangered Species



Of all the physical objects in my house as a child, the one that exerted the most influence on me was a twelve-inch high replica of Auguste Rodin’s famous sculpture, The Thinker. In its original form, The Thinker was part of Rodin’s elaborate and brilliant ”Gates of Hell” sculpture in which the seated figure of the thinker at the top and center of the gate presides over the entire scene. He appears to be lost in thought while, pardon the pun, all hell breaks loose around him. Separated from the larger sculpture, however, The Thinker is a solitary figure, naked, sitting hunched over with his chin resting on the knuckles of his left hand.


I identified with his strange creature. I shared his brooding sensibility and his desire to mentally retreat from his immediate surroundings. These were also attributes that my parents and teachers had noted about me. “Lost in thought.” “Absent minded.” “Too much in his head.” I had heard each of these criticisms aimed at me, always with a dose of sympathy, as if to suggest that I was in for a difficult life if I couldn’t plant my feet more firmly on the ground. Nevertheless, I felt a deep affinity for The Thinker. I empathized with the tinge of loneliness I sensed in him. I accepted early on that I was a ”cerebral” person, so the statue was a daily reminder that I was of a type. There were many thinkers out there, and together, we constituted a tribe of our own.


When I was in college in the 1980s, I realized that the main struggle for thinkers was how do we find the courage and fortitude to act in the world? I was attracted to writers and thinkers who had worked out a formula for deciding when thinking should lead to action. For example, I read Thomas Merton who observed that “action is the stream, and contemplation is the spring” and Ralph Waldo Emerson‘s 1837 ”American Scholar” speech, which extolled the virtues of “Man Thinking,” a kind of ideal Uber-individual who understood the proper balance between words and action, deferring to the latter in most things. I was attracted to thinkers who were also people of action—Socrates sacrificing his life for principle; Henry David Thoreau, the abolitionist underground railroad conductor who turning his night in jail for not paying his poll tax into one of the great works of political philosophy; Albert Camus writing articles for Combat, the banned underground newspaper, during World War II. I read Jean-Paul Sarte, because of his involvement in the French Resistance, and I read Pablo Neruda because Sartre called the Neruda-reading revolutionary Che Guevara “the most complete human being of our age.”


Twenty-five years later, as I was finishing a doctoral program in19th Century American Literature, I could see a new challenge emerging. As a young man, I had worried that being a thinker would lead me to spend too much time contemplating at the expense of action, but as I sat in a study cubby on the 6th floor of the Emory University Library reading for my dissertation, I realized that the act of thinking itself was under threat. Sitting there, with my laptop screen lighting up the gloom, my mind was a muddle. I could not concentrate, mainly because the internet was calling out to me from my laptop screen like a siren in the darkness. My ability to sit and read for long stretches of time had been compromised. Some new fog was creeping into my mind, a malady without a name.


But it does have a name. A few years later, I would learn some of the new terminology used to describe this state of mental befuddlment. Information overload. Infoglut. Spaghetti Brain (I favor the word "saturation" because it best describes the sensation of gorging myself on information and then choking on it). Something new was happening in my brain and in the brains of many other people I knew, something to do with a basic inability to process information and the feelings of inadequacy and exhaustion that accompanied it.


***


As a child, I would stare at The Thinker with a mix of envy and fascination. I had a sense that he was obviously engaged in an important activity, though I could not actually read his thoughts. He made me believe that there was something special about thinking itself. Two centuries before Emerson's speech, Rene Descartes famously grounded the idea of human consciousness in the simple construction "cogito, ergo sum—I think, therefore I am. If humans are capable of doubting the nature of our existence, Descartes argued, than we must possess a mind capable of miraculous feats of consciousness. Out of this idea is born the Thinking Man—a being with a unitary consciousness, a Self. It would be impossible to overstate the importance of this idea to modern existence. Everything from psychotherapy to the success of advertising depends upon it.


Descartes made thinking synonymous with being, thus elevating its importance as a human activity, but even as I was learning about the cogito in Dr. Barnes Intro to Philosophy class, I was also skeptical of it. I was aware that my own thoughts were seldom elevated or special. My mind was swimming in effemora, silliness, and redundancy. And what about those moments when I escaped my thinking mind, when I was “in the zone”? If you are not aware of thinking, are you still thinking? Also, what is the line, if any, between thinking and remembering? These questions made me at least doubt the importance of the cogito as an idea that could be applied to lived human experience of thinking. Later, there was Kierkegaard, who rightly observed that “I think, therefore I am“ is a tautology that presupposes the existence of the self.


And there was this: If thinking is the ground of being itself, why is it such a fragile and easily disruptable activity? I am reminded of a scene from early in the holocaust film Schindler’s List when one of the residents of the Krakow Ghetto, a former professor, says, in a rare moment of gratitude, ”I can’t remember when I was last able to organize a thought.“ Beset as he is by the daily horrors of the Nazi regime, he manages something that was once commonplace for him: thinking. From his perspective, thinking requires a pause, an act of mental organization. It occupies time and space—not a rarified thing or a thing only the educated can do, but definitely a thing that requires a minimal quotient of focus and concentration.


My sense of the fragility of thinking was reinforced, oddly enough, by America's wars in Iraq and Afghanistan. After 9/11, I became aware of the use of loud, persistent noise as an “enhanced interrogation” (aka torture) technique. This was a strategy used by interrogators to break down the subject, make him more pliable, motivate him to share information. It is a form of brutality every bit as dehumanizing as stripping a person naked and forcing him to stand on a barrel for hours on end or any of the other horrors reported from Abu Gairaib prison. To deny a person the ability to think is one of the worst tortures one can inflict upon a human being.


And yet, while 9/11 and its aftermath were unfolding around us, an army of software engineers was quietly building the infrastructure for a new kind of society that does to all of us what the torturers do to the man forced to listen to Marilyn Manson’s song “The Beautiful People” blaring at top volume for days on end. Whether intentionally or not, our new technologies were stripping us of the basic human dignity of being able to “organize a thought.”


***


In a previous age, we were fond of depicting the enemies of thought as grand deceivers, elites who used their power to pacify and numb humanity into a state of lumbering, slumbering compliance. In the John Carpenter film They Live, the protagonist, Nada, played by professional wrestler Roddy Piper, picks up a pair of discarded sunglasses on the street and discovers that they allow him to see the messages that are actually projected from billboards and signs—the message behind the message. “Marry and Reproduce.” “No Thought” “Obey” “Do not question authority” “Consume.” He is also able to see the creators of this grand illusion—aliens disguised as humans living undetected among us, having already conquered pacified humanity without our knowledge.


This crude satire plays on a 1960s-era notion of power in which there is a conspiracy among powerful elites to innoculate us against thinking. This is the basic dystopian plot that drives George Orwell’s classic novel 1984: the state manipulates what we see and hear so thoroughly that it creates an alternate reality. We are being plied with disinformation by an enemy who conspires to control us by putting us asleep. This motif is quite common. We see it again in The Matrix, in which artificial intelligence is the secret cabal pulling off a much more elaborate mass deception.


These plotlines no longer work as well in the 21st Century. For these allegory-of-the-cave storylines to work, the audience must possess a bipolar conception of reality—true reality vs. faux reality, with a veil separating the two. In a kind of reformulated gnostic conspiracy, the mass of people are deluded while a small group of enlightened souls is able to see the truth. But audiences in the modern era are far too cynical to believe in this kind of dual reality. On a very basic level, we are all in on the deception now. We know it is occurring—not a single veil concealing the truth, but a million veils, each one a different size and opacity. Imagine what happens if a modern cell phone-carrying Instagram user happens to pick up those glasses. She might shrug and say ”so what,” discarding them and continuing on her way. She might be briefly entertained, having discovered a new faux reality she has yet to experience. She might wonder if the glasses are a new form of virtual reality goggle, and then, her interest piqued, begin frantically searching the internet to locate the brand. There is no moment of profound revelation because the metaphors that allow us to entertain the dualism of true world vs. false world have themselves been co-opted. The “red pill”/“blue pill” choice has been memed and turned into a hundred gifs and video clips that exhaust its power of representation. We just live in the maya now. No hope of finding a desert of the real behind it all.


The Hunger Games trilogy is the new cinematic allegory for how power works: everyone knows they are being manipulated and deceived. The question is, can I create a convincing enough performance of my own that will help me survive or prosper? Of course we are saturated in falsehoods and deception. Of course lies are everywhere, big lies and small ones. Of course no one is real on social media, even when they are representing themselves as authentic. In this social construct, there can be no grand deceivers or grand deceptions. There is no Great and Terrible Oz hiding behind a curtain.


Because we have abandoned the grand deception narrative, we cannot as easily blame the powerful and privileged for deceiving us. Our increasingly dystopian society more resembles Aldous Huxley’s Brave New World than 1984 because it is distraction more than deception that defines the individual’s relationship with his/her mental environment, and distraction is a thing that we participate in. Our ability to think, critically or otherwise, was not undermined by a nefarious political force but a social and economic one that opened a portal to a new world wherein we can swim in information like tiny fish lost in a vast ocean. There was no conspiracy to suppress thinking, but at the same, no serious person can argue that the purveyors of Facebook, Twitter, Instagram, and Amazon want more critical thinkers in the mix.

***


I remember the joys of thinking. Long unadulterated and uninterrupted stretches of it.


In 1999, I lived alone in a house in Frenchtown, NJ, and would spend hours walking along the river, lost in thought. That was the year I stopped watching television just to prove that I could. I did not have a cell phone. I had a landline and a bulky desktop computer with a dialup modem. I had my books, my journal, my conversations with friends, and my active imagination. My mind felt alive and pulsing with possibility.


If you jogged past me on the river path during this golden age of thinking and happened to meet my gaze, you would have found me almost fully ensconced in my mind. It wasn’t always pretty or “organized”; often it was not. It was not free of outside influences either; no mind is. But this bout of thinking was mine. I might have been composing dialogue from a novel I would never write or trying to remember the words to a 1970s Big Mac commercial. High or lowbrow, it doesn’t matter. That time, that stretch of mental activity, was mine alone, and it was sacred.


Back then, I cultivated thinking. I made room for it in space and time. Giving up television was part of that cultivation. TV was, for me, the most significant distractor. Since childhood I had been seduced by it, drawn into countless hours of passive consumption of media. Though I had access to the internet, it had not yet achieved that same level of seduction for me, so giving up television produced a clear benefit to my overall mental environment.


The other aspect of my cultivation of thinking was positive: I embraced the daily ritual of long walks by the river. Walking like this was synonymous with thinking. There was no smart phone or smart watch to pull my attention away. I would walk, and my legs and arms pumping like pistons would drive thirty or forty minutes of glorious mental activity.


Jump ahead twenty years, and I now have a more impoverished relationship with thinking. There are now television screens in half of the rooms in my house, as well iPads, desktop computers, iPhones, and a watch that tracks my steps and heart rate. It is still possible for me to take long walks without my phone, but if my wife sees me, she will remind me to bring it with me “in case I need you.” Bringing the phone—having it on your person at all times—is now synonymous with common sense. Those long stretches of quiet retreat I once enjoyed are now further out of reach because that great army of engineers has mostly won the attention war by making hardware and software that is so seductive, we want it turned on and nearby at all times. Their employers in Big Tech have figured out how to successfully colonize those glorious hours in which I would once retreat into my own mind.


The Thinker. I can still picture him in my mind’s eye engaged in a sacramental activity. Big Tech, on the other hand, does not consider your mind or my mind to be sacred. In the attention economy, the holy grail is a dopamine response. They see the slack look on the The Thinker‘s face and see an opportunity to profit by pushing content into an idle mind. Like the Pilgrims arriving in the New World who saw only an unoccupied howling wilderness that would need to be tamed and civilized, the architects of the attention economy see our minds as empty undeveloped land. And like all good capitalists, they want to exploit the vacancy, to lay track there (or mine for iron ore and coal to make the steel) to build its own version of the transcontinental railroad.


If they could figure out how to mine your dreams while you sleep, have no doubt that they would do it.


Comments


bottom of page