Philosopher Clerihews

Invented by Edmund Clerihew Bentley, the clerihew is a poem form composed of two rhyming couplets with the scheme AABB, wherein a famous person is mentioned in the first line, and the last three complete an accomplishment, failure, biography, anecdote, rumor, or joke about them. Contrived, silly, and fun to read, these humorous poems can actually be quite educational while still being entertaining. I was inspired after reading some of Jacques Barzun’s clerihews on philosophers to write my own. Following are 16 clerihews on different philosophers. I have tried my best to make them concise summaries of their philosophies!






Henry David Thoreau
Was a very thorough
Observer of nature
Who used botanical nomenclature


Martin Heidegger
Conceived upon his ledger,
That what was once concealed
Would in a new beginning be revealed


Michel Henry
Did French phenomenology
And he into life inquired
Whence he from interiority acquired


Friedrich Wilhelm Nietzsche
Tried to preach the
Death of God, and of the slave morality
Favoring instead: Übermensch mentality


Arthur Schopenhauer
Believed in the instinctive power
Of the blind Will-to-Life,
So his pessimism was rife


Had to accede this:
Some things are outside our control
So with the punches we must roll


Edmund Husserl
Made unfurl
In his phenomenological prolegomena
The bracketing of experienced phenomena


Plato, or Aristocles,
Had found the keys
To the fundamental reality,
Which was actually ideality


Did not like Apologies
So he rushed out of the cave
And made dialectic all the rave


John Stuart Mill
Had had his fill
Of individual liberty:
He used it as a Utility


Thomas Kuhn—
Why’d you have to ruin
All of scientific history
By reducing it to anomalistic mystery?


Søren Kierkegaard
Was the first of Existential regard
Whose melancholy made him weep
And whose faith made him take a Leap


Thomas Hobbes
Was moved to sobs
When he found life was short
And served the Leviathan’s royal court


Blaise Pascal
Was a real ras-cal
Who liked to gamble
In his theological preamble


John Locke
Pictured a rock
And said it was qualities, primarily
Conceived on a blank slate, summarily


George Berkeley
Said, “Esse est percipi,”
Meaning he couldn’t find
Anything outside his mind

Should I write more philosophical clerihews? Maybe in other subjects as well, like history, literature, and psychology? Make sure to leave your own in the comments, and I’ll be sure to read them!



Kafka’s “The Trial” in a Poem

uddenly one morning, Joseph K is arrested at his home
Apartment to apartment, from lawyer to lawyer, whither he roams,
He discovers everything is beneath the Court’s unassailable dome.

The trial wraps itself around K’s neck like a noose;
It looms overhead, ambiguous, following like a cloud,
So that K, argumentative, confident, innocent, cannot hang loose.

On consulting the painter, K decides to drop his domineering lawyer,
With whom he’s dissatisfied, despite the daunting danger,
And of all the women he’s been with, he harangues her (Leni).

Reposed and ready for his final trial, K’s once more ripped from his room;
And dragged through the streets, as if “guilty” of a crime, he finds he can’t fight time,
For “the Law” has spoken, has driven into his heart a knife—yes, the clouds still loom.

Ycleped by a priest, a “door-keeper” of the Court, K is told a story:
A man is kept from the Law by a door-keeper, who closes it off for him.
K cries, “The door-keeper’s deceptions do himself no harm but do infinite harm to the man” (242)

A Very Short History of the Dream Argument

Unknown.jpegDreaming is an integral part of our lives, occurring every night when we are asleep. While the body relaxes, the brain stays active, creating a stream of thought, a stream that comes from the unconscious. Recent research into a method called “lucid dreaming” allows people to control their dreams, to place themselves within their illusory world, letting them make their dreams a reality; however, lucid dreaming, as cool as it is, presents a troubling problem, one that has intrigued humans for millennia: How do we know for certain we are not lucid dreaming right now? How do we distinguish our consciousness, our awareness, from the unconscious, the unaware? Are we actually asleep at this moment, life but a mere string of thoughts and sensations?

Defining dreaming and consciousness will help, as both concepts, simple though they may seem, are highly complex, each with their own requirements, psychologically and philosophically. Consciousness refers to “the quality or state of being aware especially of something within oneself”; in other words, consciousness refers to the realization or Unknown-1.jpegacknowledgement of the mind and its inner workings.[1] If you acknowledge that you are reading right now, you are conscious of yourself as reading, so consciousness is always consciousness of something, be it an activity or a mental state. American psychologist William James thought consciousness was not an existent thing, relating it to a stream, a series of experiences, one after the other, every one distinct from the other. Neurological studies later linked consciousness, the awareness of the brain, as a process within the brain itself, located in the thalamus. Dreams, on the other hand, are defined as “a succession of images, thoughts, or emotions passing through the mind during sleep.”[2] Dreams are specific from person to person, which makes it difficult, then, to “remember” a dream, considering it cannot be proven true or false. Therefore, it is difficult to differentiate the waking state from the dream state, so far as both are collections of experiences.

Apps-Lucid-Dreaming-Header.jpgMany philosophers, dating from the 5th century B.C. to the modern day, have attempted to tackle the “Dream Argument,” trying to prove that we are in fact living consciously. For example, Plato mentions it in a dialogue: “How can you determine whether at this moment we are sleeping, and all our thoughts are a dream; or whether we are awake, and talking to one another in waking state?”[3] Socrates was interested in finding out if our senses were reliable, if what we see, hear, taste, feel, and smell is real or a figment of our active minds. Perhaps when we fall asleep, when our brains switch to R.E.M., when we dream, there is a dreamer dreaming this dream. Another philosopher, René Descartes of the 17th century, in refuting the Dream Argument, famously proposed, “I think, therefore I am.” Descartes thought that his whole life was an illusion, a trick played on him by a divine being, that he was misled into believing reality. He started to doubt everything, including his senses; but one thing he could not possibly doubt was his existence, his self, because in order for him to doubt, there had to be a him to doubt in the first place!

Even though some of the greatest thinkers could not deny the Dream Argument irrefutably, at least we know from science that we exist, that dreams are just processes happening in the brain, and that reality is as real as it gets, dreams being a product of our imagination… unless we actually are dreaming, just waiting to be woken.



[1] “Consciousness.” (January 19th, 2017)
[2] “Dreaming.” (January 19th, 2017)
[3] Plato, Theætetus, 158d


If you have a lot of free time:

The Media, Democracy, and the Public Sphere [2 of 2]

Unknown.jpegClick here to read part 1 if you have not already (and makes sure to leave a like)!

Today’s technology-driven world is also system-dominated. A system is any division of labor paired with productive forces and knowledge, thinks Habermas. Systems operate through instrumental reason, or ends-means rationality. The ends justify the means. Organization and the state, accordingly, can manipulate the public with publicity, diverting their attention. The government tends to focus on technical problems, replacing democracy with bureaucracy, resulting in a democratic deficit, where principles of equality and consent of the governed lose their importance to Habermas’ “technocratic consciousness,” a state of mind brought forth by increasing specialization handled by authorities, experts, and professionals, each of whom spreads propaganda under technical jargon, claiming to be “fixing” some new problem. These technical problems are those to which social, pragmatic, pressing, and vital problems are subordinated. Technological ideology is not delusionalper se, as other ideologies are, such that their believers are under an illusion, misguided and mislead, although it is ubiquitous, as other ideologies are, infectious, spreading like wildfire. As such, the technical dominates the practical, removing thereby personal ethics. When a decision is made, its ethical dimensions are not considered; it is an ends-means instrumentality. Simply put, technology is self-determinative in terms of its values, which makes it a threat to democracy (in excess, of course, as technology is not intrinsically bad).

Unknown.pngThe commercialization of the press has led to the death of intellectual journalism. Drama takes precedence over detail, personality over policy. During the election, the press notably focused less on the actual and real issues and more on the candidates themselves. Rational discussion was thereby taken from the people, from whence they were distracted. Back in the 18th century, the bourgeois educated middle class read the newspaper daily, then went to the salon to discuss it with their peers. Now, the newspaper is still read daily, although not to the same extent. Consumers watch TV for hours every day, without ever exchanging discourse. Listening to the radio, watching TV, we cannot “disagree” with the media, in a sense, because it “takes away distance,” to use one of Habermas’ phrase, by which he means that we are so close to the media, that we cannot engage with it, we cannot talk face-to-face with the television or the interviewer or host who is speaking, but are forced to sit there, inactive, passive, taking it in, unable to respond critically. “The public Unknown-1.jpegsphere,” notes Habermas, “becomes the sphere for the publicizing of private biographies.”[1] News, publicity, focuses on celebrities, scandals, and politicians. It dramatizes everything they do, reporting it as news, using names to attract and tempt us, making a story out of anything they can get, in order to profit off of it. Rather than examine the policies and character of a person, the news analyzes their personal life. Habermas reflects ironically on the fact that, in the 19th century, ads in the press were considered dishonest, so they took up only 1/20 (0.05%) of the page. —How things have changed!— Take a look at any newspaper, even a respectable one, and behold how the whole page is practically take over by ads! Editorials are advertised and lose their meaning.  Advertisement gives a sales pitch, clear as day, but PR is more dangerous than advertisement because it exploits the public with attention-grabbing publicity, taking cover beneath the protection of the press.

Moreover, newspapers are dumbed-down. Publishers play around with type and font, adding flashy images and illustrations that distract from it, Habermas points out. The supervisors, just figureheads for their representative companies, get to control which topics are covered, scrapping any of which they disapprove. They “serv[e] up the material as ready-made convenience, patterned and predigested. Editorial opinions recede behind information from press agencies and reports from correspondents; Unknown-1.pngcritical debate disappears behind the veil of internal decisions concerning selection and presentation of the material.”[2] Debate, once a byproduct of the press, is itself commodified, restricted by formalities, aired to be watched without intervention or follow-up discussion. For this reason, debates are reduced to mere “personal incompatibilities,” trifles, minor disagreements, surrendering itself to the rampant relativism of the 21st century. In newspapers, “delayed-reward news,” valuable and informative, is vanishing, in its place “immediate-reward news,” which is tainted with too many clichés, touched up with drama, and made to sparkle with hyperbole, such that “the rigorous distinction between fact and fiction is ever more frequently abandoned.”[3]

By commercializing the press, the rich manage to hold onto power. They use propaganda to limit democracy. Playing the victim card, they complain that the wealthy minority are under attack from the powerless, uneducated minority. To combat the democratic instinct, they push for the “indoctrination of the youth,” a phrase actually used in official documents, emphasized by American philosopher Noam Chomsky (1928-) in his A images.jpegRequiem for the American Dream (2016) to critique the abuses of the media. Institutions like schools were told to be more strict in their requirements, to create criteria for education to brainwash children. The term “brainwashing” probably conjures up connotations of conspiracy; the fact is, brainwashing is very real, and very common, a technique mastered to influence people. Institutions try to limit free-thought, in hopes of making everyone conform to a single cutout. To cite an example, Chomsky refers to the Trilateral Commissions, an organization which, responding to the 1960’s, attempted to develop a “proper” society. There was purportedly “too much democracy,” so they needed to keep the masses in check, making people conform, passive, unquestioning. In post-Cambodia U.S. in the ‘70’s, local common spaces like the library and debate hall were closed off in universities to discourage critical discussion. In other words, the government attempted to shut down the public sphere, to prevent any criticisms of the state. Anyone who critiques the government, usually the educated minority of intellectuals, who impugns the media, is denounced as “anti-American,” a term which Chomsky traces to totalitarian regimes. To reduce criticisms of “concentrated power” (the state + corporations), the government discourages critical talk, alienating them, calling them traitors to the state, much as the Soviet Union did. Journalism was stifled. The public sphere cannot engage critically or rationally.

Famously, Chomsky said, “Propaganda is to democracy what violence is to totalitarianism.”[4] PR, then, is a method of cracking down on dissent, be it violent or nonviolent—a means of silencing and enforcing strict rules. Propaganda is more dangerous than censorship, he argues, because it, like PR, parades around as the public sphere, but is actually deceptive and misleading. Propaganda is brainwashing. This Screen Shot 2018-01-30 at 6.35.01 PM.pngdevelopment of PR and of propaganda stems from Edward Bernays, who coined the phrase “engineering consent,” a concept studied in depth by both Chomsky and Habermas. Bernays created what one official called “consent without consent,” because with the work of Bernays, PR was able to make decisions for people. As Chomsky relates from David Hume, power lies in the hands of the people; but if the people are made to think they have none, they will be powerless, and the government powerful. So the government exploits this. Fabricated consumption, a Veblen-esque term used by Chomsky, refers to the consumer culture of today, a culture in which we are told we need things, rather than want them. The media everywhere shouts, “Look at me!” “Buy this product!” Consumption is both uninformed and irrational, when it is supposed to be informed and rational! Evidently, all this has played a role in the 2016 Election. Rather presciently, Habermas writes that, with the decline of the critical public, those who do not ordinarily vote are swayed “by the staged or manipulatively manufactured public sphere of the election campaign”—notice the use of the word “manufactured.”[5] The presidential candidates were portrayed in a certain manner on purpose, because the corporations who owned them leaned in a certain direction. Unknown.pngBecause the media was biased and commercially influenced, it created a terrible environment, where discussion could not be grown, but rather created a desert, where no plants could grow, since there was no water, so they perished. Discussion was neither informed nor rational. Even if there were rational discussions, they were not factual, for the media reported no facts upon which to base them. This kind of political climate is poisonous, and offers no room for critical debate. “[A]n acclimation-prone mood comes to predominate, an opinion climate instead of public opinion,” declares Habermas; i.e., there is no talk about policy or the positions of the candidates; all there was was empty declarations like, “I’m voting for blah blah,” and “I’m pro so and so,” utterly devoid of thoughtfulness or decision.[6]

The decline of the public sphere and the commercialization of the media is no new concept, even here in the U.S. In the year 1934, the first Communications Act was passed, which formally established the Federal Communications Commission (FCC). This Unknown-1.pngorganization was created to handle media concerns, its service to the public interest. Then, in 1949, the controversial Fairness Doctrine was passed, a policy that required all media focus on pertinent, controversial topics and give equal airtime to opposing viewpoints, so as to allow for fair, balanced reporting based on facts, promoting discussions between parties, not just parochial, sectarian biases that supported one side, saying bad things about the other. In instating this, the FCC wanted to foster rational discussions, where both sides could be heard, and then citizens could make up their minds, instead of just listening to one and forming their decision without a second thought. With the Fairness Doctrine, the pros and cons could be heard and rationalized, challenged and defended. There would be less party polarization as a result—a problem we face very much today. The problem of the policy’s constitutionality arose, and it was challenged for impinging on First Amendment rights, so it was repealed in 1987, and formally eliminated in 2011. In 1975, the Cross-ownership Rules were passed by the FCC to “[set] limits on the number of broadcast stations — radio and TV — an entity can own, as well as limits on the common Unknown.jpegownership of broadcast stations and newspapers.”[7] These rules stipulated that a company could not own multiple mediums. Regulation of ownership was first defined thus. Giving equal voice to all media, the FCC made these rules to reduce and prevent media consolidation—the process in which big companies, or conglomerates, buy out other media companies, and thus hold legal and economic ownership of them. Like Chomsky, the FCC wanted to stop concentration of power. This set of rules appears to be a victory for the public sphere; unfortunately, it did not last long, and tragedy struck when the Telecommunications Act of 1996 was made active. Suddenly, the FCC repealed ownership regulations—hence, they deregulated the media—allowing for more companies to merge together and consolidate. From 2003-7, slowly but surely, the media was increasingly deregulated. Eventually, the Cross-ownership Rules of 1975 were null. Private concentration opened up. One of the terms stated that “whether a channel actually contains news is no longer considered in counting the percentage of a medium owned by one owner.” Companies could now hold 45% of the media market, as opposed 2Mp7dD3HM1Q7Q4QSc5zTjUym.jpegto the previous 25% in 1985.[8] This, the rise of oligopoly. By 1985, 50 companies controlled the media. Since the Telecommunications Act of 1996, over a course of several years, the number dropped infamously to five (or six, depending on the source) companies: Comcast, The Walt Disney Company, 21st Century Fox, Time Warner, and CBS/Viacom. Most recently, many an American has prophesied the “death of the Internet” as a result of a decision that took place on December 24, 2017: The FCC, after a long fight, repealed Net Neutrality. Why is it regarded as the death of a free Internet?—Because big corporations, such as Comcast, can now control data as they please. It used to be that data carriers equally distributed connection, but now, with it repealed, just like the Cross-ownership Rules, oligopoly can now thrive, meaning big companies control the market, stamping out smaller competitors, all in the name of money.

Unknown-1.jpegAnd what of fake news? What is it, and what implications has it for democracy and the public sphere? Fake news is defined as “false, often sensational, information disseminated under the guise of news reporting.”[9] Put another way, fake news is erroneous, nonfactual information based on getting attention, often with the use of shock to attract people. It conceals its falsehood under the “guise,” or cover, of “news reporting”; it uses the authority of the media to pull of its stunts. This is an existential threat to democracy for several reasons. First, it deceives the public. The public relies on the media to get information, but the press supplies them with none—or rather, it does, but it misinforms them, about everything, seeing as it is fake. Second, it besmirches the reputation of the media. Each time we read fake news and catch it, we lose more and more trust in the media, because we know we cannot believe a word it says. Considering there are good, factual, respectable presses out there, this is disadvantageous because it means that the preponderance of fake news seems to overcompensate for the good news out there, meaning media in general loses its credible character. Third, fake news does not make for critical discussion. If it is fake, then it is not factual, and if it has no facts, no logic, then it cannot be rational in any capacity.  Fourth, it signals the collapse of the public sphere and the recrudescence of feudalism, devoid of any criticism.

Unknown-2.pngIn a study done by Media Matters, Facebook was found to be one of the leading sources behind fake news circulation. Due to its algorithms, Facebook works like this: The more likes or views an article gets, the more it circulates, the more it spreads. The circulation of news is an active engagement; the more we interact with it, the more it interacts with us. Like a hot agent, the more it spreads, the more hosts it enters, which, in turn, spread it more, multiplying exponentially. Just clicking on the article, just coming into contact with it—this tells the system to send it to more people. The code says, “Oh! this must be popular, seeing as many people are clicking it; I’m sure everyone else will like it…,” and so sends it to more and more people, who then send it further. Worst of all is the fact that fake news is not ideological but commercial. Fake news is not necessarily for promoting a party, supporting one candidate intrinsically; rather, it is all for money, not surprisingly. One might find this fact hard to believe, as there were countless pro-Trump or pro-Hillary (and vice versa, anti-) articles. But the fact is, these fake articles that spread rumors or intentionally provocative comments are advertised not to gain support for either candidate, but to pander to their supporters, and so to make money. Yes, the advertisements were sent to respective supporters, but it was not to help them grow, but to, by the very essence of the article, make them click on it, thus making them money. It is not unknown that Facebook sells private information about its users. Millions of private accounts have their information sold to companies for large sums of money. Once the companies have our private information, they can manipulate us; they can manufacture our consent. If I were to put on my account that I supported a particular candidate, and if my information, which is kept private, concealed from public view, were to be sold to a company, then they could look at my profile, see who it is I support, and send me advertisements and articles supporting that candidate, or denouncing the other candidate, and I would not be able to resist: After all, we love to engage our subconscious biases. Any contrary information strengthens our resistance. Large companies, then, do us a disservice, pandering to us, selling us what we already like and  know, entrenching us in our beliefs, leading to confirmation bias, ultimately making 72li89phdx1y.pngthem lots and lots of money. Facebook has ads absolutely everywhere. Hence, they make money off of us. Going back to the threat of fake news, the biggest problem is its evolution. Originally, fake news used to be intentionally false, provocative, and contentious, designed to make its readers drawn to it, interested in finding out about the latest scandals, even if they were believable or not, obviously fake, with the purpose of entertaining. An example would be some kind of conspiracy, like “Hitler still alive in secret bunker in Africa.” This is “sensational” news. Fake news is now a disguised predator, a sheep in wolf’s clothing, preying on us gullible readers, presenting itself as real, authentic news. See, whereas sensational news was meant to be explicitly entertaining and false, fake news is more believable than it used to, meant to mimic real news, to pull us in with facts; it looks real, but is deceptive, too good to be true. Taking up the mask of real, credible news sources—which, notwithstanding, are fake—these sites adopt media names, like “San Francisco Chronicle Gazette” or “Denver Guardian.” The president of Media Matters, Angelo Carusone, remarks, “These sites actually capitalize on people’s inherent trust in the news media.”[10]

We pride ourselves on our democratic freedoms of speech and press, yet nothing could be further from the truth. Today is the age wherein left becomes right, up down, and right wrong, when everything we have come to know is flipped upside down, every fact we have accepted needing to be checked, then re-checked, just to make sure it is not images.jpeg“fake.” Such is the time we occupy. We cannot trust our media. There is a fundamental lack of discussion. Silent, powerless yet powerful, we have the power to make a change, if we want to. I am sure none of us would like to live in a country where the media purposefully obscures the news, covering up the government’s actions, adding glitter to it, to keep it from appearing as it is. And yet, we live in one. It is not so distant from a totalitarian state as we might think. Chomsky thought Orwell would be impressed, impressed beyond horror, at the extent to which we as a civilization have abandoned truth and honesty in our coverage of the government. The public sphere as we have come to know it, has faltered, trampled beneath our feet, like a clerk on Black Friday, as we insatiable consumers burst through the doors, indiscriminate, hungry, willing to feast on whatever is presented before us on a fancy platter. Bibs fastened around our necks, knives and forks tight in our fists, we voluntarily feast on the shiny and tasty-looking desserts placed in front of us, instead of eating our vegetables, salutary, good for us, though not as inviting. We have failed the public sphere. Rational discourse has been abandoned. But if we take the time to talk with one another, engage in discussion, and do our research, reading up on the latest news, attentive, then we can bring back honest, intellectual journalism. We must make our communication authentic.


[1] Habermas, The Structural Transformation of the Public Sphere, p. 171
[2] Id., p. 169
[3] Id., p. 170
[4] Chomsky, The Chomsky Reader, “The Manufacture of Consent (1984),” p. 136
[5] Habermas, op. cit., p. 214
[6] Id., p. 217
[9] (9m10s)
[10] Id., (9m18s)

For further information:
The Structural Transformation of the Public Sphere by Jürgen Habermas (1991)
Introduction to Critical Theory: Horkheimer to Habermas by David Held (1980)
The Penguin Dictionary of Critical Theory 
by David Macey (2000)

Chomsky on Democracy & Education by Noam Chomsky (2003)
A Requiem for the American Dream by Noam Chomsky (2017)
Dictionary of Sociology by Nicholas Abercrombie (2006)
The Chomsky Reader
by Noam Chomsky (1987)

Social Imaginaries by Charles Taylor (2005)
Media Cross-ownership
Consolidation of Media

Facebook and Fake News

The Media, Democracy, and the Public Sphere [1 of 2]

Unknown.jpegIt is hard these days to distinguish “fake” news from “real” news. Browsing the Internet, checking our social media accounts, we come across tens of advertisements and articles, all of which vie for our attention, one claiming to have found the secret to instant weight loss, another “exposing” a celebrity scandal, yet another reporting “objectively” on the Trump administration, or some commenting on foreign affairs. This is a gullible age, in which we believe everything at first sight. If it piques our interest, then we click on it. We depend on the news to get information regarding important affairs in our country and around the globe; without it, we are no different than the early Europeans, who were ignorant of the New World. It is a problem, understandably, when this very source of knowledge from which we get information about the world is no longer to be trusted, when we are forced to be wary, vigilant, and cautious of whether or not it is true. How telling it is, to have a media that requires its facts to be checked! As it is the press’s job to report on what is happening and give us citizens the lowdown on what is happening, it is not too much to ask of it that it be objective and tell us what we need to hear, not what we want to hear; because sometimes, what is the case, is not comfortable. But as has been evidenced by the 2016 Election, the media has failed miserably in its service, failing to fairly and impartially give the facts, too concerned with presenting opinions, caught up in commercial interests, its duty not to the people but to the rich and influential to whom Unknown-2.jpegthey pander, and it has impoverished many an American, depriving him of the truth, so that it has become a medium through which biases and polarizations are disseminated. The decline of the media’s duty has resulted—and is resulting in—a terrible thing: A crisis of democracy. Veritably, the media’s loss of power in presenting the truth paves the way for democracy’s dim demise. Furthermore, this loss of the media is a sign of another deterioration, one vital to democracy: The public sphere. With the failing of the media comes the failing of the public sphere, and with it the failing of democracy. What is the public sphere, and what relation has it to democracy? What happened to the media? Why is fake news an existential crisis for democracy? Such are important questions, which have to be answered.

Unknown.jpegThe theory of the public sphere was developed in detail by philosopher Jürgen Habermas (1929-), a German member of the Frankfurt School of Critical Theory, the focus of which was to critique society under a Marxist lens. Historically, the public sphere originated in the 18th century during the Enlightenment, arising in local salons and coffeehouses, where the public would gather. Usually, such locations were frequented by the bourgeois, for they were the educated middle class. Heavily into reading, having been brought up with a fine education, these intellectuals—among whose ranks were philosophers like Voltaire and Rousseau, writers, publicists, playwrights, and scientists—would come together in a central location and talk, sharing ideas from their works. These tight-knit groups of intellectuals, philosophes, and men of letters congregated to discuss politics. The governments of the 18th century were largely monarchical. For this reason, the government was representative; that is, it did not communicate directly, but indirectly. It did not present itself candidly, but represented itself, with false semblances, giving appearances, but never real insights; government happenings remained mysterious, concealed from the public, covered up so that the ruler’s intentions were never disclosed, leaving the people in a cloud of confusion. Inside coffeehouses and salons, the bourgeois exercised their 1st Amendment Rights, namely their Freedom of Speech and Freedom of Press, both of which they used to their advantage. Outside of political purview, they Unknown-1.jpegcould speak freely and frankly with each other, without fear of punishment. In order to publicize their ideas, in order to make known what the government tried to make unknown; in an effort to enlighten the people, the bourgeois created public centers and presses so they could help circulate and spread news and ideas. Newspapers sprang up rapidly throughout Europe, especially in England and France—intellectual hotspots at the time. The public sphere operated under Enlightenment ideals, such as the general will and the public interest. When the public discussed an idea, they made sure to come to an agreement. This means: They listened to what everyone had to say, debated, then came to a decision, although not an arbitrary one, nor one arrived at by the majority, but made according to the general will, the common interest, a piece whereof was shared by everyone involved. Since “consent of the governed” seemed to fall on deaf ears in government, it fell upon the people to take care of themselves, so they took it up themselves; the public sphere served the people; the public sphere was indebted to the people, who were truly sovereign.

Unknown.pngAs such, the public sphere was the successful bridging of the public and private. That is to say, the public sphere brought those from the private domain—separate, private individual citizens—together centrally in a local area—the public. Private life is the life lived by oneself, in the comfort of one’s home, in one’s everyday routine. Accordingly, the private sphere was merged with public interaction, birthing the public sphere, where private individuals created public intentions. Where they assembled became “common spaces,” from which comes the basis for the word “sphere” in “public sphere.” Take the Internet: Although its users are separated by screens, some in the same town as one another, others are miles away, in different states, or countries, or continents, yet collectively, they identify as “Internet users,” which is to say that the Internet, despite being spatially inclusive, is not a single location, but an abstraction. The sphere is spread across multiple mediums—what Charles Taylor calls its quality of being “metatopical,” or beyond location. For example, hundreds of people can assemble in a stadium, which we can then call a “common space,” physical and exact; but hundreds of people can assemble on the Internet, which we can then call a “sphere,” in this case the cybersphere, digital and inexact, though Unknown-2.jpegextending variously. The discourse within the public sphere goes beyond physical space, not in a single spot where many convene, as we have seen, but spreading out. Here, in the sphere of the public, the principle is public opinion. Public opinion, Taylor points out, must be distinguished between merely convergent and common. Common public opinion—public opinion proper—is a singular, focused goal, whereas convergent public opinion is a melange, a coinciding of intentions, interrelated, yet bearing no unity. In other words, common public opinion is a public commitment, convergent a private commitment. The difference can be illustrated thus: Intellectuals discussing politics in Starbucks is common public opinion, while sports fans coming together in a stadium is convergent because they have no collective goal, rather they are all converging, or coming to a point, through different, private paths. Taylor cites public polls as an example, interestingly, of convergent public opinion; this is because the people responding are doing so privately, committing to something individually, which, when added up, is public, and their answers are diverse, not at all unified. Common opinion is something agreed upon. Essentially, the public sphere is extrapolitical—it is outside the domain of politics. Whether you are a supporter of a sports team or a member of a charitable organization, you are recognized as a part of something official. You are an “official supporter” or “official club member.” However, the public sphere is not an official organization; it is the exact opposite. It is recognized by the people, not the government. Hence, it has no real power; rather, it is a check on the government, a means of balancing out its power. The “public” is not a determinate thing. It is an abstraction. It is not an individual; it is a 6e54eb8188ff871f883d0720f112f818.jpg.pngcollective. It is only a sphere when the public makes it one. As has been said, it is not a political association—far from it—but the coming together of the private into the public. The common opinion is focused on critiquing the government, the stress being on the sovereignty of the people, the consent of the governed. Ideally, the public sphere is a statement, one that states, We are the people, and you, the government, should listen to us. The public sphere demands the government’s attention. It demands the principle of supervision, which says a government’s actions ought to be made public to the citizens. Legislation ought to be made manifest so it is rational (using logic and reason), pressured (to make moral decisions in front of the people), and democratic (in the name of the people, who are involved).

So what is the purpose of the public sphere, and what is its objective? The public sphere, we have said, is to serve public interest. Now that we have the why, we need to know the how. Conceived in the Enlightenment, the public sphere is designed for critical thinking and discourse among the people, separate from the government. Communicative action Unknown.pngis the theory which argues that, through language, things can get done. When we tell someone to do something, and they do it, we have created action through communication. In the public sphere, the goal is to debate politics and achieve communicative action by means of discourse, in which everyone takes part, ultimately so that a consensus is reached, upon which a resolution is made, and action follows. Again, this is local, not institutional, so communicative action can be made anywhere, from a supermarket to a modern coffee shop, as long as it is a common space. What comes from debate should be “mutual comprehension” or “compatibility,” according to Habermas. This means people understand one another, understand their views, and their ideas can be related to one another, combined, or subsumed, as in a dialectical synthesis. After all, this is the desired outcome of any debate: Two or more people argue with logic to back up a side, listening to their opponent, then devising a response, with the intent of coming to a conclusion that is agreed to, thus settling the matter. It is important that it be mutual, or two-sided, because Truth can only be achieved through a consensus, a common understanding. The public sphere has a climate of debate where people can argue rationally, defend logically, and challenge politely. Social dialogue is constituted by the public sphere. Discussion is Unknown.jpegmeant to be free, open, impartial, and critical. Merging Freedom of Speech with Freedom of Assembly, the public sphere cultivates what Habermas calls an ideal-speech situation. Once achieved, an ideal-speech situation is a circumstance in which unimpeded, unfettered free speech is allowed to flow. People can speak their minds freely, ready to engage with others. Discourse ethics is the field of ethics that covers the morals of discussion, so the ideal-speech situation is an ethical doctrine, as it sets up a paragon of critical political debate. In an essay concerning universal pragmatics, written to delineate the proper usage of speech and guidelines on how to communicate effectively, Habermas came up with four aspects of effective communication.

  1. Comprehensibility
  2. Truth
  3. Correctness
  4. Sincerity

In conclusion, the ideal-speech situation generated by the public sphere is a situation wherein private individuals can gather to speak their mind, hear out others, challenge, defend, create arguments, and come to an understanding consensus—one that reflects everyone’s opinion.

Does the public sphere still apply today, and to what extent? What happened to the public sphere, and why is it collapsing? It would be arrogant to assert the public sphere does not exist today, for political talk is rife as ever, and there are dozens of shows, radios, and broadcasting stations which cover politics, all offering commentary on current events, collaborative, contentious, and the process of globalization has allowed for far greater coverage, so there are more getting involved every day, and more people Unknown-1.pngtuning in, creating a very popular and argumentative political environment, where debates take place, either in YouTube comments or in High School corridors, for instance. However, it would be equally presumptuous and ignorant to deny that the public sphere is no longer potent in its goal, having diminished greatly since its creation in the 18th century. There is undeniably more involvement in politics in this age than the Enlightenment, yet there is concernedly less critical involvement, much less commonly public involvement. The literary circles of the philosophes, who would eagerly read the newspaper, excited to share their thoughts, ideas raging, have degenerated, and we no longer see this kind of intellectual commitment to political debate. In the centuries dividing these two ages, our media has grown immensely; now billions of people are interconnected globally, able to interact with one another. But newspapers, reporters, and radio shows still fail to incite critical discussion.

Whom or what are we to blame for the failure of the media in the public sphere? Habermas says the answer lies in the structural transformation of the public sphere. This structural transformation, he describes, is the commercialization of the public sphere. What was once a noble critical culture (kulturräsonierend) became an ignoble consumer Unknown-2.pngculture. As soon as the media came under the sway of money, it became commercial, no longer a service for the public but a business for consumers, something which no longer informed but sold, which dealt not in news but commodities. News turned into ideology. Reporting used to be objective. It wrote down the facts, checked them, reviewed them, then posted them. Nowadays, reporting is subjective; reporters can write not about the event itself, but their personal reaction to it, a reaction that is colored by their beliefs. It focuses on how the person is portrayed, instead of what they actually did. As such, the editorial is what one feels or thinks, rather than what happened. Habermas writes that the press became the “gate through which privileged private interests invaded the public sphere.”[1] As soon as private corporations began taking over the press, the media became a medium through which to spread their self-interest. A result of this is the endangerment of the public sphere, because critical public institutions like the press were protected precisely because they were private, in the sense of being extrapolitical and composed of alike individuals, but commerce and technology and corporations now pose a private threat from the government and rich and powerful corporations, who collaborate to keep themselves safe from censuring, colluding to keep their power from the people. An example is newspaper credibility: The authority of a newspaper these Unknown-1.jpegdays rests in the publisher, not the publicist. If we come across Time, we immediately impute it with trustworthiness, despite the fact that we do not check who actually wrote the articles themselves. A writer can easily write whatsoever they please, and if it is approved by the publisher, it will be considered trustworthy, since we invest trust in the publisher, disregarding the publicist, to whom we ought to give equal attention. All this, of course, is spearheaded by the big corporations, who gain from this authority, using it to their advantage. While it can be argued contrariwise, Habermas says editorials were once respectable and intellectual, but are now not. Not only are they subjective, he says, but they, along with the press itself, are commercialized, advertised for money, so that it is no longer an honest pursuit, but a financially motivated one, for which people vie. Just like the Prætorian guard that swore to protect the Emperor and that ended up selling the throne to bidders, so the press that swore to serve the public honestly and with integrity ended up selling editorials.

Unknown-1.pngAnother thing that has weakened the public sphere is opinion management, or public relations (PR). The mission of PR is to bridge the public and private, much as the public sphere does, to communicate between the institution and the people, albeit in a more devious manner. PR appeals to the private under the guise of the public. Howbeit, unlike advertising, which explicitly and openly shows itself as such, PR uses news authority as a façade, a cover, under which to represent private interests as “public opinion” and “news.” In other words, PR disguises commerce as something to which the entire public assents. Opinion management, therefore, leads to the ruination of discussion by means of “sophisticated opinion-molding services under the ægis of a sham public interest.”[2] Grabbing attention with drama, shock-factor, and clichés, and using well-known celebrities as sponsors to inspire conformity and trust, PR advertises its affairs with hyperbole and misleading exaggeration. These “suppliers” of news, as Habermas labels them, recreate the representative government of the 18th century that shrouded its intents from the public. Habermas names this the “refeudalization” of the public sphere, because it takes us back to a feudal hierarchy of society, where we, the public, are reduced to lowly vassals, servants, who are indebted to and dependent upon the powerful nobles, who hide their power. No longer is the public sphere reserved for debate; it is used to represent prestige; it no longer critiques the government extra-politically, but has publicity complicit therein.


[1] Habermas, The Structural Transformation of the Public Sphere, p. 185
[2] Id., p. 195


For further information:
The Structural Transformation of the Public Sphere by Jürgen Habermas (1991)
Introduction to Critical Theory: Horkheimer to Habermas by David Held (1980)
The Penguin Dictionary of Critical Theory 
by David Macey (2000)

Chomsky on Democracy & Education by Noam Chomsky (2003)
A Requiem for the American Dream by Noam Chomsky (2017)
Dictionary of Sociology by Nicholas Abercrombie (2006)
The Chomsky Reader
by Noam Chomsky (1987)

Social Imaginaries by Charles Taylor (2005)
Media Cross-ownership
Consolidation of Media

Facebook and Fake News

Is Water Wet?—A Philosophical Inquiry

Unknown.jpegRecently, a question has been circulating both the internet and, as I have experienced firsthand, my school, a question which is truly vital and which concerns mankind at its core. The question: Is water wet? I know what you are thinking. I will admit, the question is absurd, nonsensical, and, one might point out, fairly simple to answer. Yet many are torn up and in knots because of this simple question regarding an element with which we come in contact every day, one of the essential components of life. Some argue water is wet, others that it is not; and both have their reasons. What many of my peers neglect, however, is that this question is much more complex than it appears. Indeed, the question of whether or not water is wet images.jpegis not a trivial, everyday question; rather, it is something for the armchair philosopher to ponder—yes, the question of whether or not water is wet is, at its core, philosophical. And it is this critical perspective which is missing, which could illuminate the problem. All philosophical problems, declared Wittgenstein, are ultimately reducible to language problems. That is, a philosophical problem is really just a miscommunication, a squabbling over terms, terms that are not properly defined. As such, my approach to the question of whether or not water can be said to be wet requires that it be examined philosophically, and this involves an understanding of what exactly we mean by “water,””wetness,” and how exactly the one can be related to the other. By the end of this, I hope to provide an aqueous solution to this conundrum with the help of Aristotle (384-322 B.C.) and John Locke (1632-1704).  

images-1.jpegWhat exactly is water? Scientifically, I can say that water is the molecule H2O, made up of two hydrogen atoms and an oxygen atom. While I can explain it through chemistry, describing how it is the way it is, its interactions, or its properties, I cannot properly understand water thereby. All we need to know is that water is a liquid and for this reason that its shape is fluid and that it cannot be compressed into a solid. But this does not answer the question of what water is. Essentially, water is a substance. Substances, wrote Aristotle, “are the entities which underlie everything else, and… everything else is either predicated of them or present in them.”¹ In other words, substances are bearers of qualities; they are things that can be described. Like the subject of a sentence, a substance is that around which the sentence revolves, and it receives a predicate, or that which is said of the subject, which involves the verbs and adjectives. The substance can be described, as it takes descriptors but cannot itself be one. Just as a noun cannot be an adjective, so a substance cannot be a quality. As that which is being described, the substance can also contain within it qualities. Water, then, is a substance, because it can be described, it has qualities, it is that which bears qualities. One type of quality, the affective quality, Unknown-1.jpegmodifies a substance by being present in it. Affective qualities produce, or affect, a sensation in the substance based on what the quality itself is. For example, the quality of wetness is an affective quality because it produces an effect in its perceiver, and when wetness is present in a substance—say, water—the substance is said to be wet since it has that quality in it. Accordingly, it is wetness which makes water wet. Water as a substance is amorphous. It has no definite shape, but can conform. Is being fluid a quality? No, argued Aristotle. To be without resistance, to be fluid, is not to have that quality, but to be that shape. Despite its lack of shape, fluidity is how a substance’s parts are interrelated so that they appear formless, but really are. Unlike affective qualities, the fluid, amorphous shape of water is necessary and essential to it. Water is distinctively fluid. Although water can be said to be “distinctively wet,” I will discuss it later. Another important term to know is accident, which is a quality that applies to things contingently, i.e., unnecessarily. Wetness is not essential to water. In its essence, water can be imagined dry, not to mention the fact that it is able to take two other forms of matter. From Aristotle, we have learned that water is a substance, which means it has qualities, whereas wetness is an accidental affective quality, which means it is unnecessary and descriptive, but by no means defining.

Locke was an empiricist of the 17th century. He believed that all human knowledge was derived from the senses and not at all innate. All information is made up of ideas, which are everything from thoughts to perceptions to sensations. Every idea we humans have is made up of simple ideas, individual sensory data that comes from the senses. They can come from one sense or two, becoming a single idea in the end, though. A specific smell, like honey, for instance, is a simple idea: it is derived from a sensory organ, the nose, and is singular. These simple ideas, once gathered, stay in the mind as if it were a warehouse, and can from there be made into complex ideas, which are aggregates, or combinations, of ideas. It is impossible to experience a complex idea, for they are made up of simpler images-3.jpegones—one cannot get 3 without first adding 1 and 2. Based on this, water is a complex idea to the extent that it is made of many simpler ideas, namely its color (or lack thereof), smell (or lack thereof), texture, etc. When combined, all these sensory experiences add up to the individual idea of water. Concerning qualities, Locke said there were three: Primary, secondary, and tertiary. Primary qualities are necessary. They are actually found in external objects themselves. When we look at an object, we form a representation of it in our minds, and their primary qualities carry over, making them unchanging, absolute, non-relative. These primary qualities correspond to the physical makeup of the object—its corpuscles, which were the Early Modern conception of atoms. Each atom has extension, solidity, mobility, and figure, all of which therefore apply to the object itself. No matter where or when they are, objects retain their primary qualities. In contrast, secondary qualities are unnecessary in that they are not found in the objects themselves; instead, they are relative and illusory, mere creations of the mind. When Locke said they are not real, he meant they were not in the atoms themselves, but were an effect produced by them. Secondary qualities are vested in power, power to produce ideas. In effect, secondary qualities are not qualities in and of themselves, but are capable of making them. Wetness is wet, water is wet, but water is not wetness. Water does feel wet, but the wetness is not to be found in the water, but is produced by it. Locke said of secondary qualities that they are “nothing in the objects themselves but power to produce various sensations in us by their primary qualities.”² Thus, they are not sensed by themselves; they are caused by the arrangement of primary qualities, to which they are actually reducible. When we Unknown-2.jpeglook at water, our visual sense organs, our eyes, come in contact with it through waves, and the unique physical configuration of water is such that it produces the sensation of being clear, or lacking color. Color is not real. Water has no color because it is not inherent in the water, but is produced by it. So with wetness. Furthermore, secondary qualities are perceivable in two ways: through immediate and mediate perception. In the first, we the agents come in contact with the object, giving us a subjective experience of it. In the second, we experience the object coming in contact with another object. With wetness, we can experience it for ourselves as when we touch water, or we can see water wash upon a rock and get it wet. Either way, the quality of wetness is secondary. However, Locke also identified a tertiary quality, one which makes “a change in the bulk, figure, texture, and motion of another body.”³ To summarize, tertiary qualities have the power to change primary qualities. Wetness being the absorption of a liquid, it can affect an object’s mobility, texture, and figure by hindering, smoothing, and eroding.

images.jpegIn conclusion, water is wet, yet water is not wetness. Water is a fluid, amorphous substance, or bearer of qualities, of which one, wetness, is accidental—i.e., inessential—and not inherent to water itself, but rather a sensation perceived by the mind alone, completely illusory and artificial, a result of the physical configuration of water, by which we mean that the perception of wetness is neither within water nor exclusive to it, but one of many ideas which constitute the complex idea of water, which, it must be stressed, is not essentially wet, yet which nonetheless produces the sensation of wetness, albeit contingently, an adjective, an add-on, something predicated or said of the water. Moreover, wetness, qualitative of water, is not intrinsic insofar as it is a particular, not a universal. Wetness, because it is not exclusive to water, because it is widely applicable to other liquids, is not irreducible; in fact, it has already been said that wetness is reducible to primary qualities. When we look for the essence of something, its quiddity, what makes it it, we are looking for something eternal, unchanging, and irreducible—something so simple and essential, that 171114-npacific-full.jpgit is independent; but as we have learned, wetness cannot be alone, for it requires an object, a noun, a substance, something onto which it can latch, something to be wet, meaning wetness itself is not essential, but accidental in nature. Bluntly, lava can be wet, yet what applies to water applies, too, to lava; lava is wet, but it is not wetness, because wetness is not essential to it. Precisely because wetness is not essential to water—or any other liquid, for that matter—it is not wet. Water has wetness, making it wet, but is not itself wetness. Through an eidetic reduction, whereby the essence of a thing is revealed from its accidents, water is discovered to not be wet, as water is essentially not wet, and it can exist in three states by itself. And as wetness is not a substance per se, but a quality, it cannot stand by itself.

Water is wet in virtue of its wetness, which is not necessarily so, from which we deduce that water is not wet in virtue of its wetness.

Are you convinced? What do you think—is water wet? Leave your arguments below!

¹ Aristotle, Categories, 2b15
² Locke, An Essay Concerning Human Understanding, 2.8.10
³ Id., 2.8.23

For further reading: An Essay Concerning Human Understanding by John Locke (1990)
A Critical History of Western Philosophy by D.J. O’Connor (1964)
The Encyclopedia of Philosophy Vol. 6 by Paul Edwards (1967)
Philosophy: The Classics
3rd ed. by Nigel Warburton (2008)

Socrates to Sartre by Samuel Enoch Stumpf (1982)

Dickens and Dasein: A Heideggerian Analysis of “A Christmas Carol”

Tis the season to be jolly! At last, we come to the end of 2017 in order to celebrate the holidays with our families, home from school and work, carefree, warm, and surrounded by those we love. And what a great time it is, might I add, to wrap oneself in a cozy blanket and sit in front of the fireplace with a nice, good-ole book with which to christmas-4.jpgkeep company and entertain oneself; for nothing is better than snuggling up with a traditional story for the whole family. A classic in Victorian literature in 19th century London, Charles Dickens’ novella A Christmas Carol (1843) depicts Christmas through the eyes of the infamous miser Ebenezer Scrooge, who despises the tradition and wants nothing to do with it. It is a loved and cherished story of celebrating and embracing the Christmas spirit as well as personal transformation. A classic in continental philosophy in the 20th century, Martin Heidegger’s magnum opus Being and Time (1927) is considered one of the greatest works of his time, and it analyzes the fundamental structure of human existence through the eyes of Dasein, which serves as the only being which can inquire into existence itself. It is a complex and formidable study of what it means as human beings to be. Together, Charles Dickens, a Victorian novelist, and Martin Heidegger, an existential phenomenologist—an unlikely pair—define the human condition and how we can best live our lives by being true to and understanding ourselves and others. Enjoy your Christmas and have a great New Year! 

111159a.jpgThe first thing I would like to point out is the use of symbolism Dickens employs in A Christmas Carol and how it relates to Being and Time. Known for his brilliant characterizations and descriptions of people and things, Dickens emphasizes “light” through the novella, especially in the Ghost of Christmas Past and Present, the first of which has a fiery head that can be extinguished, the latter of which spreads it as he goes forth. For Heidegger, light also plays an important role, not symbolically, but existentially. He says Dasein (human beings) “is itself the clearing [Lichtung]…. Dasein is its disclosedness.”[1] The German word lichtung translates roughly to “clearing,” as in a “clearing in the woods.” In saying that humans are the clearing, he means that, in existing, we shed light on things, and they are revealed to us from obscurity. Symbolically, light represents wisdom, divine and cosmic purity, and 4775070_f7dc2c5e.jpgrevelation, the latter of which is most important here. Heidegger conceives truth to be essentially revelatory: Truth reveals that which has hitherto been concealed. He bases this on the Greek word for truth, aletheia (αλήθεια), which translates to un-coveredness. That which is made known is truth. As such, Heidegger goes on to say that human beings are their “disclosedness” [Erschlossenheit]. Human beings illuminate their world; they make sense of it; they uncover and thus disclose the world to themselves. Therefore, when Dickens paints the Ghosts as full of light and uses it elsewhere, it is because they bring to Scrooge truth. By leading him through time, they reveal to him truths he needs to come to terms with; his life is disclosed, and he uncovers things of which he was unaware, things which were once hidden to him.

We begin in the present, with Scrooge working in his office, cranky-as-ever. Some gentlemen come inside to ask for a donation to a local charity, which Scrooge rudely turns down, saying the poor people should either go to work or prison or die, so as to “decrease the surplus population.” He refuses to get involved in other people’s businesses, declaring “‘Mine occupies me constantly’” (22). The fundamental essence of christmas-carol-a-1.jpgman, Heidegger writes, is Care [Sorge]. What he means is that we are always involved, engaged, and concerned about things. We can care about things, and we can feel concern for others. We have a certain engagement with everything in our world. If I say, “I do not care about vegetables,” I care about vegetables in a certain sense, in that I have a feeling towards them, albeit a bad one. Scrooge may be called uncaring, but in truth he cares very much—just not in the right way. He is so absorbed in his work, so involved with his entire being, that he has no concern for others, but only himself and his business. A workaholic, he cares too much about his business and not enough about other things, such that his life is centered around his work and nothing else. In our everyday language, we say we get “involved in others’ businesses,” by which we mean that we take an interest to them, or we have concern for their affairs, in which sense we care about them. Thus, when Scrooge says his business always makes him busy, he is really saying by “business” two things: First, it is more important in the sense of money-making; and second, it is more important in the sense of not getting involved with others. Scrooge is what Heidegger calls inauthentic [Uneigentlichkeit] because he lives solely in the present. While this may seem like a good thing—especially with mindfulness being all the craze nowadays—it is not, because by situating himself in the present, using it as a time of activity, he is neglecting the past and especially his future. Normally, in subjective time, we see the present moment as a time of action; it is in the present that we act and make decisions; therefore, we are busiest in the present. Scrooge exists only in the present and is absorbed therein by his work, meaning he can get nothing else done. He is trapped by his work.

Heidegger likens birth to being “thrown” [Geworfen] into the world, insofar as we are, without warning or consent, violently catapulted into life, much like a strong pitch. It is disorientating, unexpected, and outside of our control. Once we are in the world, we find ourselves disposed to a certain mood, or state-of-mind [Befindlichkeit], at every instant. A sad mood makes life appear sad, a happy mood happy. When Scrooge’s nephew Fred visits Scrooge and asks him to celebrate Christmas with his friends, Scrooge replies, scrooge-and-fred-1971.jpg“‘What reason have you to be merry? You’re poor enough,’” to which Fred counters, ‘“What right have you to be dismal? What reason have you to be morose? You’re rich enough’” (16). Here one sees the effect of moods. Regardless of circumstances, our attitudes are influenced by moods. In this particular scene, one man is rich, the other poor, yet because of their dispositions, they regard the same situation—Christmas—differently. Jovial, amiable, and affable, Fred likes the season despite his lack of wealth. Stingy, biting, and mean, Scrooge despises the season despite his abundance of wealth. Further in the book, Scrooge sees Fred discussing Scrooge’s mode of being-in-the-world (existing). Fred laments that Scrooge is corrupted by his moods, that his unhappiness will be his ruin. His greed, he says, makes him lonely. But, were he to be happy, Fred suggests, he could love and be able to be with others. Later, The Ghost of Christmas Past pays Scrooge a visit and whisks him away to the town where he grew up, which Scrooge remembers happily. Everyone has facticity. Facticity is one’s past, the collection of “facts” one has about oneself. Our past is made up of things that cannot be changed, but which are permanent and given. Part of our facticity is the fact that we exist—we acknowledge it, but we cannot change it. Our past is our facticity because we are, as Heidegger says, already-in-the-world. We cannot come into existence now or in five minutes, because we already find ourselves existing. So, the two then fast forward to a moment in which Scrooge’s marriage is called images.jpegoff by his fiancée Belle. Upset that she has been replaced by his love of money, she cries, “‘May you be happy with the life you have chosen!’” (69). Scrooge is shaped by his facticity, namely his decision to forever dispel happiness and instead pursue wealth. As soon as Belle left him, as soon as he committed himself to this course, he could not change it. Because of this moment in the past, his later life is predetermined and foreshadowed by loneliness. This one choice made in past, a fact of his existence, affects his whole life. Scrooge is distressed by this scene and demands to go home, but The Ghost of Christmas Past tells him that it is not its fault that the past is the way it is, and that Scrooge should not blame it. The Ghost implies that no one is responsible for how Scrooge’s life turned out except himself. Considering facts are given and cannot be changed, Scrooge decides to resign himself to his past, submitting to it, letting it determine him. The loss of his soon-to-be wife and the neglect of his father are facts of Scrooge’s life that he lets determine him. Scrooge’s past is inauthentic.

Cratchitfamilyindex.jpgThe Ghost of Christmas Present takes Scrooge to Bob Cratchit’s home so he can see how his clerk lives. Scrooge feels bad for his disabled son Tiny Tim, into whose fate he inquires. If Scrooge continues on in his ways, the Ghost responds, then Tiny Tim will not make it to another Christmas. A disheartened Scrooge is mocked by the Ghost, who uses Scrooge’s own words against him. This moment reveals Scrooge in another mode of existence: falling [Verfallenheit]. In a state of fallenness, Scrooge is lost in the world and experiences forfeiture. Being “lost,” Scrooge loses himself in the present, in everydayness [Alltäglichkeit], so he forfeits himself, so to speak. In everyday life, we go about our business, do our job, eat, sleep, and repeat. There is nothing special, it is just average. In this way, we are “lost” in the world, and we lose sight of our real selves. We end up reverting to chatter, or idle talk [Gerede], to pass the time. We reuse phrases we hear from others and repeat them in trivial, frivolous, and uneventful conversations that distract us from reality. The Ghost of Christmas Present, however, points out that Scrooge has never experienced “the surplus” himself, has never walked among them in person, yet he remarks about them constantly, saying they should die. Hence, Scrooge has fallen to the “they” [Das Man]. The “they” is a vague entity, a collective, at once everyone yet at once no one, the indiscriminate individual, the voice of society. When asked why we do things, we answer, “Because they do it.” Accordingly, Scrooge’s chatter, his repeating what he hears from others, that the population should get rid of unnecessary people, comes from the “they.” He has become lost in them. He has lost himself in them. He is one of them. Fallen, forfeited, determined by social conventions, Scrooge’s present is inauthentic. By partaking in chatter, communicating through assertions, he reveals himself as fallen. Next, he is taken to Fred’s house, where he plays games with the guests, although invisible to them. One can interpret this metaphorically, as though he is both literally and figuratively invisible. He watches as they play the game “Yes or No,” a trivial game. Entertainment. Gossip. For once, Scrooge sees the “they” from the third-person, witnessing their chatter, of which he is the victim, something about which to be talked, a subject of ridicule. This objective exposure makes Scrooge aware of how dispersed the “they” is, how they pervade every part of life. He hears chatter about himself, listening to how he is portrayed himself as inauthentic by others. Finally, the Ghost of Christmas fbe887d568dcb70790119d0b88734ffc.jpgPresent gives his ultimate warning, revealing two depraved children beneath his robe: “‘This boy is Ignorance. This girl is Want. Beware… most of all… this boy, for on his brow I see that written which is Doom, unless the writing be erased. Deny it!’” (115). After, he requotes Scrooge’s chatter, condemning his fallenness into the “they.” The purpose of this is to show how Scrooge has fallen victim to the vices of Want and Ignorance. He cares for the wrong things, yet cares nonetheless. The former vice is his greed, the latter his lostness in the “they,” of which he is mostly unconscious, being-amidst-others and the world. In the present, humans are essentially fallen, by which they enter forfeiture, becoming inauthentic, losing themselves, ignorance the inevitable Doom which follows. The Ghost advises Scrooge to pull himself away from the “they” and back to himself.

Existentiality is the third mode of being. It is based on projection. Humans are able to plan ahead, to understand things. We think in terms of possibilities. When the Ghost of Marley comes to Scrooge on Christmas Eve, Scrooge is in disbelief. “Though he looked the phantom through and through, and saw it standing before him; though he felt the chilling influence of its death-cold eyes,… he was still incredulous, and fought against his sense” (31). Here, Marley’s phantom is a metaphor in itself—the arrival of Death. Scrooge, despite death being in front of him, flees from it, denies it. The possibility of death is passionately rejected by Scrooge, who is undeniably frightened, fearful of his life, unwilling to acknowledge its presence. Heidegger thinks death is underrated. He examines the human attitude toward death and concludes that, in everyday life, we see the possibility of death as a “not-yet,” something which will come but has not yet come, something in the distant future, something far away from us, something eventual, improbable, and incapable of touching us; in other words, we are, to use Ernest Becker’s phrase, in denial of death. Yes we will die, just not today. Or tomorrow. Or in the next year. But, eventually, we will! We push back death, unwilling to face it, giving it a deadline, as if it were on our terms, which it is not. Scrooge is not ready to die, so he does not believe in Marley, but says his senses are deluding him. Death itself is a delusion, he tells himself. During the fourth stave, Scrooge sees a dead body and gets to hear people talking about whoever it was who dead. As the reader, I do not think it is hard to predict d71f2d3b87522d55e23bdfee2335a072.jpgwho it is, personally, but Scrooge completely ignores the possibility of his death, ruling it out immediately, thinking he must still be alive—he has to be alive! In spite of all the evidence, from the business partners to the stolen furniture to the family in debt, he fails to deduce that it is he who is dead. The Ghost of Christmas Present, when at the Cratchit house, cautions Scrooge, “‘If these shadows remain unaltered by the Future, the child will die’” (98). What has this to do with existentiality? Scrooge, like all of us, thinks in terms of possibilities, in the process reducing Tiny Tim to a presence-at-hand; simply put, by thinking about Tiny Tim’s future, he sees him as a thing subject to time, as something that has possibilities, much as a pencil has the possibility of writing. Tiny Tim is considered to be something present, something that is “there.” Scrooge, for this reason, does not think of the future or project possibilities properly. Scrooge’s future is inauthentic. At the graveyard, Scrooge pleads with the Ghost of Christmas Yet to Come,

‘Are these the shadows of things that Will be, or are they the shadows of things that May be only?… Men’s courses will foreshadow certain ends, to which, if preserved in, they must lead…. But if the courses be departed from, the ends will change. Say it is thus with what you show me!’ (141)

Thinking of the future, Scrooge is determining whether it is contingent or necessary: Is his death necessary or unnecessary, a possibility or a certainty, a preordained event or an avoidable one? Has he free will? Is his future determined by his past completely, such that he signed his death warrant as soon as he chose his selfish, greedy path? If he is given a second chance, if he returns to his life, will the foreseen things happen, or can he change himself? Scrooge finally wants to become authentic [Eigentlichkeit].

Each of the Ghosts of Christmas represents something in the novella: Past, present, and future. However, up until now, I have not talked a whole lot about the Ghost of Marley. If he is a Ghost, and he visited Scrooge, of what is he, the first of all Ghosts, representative? What role does he play, for both Dickens and Heidegger? Jacob Marley, the dead co-owner of “Scrooge and Marley” and friend of Scrooge, is Scrooge’s call of conscience. In his famous monologue, Marley declares,

‘I wear the chain I forged in life… I made it link by link, and yard by yard; I girded it on of my own free will, and of my own free will I wore it. Is its pattern strange to you?…. Or would you know… the weight and length of the strong coil you bear yourself?’ (34-5)

mp_main_wide_christmascarol2008_452.jpgThe chains are a famous metaphor for the decisions Marley made throughout his life. Every single link, he says, is a choice he has made by himself, for himself. He repeats the phrase “free will,” which is important, because it means he alone made the choices; no one forced him to do them; he made his own life. Then, he asks Scrooge if the pattern is familiar. Like Scrooge, Marley stinted, grudged, and cared only about himself, leading to his lifestyle, which he regrets, a fate he abhors yet bears because he has to. Marley expresses remorse that he never went outside the building to see the people during Christmas time, but stayed locked up in his little cubicle working. This is what Heidegger calls guilt [Schuld]. Guilt is both a debt and a responsibility. Scrooge experiences guilt as a debt, because he has to pay off what he has done. His past actions, mind you, are part of his facticity, so he owes with his existence. Similarly, this debt is manifest in a responsibility for one’s actions. To be guilty is to look back at one’s past, to acknowledge that, while the past defines who we are, it does not define who we will be. Scrooge is determined by his past insofar as he has trouble forming intimate connections with others and he loves money, but this does not mean he has to be this way forever. He is indebted to his past, and must as a result carry this responsibility. Heidegger explains that when one is guilty, one is “full of not’s”—that is, we see what we are in contrast to what we are not. Since we are constantly making choices, we are simultaneously negating possibilities. By writing this essay, I am negating the possibility of having never written it, which would make me a different person, a person to whom I would be indebted, and for whom I would hold responsibility; conclusively, looking back, I would be guilty. Marley continues, complaining how sad it is “‘not to know that any Christian spirit… will find its mortal life too short for its vast means of usefulness! Now to know that no space of regrets can make amends for one’s 1.jpglife opportunities misused!’” (38). We only have one shot at life; in a word, YOLO. The point of Marley’s jeremiad is to remind Scrooge of his mortality, which has hitherto been neglected. In the present, Scrooge lives too absorbedly in the present, disregarding the future, paying no thought to it, as he is wrapped up in his business. How much change, how much good Scrooge could do, implores Marley, if he only realized his “vast means of usefulness”! Marley fears that if Scrooge sticks to his hermit-like existence, then it will be too late, and he will never get a chance to redo his life, as he did. Notably, he says, “‘Mankind was my business. The common welfare was my business…. The dealings of my trade were but a drop of water in the comprehensive ocean of my business!’” (38). Business has two meanings, of which work, the second, is subsumed under the first—the service of humanity. The business of Marley is the sum of his involvement, his care, in the world. Getting money is but a small portion of his engagement with the world; the other half was neglected, namely people. Similarly, Scrooge fails to conduct business with his fellow man. Only through the future can the past be changed. Scrooge, too narrow in his approach, cared too much and was Unknown.jpegconcerned too little, inspiring regret. After lamenting that he did not help the poor on Christmas Eve in life, Marley reveals he has come to warn Scrooge of how to avoid his very fate. First, Scrooge must realize that his facticity is inauthentic; to fix it, he must avoid the determinism of the past. Second, he must take up his duty toward man. In this way, the Ghost of Marley is the call of conscience, as Heidegger saw it. Conscience is itself a calling, a voiceless voice, which calls humans back to themselves. It is the call of the self back to come back to the self, away from the “they,” from inauthenticity, from fallenness, from forfeiture. It retrieves us from our absorption in the everyday. Through the call of conscience, we are made aware of our situation: We are alone, and wholly responsible for our choices. Marley beseeches Scrooge to personalize his past; he must make the past his before the past makes him its. Rather than fall victim to the past and let it define him, he must understand his past and how it shapes him. While he later denied it in an essay further in his career, Heidegger is here supporting Sartre’s “existence precedes essence.”

We are always in a mood. There is a peculiar mood, however, which leads to authenticity by making us confront our mortality: Anxiety/dread [Angst]. Unlike other moods, anxiety discloses our finitude to us. This necessary though unsettling state-of-mind allows us to Unknown.jpegrealize our essential mode of being: Being-towards-death [Sein-zum-Tode]. This is a scary idea, but Heidegger insists that it is at our core. Essentially, we are always moving towards death slowly. Time passes as it inches closer, year by year, moment by moment. Death is defined as “the possibility which is one’s ownmost, which is non-relational, and which is not be outstripped.”[2] Put simply, death is the only certainty in life. Everyone has to face death. No one is exempt from dying. It is insightful for Heidegger to propose that death is one’s “ownmost,” through which he communicates that death is my ownno one can die my death for me, I must die it myself. He notes that I can die for others in the sense of a sacrifice, but I am eventually going to die myself, independent of anyone else. We must all die on our own, for death is essentially private, unique to everyone. Death, then, is both unique and unavoidable, a necessity. Heidegger is quick to critique our views of death: According to him, the “they” in everyday life dismisses death, objectifying it as an observable event that will happen. Think about it: When we talk about death, we say it “will happen, just not right now.” The “they” postpones death and convinces us that we are immune to it. Truthfully, death comes to us all, and it is the ending of life: There are no more possibilities after death, for it is “not to be outstripped.” Scrooge, when he sees his 1984-xmas-future.jpggrave with the Ghost of Christmas Yet to Come, is filled with anxiety; he is immediately made aware of his mortality and the shortness of life on Earth; all at once, his Being is filled with intense emotions. Scrooge achieves resoluteness [Entschlossenheit]. To be resolute is to realize that one’s possibilities are one’s own. Resoluteness, in everyday language, means autonomy. A resolute Scrooge takes responsibility for each of his actions, considering they are his, and no one else’s. His life is his, so he must evaluate his possibilities for the future by himself, in the face of death. Together, being-towards-death and resoluteness become “anticipatory resoluteness,” which is just a fancy way of saying that one anticipates, or awaits, their death (hence anticipatory), thereby becoming resolute. An illustration: Scrooge sees his tombstone, realizing his mortality (anticipation), and decides thenceforth to become a new person (resoluteness). Achieving anticipatory resoluteness leads to a “moment of vision” [Augenblick], in which one reinterprets the past in relation to the future in the Present. The word “moment” is misleading, as it really refers to the fact that it happens in the Present (with a capital ‘P’), which is distinguished from the present, or the “now.” In the present, one is active: One acts in the present. In the Present, one is passive: Things happen to us in the Present. While you are contemplating your New Year’s resolutions, keep death in mind. Being resolute is like making a resolution—just make sure to anticipate death while you are at it! Heidegger describes authenticity in the following passage:

[A]nticipation reveals to Dasein its lostness in the they-self, and brings it face to face with the possibility of being itself, primarily unsupported by concernful solicitude, but of being itself, rather, in an impassioned freedom towards deatha freedom which has been released from the Illusions of the “they”, and which is factical, certain of itself, and anxious.[3]

To conclude, we get out of inauthenticity by confronting our own deaths, our ultimate possibility. We disclose ourselves through anxiety as beings-toward-death, a death which is certain, unique, and total.

Scrooge swears to the Ghost of Christmas Yet to Come he will change his ways, promising,

‘I will honour Christmas in my heart, and try to keep it all year. I will live in the Past, Present, and Future. The Spirits of all Three shall strive within me. I will not shut out the lessons that they teach. Oh tell me I may sponge away the writing on this stone!’ (142)

When I first read this quote, I almost jumped out of my blanket in joy; for while it is the climax of the story, the point where Scrooge truly resolves to turn his life around, it also could not line up more perfectly with Heidegger’s philosophy! Heideggerian temporality [Zeitlichkeit] is extraordinary: On the one hand, it is extra-ordinary in that it goes beyond and even shatters our everyday conception of time; and it is extraordinary inasmuch as it is a creative, insightful, and existential way of thinking about time. “Reaching out to the future, it [time] turns back to assimilate the past which has made the present.”[4] bigstock_Past_Present_Future_Time_Co_4799792.jpgWhat does this mean? Authentic temporality is subjective and finite: It is something experienced by us, and it has a beginning and an end. But unlike our view of time, which divides temporality into three separate dimensions—past, present, and future—Heidegger says time is a unity. Time is not broken up into infinite “nows” in the present, arising from the past and becoming the future. Inauthentic temporality is past, present, and future; authentic temporality is past-present-future, all in one. How can one be in the past, the present, and the future simultaneously, all at once? How is this possible, if even it is? According to Heidegger, when one exists authentically in time, one looks ahead to the future, to what they could be, at death, then reinterprets the past in light of this and becomes aware of how the past has shaped them, notices that what they are is influenced by what they were, and acts in accordance with this in the present—all in an instant. The future is predominant, though, since with it one anticipates death. Now, compare the following passage, from Heidegger, to the one quoted above, from Dickens:

In every ecstasis, temporality temporalizes itself as a whole; and this means that in the ecstatic unity with which temporality has fully temporalized itself currently, is grounded the totality of the structural whole of existence, facticity, and falling—that is, the unity of the care structure.[5]

The above passage basically restates what Scrooge promises to the Ghost of Christmas Yet to Come: Truly, Scrooge “will live in the Past, Present, and Future”! It is worth considering that Dickens took to capitalizing each of the “ecstasies” of time purposefully because he wanted to emphasize the importance of each structure of time. Conveniently—perfectly, I might chance to say—it fits with Heidegger, forming a union. And also, pay attention to the last part of Heidegger’s passage. He refers to the “care structure,” which is united by—look at that!—the three modes of existence: facticity, falling, and existentiality, each of which lines up with the three modes of time: past, present, and future. The care structure ties in with what was talked about earlier—our involvement in the world. As such, being is essentially linked with time, hence the title of Heidegger’s book, Being and Time [Sein und Zeit]. (Is your mind blown yet?). Another notion is then introduced by Heidegger: Fate [Schicksal]. But did not we discuss that existence precedes images-1.jpegessence earlier, that there is free will, not determinism? Fate is different for Heidegger than it is for us, unsurprisingly. One’s fate is existing in the authentic present. In a process he calls “historizing” [geschehen], we “stretch” ourselves along time. That is, we stretch ourselves between the past and the future, the beginning and end, birth and death. As with anything stretched between two ends, there is a middle ground. In this case, the Present. Our fate is to live authentically in the Present for ourselves, resolutely. It is during this time that we engage in the moment of vision, which, as we said, is not sustained for just a “moment,” but indefinitely, as long as one is authentic.

While planning this, I ran into a perplexing problem with terrible implications: If Christmas is a tradition everyone follows, an event “they” do, and if Scrooge celebrates it, then does that make Christmas inauthentic, something in the realm of the “they”? If this is so, then did Scrooge come all this way and listen to the Ghosts in order to authenticate himself to—what, to become inauthentic again? Does this unravel the entire plot instantly? Lo! luckily, Heidegger has a solution:

Repeating is handing down explicitly—that is to say, going back into the possibilities of the Dasein that has-been-there. The authentic repetition of a possibility of existence that has been… is grounded existentially in anticipatory resoluteness; for it is in resoluteness that one first chooses the choice which makes one free for the struggle of loyally following in the footsteps of that which can be repeated.[6]

The phenomenon known as repetition [Wiederholung] is reaching back into the past and “inheriting” something for oneself. He calls it “handing down.” Much as siblings give each other hand-me-downs or families hand down heirlooms, so we can interact with the past in a special way. Repetition does not necessarily have to happen out of conformity. Unknown.jpegLike Heidegger writes, it can be authentic when acted on through anticipatory resoluteness. If we consciously make the choice to celebrate an age-old tradition which others celebrate, too, then we are authentic. However, those who celebrate Christmas just because their families and friends do, without knowing why they celebrate, what the importance of it is—they are celebrating Christmas inauthentically. They are not giving it the respect it deserves. To celebrate Christmas, to partake in the Christmas spirit, requires that one truly choose it, and this is precisely what Scrooge does. Heidegger adds that authentic repetition “deprives ‘today’ of its character as present, and weans one from the conventions of the ‘they.’”[7] Not only is an appropriated past event not past at all, but it is completely free from the besmirchment of the “they.” Chosen authentically and intentionally in the face of death, projected in the long run, following a tradition makes it neither past nor present, but Present, because it is something which happens, that is not caused, and is not done to progress anything.

3067ba0d24d42f07115774045d4393a8.jpgCare as solicitude, or protection and concern, is thus enacted by an authentic Scrooge, who, embodying the Christmas spirit, united temporally, having encountered death, in a bliss mood, gives a young boy passing by money to buy a big turkey, which he delivers to the financially struggling Bob Cratchit; donates a large sum of money to charity, recanting his mistaken chatter; and befriends the Cratchits, joining the family, becoming a father figure for Tiny Tim, whose life he saves by saving his own. On his way out of the house, Scrooge stops to look at his door knocker, which once resembled Jacob Marley’s Unknown.jpegface, and exclaims, “‘I shall it as long as I live! … I scarcely ever looked at it before. What an honest expression it has in its face! It’s a wonderful knocker!’” (149). This seemingly unimportant moment is probably glanced over by readers, but it holds significance. We encounter things and objects in the world as either present-at-hand [Vorhandenheit] or ready-to-hand [Zuhandenheit]. The former are things that that just are; they are factical and given, and their presence indicates their name. The latter are things that can be used—equipment, if you will. As can probably be gained from this, you can conclude that objects are looked down upon as merely things, objects of use. Living things are more important than lifeless objects lying around. This is why this moment is worthy of our c2127e94e31cfe9a2181bb55974fd9dd.jpgattention. Heidegger explains, “The moment of vision permits us to encounter for the first time what can be ‘in a time’ as ready-to-hand or present-at-hand.”[8] Taken for granted, seen daily but not considered in itself, used mindlessly through subconscious habit, Scrooge’s door knocker only gains value when he sees Marley’s face in it. Now, as a being-towards-death, Scrooge sees the door knocker in a new light (symbolism!), disclosing it, revealing what was once hidden to him, finding pleasure in the simple things. One thinks of the common adage, “Live each day as though it were your last.” The night before was almost his very last, so he cherishes being alive, even being happy towards objects. The moment of vision discloses the world and objects, uncovering them as they are; and it is not just for a single instant, but for a lifetime. Scrooge is authentic Dasein.

Scrooge was better than his word. He did it all, and infinitely more…. He became as good a friend, as good a master, and as good a man as the good old City knew, or any other good old city, town, or borough in the good old world…. His own heart laughed, and that was quite enough for him (155).

And so, as Tiny Tim observed, God bless Us, Every One!

Screen Shot 2017-12-22 at 11.58.28 PM.png


*I want to dedicate this blog to my dad, who has himself encountered death in his time; who has, I want to think, remained authentic as a father for as long as I can remember; whose avid and ardent affection, appraisal, and adoration for Charles Dickens inspired me to write this post; and without whose support I would not be writing. May we have many more Christmases together!


[1] Heidegger, Being and Time, H. 133
[2] Id., H. 250-1
[3]  H. 266
[4] Edwards, The Encyclopedia of Philosophy, Vol. 3, p. 461
[5] Heidegger, op. cit., H. 350
[6] Id., H. 385
[7] H. 391
[8] H. 338

For further reading: 
Masterpieces of World Philosophy in Summary Form by Frank N. Magill (1961)
The Columbia History of Western Philosophy by Richard H. Popkin (1999)
Existentialist Philosophy: An Introduction by L. Nathan Oaklander (1992)
The Encyclopedia of Philosophy
Vol. 3 by Paul Edwards (1967)

Time, Narrative, and History by David Carr (1991)
A Christmas Carol by Charles Dickens (1994)
Being and Time by Martin Heidegger (1962) 

Why Do We Root for the Good Guys?

Warning: Lord of the Flies and Game of Thrones (Season 6) Spoilers! 

I grew up watching movies. My favorites were action movies, where the good guy shot up his enemies and performed exciting stunts in flaming buildings in order to stop some evil-doer from doing something terrible. Of course, there were also the classics that I adored, such as Star Wars, a classic good vs. evil story. Back then, I liked to think myself quite the devil’s advocate, hopping to the other side, wondering what would happen if the bad guy won this time, then cheering for them. It made me wonder as a young child: Why do the good guys always win? There are always two sides to the story, so why Unknown.jpegweren’t the villains’ sides considered? No matter whom I rooted for, good or bad, it was always the good who vanquished the bad, who stood victorious in the name of peace and order. This eternal struggle between good and evil, this Manichæan theme, this dualistic battle—it is not just present in cinema, but permeates all of Western culture, from its videogames to its literature to its mythologies to its historiography. This narrative is woven into our daily life. As such, how earth-shattering it is to read Nietzsche: “No one has… expressed the slightest doubt or hesitation in judging the ‘good man’ to be of a higher value than the ‘evil man….’ But! What if we suppose the reverse were true? What then?”—indeed, what then? [1]

Everyone has a Will-to-Power, believed Nietzsche. Deep down, hidden in the unconscious, there is an unkown, life-preserving, exploitative, driving urge that  permeates every living thing. When people act out of this unconscious Will, they are not to be blamed, for this Will is natural. To Nietzsche, it seemed absurd to say that anyone who acted on this Will to Power was blameworthy because, in essence, it is the Will that is intrinsic to them. “A measure of force,” he said, “is just such a measure of impetus, will, Unknown-1.jpegaction.”[2] Therefore, throughout nature, embedded in all our willed, voluntary actions is the Will to Power. The Will to Power is inherent to all animals, which are always seeking not the most happiness, but the most power, and are always avoiding that which prevents power. By power, Nietzsche meant the ability to triumph, to master one’s surroundings and prevail, to exploit to the best of one’s abilities, such that it lives longer, by whatever means necessary. Hence, “[A]n injurious, oppressive, exploitative or destructive action cannot be intrinsically wrong, inasmuch as life is essentially something which functions by injuring, oppressing, exploiting, and destroying, and it is absolutely inconceivable without such a characteristic.”[3] Basically, all actions we judge today as wrong are, to Nietzsche, natural expressions of the Will to Power. In fact, we should not judge them at all, because, as illustrated in the quote above, Nietzsche saw life rather pessimistically, describing life as a dog-eat-dog, every-man-for-himself competition, where only the strongest survive. One gets the idea from Nietzsche, then, that one can only make it through life if they embrace these qualities, these violent, aggressive, harmful qualities. A philologist and historian, Nietzsche concluded from his studies that ancient man was naturally sadistic: He enjoyed participating in violence and loved inflicting cruelty, deriving a savage pleasure from it. Punishment was an important part of daily life back then, so, Nietzsche proposed, those who were quick to inflict suffering were seen as good, while those who were hesitant, who were slow to deliver punishment for a forgotten debt, were seen as incompetent. This cruelty, correctly, was said by Nietzsche to be the direct product of the Will to Power. He went so far as to say that cruelty is “something to which the heart says a hearty yes.”[4] This sounds frightening. Do we really delight in cruelty, even in today’s modern, civilized world, so distant from our barbaric past? While we may be in denial or firm disagreement, thinking such a sentiment disgusting or repugnant, we must concede that we do take pleasure in cruelty, even if it is minimal. After all, we all know that wonderful German word schadenfreude—the joy we get from watching others’ misfortune. Nietzsche remarked that today, although we do not go around gaily slaughtering each other as our ancestors did, we still enjoy cruelty in other, less explicit Fighting-630x420ways, such as video games and movies and events that have fighting, like wrestling or MMA. In this way, we have not completely gotten rid of cruelty, but have rather channeled it through vicarious means, not directly inflicting it, but still experiencing it. But how many of us would willingly admit that we enjoy watching—or even inflicting—pain? Nietzsche foresaw this, even saw it in his own time: We are more likely to believe in fate or chance or free will than in the Will to Power, the idea of which repulses us and could not possibly be in our psyches. Our unwillingness to accept this exploitative Will, reasoned Nietzsche, leads to what he called “misarchism,” or hatred of rulers and ruling. By this he meant that we hated the idea of power and all its associations. To say that history’s great men were shaped by this Will to Power rather than their cultures or destinies, seems to us impossible to accept. Think of all the brutal, bloodthirsty dictators and authoritarians throughout history! We fear power, to the point of detesting it, and we are worried about its applications everywhere. Nietzsche passionately rejected Darwin’s theory of natural selection, explaining that organisms sought not survival, but flourishing. All organisms are not content with simply surviving. The lion did not survive natural selection only to settle down, feeling himself lucky to have lived out his competitors; he survived to gain more power, to be dominant, and therefore to dominate his environment and prey. Adaptation is more about being proactive than reactive. Adaptation is achieved through internalizing conflicts. Progress is a necessary sacrifice of the weak to the powerful, in Nietzsche’s eyes. He thought that strong could live by themselves. They were autonomous. In following their own morality, they could live on their own terms, unbeholden. The weak hold us back, he wrote. This gives us a picture of Unknown-2Nietzsche’s ideal man. An ideal man affirms, not denies, his Will to Power. Just as the best government has the least laws, so the best man has the least moral values save his own. He follows his own morality, not society’s. He stands out from the herd. He seeks power, not pleasure; those who seek pleasure avoid pain, but pain is inevitable, leading to “pessimism of sensibility,” or conscience. In what Mencken calls “ideal anarchy,” every man does what pleases him, and him alone. The ideal man concerns himself with himself, and no one else. Spontaneous, instinctive, and unconscious, he acts on his Will, embracing what Nietzsche calls his instinct for freedom. Unlike the weak, who experience responsibility for their actions, the strong feel no guilt or responsibility, but act in the moment, unafraid of the consequences, but wholly accepting them.

There are two kinds of people in this world: Masters and slaves. According to Nietzsche, all moralities can be divided under these two classes. In tracing the history of the concepts of Good and Evil, Nietzsche found in early societies a primitive form of this duality, finding it to be between not Good and Evil, but instead Good and Bad. He discovered these two words are linked etymologically to the aristocracy, in which the aristocrats, the rich and powerful, call themselves “Good” and everyone who is not an aristocrat, the poor and powerless, “Bad.” In other words, the idea of Goodness developed from the nobility, from the upper class, which often consisted of the dominant few who had most of the land and owned slaves. They thought themselves the best, superior to everyone else, as they had control over resources, among them, people.[5] Seeing as they were educated and could do whatever they pleased with their property, it was only fitting, Nietzsche thought, that they should differentiate themselves from the masses, whom they considered lowly and base. The nobility possessed what Nietzsche calls the pathos of distance—that feeling of separation Unknown-3between oneself and others, especially of higher from lower, owner from owned. This worldview said that whatever was not aristocratic was bad, so all slaves were bad, in that they lacked everything the nobility had. What distinguishes the master from the slave is power. Thus, anything that goes against power is slavish and therefore bad, meaning the virtues we so often praise, such as temperance and compassion, are bad qualities, to the extent that they are anti-power. A change took place in these societies when religions like Judaism and Christianity began amassing followers, pandering to the masses, particularly the slaves. Suddenly, the consensus was, “The wretched are alone the good; the poor, the weak, the lowly are alone the good… but you, on the other hand,… you men of power, you are for all eternity the evil, the horrible, the covetous, the insatiate, the godless.”[6] Religion created an inversion of the noble morality, turning Good and Bad into Good vs. Evil. There was, accordingly, a twofold inversion: The Bad became the Evil, and it was no longer a coexistence but a competition of values, and there could only be one victor. Through this inversion, the weak made themselves “stronger” than their oppressors. By painting their enemies as Evil, the manifestation of all things contemptible, the slaves managed to get the upper hand, convincing themselves that they were happier than their masters. They aggrandized suffering, rather than dominating. Nietzsche named this approach the ascetic ideal, which he defined as “an attempt to seem ‘too good’ for this world, a sort of holy debauchery.”[7] He says “too good for this world” as a way of satirizing this otherworldly approach, which emphasizes the pure and the heavenly, calling for the renunciation of the appetite, a call to a virtuous life, one that will be rewarded in the second life. These ascetics parade their “holy debauchery,” whereby they take pride in their virtuous, saintly life; in their denial of this world; and in their holier-than-thou comportment. Foreshadowing Freud, Nietzsche theorized that the Unknown-4repression of the Will to Power that took place in asceticism led to “bad conscience,” a concept similar to guilt. Simply, Judeo-Christian morality taught that it was wrong to act on the Will to Power, so its followers repressed, or kept in check, their instincts; guilt arises, then, when one’s instincts turn upon oneself. These built-up instincts, having no output, are accordingly relieved by self-inflicted suffering. This “internalization of man,” Nietzsche diagnosed, is what made the weak appear strong yet remain weak; for the Will cannot be fully renounced after all, but finds its way out in the cleverest of ways. He noted how they paradoxically “use[d] power to dam the sources of power…. [A] baleful eye is cast at physiological well-being, especially against the expression of such well-being,… while a sense of joy is experienced and sought in… wilful privation, in self-denial and flagellation.”[8] It is through the Will that the weak try in futility to deny it. They cast away their inner nature, condemning those who are complicit, who partake in it. A minority, they convince themselves they are right, and the others are wrong, as though they are doing the right thing and are guided aright, while the others are misguided, and they take pride in their apparent pureness, seeking meekly for absolution, as if it is the proper pursuit, a struggle that will, in the end, be rewarded justly in the next life, where those who were tempted suffer eternally in damnation. Psychologically, this results in ressentiment, a feeling of deep-seated animosity or hatred of the oppressed directed toward the oppressor, over whom they have no control. Again, prefiguring Freudian theory, Nietzsche develops an early form of displacement; i.e., redirecting one’s feelings onto an object or person. In this case, the oppressed, who in reality can do nothing against their powerful rulers, fabricate their own mythology, in which the oppressors are punished in the name of the weak. Therefore, ressentiment is a form of catharsis, a release, if you will, of anger, which is relieved through imagined retribution. The slaves, who are by nature weak, bearing their suffering thereby, impute this suffering to the strong, whom they blame for their condition. Pleasing oneself, or indulging the Will, 250px-Temptation_of_Saint_Anthony_by_Bosch.jpegconsequently, is seen as bad. All acts exhibited as Will become frowned-upon, made into crimes: Those who want something and take it for themselves—a quality admired by the noble—are called covetous, and those who please themselves tirelessly, always taking more—self-preservational, and thus symbolic of a master—are called insatiate. Evidently, noble virtues become slavish vices, and noble vices become slavish virtues. The Will presents itself as weakness, which is interpreted by the slaves as strength, so they convince themselves that they chose it, that it is, as Nietzsche called it, an “achievement.” They are excited to have “tamed” the Will! To summarize, “The strong man’s objective is to take as much as he can from his victim; the weak man’s is to save as much as he can from his conqueror.”[9] Without hesitation, without thought, the strong man takes what he wants; the slave denies their Will and represses it.

All this sounds quite abstract and foreign, admittedly, as if it is out of place, which it might seem to most of us at first. However, I shall proceed to highlight some relevant, modern day examples that I hope shall illustrate that what Nietzsche is describing is entirely applicable and can easily be found in Western culture, and not some idle speculation about a different time period, when things were much different. A while ago, I did a blog on Lord of the Flies, wherein I discussed the Will to Power. Based on this discussion, I would ask, Who really won in Lord of the Flies? The answer, undoubtedly, is Jack. Although Ralph may have been saved by civilization, the damage was done, and in an alternate ending, he would have ended up dying at the hands of Jack and his merciless tribe. All throughout the novel, we readers are quietly cheering for Ralph and Piggy, the untainted, the pure, the civilized, to survive and triumph over the brutal images.jpegsavages into which the other boys had devolved. How terrible it would be if those brutes, those aggressive, violent, primitive hunters had the island to themselves! What chaos would ensue! Yet, in the end, Ralph and Piggy, the protagonists, were slaves to society’s morality; they unthinkingly followed the herd instinct. They did not question the morality imposed on them by society, which taught them to behave and to control their impulses, to stifle their Will. On the other hand, Jack and his tribe fully embraced their Will to Power. Channelling the primordial hunter within them, they expressed their instincts through aggression, such as when Jack hunts the pig or when Robert terrorizes the smaller boys—in either case, the boys were accompanied not just by a great pleasure, but a feeling of power, of power over something, exploitation. Whereas Piggy and Ralph were like small gazelles trying to survive, Jack was like a lion trying to predominate. It was the strongest who won.

A classic example of the battle between Good and Evil is the (currently) heptalogy Star Wars. Based on Campbell’s The Hero with a Thousand Faces, Star Wars follows the age-old theme of Light and Dark and the cosmic duel between opposing forces. Interwoven into its narrative is the want for the good guys—the Jedi, in this case—to beat the bad guys—the Sith—so that intergalactic peace can be maintained. So why exactly are the Jedi and Sith at odds? Why are they enemies of each other even though they both harness the same energy—the Force? The Sith, who practice what is called the “Dark side of the Force,” are called Evil by the Jedi because it is known to be tempting and thence corrupting. The learned masters warn their padawan not get drawn to the Dark side, lest they gratify their instincts, no matter how natural or easy they are to gratify. In essence, the Jedi are saying to choose virtue over vice. Sound familiar? The Jedi are the slaves, the Sith the masters. If we further examine the two orders, we shall find even better evidence. Both orders adhere to their respective codes, which outline their core beliefs. Here is the Sith Code:

Peace is a lie. There is only Passion.

Through Passion I gain Strength.

Through Strength I gain Power.

Through Power I gain Victory.

Through Victory my chains are Broken.

The Force shall free me.

Canon_Sith_symbol.pngIt can be gathered from this that central to the Sith philosophy is the idea of a blind, erratic chaos which governs all. There is no order in the galaxy, only disorder. The key to the Sith is aggression, which comes from the Will, and is pure, focused anger. It is through the instincts that power is both achieved and channeled, from which comes victory, after which follows freedom. Accordingly, it is the directing of the Will that sets them free; they engage their instinct for freedom, which the slaves deny. Another part of their code “encouraged the strong to destroy the weak, and insisted on the importance of struggling and surviving”; and the master and his student always sized each other up, for “a weak master deserved to be overthrown by their pupil, just as a weak pupil deserved to be replaced by a worthier, more powerful recruit.”[10] Words like “worthier,””powerful,” and “weak” all can be connected to the master-slave morality, having originated from the aristocracy. From this perspective, the Sith favor the strong, thinking themselves superior to the Jedi, whom they consider, conversely, the slaves. Nietzsche emphasized overcoming one’s struggles through exploitation, sort of like an extreme survival of the fittest, to use Spencer’s term. Therefore, the students of Sith masters, if they were deemed too weak, were replaced to make room for better, stronger, more Willful students. Darth Vader said, “Anger and pain are natural and part of growth…. They make you strong.” Both emotions named stem from the unconscious, the self-preservational, and both are biologically necessary, according to Nietzsche. Today’s Western civilization devalues anger, calling it an ugly, unproductive emotion, and discourages it. To the Sith and Nietzsche, however, anger is a necessary emotion through which the individual overcomes himself and becomes something, someone, better. Now let’s examine the Jedi:

There is no emotion, there is peace.

There is no ignorance, there is knowledge.

There is no passion, there is serenity.

There is no chaos, there is harmony.

There is no death, there is the Force.

Unknown.pngLooking at the parallel structures of the two codes, you will notice the Jedi Code is an exact inversion of the Sith Code! Compare this to what Nietzsche claims occurred millennia ago, when the Judeo-Christian slaves pulled a complete reversal on their masters, thus establishing the slave morality, which was the opposite of the noble values. The Jedi deny any chaos, instead affirming harmony; the Jedi deny the passions, instead affirming asceticism, or a turn away from them. To say someone is emotional is usually not a compliment, as it usually means they are over-dramatic, easily upset, or moody; so when the Jedi say there are no emotions, they are basically denying the Will to Power, eschewing it totally from their worldview, because according to them, emotions lead to chaos, whereas no passions leads to peace. The wisest of the Jedi, Master Yoda—everyone’s favorite backwards-speaking native of Dagobah—has a wealth of quotable adages, among them many attacks on the Sith, one of which goes, “Powerful you have become, the dark side I sense in you.” Automatically, he associates “power” with the dark side, for it denotes exploitation, injury, and all the other volitions Nietzsche stated. He also says, “[I]f you choose the quick and easy path… you will become an agent of evil.” Yoda uses the phrase “agent of evil” deliberately here: Make no mistake, he thought his wording through very thoroughly, such that his choice of words is intentional. Recall that through ressentiment, the slaves change Bad to Evil so that it looks like they are being oppressed; similarly, Yoda calls the Sith Evil, whereas the Sith would most likely call Yoda Bad, in accordance with the aristocratic morality. And when calls the dark side the “quick and easy path,” he calls it such because it is easier, he knows, to gratify one’s instincts than to repress them, as he does.

Finally, I shall examine the very popular HBO show Game of Thrones, in which I found much food for thought. As with every narrative, we always cheer for the good side and boo for the bad side. While watching, I asked myself, Why do we like the Unknown-1.jpegStarks and hate the Lannisters? What is it about the two houses that makes one favorable to the other? How is it that our values affect our associating with the characters?  Eddard “Ed” Stark is the first major character with whom the audience starts to feel an affinity. He is the archetypal “good guy” because he is pure, ascetic, and he denied his Will. Compassionate, considerate, fatherly, and humble, Ed is loved by all because he is so virtuous and caring—we would never expect him to burn down a village of innocents, for example: It is not his character to do so. His resistance to his Will made him weak and oppressed, though. Why would we be cheering for an oppressed character? It is precisely because of his weakness that we like him: We feel pity for him, and we want him to prevail at the hands of evil, we want him to succeed, we want him to stand up against the oppressors, we want retribution, we want a David and Goliath story. The weak, we have learned, always blame their oppressors, so we naturally blame the Lannisters and acquit the Starks, who have suffered at the hands of the former. Unfortunately, it is Ed’s purity and refraining from the rampant corruption, dishonesty, and moral bankruptcy around him and his loyalty to a moral code that lead to his downfall. Each time the Starks lose and the Lannisters gain, every step backwards and forwards they take, respectively, the more we love and pity the Starks and hate and abhor the Lannisters, who seem to take everything they want, rapacious, immoral, and exploitative. We viewers suffer from the pessimism of sensibility: There is so much suffering in the show—too much—that we become disillusioned, making us feel like life is unfair, like there is no equality, and so we become disheartened every time the Starks suffer a loss; we suffer with them. We want justice for the cruel acts the Lannisters commit against the defenseless. The Lannisters do anything that will get them ahead, even if it means blurring the lines of what is considered moral, using whatever is in their advantage, cheating when they can. Hence, Unknown-2.jpegJaime and Cersei, heads of House Lannister, are masters. Jaime Lannister has a simple, anthropocentric worldview: He and Cersei are the only two people who are important in the world, and nothing else matters. In other words, Jaime cares only about himself and Cersei, and he is willing to do whatever he needs to so he can protect her. Instead of compiling a list of ethics, Jaime has a simple goal, with no guidelines. Anything goes. He can do whatever he pleases, as long as it is for his and Cersei’s sake. Even when Jaime is the prisoner of Brienne, supposedly making Brienne the master and Jaime the slave, Jaime remains the master after all. Pretty much every action movie I have seen has a scene where the good guy has a captured enemy who taunts them, encouraging them to strike them, to lose their temper and ignite their fury, but the good guy refuses, calms himself, collects his nerves, remembers his values, and does not give into the volatile words. As when in Star Wars Emperor Sidious tells Luke to act on his anger but Luke refuses to surrender to the dark side, so Jaime tries to enrage Brienne, clearly unnerving her, then telling her to release her anger on him, because he knows she wants to; as the fire lights in her eyes and she raises her sword, she then drops it, remembering her promise, and she chooses the “noble path,” the ascetic path. She wants to hurt him, deep down. She wants to be cruel. Unknown-3.jpegBut she resists her Will on account of a “higher order.” Jaime, then, has the real advantage over Brienne. While she may be the one with the sword, and while he may be the one tied up, it is he who holds dominance, who is most powerful. Another encounter, this time with Edmure Tully, takes place in a tent; this time, the positions have changed, Edmure being the prisoner, Jaime being the keeper. Edmure tells Jaime, “You understand you’re an evil man.” After a discussion that leads to the subject of Catelyn Stark, Edmure’s sister and Jaime’s former captor, Jaime states, “Catelyn Stark hated me like you hate me, but I didn’t hate her. I admired her, far more than I did her husband or her son” (S6:E08). Like Yoda, Edmure Tully calls Jaime “Evil” to demonstrate that he is his opposite. While Edmure is Good, a saint, Jaime is Evil, a sinner. One of the characteristics of the noble master, Nietzsche claimed, is that they have a “love of their enemy”; meanwhile, the slaves despise those they call Evil. The strong respect their enemies because they define themselves in relation to them. Without the Bad, there can be no Good. Nobles, therefore, respect those lower than them, because they have power over them. Jaime’s sister, Cersei, also has a straightforward moral code: Unknown.jpeg“I do things because they feel good” (S6:E10). In that episode, Cersei turns the tables against her zealot-captor Septa Unella. She says Unella made her suffer not out of compassion or a desire to see her purify herself, but out of her inner, biological craving for cruelty that comes from the Will. She made her miserable because she loved to inflict pain, which, Cersei confides, she, too, experiences. Cersei does not follow a pre-established morality; rather, she makes her own, doing whatsoever she pleases, whensoever she pleases, if it benefits her, even if it means killing thousands—even if, among those thousands, there are innocents. That is, she does not think before acting, but forms her morality from that. Nietzsche explained that pleasure is not what is good for oneself or what makes one feel pleasant. Pleasure is just a byproduct which accompanies an increase in power. Consequently, whenever Cersei does something because it pleases her, it really means she does it because she gains power, and her Will to Power is fulfilled. When she makes a decision, Cersei does not consider what effect it may have on others, especially the slaves; she only does what will further her cause. Another character who values power is Ellaria Sand, widow of Oberyn Martell, who, after killing Doran Martell, proclaims, “Weak men will never rule Dorne again” (S6:E01). Because Doran did nothing, Ellaria decided to take power into her own hands, stabbing him in order to gain control, such that she could rule Dorne, this time with purpose and conviction. Doran did not do anything. He preferred peace and was thus inactive. And weak. He did not take initiative, did not affirm his Will, and so let his country suffer. Instead of a slave, Dorne needed a master to rule. Two other characters—Dænerys and Grey Worm—ought to be evaluated as well. Danny, the so-called liberator of men, is not herself liberated, but enslaved, not in the Unknown-1.jpegsense of being indebted to another, but insofar as she is dependent on a higher morality, one that demands quiescence of the Will, and which seeks to eliminate the Will in others, the masters of Slaver’s Bay. She is pitiful and merciful, yet at the same time she possesses a certain brutality. As it is, Danny cannot be strictly classified as a master or slave insomuch that she simultaneously hinders her Will and incites it. Her loyal soldier, Grey Worm, has a talk with Tyrion. Tyrion asks, “Why don’t either of you ever drink?” to which Grey Worm replies, “Unsullied never drink.” Unconvinced, Tyrion queries, “Why not?” Grey Worm says, “Rules,” answered by Tyrion, “And who made these rules, your former masters?” (S6:E08). Here, Tyrion remarks that Grey Worm, despite being a freed man, still lives by his old master’s rules, thereby enslaving him. Morality, to Nietzsche, is a herd instinct; put another way, morality is something to which the weak flock, as though they are herd animals, and into which they invest blind trust, accepting it without questioning it, living by its rules without ever stopping to ask why they live by those rules, slaves to tradition, shackled to its ascetic ethics. Grey Worm does not live by his own, self-invented rules; he does not affirm himself; he denies his power and surrenders it to another.

What Nietzsche painted is a bleak, unaffectionate, uninviting, savage picture, in which the strong dominate the weak, and inequality reigns supreme alongside chaos and anarchy. Do I personally agree with what he said? I agree that our Western values have been and are influenced by and even derived from the Judeo-Christian traditions, which valued asceticism and renunciation of the passions, in favor of a virtuous, happy, and content life lived with value. It is not hard to see that this morality is ingrained in our Unknown-2.jpegculture, even in the 21st-century. I agree that we are approaching a time of nihilism, when our traditions are collapsing around us, and we are slowly losing these long-cherished values. I disagree with Nietzsche, however, that it is the strong and powerful who must triumph, that the slave morality is subversive and self-defeating. It is true that Nietzsche never explicitly expressed contempt for the slave morality; he just disapproved of it. Notwithstanding, today’s values have undergone changes within the last two millennia, and they will inevitably continue to change with the ages. The next time you are watching a movie or TV show, the next time you find yourself cheering for the good guy, remember that there are two sides to every story. Our protagonists all have motivations, but so do our villains. As you find yourself lounging on the couch, whether in bed or in Yin-Yang-Black-Gold-Dark-Temple-Small-308x300.jpgthe theater, watching the cosmic eternal dance of Good and Evil, consider what you value and why you value what you value. Was the point of this essay to convince you to start backing up the bad guys? Not at all. It is to get you thinking. It is to get you to consider things from a different perspective—something we all ought to do every now and then. “You are aware of my demand upon philosophers,” said Nietzsche—”that they should take up a stand Beyond Good and Evil.”[11]

[1] Nietzsche, On the Genealogy of Morals, p. 9, Preface, §6
[2] Id., p. 32, Essay 1, §13
[3] p. 62, Essay 2, §11
[4] p. 52, Essay 2, §6
[5] Aristocrat derives from the Greek aristos, meaning “best”
[6] Nietzche, op. cit., p. 22, Essay 1, §7
[7] p. 81, Essay 3, §1
[8] p. 104, Essay 3, §11
[9] Mencken, The Philosophy of Friedrich Nietzsche, p. 61
[11] Nietzsche, Twilight of the Idols, p. 33
For further reading: On the Genealogy of Morals by Friedrich Nietzsche (2013)
The Philosophy of Friedrich Nietzsche by H.L. Mencken (2006)
Twilight of the Idols
by Friedrich Nietzsche (2008)

Technology and Social Media: A Polemic


Much gratitude is to be given to our devices—those glorious, wonderful tools at our disposal, which grant us capabilities whereof man centuries ago could only have wished, the culmination of years of technology, all combined in a single gadget, be it the size of your lap or hand. What a blessing they are, to be able to connect us to those around the world, to give us access to a preponderance of knowledge, and to give longevity to our lives, allowing us to create narratives and storytell; and yet, how much of a curse they are, those mechanical parasites that latch onto their hosts and deprive them of their vitality, much as a tick does. That phones and computers are indispensable, and further, that social media acts as a necessary sphere that combines the private and public, creating the cybersphere—such is incontrovertible, although they are abused to such an extent that these advantages have been corrupted and have lost their supremacy in the human condition.


Technology is ubiquitous, inescapable, and hardwired into the 21st-century so that it is a priori, given, a simple fact of being whose facticity is such that it is foreign to older generations, who generally disdain it, as opposed to today’s youths, who have been, as Heidegger said, thrown into this world, this technologically dominated world, wherein pocket-sized devices—growing bigger by the year—are everywhere, the defining feature of the age, the zeitgeist, that indomitable force that pervades society, not just concretely, but abstractly, not just descriptive but normative. In being-in-the-world, we Millennials and we of Generation X take technology as it is, and accept it as such. To us, technology is present. It is present insofar as it is both at hand and here, whereby I mean it is pervasive, not just in terms of location but in terms of its presence. A fellow student once observed that we youths are like fish born in the water, whereas older generations are humans born on land: Born into our circumstances, as fish, we are accustomed to the water, while the humans, accustomed to the land, look upon us, upon the ocean, and think us strange, pondering, “How can they live like that?”


As per the law of inertia, things tend to persist in their given states. As such, people, like objects, like to resist change. The status-quo is a hard thing to change, especially when it is conceived before oneself is. To tell a fellow fish, “We ought to live on the land as our fathers did before us”—what an outlandish remark! Verily, one is likely to be disinclined to change their perspective, but will rather accept it with tenacity, to the extent that it develops into a complacency, a terrible stubbornness that entrenches them further within their own deep-rooted ways. This individual is a tough one to change indeed. What is the case, we say is what it ought to be, and so it is the general principle whereupon we take our case, and anyone who says otherwise is either wrong or ignorant. Accordingly, following what has been said, the youth of today, the future of humanity, accepts technology as its own unquestioningly. As per the law of inertia, things tend to persist in their given states—that is, until an unbalanced force acts upon it.


What results from deeply held convictions is dogmatism. A theme central to all users of devices, I find, is guilt; a discussion among classmates has led me to believe that this emotion, deeply personal, bitingly venomous, self-inflicted, and acerbic, is a product of our technological addictions. Addiction has the awesome power of distorting one’s acumen, a power comparable to that of drugs, inasmuch as it compromises the mind’s judiciary faculty, preventing it from distilling events, from correctly processing experiences, and thereby corrupting our better senses. The teen who is stopped at dinner for being on their phone while eating with their family, or the student who claims to be doing homework, when, in reality, they are playing a game or watching a video—what have they in common? The vanity of a guilty conscience—would rather be defensive than apologetic. The man of guilt is by nature disposed to remorse, and thus he is naturally apologetic in order to right his wrong; yet today, children are by nature indisposed thereto, and are conversely defensive, as though they are the ones who have been wronged—yes, we youths take great umbrage at being called out, and instead of feeling remorse, instead of desiring to absolve from our conscience our intrinsic guilt, feel that we have nothing from which to absolve ourselves, imputing the disrespect to they who called us out.


Alas, what backward logic!—think how contrary were it to be if the thief were to call out that poor inhabitant who caught them. Technology has led to moral bankruptcy. A transvaluation of morals in this case, to use Nietzsche’s terminology is to our detriment, I would think. Guilt is a reactionary emotion: It is a reaction formed ex post facto, with the intent of further action. To be guilty is to want to justify oneself, for guilt is by definition self-defeating; guilt seeks to rectify itself; guilt never wants to remain guilty, no; it wants to become something else. But technology has reshaped guilt, turning it into an intransitive feeling, often giving way, if at all, to condemnation, seeking not to vindicate itself but to remonstrate, recriminate, retribute, repugn, and retaliate. Through technology, guilt has gone from being passive and reactive to active and proactive, a negative emotion with the goal of worsening things, not placating them. Digital culture has perpetuated this; now, being guilty and remaining so is seen as normal and valuable. Guilt is not something to be addressed anymore. Guilt is to be kept as long as possible. But guilt, like I said, is naturally self-rectifying, so without an output, it must be displaced—in this case, into resentment, resentment directed toward the person who made us feel this way.


—You disrupt me from my device? Shame on you!—It is no good, say you? I ought get off it? Nay, you ought get off me!—You are foolish to believe I am doing something less important than what we are doing now, together, to think it is I who is in the wrong, and consequently, to expect me to thusly put it away—You are grossly out of line—You know naught of what I am doing, you sanctimonious tyrant!—


When asked whether they managed their time on devices, some students replied quite unsurprisingly that they did not; notwithstanding, this serves as a frightful example of the extent to which our devices play a role in our lives. (Sadly, all but one student said they actually managed their time.) They were then asked some of the reasons they had social media, to which they replied: To get insights into others’ lives, to de-stress and clear their minds after studying, and to talk with friends. A follow-up question asked if using social media made them happy or sad, the answer to which was mixed: Some said it made them happier, some said it made them sadder. An absurd statement was made by one of the interviewees who, when asked how they managed their time, said they checked their social media at random intervals through studying in order to “clear their mind off of things” because their brains, understandably, were tired; another stated they measured their usage by the amount of video game matches played, which, once it was met, signaled them to move onto to something else—not something physical, but some other virtual activity, such as checking their social media account. I need not point out the hypocrisy herein.


I take issue with both statements combined, for they complement each other and reveal a sad, distasteful pattern in today’s culture which I shall presently discuss. Common to all students interviewed was the repeated, woebegone usage of the dreaded word “should”:
—”I should try to be more present”—
—”I should put my phone down and be with my friends”—
—”I should probably manage my time more”—


Lo! for it is one thing to be obliged, another to want. Hidden beneath each of these admissions is an acknowledgment of one’s wrongdoing—in a word, guilt. Guilt is inherent in “shoulds” because they represent a justified course of action. One should have done this, rather than that. Subsequently, the repetition of “should” is vain, a mere placeholder for the repressed guilt, a means of getting rid of some of the weight on one’s conscience; therefore, it, too, the conditional, is as frustrated as the guilt harbored therein.


Another thing with which I take issue is when the two students talked about their means of time management. The first said they liked to play games on their computer, and they would take breaks intermittently by going elsewhere, either their social media or YouTube to watch videos. No less alogical, the other said they would take breaks by checking their social media, as they had just been concentrating hard. How silly it would be for the drug addict to heal himself with the very thing which plagues him! No rehabilitator assures their circle with alcohol; common sense dictates that stopping a problem with that which is the problem in the first place is nonsense! Such is the case with the culture of today, whose drugs are their devices. In the first place, how exactly does stopping a game and checking some other website constitute a “break”? There is no breach of connection between user and device, so it is not in any sense a “break,” but a mere switch from one thing to the next, which is hardly commendable, but foolish forasmuch as it encourages further usage, not less; as one defines the one in relation to the next, it follows that it is a cycle, not a regiment, for there is no real resting period, only transition. Real time management would consist of playing a few games, then deciding to get off the computer, get a snack, study, or read; going from one device to another is not management at all. Similarly, regarding the other scenario, studying on one’s computer and taking a break by checking one’s media is no more effective. One is studying for physics, and after reading several long paragraphs, sets upon learning the vocabulary, committing to memory the jargon, then solving a few problems, but one is thus only halfway through: What now? Tired, drained, yet also proud of what has been accomplished thus far, one decides to check one’s social media—only for 30 minutes, of course: just enough time to forget everything, relax, and get ready to study again—this is not the essence of management; nay, it is the antithesis thereof! No state of mind could possibly think this reasonable. If one is tired of studying, which is justifiable and respectable, then one ought to (not should!) take a real break and really manage one’s time! Social media is indeed a distraction, albeit of a terrible kind, and not the one we ought to be seeking. Checking a friend’s or a stranger’s profile and looking through their photos, yearning for an escape, hoping for better circumstances—this is not calming, nor is it productive. A good break, good time management, is closing one’s computer and doing something productive. Social media serves to irritate the brain even more after exhaustion and is not healthy; instead, healthy and productive tasks, of which their benefits have been proven, ought to be taken up, such as reading, taking a walk, or exercising, among other things: A simple search will show that any of the aforementioned methods is extremely effective after intense studying, and shows signs of better memory, better focus, and better overall well-being, not to mention the subconscious aspect, by which recently learned information is better processed if put in the back of the mind during something else, such as the latter two, which are both physical, bringing with them both physiological and psychological advantages. Conclusively, time management consists not in transitioning between devices, but in transitioning between mind- and body-states.


The question arises: Why is spending too much time with technology on devices a problem in the world? Wherefore, asks the skeptic, is shutting oneself off from the world and retreating into cyberspace where there are infinite possibilities a “bad” thing? Do we really need face-to-face relationships or wisdom or ambitions when we can scroll through our media without interference, getting a window into what is otherwise unattainable? Unfortunately, as with many philosophical problems, including the simulation theory, solipsism, and the mind-body problem, no matter what is argued, the skeptic can always refute it. While I or anyone could give an impassioned speech in defense of life and about what it means to be human, it may never be enough to convince the skeptic that there is any worth in real-world experiences. It is true that one could easily eschew worldly intercourse and live a successful life on their device, establishing their own online business, finding that special person online and being in love long distance—what need is there for the real world, for the affairs of everyday men? Philosopher Robert Nozick asks us to consider the Pleasure Machine: Given the choice, we can choose to either hook ourselves up to a machine that simulates a perfect, ideal, desirable world wherein all our dreams come true, and everything we want, we get, like becoming whatever we always wanted to become, marrying whomever we have always wanted to marry, yet which is artificial, and, again, simulated; or to remain in the real world, where there are inevitable strifes and struggles, but also triumphs, and where we experience pleasure and pain, happiness and sadness—but all real, all authentic. There is, of course, nothing stopping one from choosing the machine; and the skeptic will still not be swayed, but I think the sanctity of humanity, that which constitutes our humanity, ought never be violated.


What, then, is the greatest inhibition to a healthy, productive digital citizenship? What can we do to improve things? The way I see it, the answer is in the how, not the what. Schools can continue to hold events where they warn students of the dangers of technology, advise them on time management, and educate them about proper usage of technology and online presence; but while these can continue ad infinitum, the one thing that will never change is our—the students—want to change. Teachers, psychologists, and parents can keep teaching, publishing, and lecturing more and more convincingly and authoritatively, but unless the want to change is instilled in us, I am afeard no progress will be made. Today’s generation will continue to dig itself deeper into the technological world. They say the first step in overcoming a bad habit or addiction is to admit you have a problem. Like I said earlier, technology just is for us youths, and it always will be henceforth, and there will not be a time when there is not technology, meaning it is seen as a given, something that is essential, something humans have always needed and will continue to need. Technology is a tool, not a plaything. Technology is a utility, not a distraction. Social media is corrupting, not clarifying, nor essential. We have been raised in the 21st-century such that we accept technology as a fact, and facts cannot be disproven, so they will remain, planted, their roots reaching deeper into the soil, into the human psyche. Collectively, we have agreed technology is good, but this is “technology” in its broadest sense, thereby clouding our view of it. We believe our phones and computers are indispensable, that were we to live without them, we would rather die. To be without WiFi—it is comparable to anxiety, an object-less yearning, and emptiness in our souls. How dependent we have become, we “independent” beings! This is the pinnacle of humanity, and it is still rising! Ortega y Gasset, in the style of Nietzsche, proclaimed, “I see the flood-tide of nihilism rising!”¹ We must recognize technology as a problem before we can reform it and ourselves. A lyric from a song goes, “Your possessions will possess you.” Our devices, having become a part of our everyday lives to the extent that we bring them wheresoever we go, have become more controlling of our lives than we are of ourselves, which is a saddening prospect. We must check every update, every message, every notification we receive, lest we miss out on anything! We must miss out on those who care about us, who are right in front of us, in order to not miss out on that brand new, for-a-limited-time sale! But as long as we keep buying into these notification, for so long as we refuse to acknowledge our addictions and the problem before us, we will continue to miss out on life and waste moments of productivity, even if they are for a few minutes, which, when added up at the end of our lives, will turn out to be days, days we missed out on. As my teacher likes to say, “Discipline equals freedom.” To wrest ourselves from our computers or phones, we must first discipline ourselves to do so; and to discipline ourselves, we must first acknowledge our problem, see it as one, and want to change. As per the law of the vis viva (and not the vis inertiæ), things tend to persist in their given states, until its internal force wills it otherwise. We bodies animated with the vis viva, we have the determination and volition to will ourselves, to counter the inertia of being-in-the-world, of being-online, whence we can liberate ourselves, and awaken, so to speak. We, addicts, have no autonomy with our devices—we are slaves to them. Until we break out of our complacency, until we recognize our masters and affirm our self-consciousness thence, and until we take a stand and break from our heteronomy, we will remain prisoners, automata, machines under machines. We must gain our freedom ourselves. But we cannot free ourselves if we do not want to be freed, if we want to remain slaves, if we want to remain in shackles, if we want to plug into the machine. A slave who disdains freedom even when freed remains a slave. Consequently, we cannot be told to stop spending so much time on our devices, to pay attention to whom or what is in front of us; we must want to ourselves. Yet no matter how many times or by whom they are told, today’s youth will never realize it unless they do so themselves. They must make the decision for themselves, which, again, I must stress, must be of their own volition. Until then, it is merely a velleity, a desire to change, but a desire in-itself—nothing more, a wish with no intent to act. It is one thing to say we should spend less time, another that we ought to.


¹Ortega y Gasset, The Revolt of the Masses, p. 54

Against Prattle: a Philippic

The following treatise–more of a brief polemic really–is a collection of reflections concerning the ethics of false speech. Referring to it as a Philippic, the written or verbal attack first used by Greek rhetor Demosthenes to denounce Philip II, I have written this as  part-social criticism, part-moral essay. If you would rather listen to and watch the diatribe than read it, click here.


The Buddha identifies the misconduct of the tongue as one of the five precepts in the Eightfold path, and this vice, millennia later, still exists as a ubiquitous problem that, like a tick, burrows in the skin of, taints the blood of, and festers in the soul of its victim, unrelenting in its ways, corrupting within our society, and inflicting upon our rationality insipidity and vacuousness, which, in turn, strips us of our communicative and contemplative functions, and over time it becomes all too comfortable as it finds its way into our everyday interactions. We are powerless against this latent evil, and we no not of when we are consumed by it, for we feel the urge to act upon it, failing to consider what we really are doing.


Having not yet come to terms with themselves, having not yet established their place with their peers, and having not yet stipulated the trifles whereof they speak, the youth of today have failed incommensurably to understand the importance of that to which they contribute their input. Whencesoever this problem has arisen goes beyond me, inasmuch as the prevalence of this particular epidemic is to be considered universal for this time. It is due, possibly, to the increasing globalization of texting, a form of communication that has undermined the fundamentals of language, both socially and digitally, leading to the utter disregard of and complete ignorance of proper conversation and the destructive neglect of conventions in grammar, which, in turn, has created countless neologisms, limitless acronyms, and egregious shorthand, all of which has stemmed from our technology, since the culture of today is influenced so much by it.


So what exactly is the nature of the misconduct in speech to which I refer? To what extent does misconduct reach in relation to the tongue? The prattle with which I concern myself is that consisting of no practical value, of no constructive merit, or, more specifically, language used not for the betterment of the individual, i.e., their character or for their rationality, but for empty entertainment, i.e., the consumption of time or that regarding external matters such as diurnal occurrences or social conference. When speaking of the former, I speak of right conduct, speech used to further one’s morals and virtues or to further one’s thoughts and ideas, as these can be considered constructive, insofar as it provides the interlocutors to engage in discourse that will have a lasting effect, whereas engaging in the latter provides no such resolution.


Proper usage of speech, then, consists of structured, formal talk, which will benefit not just the talker but the listener, the useful benefit being the capacity to expand upon ideas, not the capacity to inquire into the happenings of another’s business found so commonly in the chatter of those not practiced aright in the art of conversation. And what is to dissuade us from said prattle? to inspire us to partake in constructive dialogue? Just as it is the job of the mother to nurse and raise her children and not the opposite, so too does conversation stimulate and enlighten and not the opposite; it is a shame when the mother chastises and abuses her young, just like when discussion dulls and deteriorates the minds of its users. We must refrain from reducing ourselves to useless talk, evidently, as it, like a car with no engine, will stall, will remain idle, and will get us nowhere, its only success being a waste of our time, precious time.


And what does this look like, exactly? While I hearken to the frivolous matters discussed nowadays, I cannot help but ask why. It is like a burning in my mind, not exactly a physical sensation, but a yearning, a desire for something that will bear fruit, which can be consumed and then digested, and like the natural desire of hunger, it will continue in a cycle. Of what use is talking about the small matters of your day? of talking negatively of those who have wronged you in the slightest offense? of colluding, viciously, behind the backs of your friends? of complaining incessantly of that which has no effect on you, or of that that rests outside of your control? To what end does this lead? So, in my moments of velleity, I ask myself and of my peers: where is the excitement? the passion? Where is the intense fervor we so frequently seek in life? Since this life is limited, it is this time, time of conversation, time of being with friends, that we should exchange not playful persiflage but confrontation, debate, forasmuch as engaging in a unilateral conversation bears no seeds, merely fruits; it is the seed, from which the ideas burgeon, which we desire, for the fruit will, in time come, but it is the journey, not the destination, that matters, so far as the learning, the stimulation, comes directly from the discourse. From conflict comes resolution, not the other way around, insomuch that, like a student, our learning comes not from the finished paper, but the computation of it, which is exactly what we are looking for when we converse. Indeed, a better use of our time would be used discussing big ideas, ideas that will inspire the aforesaid debate, considering it creates a connection between those involved and will hook them. We should be debating philosophy, history, politics, values, psychology; we should be debating the arts of the free man! for we, after all, are free, and thus we desire a fulfillment of our needs. How one should act, ethically, should take precedence over what minor misadventure another has committed one fateful day every time, seeing as the former is practical, whereas the latter is trivial and should be kept to one’s self.