Idiocy: A Formal Indication of Technological Alienation

technology-2020-15851598640jbAlthough the problem of alienation has always plagued man in various ways, where alienation names the condition of being disconnected from what is essential to one’s being, I believe the 21st-century has radically transformed the phenomenon of alienation. Specifically, modern technology in the form of personal devices, from phones to computers to headsets, and social media—both of which, to be sure, have benefitted us immensely in easing work, connecting us with each other, facilitating communication, storing information, etc.—creates a universal framework under which alienation manifests itself as an objective condition. This distinctive technological alienation of the self from itself, others, and the world I call “idiocy.” 


download-1The ancient Greeks believed that man was by nature social, that is, a political animal. As such, to be cut off from the social sphere, to be aloof, to keep to one’s own, to be a private citizen—such a person was an ἰδιώτης, an “idiot,” for one was concerned solely with what was one’s own (ἴδιος). And insofar as our age is a technological age, it paradoxically comes about that the greater our connectivity becomes, the greater our privacy grows proportionally: the diminution of distance, the convenience of contactless contact, the fragmentation and proliferation of tastes and interests, the mediation of self-presentation, and the blocking-out of the non-immediate all contribute to a contemporary alienation that is best summarized as “idiotic.” The negative connotations are impossible to dissociate from the term, and a full reclamation is undesirable; in fact, this is necessary since, just like alienation, the phenomenon is inherently privative: to be idiotic is to be deprived of and cut off from the world at large and even, to an extent, oneself. 


The self is prone to self-estrangement when it is involved with technology, particularly on social media. At first glance, this seems untrue, because if anything, social media encourages users to present themselves how they see fit, in accordance with their sense of individuality. Deciding what to post, for example, requires reflexivity: I consult myself, consider what is most appropriate, then choose whether to download-3post or not. Thus, social media brings me closer to myself. However, this argument presupposes that such attentiveness is always good, when it is also the case that the more attentive to myself that I become, the less myself I am capable of being. In truth, I alienate myself through the curation of my content. Social media promotes the compartmentalization of the self; it is not uncommon for someone to have a public account, which is open to all followers regardless of intimacy, and a private account, whose audience is much smaller and which gets to see another, typically truer side of the poster. While the idea of the private account would presumably counter the idea of self-alienation, in fact it proves it, for its very existence points to a self-differentiation. It is not just that certain people are cut off, but the user themselves is cut off from themselves, so that, Januslike, they lead a double life, suffering a split consciousness.


download-2The more control I exert over myself in how I present myself, the farther I drift from myself; it is as if, in splitting myself, in having to manage these separate lives of mine, I have already admitted my loss of control, if only implicitly. Whereas it appears we are hiding certain facets of our lives from others, keeping them out of the spotlight, highlighting only those facets we deem appropriate, it is more accurate to say that we compel ourselves to hide ourselves from ourselves. Again, the self-reflexivity of this idea only seems contradictory: the withholding of parts of ourselves, often the most delicate and deeply held, does involve a recourse to the self in the end, yet it also promotes such a heightened self-consciousness that, like Dostoevsky’s Underground Man, we are forced at every moment to justify our choices or fall into paralysis and indecision. It gets to the point where “we’ll contrive to be born somehow from an idea” (130), because “we feel a sort of loathing for real ‘living life,’” where living is nothing but “service” (129) for our image, and nothing in its own right.


downloadRather than life being pictured, we picture life to ourselves. The intensity of self-absorption is nothing more than the externalization of the self by itself, because we see ourselves through the eyes of others, whether we imagine a particular person or an anonymous viewer; either way, we estrange ourselves through our self-preoccupation. Accordingly, following Marx, we might say that by laboring on the self and commodifying it, we objectify ourselves into an image which, being external, is no longer really ours; it is “sold” as a product, as it were, on the cybersphere of Facebook, Instagram, Snapchat, etc., from which it derives its value in terms of validation. We are no longer a source of our own value. Consequently, our own, private, inward life becomes privatized; the “self” takes on an other self of its own, and we succumb to idiocy. 


download-4The clearest example of idiocy that technology fosters is visible out in public, where privacy now reigns. Society is become idiotic. Tocqueville, in the second and fourth chapters of volume II of Democracy of America, and Marx, in “On the Jewish Question,” both identified and cautioned against an early form of this idiocy in liberal democracy, in the form of individualism. Each faulted liberalism with solidifying the idea of the self-sufficient, independent being, the citizen who was defined not as part of the community but as apart from the community, and whose particular tastes, beliefs, and interests set him on a path of his own, parallel to or divergent from the likes of his fellow men. As a result, people are encouraged to withdraw from the public into the domestic sphere or into their idiosyncratic affairs, unconcerned with anybody else. It is a process of atomization and monadization—in short, the creation of windowless, self-impelled units.


21st-century technology exacerbates this to a profound extent: It is not just the case that the spheres of life are abstractly distinguished, rather it is entirely appropriate to state that we live in our own worlds. This is a common sense observation, to be sure, as each of us lives in our own world inasmuch as we see the same world through our own eyes, etc. However, the privatization of the world under technology is of an altogether radical character: Phenomenologically, my world seems no longer to be merely a slice of the unitary, indivisible, intersubjective world, but through the mediation of devices, my world is constituted subjectively in toto. Three examples will illustrate this: (1) Earphones, (2) streams, and (3) virtual reality (VR).

  1. download-6The mention of walking down any street is sufficient enough to introduce this. Each of us walks to the town center, class, or for leisure while we listen to music or podcasts or talk to someone on the phone. In this case, the world becomes the background of our engagement, but nothing in itself; it fades back, out, away from view, only the immediacy of our destination—say, the library—being my intention, if it is seen at all. Nature, conversation, gravel underfoot are all bracketed out and silenced beneath the blare of tones. One is prone to accidents, either walking through a crowd or crossing the street, if one is not sufficiently peripherally aware. Existence is characterized here by its marked obliviousness. In public, amidst others, I am private, absorbed in my own echo chamber. On my way to class, only I, for all intents and purposes, who am immersed in my tunes, exist. I am an idiot.   
  2. download-1Live broadcasting services, like YouTube and Twitch, have popularized the simultaneity of experience; no longer is consciousness itself a stream, but rather it is something to be streamed to others in the process of its actualization, albeit vicariously. A decade ago, it struck one as odd, perhaps deranged, to see a person walking down the street, holding a phone or camera in front of themselves an arm’s length away, seemingly talking to themselves. Now, however, as it has become mainstream, this is “what one does”; it is, to be sure, still exceptional, but by no means unconventional. I and several thousand others worldwide may be watching and listening to a complete stranger living out their lives, all the while feeling a strong, albeit illusory, sense of connectedness to them. I have a window into their life; I am a part of their life. Except that I am not. For the live streamer, their audience of ten thousand watchers who “tune in” regularly are more “real” to them in that moment than are the very enfleshed people who walk beside them on the street, who are more distant in that instant than those who are hundreds and thousands of miles away. The streamer who streams to me, and I who consume the stream—we are idiotic.   
  3. Lastly, the rise, spread, and promise of virtual reality threatens the real possibility of solipsism, the consummation of idiocy in which the individual becomes a world unto themselves, self-sufficient, not needing anything external; the concrete gives way to the abstract, i.e., the virtual, which is Nothing and Nowhere. At this point, particular products, e.g., VRChat and the Oculus Rift, are download-5marketed as individual commodities, which is fitting enough. Virtual space, like all other space, is unitary and indivisible, but it is this feature which makes it liable to complete privatization, that is, when this phenomenological privatization is commercially privatized. My choice to enter a fantasy realm particularly fitted to myself is sold as something to me myself, as a private consumer. The world, which is become my world, is something to be consumed, integrated into me. Hence, the world is subjectivized, nothing but a projection of myself within which other people—if there are other people—are permitted to exist. Nozick’s experience machine, Huxley’s soma—these formerly virtual ideas are now actualized as virtual. Like Aristotle’s self-contemplating god, we can retreat into our rooms, turn off the lights, tuck ourselves away, and reside within an artificial world, taking leave of all that is around us, all that is real, in favor of the most unreal reality. This is the most idiotic thing one can do.        

In each of these phenomena, the following traits are observed: The derealization of the immediate and real, the exclusion of the world, the mediation of living, the inattentiveness of the individual, the abstraction of experience, the centralization of the knowing subject who is elevated to divine spectatorship (the contemplative life!), and the toleration of others. Idiocy, therefore, refers to the contemporary living-out of life in which, to varying degrees, the aforesaid formal qualities disclose themselves. Technology, functioning as an objective reality, that is, as something which, transcending individual devices, comes to structure experience itself, mediates life to such an extent that this mediation, which is always a mediating-for…, is taken to be essential for the subjective.


download-2In other words, technological life is subjective life; all is related through the subjective, to the point of appearing to be a product thereof. The world exists for me, by means of my device, meaning that it is my world in which others can live. But a life that is mine alone is idiotic. Such an existence cannot be anything but alienated. One is estranged from oneself, from others, and from the world. The joy I take in expressing myself, exploring the world, or being with others is mediated through a veil of artificiality and distance. I am far from myself when I am absorbed in myself; I am closed off from the world when I enclose it into a frame or a virtual sphere; I am most apart from others when I am nearest to them. When life is reduced accordingly, then “It is [but] a tale told by an idiot, full of sound and fury, signifying nothing” (Macbeth, 5.5, ll. 25-7).

 

 

 

Source cited:


Dostoevsky, Fyodor. Notes from Underground. Translated by Richard Pevear and Larissa Volokhonsky, Vintage Books, 1994.

Max Weber on the Role of Science in Society (1 of 2)

In 1917, the German sociologist Max Weber delivered a speech titled “Science as a Vocation” (Wissenschaft als Beruf) at the University of Munich. It would be published two years later, a year before his death in 1920. The 19th-century had been a productive century for science, which was no longer download-3known as “natural philosophy.” Various branches, like chemistry and biology, were codified and given the systematic rigor by which we know them today, while influential ideas like Charles Darwin’s theory of evolution and Hermann von Helmholtz’s formulas on the conservation of energy drastically revised our conception of the Universe. But the success of science was not just theoretical, for it was also applied to everyday life, leading to the invention of lightbulbs, cars, and cameras. Then, in 1905, Weber lived through the overturning of the Newtonian, or classical, paradigm of physics, which had been in place for two centuries since the 18th-century, when Einstein and his contemporaries revealed the quantum realm and the relativity of space and time. Thus, from amidst the First World War, Weber’s reflections on the role of science, and on science as a calling, can perhaps speak to us at a time when science’s hold is even greater.

The State of Modern Science


Having been in academia since the early 1890s, Weber was able to see the development of the scientific profession around him. It is important to note that in German, the term Wissenschaft refers to “knowledge” in general, not necessarily science in the sense of the hard or natural sciences; therefore, the social sciences and the humanities, too, fell under the title. Nonetheless, it is evident throughout his lecture, by references to either individual scientists or else the disciplines themselves, that for the most part, he has in mind the natural sciences. And it is just this feature, namely the fragmentation of the sciences, which he addressed first: “A really definitive and good accomplishment,” he observed, “is today always a specialized accomplishment” (111b-c*; emphasis mine).


download-4Roughly until the 19th-century, and more so before the 18th, before science assumed its official role, and when it was still called natural philosophy, the study of natural phenomena was a bit more unified, such that many people who contributed in some way to math or science, like Blaise Pascal, Gottfried Leibniz, and Johann Gœthe, were polymaths who had expertise in several domains, not just one. By the 1900s, though, particularly in Western Europe, science as a unified system rendered this well-roundedness practically impossible with its increasing academic divisions. If one wanted to answer to “the inward calling [Beruf] for science” (111b), then one had necessarily “to put on blinders” (111c), realizing that it was a narrow commitment to which one was devoting oneself. A biologist can do biology very well, since that is what she was trained for, but if she were asked to study a supernova, then she would be close to useless. To be sure, her scientific training would give her a baseline, but as Weber indicated, she would by no means be accomplished.


downloadFor Weber, this was a drawback in pursuing science, but he realized something more important—that the investigations and experiments that a scientist carries out are not deadening, emotionless, or stultifying. The image of a sanitized researcher working in a white coat who is surrounded by equipment and charts and figures contains a few kernels of truth, but it by no means representative; what this stereotype of the scientist conceals, is the humanity of science, the fact that “science” is not some monolith but actually a collection of scientists, that is, people, real people, whom an inward drive compels onward in the search for—well, many things: truth, power, success, progress, etc. Although much of modern science relies to a high degree upon mathematics, this does not by any stretch of the imagination mean that scientific work itself is mathematical. On the contrary, the design and execution of experiments requires an experimenter who devises it, who sets out with an idea in mind, a hypothesis that he is excited to test out because there are things at stake!


In this respect, Weber interestingly compared the scientist to the artist in that both of them require creativity, intuition, and inspiration for their work. One of the greatest scientists, with whom Weber would have been well acquainted, comes to mind here: the legendary Albert Einstein, for whom science was download-5impossible without intuition. Many of Einstein’s insights, for example, came from vivid thought experiments and revelatory flashes, things that could not be found in the laboratory or at the research table. This is why I think that Weber would not fear the growth of A.I. in science: I think that if he were alive today, then he would argue that a program or algorithm could never do the work that a human scientist does because it is not about computations or simple strings of processing; rather, science is driven, as we have said, by the force or passion within us that cannot be reduced to a binary code, the force that gives rise to seemingly outlandish theories and the madness to test them, to think outside the box. Science, like the humanities, cannot be automated—the calculations can be calculated, obviously, as is being done right now, but not the science.


download-6The humanness of science, however, also has a negative tendency, one which is not reserved just for Weber’s time: The tendency for the scientist to become, in Weber’s words, a personality, that is, a celebrity. This happened to Einstein. Often, and for understandable reasons, a scientist who achieves a breakthrough and is regarded as a genius will become revered and sought out by everyone—not just the experts, but the media, the pundits, and the average person. Suddenly, this figure is put into the limelight and expected to give their opinion—not necessarily a scientifically based one—on such diverse matters as global politics, warfare, the existence of God, love, the meaning of life, whether pineapples go on pizza, etc. The 19th-century had Thomas Huxley, “Darwin’s bulldog,” and Ernst Hæckel, for example; and more recently, we have had the likes of Richard Feynman, Stephen Hawking, Neil deGrasse Tyson, Bill Nye, Richard Dawkins, Sam Harris, and others.


download-7The problem that Weber was addressing was not that scientists were being asked such questions, nor that they were answering them; instead, it seems to have been the fact that the questions were being asked because they were scientists, with the expectation that, owing to their “genius” and “privileged insight,” despite not pertaining to their area of expertise, their answers would thereby retain the character of being “scientific” and hence automatically correct and unassailable—in a word, dogmatic, and taken for granted. Weber preferred that a scientist’s work speak for itself; a humble researcher who dedicates herself to her work without making a fuss about it or herself is a testament to success, according to him. The question of success leads to another insight of Weber’s: “[I]t is the very meaning of scientific work…. to be ‘surpassed’ and outdated,” with the “hop[e] that others will advance further than we have” (113b).


Here, Weber returned to his comparison of the scientist and the artist, this time negatively; for whereas the artist’s work is relative—one can judge for oneself whether it is good or not, and no painting or style, e.g., neoclassical or cubist, can be said to have “advanced” beyond the other, except in purely download-1chronological terms—the scientist’s is the exact opposite, having an objective and progressive basis for judgment. It is tempting to describe scientific advancements as “linear” because they move forward, but any look at the history of science will immediately disprove such a straightforward interpretation; therefore, I think Weber would agree more with “progressive,” since even a “wrong turn,” so to speak, like Newtonian physics when viewed in retrospect, is never entirely discarded or made obsolete. Hence, Newton is just as important today as he was between 1700 and 1900 because, in being overridden (at the subatomic level, that is), he provided the necessary grounding for future developments. Having said this, Weber asked an incisive question: What, then, is the point of science, if it seemingly never ends and if, viewed from outside, it consists only of theories that happen to be truer than the previous ones, only to be discarded ad infinitum?

The Disenchantment of the World


The most obvious value of science is its material contribution to society. Thanks to science, we live with unprecedented comfort, health, and convenience. Through the collective research of scientists, others—engineers, technicians, and inventors—can apply the findings of even the most abstract or seemingly ungrounded principles of the natural sciences, like friction, the universal law of gravitation, and the chemical properties of substances, in order to create things of utility. Thus, the “applied sciences.” Besides practicality, science is valuable for another reason: It has given us an intellectualized and naturalized account of the world and its innermost workings.


download-8Weber was quick to qualify this statement, as while it is true that we understand the world better scientifically, this “we” does not actually mean “all of us.” As we have noted, science became highly specialized over a century ago, and it has only become more complex over time. Consequently, it is more accurate to say that science has provided us the possibility, or the capacity, to understand the world. How many of us know the ins and outs of the hardware and/or software of our computers? Or how about the nanochip, touch screen, or GUI of our smartphones? Indeed, how many of us, which is to say those of us who are not professionally educated in the sciences, have forgotten the basic scientific facts that we learned back in our schooling, since elementary school? The facts are all there, available to us, but the sheer extent, complexity, and sophistication of many of them prevent us from achieving the same level of understanding as a Ph.D. student, senior researcher, or engineer. Regardless, we still rely on these people and their findings to keep the world running. Our devices work, and that is enough for us.


download-10Overall, this scientific intelligibility led Weber to assert one of his most well-known theses: “[O]ne can, in principle, master all things by calculation. This means that the world is disenchanted” (114a; emphasis mine). By this, Weber meant that science had basically ruled out any superstitious or religious understanding of the world. In this way, fulfilling the mission of the 18th-century Age of Enlightenment, science has liberated us from the darkness of irrationality. But for some, this liberation is not a positive thing; on the contrary, it could be argued that science’s supposed liberation is just a new form of bondage that goes by a new name. A discontentment with this stifling rationality can be heard in a familiar criticism of modern science: “[T]he intellectual constructions of science constitute an unreal realm of artificial abstractions, which with their bony hands seek to grasp the blood-and-the-sap of true life without ever catching up with it” (114d).


Heidegger_3_(1960)Martin Heidegger, a 20th-century German philosopher, saw modern science as being rooted, counterintuitively, in what he called the “essence of technology.” According to his analysis, science frames the world in terms of objectivity, which really means measurability and calculability. Heidegger’s concern was not that science was false or evil or anything, but that, having become the arbiter of truth and reality, it gets to decide what is real or not, based solely on its own criteria, viz. experimental, measurable demonstrability; anything else, if it is not measurable or if it cannot be conceptualized objectively, is unscientific, and thus unreal. As a result, Heidegger’s view was not quite the same as Weber’s: Whereas we might say that Heidegger built upon Weber in targeting scientism, Weber was more so lamenting, or maybe just observing, that science obviates the need for myths, transcendent ideals, and supernatural explanations.


Recently, Weber’s disenchantment thesis has been challenged. Josephson-Smith’s book The Myth of Disenchantment (2017) complicates Weber’s declaration, showing how the perceived threat of disenchantment is actually a means of resurrecting enchantment. He writes about how Weber himself, download-11despite pronouncing the end of enchantment, engaged in the very practices whose deaths he announced, like occultism. In my opinion, I don’t find this a compelling dismantling of Weber’s position, though. Just because there were still lots of people not just in Weber’s time, but also our own, who were fascinated by magic, cults, and spiritualism, does not change the substance of what Weber was saying; I believe the diagnosis of disenchantment is a bit more nuanced, insofar as the respectability and authority of science is generally accepted worldwide, in comparison to which everything else is unscientific. The point is not that science has outright eradicated alternative forms of knowledge; rather, it has (mostly) successfully invalidated, or at least challenged the foothold of, them. Outliers, of course, exist; Shintoists in Japan or Natives in North America certainly retain their enchantment. I do not think Weber was naïve enough to not consider this, and not to think that science had simply wiped out its competition.


download-2Furthermore, Josephson-Smith characterizes the rise of new forms of enchantment as either “ironic” or “paradoxical,” directly disproving Weber’s point; however, again, I think Josephson-Smith is off the mark, for Weber himself predicted this very phenomenon (115d), i.e. the reactionary resurgence and growth of religiosity and irrationalism. The whole point is that these emerge in response to disenchantment; hence, the fact that people today believe in QAnon, astrology, New Age spirituality, neopaganism, haunted houses, etc., or that Weber was personally interested in such ideas, is precisely evidence for, not against, disenchantment. These are active attempts to fight back against scientific rationality and to preserve or bring back a sense of lost enchantment.


download-9Leaving aside the legitimacy of disenchantment, we arrive at another of Weber’s key propositions. He cited the Russian novelist Leo Tolstoy, who believed that in the modern world, meaning was unattainable. An interesting distinction is brought up between being “satiated with” life versus being “tired of” life, the former belonging to pre-modern life and the latter to modernity. According to Weber, Tolstoy found death meaningless due to science, and by extension, life, too. It is difficult to tell whether Weber discussed Tolstoy because he agreed with him and felt similarly or whether he merely brought him up as an example of such a view. His wording does not seem to betray any partiality, so I’m not sure if Weber himself thought that science rendered death and life meaningless. What exactly it means for death to have meaning or not, remains unclear to me; nor does the logical leap from death’s being meaningless to life becoming meaningless seem to me entirely justified. Whatever the case, Weber’s reference to Tolstoy was meant to problematize science’s role in everyday life.

 

 

Notes


* The letters (a-d) after the page numbers designate the quadrant in which the quote appears, as I am relying upon Encyclopædia Britannica’s The Great Books, Vol. 58, which is printed in two columns.

Read a digital copy of the speech here

TikTok and Trends

Unknown.pngIn the absence of school, work, and other obligations, and in the presence of our devices, which for the time being our only ways of access to our friends and family and the “outside world,” what better way to spend one’s newly acquired leisure time than to lie on one’s bed and entertain oneself by scrolling through one’s TikTok feed and watching the latest trends as they play out on the “For You” page? Whether or not this is better classified as “using” or “wasting” one’s time, for many it is their only way of staying sane; love it or hate it, TikTok serves as a community in these times, an outlet where people can interact with others and express themselves, get a laugh, or maybe make new friends. During times of crisis, we look for comfort in humor and other people. And when you pair this with the fact that everyone is locked in one place, with nothing better to do, you get a recipe for immense productivity and creativity, everyone looking to outdo each other in their jokes and skits. As a result, we witness dozens of trends on TikTok, some funny and original, others not so much, but all of them united by one thing: time. That is, while the content might differ dramatically, it is the form, or character, of trends that remains universal, namely that they all last for a brief period of time before “dying out,” or becoming unfunny and overused, then abandoned. In this post, inspired by a TikTok live stream, I want to explore what a trend is, what role TikTok plays in trends, and what makes trends problematic. 

What are trends?


What is a trend? The answer would appear obvious, seeing as we have all experienced trends. It is, simply, a temporary popular movement; it is when a lot of people like something for a short period of time. However, we can also get technical because, on the sociological level, there are different ways of classifying collective behaviors. For Unknown.jpegexample, we might now ask, “What is the difference between a trend, a fashion, and a fad?” Some will answer that a fashion is more historical, a fad more crazed, and a trend more lasting. Right away, though, we come up against the conflict of the lay and the educated: often, our attempts to classify, that is, to be scientific, are opposed to the way we experience things as they really happen. In other words, language is shared and, for lack of a better word, ordinary; rarely would we stop to consider and debate the merits of a fad versus a fashion. In everyday life, we do not speak so precisely. This ambiguity is evident in the way we speak for the most part: we say that a video “is trending,” or there is a “trending hashtag,” or it is “fashionable to….” It would seem, then, that a classification is not appropriate here. Again, we settle with the common consensus in saying that a trend is a short-lived burst of attention and attraction to a behavior or appearance. All trends tend; each movement is directed toward something, follows a course.


To explain how this collective behavior comes about, we can look to one of the founders of crowd psychology, Gustave Le Bon, who in 1895 published The Crowd, initiating the academic interest in mass movements. According to Le Bon, a crowd is distinguished from an ordinary group by two criteria:

  1. deindividuation
  2. the law of mental unity.

Unknown-1.jpegIn order to be a crowd, the members of the group must give up their sense of personhood and have a common purpose. Hence, numbers do not matter; a crowd can be three people or it can be 50, just as long as it believes the same thing. Our idea of mob/herd mentality, or of a “hivemind,” originates from Le Bon’s work, in which he writes that the group assumes a collective mind, one that speaks for everyone involved. The collective mind is like the Leviathan in the English philosopher Thomas Hobbes’ political theory, the monarch who, by representing all individuals, thereby takes away their freedom. Since it is a “collective,” this mob mentality is greater than the sum of its parts, making it an entity of its own. No longer do the members make their own decisions; the mind makes it for them, and they obey it. It is as if each member dissolves into the collective.


Clicking on the sound of a TikTok, one sees everyone else who has used that same sound, and sees, more importantly, the repetition which occurs. It is usually the case that, as one scrolls through the “For You” page, one skips over the videos without much thought; it does not matter to us who made the video, unless, for some reason, it makes an impression on us; but what this shows us is that every single person who contributes to a trend on TikTok is essentially forgotten, overlooked by the bigger figures like Addison Rae, so that it would seem they are but a part in a big machine that rolls on without them. They are mere footnotes in trend history.


Another thing Le Bon observed about crowds is their susceptibility to influence, which is made possible by irrationality. It is very easy, he said, to use specific words in order to bring about action. Words are powerful because they conjure up images, emotions, and connotations. We act “as if [short syllables] contained the solution of all problems,” Le Bon wrote (The Crowd, 96). These “short syllables,” moreover, are more powerful depending on their vagueness. When we think we know what a word means, when it awakens an association within us, we are subject to manipulation. Someone can easily shape a crowd’s perception by abusing language by cloaking or redefining a word—e.g., chivalry devolves into “simpery,” making an otherwise- positive gesture negative—a problem to which I will return later.


The most important implication of the crowd, though, is their attitude toward truth. This is particularly problematic today because we are living in a post-truth era, when objectivity is discarded. Not only do crowds inherently believe anything, but the added skepticism of our age only worsens this tendency. As such, the psychologically and now-historically conditioned disregard of truth endangers our communication. Only ignorance can follow hence.


Of course, Le Bon was writing over 100 years ago and, since his time, we have come up with more updated theories of social behavior, like emergent-norm theory, according to which a crowd will form when we are confronted with a confusing situation and need a strong principle to follow, and value-added theory, which states that crowd formation requires

  1. awareness of a problem
  2. tension
  3. common beliefs
  4. provocation
  5. organization
  6. reactivity.

Both of these sociological theories try to develop Le Bon’s by rationalizing individuals’ behaviors. 

What is TikTok?


pexels-photo-1015568.jpegNow we can look at the exact role that TikTok has and how trends work there. To do this, it is important that we understand the function of TikTok. As a social media application, TikTok assumes its role as an extension of the public sphere. The public sphere is where we interact with others. Schools are a form of the public sphere because, in between classes, we get to talk with our peers and socialize. A better example would be any city, as that is clearly “public”; we are able to go to a coffee shop, order a coffee, and immerse ourselves amidst other people. But sociologists see the public sphere as doing more than just allowing us to socialize; fundamentally, the public sphere allows for socialization, “the lifelong process in which people learn the attitudes, values, and behaviors appropriate for members of a particular culture” (Schaefer, Sociology, 9th ed., p. 99). Put another way, the public sphere is where we are educated culturally and socially. Thus, we can see how TikTok might perform this task of socialization because it brings a bunch of people together in one place to learn and enforce what we should and should not do.


Unknown-2.jpegHowever, it might seem strange to describe TikTok as a public sphere—and rightly so. Earlier, I described it as an “extension of the public sphere,” which is more accurate. In fact, TikTok is unique because it constitutes a new sphere, what we would call the cybersphere. See, unlike a school or a downtown plaza, TikTok cannot be located on a map; I cannot say, “I’m going to TikTok to see a video.” Unlike the public sphere, TikTok’s cybersphere is virtual: it is spaceless. Recently, sociologists have accepted that crowds can now form without being in contact with one another (recall that Le Bon discounted quantity). Crowds are a type of “secondary group,” a gathering of people who do not know each other, are not close, and do not meet up frequently. TikTok users come from all over the world, and TikTok, while being a social media app, is not like Instagram or Facebook that tries to develop connections, but operates on short, impersonal interactions.


One consequence of this is anonymity. Le Bon said that a crowd consists of deindividualized members, people who, in joining the crowd, lose their self-awareness. Likewise, on the Internet, or on TikTok, users (the fact that we call ourselves “users” demonstrates this very impersonality!) can create their own profiles, which means Unknown-1.pngmaking up a name for oneself, ridding oneself of one’s identity. At school, people know our names, know who we are; online, however, we are a blank slate, so nobody can hold us accountable. This is what makes cyberbullying prevalent: we cannot be held responsible because nobody knows who we are behind a screen. Putting this all together, one comes to a frightening thought: if the cybersphere simultaneously socializes—tells us what to value—and deindividualizes—takes away responsibility and selfhood—then to whom are we listening, and from where are we getting these so-called values? The psychoanalyst Erich Fromm called this “anonymous authority”—when we adopt values from seemingly nobody. After all, we can say that a trend on TikTok is perpetuated by individuals, and perhaps put together a chronology of who said what when, but at the end of the day, the truth is that it is not just one person to blame; on TikTok, values are truly anonymous (the word literally means “without a name”).


Yet we can still add to this because Le Bon noted that a crowd is led; any crowd requires an opinion leader, someone popular or respected whose voice galvanizes. One of the core values of many TikTokkers is originality. People who use TikTok scorn those who copy something without crediting the creator. The original poster, the trendsetter, the one who sets the trend in motion, thus assumes the role of opinion leader. An example should suffice: the use of “Simp Nation” started by polo.boyy quickly spread, with many making their own spin-offs and commenting on others while tagging polo.boyy asking, “Is (s)he valid?”—i.e., do they live up to the original? Let us explore another aspect of TikTok now.


In sociology, gatekeeping is the process of filtering information. Media like CNN and Fox, for example, are gatekeepers because they let in certain information based on their agendas while blocking other information from getting through. CNN is more liberal, Fox more conservative, so they approve of different norms, which influences their output. Gatekeeping exists to protect and perpetuate dominant ideologies in a culture, beliefs that are held by powerful groups and which allow them to hold power over others. This leads to the oppression or silencing of certain minorities in most cases. So is TikTok a gatekeeper? At first glance, it would appear not. The question seems extreme. TikTok is not a news organization, you might say, so there is no need to block things. But is that so?


Unknown-2.pngA look at the algorithms should tell us… only, we cannot look at them because TikTok, run by a Chinese company, does not make its algorithm public. However, efforts have been made to understand at least a little about the algorithms, such that we know it operates according to a process called “collaborative filtering,” which makes predictions based on our past history as well as what other people like. The videos that appear on our “For You” page are therefore tricky at best. Several experiments have been conducted to show that, based on one’s liking tendencies, certain viewpoints become favored. This seems like commonsense. What makes this troublesome, however, is the blurred distinction between description and prescription: is TikTok recommending things that we really like or that we should like? Is it just building off our preferences or imposing its own? Does it describe us or prescribe to us?


On the one hand, we users are responsible for what we choose to like and dislike, which influences what we see; though on the other, it is possible for the algorithm to disproportionately impose certain views on us, regardless of our liking for them—it assumes our likes, in other words. Just because I like a video that happens to be conservative, for example, does not mean that I like conservative content. Shouldn’t the algorithm be based on providing us with new, fresh, funny, and original content instead of categorizing us? As a result of collaborative filtering, TikTok creates “filter bubbles,” specialized niches that, in accordance with gatekeeping, block us from certain things, exposing us only to those which have been selected. We can become easily trapped in these bubbles, unable to escape. The app becomes a positive feedback loop where, upon liking one thing, it brings me to a similar one, the liking of which will bring me more and more similar ones, etc. It is easy to see how views can become polarized on TikTok.


Unknown-3.pngPolarization is something that ought to be taken seriously. Many of us, when we hear the word “polarize,” think it means “to break apart,” which is indeed one of its meanings—”to fragment.” However, psychologically, its meaning is much more important. Already, within the word, we see “polar”—think North and South Pole. The phenomenon of group polarization, therefore, refers to radicalization; it is when groups become extreme in their original views. Just as the North and South are opposite each other, never able to meet, so groups that are polarized are at opposite extremes and refuse conciliation. Polarization is a matter of pulling-apart, dividing. Consequently, we form in-groups with which we identify and out-groups which we designate as the enemy. Collaborative filtering creates filter bubbles, which polarize individuals into groups, creating an “us vs. them” mentality. Thus, we see the inevitable introduction of identity politics into TikTok.

Case study: sexism


Unknown-4.pngNow, I wish to demonstrate what has just been said through an illustrative case study based on some of my own observations, in the hopes of providing insight into the collective behavior that takes place over TikTok. Specifically, I want to look at the case of sexism on TikTok. To begin, what is sexism? I shall define it as prejudice, or negative appraisal, toward members of another sex. Sexism is believing one sex is superior to another. (It should be noted that while it can affect men, sexism is primarily directed toward women.) Furthermore, one of the things which distinguishes sexism from other -isms and -phobias is its ambivalent character. Researchers contend that there are two types:  

  1. benevolent sexism, based on patronizing and diminution
  2. hostile sexism, based on explicit hatred and discrimination.

Feminist theory seeks to critique society from the viewpoint of women. While it identifies many problems, here are three examples:

  1. women are underrepresented in education, politics, and more
  2. media proliferates stereotypes about women
  3. society enforces roles and male-dominated discourse, or the patriarchy.

My intention in bringing these up is not to evaluate these claims, to say whether they are right or wrong, to challenge their fundamental beliefs as many are wont to do; instead, I present them to be considered further, on the assumption that they say something important about our society. Undeniably, our views of the sexes is shaped by gender roles, the existence of which is incontestable. Here in the U.S., for example, we have notoriously taken varying attitudes toward women since the ‘50s. Gender roles are expectations for how men and women are supposed to behave, and they are kept alive by normative rewards and punishments, usually in social, political, or emotional forms.


Unknown-5.pngWhat does this have to do with TikTok? Frankly, it is uncontroversial to state that TikTok is a place of tremendous strife with regards to sexism and prejudice in general. In reaction to the ‘50s, and reaching its heights in the ‘70s, the Women’s Liberation Movement made great strides forward in advancing women’s standing in America. When I was younger, having been raised in a small, friendly, and liberal city, I took it for granted that men and women were equal; I did not understand why people claimed women were lesser in any way. This might just be a purely subjective judgment, although maybe some will feel the same, but I feel that, moving in the direction toward liberal progress, the U.S. has become complacent, leading many, including myself, to falsely believe that we live in a post-sexist society—that is to say, as we have become more progressive, we believe we have “moved past/beyond” sexism. What this does is silence the matter, and de-problematize it.


IMG_3456.jpegWhat leads me to say this? Well, one might argue that TikTok, for example, is perfectly democratic because, like the American Dream’s promise of making anyone rich, the TikTok Dream’s promise of making anyone famous (if only for a while) is open to everyone. Creators can be male, female, non-binary, young, old, white, black—it matters not… or does it? Despite our apparent liberalism, sexism is far from gone. Take, for instance, the following remarks that can be found in pretty much any comment section: “If a male made this, it would be funny,” “Waiting for a guy to remake this…,” “The ‘f’ in woman stands for funny,” “You’re actually funny for a female,” “What? A woman who’s funny?,” “We did it boys, we found one that’s actually funny!,” etc. If anything, these myriad comments indicate that sexism—the belief in the superiority of one sex over another—is as strong as it has ever been.


pexels-photo-708440.jpegIs it the expression of “the people”? Is it representative of our times? I shall address this later. The fact is, each of the above cited quotations is evidence of a lingering patriarchy or—if you prefer to deny the existence thereof—male dominance. Is it really indicative of sexism, though? Isn’t it just an observation that, perhaps, this guy happened to be funnier than the average girl? That is to say, couldn’t they just be preferences for humor, not motivated by negative attitudes toward women? No, it is most definitely motivated by sexism: “Men are more likely… to minimize the contributions and ideas of members of the opposite sex,” reports one author (Schaefer, p. 288). The matter at hand is competency, and men are denying it. To be sure, if someone IMG_3471-1.jpegwere to comment, “Men are stronger than women,” then I would agree insofar as that is a biological, objective truth; however, to apply this level of competency to the comedic level, which, mind you, is subjective, and to declare that women are not as funny as men, is not a matter of fact but a matter of personal beliefs—though not good ones. We men are taught at a young age that we are the more “successful” sex, success being measured by our wealth, our social status, our political standing, etc. It would seem logical that humor would be yet another category that we claim for ourselves; we assume that we are better than women, so we must be funnier, too, a fortiori. To deny a sex’s humor is blatantly sexist; it is a denial of opportunity and an act of degradation.


One of the more interesting, and perhaps nuanced, aspects of this sexism on TikTok is the word “female.” But what’s the issue with “female,” you ask? I, too, was not entirely sure until one night when I was watching a live stream, and the host was expressing her views on it. She said the word, for her, was immature and degrading. Admittedly, I was confused because, after all, the word “female” is a common one, one used in everyday Unknown-6.pnglanguage, so what could be so controversial about it? As she explained, though, how it was “unnatural”—forced—and thus overly formal—a cop might say, for instance, “The suspect is a female”—it made sense to me. It seems entirely acceptable to play this off as just being “oversensitive” or a “snowflake”—I thought so myself as she first began—but when I really thought about it, I realized what it really meant. To me, the word “female” has an objectifying character. By objectifying, I do not mean sexualizing, however; instead, what I mean is that “female,” drawing on its formality, its unnaturalness, turns women into an object of study, that is, a specimen. One thinks of the phrases “Look at that group of females” or “The females are approaching”—in either case, the utterer treats the women in question as they would an animal in the wild, a variant of Homo sapiens that is either mysterious, dangerous, or even both. There is an air of caution, of wariness, that hangs about the word. The “scientist” finds himself (intentionally not neutral) in the midst of some-thing exotic. Other scholars point out that its provocative nature stems from the distinction between sex (female) and gender (woman). 


In short, “female” becomes a formal, scientific, and classificatory term. As “female,” the woman is reduced to a species, an object of study, a foreign or alien specimen, or—to put it in the terms of the existentialist-feminist philosopher Simone de Beauvoir—the woman Unknown-3.jpegbecomes “other,” in fact, The Other, completely different from man. Essentially, as I interpret it, the use of “female” amounts to an over-rationalization of women in response to their perceived irrationality. What I mean is, a common stereotype of women is that they are overly emotional, and they never say what they mean, making it hopeless for us men to understand them and what they want from us; and in response to this incomprehension on our parts, we decide to impose our “superior rationality” upon them, like the scientist upon an insect, in hopes of figuring them out and discovering what makes them tick. Members of the incel community have also contributed a word of their own: femoid, short for “female humanoid.” Clearly, this is even more dehumanizing and repugnant than the use of female.


So, as a matter of fact, having taken this all into consideration, the “f” in “Female” can stand for funny if we so wish and open up our minds a little bit. If I had more followers, or if my blog were seen by more people, then I would probably be more hesitant to publish this for fear of being called a “simp,” but fortunately for me, that is not the case.  

The question of irony


From what has just been said, it would appear sexism is a big problem on TikTok. But earlier I raised a question that is hitherto unanswered. In my opinion, we are faced with an even bigger, more serious problem. We must ask the question earnestly: Do people mean what they say? Are problems like sexism, racism, homophobia, and more caused by people with misguided beliefs—or, at the end of the day, is it all some big joke? Are the trends which indict both women and men* (I know I haven’t addressed sexism toward men, but it is a big problem in itself, perhaps worthy of a separate blog) motivated by actual internal values or are they just playful contributions? The problem, as I see it, is one of irony, for today is the Age of Irony, as I like to say.


In the 21st-century, irony has become incredibly complex, so much so that we can speak of things ironically, by which we do not mean what we say; “unironically,” by which one comes to like a thing after merely pretending to; “post-ironically,” by which one pretends not to mean what one says; and “meta-ironically,” by which what one says is meaningless and fluid. Accordingly, in this yawning abyss that opens before us in the absence of truth, we ask, Why do we say what we say on TikTok? At this point, we must dive into the deeper psychological and philosophical underpinnings of trends and how we participate in them. Psychologists distinguish between three main forms of social influence and their motivators: 

  1. Compliance: Do we say what we say in order to gain rewards and avoid punishments? For instance, do we post a video of ourselves making a racist joke because we know that such humor is liked by many, and we expect to get a Unknown-7.pnglot of likes and followers from it? One of the sad things I have observed on TikTok is the self-degradation in which some girls engage, seemingly for this reason; they “go along” with gender roles to elicit the approval of male followers. Of course, I do not want to generalize: some do genuinely believe in such roles, but what I am concerned about is when girls do it solely for attention, even when they know it is false. One is (seemingly) forced to put aside one’s internal convictions in favor or public approval. Another problem associated with this comes up when we consider the number of young TikTok users: What happens to young, impressionable kids who see the divisive comments and offensive videos, and are thus socialized to find it acceptable?

  2. Identification: Do we say what we say in order to belong to a group? As a male, I feel a kinship with my fellow guys; a girl, similarly, will feel a kinship with her fellow “gals.” This is a natural thing for us to do. But another thing which is natural yet should be avoided because it is harmful is what I discussed earlier: Unknown-4.jpeggroup polarization and in-group favoritism. When we are in any conflict, we will usually side with our tribe. At that point, communication between the two camps is fruitless. This, in turn, leads to the in- and out-group homogeneity effects. When this happens, we see the group of which we are a part as being alike in its virtues, and the group to which we are opposed as being alike in its vices. Seen from the perspective of sex, boys might say, “All girls are the same: they’re promiscuous and stupid, whereas we boys have each other’s backs and are funny,” and girls might say, “All boys are the same: they’re cheaters and objectifying, whereas we girls are compassionate and loyal.” Do some say what they say because they have been indoctrinated by the all-encompassing monolith known as “The Boys”? 

  3. Internalization: Do we say what we say in order to get our views across? Put in terms of sex: Are people sexist? This explanation does not have to do with influence; internalization means that we have encountered a belief and adopted it for ourselves. One who has internalized the belief that men are Unknown-8.pngfunnier than women is not saying sexist remarks because one has social needs, but because one is sexist, plain and simple. This also makes it the hardest explanation to tackle because it is hard to change someone’s mind when it is already made up. In TikTok comments, one can read such things as “The girls are pressed now” or “All the females are silent because they can’t respond.” While these may have merit to them, they are also inherently provocative; the very post on which they are made was created to elicit these responses due to their beliefs. Both men and women are guilty here, as they often post videos which are controversial because they want to blame the other sex, resulting in conflicts which divide even more.

  4. Probably a little bit of all of them. 


From what I have just sketched, it is apparent that ideas and values are difficult to communicate seriously these days. In our liberal era, it is difficult for many to express themselves if they feel their opinions are not mainstream; conservatives and right-leaning people, finding themselves cornered, unable to openly say what they feel, may fall back upon irony as a defense and shield to deflect criticism, or they will appeal to some conspiracy like that of “postmodern cultural Marxism’s attempt to destroy Western Civilization by means of identity politics.” Thus, when faced with backlash, one can easily say, “I didn’t mean it, it was just a joke”—but was it? That is the difficulty. It is hard to tell what one truly believes in these days.


The German philosopher Martin Heidegger illuminated how this ambiguity results from trend-following in his famous 1927 book Being and Time. His term for this phenomenon was Gerede, which translates from German into “idle talk.” According to Heidegger, idle talk is intrinsically inauthentic because it is mediatory. What did he mean by “inauthentic”? The German word for authentic, eigentlich, derives from the word for “own,” eigen. Therefore, something inauthentic is something that is “not one’s own”; it is insincere, disingenuous, false. When he said that this kind of talk is mediatory, he meant that information gained through idle talk is never gained through oneself, but always through others. As such, I cannot claim it as “my own” knowledge.


To use an example, just a couple of weeks ago, there was a trend—now dead—on TikTok in which people found it funny to post their reactions to a video of a baby with stuff on its mouth, in which they would say things like, “Why does he like middle-aged?”, “I really wanna hit that baby so hard,” “I can tell he smells like ketchup,” and other stupid things. You have to ask: would they really do the things they said if they found themselves face-to-face with that child? Of course not. They said it to be funny, because it was “the trend.” But this is not what is most interesting about the trend, no; what is most interesting about this particular trend is that one did not have to see the original in order to know and follow it. The TikTokker I was watching on live stream herself said, “I didn’t see it [the video of the baby] before it got popular,” and yet she knew what it was. One hears of it from others.


Unknown-5.jpegAs Heidegger put it, idle talk is hearsay. The word hearsay is interesting in that it is self-evident: it literally means “hear, then say.” One hears about a trend and, without giving any thought about it, without being critical, passes it on. Communication, which for Heidegger functions as uncovering—language reveals a situation—becomes covering-up instead. When we should be looking for the original, we cover up the very origin, thus obscuring its meaning. We misunderstand a trend to be understood. “Oh, well everyone knows about the baby video, though,” one says, concealing one’s misunderstanding. Through idle talk, beliefs and opinions and values are picked up and “passed along,” as through a game of telephone. On TikTok, “everything (and at bottom nothing) is happening” (Heidegger, Being and Time, p. 219) Should you take just a moment to break this idle talk down, however, you will discover its baselessness easily. If you were to ask someone why they hated the baby, for example, they would not be able to give you an authentic reason, that is, a reason of their own. There is nothing backing their beliefs; it is unsubstantiated; it is unreflective.


Moreover, ambiguity results from a lack of intention, Heidegger said. It is not as if, in spreading the baby video, the person genuinely hated the baby and wanted to deceive others with their opinions; rather, idle talk is pervasive precisely because it is intentionless: it is mindlessly, unthinkingly, and uncritically absorbed information that has not been digested. Idle talk quickly becomes normative and prescriptive when it is mixed with lots of free time needing to be filled with entertainment, for the TikTok filter bubbles created by collaborative filtering that I discussed earlier conditions “what to watch,” i.e., whatever appears on the “For You” page.


IMG_3457.jpegPsychologically, this resembles something known as “pluralistic ignorance.” A social psychologist writes, “[W]e often misperceive what is normative, particularly when others are too afraid or embarrassed to publicly present their true thoughts, feelings, and behaviors” (Kassin, Social Psychology, 8th ed., p. 261). Pluralistic ignorance is when we disagree with something but support it openly because we assume everyone else supports it. If there is some prevailing view, like that of sexism, against which I am opposed, yet I see video after video voicing it, then I might think to myself, “Oh, everyone else supports it, and I can’t be the only left out, so I guess I’ll hop on the trend”—even when everyone else, deep down, feels the same way. Thus, some end up participating unwillingly. It reminds one of dramatic irony; it is as if we are actors in a tragic drama, the way we succumb to a non-existent threat.


IMG_3470-1.jpegHowever, we must not forget that there are people out there who, through their courage, and despite their minority status, do speak up. As we know too well, though, whoever opposes the dominant ideology or disagrees with the majority is met with ostracism and derision. Heidegger stated, “[I]dle talk discourages any new inquiry and any disputation, and in a particular way suppresses them and holds them back” (Being and Time, p. 213). If a guy speaks up for a girl, he is automatically a “simp” or a “cuck”—notwithstanding their misapplication: those who throw about such terms do not even bother to look up what the words mean, merely taking their meaning for granted. Opinions become fixed, accepted, and established via repetition, regardless of their original meanings or histories, due to idle talk, as Heidegger would explain it.


IMG_3460-1.jpegEvidently, words like “simp,” derived from “simpleton,” and “incel,” derived from “involuntary celibate,” are overused and, as such, have lost their true meanings. Just as Le Bon explained, meaning and truth do not matter to crowds; as long as a word acquires some kind of normative significance, it can hold influence over people and their actions. As soon as a guy speaks out against mistreatment by girls, like if he had been cheated on by an ex-girlfriend, he is labeled an “incel” because that word, through careless use, somehow acquired the wide-ranging meaning of “anything remotely anti-feminist.” Yet the word should be reserved for those men who, through inadequacies of their own, among which are their extremely prejudiced views of women, expect privileges and special treatment from women. Now, if a guy says, “This one time, a girl ghosted me,” he is instantly an incel—?


What all this inquiry has shown us, at bottom, is that originality, closely linked to authenticity, ownness, is an endangered concept. To create things that are uniquely one’s own—this practice is becoming increasingly difficult. “[W]hat is genuinely and newly created,” Heidegger said, “is out of date as soon as it emerges before the public” (Being Unknown.pngand Time, p. 218). At the beginning, I said that one of the defining characteristics of a trend is its ephemerality, its temporariness. To be ahead, Heidegger reflected, was to be on time; reflection is already behind, too late onto the scene. When one chooses to be authentic, one is left behind. I have neither the space nor the knowledge to engage in the philosophy of humor here, but suffice it to say, the question of what constitutes humor, as well as its fate in this century, becomes important, especially due to the presence of apps like TikTok. Some take the view that whatever is mainstream is unfunny; a good joke is one that belongs to the few and which, for that reason, is appreciated for its comedic value. But once a joke becomes a trend, enters into the mainstream, it erodes like a cliff exposed to water, becoming overused, annoying, and predictable—predictability, the death knell of humor. As I like to say, all that is comic is novel. 

Conclusion


In conclusion, we have explored what exactly a trend is and how it functions; what TikTok is and does; how trends express themselves through TikTok; and finally, what some of the ramifications are of trends on the collective conscience. As humans, we do crazy things together, and it is in our nature to then stop and ask, Why? This is why psychology and sociology, for example, are so fascinating; they help us to look at how pexels-photo-697243.jpegand why we do the things we do. We learn, for example, about what enables a crowd to prosper, as well as the complex, nuanced reasons behind why we side with groups. In turn, this raises ethical questions. Is this a problem? Should we say the things we do? How do we fix it? and so on. I guess I should offer a disclaimer (more of a debrief, seeing as it is coming at the end) by saying that, for the most part, I enjoy TikTok and derive a lot of enjoyment from it. Some of the trends I criticized in this post, for example, are actually among my favorites. It is good to be able to compartmentalize, to enjoy something on the one hand and to be able to step back and criticize it on the other. Life is a pendulum swinging between humor and seriousness (where does irony lie?). It is important that we stop and think before we post or comment, but equally important that we not take jokes too seriously (what’s the line, though?). In the end, it is most important that we make the best of our time in quarantine, whether that means getting a laugh out of a TikTok, spending time with family, going outside, etc. Always remain thoughtful: the unexamined life is not worth living. 


* I’m thinking here of the now-dying trend that goes “If girls do ‘x,’ then why is it bad when guys do ‘x’?”, featuring a guy standing, back turned to the camera, looking up dramatically, as if pondering this cosmic question, or a sad girl on her bed wondering, “Why is it okay when guys do ‘y,’ but when girls…,’ etc.

Several images taken from Pexel.com.

Print sources:
Sociology 9th ed. by Richard T. Schaefer (2004)
Social Psychology 
8th ed. by Saul Kassin (2010)
Being and Time
by Martin Heidegger (2019)

The Crowd by Gustave Le Bon (1897)
Online sources:

Glick, Peter, and Susan T. Fiske. “Ambivalent Sexism Revisited.” Psychology of Women Quarterly, vol. 35, no. 3, 2011, pp. 530–535., doi:10.1177/0361684311414832.

Haskins, Caroline. “TikTok Can’t Save Us from Algorithmic Content Hell.” Vice, 31 Jan. 2019, www.vice.com/en_us/article/kzdwn9/tiktok-cant-save-us-from-algorithmic-content-hell.

Heilweil, Rebecca. “There’s Something Strange about TikTok Recommendations.” Vox, Vox, 25 Feb. 2020, www.vox.com/recode/2020/2/25/21152585/tiktok-recommendations-profile-look-alike.

Mellor, Maria. “Why Is TikTok Creating Filter Bubbles Based on Your Race?” WIRED, WIRED UK, 2 Mar. 2020, www.wired.co.uk/article/tiktok-filter-bubbles.

Wu-Sharona, Qian. “Is Algorithm Really Isolating People in ‘Filter Bubble’?” meco6936, 9 Apr. 2020, meco6936.wordpress.com/2020/04/09/is-algorithm-really-isolating-people-in-filter-bubble/.

On Bonding: A Polemic

Every child believes itself to be the center of the world. We imagine the world to be a sort of shadow play: Others pass us by, like shadows projected onto the walls by our hands, talking and playing with each other, before disappearing into the nothingness, consumed by the dark, and we fall asleep. It is this magical egocentrism of ours which we never outgrow, a conceit that grounds itself as the fore of our understanding of ourselves and others. We, the puppet masters, encounter shadow puppets every day, yet unbeknownst to us, they are not actually our projections, but people in- and of-themselves, at the center of their own worlds. Essentially, life amounts to a collective sleep-walking up from which we rarely wake.


Over the past few months, from both personal experience and ideas I’ve been reading in and out of my psychology class, I have been thinking about bonding. When we hear that word, specific memories come to mind, instances in which we find ourselves with friends doing something fun, like going out for food or riding in the car, chatting and learning about one another, and walking away feeling like our loneliness has been shattered, feeling like we have been discovered by someone, finally freed from the captivity of our subjectivity. At the same time, the word seems corny: Being put into a group of classmates or strangers and forced to do activities with them in an attempt to become closer with them comes to mind. “Bonding,” therefore, is in my mind expressed by a smile in these two senses: It is a happy, pleasant affair, but it can also be disingenuous, plastic, clichéd. And it is precisely this ambiguity, I feel, which has not been taken seriously or considered as a problematic by many, especially my peers, to whom this is dedicated, wherefore I think it necessary to reflect on.


Several months ago, I had some people from another school over whom I would ultimately befriend. Later that night, I stayed up talking with one of them on the phone, and we evaluated how the night had gone, agreeing that, despite everyone’s having fun, no one left knowing much about one another, at which point my friend said to me, “You can’t get to know people by making them share about personal stuff.” The remark unsettled me because it contradicted one of my core pillars: Creating authentic relationships based on depth. It shook me up so much that it caused me to think on it for the next couple weeks as I considered how best to get to know someone, and then whether or not we even could know someone in the first place. Eventually, after consulting some friends, I was reassured: What my friend said was, in fact, nonsensical—and I should have known that when I first heard it—considering the very phrasing, the use of the word “personal,” refutes it. Obviously, the only way to learn about a person is through person-al matters, right? This much should be self-evident. 


Later, in my psych class, we were studying motivation, and my teacher had us read this article about “Project Aristotle,” a study on what makes the best team. The researchers reached the following conclusion: A good team fosters psychological safety, meaning the quality of openness and vulnerability between people, the ability to be and express oneself without judgment. As an example, the article cites a moment when, in one of the groups studied, the leader brought his group together, sat them down, and told them all he had cancer, a revelation that established a sense of safety and belonging among them all, causing several others to open up in turn, after which the group became one of the most successful. Reading this eased my mind. I was correct after all—bonding is impossible without vulnerability, without deep diving. Even before reading the article, I had been doing the same thing, trying to attain psychological safety, creating an environment of openness and acceptance, in an attempt to get closer with others without knowing it. 


One of the arguments against bonding is that it should not be forced. According to Harbinger in “Stop Trying to Be ‘Vulnerable. Do this instead,” forcing bonding is a bad idea for several reasons. First, because it makes others feel uncomfortable. When people are forced to open up, there is bound to be resistance; after all, the element of coercion, of making someone do something against their own interest, is extremely repellant. “If she shared, then I must,” we think. Thus, it becomes an expectation, a requirement, instead of a meaningful opportunity. In other words, it creates a distressing obligation where there need not be one. My friend told me a friend of his had such an experience: At their church, they congregated in small groups and shared personal experiences, one of them discussing how, as a high schooler, he had to skip school for a while to go to rehabilitation, having become addicted to drugs. Understandably, my friend’s friend felt overwhelmed, as if he were in the wrong place. One can imagine oneself in his place, thinking frantically, “Oh god, what am I going to say when it’s my turn? What can compare to that?” A second reason forcing bonding is a bad idea is that it may have ulterior motives. As Harbinger puts it, “The right motivations for opening up are about being: being ourselves, being connected, being authentic. The wrong motivations for opening up are about getting: getting sympathy, getting friendship, getting approval.” This is self-explanatory. 


However, while I agree with some of Harbinger’s assessments, I disagree with his conclusion that bonding should not be forced; I aver that it ought to be cultivated. To begin with, self-disclosure is not, by definition, depressing, sad, or negative. For some reason, we have this notion that private matters are necessarily dark. At a team dinner, I suggested we talk about ourselves (not in a narcissistic way), to which one of my teammates replied jokingly, “What, you want us to share our deepest, darkest secrets?” This kind of response, which is common, gets at this misconception. While our “secrets,” as it is put, are often dark, they need not be. Indeed, when we speak of personal matters, matters that express our person, our individuality, are we not speaking of who we are, fundamentally, as people, which, far from being just negative experiences, is also our aspirations, dreams, and joys, and innumerable other things? It is silly to single out negative experiences. Going back to my friend’s friend in church, we can see why he felt so uncomfortable. The real problem is not that there is an expectation, but what that expectation is. Because everyone has a story, we assume there has to be some dark conflict within it, a defining struggle, like that of a tragedy. But life is not just a tragedy, it is also a comedy—not just in the narrow sense of being funny, but of being celebratory, happy, jubilant. Personal in-sight is literally seeing-into the other person. Sharing about what wakes one up in the morning, what drives one every day, or what passions one has, is just as telling as a fear or a crisis. Our response to “Who are you?” is and should not just be, “This is what has happened to me”; it consists also of things like, “Here’s what I have done,” “Here’s what I wish to do,” “These are things for which I’m grateful,” etc. It’s about time we disabuse this confusion. Then we must consider that, while the feeling of being obligated exists, it is by no means exclusionary; that is, it is not as though if one does not share, one will be ostracized. Harbinger himself writes about how, when it was his turn to share, he said he was uncomfortable with having to share, which was met with respect from several listeners. So this idea of an implicit social contract is overblown. The pressure exists, undoubtedly, but it is not tyrannical.


Next, Harbinger talks about motives, and here I entirely agree with him. As a principle, I have made sure never to aim at getting, only at being. I have no desire to make people talk in order that I may use this privileged information to blackmail them at a later point in time—not only is this a cynical way of thinking (not that I don’t deny this happens), but it is intrinsically wrong and unconscionable, I’m sure everyone would agree. For this reason, I see bonding as a means to knowledge, not manipulation; it is used to learn about others. Some may use this for their own gain, but these scoundrels are thankfully few. The key quality of vulnerability is, for Harbinger, authenticity. By this, he means that opening up must “come naturally,” which is notoriously vague. He subscribes to a cinematic view of reality in which we, at the height of our ecstasy, spontaneously spill forth our souls, letting the truth roll off our tongues effortlessly, to the enlightenment of everyone else present. Yet this idea of “coming naturally,” in my opinion, is no more artificial than the inducement thereof. This is not to say that spontaneous self-disclosure does not occur; it certainly does, just not as much as either Harbinger or I would like it to. Rarely does life work out this way, as much as we would love for it to. It is a Romantic, idealistic way of seeing the world; in fact, it places its own expectations on the situation, requiring that it unfold in the “right” way, lest it disrupt the natural order. Another aspect of authenticity is that it must “feel right”: “Because we know that what we’re feeling in those moments [of forced bonding] isn’t the real thing. We know that we’re being forced to open up, despite the fact that opening up is only meaningful when we choose to do it.”


But why can it not be meaningful when forced? Why can it not be chosen? Some hold this view that if it is forced, it is intrinsically bad. True, the situation is an imposition; the occasion has indeed been set upon us, but who says we cannot act freely therein? Every single encounter is such an imposition. The assumption is a deterministic one. It is like saying that, because my friend has approached me with the topic of homework, I must conform to that and not stray from it, precisely because it has been brought up, when I know for a fact that I can change the topic to whatsoever I desire. Harbinger seems to think that any time I am cornered, I immediately become inauthentic. And why is it that “what we’re feeling in those moments isn’t the real thing”? Is he proposing a radical presentism, according to which something is not real if it is not existent at that very moment? Therefore, in response to “What are you feeling right now?” I might as well say, “I feel the chair I’m sitting on.” Technically, everything is in the past, though, because our brains have to process every sensation. My point is, Harbinger’s contention that we cannot be authentic in the moment is far too constricting and nonsensical. Does he know better than I do about myself? If I give an answer, can he rightfully accuse me of having lied? I think not. Authenticity need not be spontaneous in the sense he prescribes.


The next argument against bonding is not only that it should not be forced, but that, fundamentally, it cannot be forced, similar to the myth of “coming naturally” or “happening of its own accord” in laissez-faire fashion. This argument derives from experience. Our deepest relationships, like those we have with our best friends, siblings, or spouses, also happen to be our longest relationships. The reason they are so deep, we gather, is because we have known them so long. And this is true. Experience accrues with time. Accordingly, relationships should be allowed to develop naturally, over time, without forced interference. As Lao-Tzu put it, “Nature does not hurry, yet everything is accomplished.” Once more, this anti-interference attitude comes into play. If it is forced, if it goes against nature, then it is automatically inferior—the naturalistic fallacy. Furthermore, proponents of this argument will say that people are like puzzles: We do not see the picture all at once, but put it together piece by piece, making more progress on some days than others, sticking to it, patient, persistent, stopping some days to rest, then picking up where we left off. Unlike ordinary puzzles, though, humans cannot be “completed”; we will never have the full image filled in; there will always be gaps—pieces—missing. “Very well,” they say, “but you can’t build a puzzle in a day.” Agreed. But—as you said, some days we make more progress than others—so why not put in a lot of work, make it an all-nighter? Then there is that great quote that Plato said but that he never really said that Alan Loy McGinnis really said: “You learn more about a person in an hour of play than in a year of conversation.” The first half contradicts what has just been said; what is important is that the second half devalues the role of conversation. Really? A whole year—365 days—of conversation is not that telling? Whether or not it is hyperbole, I refuse to believe it. 


In reply to this second objection, that bonding cannot be forced, I offer the argument from exigency: Simply put, we do not have the time to wait, to “let nature take her course.” There are over 7,000,000,000 fellow humans within this world of ours, of which an iota of a fraction graces our existence. And of this tiny proportion that even grazes us, passes us by, there is an even smaller number that stops in the race of life to engage us and stick by us. In our early years, we are lucky to know someone outside of our families for a decade. My best friends, the people I have known the longest since pre-school and with whom I am still in contact, have been my friends for about 13 years now—it will be 14 when we graduate, it will be 14 when, tragically, we part ways for college and enter the real world, likely to never meet again. During all these years, we have steadily grown apart, and I lack the real-time knowledge that I used to have; for, no longer being around one another 24/7, we do not have constant coverage of what is going on. And these are my best friends—what of peripheral friends, friends we talk to and have hung out with, but whom we do not know that well, who are destined to be forgotten except when we jaunt through our yearbooks a decade hence? Should we just “leave it to nature”? Are we to wait for spontaneity to unravel itself? The reality is that, often, these occasions never arise by themselves. Sometimes, bonding does not come at all and, rather than force it because we are afraid of forcing things, we decide to remain inactive, losing out on priceless memories and experiences. There are people whom we will encounter for one day in our lives, say, in a coffee shop or at a museum—why not engage them? Again, the expectation is not that we will interrogate a stranger about their life at home. It may be something as innocuous and banal as a classmate with whom we are paired one day, but to whom we never again utter a word. Life is short. It really is. We simply do not have the time to wait things out. Even supposing this were not true, as Emerson does when he says “We ask for long life, but ‘t is deep life, or grand moments, that signify,” the focus is on “deep life,” profundity, meaning, person-al-ity; our priority should be density, how much we can fit into as little space as possible. We want the densest relationships possible; the more we put into something small, the denser it is. This amounts to a kind of “Carpe diem!”


And pertaining to the McGinnis quote, in addition to its being hyperbolic, I might add that, although actions do speak louder than words, I think this must be qualified: Perhaps I might venture to interpret “conversation” as an intentional choice, indicating small talk, in which case I would agree. An hour of play is certainly more telling than an hour of small, trivial, insignificant chatter. However, to elevate play to the level of profound, rich discussion and self-disclosure—unthinkable, methinks. To determine who one is based on play, is to research like a behaviorist, focusing only on observable behavior while neglecting consciousness, the private, subjective world behind such behavior; while talking with someone (and assuming they are truthful and honest) reveals their intentions, feelings, ambitions, dreams, desires, etc., all of which form who they are, essentially. Words are words. That does not mean much, but we do tend to devalue words. They tell us things. Words communicate. They bridge gaps as best as possible. Of course, words are not transparent; they do not magically reveal the other as through revelation; they are more translucent, letting in a bit of light while still retaining opacity. But it is the best we can and will get. And perhaps it is best that way, neither transparent nor opaque—it adds mystery and life to the other person. We do not quite know what exactly is going on in their head, but we get clues, from which we can make inferences, and so on. It is fun that way. Stressful, too, but fun. As Aristotle stated, we naturally desire to know. As he also stated, we are political animals, meaning we thrive in communities. We desire belonging, we desire intimacy. We want to get close to others and know them. To me, it does not make sense to put this on hold. If we desire knowledge, intimacy, and belonging, then why wait? Why not seek it out?  


The last obstacle toward bonding, as I see it, is how we conceive of it. Similar to my earlier anecdote, I was hanging out one night with some of my teammates, and I asked if we could bond. One of them said, “Good idea, let’s go around and say our favorite colors,” at which everyone laughed. Following through with it, we each said a color, before they then said, “Now let’s say our favorite genres of music.” This was definitely an improvement from the previous one. A couple years ago, coincidentally, I had written an essay on friendship in which I discussed the idea of favorite colors, using it as an example of the kind of knowledge which has been devalued over time. In other words, the older we get, the less we concern ourselves with smaller, more trivial aspects of others, especially those whom we presume to know best. Do you know your best friend’s favorite color? If you do, then you are in good hands. My point is, a seemingly unimportant question such as that, while not telling us anything about the person themselves, indicates the level of commitment one puts into a friendship. This is not to say that, in order to be a good friend, you should be able to list off every single trait or preference of your friend; it is the spirit, not the letter, of the law with which I am concerned.


The reason I bring this up is because, when my teammate had us tell our favorite colors, they did not have this intention in mind; it was more of a joke. However, in spite of this, the innocence of it, the very fact that it seemed childish to ask, redeemed it. There we were, a bunch of high schoolers who, outside of track and field, hardly knew each other, sharing our favorite colors, as if we were strangers—”as if”?—no, we were strangers to each other. A while after the first anecdote I told (sorry to jump around), the one about my teammate at the team dinner who interpreted bonding as “sharing one’s deepest secrets,” we had another team dinner. Now, when I brought the topic up again, they said, “But we already bonded last time, remember?” I did remember. I remembered how we discussed where we were going and what we were doing for the upcoming vacation. While I admit it was interesting to learn about what my teammates were planning on doing and some past experiences they had had, I did not feel as though I knew them any better; it felt like I knew about them, but I did not know them themselves. In philosophical terms, I learned about accidents, things that are not inherent to a thing, but happen to be there, i.e., things that are unnecessary. Thus, I learned about some different facets and surfaces of my fellow runners, but I got no in-sight into them. Accordingly, my peers and I confuse small talk for bonding, thinking that sharing small pieces of trivial information is equivalent to self-disclosure. I want to emphasize that I am not advocating for the eradication of small talk; I wish simply to separate small talk from bonding, to isolate the latter as something in itself with which we ought to be more concerned.


My next point is as follows: In adolescence, we form friendships based on things we have in common, whether it be sports, videogames, music genres, personality traits, or something else. In consequence, making friends becomes a lot easier. Most of the time, we can identify who we wish to befriend in advance because we will know if they are a good fit for us. If I like a TV show, then I have a point of entry; if I like football, then I have a point of entry. This common trait then binds us together. However, this blessing is also a curse: Forming friendships based on a common trait is just that—a relationship centered around a fixed thing. A good friendship, mind you, will move beyond that initial shared thing, but most friendships do not. Hence, most of these relationships are what Aristotle would call “friendships of pleasure/utility”: They are relationships not immediately between two people, but mediated between people and something else, be it pleasure or some kind of use. As a result, we do not focus on the other as Other, but the other as they relate to our shared object. The relationship becomes secondary. A narrow group identity, then, endangers authentic friendships. Take my track team, for example. We “know” each other. I know one teammate and label them “the guy who runs the 400 meters,” another as “the girl who does long jump.” Everywhere we go, to make sense of the world, we add epithets. “X, the one who…,” “Y… the intelligent one…,” etc. And yet, the ironic truth is that, the more people know us by these epithets, the less we are known. How it must feel, to come home, reduced to “that girl—she went skiing over the break—she is on the school newspaper,” and to be known for that, to be known as that—how lonely we all must be from time to time, when this all sinks in, for such is the human condition! Is that all we are? Epithets? Like Jean-Paul Sartre’s waiter from Being and Nothingness, none of us on the track team is fundamentally a track athlete; we merely play at being track athletes. It is a role, and we must acknowledge it as that. To do otherwise, is to be in bad faith: We reduce our transcendence—our freedom—to our facticity—our past or present condition—thereby depriving ourselves of self-determination. I am nothing, according to Sartre, precisely because I can choose anything. I am not a track athlete. I am not a blogger. These are things I have chosen, and these choices, in turn, reveal something about me. But they are not me myself. Why limit ourselves to anything? For my teammates and I to say of each other, “We are runners/jumpers” is, in the words of Emmanuel Levinas, to totalize one another’s infinity; that is, when we try to apply some kind of image or representation to other people, we limit them in their potentiality-for-Being (had to get a Heidegger reference in somewhere!). We do not allow them to be. When Levinas describes “the Other” (another person) as infinity manifested, he is saying that the other person is always beyond us, incomprehensible, nuanced, interesting, unique—in a word, infinite. Yet we are constantly trying to totalize the Other, throwing a metaphorical net over them, as if to catch their essence. We want to finally “get” each other. We ourselves want to be “got.” A grand paradox in itself. The title of Levinas’ best-known work, Totality and Infinity, might as well be the subtitle to the movie of Life. In short, when we self-identify, we unintentionally mislead and deceive others, not just at their expense, but also at our own. Misunderstanding is a two-way street. I myself am guilty hereof. Many a time have I been in bad faith, inauthentic, totalized. Rarely have I been swept up in the channels of infinity. 


This polemic is really an open letter-plus-criticism-plus-confession. As I said in the opening, I was motivated to write this mostly due to personal experience. Already halfway through my Junior year, and with one year left before I graduate, I am constantly reflecting on friendship because it is one of the most important things in life. There are so many people I have not met or talked to, that it is dizzying at times. It is genuinely mind-blowing how a single conversation with someone can change things instantly, how the exchanging of mere words can have a tremendous effect on two people’s lives. In the past year, I have met many fellow track athletes—I mean, people who play at being track athletes—from different schools, people who were previously considered “enemies,” but who, as is always the case, ended up being just the opposite. But time is running out, and I know it. So the reason for my criticalness, I hope you can understand, reader, is my awareness of time. The reason I adjure my peers and myself so, is not because I am disappointed or annoyed with them; quite the opposite—my urgency is a reflection of my concern for them. I hope whoever reads this may ponder what has been said and take it to heart, too, for thought is desperately needed at all times. Perhaps I am too late in delivering this message, perhaps the hands near midnight, and the Owl of Minerva is taking flight—but then again, better late than never…   

Phon(e)y – A Poem

images.jpeg

We’re phonies when we’re on the phone,

To hide from the fact that we’re painfully alone,

Feeling as though we’re liked and popular—

But when it comes to making real friends, we demur,

Preferring the cover of a screen,

From which we cannot hope to ourselves wean,

Holding it close, as if our heart,

Foolish to think we could from it ever depart

Like a child dependent on a parent,

Our attachment to our devices is apparent:

As a toddler’s reluctant to let go,

So we constantly bring our phones in tow,

Fearing what others might of us think,

Causing us to incessantly blink


On others, we want to ourselves impress,

More often than not calling for redress,

Hoping that, as a result, they don’t like us less

For we’re concerned, foremost, with appearance,

Which thenceforth our personality tints,

And paranoia about whether it stints

By applying stereotypes,

Beneath which, like tight clothing, it gripes,

A suave saboteur who our chances snipes;

Creating for the other his own narrative,

As if to prevent him from having ever lived,

A historiography that is repressive

To superficial surfaces we’re thus subject,

Wherefore the lot of us are left deject,

And we our prospects betimes reject

 

Meaning, Motivation, and… Minecraft?

Unknown.jpegSo recently, in 2019, Minecraft has undergone a revival. It has re-emerged, as it were, from hiding, and is now back in the limelight, having finally beaten out Fortnite, once again establishing itself as the king of online video games. Upon first hearing of this, I ignored it, considering my Minecraft days as “long gone”; but after coercion from my friends, I decided to get back on to see what I was missing, as well as for memory’s sake, to return to one of the games that formed me. When I created my first new world, I was taken aback by the newness of it all, namely in the change in textures that had just been added, and I was opposed to it, just as many were, for who really likes change at first? But then, gradually, I warmed up to it. Besides some other new features, the game was largely the same, and to most people, that was a good thing—not so much to me, though. See, whereas most Minecraft players came running back to Minecraft with open arms and a warm sentiment of reconciliation, I came back to it hesitantly, reluctantly, as if coming back to someone I have not seen for a while, about whom I was not sure how to feel, having been away so long, like an awkward reunion where neither one of us was certain we wanted to be there. Whereas most Minecraft players never tired of the game in its Unknown-1.jpeginfinite possibilities, picking themselves up after every setback, I could never seem to fight successfully against the inevitable ennui that would set in during my gameplay sessions, the despair of restarting, the frustration of dying. But the 2019 revival of Minecraft, I believed, would hopefully change that. After my friends convinced me to join their server with them, I logged on with promise, with hope, that maybe, just maybe, I could find happiness in this virtual block world, and never tire of it. I am happy to announce that this dream is not yet dead—that is to say, it is a work-in-progress. This post will be my brief meditations upon how Minecraft, and video games in general, has conditioned an entire generation, reinforced my feelings of absurdity, become a source of meaning for many, and also made me appreciate life.


A year ago I returned to watching videos by the YouTuber Markiplier, beginning with his series on “Getting Over It with Bennett Foddy,” a game in which you play as a man in a cauldron who must make his way up a massive obstacle course only by swinging a sledgehammer, an obstacle course littered with sadistic, infuriating points at which you Unknown-2.jpegrisk losing most, if not all, of your progress, forcing you to go through it all over again. There were points when I was watching the video where I would just lean back and ask myself, “Why’s he getting so mad at the game… why do we get so mad at games?” And in several efforts to rationalize his suffering, Markiplier himself would tell himself, “It’s just a game, no need to get angry.” Whenever we are caught up in the moment, the best solution is always to take a step back, to “be objective,” that is, to look at the whole picture, from above, so that whatever is bothering us is made small and insignificant. By using our reason, we can minimize the problem, and by rationalizing it, we can come to endure it.


During sports, for example, when someone’s taking it too hard, we say, “It’s just a game.” But “being a game”—what does that signify? What does just being a game mean, further? Is something inherently less important in virtue of its being a game? Like, is there something in the notion of “game” that is self-mitigating? It certainly seems to be that way. Games are designed to be fun and inclusive. video games are not real, they are virtual. That much is understandable, seeing as if it is not real, it should not be as Unknown-3.jpegimportant. After all, if the character as whom we are playing falls off a cliff, it is not we who feel the impact, but the avatar. For sports, it is not so easy, which is why this blog is not about sports. In video games, we identify with our characters, we form attachments to them, we feel “at home” in the game, as if, in exploring the open world, we ourselves were doing it. Injecting ourselves into the game, we associate with the character, become them in a sense, and so we adopt their goals as our own. The simple reason for why we get frustrated, then, amounts to this: We get mad during video games because we feel as though what is happening to the character is happening to us. And yet, this answer is so… unsatisfactory, or at least, it seems so to me. It still seems to miss the underlying “Why?” we are seeking. What is really going on when we play a game and get frustrated? What does it mean?


Unknown-4.jpegHundreds of swear words, screams, and chair throws later, Markiplier finished “Getting Over It” (but never seemed to “get over it”), then decided to try out the game that served as its inspiration: “Sexy Hiking.” Just like its offshoot game, “Sexy Hiking” enraged Markiplier, and he found himself screaming just as much, whereupon he relates, “If I was just flailing against madness here, I wouldn’t be too upset about it…. If I was just subjecting myself to impossible bullshit, I wouldn’t care. It’s because I care so much that [I try].” Here we get major insight into the gamer mindset. According to Markiplier, our attachment to and identification with video games comes from their possibility, as opposed to their im-possibility. His talk about “flailing against madness” is reminiscent of French existentialist Albert Camus’ idea of “the Absurd,” the utter meaninglessness of life and the Universe, their indifference toward us, their totally blank expression, their refusals to accommodate us. But, the important thing Unknown-5.jpegis that Markiplier says “if.” That means the game does have meaning; otherwise, he would not try. If something were impossible, if it could not happen, then you would have to be mad to attempt it. It would be absurd to attempt it. Camus tells us that, in order to cope with the Absurd, we just have to live with and embrace it. We have to celebrate absurdity. To embrace madness, requires that one be mad oneself. Yet Markiplier is advancing a very different idea. He is telling us that, where there is madness, there is resignation. Whereas Camus advises against creating meaning, deeming it a futile task, Markiplier thinks to the contrary, declaring that it is in spite of madness that we create meaning. In this sense, Markiplier is more like Sartre, who championed the free creation of meaning, than Camus, who championed Sisyphus, the symbol of the Absurd.


Unknown-7.jpegToday, I was watching CaptainSparklez do a parkour course in Minecraft. Over and over, he would jump from bamboo stalk to bamboo stalk, sometimes progressing an obstacle or two, sometimes doing worse, ending up farther back. One has to ask: In this situation, who is madder—he for doing the same thing over and over, or I for watching him do the same thing over and over? Inside both of our minds, there is that sliver of hope, that little part of us that keeps thinking, “This next one will be it.” Even though he is the one who is immediately doing the course, and I am only watching vicariously, we still have this same thought process. Gnawing at us constantly is the voice of possibility. Possibility literally means “the ability to be able,” like “being-able-ness.” So every time CaptainSparklez falls down and starts again, his actions speak, “I can,” “I am able.” Again, if it were impossible, he would not be trying. A parkour course makes us want to punch our computers because we know they can be done. It is in the moment that we jump and narrowly miss the landing that, at once, we are frustrated, and at the same time, we are hopeful. We reflect, “If I had just jumped later…” And here is the next component: Repetition. Not only is parkour possible, but it is repeatable. When we fall down, it is not as though we are whisked away or our chances are forever left behind; we pick ourselves up, go to the start, and begin again. We lose nothing—except maybe some sanity and self-esteem. All that confronts us, really, Unknown-8.jpegis sheer possibility. Nothing is at stake but success itself, and this is purely abstract. In other words, navigating a parkour course will not cost us our lives our livelihoods; instead, it frustrates us, and frustration is different from ordinary anger in that it is anger over impotence, or powerlessness, incapacity, non-possibility. But non-possibility is not the same thing as impossibility. To jump and mess up, one says to oneself, “I failed, but I can try again, because this time, I have the power to do it.” One does not say to oneself playing “Getting Over It” or Minecraft “I’m a failure”; rather, the frustration is more abstract, dealing not with the player but their actions and skill. Missing a jump, I do not say, “I’m terrible,” but, “I’m bad at parkour,” or, “I’m bad at Minecraft.” This is because it is not I in concreto who has failed, per se, but my playing that has failed. Therefore, I can correct it, without fear of hurt to my physical, bodily self. In video games, we can do what we cannot do in real life with repetition, and without real consequences. Virtual failures do not carry over into real life. Falling in “Getting Over It” makes me mad because I could have made the jump, and so I am determined to try it again, whereas failing to be employed in real life may instill one with a crippling sense of inadequacy, discouraging one from trying again.


Video games, then, are not about their being meaningless or “just” anything. For this, we can draw our attention to Markiplier’s saying that he is not “just subjecting himself to impossible bullshit,” and again, his hypothetical “just flailing against madness.” The “just,” we have said, connotes inferiority, degradation. As such, Markiplier is not playing “just a game,” something non-existent in reality, only valuable virtually; that would be Absurd; it would be madness. The reason we get mad over video games is precisely that they have meaning. Markiplier’s statement asserts that madness itself warrants nothing. However, because the game does have meaning, because it is more than “just a game,” Markiplier has a reason to be upset. It is interesting to note that video games are their own microcosms; so this means that, in these mini, self-contained universes, there is meaning.


Unknown-9.jpegPerhaps, living in a Camusian time of rampant Absurdism, this is why so many flee to video games—because unlike our own ostensibly meaningless Universe, video game universes provide their own meaning? So far, this seems like a plausible case, yet a major problem presents itself: Alright, so video games are appealing because they give us meaning, but that in itself is an unsubstantiated claim; that is, if video games can give us what life cannot, then what is it video games give us? What meaning is conducted through them, and why do video games hold this privileged spot, as opposed to life itself? Suppose Markiplier found no meaning in life, but he did find it in video games—what might this tell us?


Unknown-10.jpegHaving touched upon video games in general, I will now turn my focus to Minecraft specifically. Why does everyone love Minecraft? Because it gives them freedom, creativity, happiness. As children, the only thing that got us through the school day was the excitement we got from thinking about what we would do when we got back onto our world in Minecraft. It was a little break from school. In fact, it was a little corner we had carved out for ourselves in our lives. We speak of the public sphere and private sphere, but sometimes we need something to escape from both of them. And Minecraft was that escape. Whenever we were stressed from school, or whenever it was raining outside, we would play Minecraft. This was turning away from the public sphere. At the same time, whenever we were bored, or whenever we were having problems at home, we would play Minecraft. This was turning away from the private sphere. If ever there was a problem in the outside world, then we retreat to the inside world, the private world. But 2019-05-31_16.20.10.pngoften, we found that this was not sufficient, either. The beauty of Minecraft is that if someone were depressed, and he came home from a terrible day of school, his being home would not automatically make them happy, nor would being with his family—Minecraft lies outside both the public and private sphere because it is a sphere of its own. People speak warmly of how Minecraft provided an escape from them, and we can see why. However, we are still begging the question here as we had when we said video games give meaning: By saying that Minecraft is beloved because it was an escape, we are not answering the question, which is what makes Minecraft this escape? Saying that Minecraft’s likeability derives from its being a safe haven does not tell us what about Minecraft makes it distinct. This is what we are after. Only then can we proceed to make a larger claim. If we can find out what is special about Minecraft, then we can speculate about what is special about video games, and thence how they give us meaning, allowing us to cope.


Unknown-11.jpegMinecraft is famous for its simplicity, its unassuming and modest nature, the fact that it promises so much by giving so little. You spawn in a world made up of cubes, encounter animals and evil monsters that need to be defeated, and along the way you meet villagers, all while trying to fend for yourself, surviving through farming, mining, and building, and other minor activities. It’s not as though it were some high-quality game with the greatest graphics in the world. People actually make fun of the game for being so simple, questioning how something so childish and cubelike could possibly be so entertaining. And yet it is somehow. Players talk of the calming, soothing, tranquil nature of the game. When you explore the infinite world, passing through the diverse biomes, seeing massive natural generations like forests and mountains and ravines, watching breathtaking sunsets, you cannot help but feel peaceful. It is just you, alone, traveling, with nowhere to go, filled with an insatiable wanderlust, and the whole world is there for you to see. Both Minecraft and real life are free roam worlds; the only difference is Unknown-12.jpegthat the former has no borders. Consequently, in Minecraft, one feels freer than in real life because there is nothing that cannot be overcome (except Bedrock). There are no borders, imagined or geographical, in Minecraft, as separate nations today. The only obstacles to the gamer are mountains and oceans and ravines, all of which can be surpassed, by tunneling or sailing or building over. Although this is a dumb analogy, could we mine our way through city borders in real life? No. As a result, Minecraft is about being free in the most absolute sense. One is not constrained by anything. The whole world is manipulable to our whims. If there is a mountain we do not like, if it blocks our view, or if it is simply inconvenient, then we can get rid of it. The world is customizable, able to be adapted to our liking. And this idea of freedom leads to the next aspect of Minecraft, which is its appeal to the individual’s creativity, expressible through such freedom.


Whether in survival or creative mode, you can build whatever you want. Want a huge mansion? Build it. A spaceship? Build it. A medieval castle? Build it. Although there is a build limit in Minecraft, the sky is figuratively the limit. When we are alone, we can do whatever we want without judgment. We can bring out the inner artist or architect within us by designing our own houses, then decorating them afterward with furniture Unknown-13.jpegand other amenities. No one needs a degree in engineering or interior design or architecture—one can do what their heart desires. Playing with friends, we can cooperatively build things and admire each other’s work. After a stressful day, it reassures one to know one can come home and build, just build. If you get a large project going, and if you throw yourself into it, you will find yourself wholly absorbed in it, and nothing else will receive your attention, because everything in you is committed to finishing this project of yours, this vision, this dream. Once you finish, you stand back and gaze upon your work, satisfied, accomplished, proud—this, this monument, it came from my hands, it is mine, I built it. Finally, we have an outlet for ourselves. All of us have a repressed artist within, and we cannot express ourselves in the real world. Some find their freedom in Minecraft because there they can unload. Minecraft is constructive because it allows for sublimation: If we have any urges, then we can channel them into building something, furthering our artistic inclinations.


Evidently, Minecraft is a calming game. Some cite stories of mining and finding immense relief therein, and I myself have experienced this. Yesterday, even, I was looking for some diamonds, strip mining in my hallways, digging through the stone, digging and digging, not really finding anything, but enjoying it nonetheless, because I got into a Unknown-16.jpegrhythm, and the digging, rather than being a chore, became a routine, a comforting practice. Even better, though, is when the music kicks in. As jschlatt mentions in his “Tribute to Minecraft,” the one constant in Minecraft is its soundtrack. Throughout the years, the classic works of C418 have aged well, and both early and new players of the game can appreciate the simplicity and beauty of the songs, which can invoke anything from naïve bliss to melancholic nostalgia, enhancing the gaming experience, really embedding the player in the game, grounding them in it, carrying them away in their absorption. Putting it all together, one can imagine the idyllic pleasure of exploring the vast world with a dog as one’s companion, the beneficiary of a sunset that puts the world to sleep, as the gentle music sets in, and one builds oneself a house that they will come to call home, feeling rooted, secure, strangely comforted in spite of the isolation, because it is as though the game itself were there beside you—and that is where carefree happiness comes from.


So wherefore my troubled relationship with Minecraft? If it is the panacea I have just described it as, then why should I have been so reluctant to come back to it? Two reasons: Frustration and Absurdity. By far, the most annoying thing for me in Minecraft Unknown-5.jpegis dying; more specifically, dying in lava with good items. I cannot count how many times I have been mining, with diamonds in my inventory, when I broke the block on which I was standing or to which I was moving, and fell into lava, helpless to do anything about it, just bobbing up and down, looking around in vain for a way out, for some non-existent guardian angel to come and save me, watching with dejection as my hearts went down, and my happiness along with them. Then seeing all my hard-earned items bursting from me before landing in the lava, burnt into oblivion, never to be seen again, all my hours of work literally going up into flames in a single instant. That is when I exit out and close the game, never to open it again for a day, when my grudge finally subsides. Really, if anything, this insight says more about me psychologically than it does about the game.


There are two types of people, basically: Those who, after they die, take a deep breath and press restart, undaunted by this setback; and those who, after they have lost everything, decide it is not worth it and give up, the latter being me. Someone who belongs to the first category has resilience. Although it makes them upset, it does not get to them personally. They detach themselves from the moment and say, “It is just a game; images.jpegI can do all that again no problem, it’ll just take some more time.” I, on the other hand, like a Marxist laborer, see myself in my works; so when I die in the game, it is as though a little part of myself has died, and I feel as though my life in-game is in vain. No longer do I feel like doing all that work all over again. Sure, there are other diamonds out there to be found; and sure, that stack of iron ore is replaceable—but I mined them by my own hand, they fell into my inventory, they have a story in which I played a part. Perhaps I am just a sentimentalist, but I take my deaths very seriously in Minecraft. A single mistake, and everything is lost. It really reminds you of the impermanence of life. Other players, though, are like Buddhists, able to command non-attachment. Is there a right way to play? I would argue “No.” I will say that the first type of player definitely has more fun, because they are able to move past their mistakes, whereas I dwell on my failures, preventing me from moving forward.


images-1.jpegRecently, I have been working on this. In one of my worlds, I died in lava, losing a stack of iron and two of coal. I was very upset, as I usually am, and considered quitting, when, hovering over the Title screen” button, I decided I would change my ways, at which I restrained and hit “Esc,” taking me back to the game, with a fresh start. Having done this, I felt, to some degree, refreshed. I guess I can see why some people do not mind dying and restarting, or at least why they can tolerate it: Because it removes the burden of the past, as it were, making one lighter, freer. No longer was I weighed down by the items I had (though not literally, seeing as having more items does not actually slow you down), because I was no longer a slave to their histories, and I could forge a new path. In Minecraft, you do not have to follow a pre-set path; the point is that, with each new Unknown-1.jpegworld, with each new life, you are literally reborn, you are given another chance at life, you can try out something new, in no way predestined to repeat what you had done in the previous life. And it is not as though, when you die, you have to create a new world, unless you are in hardcore; you are reborn in the same world. This is the kind of thing of which Milan Kundera, in The Unbearable Lightness of Being, was writing—the human dream of being able to repeat life, but without the weight of the past, being able to rewrite one’s history without different conditions. When we are light, when the past does not matter, when we have the freedom to be reborn without consequences, we are like a Buddhist caught in samsara with good Karma.


Unknown.pngSo there’s no problem anymore, right? I’ve finally come to peace with myself and Minecraft? Well, not quite. See, even after respawning from my tragedy in the lava, and after reacquiring the iron and coal I had lost, there dawned on me a saddening dissatisfaction, a feeling of—dare I say it—Absurdity, meaninglessness? When I play Minecraft, the same question inevitably haunts me: “What is it all for?” It is actually quite funny, now that I think about it, how the Minecraft narrative fits the American one. Here in the U.S., we are expected to go to school for the first third of our lives, get good grades, go to a good college, get a good job, support ourselves and our families, retire, and watch the rest of life go on. In Minecraft, the expectation is that we find a place to settle, build a home, create a farm of some sort to give us a constant supply of food, get a full set of diamond equipment—tools and armor—fully enchanted, go the Nether in order to get to the End, and defeat the Ender Dragon, whereupon one “wins.” In America, as in Minecraft, there are “winners,” people who have reached the end, who have lived “the dream” to its fullest. Not everyone makes it. Only a few of the lucky ones. He who goes to a good college and graduates, finding himself a secure job, is entitled to boast of his success, and everyone congratulates him accordingly. Similarly, he who can claim credit for having beaten the Ender Dragon in survival is met with great praise, and he is “one of them,” the elite, and is likewise received. The biggest difference, I suppose, is that in Minecraft, upon defeating the Ender Dragon, the game still goes on, and when you come back to the overworld, you can still build whatever or continue to amass resources Unknown-2.jpegfor fun. Minecraft is a bigger sandbox than the real world. But just like with the American Dream, I am skeptical of the Minecraft Dream. This cynicism of mine may or may not be due to the fact that I myself have not yet defeated the Ender Dragon in survival… To reiterate, much of my feelings toward Minecraft can be explained psychologically, and this particular instance may be a manifestation of a deep-seated jealousy, i.e., “I haven’t defeated the Ender Dragon, so why should it be so great to the rest of them?”  Needless to say, it is a childish sentiment—but what about in the U.S., where so many genuinely try to achieve the Dream, but fail, or by circumstance are actually unable? While this may be a false analogy, my point is that my grudge toward the Minecraft Dream is, perhaps, somewhat substantial, and should not be so readily dismissed as coming from jealousy. Maybe I just happen to be a terrible gamer.


Unknown-1.pngWhatever the case may be, I always end up getting bored with Minecraft. I got a few diamonds—now what? I built a house—now what? I built a fully-automated mob farm—now what? Each time, I feel as though everything has been done, as though I have nothing left. After all, for what purpose is a house other than living in it? But in Minecraft, you cannot live an ordinary life as you do in the real world; you do not wake up and lounge around the house, because in Minecraft, it is impossible to stay still; there is no point in “enjoying one’s house,” for there is something to do; Minecraft players thirst for adventure, for movement. So how can one derive anything but æsthetic enjoyment from a house or other structure? And the diamond tools for which you toiled, with their awesome enchantments, what is their use when, after all this hard work, they break on stone? Just build another one! But then what? Just keep mining?


I am always envious of YouTubers who, in their series, have full diamond armor and tools completely enchanted, for I have never been able to accomplish this. In response, I always resort to cheating, using commands to give myself these items, thinking it will Unknown-3.jpegmake me enjoy the game. But it does not. Part of human nature is that we convince ourselves that we need certain things to be happy. The thing is, when I got my diamond tools with insane enchantments, my emptiness was only amplified. Now that the game was essentially completed, and now that I had everything I wanted, there was nothing left for me. All my tasks, all my ambitions, are desiccated, devoid of meaning. Just as people who feel empty take to filling themselves up with material things to fill their emptiness, so the emptiness experienced by Minecraft players who have beaten the Ender Dragon take on ambitious building projects in an effort to defer their ennui, to “keep the game going,” to make sure the fun does not run out. But God knows if they ever stop, if they take a minute to reflect, then they will realize that it is all in vain, that there is nothing else to it. It is a coping mechanism to build. It is done out of desperation.


I know this is pessimistic thinking, maybe even nihilistic to an extent, but I think this is a good thing. A good thing—how so? Well, hitherto this whole blog has been about comparing me to the average Minecraft player. And so here is the decisive difference: Whereas most Minecraft players use the game as an escape from real life and find immense meaning in it, I find Minecraft to be meaningless, life meaningful. Strange, I know. Perhaps I am an outlier, but for some reason, my psychology is inverted, but I am thankful therefor. Honestly, I would much rather live a meaningful life out there than one inside a video game. I would consider myself lucky. We are made to believe that, in Minecraft, you can do whatever you want because you are free, and for the most part, that is true. But can you read literature in Minecraft? Create literature? Fall in love? Watch cinema? Smell the freshly fallen rain? Hear the psithurism of the trees in Autumn?


Minecraft, rather than being more freeing than real life, is more constricting. I feel freer in real life to express myself than in Minecraft. Minecraft is vicarious; it is “just a game” to me. While it is fun to return to if I ever want to feel calm and get a brief break, then so be it; but I do not see it as “a way of life,” as my friend put it. Although it teaches resilience, and although it may teach us to not fear death, faithfully creating the experience of eternal return, we need to treat death as it really is and see the eternal Unknown-2.pngrecurrence of the same as Nietzsche saw it—as a thought experiment, a way of determining whether one is capable of affirming life. Minecraft does not teach us to affirm life, because there, it does not matter, given that we are “light,” in the sense of having no burdens upon us, no consequences for our actions; life is the only thing that can teach us to affirm itself, because we only have one shot at it, and ultimately, our experience is grounded herein, and Minecraft would not be possible were it not for the life-world, the Lebenswelt, in contrast to the overworld. Here, death is the “possibility of impossibility,” so we should approach it as though we had to relive it an infinite number of times—not because we actually do. Such is the difference between mortality here and mortality in Minecraft. The latter, understandably, is more comforting, whereas the former is harder to face—and for that reason, more admirable, in my opinion. In contrast to most Minecraft players, I do not find meaning in Minecraft. Quite the opposite. What others find in Minecraft, I find in real life. This is not to say that I am better than they, only that I have managed to mature, to overcome this old coping mechanism of mine. I will admit: I used to find meaning in Minecraft where it was lacking in real life. But there was a point when this all changed, when life became full of meaning, and Minecraft was deprived of it. And frankly, I prefer it this way; I am glad this shift occurred.


Unknown-4.jpegIn the end, I am thankful for Minecraft, as thankful as any other player, for just like them, I owe who I am to it. Playing Minecraft has been an integral part of my life. I can still remember my first day of playing it on my friend’s account, unaware of how impactful it would be, unaware of the fact that, one day, in spite of how much fun I had playing it, it would no longer be a part of my life. It is bitter-sweet, of course, to come to this realization. Much of the happiness we attribute to our early days of Minecraft, it should be noted, can be partly explained purely by the nature of memory and nostalgia: We look favorably on the past, usually overlooking the negative aspects of a thing because we remember the positives most. As such, while I am sure I did have a blast playing Unknown-6.jpegMinecraft back in the early 2010s with my friends, I am also sure that those days are gone, belonging entirely to the past. Maybe it is not the Eden we remember from before the Fall; maybe we are just victims of memory. I miss Minecraft, to be sure, but letting go of it has been an important part of my maturation. Whether you find meaning in video games or in real life—stick with it as much as you can. Better meaning from video games than meaninglessness at all. Better to be mad at video games than mad in the way Sisyphus is, embracing the Absurd. One must imagine Steve happy.

Simulations, VR’s, and Nietzsche on the Defamation of the World: A Polemic (2 of 2)

Click here to read part 1 if you haven’t already!


We associate appearance with falsehood, as being something on the outside which it is not on the inside. That is, it tricks us, it lies to us. Set against this is truth, which is the opposite of appearance. A stick bent in water is appearance, a refracted image truth. Subsequently, we say truth has value, whereas appearance is valueless. And since this world contradicts the underlying world of truth according to science, religion, and philosophy, it means this world is valueless; it has no inherent meaning, no true value in itself. “Existence has no goal or end; any comprehensive unity in the plurality of events is lacking… Briefly: the categories ‘aim,’ ‘unity,’ ‘being’ which we used to project some value into the world—we pull out again; so the world looks valueless,” Nietzsche summarizes.[1]


Today, it is accepted as a fact that life is meaningless, so much so that it is considered an objective, scientific fact. Under no micro- or telescope do we find any hint of meaning. The 21st-century is the triumph of absurdism. Even if we try to make sense out of history, if we put a narrative together for life, Nietzsche says, we can find no value to it. Postmodernism killed the meta-narrative: there is no way for us to come up with a story to explain history. Consequently, we are left in a vacuum, a vacuum of meaning, in which 95290-91766.jpgnothing we propose can stand on its own. There is no basis for meaning, no absolute foundation therefor, because “God is dead.” This valuelessness comes from a lack of “aim,” “unity,” and “being,” as Nietzsche writes. We know that the Universe has no end goal, no teleology, nothing toward which it moves; it is all random and arbitrary, lacking design and purpose. No matter how much we may try, we can also find no unity whatsoever, because any attempt to do so falls apart. Labels like “modernity” can no longer account for us; there is no way to impose an order on the chaos and irregularity that is life. Lastly, this world is constantly changing before our eyes, never staying still, always being mixed up and rearranged and broken down then built up again. The second law of thermodynamics: the Universe is becoming increasingly disorderly. Nothing remains over time, not even ourselves. All is temporary, all is fleeting; no substantiality supports the world. Each of these three things we “project… into the world,” yet today we have “pull[ed] [them] out… so the world looks valueless.”


Fed up with reality, annoyed that is does not meet our expectations, we abandon it, set it aside, and turn our attention elsewhere, to some other world we can create that corrects all these errors. VR and simulations offer us this escape, by allowing us to experience life Unknown-2.jpeganew in a meaningful, end-oriented way that progresses logically and sequentially, never changing radically or unexpectedly, staying faithful to us as the players. Now, VR is more appealing than real life. We would rather live in our computer-generated worlds, we say, unhindered by any stresses, than in the real world outside our computers, forced to carry the weight of responsibilities, stuck interacting with real people, people who are resistant to us, who cannot be controlled by us, who have to be encountered and treated in a certain way, while in our alternative worlds we can change how people respond to us. But we thereby deprive ourselves of real-world skills. “Real-world skills?” they say. “Who needs those when you have a simulation?” Indeed—who needs them when we can interact online without having to see people face-to-face? The future is now! Don’t be old-fashioned and talk to people in person—just add them online and video chat!


Nietzsche decries all of this. Virtual realities, simulations, video calls—all of this technology is but a repetition of religion and philosophy’s statements that reality is not what it seems, that beyond us is a perfect realm, starting with Plato’s realm of the Forms to Kant’s noumenal realm to science’s subatomic realm, all of which are “more real” than this realm in which we live. What does any of this matter, though, ultimately? Who cares if we leave behind this world? Who cares if we migrate online? Progress is inevitable, so we might as well go along with it and not fight what is natural. I have addressed this problem a little in another one of my posts on technology, and it relates to Robert Nozick’s “Pleasure Machine” thought experiment. Given the choice, we can hook ourselves up to a machine that will simulate a reality in which all our desires come true, although we can never come back to reality.


If we were to make a poll of who would say “Yes” or “No” a while back, I think the majority would say “No,” afraid of the prospect of living in some kind of artificial Matrix. Today, things are different. I think if we made the poll now, the majority would say “Yes,” seeing as technology is becoming more integral to life, more real than life itself, in fact—superior to life, transcending life. “Things change, we change,” some may argue. True, say I, but I still find this frightening. We are now willing to give up this world for another one. We have no problem with saying “Yes” to the Pleasure Machine. Hook me up right now! Let’s get out of here ASAP! Nietzsche had this to say: “‘Eternal bliss’: psychological nonsense. Brave and creative men never consider pleasure and pain as ultimates values… one must desire both if one is to achieve anything.”[2]


Unknown-1.jpegNozick’s thought experiment was designed to refute utilitarianism, the belief that the highest good is the greatest pleasure for the greatest number. His idea of the Pleasure Machine revolves around the idea of pleasure and its fulfillment. Hence, when Nietzsche denounces the pursuit of pleasure and avoidance of pain, he is arguing on the side of Nozick: there is more to life than pleasure. More than this, the “more” that exceeds pleasure is its opposite—pain. Pleasure alone is inhuman. The human condition is as such precisely because it does not consist solely of pleasure. We are human because we also experience pain. There is something unnatural to only getting our way. In actuality, we always come up against something in our way, an obstacle that has to be overcome. Struggle is an inevitable outcome, but struggle, for Nietzsche, is essential to life; there is no life without struggle. The truth is, we find it hard to think of life without struggles—it is nearly inconceivable, for the seesaw of pleasure and pain is all we know. Nothing is more real than the pain we go through.


This is why I find Nietzsche’s comment that the “eternal bliss” promised by escapist methods, e.g., the Pleasure Machine, drugs, VR/simulations, is really “psychological nonsense.” It is impossible. Like the idea of a “true world,” eternal bliss is a chimerical notion, one we invented for ourselves. Nietzsche correctly observes that great men are shaped by their pursuit of both pleasure and pain. Without the latter, their achievements would have been impossible. But by retreating into our simulated realities, we give up pain altogether in favor of instant gratification, being able to glide through life without a care or struggle. Nothing great will ever be achieved hereby. Conclusively, the root of our current situation arises out of a long-held denial of the world on the basis of the distinctions between truth and appearance, pleasure and pain.


Efecto_matrix.jpegHitherto, I have been discussing VR primarily and mentioning “simulation” concurrently, without really delving into it. Admittedly, I am no expert on the Simulation Hypothesis, but, as I understand it, the gist goes as follows: because we are capable of creating realistic simulations that model our world, it stands to reason that perhaps we ourselves are mere simulations of another civilization more advanced than we. This hypothesis is not immune to Nietzsche’s attack: the “true world,” according to the Simulation Hypothesis, is what is called “base reality.” If we create a game like “The Sims,” then we are said to be in “base reality,” because we are the ones who are controlling the players in the video game. However, it is possible that we are not actually in base reality, that we are in a simulation, similar to the Matrix. There is posited an imaginary Universe or civilization that has programmed us. We are players in somebody else’s video game. But what is this civilization? Where is this Universe outside of us? Here again we are “reach[ing]… for nothingness,” in Nietzsche’s words. We have no evidence for such a race capable of designing and playing as us for entertainment.


Just because we can do it, we assume it might have already been done. If we went back in time, would people think we were in a simulation? Maybe a few—but they would be considered delusional, perhaps. The fact is, I feel like there is some fallacious reasoning here. Sure, we can program stuff, but that does not automatically mean that something else beat us to it. Possibility arises from possibility. If we could not program, would we think the same way? Who knows. Another question: we do not know if we are in a simulation for sure, we must concede, but what difference does it make whether we are or not? Why need we ask this? This kind of thinking belongs in the 16th-century with Descartes, whom we have long abandoned. Let us not even entertain the notion of a simulation, because it is useless. And do we really need to de-base (no pun intended) “base reality” as we know it in the process? How about we just stick to this world—it is quite fine as it is.


But let us say, hypothetically, that we discover we do, indeed, live in a simulation. People like Musk might say, “Ha, we told you we’re living in a simulation!”—but is that something to be happy about, something to be proud of? Will we be satisfied to discover that everything has been a lie? Artificial? Not our own? In essence, does the Simulation Unknown.jpegHypothesis really help us in any way or make us happy? I do not think so. I side with Nietzsche on this: the Simulation Hypothesis contributes to what Nietzsche names “the defamation of the world,” i.e., its slandering. If we entertain the Simulation Hypothesis, we are not so different from the religious thinkers who believed in a perfect, transcendent world, thereby devaluing this world; we “slander and bespatter this world.” Musk, in one interview, said “base reality” would be boring because every simulation only takes what is interesting from life and puts it together, leaving out the boring stuff since it would be uninteresting. If we accept that this world is base reality, then that would mean our lives, right now, are boring. Although we may feel this from time to time, convinced that our lives truly are boring, we cannot say that life as a whole is boring—how would we live otherwise? To claim that life is boring and that we need simulations to entertain ourselves, is again to slander the world. 


Unknown-3.jpegTo bring this all to a close, we can summarize by advising this following: let’s just stick to common sense and life, shall we? No need for virtual realities that fulfill our fantasies or super advanced aliens that have simulated us. Sure, life is full of difficulties and complexities, but hey, we have made it this far—and c’est la vie, such is life. We cannot rid ourselves of this world any more than we can rid ourselves of our bodies (but then again, we might in the near future if we end up putting our brains in vats or supercomputers!) Nietzsche’s argument in favor of reality goes like this: rather than being deceiving, appearance is reality itself, because it is through appearances that we learn about and interact with the world, so reality is what appears to us as such, a claim that is evidenced by the fact that we are still here going about our everyday lives, going through the motions, carrying out everyday, practical tasks, even theorizing, never questioning whether we are living in a simulation Unknown-4.jpegor not, never thinking to ourselves voluntarily “I’d rather see it in VR”—clearly the world as we live in it must be true and worth living, must be livable, seeing as humans have been living in this world—and not some transcendent one—which has supported them for hundreds of thousands of years. Life is a series of interconnected relationships and perspectives, according to Nietzsche; there is no “truth-in-itself” except reality itself and its appearances, no “final reality.” So let’s stop trying to replace this world with a virtual reality or theorizing if we are in a simulation and just live life to its fullest! Otherwise, we are fulfilling Nietzsche’s depressing prognosis: “What is dawning is the opposition of the world we revere and the world we live and are. So we can abolish either our reverence or ourselves. The latter constitutes nihilism.”[3]

 

 


[1] Nietzsche, The Will to Power, §12a, p. 13
[2] Id., §579, p. 311
[3] §69n39, p. 45

For further reading: Thus Spake Zarathustra by Friedrich Nietzsche (1995)
Twilight of the Idols by Friedrich Nietzsche (2008)
The Will to Power by Friedrich Nietzsche (1968)

Simulations, VR’s, and Nietzsche on the Defamation of the World: A Polemic (1 of 2)

Unknown-1.jpegIn 2016, the Oculus Rift headset was released, and the popular tech entrepreneur Elon Musk suggested we all live in a simulation—two events that, although isolated, point to a powerful force at work in the 21st-century, the century of nonpareil technological progress. The former was a game changer, literally, in that it was the first majorly successful technological gadget to allow gamers to immerse themselves in the virtual worlds in which they played, putting them right in the middle of the action, as if they were actually there in the virtual world, able to interact with it. On the other hand, Musk’s provocative claim marked a crucial stage in our intellectual development, one which signaled a shift toward the acceptance of the increasingly pervasive technological character of life, such that it is nowadays accepted, not at all eliciting raised eyebrows.


This is quite the deviation from, say, 60 years ago, with the popularization of science fiction novels that portrayed grim visions of such beliefs (a deviation I’ll explain). Together, the virtual reality headset and the unpromising words of Musk represent a Unknown-1new conception of reality and how we humans situate ourselves in it. If the philosopher Friedrich Nietzsche were alive to experience VR and hear today’s talk of technology, then he would shake his fist disdainfully, criticizing us for succumbing to one of the oldest philosophical prejudices in the books with his fiery aphorisms, before returning, head held low in contempt, to his cave like a disappointed Zarathustra, wondering with shame how we went so wrong. The man who proclaimed the death of God, Nietzsche was highly critical of both religion and philosophy, finding in each of them an attitude of denial—denial of the Will, denial of becoming, and, worst of all, denial of the world. Recently, I have been troubled by these two tendencies—those of VR and the proliferation of the simulation theory—so I will be undertaking a Nietzschean critique of both in order to point out the psychology of our times, namely, the denial of reality.


“You haven’t seen it until you see it in VR,” states the advertisement for Oculus VR’s latest product the Oculus Go. The ad seemed to be everywhere on YouTube, following me around from video to video, trying to get me to buy the headset. When I first heard the slogan “You haven’t seen it until you see it in VR,” I became worried. It feels like something you would hear in a dystopian movie, something straight out of Brave New World or Fahrenheit 451, an attempt by the government or some big corporation to Unknown-2.jpegindoctrinate the masses to escape from reality into a virtual world, one where they can create whatever they want, as they please, without fear of the consequences and stresses of everyday life. Strap this headset on, they say, and all your problems will go away! Get transported to another world, a better world, a perfect world, where you can fulfill your wildest dreams. Forget this world—it’s awfully boring, because, after all, “until you see it in VR” “you haven’t seen it.” In other words, real life is not good enough for us anymore. Life? Pfffft, lame, I can see something more real in VR. With VR, I am immersed, I am in the center of it all, I am free. This world, though, reality, is alienating, is inadequate, is constraining. Our own eyes deceive us; reality is not as it seems. In contrast, VR is the real deal. As soon as I put on my headset, I am put into the world, thrown into it, and I can explore it to no end, seeing things as they are, how they are meant to be perceived. Why visit Niagara Falls when I can see it in VR, where, unlike real life, I can get up close to it and really experience it in its fullest?


Unknown.jpegNow, perhaps I am overreacting, perhaps I am making a big deal out of nothing. It is just an advertisement, in the end, just a playful attempt by a company to attract consumers with a fun slogan, pressuring them to join into the fun. By playing on the fear of missing out (FOMO), Oculus VR is inviting people to get the Oculus Go; until people purchase it, they are missing out, they are not seeing the world through the headset, which is an entirely new world. But while it is merely an advertisement, it betokens something rather sinister, a deviation: the consensually clandestine replacement of reality. That is to say, this whole advertising move conceals a dark underpinning in our society, which involves the forgetting of this world—life, reality—in place of a promised better one—in a word, escapism. In the introduction, I alluded to 20th-century science fiction, the focus of which was on predicting what the future would be like when it was dominated technology. More often than not, these predictions were negative, rather than positive, impacted by the disastrous aftermath of technology’s fruits in the atom bombs of World War II.


However, the fact that we are no longer afraid, that, instead, we are open to and welcoming of such technological progress indicates a change in our historical consciousness. Rather than shun such immersive technology, we embrace it. No—we do not merely embrace it, we advertise it, we celebrate its advent, its prospects, and we encourage everyone to become a part of the movement, to literally buy into this alternate, Other, reality. Society has undergone a radical 180º turn. Of course, Oculus Go’s advertisement is harmless… directly, that is. What are we to make of this change in consciousness? Should we be worried? I think we should. We are in danger of escaping ourselves and this world into some fantasy realm. But at the same time, are we not denying these very things? We say VR is immersive, centralizing, and freeing—but is not this the essence of life, of reality as we know? What can be more immersive than life itself? To what are we more central than reality? Where are we afforded greater freedom than this world? To have the freedom to create another world in which we can be free(?)—nowhere else can this be conceived except here and now. In this world. At this time. This is the immediate danger.


It is easy to know what the danger is, less so to know why or how it is. Why did we change our attitude toward virtual, alternative realities? What in our psyches instigated this revolution of ours? For this, we must look to Nietzsche, who, in addition to being a philosopher, was also a psychologist; one might say a psychiatrist, too, since he not only Unknown-3.jpegdiagnosed society, but also prescribed for it certain remedies to treat its decadence. According to Nietzsche, our fascination with VR and simulations stems from an unlikely historical source: our Christian heritage. From Christianity, we inherit a dying, albeit still-alive, belief in the transcendent, the beyond. In addition to this world in which we live and have our business, there is another world out there, an improved, perfect world where there is happiness and permanence and goodness, i.e., the opposite of this world, full of “appearance, change, contradiction, struggle.”[1] An essential part of man’s imagination has been occupied with this transcendent world which lies beyond our experience, whose existence can be but brushed by the intellect in faint strokes or by brief revelations. Either way, it remains unknown, unknow-able, to us finite mortals. Christianity teaches that life as such is flawed, vicious (full of vice), cruel, and immoral; evil is a real thing, opposed to the inherent goodness of Jesus. The two are at odds: this world, full of evil, and God, full of good.


How can the two be reconciled? Nietzsche says: they cannot. So diametrically opposed, so antipodal are they, that the two are literally different worlds. This world in which we live, consequently, must be false due to its flaws and imperfections. This world is not the world, but an image of it. Hence, Nietzsche calls Christianity a Platonism for the masses. Contrasted to this world is the “real” world, the “true” one, which is, unlike ours, stable and unchanging and moral and perfect. It may not appear evident, but I think this way of thinking still underlies the 21st-century. For instance, for what purpose do we create Unknown-4.jpeggames like “The Sims,” where we create people we can play as and give them families, recreating not only life as we live it, but life as we could have lived it, that is, in a stable world, one under our control, one where we have a say, one that is, perchance, perfect? Is it not for this express purpose, namely, reaching beyond this world to another one in the imagination, fitted to our unrealistic aims? Just as God created and designed a perfect world in contrast to this world, so computer and game designers can create and design virtual worlds in contrast to this world—worlds that are said to be “perfected” versions. Sure, they have bugs and other issues, but so does real life. These bugs are called vice, sin, etc. Through VR and simulations, we escape from this world into a fantastical one.


As such, man, Nietzsche claims, “invents a world so as to be able to slander and bespatter this world: in reality, he reaches every time for nothingness and construes this nothingness… as truth”; this reaction stems from a sort of “revenge on reality,” as Nietzsche puts it.[2]  In examining this quotation, we find that the first phrase is perfectly applicable: we do, indeed, invent worlds. Yet is it explicitly to “slander and bespatter this world”? And what is this talk of “reach[ing] every time for nothingness”? Beginning with the last part, Nietzsche is referring to the fact that, in seeking another reality, we are Unknown-5.jpeglooking for something that is not there; the transcendent world of the religions is fictitious, purely mental, a fabrication, just as the worlds in our video games are. While with VR we can go inside of them, we will never really experience them consciously, for this is the only world we have. Yet we “construe this nothingness… as truth,” and we convince ourselves that we are really playing in another world, one that is realistic and comparable to actuality, all the while belonging to ideality, to something that exists only in the mind, not out here, sensible, touchable, interactive in the truest sense. Being virtual, it does not exist concretely or fully, but abstractly and partly. To say we are “in a virtual world”—what does this mean? Can I “see” this virtual world through my own eyes, or only through a screen or headset? It appears to me to be nothing, nothingness, and yet I make it seem to exist side-by-side our world.


This leads back to the other point of Nietzsche, which is that such a construal of nothingness is precisely to “slander and bespatter this world.” Our talk of another world outside of ours, alongside ours, better in some way, not entirely within our reach, is to Nietzsche a disrespect; we are spreading rumors about this world, ranting about how imperfect, how unfair, it is, corrupting its image through more images, trying to leave it behind and abandon it for something else, something Other, divorced from reality. For this reason, Nietzsche labels such actions as “slander” because we are going behind the world’s back and talking smack about it, whispering to each other about all the drama and gossip, spreading injurious words and phrases, accusing it, blaming it for all the iniquities in the world. Thus, we find that the whole of Nietzsche’s claim is appropriate.


But this prompts a deeper inquiry: why do we have this way of acting and thinking? Another way to put it is: why did the Christians abandon this world—what deep-rooted psychology have we inherited that makes us aspire toward the transcendent? We initially touched on this with the comparison of life to God. By comparing this world and its evils to God’s justness and goodness, we formulated a dichotomy between this world and that world, i.e., the “true” world. Everywhere we look, there is injustice and chaos; the Universe is ruled by entropy. Thus, there must be a world that has none of this. A kind of claim like this is even supported by science, Nietzsche remarks. In light of Unknown.pngmodern physics, especially quantum mechanics, the world is nothing more than an illusion, it would seem: the world is reduced to appearance. It appears I’m sitting on a couch, although it is nothing solid or green in reality, but a collection of subatomic particles, practically indistinguishable from each other, conserved in their motion and energy. So science draws a line between reality as we experience it and reality as it really is. Our perceptions of things can be mistaken, as in optical illusions, and the things which we are perceiving do not even show themselves accurately, veiling their true nature; beneath all of this is another world, the “real” world of quarks, energy, and maybe even strings, which is permanent (law of conservation) and, arguably, perfect (that is, in terms of contingent habitability, cf. Anthropic Principle). Nietzsche finds irony in this considering science is supposed to study reality, when it really tries to get behind reality, just as metaphysicians and mystics do, proposing some kind of unknowable reality that is covered by the knowable one.


[1] Nietzsche, The Will to Power, §578, p. 310
[2] Id., §461, p. 253

For further reading: Thus Spake Zarathustra by Friedrich Nietzsche (1995)
Twilight of the Idols by Friedrich Nietzsche (2008)
The Will to Power by Friedrich Nietzsche (1968)

Bradbury, Martians, and the Threat of Technology

Future-London-Futuristic-City-Future-Architecture-Simon-Kennedy-Factory-Fifteen-01.jpgPicture what you think the future will be like in 50 years: Will there be flying cars, intergalactic space travel to other planets, space colonies, and smart houses that will be able to do anything you want them to, with everything perfected to function automatically, so that no one has to do any work? While this is a dream for many, it is a nightmare for author Ray Bradbury, whose science fiction book The Martian Chronicles details the dangers of a technologically dominated future. Written in the middle of the 20th-century, the collection of short stories is designed to showcase the many threats posed by technology against its very creators. He warned in his book how a reliance on technology is not to our benefit. Bradbury’s predictions about technology have in today’s world become evident. Just as he predicted, we humans have become so dependent upon our technology that, when we risk losing both our control and humanity, our abuses thereof can ultimately threaten and endanger us.


Technology is used every day for tasks, yet this reliance can be carried to extremes, such that we are no longer independent, replaced by technology in even the simplest of tasks. In “There Will Come Soft Rains,” a fully-automated house is left isolated after a nuclear fallout. The owners were vaporized in the blast, leaving the house to continue functioning on its own. Describing its system, Bradbury writes, “The house was an altar with ten thousand attendants, big, small, servicing, attending, in choirs. But the gods had gone away, and the ritual of the religion continued senselessly, uselessly” (168). By depicting the house with religious metaphors, he creates a sense of sacredness smart-home-2769210_1920.jpgsurrounding the technology. Specifically, the word “altar” refers to a table used for religious rites, a center for worship. For this reason, the house can be construed as a holy place, one frequented by something supernatural. When he says “the gods had gone away,” Bradbury is referring to us humans, who are considered deities to the house in a double sense: We are both the owners of the house, whom it serves, and the significant creators of the technology, such that, as their creators, we are worshiped as gods. As such, we humans take refuge in a temple, a place of importance. Later, it is explained that the house is equipped with mice that clean the house, and an intercom system is in control of every homely option, which is suggested by the fact that the house has “ten thousand attendants,” showing that the technology is overwhelmingly unnecessary; it replaces the humanness of the house due to its excessiveness. Because the humans have left, because the house remains a place of worship, the technology is deified. This swap of human-technological worship is representative of today, as Bradbury guessed, because no longer does technology serve man, but man serves technology. The image of the house surviving the blast while the humans died supplies readers with the idea of technology’s immortality and power over the natural world. Since the humans are gone from Bradbury’s world, it means everything is dependent on machinery; technology has replaced the need for humans because we grew too dependent upon them. Even though the owners are gone, the “rituals”—by which Bradbury means the everyday chores done by the house—“continued senselessly, uselessly.” Later in the story, when the house Unknown-1.jpegmakes breakfast, but there is no one to eat it, it still keeps making the breakfast, showing that the sacredness imputed to humans has become obsolete. Everyday tasks have, with their relegated monotony, have lost their meaning in the human realm. But the future Bradbury imagined is not so far from reality, because today robots are replacing humans in society. In “The Big Robot Questions,” writer Patrick Lin states, “[B]ecoming overly reliant on technology for basic work … seems to cause society to be more fragile” (Lin). Nowadays, there is a device for everything, and the “basic work” of which Lin speaks consists of things like making food, washing clothing, or waking up. It is basic since it is fundamental and simple—any human can do it. However, the absence of such basic work leads to “fragil[ity],” or a state of being prone to breaking. If society is fragile, then it means it is not secure, that, at any moment, it can fall apart, like a dropped antique vase. Taken together, both Bradbury and Lin foresee a world where technology reduces humanity to mere idols, lazy, otiose. Where Lin talks about “basic work” being taken over by machines, Bradbury mentions the “ritual[s]” of machines that will become regular. Automation can be a good thing, as it eliminates effort, yet the two authors believe that an overdependence on technology will remove meaning from human life, of which a large component is work, from which we derive meaning. With work eliminated, Bradbury fears the boring longevity of technology, whereas Lin fears the weakening of society. Through these examples, it can be concluded that technology, if it is used too much, can render life a senseless worshiping of idleness.


Technology is capable of both imaginative creation and unimaginable destruction if it falls out of our control, in which case it is liable to backfire and threaten both human and environmental safety. The short story “The Million-Year Picnic” follows a family who is supposedly visiting Mars for a fishing trip. After destroying their rocket, the dad drives Unknown-2.jpegthem across the planet, where they claim a city as their own. The dad explains to his kids why he took them there: “[P]eople got lost in a mechanical wilderness, … emphasizing the wrong items, emphasizing machines instead of how to run the machines. Wars got bigger and bigger and finally killed the Earth” (180). Here, the imagery of humanity “lost in a mechanical wilderness” represents the confusion and directionless created by technology. The metaphor of technology as a “mechanical wilderness” shows that technology is a wide, pervasive, and dense network, a great big expanse of land into which it is easy to lose one’s way. When a person is “lost in the wilderness,” it means they are stuck because they do not know which way to go, how to get out, as they have no compass, no guide. “Wilderness” itself brings up connotations of danger and unpredictability, a feeling of being disoriented or out of place. Accordingly, Bradbury is saying that, when it comes to technology, humans do not have a moral compass, a device with which to navigate the dense thickets of technology, where we are left stranded. He talks about the difference between “machines” and “how to run the machines”; developing technology is dependent upon using that technology, although the latter is neglected, Bradbury thinks. The importance of the phrase “how to” lies in its indicating control. To ask how something is done is to ask the method by which it is done, so to emphasize “how to run 400px-Bonsack_machine.pngthe machines” is to emphasize the purpose, or utility, of technology. There is also another interpretation of “how to”—namely the avoidance of improper use. In the case of technology, Bradbury is predicting that we will become blind to the proper use of technology, resulting in its going awry, malfunctioning—functioning badly, not in its desired way. Asking “how to” is also asking “how not to,” implicitly, because it distinguishes between the correct and the incorrect utilization. These views are not unfounded, for even now machines are not functioning correctly. Patrick Lin, in the same article, recounts how, in 2007, an automatic aircraft cannon failed to work properly, killing nine, injuring 14 (Lin). This incident reveals the shortcomings of technology, specifically its ability to malfunction. As a result of this malfunction, many people either lost—or came close to losing—their lives; consequently, technology, when it is not understood, when it is not under precise control, can harm us humans. Imagine if a nuke were to malfunction: Any flaw in its system would be devastating. Bradbury and Lin are noticing that technology is something not to be tampered with, a thing which can change on a whim, if we are not careful. The dad in “The Million-Year Picnic” talks about the devastation of the Earth and how wars got people killed, while Lin mentions an incident where a weapon went off and killed several people. Despite their differences, one being bigger than the other, they both express the same fear: Technology is Unknown-3.jpegdangerous. Bradbury’s vision is much more extreme than Lin’s; however, the potential is implied by Lin, who cites the increasing use of militarized technology. Both authors, then, are in agreeance that if we move too fast, if we do not look where we are going, then we will lose sight of what we are doing, with terrible consequences. In Bradbury, the machines are not used aright, and so contribute to the devastation of the Earth, and in Lin, a machine does not work, and so contributes to the death and injuring of a group of people. A common theme is the imperfection of technology, its proneness to mistakes, because they are flawed. As imperfect creators, our creations are imperfect, too.


Unknown.jpegOriginally designed to connect us across long distances and to help us at home, technology has begun to extend its range to the world at large; however, its effect on the environment is less than beneficial. “The Locusts” is a story centered on the arrival of hundreds of rockets on Mars, which has been deemed safe for living. Tens of thousands of people move to the new planet to set up their homes, in the process destroying the environment with their technology. Tersely, bluntly, Bradbury narrates, “In six months a dozen small towns had been laid down upon the naked planet, filled with sizzling neon tubes and yellow electric bulbs” (78). Bradbury calls the planet “naked,” symbolizing bareness and vulnerability. To be naked is to be without cover, and so to be uncovered, meaning to be without protection. Because Mars is unprotected, it is open; however, this natural openness, this simple innocence, also means it is able to be attacked, since it is without armor. In contrast, Bradbury says that the humans’ technology consists of “sizzling neon tubes.” “Sizzling” refers to burning in a hot or excited way, as in a sizzling fire, which is dangerous because of its easily flammable tendencies. As such, a naked Mars is scorched by sizzling technology. Furthermore, the “neon tubes” create images of bright, disruptive, blinding light. Colors that disorient us are called “loud,” so it can be said that Unknown-2.jpegBradbury is depicting technology as loud and obtrusive, and, it follows, unnatural. In short, technology is characterized as having a negative effect on the environment due to its ability to intrude upon nature. This conception of technology is similar to that of German philosopher Martin Heidegger’s. In his work “The Question Concerning Technology,” written in 1954, around the time of Bradbury himself, he observes, “Enframing means the gathering together of that setting-upon which sets upon man, i.e., challenges him forth, to reveal the real, in the mode of ordering, as standing-reserve” (Heidegger). Although this passage is not clear at first, what he means is similar to Bradbury. “Enframing” is putting the world into a preconfigured framework, as in, “to put in a frame,” and the “setting-upon” to which he refers is likened to a seizure, or confrontation with something, often an object. Man, Heidegger expressed, is “challenged forth,” or provoked, made to compete, “to reveal the real,” or show nature, “in the mode of ordering,” or in the mode of commanding into a certain arrangement. What this means is that man is called on to make nature into a specific orientation for his self-interest. This specific orientation with which he “reveal[s] the real” as “ordering” is what Heidegger calls “standing-reserve,” which is basically what it sounds like: Nature is turned into a commodity, something reserved, and, as such, is a resource awaiting its use. Nature is made to be on standby until it becomes useful. In this sense, technology, or enframing, is the process of converting nature into something of use. To summarize, technology for Heidegger is a worldview in which man seizes nature and commands it to reveal itself as a resource to be used up—and nothing more. Bradbury’s idea of technology as a threat to the environment is based on its ability to disrupt the natural Unknown.jpegstate of the world, to change it completely, whereas Heidegger’s idea of technology is based on its ability not to disrupt nature, but to reinterpret it, to change its form so as to be something manipulable, exploitable. In “The Locusts,” the humans’ homes are “laid down upon” Mars, and in Heidegger, technology “sets upon” nature; yet in both cases, this idea of a verb plus the preposition “upon” means a violation, an act of invading and attacking, hostile and unfriendly. When the two sources are looked at together, they both express a common concern regarding technology’s ability to ruin the natural beauty of the world. By replacing the natural with the unnatural, by seizing nature and commandeering it, technology, the two authors assert, does the environment bad. Technology as enframing enframes the world as something to be used for our own instrumental purposes.


https-cdn.evbuc_.com-images-34858522-90081055097-1-original.pngAs Bradbury envisioned, our age is a technological one, one where we have become indebted to machines, where our lives have become dominated by them, where we are on the verge of losing our autonomy, and where our blindness to this predicament will be rewarded only with idleness and meaninglessness. By making technology an indispensable part of our lives, we have to them become ourselves dispensable, as Bradbury foretold. It is important in this technological era to retain and keep safe what it means to be human, what it means to engage in meaningful work, what it means to be a safe people, when we are not threatened by the specter of nuclear war or technological revolutions. Technology is expanding and evolving faster and more efficiently with each year; and with each new iteration, we are more inclined to entrench ourselves in it, to bind ourselves to it. So what will the future be like in 50 years? With hope, we will be attentive enough—and still alive—to experience it, because appearances are often deceiving, and technology is a master of deception.

 

The Problem with Memetic Literacy

Unknown.jpegImmediately after they wake up, a large percentage of people check their phones to see the latest notifications from their social media or to respond to the influx of emails they have received. Similarly, a large percentage of people stay up late doing the same thing, checking their feeds, scrolling, nothing in particular on their minds, nothing to look out for—just scrolling, as if something will magically appear. Everyday, millions of pairs of eyes flicker over their bright screens, either on Instagram, Snapchat, or iFunny looking at hundreds of memes, short, humorous images or clips shared from person to person, starting with just one viewer, then spreading exponentially, until, like the game of telephone, it evolves with every share, becoming something new, something different, yet derivative, building off of the original, but with a new touch of interpretation by whoever appropriates it.


It can be said that memes are one the greatest things of 21st-century technology since they are able to be universally understood, shared, and laughed at. Language barriers are no Unknown.pngmore, so someone in the U.S. can share a meme with someone in China, and they will both get it. How cool is that—to be able to communicate cross-culturally and get a laugh out of it? Memes allow for a shared knowledge and entertainment for people of all ages and backgrounds, connecting them through a single medium. While I myself like a good meme, just as anyone else does, and while they can be hilarious, I think the popularity of memes today, despite its benefits, also brings with it deficits, problems that need, and should, be addressed. The spread of a “memetic literacy,” as I like to call it, has supplanted a much more fundamental, more necessary cultural literacy, and so will, I believe, impoverish both today’s and tomorrow’s youths.


Screen Shot 2018-03-18 at 11.33.01 PM.pngWhen we think of literacy, we think of reading and writing. To be literate is to be able to read and write; to be illiterate, to be able to neither read nor write. Defined this way, our generation has the highest literacy ever, according to the graph to the left. Over time, as education has become open to more people, as education has been improved, literacy has gone up, and will continue to. We are living in an Enlightened age, the most Enlightened age, with information stored in computers and more brains than there have ever been. However, there is a difference between being able to read and write and being able to read and write well. E. D. Hirsch defines literacy as “the ability to communicate effectively with strangers.”[1] What this means is that literacy is a common, shared knowledge. If I am literate, then I should be able to engage anyone on the street and be able to have an understanding conversation with them, one in which I am able to understand them, and them me. Despite our backgrounds, we are both able to know what we are each talking about; I and they are comprehended.


During the 19th century when the world was industrializing, education was universalized. Schools were implemented worldwide to teach a shared culture. National languages were codified, instead of regional dialects so that people could understand one another, and thus, as in Unknown-1.jpegthe Renaissance, reading was made available for everyone, not just the learned elite, who were usually religious members. Because language was made singular, common, the koine, the vulgar tongue, the common folk could on a mass level learn to read and write in school. Some argue that it is a language and a culture that create a nation, for what is spoken and what is spoken about constitute a common people. There is a sort of egalitarian principle behind this, a principle of making everyone equal, of giving everyone—no matter their makeup, no matter their abilities, no matter their social position, the right to an education—the right to be a part of a culture. There are no distinctions between the advantaged and the disadvantaged, the educated and the uneducated.


Unknown-2.jpegHirsch relates how the literate usually like to keep the illiterate illiterate by not telling them how to be literate, withholding the specific requirements for becoming so. It is subtle: There is no single, agreed-upon list of things to know in order to be literate, for the selection is just so vast. The Western Canon, for example, is but a sampling of the world’s greatest literature. So while some may call you literate for having read the whole Canon, some may not consider that criteria enough. As such, to be truly literate, to be well read, is to be a part of the elite, as opposed to the merely literate, comprised of those who are educated enough to read and write.


I like to think that I am pretty literate in memes, but this was disabused when I was hanging out with a friend one time, and every phrase I heard out of his mouth I could not relate to. I thought I had a pretty solid grasp of memes, yet here was my friend, who was clearly more literate in memes, referencing different jokes whereof I knew not. It was like he was having an inside joke with himself that I could not understand; I lacked the shared background knowledge as he, and he assumed I had it, when I did not.


Screen Shot 2018-03-19 at 11.11.59 AM.pngOn YouTube, there are famous playlists 300-videos long, lasting for several hours, full of memes. If one can sit through all of them, then one, I guess, can be called “literate” in memes. However, he will still be lacking in other memes, meaning it is hard to specify what memes one should know if one is to be literate in them. In my case, how am I to know which memes are in vogue? Moving past this, the better one can read, the better one does in other subjects. From experience, I can attest to the fact that reading a variety of texts leads to a bigger vocabulary, and thence to a larger storage of knowledge and comprehension, resulting, ultimately, in easier learning through association. Such is the outline of literacy by Hirsch. Someone who is well-rounded in their reading, who reads not just fiction but non-fiction, who looks up words they do not know so they can improve, who not only specializes but generalizes their knowledge, who associates what they do not know with what they do know—they are literate, and they are successful in reading and writing.


Unknown-1.pngE. D. Hirsch writes of a study he once conducted in Richmond, Virginia at a community college. There, he interviewed students and asked them to write responses to his prompts. Eventually, he asked them to write an essay in which they compare Civil War generals Ulysses S. Grant and Robert E. Lee, the latter of whom was himself a Virginian. Although they were in the capital of Virginia, what was once the capital of the South, the students were not able to write a response because they did not know who either of the two men was. Hirsch was flabbergasted, to say the least.


The point he was trying to prove was this: Cultural literacy is integral to society. A universal background is always presupposed. We require tacit knowledge to understand things that are implicit, both in a text and in the world around. The culture is greater than the sum of its parts. Culture must be understood generally, in relation to all its parts, kind of like a Hermeneutic Circle, where the whole and its parts must be continually interpreted in light of each other. In this sense, cultural literacy comprises political, historical, social, literary, and scientific literacy, all in one, according to Hirsch. In other words, cultural literacy is the totality of all its subjects.


One must be well-rounded and not too-specialized to be culturally literate, lest one neglect a subject over another. For instance, a writer writing a non-fiction book assumes his audience knows what he knows, or at least has some kind of background information coming into it; he least expects them to be coming in blindsided, without any preconceptions or context whatsoever. There should be an interplay between specialization and generalization, because, on the one hand, a reader should have a grasp of the subject overall, but also the details within it. Things that are assumed are connotations, norms, standards, and values, among other things—in short, shared knowledge. To have this shared knowledge, this basic understanding of one’s culture, such that one is able to engage with it, “to communicate effectively with strangers,” is to be culturally literate.


Unknown-4.jpegDurkheim spoke of a “collective consciousness,” a totality of implicit, pre-existent notions that exist within a society. Everyone in the given culture is under this collective consciousness, is part of it. It is collective because it is common to everyone; consciousness because everyone knows it, even without acknowledging it. Being an American, I have the idea of freedom as a part of my collective consciousness, just as over 300 million other people do. Were I to stop a stranger and ask them about freedom, I am sure they would have the same background knowledge as I, such as the 4th of July, which signifies independence for the U.S. This example illustrates an interaction in cultural literacy.


Things are a part of our collective consciousnesses only because they are meaningful and valuable; if they are not, then they do not deserve to be presupposed by all. If it did not mean something, why should it survive in all of us? Hirsch writes, “[T]he lifespan of many things in our collective memory is very short. What seems monumental today often becomes trivial tomorrow.”[2] It is hard to become a part of the collective memory. What makes good literature good is its longevity. Homer has long been considered one of the greatest ancient writers because he has remained read for millennia.


F759C5A8B71089736889893797888_175ced7823d.3.2.mp4.jpgCompare this to pop singers today, whose meteoric rises soon meet an impasse, only to decline, impermanent, impertinent. With memes, the same can be said. They all explode in popularity, only to reach their apex before either fading into obscurity or being replaced by another. A meme can be overhyped. It loses its importance, and although it seems “funny” or “important” one day, it may not the next. Memes are volatile things. On a whim, they come and go. Even though some have a longer life than others, they all eventually go. The classic Vine “9+10=21” was once extremely popular, and was quoted daily in school; now, it hardly exists in our collective memory; it is a ghost, a fragment from oblivion.


Hirsch comments that about 80% of what is taught in the collective memory has already been taught for at least 100 years. The Western Canon, again, is a good example: Its core works have been fixed since antiquity, and as civilization progressed, more works were added to it to keep up, all the way to the 20th century. In 100 years, it is incredibly unlikely—albeit still possible—that we will remember, or at most care about, people chucking things while yelling, “YEET!” Memes, while communicating entertainment, do not express values. Therefore, the Western Canon as such is as it is because it has been formative in our world; they have been studied so long and by so many people, that it has left an indelible influence, an influence that persists today.


Unknown-5.jpegGiven all this, I can now address the main problem of this essay, namely the conflict between cultural literacy and “memetic literacy.” I have not spoken a lot about memes yet save in small bits, but I shall discuss them presently. For now, I wish to direct your attention to the issue at hand: The decline of cultural literacy. A teacher created a quiz full of famous, influential persons and gave it to his class to gauge their familiarity with historical, artistic, literary, and philosophical literacy. He was disappointed when one of his students compared the test to a game of Trivial Pursuit, because it prompted the question, What counts as important or trivial today? This is a vital question that everyone needs to ask themselves. Are famous leaders like Napoleon now trivial today, compared to the importance of Viners and YouTubers like Logan Paul? If both names were to be put on a test, would students cry, “Why do we have to know this Napoleon guy? Logan Paul obviously has a bigger influence today”? Is knowing who Napoleon is just trivia? Furthermore, the teacher found that his students had no knowledge of current events, specifically of their own country and its involvement in foreign affairs.


Jaime M. O’Neill, the teacher, states, “Communication depends, to an extent, upon the ability to make (and catch) allusions, to share a common understanding and a common heritage.”[3] Allusions are thought by many to be pretentious. Those who make allusions are called name-droppers, and are disparaged. Many and I would argue on the contrary, saying that it connects to Hirsch’s idea of cultural literacy. Allusions are an example of shared knowledge. To be well-read, and therefore to know of many ideas and people, is to be involved in your culture. If I were to call something Kafkaesque, then I would be engaging with my culture, as I am expressing a background in literature, whereof the situation calls.


Conclusively, we are losing the ability to make references to the collective consciousness, the ability to commune with strangers on the same basis. There is a paucity of literacy in literature and history. All teenagers know these days is what they need to know. No one goes out of their way to study history or literature; they are content and complacent with what they know. O’Neill records, plaintively, that some of his students thought Pablo Picasso was a 12th-century painter, and William Faulkner was an English scientist during the Scientific Revolution.


Throughout my day, I hear my friends and classmates complaining about impractical, specialized knowledge on their tests, knowledge they have to memorize. Although I can sympathize with them, and although I agree often that these tests are absurd, I also think they are in the wrong to say these things. Jeff Jacoby, a journalist for the Boston Globe, has written about the same subject. He talks about how it is actually easier to memorize what is on standardized tests than it is our peers’ standards. Put another way, we memorize so much useless information and trivia on a daily basis about sports, music, and TV in order to keep up with our peers, that it is easier to memorize facts that are on a test.


91uBT9850xL._SL1200_.jpgUnlike peer culture, whose facts are prone to change and in constant flux, tests’ facts are fixed and unchanging. Whereas 1789 is always the date of the start of the French Revolution, knowing Steph Curry is the point guard for the Golden State Warriors is bound to change in years to come. Memorizing the Pythagorean Theorem is applicable, as opposed to memorizing all the names of the band members of One Direction, which is impressive, but not applicable. The biggest complaints I hear, and which Jacoby also cites, are “I could spend my time more meaningfully” and “Why should we have to memorize facts?” Both points have merit, I concede, especially the latter.


Please do not interpret me as supporting the school and not the students; I have many a problem with education today, of which one is standardized testing, because the memorization of lifeless facts is indeed a problem. My point is: We youths memorize countless dumb, trivial facts about pop culture and regurgitate them just as much as we do scientific facts, like mitochondria being the powerhouse of the cell. I am forced to ask, If you claim you could be spending your time better, what, then, would that look like? Simply put, teenagers, myself included, are false and hypocritical; and while I am not saying we should not complain at all, I think we should complain less, unless we truly have grounds for doing so.

Kids set truly high performance learning standards for each other…. If students don’t know the details of the latest clothing fashions or the hot computer games or the to-die-for movie stars, they’re liable to be mocked, shunned, and generally ‘flunked’ by others their age. That’s why they so many spend hours each day absorbing the facts and names of popular culture.[4]

This is a particularly interesting insight. Writing for the Concord Review, Will Fitzhugh observes that teens memorize popular culture information to fit in with their peers, to pass their “informal tests” that they create for each other, to be cool. Just as school is standardized, so peer performance has standards, which, if not met, result in getting “flunked.” Students complain about testing in schools when life is a big test itself! One must struggle to stay afloat in the advancing rapids of entertainment that speed by. One must be “cool,” lest they be ostracized for not being a part of the peer culture. One should be studying hard for a test they have later that week, yet there they are, up late at night, stressing over whether they are literate enough in pop culture, cramming in short seven-second videos to fit in, obsessive, anxious. Memetic literacy is slowly overtaking cultural literacy. Jacoby concludes, “The question on the table is whether the subjects to be memorized will include English, math, science, and history—or whether the only mandatory subjects will be music, television, movies, and fashion.”[5]


So what actually is a meme? The following excerpt comes from the originator of the term, the scientist Richard Dawkins:

We need a name for the new replicator, a noun that conveys the idea of a unit of cultural transmission, or a unit of imitation…. [M]emes propagate themselves in the meme pool by leaping from brain to brain via process which, in the broad sense, can be called imitation.[6]

Unknown-6.jpegA meme is a certain kind of gene, a strand of code that is inherited. But unlike biological genes, memes are what Dawkins calls “cultural genes” in that they do not pass from person to person, but culture to culture. It is a gene on a mass level. Think viral. A “viral video” is so called because, like a virus, it spreads exponentially in its hosts, not just through the air, but digitally. The video goes “viral” as it is passed from person to person, computer to computer. He says a meme is a form of “imitation,” by which he means that the meme is copied and then replicated. It has copies made of it, either new ones or mutations. They are reproducible and copyable—in fact, there is a meta-meme, a meme about a meme, about stealing memes: Creators will take an already existing meme and put their own twist on it, then put their name on it to claim it, ad infinitum.


A meme is a favorable way of cultural transmission, as Dawkins puts it, because they are easily reproducible. The basic meme consists of a picture background with an above and below text that makes some kind of predictable joke along a patterned outline. The picture stays the same, but the text can be changed to allow for different jokes among people. They are simple and easy to understand. Punchlines are short and witty, and they are so widely recognized that anyone, regardless of ethnicity or language, will be able to get a laugh at its comedy.


images.jpegUnlike cultural literacy, which differs transculturally, memes are universal. Any high schooler, I can guarantee, will know a meme from across the world if presented one. Memes have become the source of new allusions. This means, after all, that memes are a part of the collective consciousness briefly. Seen by millions daily, memes are a worldwide shared knowledge. But of course, memes, for how good they are, come with problems, too. What is most important in the definition of a meme, I feel, is the word “idea.” Idea can be many things—a song, a joke, a theory, an emotion, a fashion, a show, a video, and a dozen others.


This said, memes have great potential because they are good for spreading ideas that matter. The problem is: Memes spread ideas that do not matter. Viral videos are for entertainment, and nothing else. One laughs at a sneezing panda for enjoyment, not education, nor enlightenment. Memes are usually trivial, frivolous, meaningless, and humorous. Not all are, but most are. Despite their potential, memes are actually vapid and disruptive. I get a good laugh out of memes, and sometimes they can even be intellectual in their content, like historical memes. But for the majority of them, they are useless, fatuous entertainment. We need, in this age of ours, to find a balance between being literate in memes, and being literate in our world.


Unknown-8.jpegTo summarize, the problem at hand is that we are seeing a decline in cultural literacy, the ability to communicate with strangers with a shared, underlying knowledge; and a rise in memetic literacy, the ability to make allusions to videos, celebrities, sports, fashion, and other popular culture. This is not to say that memes should not be used at all, no; after all, Nietzsche said, “Without music life would be a mistake.”[7] A musician like Michael Jackson, being a part of popular culture, ought to be discussed just as much as Louis XVI because he is a part of our collective memory. Popular culture is, of course, a subdivision of cultural literacy, because without it, we would have little shared knowledge!


Unknown-7.jpegI fear the day we no longer know of classical literacy, when we can quote Lil Pump’s “Esketit” but not Shakespeare’s “To be or not to be.” We should be able to discuss music and fashion and sports, but it should not be the priority; they are entertainment. Memes do a lot of good, but they can also do a lot of harm. They spread universal joy. They can get an idea to be seen by millions. What we need to do is ask ourselves questions. We need to consider what is trivial and important today. We need to decide what is worth studying, what ideas are worth spreading. Entertainment is essential, but spreading ideas, good ideas, is more important. We are undergoing a fundamental change in our world, and we need to be present to address it. This is a proposal to look inward instead of outward, to examine our values, to find out what we care about.


[1] Hirsch, The Dictionary of Cultural Literacy, p. xv
[2] Id., p. x
[3]  O’Neill, “No Allusions in the Classroom” (1985), in Writing Arguments by John D. Ramage, pp. 400-1
[4] Will Fitzhugh, qtd. in Jacoby, “The MCAs Teens Give Each Other” (2000), in Elements of Argument by Annette T. Rottenberg, p. 99
[5] Id., p. 100
[6] Dawkins, The Selfish Gene, p. 192
[7] Nietzsche, The Twilight of the Idols, §33, p. 5

For further reading: 
Elements of Argument: A Text and Reader 7th ed. by Annette T. Rottenberg (2003)
Writing Arguments: A Rhetoric with Readings
by John D. Ramage (1989)
The Dictionary of Cultural Literacy
by E. D. Hirsch (1988)
Challenges to the Humanities
by Chester E. Finn (1985)

An Incomplete Education by Judy Jones (2006)