Some Reflections on OreGairu (My Teen Romantic Comedy SNAFU): Rationalization (4 of 5)

Read the previous post here.

Rationalization


Unknown-3In his famous essay “The Myth of Sisyphus,” French philosopher Albert Camus describes the ridiculous sight of a man wildly gesticulating in a phone booth, citing this as a demonstration of life’s absurdity.[1] This, he implies, is a representation of the entire human condition: All of us making a big fuss, as if anything meant anything! We are all, each of us, mimes trapped in our own phone booths. The problem with this, of course, is that he stands outside and reports upon the situation “objectively,” as if he were an entomologist observing the comical behavior of an insect. Such is the attitude of the crude behaviorist who records outward gestures and acts and believes either that there is no inside at all or that, from the exterior, the inside can be sufficiently inferred, being secondary.


What Camus hereby ignores and minimizes is the lived reality of the caller, his first-person experience, which to him is undeniably a matter of importance. Perhaps the caller is talking with his ailing wife, discussing his job with his boss, or hearing that his child has been hospitalized; however, Camus never considers any of these possibilities—in fact, he discounts them entirely, content instead to view human life from a detached, external perspective, from which he can pronounce a general judgment on existence. This is largely the fault of which Hachiman, too, is guilty, especially throughout the first and second seasons. To give an example, he is always highly critical of his classmate Hayato Hayama’s clique, which he comments on from a distance, sitting at his desk, his head buried in his arms, mocking their falseness, their lack of intimacy, the shallowness of their conversations, the pettiness of their drama, and the weakness of their bonds. All groups, he has decided, are by default fake. 


e3924ffc647e78083f435675f5defcffBut like Camus, he has the privilege of being an outsider, giving him total discretion. Now, it is true that in many cases Hachiman’s diagnoses are correct—Hayato and his “friends,” Tobe, Hina, Yumiko, Yui, et al., are, in fact, petty, shallow, and not super close—but just because he is correct in a few judgments does not justify his broader claims. An armchair, or rather desk, psychologist, Hachiman has nobody to consult, but remains inside his head; he has never been in a group, so he has no firsthand experience of group dynamics, except what he has seen from afar. His deductions are nothing more than prejudgments. At the beginning of season two, Hachiman tells Hayato to his face that in trying to prevent his group from falling apart, what he is trying to save is probably not worth it if it needs saving.


Screen Shot 2022-09-25 at 4.26.26 PMThe great irony is that by the end of the same season, he finds himself in the very same position, noting his own falseness in interacting with Yukino and Yui, the first group of which he has been a part and to which he feels an obligation. Therefore, it is only when he is in a group himself, when he has surrounded himself with people who care for him and for whom he cares, when he is finally on the inside and learns what friendship is and can be that Hachiman lives this out for himself internally, and can truly understand what is at stake in relationships. An outsider or a stranger never has stakes in what they are looking at from the safety of the sidelines; whereas someone who commits themselves to something or someone has a stake therein, the effects of which they will feel deeply. Indeed, this is the exact criticism Ms. Hiratsuka directs against Haruno toward the end of the third season, when the latter tells Hachiman that his relationships with Yukino and Yui are not genuine, but merely expressions of co-dependence.


To elaborate, if, upon observing a couple passing by, and having no acquaintance with them, I declare what they have to be false or not “real,” attributing some ulterior motives to either, then I have not made a deduction, but only succeeded in falsifying it. This abstraction at a distance does not disillusion; to decide from the outside that a thing is false is nothing but a falsification. There is something deeply false, in other words, in the very act of falsifying, from which follows the corrosiveness of skepticism and cynicism. There is a subtle difference between the words “mistrust” and “distrust” that I think bears upon this analysis. The two are the same in terms of outcome, namely, not trusting others, but they differ in how they arrive at that conclusion. I would argue that distrust is founded in mistrust, so the former is narrower than the latter. As the prefixes indicate, when I mistrust someone, I mistakenly trust them, usually unintentionally; and if I end up having enough negative experiences, then I may develop the inclination to distrust everyone, in which case, prior to all interactions, I automatically assume they are untrustworthy, self-interested, and disloyal.


downloadThis is sadly an all-too-common phenomenon; you most likely know someone in your life who may have been generally optimistic or happy but who, having had a bad romantic relationship or friendship, then declares that they will “never love again,” “never open myself up to another,” “never let my guard down,” “never make the mistake of being vulnerable,” etc. Once one has been hurt, one becomes closed off, understandably, and protects oneself from the possibility of it ever happening again. As creatures of custom, we make inductions all the time, generalizing from a few cases, so that if we have been fooled once, five times, ten times—then the whole of humanity is to be distrusted henceforward! This is not to diminish the very real and painful experiences of having been wronged, having one’s trust broken, being manipulated, being abandoned, or anything like that; rather, the point is that from a few incidents, whether isolated or closely connected, we tend to overcompensate, thinking that we are helping ourselves by erecting walls, when in reality, by closing others off from ourselves, concealing or repressing our feelings, and forsaking relationships, we only exacerbate our hurt. 


To live in distrust, I propose, is to live a diminished life. Such is the existence Hachiman leads. Watching how he interacts with Hayato and some girls, Haruno observes, “You always try to read what’s behind every word and action. It’s […] like you’re afraid everyone has evil intentions” (S2E4 13:26). There are two aspects to this. The first is, as I have noted, life history. Due to his negative early experiences, having mistrusted others, Hachiman is generally distrustful in the present. Yet the second is something introduced earlier, i.e., projection. There is an extent to which Hachiman attributes his thinking to others, partly for explanatory value, though partly for self-protection. That is to say, despite being (overly) honest with those he dislikes, Hachiman is far from being open when it comes to his genuine feelings, about which he remains silent, particularly with Yukino and Yui. Thus, his uneasiness or reluctance to believe in sincerity is itself an expression of insincerity on his part.


764fcabcb3d4328e221d206e78f9c79aI take it this is the unstated rationale behind his notorious “Nice Girls” speech, in which he doubts the realness of any expression of kindness, seeing it as disingenuous, impersonal, or utilitarian coming from a girl. There is no way, he thinks, that a girl could possibly—and for seemingly no reason at all—want to be nice to him, from which he deduces that there must, in every instance, be some sort of unseen gain or ploy at work. Notably, the occasion for his monologue comes after a stiff conversation with Yui. Earlier in the school year, he saved her dog from being run over, getting hit in the process and having to skip school for a while until he recovered. Surely, there is no conceivable reason why Yui should pay the slightest attention to him, he calculates, unless it is for the sake of paying him back, as it were; a way of saying “Thanks!”, being, therefore, ulteriorly motivated.


He reflects, “If the truth is cruel, then lies must be kind” (S1E5 21:37); and seeing as Yui is kind toward him, then as per the premises, her kindness must be a lie. The truth, correspondingly, is that in reality, beneath the appearances, Yui does not like him, or is indifferent to him. What he does not consider, of course—because, in this mode of thinking, he is incapable of considering it—is the possibility that her kindness is motivated by nothing other than kindness. He cannot comprehend kindness “for no reason,” even though it is arguably the case that kindness “for a reason” is undermined thereby. In other words, kindness is not “for” any reason because it should not be (unless we use a tautology: Being kind for the sake of kindness). Again, kindness, like other acts proceeding from the heart, does not fall under the domain of calculation or logical deduction. 


download-3Even the Underground Man, the great rationalist, the ultimate overthinker, realizes, at the end of his reflections, the futility of his way of life: “But reasoning,” he painfully admits, looking back at a shameful incident with a prostitute, “explains nothing, and consequently there’s no point in reasoning” (124).[2] He comes to acknowledge, as Heidegger pointed out, that the demand to render reasons is itself without reason; the compulsion to explain and provide a reason for everything—the ideal of rationalism—is at its heart deeply irrational.[3] Although we think that reasons provide the ground for a thing, it is really the case that rationalizing takes us away from life and ourselves. As soon as we set upon conceiving the rationale behind something a friend said, a gesture a classmate made, or the act of a stranger, as if each were a code to be demystified, we have become detached from them, more attached to the realm of imagination; we are the farthest from them in thought.


download-1There is a secret order to the heart, an ordo amoris, which cannot in the final analysis be comprehended, but which must rather be experienced. Or as Pascal eloquently put it, “The heart has its reasons, which reason does not know” (Pensées, §277).[4] Haruno further indicates this in the third season. She tells Hachiman that, like her, he “can’t get drunk” (S3E2 (09:38). I take this to mean that, similarly to the Underground Man, no matter what he does, Hachiman can never turn off his mind—or, more specifically, the restlessly critical, distrustful voice that is constantly operating therein. The idea is that drinking is a way of quieting the inner voice in our heads; we want a little respite from the judgments we are always making, whether about ourselves or others, and a little intoxication takes the edge off. Yet these three characters—the Underground Man, Hachiman, and Haruno—do not find even this adequate. When the Underground Man has dinner with some old schoolmates, he gets maddeningly drunk, which happens not to defuse his thinking, but to fan its sparks into a conflagration. Hachiman only drinks coffee throughout the show, but Haruno, as I suggest, is implying that his critical function is something from which he cannot escape, something he cannot “turn off”; it is, to echo the Underground Man, a real sickness to think too much, which no numbing can mitigate.

 

 

 


Notes:
[1] The Myth of Sisyphus, and Other Essays (1991), p. 15

[2] This may seem ironic, if not textually dishonest, considering the bulk of the Underground Man’s criticisms are directed precisely at rationalism, which overlooks and comes at the expense of humanity’s irrational side, its capacity for freedom. However, I defend calling him a rationalist for two reasons: (1) although he praises passion and emotion while denouncing reason, he engages in the latter for most of the book and comically desists from the former, being utterly incapable of acting upon his impulses, which are often impeded by reflection; (2) he can recognize and praise the importance of irrationality only because he has been cut off from it, being a well-educated man and living in an increasingly rationalized, Westernized Russia; it is precisely in his disconnection from his emotions, his analytic approach to them, that he laments this condition, for the rationalism he denounces is not some empty ideal, but his very own rationalism. 

[3] Cf. The Principle of Reason (1962), pp. 11-2

[4] https://www.gutenberg.org/files/18269/18269-h/18269-h.htm

 

Works cited:
Camus, Albert. The Myth of Sisyphus, and Other Essays. Translated by Justin O’Brien, Vintage Books, 1991. 

Dostoevsky, Fyodor. Notes from Underground. Translated by Richard Pevear and Larissa Volokhonsky, Vintage Books, 1994

Passion and Craft (4)

Read the previous part.


So to what extent is my role as a writing tutor futile? If I help a student out with their writing, then am I really, truly helping them as a writer? I think the most realistic answer is that I most likely have not converted nor will I convert anyone to becoming a writer—at least, not within a half-hour session. While I think the possibility of that happening exists, the conditions would have to be much different; a lot of time, enthusiasm, and patience are required—more than I can provide in a single tutoring session to mostly one-time attendees.


download-1Two metaphors are pertinent: The seed and the Sun. Of course, the seed metaphor is clichéd, expected, and perhaps overused: The mentor or tutor is someone who “waters the seed,” who manages to actualize and make grow the “seed”—the inner potential—of another person. According to this worldview, every single person contains within them infinite possibilities, but only some in their lifetimes are cultivated, more care and nourishment being shown to some over others. We might modify it to be more realistic by imagining some seeds to begin bigger than others, symbolizing predilections and skills, but the key point remains the same: Any seed, if shown enough attention, and watered by the right person, can grown, and this growth, while limited in some cases, is still growth nonetheless. Thus, as a writing tutor, my goal is to either “plant” the seed of writing if it does not already exist or to “fertilize and grow” that seed if it does exist yet needs the extra attention.


downloadAlternatively, I can act as the Sun: As the provider and source of light, of vision, I highlight and draw out otherwise-neglected or -unknown aspects. Based on this paradigm, my purpose consists in transforming the very vision with which one views writing. By altering their perspective, by “shedding light” on certain details, by “enlightening” them, and by letting them see what previously escaped their vision, I can hope to show them what they have been missing, that to which they have been blind. In a sense, I function as the omniscient overseer since I “circle” the globe and have seen all there is to see, which is passed down as knowledge and maybe even wisdom. To be sure, I can combine both metaphors seeing as sunlight is necessary for the growth of a plant, which by nature turns toward the light.


All of this is just to say that it is difficult to get somebody to see the value of an art, but it is possible nonetheless. Writing for the sake of writing makes long stretches of time bearable. This is because when I write, I am concerned with the choices I want to make, the intentions I want to convey, and the effects thereof; I am absorbed in my task, which sometimes, but not always, leads to flow state. As I write at this very moment, I am putting painstaking, meticulous thought into the exact words I want to use, the syntactical ordering of those words, transitions between ideas, punctuation, and sentence construction.


Everyone writes differently, of course, and some may find themselves cringing at the way I go about writing, which sounds stilted, forced, unnatural, and slow. Some writers will say that the best, or even the only, way of writing is to “put it all out on the page without thinking,” to “write first then edit later.” This download-1spontaneous, thoughtless, impulsive method works for many, I am sure, but I cannot do it; as I am writing these sentences, I am thinking along with them, thinking them through, allowing the words to feel themselves out across the page. Long sentences, which I love writing, are a good challenge because they require planning beforehand: I must know in advance to what end my sentence is moving, and it is merely a matter of removing the roadblocks which get in the way of its course, which might require taking detours, backtracking, and adding more connections. Then I might ask myself whether a semicolon, colon, comma, or em dash is required because they all do different things. I wonder whether it will read smoothly, if it has good rhythm, if any pronouns are vague or ambiguous; I am constantly thinking about how best to express myself, and this is how I lose track of time.


There is evidence of what I am trying to get at in the fact that anyone, regardless of whether they like writing, can recognize what to them is an “elegant” or “beautiful” sentence, an “expressive” or “good” word, a “seamless” or “smooth” transition, an “evocative” or “stunning” description or passage; in other words, even those who do not speak the language of writing, who do not consider themselves writers, and who are not familiar with the jargon of the craft exercise a fluency to some degree. They may not be able to identify what it is that lends a particular passage or sentence its eloquence, but they know it when they see/hear it—and that, I think, is an amazing, wondrous, and magical thing.


One time, as I was helping a classmate with an essay, I reworded one of their sentences to make it flow better, and they found it remarkable that I could do so. It struck me as odd and unwarranted, naturally, because all I did was change a few words around, yet it made me elated to know that such a small edit imageswas not only noticed, but appreciated and admired. I wanted to tell them that I had not done anything. But then it made me realize that the labors of a humble writer such as myself were worthwhile and actually had some effect—it usually just goes unnoticed. In fact, this is why we so often feel uncomfortable, demurring, “It was nothing, really,” when a skill of ours is pointed out: Intrinsicality, which develops as a result of devotion, practice, and skill, becomes so sedimented that it is interpreted as “natural,” something that inheres in us, no more questionable than the act of speaking, walking, or swallowing; that is, a writer who writes well and is marveled at feels awkward because, as a writer, what else could have been expected of them?

Passion and Craft (3)

Read the previous part.


download-1This widespread idea that people are at birth destined into some skill-based caste may seem very appealing, and perhaps there is more veracity to it than I have here considered, which will prompt some further research. A famous mathematician is likely to have a child with a knack for math and logic, while a poet will probably have a literary child—this seems to me likely. Yet it could also be the case that the environment in which the child is raised plays a much more significant role since a mathematician will have lots of books, for example, on mathematics, while a poet has poetry anthologies, thereby confounding the nature-nurture question. Again, I have not researched this in depth: These are my immediate, untested speculations.


downloadSo for now, suspending the question of deciding on this deterministic picture’s validity, but supposing that it is true, which is what I and my peers like to do—in other words, assuming that we each have predispositions toward different subjects—we can consider some new ideas. If a lot of my friends confidently see themselves as physicists, chemists, coders, and engineers who have no grasp of words or grammar, then I must proudly and readily declare myself a writer and own the fact that I have a poor grasp of numbers, logical problem-solving, and hands-on practicality. And assuming, as I said, that this set in stone, that there is little-to-no hope of immersing ourselves in radically different subjects, and that we shall be as foreigners who speak different languages which are unintelligible to each other, does this mean that we must necessarily live such short-sighted, monochrome, and insulated lives?


My own solution comes from thinking of this problem as a microcosm analogous to contemporary political issues, as a matter of fact: In the 21st-century, we find ourselves amidst crazy and overwhelming cultural, demographic, social, political, and informational changes due to a shrinking, globalizing world, which force us to contend with incommensurable and contradictory narratives, traditions, languages, download-2governments, religions, and more. In short, we are dealing with pluralism. Atheists, Christians, Jews, Muslims, Hindus, and thousands of other religions must coexist; English, Mandarin, Hindi, Spanish, and thousands of other languages must coexist; the United States, Ethiopia, Turkmenistan, and hundreds of other nations must coexist; and so on. My own belief is that cultural relativism, or at least pluralism, with all its flaws and shortcomings, is the only tolerable and tolerant way forward; it is odd and problematic to say one culture is “superior” to another, that one language is “the best,” or that one religion is “more truthful.” Instead, we are faced with an unnerving, frightening openness. There are many issues with globalization, to be sure, but I think the benefits are important.


download-3Essentially, the optimistic answer to my initial question before diverting to politics—must we necessarily live such short-sighted, monochrome, and insulated lives?—is as follows: I neither think nor hope so; and consequently, returning to the original problem at hand, namely, varying talent in various subjects, I think the key is to appreciate that we are good at certain classes or practices while recognizing that although we may be bad at others, there is something equally valid and rewarding in those other activities, something that can be learned from them, leading to what the German philosopher Hans-Georg Gadamer called a “fusion of horizons.” 


Returning, now, to my original point about being a writing tutor, I will elaborate further on what this pluralism looks like in practice and how I think it makes life more fulfilling, beautiful, and significant. How much time I spend on writing, as any writer will know, is inconstant. People can talk about typing speed, e.g., words per minute, but as is true of any test, such a measurement does not wholly map onto the real world. For instance, timing oneself type a passage rarely matches up with the impressive, feverish speed at which the procrastinator writes the day before an assignment is due or the deliberate, timely rate at which the prepared stylist writes for their own pleasure. Depending on what I write, be it fiction or nonfiction, dialogue or description, I can take anywhere from 20 minutes to an hour and a half to fill up one page.


imagesThe reason I wanted to write this post is motivated by the following event: A while ago, one night, I was writing a page for a fictional work, which usually takes no more than 30 minutes, yet I ended up taking an entire hour and a half to write the singular page. I had an imaginary conversation with myself as I was getting ready for bed in which I pretended to be a friend of mine in order to get a reaction, and it occurred to me how insane such a thing would appear to someone else that I spent 90 minutes writing one page—and not even for school or because it was expected of me, but because I actually kind of had fun. Normally, I and anyone else would be frustrated spending so much time on something rather straightforward; it seems normal that anyone, if it took more than an hour, would call it quits and take it up the next day when one had the patience and energy; however, I kept going until I wrote the entire page. At first, even I was confused by myself, by the fact that against all expectations, I did not consider my night wasted or my work “uninspired,” “not worth it”; it was a genuinely enjoyable experience.


The reason comes down to this: Writing, for me, is a craft. For many, writing is like Hell because it is an assignment, something expected of them, and because many do not find it fun or interesting to begin with, which is understandable. When writing becomes a craft, though, a skill, a passion, something one imagesdoes for its own sake—writing for the sake of writing, or for the sake of writing well—and which is intrinsically rewarding, things like time and energy do not matter as much as effort. But this does not solve the problem with which I began this discussion: It is great and awesome and wonderful that I should enjoy writing and see it as an end in itself, yet I cannot expect others to feel the same way about it as I do considering they likely have other interests to which they are devoted, writing not being one of them. At the same time, I want to avoid a form of solipsistic or tribal determinism according to which every single person, because he or she prefers and is better at certain things over others, becomes hyper-specialized in that one thing; otherwise, this discussion would be unnecessary, and everyone would stick to their own sphere. This is obviously undesirable and doomed for failure.

A Summary and Review of Huston Smith’s “Condemned to Meaning” (1 of 2)

In 1965, then a philosophy professor at MIT, Huston Smith (1919-2016), a renowned scholar of world religions, delivered a talk entitled “Condemned to Meaning” for the John Dewey Society For the Study of Education and Culture—later shortened to just the John Dewey Society—which had been founded 30 years earlier. Shortly after, the lecture was turned into a slim book of 90 pages, making for a quick, accessible, and informative read.


downloadI came across the book by chance while browsing my university’s library, and, recognizing the author, and having a penchant for anything that has to do with the meaning of life—the title of the book is taken from the French philosopher Maurice Merleau-Ponty’s Phenomenology of Perception, which itself borrows from his contemporary Jean-Paul Sartre’s famous dictum in Being and Nothingness that we arecondemned to be free—I rejoiced at having discovered a text of which I was formerly ignorant; only to discover, upon doing a quick Google search to find out more about it, that so, too, was the Internet! On Amazon, the book has neither description nor review; and on Goodreads, which boasts a database of over 12 million titles, the book did not even exist (until I registered it myself)!


However, one can, at least, find two academic reviews, one from the Journal of Religion and one from the Journal of Thought [1], that skim its contents in addition to providing some critical comments; yet to think of doing so, requires that one know about the book in the first place! Accordingly, in this post, I will provide a brief summary, both explanatory and evaluative, of Smith’s much-neglected lecture, in the hopes of bringing it to more people’s attention.

Introduction


download-1Arthur G. Wirth, who oversaw the lecture series, states the main contention of the talk in the foreword: “Professor Smith argues that the loss of life-meaning is unnatural for man.”[2] This claim is telling because it is really two claims, which is worth noting since it will clarify the course of the book. First, it is assumed that there is, in fact, a loss of meaning, which also implies that there was meaning that was had beforehand. Second, this state of affairs, in which meaning is absent, is not normal, which suggests a moral claim: Although what is natural is not necessarily good, Wirth seems to indicate that, for Smith, what is ownmost to humanity has been compromised, and so stands in need of retrieval, reclamation, recovery.


download-2Man, by nature—the argument goes—lives by and with meaning, showing that the historical context in which Smith is speaking, the ‘60s, is somehow aberrant, off course; whereas it is his goal to explain and/or solve this crisis. Hence, in his introduction, Smith discusses how cultural pessimism characterizes the post-War world. He compares this zeitgeist with the Ancient Greeks, for whom pessimism was a fringe and extreme sentiment—“an occasional Greek,” he says, flirted with antinatalism [3]—in allusion, perhaps, to the wisdom of Silenus and Solon. This seems to be more rhetorical than accurate, though, given the popularity of the tragic worldview as espoused by Sophocles and Æschylus, and as celebrated by Aristotle and Nietzsche. Nonetheless, Smith is more spot-on when he points to the popularity and growth of Absurdism in contemporary culture, from the philosophy of Camus to the plays of Samuel Beckett, which depict the world as unforgiving, random, and devoid of significance.


The ‘50s and ‘60s, paradoxically, were a time of general thriving, especially in America, with its blossoming middle class, raging consumerism, and expanding higher education. For Smith, this is no paradox: inner poverty amidst outer prosperity is by no means contradictory but correlative, if not causative. He attributes this spiritual crisis to “the impact of science upon religious certainty and of technological progress upon the settled order of family, class, and community,” a perspective which, albeit true to a degree, is overly reductive (likely due to his bias as a religious scholar!) [4].


Preliminary_Cold_War_IIThe anxiety of the Cold War, the bubbling up of the Civil Rights Movement, the assassination of an American president, the push of second-wave feminism, the high expectations of conformity, the sedimentation of corporate culture—historical and social forces like these, all preceding his lecture, are unquestionably contributing factors, although it is unfair to blame Smith, who was in the middle of it all. Regarding “technological progress,” he most likely would have been thinking of the atomic bomb, the threat of which was magnified by the USSR’s arm race and the resulting quandary of mutually assured destruction (M.A.D.), which came to a head just years prior in the Cuban Missile Crisis; but he could also have had in mind the contraceptive pill or even  mass-produced domestic utilities, both of which, in expanding freedom, led to more responsibility and questions of roles and duties.


Unknown-6Another implicit factor which I think Smith would acknowledge if pressed on is the symbolic and notorious announcement of Nietzsche about “God’s death,” despite its being an earlier development; because with it, an attitude of laxity and lostness sets in: without a transcendent foundation for morality or values, the nihilism of his and our day, Smith would contend, is only natural and to be expected. To fill this empty space, existentialism awards to humans the freedom to define themselves, to create their own values, a freedom which, as Kierkegaard and Sartre vividly put it, evokes immense angst and anguish. Although this is certainly a better position to be in than the pure empty abyss of nihilism, it still places a burden upon us to actually create these values.

Academia


To date, Smith believes that “meaning” has been given either insufficient or inadequate treatment within different academic disciplines. He highlights three in particular: anthropology, psychology, and philosophy.


  • Being the study of humans and the various ways in which they organize themselves, anthropology shows that civilizations require meanings; in order to succeed and thrive, they must have a sense of rootedness, a foundation from which to build and by which they understand themselves. A people imageswithout a meaning, Smith says, is directionless, and without direction, a civilization loses its way and flounders. He makes the provocative statement that we begin by posing answers and proceed from there to ask questions. For example, people create myths to survive, and from these stories, they create a narrative framework under which questions become possible and askable. It is on the basis of a story of the origin of water, for instance, that the question of its purpose can be raised. Yet this apparently insightful comment does not make much sense: an answer clearly cannot precede a question, for an answer is always an answer to something, which requires, first of all, that the something be raised, i.e., called into question. Anthropology, in short, teaches that meanings are vital.
  • download-3Psychology, because it is the study not of collective humanity but of the individual human, arises in the midst of individualism. As soon as the collective grounding of meaning was lost, having been subjected to critical and historical analysis, resulting in a condition of anomie in which norms no longer hold up, psychology took it upon itself to provide help to those struggling to find value in their lives. Hence, Smith highlights the contributions of Viktor Frankl, the existential psychotherapist and Holocaust survivor who invented logotherapy, which declares the will-to-meaning to be man’s innate drive. Meaning, then, is not a luxury but a necessity; a “why” is needed to live on.
  • Lastly, contemporary philosophy, with its split between Analytic and Continental, has its own take on meaning—or rather, two takes. To oversimplify, where Continental philosophy concerns itself with existential meaning, Analytic philosophy concerns itself with semantic/logical meaning. So a Unknown-3philosopher like Albert Camus will ask what it means for my life, or life in general, to have any meaning, but Rudolf Carnap will ask what this “meaning” means, that is, what significance, if any, such a proposition has. It is evidently Continental philosophy that is more closely aligned with Smith’s problem; however, rather than view Analytic philosophy as misguided or useless—which it is not—Smith wants to use its methods in application to Continental themes. While Analytic philosophy can get bogged down and restricted in its devotion to empiricism and cognitivism, its tools of inquiry can be carried over for better clarity. Such is Smith’s goal: to look at existential meaning through an analytic lens.

Analysis


He proposes four categories that can apply to meaning and, by distinguishing between their dichotomies, arrives at a more precise conception thereof.


  1. Meaning can be either atomic or global: does meaning relate to part of existence or to the whole of it?


  2. Meaning can be either extrinsic or intrinsic: is the meaning of existence outside of and “for” something else, or is it inside of and for itself?


  3. download-5Meaning can be either articulable or inarticulable: can meaning be expressed in words or does it remain outside the realm of expression? Here, Smith posits degrees of communicability. Some things are fully articulate and can be put into words; others are merely tacit, meaning they are felt in some way, intuited, yet they resist speech; and others are what he calls “subceptual,” indicating that they are unconscious, making them “unknown knowns”—things we do not know we know. These degrees, furthermore, are convertible, like when we attempt in painting to get across our unnameable impulses and feelings or when in therapy a psychologist tries to bring out an unacknowledged truth.


  4. Meaning can either be individual or generic: is meaning unique to me alone or can it be communal and shared, perhaps even historical?

The Problem of Meaning


download-4With these specifications made, Smith asks the official question that will guide his discussion: “What, insofar as it can be stated (rendered articulate), is the meaning of human life (global) considered in its own right (intrinsic) and as pertaining to all who live it (generic)?” [5] Additionally, as up until now he has not actually stated what he means by “meaning” except by saying that it is existential, he clarifies, “Is there a purpose which, if realized, would render life clearly worth living?” [6] Combining the two, we get a more exact guide: What, insofar as it can be stated (rendered articulate), is the purpose which, if realized, would render life clearly worth living (global) considered in its own right (intrinsic) and as pertaining to all who live it (generic)?


At this point, Smith takes on the mantle of the Enlightenment philosopher Immanuel Kant by proposing a critique of the meaning of life, critique being understood not as negative criticism but positive analysis. In fact, he borrows Kant’s section on categories, but instead of using them as the conditions of experience, Smith uses them as the conditions of meaning. What elements are needed, in other words, for meaning to exist? Though unlike Kant, who enumerated 12 such categories, Smith mentions only five as constituting a theory of meaning.


  • download-6(1) First, meaning requires pain and suffering. Alright, then. It is a brutal and honest start, to be sure, but it makes sense. Many philosophers, from the Buddha to Schopenhauer, have maintained that the one truth of life is the fact of suffering. Life can be and often is unpleasant. Such an acknowledgment is important because, without it, one would be lying; anything said about existence that excludes suffering as a key component is a misleading and false one. Yet it is the existence of pain as a first category that makes possible the second.
  • images-1(2) Hope. A philosophy which espouses only the primacy of suffering and nothing else is just as incomplete as one without suffering at all—that is, if one actually cares about achieving meaning. Therefore, suffering is a good starting point because it assumes an ending point, namely, the end of suffering. By having such a goal and a means of working toward it, like a path to follow in the case of the Buddha, then human transcendence, i.e., progress and making choices, is possible. There is some good awaiting one at the end, a source of salvation; however, this good need not be otherworldly, like Nirvana, Heaven, or union with God—merely a cessation, or at the very least a mitigation, of suffering. Tied to this is the third category, which Smith names endeavor.
  • download-7(3) Endeavor. If I have a goal, an end to suffering, then I can work toward it. This presupposes, then, that I do have the capacity to do this; there is no point in having a goal which cannot be reached if one tries. Such a philosophy would be hopeless by definition. “The capacity for intentional self-transcendence,” Smith declares, “is the chief attribute that divides man from the lower animals.”[7] We humans are unique in that even if animals have consciousness, it is a difference of degree such that we can envision a future and in most cases make it a reality, self-transcendence being the capacity to change oneself and one’s situation by means of projecting—creating projects and throwing ourselves after them, so to speak. To imagine a better state of things is already a step thereto. Next is trust, the fourth category.
  • images(4) Trust. A meaningful life is one in which I have assurance not only in myself and the world but also in this assurance being reciprocated. Trust is a relational term, so it requires that the one who trusts be themselves trusted. Otherwise, if in doing something good I do not feel fulfilled, if the world actively pushes me back and resists me, then such a life is not sustainable. It may be the case, however, that the world is actively resisting me, but the key to trust is maintaining it in spite thereof: resilience is a matter of having faith.

[1] Cf. sources below. (If access is blocked, contact me for pdfs.) 
[2] Smith, Condemned to Meaning, p. 10
[3] Id., p. 13
[4] p. 15
[5] p. 41
[6] p. 42
[7] p. 49


Sources:

Christian, William A. The Journal of Religion, vol. 46, no. 1, University of Chicago Press, 1966, pp. 56–56, http://www.jstor.org/stable/1201227.

Moore, Thomas D. Journal of Thought, vol. 2, no. 4, Caddo Gap Press, 1967, pp. 59–60, http://www.jstor.org/stable/42588045.

Smith, Huston. Condemned to Meaning. Harper & Row, Publishers, 1965.

 

Passion and Craft (2)

Read the first part here!


imagesAccordingly, it strikes one as strange and impossible that a mathematician, a physicist, a musician, and a yo-yoer should all stand before a crowd of unshaped, directionless children and, with equal confidence, state matter-of-factly, “My path is the best”: “Math is the highest pursuit in life,” “Physics is the highest pursuit in life,” “Music is the highest pursuit in life”—”No, yo-yoing is!” We know none of this is true, though; even to the most inexperienced and naïve child, this state of affairs is sense-defying. What each speaker really means, and what we understand them to mean, even though it goes unstated, is, “My path is the best for me.”


images-1But this is common sense. We surely understand these adults to be speaking subjectively, so it is absurd that we should expect them to add so trivial a phrase to the end of their speeches as though it were a much-needed disclaimer. If one occupation or passion were truly the best, greatest, and most meaningful, then we would expect more people to be doing it. But even this is a childish claim, as we know not everyone can do what is most meaningful or that about which one cares most considering there are jobs in society that need to be done and that are often neither desirable nor pleasurable. Both of these are fair and evident considerations, yet I do not intend to focus on them; instead, I will be ignoring them, expressing in more detail the uniqueness of one’s calling by sharing my recent thoughts.


UnknownThis year, for class credits, I have been working in my school’s tutorial center as a writing tutor. My responsibility is to help students who need help with their English assignments, typically essays. For what deeper reason, beneath the surface-level motive of getting credits, did I choose to be a writing tutor? Simply and naïvely put, I wanted to get my fellow classmates to enjoy, or at least understand, their English classes. Ever since middle school, nearly every one of my contemporaries has consistently rated history as the most boring class and English the most hated, two opposite explanations being either that “We speak and write in English every day, so there is no reason to have an entire class on it” or that “I barely understand our language or what I’m saying.”


UnknownThen, in response to protests that English is useless and a waste of time, English teachers eagerly and desperately defend their class by arguing that, on the contrary, their class is the most practical in everyday life since we are expected to write and speak well regardless of the field we end up entering, a claim which is then summarily attacked by students who are fed up with analyzing the symbolism hidden in an elaborately described doorway, the passing of nighttime, or a character’s name; explaining why an author used repetition in one place, emphasis in another, and rhetorical questions at the end; or having to engage in a “class Socratic” where participation is required and which is about a book that nobody cared to read or understand, that has no relevance to daily life in the 21st-century, and that is difficult to read—even if, as they occasionally admit with sourness, the book was actually a good one, the fact that it was assigned to them—that is, that they did not choose it, but that it was imposed upon them—being enough to tarnish, pervert, and obstruct their experience of it.


These are just a few of the most recurrent complaints about English, and I do not intend on answering any of them; in fact, even I, who have rated English highly and thoroughly had fun in it throughout my schooling, resort to some of these criticisms from time to time, like last year, when I had to write rhetorical analyses, closely looking at word choice, context, figurative language, etc., in speeches and other writings in order to demonstrate how the author accomplished some specific task. I certainly do not miss rhetorical analyses, and there have been books which I disliked and yet on which I had to write essays and discuss with classmates.


Unknown-1What I find problematic yet unavoidable is that the diversity of opinion regarding experiences in English class is thus attributed to some deterministic aspect; that is, whether one has a good time or does well in English depends upon the type of student one is, some people “just being good” at it while others are not, with those who have a good experience therefore being singled out as the exception because they have something in them, some quality—perhaps genetic, perhaps temperamental, perhaps superhuman—that sets them above the rest, distinguishing them as “those who do well in English,” as though it were a conspiratorial secret which only a few elites know but which is, and shall remain, forever inaccessible to those outside.


Unknown-1Writing well and speaking well are enviable skills that seem as far off as competing in the Olympics, and those who enjoy the class must surely be naturally talented; hence, it is hopeless and futile for everyone else to “train,” as it were, or even to have hope, since “that is just how it is”—namely, they will never be good writers nor do well since writing is “not one of my strengths, fortes, or talents.” The sports analogy runs much deeper and is actually more apt than we might imagine inasmuch as talent is overrepresented and -valued in athletics just as it is in writing, the heritability of the two—sports and writing—quickly deconstructing when we examine them further: Inheriting “athleticism” is a vague, misleading notion since athleticism is nothing but a set or bundle of other inherited variables, like height, weight, body composition, coordination etc., sort of like general intelligence (g), just as “writing/speaking well” is not some specific, discrete code that can be passed down, but is better thought of as a cluster of associated traits, like creativity, comprehension, critical thinking, etc.

Prospective vs. Inspective Thinking

downloadWhen it comes to doing work, whether it be tedious or demanding, it’s hard to get started and put ourselves in the right mental space to “get down to it.” Faced with a task we would rather not do, or one which, when we think about, seems difficult and unreasonably long, we would rather put it aside, not think about it, and procrastinate; then, the day before it is supposed to be done, we are infused with febrile adrenaline, and hastily complete it. Especially nowadays for a lot of students like me, when school is online, when accountability for work is placed solely on us, and when we have all the free time in the world, it can be difficult to jump into our work. In this blog, I want to briefly introduce two ways of conceptualizing our approach to work that I have observed—namely, prospective and inspective thinking, by way of contrast.


I’ll give two examples of homework assignments I’ve had. Last year, in Spanish class, my homework would sometimes consist of researching some topic then creating a presentation on it, and the presentation would have a bunch of requirements like (a) “it must be at least 10 sentences” (b) “you must include five vocabulary from this unit” (c) “you must include three examples of a specific verb conjugation,” etc. This year, in my AP U.S. Government class, we had to find a recent news article, write two sentences explaining its credibility and at least six sentences explaining its relevance, then ask a question to which our classmates could respond. I use these two examples because, to my and fellow students’ minds, an assignment like this, despite being summarized rather easily, can (and usually does) appear exhausting and, if we are feeling hyperbolic, impossible.


download-110 sentences—so a paragraph! And in Spanish?! Both my classmates and I pretty much always responded like this, and were filled with dread upon its being assigned. We would joke that we could hardly write a paragraph in English, so to have to do it in Spanish!—well, it couldn’t possibly be done. As our eyes flitted over the numerical requirements—five of these, 10 of these, and three of those—we groaned and agonized, not knowing how we would get through it, how we would even begin. It seemed a Herculean task. However, through the years, as I’ve completed more and more assignments, combatting procrastination, overcoming feelings of futility and uselessness, what I have observed in myself is that there are really two ways of approaching work, as I have mentioned; though I found myself, and others, primarily remaining stuck in the first. And what I have found is that, by clarifying the two, I have found that assignments which appear long and difficult tend to not take as long or require as much effort as I thought.


So to the two modes I now turn.


When I am assigned homework, I begin by prospecting—searching and surveying, planning, taking different elements like time and effort into account. This is where I often become stuck, though. The real breakthrough comes when I begin inspecting the assignment—engaging with it, working with it, recalling what I know, making progress. To further elaborate these two modes, I will view them through different categorical lenses.

Transcendence vs. Immanence

downloadProspective thinking is transcendent, that is, it views things from outside, above, like a bird’s eye view, taking the thing in as a whole. It represents the work to be done, creating an image of it, which is what we fret over; our mind amplifies it, enlarges it, and so it takes on an intimidating quality. Looking at my Spanish assignment, I go beyond the parts to what it is as a whole, and this yields something that thus becomes unmanageable, like a behemoth, a massive project that defies my grasp. In contrast, inspective thinking is immanent: Instead of looking at the project from beyond, it looks at the project from within, as it actually occurs in the bounds of what is asked. One does not stand outside, observing objectively and detachedly, like a spectator circling it; instead, one walks into it, feels one’s way through it, navigates it on the ground level. This means that I am not thinking about the project as a unified whole, but that I am going through each part, one by one, as something in itself.

Infinitive vs. Progressive

download-1Another way to conceptualize prospection is in grammatical terms, through the verbal tense of the infinitive, which takes the form of “To…” For example, I say to myself, “I have to write 10 sentences in Spanish.” From this perspective, my assignment consists entirely of infinitives; this means that there is something expected of me, and these expectations have an indeterminate sense. When I say, “I have to write 10 sentences in Spanish,” there is that residual sense of incompleteness, of deferral; I have not yet written these sentences, but their completion, and the obligation which it demands of me, looms, hangs over me, beckons—I have to x, I have to y. In this way, the project becomes overbearing.


download-2Inspection, on the other hand, is characterized by the verbal tense of the present progressive, which takes the form of “-ing.” For example, I say to myself, “I am writing one of my 10 sentences in Spanish.” Here, the difference between the two is its largest, because one of them—prospection—is anticipatory in that it awaits an action to be completed; whereas the other, inspection, consists of the action itself, the engagement with the requirement. It is one thing to think that one has to write a sentence, which can be done whenever one wants, and another to realize that one is writing a sentence. Of course, before the sentence can be written, it has to be thought; and here, too, the distinction applies, for one can sit and think, “What am I to write?” while another can think the sentence improvisationally: “Cuando yo… asisto a…” and proceed from there.


In short, the infinitive, by virtue of its indeterminate incompleteness and sense of obligation, grants a sense of infinity to its thinking to the extent that it can never be accomplished, only idly planned and thought over; whereas the progressive, because it unfolds in time, in the present, is progressive—that is, it progresses, moves forward, even if it is at but an inch at a time, continual.

Thinking About vs. Thinking Through

This is the simplest and clearest way of differentiating the two modes of cognition. In prospection, one thinks about a problem. Phrased accordingly, it sublates—combines and upholds—the previous two determinations of transcendence and infinitude. Etymologically, when we look at both prospection and inspection, we see that they derive from the Latin root specere, meaning “to imagessee,” so they refer to how we see and conceptualize the world and our projects in it. Where they differ, is in the prefixes: Prospect has the prefix “pro-,” which can mean “forward” and “outside.” Thus, to prospect is to “look forward (to),” and this can be understood both in terms of space and time. Thinking of my Spanish assignment, I prospect it, I look forward to it, which now means one of two things: Either I am planning for it, thinking of it in advance, oriented toward the future, which is uncertain; or I am viewing it from outside, from afar, as if through binoculars—ultimately, the two meanings reinforce each other because I am at a distance from it, I distance myself from what “needs to be done.” Therefore, in prospecting, I can never hope to do my work because it is transcendent to me, because I transcend it, and because my futural grasp of it means it is infinitely undone, awaiting my initiative—in short, I am merely thinking about my Spanish project, which does nothing: It is impotent conjecture and imagination.


Or I can inspect my Spanish homework. The prefix “in-” is self-explanatory: It means “in,” “within.” Unlike the prospector, whom one imagines to be a miner in California during the Gold Rush, searching desperately for gold, though with bad odds, the inspector is like a detective who is holding not a metal detector, but a magnifying glass; and who is not looking for an object, but who, having already found the object, is examining it, probing it, analyzing it. Here, work is being done, and one is not stressed about one’s project; rather, one is immersed in the project, which is, if not somewhat enjoyable, at least engaging, taking up the whole of one’s focus. When I inspect an assignment, I am looking into it, exploring it like an adventurer or explorer.


download-4Although both the explorer and the prospector are united in their search, the act of “Looking-for…,” there is a subtle difference between the two, with the prospector depending on his find to justify himself, extrinsically motivated; meanwhile, the explorer is concerned with what she will actually find, intrinsically motivated. Tasked with writing a sentence in Spanish, I am confronted with these two options: I can think about the sentence I will write, or I can think through the sentence I will write. Suppose, though, that I encounter a block, and cannot think of a specific vocabulary term. If I think about the word, then I will rarely, if ever, find it; the word becomes a fog, vague and amorphous, drifting through my mind, eluding my grasp. Or, I can actively inspect the situation, grabbing my Spanish dictionary and flipping through it, looking at the words in order to determine which will fit best. This means I have to actually do the work.


download-2To use yet another example, I can refer to my Physics homework, which can often be daunting. Opening up my textbook and flipping to the designated pages, I read the problem I have been assigned: Oh boy. How will I ever solve this?! I think to myself. It seems impossible. I’ll never get it done. In fact, now that I think about the problem even more, I realize that I’m going to have to do it in parts, and I realize that I’m going to have to use a bunch of the formulas I learned throughout the chapters. But having the formulas is not enough: I must also know which values to substitute for the algebraic symbols, then relate them to the diagram, and then x, then y, and then… It’s no use! This’ll take me an hour to do! Notice, though, how I used phrases and words like “think about,” “going to,” “have to,” “must,” “then,” all of which are temporal and obligational. In thinking prospectively, I put myself outside of the task—stand outside it, before it—and it becomes an unruly monster that must be slain, but whose magnitude prevents me from doing so; I set before myself a bunch of expectations, things “to do,” which intimidate me, prevent me from starting. Viewing it as a whole, and predicting that it will take me an hour, I think it hopeless, and will likely put it off.


download-3Alternatively, I can decide with some willpower to begin inspecting the problem, thinking through it rather than merely about it. Thus, I write out what variables and formulas I know, draw an image representing the scenario, and begin relating them to each other. These individual acts are all things that I first know and, second, know how to do. If I forget a formula, then I flip through the book; if a variable, then I reread the problem closely. I inspect each item. I do not get caught up with the entirety of the problem, but concern myself with what I know to do, piece by piece, building from the bottom up. It is a matter of taking a deep breath and taking the leap. It is not enough to think “about” the thing; I must get up close, I must get within it, travel through it, as along a path, rather than flying up above from it, detached like a bird. Physics and Spanish must be done, not thought about. Prospecting leaves one in a ditch, where one becomes stuck; there is an impasse against which it is powerless. Sizing up the embankment will not get rid of it; rather one must take up a pickaxe or spade and set to work.

Conclusion

As with any other contrast, there must be nuance. If we were to merely think about prospection and inspection, then we would hastily conclude that inspection is superior to prospection, and that we should only inspect problems to solve them; however, if we download-3go within the contrast, that is, if we think through these two modes of cognition, then we realize that it is a dialectical relationship: Inspection cannot exist without prospective, and vice versa. Both elements are needed in order to work. Without prospection, inspection would be blind. In order to do something, we need to know what we have to do. We need the aerial view in order to chart a clear path and see how the parts relate to the whole. Identifying the problem in the first place then determining what it involves and what is required is integral to problem-solving. Equally true is that prospection without inspection is like a candle without a match: It cannot light, and nothing will come of it. The real thesis, then, is not that we should do away with prospection in favor of inspection, but rather that we ought not confuse prospection for inspection or neglect the latter altogether; in other words, thinking about a problem is not the same thing as doing the problem.

Police Brutality (2 of 2)

Read part 1 here.


A Bad Barrel

However, attributing police brutality to “just a few bad apples,” pinning it on a couple individuals who deviate from the norm, is not adequate and, frankly, ignores the fact that the problem is systemic. Considering individuals go through a screening process to become cops, and ignoring any faults which may be found in them, the question asked above—do violent people become cops, or do cops become violent people?—reveals that we need to look at the institution itself, and its attendant problems.

Roles


Unknown-1In the early ‘70s, Philip Zimbardo, a social psychologist, conducted an (in)famous experiment: The Stanford Prison study. Getting a group of students together to be his test subjects, he randomly assigned them into two groups, guards and prisoners. After just six days, Zimbardo had to stop the study, for it had gotten out of hand: The guards, who were psychologically stable, but who were equipped with uniforms, batons, and reflective glasses—providing a sense of anonymity, thus disinhibiting them—became increasingly violent and abusive, both physically and emotionally, toward their fellow students, who were themselves dressed in prison uniforms, shackled, and identified by numbers, dehumanizing them. Continue reading

Police Brutality (1 of 2)

Unknown-1In light of the murder of George Floyd on May 25 at the hands of a police officer and his fellow officers who stood idly by, worsened by the shooting soon after of Rayshard Brooks and a whole host of others, invoking the painful memories of the past murders of Breonna Taylor, Michael Brown, and Eric Garner, to name just a few, and in reaction to the violent law enforcement response to protests for the Black Lives Matter movement, people who are upset with these unjust acts, perpetrated by the very people supposed to protect us, are seeking radical change while asking themselves constantly: Why does this keep happening? Why does it seem that police never learn, that police brutality continues, and Unknown-2that justice is kept deferred? Why, with all that is happening, does senseless, unnecessary violence haunt the streets and neighborhoods of America? When people of color fear for their lives every day, when concerned citizens who are freely and peacefully protesting are shot at with rubber bullets and gassed and beaten, and when safety is threatened by those who swear to ensure it, there will be cries of ACAB—All Cops Are Bastards—and calls for the defunding, and even abolition, of the police. In chaotic times like these, we look for reasons behind it all: What could possibly drive someone to act so cruelly? So in this post, I will be exploring (by no means exhaustively) the question of why police tend to resort to brutality.  Continue reading

TikTok and Trends

Unknown.pngIn the absence of school, work, and other obligations, and in the presence of our devices, which for the time being our only ways of access to our friends and family and the “outside world,” what better way to spend one’s newly acquired leisure time than to lie on one’s bed and entertain oneself by scrolling through one’s TikTok feed and watching the latest trends as they play out on the “For You” page? Whether or not this is better classified as “using” or “wasting” one’s time, for many it is their only way of staying sane; love it or hate it, TikTok serves as a community in these times, an outlet where people can interact with others and express themselves, get a laugh, or maybe make new friends. During times of crisis, we look for comfort in humor and other people. And when you pair this with the fact that everyone is locked in one place, with nothing better to do, you get a recipe for immense productivity and creativity, everyone looking to outdo each other in their jokes and skits. As a result, we witness dozens of trends on TikTok, some funny and original, others not so much, but all of them united by one thing: time. That is, while the content might differ dramatically, it is the form, or character, of trends that remains universal, namely that they all last for a brief period of time before “dying out,” or becoming unfunny and overused, then abandoned. In this post, inspired by a TikTok live stream, I want to explore what a trend is, what role TikTok plays in trends, and what makes trends problematic. 

What are trends?


What is a trend? The answer would appear obvious, seeing as we have all experienced trends. It is, simply, a temporary popular movement; it is when a lot of people like something for a short period of time. However, we can also get technical because, on the sociological level, there are different ways of classifying collective behaviors. For Unknown.jpegexample, we might now ask, “What is the difference between a trend, a fashion, and a fad?” Some will answer that a fashion is more historical, a fad more crazed, and a trend more lasting. Right away, though, we come up against the conflict of the lay and the educated: often, our attempts to classify, that is, to be scientific, are opposed to the way we experience things as they really happen. In other words, language is shared and, for lack of a better word, ordinary; rarely would we stop to consider and debate the merits of a fad versus a fashion. In everyday life, we do not speak so precisely. This ambiguity is evident in the way we speak for the most part: we say that a video “is trending,” or there is a “trending hashtag,” or it is “fashionable to….” It would seem, then, that a classification is not appropriate here. Again, we settle with the common consensus in saying that a trend is a short-lived burst of attention and attraction to a behavior or appearance. All trends tend; each movement is directed toward something, follows a course.


To explain how this collective behavior comes about, we can look to one of the founders of crowd psychology, Gustave Le Bon, who in 1895 published The Crowd, initiating the academic interest in mass movements. According to Le Bon, a crowd is distinguished from an ordinary group by two criteria:

  1. deindividuation
  2. the law of mental unity.

Unknown-1.jpegIn order to be a crowd, the members of the group must give up their sense of personhood and have a common purpose. Hence, numbers do not matter; a crowd can be three people or it can be 50, just as long as it believes the same thing. Our idea of mob/herd mentality, or of a “hivemind,” originates from Le Bon’s work, in which he writes that the group assumes a collective mind, one that speaks for everyone involved. The collective mind is like the Leviathan in the English philosopher Thomas Hobbes’ political theory, the monarch who, by representing all individuals, thereby takes away their freedom. Since it is a “collective,” this mob mentality is greater than the sum of its parts, making it an entity of its own. No longer do the members make their own decisions; the mind makes it for them, and they obey it. It is as if each member dissolves into the collective.


Clicking on the sound of a TikTok, one sees everyone else who has used that same sound, and sees, more importantly, the repetition which occurs. It is usually the case that, as one scrolls through the “For You” page, one skips over the videos without much thought; it does not matter to us who made the video, unless, for some reason, it makes an impression on us; but what this shows us is that every single person who contributes to a trend on TikTok is essentially forgotten, overlooked by the bigger figures like Addison Rae, so that it would seem they are but a part in a big machine that rolls on without them. They are mere footnotes in trend history.


Another thing Le Bon observed about crowds is their susceptibility to influence, which is made possible by irrationality. It is very easy, he said, to use specific words in order to bring about action. Words are powerful because they conjure up images, emotions, and connotations. We act “as if [short syllables] contained the solution of all problems,” Le Bon wrote (The Crowd, 96). These “short syllables,” moreover, are more powerful depending on their vagueness. When we think we know what a word means, when it awakens an association within us, we are subject to manipulation. Someone can easily shape a crowd’s perception by abusing language by cloaking or redefining a word—e.g., chivalry devolves into “simpery,” making an otherwise- positive gesture negative—a problem to which I will return later.


The most important implication of the crowd, though, is their attitude toward truth. This is particularly problematic today because we are living in a post-truth era, when objectivity is discarded. Not only do crowds inherently believe anything, but the added skepticism of our age only worsens this tendency. As such, the psychologically and now-historically conditioned disregard of truth endangers our communication. Only ignorance can follow hence.


Of course, Le Bon was writing over 100 years ago and, since his time, we have come up with more updated theories of social behavior, like emergent-norm theory, according to which a crowd will form when we are confronted with a confusing situation and need a strong principle to follow, and value-added theory, which states that crowd formation requires

  1. awareness of a problem
  2. tension
  3. common beliefs
  4. provocation
  5. organization
  6. reactivity.

Both of these sociological theories try to develop Le Bon’s by rationalizing individuals’ behaviors. 

What is TikTok?


pexels-photo-1015568.jpegNow we can look at the exact role that TikTok has and how trends work there. To do this, it is important that we understand the function of TikTok. As a social media application, TikTok assumes its role as an extension of the public sphere. The public sphere is where we interact with others. Schools are a form of the public sphere because, in between classes, we get to talk with our peers and socialize. A better example would be any city, as that is clearly “public”; we are able to go to a coffee shop, order a coffee, and immerse ourselves amidst other people. But sociologists see the public sphere as doing more than just allowing us to socialize; fundamentally, the public sphere allows for socialization, “the lifelong process in which people learn the attitudes, values, and behaviors appropriate for members of a particular culture” (Schaefer, Sociology, 9th ed., p. 99). Put another way, the public sphere is where we are educated culturally and socially. Thus, we can see how TikTok might perform this task of socialization because it brings a bunch of people together in one place to learn and enforce what we should and should not do.


Unknown-2.jpegHowever, it might seem strange to describe TikTok as a public sphere—and rightly so. Earlier, I described it as an “extension of the public sphere,” which is more accurate. In fact, TikTok is unique because it constitutes a new sphere, what we would call the cybersphere. See, unlike a school or a downtown plaza, TikTok cannot be located on a map; I cannot say, “I’m going to TikTok to see a video.” Unlike the public sphere, TikTok’s cybersphere is virtual: it is spaceless. Recently, sociologists have accepted that crowds can now form without being in contact with one another (recall that Le Bon discounted quantity). Crowds are a type of “secondary group,” a gathering of people who do not know each other, are not close, and do not meet up frequently. TikTok users come from all over the world, and TikTok, while being a social media app, is not like Instagram or Facebook that tries to develop connections, but operates on short, impersonal interactions.


One consequence of this is anonymity. Le Bon said that a crowd consists of deindividualized members, people who, in joining the crowd, lose their self-awareness. Likewise, on the Internet, or on TikTok, users (the fact that we call ourselves “users” demonstrates this very impersonality!) can create their own profiles, which means Unknown-1.pngmaking up a name for oneself, ridding oneself of one’s identity. At school, people know our names, know who we are; online, however, we are a blank slate, so nobody can hold us accountable. This is what makes cyberbullying prevalent: we cannot be held responsible because nobody knows who we are behind a screen. Putting this all together, one comes to a frightening thought: if the cybersphere simultaneously socializes—tells us what to value—and deindividualizes—takes away responsibility and selfhood—then to whom are we listening, and from where are we getting these so-called values? The psychoanalyst Erich Fromm called this “anonymous authority”—when we adopt values from seemingly nobody. After all, we can say that a trend on TikTok is perpetuated by individuals, and perhaps put together a chronology of who said what when, but at the end of the day, the truth is that it is not just one person to blame; on TikTok, values are truly anonymous (the word literally means “without a name”).


Yet we can still add to this because Le Bon noted that a crowd is led; any crowd requires an opinion leader, someone popular or respected whose voice galvanizes. One of the core values of many TikTokkers is originality. People who use TikTok scorn those who copy something without crediting the creator. The original poster, the trendsetter, the one who sets the trend in motion, thus assumes the role of opinion leader. An example should suffice: the use of “Simp Nation” started by polo.boyy quickly spread, with many making their own spin-offs and commenting on others while tagging polo.boyy asking, “Is (s)he valid?”—i.e., do they live up to the original? Let us explore another aspect of TikTok now.


In sociology, gatekeeping is the process of filtering information. Media like CNN and Fox, for example, are gatekeepers because they let in certain information based on their agendas while blocking other information from getting through. CNN is more liberal, Fox more conservative, so they approve of different norms, which influences their output. Gatekeeping exists to protect and perpetuate dominant ideologies in a culture, beliefs that are held by powerful groups and which allow them to hold power over others. This leads to the oppression or silencing of certain minorities in most cases. So is TikTok a gatekeeper? At first glance, it would appear not. The question seems extreme. TikTok is not a news organization, you might say, so there is no need to block things. But is that so?


Unknown-2.pngA look at the algorithms should tell us… only, we cannot look at them because TikTok, run by a Chinese company, does not make its algorithm public. However, efforts have been made to understand at least a little about the algorithms, such that we know it operates according to a process called “collaborative filtering,” which makes predictions based on our past history as well as what other people like. The videos that appear on our “For You” page are therefore tricky at best. Several experiments have been conducted to show that, based on one’s liking tendencies, certain viewpoints become favored. This seems like commonsense. What makes this troublesome, however, is the blurred distinction between description and prescription: is TikTok recommending things that we really like or that we should like? Is it just building off our preferences or imposing its own? Does it describe us or prescribe to us?


On the one hand, we users are responsible for what we choose to like and dislike, which influences what we see; though on the other, it is possible for the algorithm to disproportionately impose certain views on us, regardless of our liking for them—it assumes our likes, in other words. Just because I like a video that happens to be conservative, for example, does not mean that I like conservative content. Shouldn’t the algorithm be based on providing us with new, fresh, funny, and original content instead of categorizing us? As a result of collaborative filtering, TikTok creates “filter bubbles,” specialized niches that, in accordance with gatekeeping, block us from certain things, exposing us only to those which have been selected. We can become easily trapped in these bubbles, unable to escape. The app becomes a positive feedback loop where, upon liking one thing, it brings me to a similar one, the liking of which will bring me more and more similar ones, etc. It is easy to see how views can become polarized on TikTok.


Unknown-3.pngPolarization is something that ought to be taken seriously. Many of us, when we hear the word “polarize,” think it means “to break apart,” which is indeed one of its meanings—”to fragment.” However, psychologically, its meaning is much more important. Already, within the word, we see “polar”—think North and South Pole. The phenomenon of group polarization, therefore, refers to radicalization; it is when groups become extreme in their original views. Just as the North and South are opposite each other, never able to meet, so groups that are polarized are at opposite extremes and refuse conciliation. Polarization is a matter of pulling-apart, dividing. Consequently, we form in-groups with which we identify and out-groups which we designate as the enemy. Collaborative filtering creates filter bubbles, which polarize individuals into groups, creating an “us vs. them” mentality. Thus, we see the inevitable introduction of identity politics into TikTok.

Case study: sexism


Unknown-4.pngNow, I wish to demonstrate what has just been said through an illustrative case study based on some of my own observations, in the hopes of providing insight into the collective behavior that takes place over TikTok. Specifically, I want to look at the case of sexism on TikTok. To begin, what is sexism? I shall define it as prejudice, or negative appraisal, toward members of another sex. Sexism is believing one sex is superior to another. (It should be noted that while it can affect men, sexism is primarily directed toward women.) Furthermore, one of the things which distinguishes sexism from other -isms and -phobias is its ambivalent character. Researchers contend that there are two types:  

  1. benevolent sexism, based on patronizing and diminution
  2. hostile sexism, based on explicit hatred and discrimination.

Feminist theory seeks to critique society from the viewpoint of women. While it identifies many problems, here are three examples:

  1. women are underrepresented in education, politics, and more
  2. media proliferates stereotypes about women
  3. society enforces roles and male-dominated discourse, or the patriarchy.

My intention in bringing these up is not to evaluate these claims, to say whether they are right or wrong, to challenge their fundamental beliefs as many are wont to do; instead, I present them to be considered further, on the assumption that they say something important about our society. Undeniably, our views of the sexes is shaped by gender roles, the existence of which is incontestable. Here in the U.S., for example, we have notoriously taken varying attitudes toward women since the ‘50s. Gender roles are expectations for how men and women are supposed to behave, and they are kept alive by normative rewards and punishments, usually in social, political, or emotional forms.


Unknown-5.pngWhat does this have to do with TikTok? Frankly, it is uncontroversial to state that TikTok is a place of tremendous strife with regards to sexism and prejudice in general. In reaction to the ‘50s, and reaching its heights in the ‘70s, the Women’s Liberation Movement made great strides forward in advancing women’s standing in America. When I was younger, having been raised in a small, friendly, and liberal city, I took it for granted that men and women were equal; I did not understand why people claimed women were lesser in any way. This might just be a purely subjective judgment, although maybe some will feel the same, but I feel that, moving in the direction toward liberal progress, the U.S. has become complacent, leading many, including myself, to falsely believe that we live in a post-sexist society—that is to say, as we have become more progressive, we believe we have “moved past/beyond” sexism. What this does is silence the matter, and de-problematize it.


IMG_3456.jpegWhat leads me to say this? Well, one might argue that TikTok, for example, is perfectly democratic because, like the American Dream’s promise of making anyone rich, the TikTok Dream’s promise of making anyone famous (if only for a while) is open to everyone. Creators can be male, female, non-binary, young, old, white, black—it matters not… or does it? Despite our apparent liberalism, sexism is far from gone. Take, for instance, the following remarks that can be found in pretty much any comment section: “If a male made this, it would be funny,” “Waiting for a guy to remake this…,” “The ‘f’ in woman stands for funny,” “You’re actually funny for a female,” “What? A woman who’s funny?,” “We did it boys, we found one that’s actually funny!,” etc. If anything, these myriad comments indicate that sexism—the belief in the superiority of one sex over another—is as strong as it has ever been.


pexels-photo-708440.jpegIs it the expression of “the people”? Is it representative of our times? I shall address this later. The fact is, each of the above cited quotations is evidence of a lingering patriarchy or—if you prefer to deny the existence thereof—male dominance. Is it really indicative of sexism, though? Isn’t it just an observation that, perhaps, this guy happened to be funnier than the average girl? That is to say, couldn’t they just be preferences for humor, not motivated by negative attitudes toward women? No, it is most definitely motivated by sexism: “Men are more likely… to minimize the contributions and ideas of members of the opposite sex,” reports one author (Schaefer, p. 288). The matter at hand is competency, and men are denying it. To be sure, if someone IMG_3471-1.jpegwere to comment, “Men are stronger than women,” then I would agree insofar as that is a biological, objective truth; however, to apply this level of competency to the comedic level, which, mind you, is subjective, and to declare that women are not as funny as men, is not a matter of fact but a matter of personal beliefs—though not good ones. We men are taught at a young age that we are the more “successful” sex, success being measured by our wealth, our social status, our political standing, etc. It would seem logical that humor would be yet another category that we claim for ourselves; we assume that we are better than women, so we must be funnier, too, a fortiori. To deny a sex’s humor is blatantly sexist; it is a denial of opportunity and an act of degradation.


One of the more interesting, and perhaps nuanced, aspects of this sexism on TikTok is the word “female.” But what’s the issue with “female,” you ask? I, too, was not entirely sure until one night when I was watching a live stream, and the host was expressing her views on it. She said the word, for her, was immature and degrading. Admittedly, I was confused because, after all, the word “female” is a common one, one used in everyday Unknown-6.pnglanguage, so what could be so controversial about it? As she explained, though, how it was “unnatural”—forced—and thus overly formal—a cop might say, for instance, “The suspect is a female”—it made sense to me. It seems entirely acceptable to play this off as just being “oversensitive” or a “snowflake”—I thought so myself as she first began—but when I really thought about it, I realized what it really meant. To me, the word “female” has an objectifying character. By objectifying, I do not mean sexualizing, however; instead, what I mean is that “female,” drawing on its formality, its unnaturalness, turns women into an object of study, that is, a specimen. One thinks of the phrases “Look at that group of females” or “The females are approaching”—in either case, the utterer treats the women in question as they would an animal in the wild, a variant of Homo sapiens that is either mysterious, dangerous, or even both. There is an air of caution, of wariness, that hangs about the word. The “scientist” finds himself (intentionally not neutral) in the midst of some-thing exotic. Other scholars point out that its provocative nature stems from the distinction between sex (female) and gender (woman). 


In short, “female” becomes a formal, scientific, and classificatory term. As “female,” the woman is reduced to a species, an object of study, a foreign or alien specimen, or—to put it in the terms of the existentialist-feminist philosopher Simone de Beauvoir—the woman Unknown-3.jpegbecomes “other,” in fact, The Other, completely different from man. Essentially, as I interpret it, the use of “female” amounts to an over-rationalization of women in response to their perceived irrationality. What I mean is, a common stereotype of women is that they are overly emotional, and they never say what they mean, making it hopeless for us men to understand them and what they want from us; and in response to this incomprehension on our parts, we decide to impose our “superior rationality” upon them, like the scientist upon an insect, in hopes of figuring them out and discovering what makes them tick. Members of the incel community have also contributed a word of their own: femoid, short for “female humanoid.” Clearly, this is even more dehumanizing and repugnant than the use of female.


So, as a matter of fact, having taken this all into consideration, the “f” in “Female” can stand for funny if we so wish and open up our minds a little bit. If I had more followers, or if my blog were seen by more people, then I would probably be more hesitant to publish this for fear of being called a “simp,” but fortunately for me, that is not the case.  

The question of irony


From what has just been said, it would appear sexism is a big problem on TikTok. But earlier I raised a question that is hitherto unanswered. In my opinion, we are faced with an even bigger, more serious problem. We must ask the question earnestly: Do people mean what they say? Are problems like sexism, racism, homophobia, and more caused by people with misguided beliefs—or, at the end of the day, is it all some big joke? Are the trends which indict both women and men* (I know I haven’t addressed sexism toward men, but it is a big problem in itself, perhaps worthy of a separate blog) motivated by actual internal values or are they just playful contributions? The problem, as I see it, is one of irony, for today is the Age of Irony, as I like to say.


In the 21st-century, irony has become incredibly complex, so much so that we can speak of things ironically, by which we do not mean what we say; “unironically,” by which one comes to like a thing after merely pretending to; “post-ironically,” by which one pretends not to mean what one says; and “meta-ironically,” by which what one says is meaningless and fluid. Accordingly, in this yawning abyss that opens before us in the absence of truth, we ask, Why do we say what we say on TikTok? At this point, we must dive into the deeper psychological and philosophical underpinnings of trends and how we participate in them. Psychologists distinguish between three main forms of social influence and their motivators: 

  1. Compliance: Do we say what we say in order to gain rewards and avoid punishments? For instance, do we post a video of ourselves making a racist joke because we know that such humor is liked by many, and we expect to get a Unknown-7.pnglot of likes and followers from it? One of the sad things I have observed on TikTok is the self-degradation in which some girls engage, seemingly for this reason; they “go along” with gender roles to elicit the approval of male followers. Of course, I do not want to generalize: some do genuinely believe in such roles, but what I am concerned about is when girls do it solely for attention, even when they know it is false. One is (seemingly) forced to put aside one’s internal convictions in favor or public approval. Another problem associated with this comes up when we consider the number of young TikTok users: What happens to young, impressionable kids who see the divisive comments and offensive videos, and are thus socialized to find it acceptable?

  2. Identification: Do we say what we say in order to belong to a group? As a male, I feel a kinship with my fellow guys; a girl, similarly, will feel a kinship with her fellow “gals.” This is a natural thing for us to do. But another thing which is natural yet should be avoided because it is harmful is what I discussed earlier: Unknown-4.jpeggroup polarization and in-group favoritism. When we are in any conflict, we will usually side with our tribe. At that point, communication between the two camps is fruitless. This, in turn, leads to the in- and out-group homogeneity effects. When this happens, we see the group of which we are a part as being alike in its virtues, and the group to which we are opposed as being alike in its vices. Seen from the perspective of sex, boys might say, “All girls are the same: they’re promiscuous and stupid, whereas we boys have each other’s backs and are funny,” and girls might say, “All boys are the same: they’re cheaters and objectifying, whereas we girls are compassionate and loyal.” Do some say what they say because they have been indoctrinated by the all-encompassing monolith known as “The Boys”? 

  3. Internalization: Do we say what we say in order to get our views across? Put in terms of sex: Are people sexist? This explanation does not have to do with influence; internalization means that we have encountered a belief and adopted it for ourselves. One who has internalized the belief that men are Unknown-8.pngfunnier than women is not saying sexist remarks because one has social needs, but because one is sexist, plain and simple. This also makes it the hardest explanation to tackle because it is hard to change someone’s mind when it is already made up. In TikTok comments, one can read such things as “The girls are pressed now” or “All the females are silent because they can’t respond.” While these may have merit to them, they are also inherently provocative; the very post on which they are made was created to elicit these responses due to their beliefs. Both men and women are guilty here, as they often post videos which are controversial because they want to blame the other sex, resulting in conflicts which divide even more.

  4. Probably a little bit of all of them. 


From what I have just sketched, it is apparent that ideas and values are difficult to communicate seriously these days. In our liberal era, it is difficult for many to express themselves if they feel their opinions are not mainstream; conservatives and right-leaning people, finding themselves cornered, unable to openly say what they feel, may fall back upon irony as a defense and shield to deflect criticism, or they will appeal to some conspiracy like that of “postmodern cultural Marxism’s attempt to destroy Western Civilization by means of identity politics.” Thus, when faced with backlash, one can easily say, “I didn’t mean it, it was just a joke”—but was it? That is the difficulty. It is hard to tell what one truly believes in these days.


The German philosopher Martin Heidegger illuminated how this ambiguity results from trend-following in his famous 1927 book Being and Time. His term for this phenomenon was Gerede, which translates from German into “idle talk.” According to Heidegger, idle talk is intrinsically inauthentic because it is mediatory. What did he mean by “inauthentic”? The German word for authentic, eigentlich, derives from the word for “own,” eigen. Therefore, something inauthentic is something that is “not one’s own”; it is insincere, disingenuous, false. When he said that this kind of talk is mediatory, he meant that information gained through idle talk is never gained through oneself, but always through others. As such, I cannot claim it as “my own” knowledge.


To use an example, just a couple of weeks ago, there was a trend—now dead—on TikTok in which people found it funny to post their reactions to a video of a baby with stuff on its mouth, in which they would say things like, “Why does he like middle-aged?”, “I really wanna hit that baby so hard,” “I can tell he smells like ketchup,” and other stupid things. You have to ask: would they really do the things they said if they found themselves face-to-face with that child? Of course not. They said it to be funny, because it was “the trend.” But this is not what is most interesting about the trend, no; what is most interesting about this particular trend is that one did not have to see the original in order to know and follow it. The TikTokker I was watching on live stream herself said, “I didn’t see it [the video of the baby] before it got popular,” and yet she knew what it was. One hears of it from others.


Unknown-5.jpegAs Heidegger put it, idle talk is hearsay. The word hearsay is interesting in that it is self-evident: it literally means “hear, then say.” One hears about a trend and, without giving any thought about it, without being critical, passes it on. Communication, which for Heidegger functions as uncovering—language reveals a situation—becomes covering-up instead. When we should be looking for the original, we cover up the very origin, thus obscuring its meaning. We misunderstand a trend to be understood. “Oh, well everyone knows about the baby video, though,” one says, concealing one’s misunderstanding. Through idle talk, beliefs and opinions and values are picked up and “passed along,” as through a game of telephone. On TikTok, “everything (and at bottom nothing) is happening” (Heidegger, Being and Time, p. 219) Should you take just a moment to break this idle talk down, however, you will discover its baselessness easily. If you were to ask someone why they hated the baby, for example, they would not be able to give you an authentic reason, that is, a reason of their own. There is nothing backing their beliefs; it is unsubstantiated; it is unreflective.


Moreover, ambiguity results from a lack of intention, Heidegger said. It is not as if, in spreading the baby video, the person genuinely hated the baby and wanted to deceive others with their opinions; rather, idle talk is pervasive precisely because it is intentionless: it is mindlessly, unthinkingly, and uncritically absorbed information that has not been digested. Idle talk quickly becomes normative and prescriptive when it is mixed with lots of free time needing to be filled with entertainment, for the TikTok filter bubbles created by collaborative filtering that I discussed earlier conditions “what to watch,” i.e., whatever appears on the “For You” page.


IMG_3457.jpegPsychologically, this resembles something known as “pluralistic ignorance.” A social psychologist writes, “[W]e often misperceive what is normative, particularly when others are too afraid or embarrassed to publicly present their true thoughts, feelings, and behaviors” (Kassin, Social Psychology, 8th ed., p. 261). Pluralistic ignorance is when we disagree with something but support it openly because we assume everyone else supports it. If there is some prevailing view, like that of sexism, against which I am opposed, yet I see video after video voicing it, then I might think to myself, “Oh, everyone else supports it, and I can’t be the only left out, so I guess I’ll hop on the trend”—even when everyone else, deep down, feels the same way. Thus, some end up participating unwillingly. It reminds one of dramatic irony; it is as if we are actors in a tragic drama, the way we succumb to a non-existent threat.


IMG_3470-1.jpegHowever, we must not forget that there are people out there who, through their courage, and despite their minority status, do speak up. As we know too well, though, whoever opposes the dominant ideology or disagrees with the majority is met with ostracism and derision. Heidegger stated, “[I]dle talk discourages any new inquiry and any disputation, and in a particular way suppresses them and holds them back” (Being and Time, p. 213). If a guy speaks up for a girl, he is automatically a “simp” or a “cuck”—notwithstanding their misapplication: those who throw about such terms do not even bother to look up what the words mean, merely taking their meaning for granted. Opinions become fixed, accepted, and established via repetition, regardless of their original meanings or histories, due to idle talk, as Heidegger would explain it.


IMG_3460-1.jpegEvidently, words like “simp,” derived from “simpleton,” and “incel,” derived from “involuntary celibate,” are overused and, as such, have lost their true meanings. Just as Le Bon explained, meaning and truth do not matter to crowds; as long as a word acquires some kind of normative significance, it can hold influence over people and their actions. As soon as a guy speaks out against mistreatment by girls, like if he had been cheated on by an ex-girlfriend, he is labeled an “incel” because that word, through careless use, somehow acquired the wide-ranging meaning of “anything remotely anti-feminist.” Yet the word should be reserved for those men who, through inadequacies of their own, among which are their extremely prejudiced views of women, expect privileges and special treatment from women. Now, if a guy says, “This one time, a girl ghosted me,” he is instantly an incel—?


What all this inquiry has shown us, at bottom, is that originality, closely linked to authenticity, ownness, is an endangered concept. To create things that are uniquely one’s own—this practice is becoming increasingly difficult. “[W]hat is genuinely and newly created,” Heidegger said, “is out of date as soon as it emerges before the public” (Being Unknown.pngand Time, p. 218). At the beginning, I said that one of the defining characteristics of a trend is its ephemerality, its temporariness. To be ahead, Heidegger reflected, was to be on time; reflection is already behind, too late onto the scene. When one chooses to be authentic, one is left behind. I have neither the space nor the knowledge to engage in the philosophy of humor here, but suffice it to say, the question of what constitutes humor, as well as its fate in this century, becomes important, especially due to the presence of apps like TikTok. Some take the view that whatever is mainstream is unfunny; a good joke is one that belongs to the few and which, for that reason, is appreciated for its comedic value. But once a joke becomes a trend, enters into the mainstream, it erodes like a cliff exposed to water, becoming overused, annoying, and predictable—predictability, the death knell of humor. As I like to say, all that is comic is novel. 

Conclusion


In conclusion, we have explored what exactly a trend is and how it functions; what TikTok is and does; how trends express themselves through TikTok; and finally, what some of the ramifications are of trends on the collective conscience. As humans, we do crazy things together, and it is in our nature to then stop and ask, Why? This is why psychology and sociology, for example, are so fascinating; they help us to look at how pexels-photo-697243.jpegand why we do the things we do. We learn, for example, about what enables a crowd to prosper, as well as the complex, nuanced reasons behind why we side with groups. In turn, this raises ethical questions. Is this a problem? Should we say the things we do? How do we fix it? and so on. I guess I should offer a disclaimer (more of a debrief, seeing as it is coming at the end) by saying that, for the most part, I enjoy TikTok and derive a lot of enjoyment from it. Some of the trends I criticized in this post, for example, are actually among my favorites. It is good to be able to compartmentalize, to enjoy something on the one hand and to be able to step back and criticize it on the other. Life is a pendulum swinging between humor and seriousness (where does irony lie?). It is important that we stop and think before we post or comment, but equally important that we not take jokes too seriously (what’s the line, though?). In the end, it is most important that we make the best of our time in quarantine, whether that means getting a laugh out of a TikTok, spending time with family, going outside, etc. Always remain thoughtful: the unexamined life is not worth living. 


* I’m thinking here of the now-dying trend that goes “If girls do ‘x,’ then why is it bad when guys do ‘x’?”, featuring a guy standing, back turned to the camera, looking up dramatically, as if pondering this cosmic question, or a sad girl on her bed wondering, “Why is it okay when guys do ‘y,’ but when girls…,’ etc.

Several images taken from Pexel.com.

Print sources:
Sociology 9th ed. by Richard T. Schaefer (2004)
Social Psychology 
8th ed. by Saul Kassin (2010)
Being and Time
by Martin Heidegger (2019)

The Crowd by Gustave Le Bon (1897)
Online sources:

Glick, Peter, and Susan T. Fiske. “Ambivalent Sexism Revisited.” Psychology of Women Quarterly, vol. 35, no. 3, 2011, pp. 530–535., doi:10.1177/0361684311414832.

Haskins, Caroline. “TikTok Can’t Save Us from Algorithmic Content Hell.” Vice, 31 Jan. 2019, www.vice.com/en_us/article/kzdwn9/tiktok-cant-save-us-from-algorithmic-content-hell.

Heilweil, Rebecca. “There’s Something Strange about TikTok Recommendations.” Vox, Vox, 25 Feb. 2020, www.vox.com/recode/2020/2/25/21152585/tiktok-recommendations-profile-look-alike.

Mellor, Maria. “Why Is TikTok Creating Filter Bubbles Based on Your Race?” WIRED, WIRED UK, 2 Mar. 2020, www.wired.co.uk/article/tiktok-filter-bubbles.

Wu-Sharona, Qian. “Is Algorithm Really Isolating People in ‘Filter Bubble’?” meco6936, 9 Apr. 2020, meco6936.wordpress.com/2020/04/09/is-algorithm-really-isolating-people-in-filter-bubble/.

On Bonding: A Polemic

Every child believes itself to be the center of the world. We imagine the world to be a sort of shadow play: Others pass us by, like shadows projected onto the walls by our hands, talking and playing with each other, before disappearing into the nothingness, consumed by the dark, and we fall asleep. It is this magical egocentrism of ours which we never outgrow, a conceit that grounds itself as the fore of our understanding of ourselves and others. We, the puppet masters, encounter shadow puppets every day, yet unbeknownst to us, they are not actually our projections, but people in- and of-themselves, at the center of their own worlds. Essentially, life amounts to a collective sleep-walking up from which we rarely wake.


Over the past few months, from both personal experience and ideas I’ve been reading in and out of my psychology class, I have been thinking about bonding. When we hear that word, specific memories come to mind, instances in which we find ourselves with friends doing something fun, like going out for food or riding in the car, chatting and learning about one another, and walking away feeling like our loneliness has been shattered, feeling like we have been discovered by someone, finally freed from the captivity of our subjectivity. At the same time, the word seems corny: Being put into a group of classmates or strangers and forced to do activities with them in an attempt to become closer with them comes to mind. “Bonding,” therefore, is in my mind expressed by a smile in these two senses: It is a happy, pleasant affair, but it can also be disingenuous, plastic, clichéd. And it is precisely this ambiguity, I feel, which has not been taken seriously or considered as a problematic by many, especially my peers, to whom this is dedicated, wherefore I think it necessary to reflect on.


Several months ago, I had some people from another school over whom I would ultimately befriend. Later that night, I stayed up talking with one of them on the phone, and we evaluated how the night had gone, agreeing that, despite everyone’s having fun, no one left knowing much about one another, at which point my friend said to me, “You can’t get to know people by making them share about personal stuff.” The remark unsettled me because it contradicted one of my core pillars: Creating authentic relationships based on depth. It shook me up so much that it caused me to think on it for the next couple weeks as I considered how best to get to know someone, and then whether or not we even could know someone in the first place. Eventually, after consulting some friends, I was reassured: What my friend said was, in fact, nonsensical—and I should have known that when I first heard it—considering the very phrasing, the use of the word “personal,” refutes it. Obviously, the only way to learn about a person is through person-al matters, right? This much should be self-evident. 


Later, in my psych class, we were studying motivation, and my teacher had us read this article about “Project Aristotle,” a study on what makes the best team. The researchers reached the following conclusion: A good team fosters psychological safety, meaning the quality of openness and vulnerability between people, the ability to be and express oneself without judgment. As an example, the article cites a moment when, in one of the groups studied, the leader brought his group together, sat them down, and told them all he had cancer, a revelation that established a sense of safety and belonging among them all, causing several others to open up in turn, after which the group became one of the most successful. Reading this eased my mind. I was correct after all—bonding is impossible without vulnerability, without deep diving. Even before reading the article, I had been doing the same thing, trying to attain psychological safety, creating an environment of openness and acceptance, in an attempt to get closer with others without knowing it. 


One of the arguments against bonding is that it should not be forced. According to Harbinger in “Stop Trying to Be ‘Vulnerable. Do this instead,” forcing bonding is a bad idea for several reasons. First, because it makes others feel uncomfortable. When people are forced to open up, there is bound to be resistance; after all, the element of coercion, of making someone do something against their own interest, is extremely repellant. “If she shared, then I must,” we think. Thus, it becomes an expectation, a requirement, instead of a meaningful opportunity. In other words, it creates a distressing obligation where there need not be one. My friend told me a friend of his had such an experience: At their church, they congregated in small groups and shared personal experiences, one of them discussing how, as a high schooler, he had to skip school for a while to go to rehabilitation, having become addicted to drugs. Understandably, my friend’s friend felt overwhelmed, as if he were in the wrong place. One can imagine oneself in his place, thinking frantically, “Oh god, what am I going to say when it’s my turn? What can compare to that?” A second reason forcing bonding is a bad idea is that it may have ulterior motives. As Harbinger puts it, “The right motivations for opening up are about being: being ourselves, being connected, being authentic. The wrong motivations for opening up are about getting: getting sympathy, getting friendship, getting approval.” This is self-explanatory. 


However, while I agree with some of Harbinger’s assessments, I disagree with his conclusion that bonding should not be forced; I aver that it ought to be cultivated. To begin with, self-disclosure is not, by definition, depressing, sad, or negative. For some reason, we have this notion that private matters are necessarily dark. At a team dinner, I suggested we talk about ourselves (not in a narcissistic way), to which one of my teammates replied jokingly, “What, you want us to share our deepest, darkest secrets?” This kind of response, which is common, gets at this misconception. While our “secrets,” as it is put, are often dark, they need not be. Indeed, when we speak of personal matters, matters that express our person, our individuality, are we not speaking of who we are, fundamentally, as people, which, far from being just negative experiences, is also our aspirations, dreams, and joys, and innumerable other things? It is silly to single out negative experiences. Going back to my friend’s friend in church, we can see why he felt so uncomfortable. The real problem is not that there is an expectation, but what that expectation is. Because everyone has a story, we assume there has to be some dark conflict within it, a defining struggle, like that of a tragedy. But life is not just a tragedy, it is also a comedy—not just in the narrow sense of being funny, but of being celebratory, happy, jubilant. Personal in-sight is literally seeing-into the other person. Sharing about what wakes one up in the morning, what drives one every day, or what passions one has, is just as telling as a fear or a crisis. Our response to “Who are you?” is and should not just be, “This is what has happened to me”; it consists also of things like, “Here’s what I have done,” “Here’s what I wish to do,” “These are things for which I’m grateful,” etc. It’s about time we disabuse this confusion. Then we must consider that, while the feeling of being obligated exists, it is by no means exclusionary; that is, it is not as though if one does not share, one will be ostracized. Harbinger himself writes about how, when it was his turn to share, he said he was uncomfortable with having to share, which was met with respect from several listeners. So this idea of an implicit social contract is overblown. The pressure exists, undoubtedly, but it is not tyrannical.


Next, Harbinger talks about motives, and here I entirely agree with him. As a principle, I have made sure never to aim at getting, only at being. I have no desire to make people talk in order that I may use this privileged information to blackmail them at a later point in time—not only is this a cynical way of thinking (not that I don’t deny this happens), but it is intrinsically wrong and unconscionable, I’m sure everyone would agree. For this reason, I see bonding as a means to knowledge, not manipulation; it is used to learn about others. Some may use this for their own gain, but these scoundrels are thankfully few. The key quality of vulnerability is, for Harbinger, authenticity. By this, he means that opening up must “come naturally,” which is notoriously vague. He subscribes to a cinematic view of reality in which we, at the height of our ecstasy, spontaneously spill forth our souls, letting the truth roll off our tongues effortlessly, to the enlightenment of everyone else present. Yet this idea of “coming naturally,” in my opinion, is no more artificial than the inducement thereof. This is not to say that spontaneous self-disclosure does not occur; it certainly does, just not as much as either Harbinger or I would like it to. Rarely does life work out this way, as much as we would love for it to. It is a Romantic, idealistic way of seeing the world; in fact, it places its own expectations on the situation, requiring that it unfold in the “right” way, lest it disrupt the natural order. Another aspect of authenticity is that it must “feel right”: “Because we know that what we’re feeling in those moments [of forced bonding] isn’t the real thing. We know that we’re being forced to open up, despite the fact that opening up is only meaningful when we choose to do it.”


But why can it not be meaningful when forced? Why can it not be chosen? Some hold this view that if it is forced, it is intrinsically bad. True, the situation is an imposition; the occasion has indeed been set upon us, but who says we cannot act freely therein? Every single encounter is such an imposition. The assumption is a deterministic one. It is like saying that, because my friend has approached me with the topic of homework, I must conform to that and not stray from it, precisely because it has been brought up, when I know for a fact that I can change the topic to whatsoever I desire. Harbinger seems to think that any time I am cornered, I immediately become inauthentic. And why is it that “what we’re feeling in those moments isn’t the real thing”? Is he proposing a radical presentism, according to which something is not real if it is not existent at that very moment? Therefore, in response to “What are you feeling right now?” I might as well say, “I feel the chair I’m sitting on.” Technically, everything is in the past, though, because our brains have to process every sensation. My point is, Harbinger’s contention that we cannot be authentic in the moment is far too constricting and nonsensical. Does he know better than I do about myself? If I give an answer, can he rightfully accuse me of having lied? I think not. Authenticity need not be spontaneous in the sense he prescribes.


The next argument against bonding is not only that it should not be forced, but that, fundamentally, it cannot be forced, similar to the myth of “coming naturally” or “happening of its own accord” in laissez-faire fashion. This argument derives from experience. Our deepest relationships, like those we have with our best friends, siblings, or spouses, also happen to be our longest relationships. The reason they are so deep, we gather, is because we have known them so long. And this is true. Experience accrues with time. Accordingly, relationships should be allowed to develop naturally, over time, without forced interference. As Lao-Tzu put it, “Nature does not hurry, yet everything is accomplished.” Once more, this anti-interference attitude comes into play. If it is forced, if it goes against nature, then it is automatically inferior—the naturalistic fallacy. Furthermore, proponents of this argument will say that people are like puzzles: We do not see the picture all at once, but put it together piece by piece, making more progress on some days than others, sticking to it, patient, persistent, stopping some days to rest, then picking up where we left off. Unlike ordinary puzzles, though, humans cannot be “completed”; we will never have the full image filled in; there will always be gaps—pieces—missing. “Very well,” they say, “but you can’t build a puzzle in a day.” Agreed. But—as you said, some days we make more progress than others—so why not put in a lot of work, make it an all-nighter? Then there is that great quote that Plato said but that he never really said that Alan Loy McGinnis really said: “You learn more about a person in an hour of play than in a year of conversation.” The first half contradicts what has just been said; what is important is that the second half devalues the role of conversation. Really? A whole year—365 days—of conversation is not that telling? Whether or not it is hyperbole, I refuse to believe it. 


In reply to this second objection, that bonding cannot be forced, I offer the argument from exigency: Simply put, we do not have the time to wait, to “let nature take her course.” There are over 7,000,000,000 fellow humans within this world of ours, of which an iota of a fraction graces our existence. And of this tiny proportion that even grazes us, passes us by, there is an even smaller number that stops in the race of life to engage us and stick by us. In our early years, we are lucky to know someone outside of our families for a decade. My best friends, the people I have known the longest since pre-school and with whom I am still in contact, have been my friends for about 13 years now—it will be 14 when we graduate, it will be 14 when, tragically, we part ways for college and enter the real world, likely to never meet again. During all these years, we have steadily grown apart, and I lack the real-time knowledge that I used to have; for, no longer being around one another 24/7, we do not have constant coverage of what is going on. And these are my best friends—what of peripheral friends, friends we talk to and have hung out with, but whom we do not know that well, who are destined to be forgotten except when we jaunt through our yearbooks a decade hence? Should we just “leave it to nature”? Are we to wait for spontaneity to unravel itself? The reality is that, often, these occasions never arise by themselves. Sometimes, bonding does not come at all and, rather than force it because we are afraid of forcing things, we decide to remain inactive, losing out on priceless memories and experiences. There are people whom we will encounter for one day in our lives, say, in a coffee shop or at a museum—why not engage them? Again, the expectation is not that we will interrogate a stranger about their life at home. It may be something as innocuous and banal as a classmate with whom we are paired one day, but to whom we never again utter a word. Life is short. It really is. We simply do not have the time to wait things out. Even supposing this were not true, as Emerson does when he says “We ask for long life, but ‘t is deep life, or grand moments, that signify,” the focus is on “deep life,” profundity, meaning, person-al-ity; our priority should be density, how much we can fit into as little space as possible. We want the densest relationships possible; the more we put into something small, the denser it is. This amounts to a kind of “Carpe diem!”


And pertaining to the McGinnis quote, in addition to its being hyperbolic, I might add that, although actions do speak louder than words, I think this must be qualified: Perhaps I might venture to interpret “conversation” as an intentional choice, indicating small talk, in which case I would agree. An hour of play is certainly more telling than an hour of small, trivial, insignificant chatter. However, to elevate play to the level of profound, rich discussion and self-disclosure—unthinkable, methinks. To determine who one is based on play, is to research like a behaviorist, focusing only on observable behavior while neglecting consciousness, the private, subjective world behind such behavior; while talking with someone (and assuming they are truthful and honest) reveals their intentions, feelings, ambitions, dreams, desires, etc., all of which form who they are, essentially. Words are words. That does not mean much, but we do tend to devalue words. They tell us things. Words communicate. They bridge gaps as best as possible. Of course, words are not transparent; they do not magically reveal the other as through revelation; they are more translucent, letting in a bit of light while still retaining opacity. But it is the best we can and will get. And perhaps it is best that way, neither transparent nor opaque—it adds mystery and life to the other person. We do not quite know what exactly is going on in their head, but we get clues, from which we can make inferences, and so on. It is fun that way. Stressful, too, but fun. As Aristotle stated, we naturally desire to know. As he also stated, we are political animals, meaning we thrive in communities. We desire belonging, we desire intimacy. We want to get close to others and know them. To me, it does not make sense to put this on hold. If we desire knowledge, intimacy, and belonging, then why wait? Why not seek it out?  


The last obstacle toward bonding, as I see it, is how we conceive of it. Similar to my earlier anecdote, I was hanging out one night with some of my teammates, and I asked if we could bond. One of them said, “Good idea, let’s go around and say our favorite colors,” at which everyone laughed. Following through with it, we each said a color, before they then said, “Now let’s say our favorite genres of music.” This was definitely an improvement from the previous one. A couple years ago, coincidentally, I had written an essay on friendship in which I discussed the idea of favorite colors, using it as an example of the kind of knowledge which has been devalued over time. In other words, the older we get, the less we concern ourselves with smaller, more trivial aspects of others, especially those whom we presume to know best. Do you know your best friend’s favorite color? If you do, then you are in good hands. My point is, a seemingly unimportant question such as that, while not telling us anything about the person themselves, indicates the level of commitment one puts into a friendship. This is not to say that, in order to be a good friend, you should be able to list off every single trait or preference of your friend; it is the spirit, not the letter, of the law with which I am concerned.


The reason I bring this up is because, when my teammate had us tell our favorite colors, they did not have this intention in mind; it was more of a joke. However, in spite of this, the innocence of it, the very fact that it seemed childish to ask, redeemed it. There we were, a bunch of high schoolers who, outside of track and field, hardly knew each other, sharing our favorite colors, as if we were strangers—”as if”?—no, we were strangers to each other. A while after the first anecdote I told (sorry to jump around), the one about my teammate at the team dinner who interpreted bonding as “sharing one’s deepest secrets,” we had another team dinner. Now, when I brought the topic up again, they said, “But we already bonded last time, remember?” I did remember. I remembered how we discussed where we were going and what we were doing for the upcoming vacation. While I admit it was interesting to learn about what my teammates were planning on doing and some past experiences they had had, I did not feel as though I knew them any better; it felt like I knew about them, but I did not know them themselves. In philosophical terms, I learned about accidents, things that are not inherent to a thing, but happen to be there, i.e., things that are unnecessary. Thus, I learned about some different facets and surfaces of my fellow runners, but I got no in-sight into them. Accordingly, my peers and I confuse small talk for bonding, thinking that sharing small pieces of trivial information is equivalent to self-disclosure. I want to emphasize that I am not advocating for the eradication of small talk; I wish simply to separate small talk from bonding, to isolate the latter as something in itself with which we ought to be more concerned.


My next point is as follows: In adolescence, we form friendships based on things we have in common, whether it be sports, videogames, music genres, personality traits, or something else. In consequence, making friends becomes a lot easier. Most of the time, we can identify who we wish to befriend in advance because we will know if they are a good fit for us. If I like a TV show, then I have a point of entry; if I like football, then I have a point of entry. This common trait then binds us together. However, this blessing is also a curse: Forming friendships based on a common trait is just that—a relationship centered around a fixed thing. A good friendship, mind you, will move beyond that initial shared thing, but most friendships do not. Hence, most of these relationships are what Aristotle would call “friendships of pleasure/utility”: They are relationships not immediately between two people, but mediated between people and something else, be it pleasure or some kind of use. As a result, we do not focus on the other as Other, but the other as they relate to our shared object. The relationship becomes secondary. A narrow group identity, then, endangers authentic friendships. Take my track team, for example. We “know” each other. I know one teammate and label them “the guy who runs the 400 meters,” another as “the girl who does long jump.” Everywhere we go, to make sense of the world, we add epithets. “X, the one who…,” “Y… the intelligent one…,” etc. And yet, the ironic truth is that, the more people know us by these epithets, the less we are known. How it must feel, to come home, reduced to “that girl—she went skiing over the break—she is on the school newspaper,” and to be known for that, to be known as that—how lonely we all must be from time to time, when this all sinks in, for such is the human condition! Is that all we are? Epithets? Like Jean-Paul Sartre’s waiter from Being and Nothingness, none of us on the track team is fundamentally a track athlete; we merely play at being track athletes. It is a role, and we must acknowledge it as that. To do otherwise, is to be in bad faith: We reduce our transcendence—our freedom—to our facticity—our past or present condition—thereby depriving ourselves of self-determination. I am nothing, according to Sartre, precisely because I can choose anything. I am not a track athlete. I am not a blogger. These are things I have chosen, and these choices, in turn, reveal something about me. But they are not me myself. Why limit ourselves to anything? For my teammates and I to say of each other, “We are runners/jumpers” is, in the words of Emmanuel Levinas, to totalize one another’s infinity; that is, when we try to apply some kind of image or representation to other people, we limit them in their potentiality-for-Being (had to get a Heidegger reference in somewhere!). We do not allow them to be. When Levinas describes “the Other” (another person) as infinity manifested, he is saying that the other person is always beyond us, incomprehensible, nuanced, interesting, unique—in a word, infinite. Yet we are constantly trying to totalize the Other, throwing a metaphorical net over them, as if to catch their essence. We want to finally “get” each other. We ourselves want to be “got.” A grand paradox in itself. The title of Levinas’ best-known work, Totality and Infinity, might as well be the subtitle to the movie of Life. In short, when we self-identify, we unintentionally mislead and deceive others, not just at their expense, but also at our own. Misunderstanding is a two-way street. I myself am guilty hereof. Many a time have I been in bad faith, inauthentic, totalized. Rarely have I been swept up in the channels of infinity. 


This polemic is really an open letter-plus-criticism-plus-confession. As I said in the opening, I was motivated to write this mostly due to personal experience. Already halfway through my Junior year, and with one year left before I graduate, I am constantly reflecting on friendship because it is one of the most important things in life. There are so many people I have not met or talked to, that it is dizzying at times. It is genuinely mind-blowing how a single conversation with someone can change things instantly, how the exchanging of mere words can have a tremendous effect on two people’s lives. In the past year, I have met many fellow track athletes—I mean, people who play at being track athletes—from different schools, people who were previously considered “enemies,” but who, as is always the case, ended up being just the opposite. But time is running out, and I know it. So the reason for my criticalness, I hope you can understand, reader, is my awareness of time. The reason I adjure my peers and myself so, is not because I am disappointed or annoyed with them; quite the opposite—my urgency is a reflection of my concern for them. I hope whoever reads this may ponder what has been said and take it to heart, too, for thought is desperately needed at all times. Perhaps I am too late in delivering this message, perhaps the hands near midnight, and the Owl of Minerva is taking flight—but then again, better late than never…