Technology and Social Media: A Polemic

§1


Much gratitude is to be given to our devices—those glorious, wonderful tools at our disposal, which grant us capabilities whereof man centuries ago could only have wished, the culmination of years of technology, all combined in a single gadget, be it the size of your lap or hand. What a blessing they are, to be able to connect us to those around the world, to give us access to a preponderance of knowledge, and to give longevity to our lives, allowing us to create narratives and storytell; and yet, how much of a curse they are, those mechanical parasites that latch onto their hosts and deprive them of their vitality, much as a tick does. That phones and computers are indispensable, and further, that social media acts as a necessary sphere that combines the private and public, creating the cybersphere—such is incontrovertible, although they are abused to such an extent that these advantages have been corrupted and have lost their supremacy in the human condition.

§2


Technology is ubiquitous, inescapable, and hardwired into the 21st-century so that it is a priori, given, a simple fact of being whose facticity is such that it is foreign to older generations, who generally disdain it, as opposed to today’s youths, who have been, as Heidegger said, thrown into this world, this technologically dominated world, wherein pocket-sized devices—growing bigger by the year—are everywhere, the defining feature of the age, the zeitgeist, that indomitable force that pervades society, not just concretely, but abstractly, not just descriptive but normative. In being-in-the-world, we Millennials and we of Generation X take technology as it is, and accept it as such. To us, technology is present. It is present insofar as it is both at hand and here, whereby I mean it is pervasive, not just in terms of location but in terms of its presence. A fellow student once observed that we youths are like fish born in the water, whereas older generations are humans born on land: Born into our circumstances, as fish, we are accustomed to the water, while the humans, accustomed to the land, look upon us, upon the ocean, and think us strange, pondering, “How can they live like that?”

§3


As per the law of inertia, things tend to persist in their given states. As such, people, like objects, like to resist change. The status-quo is a hard thing to change, especially when it is conceived before oneself is. To tell a fellow fish, “We ought to live on the land as our fathers did before us”—what an outlandish remark! Verily, one is likely to be disinclined to change their perspective, but will rather accept it with tenacity, to the extent that it develops into a complacency, a terrible stubbornness that entrenches them further within their own deep-rooted ways. This individual is a tough one to change indeed. What is the case, we say is what it ought to be, and so it is the general principle whereupon we take our case, and anyone who says otherwise is either wrong or ignorant. Accordingly, following what has been said, the youth of today, the future of humanity, accepts technology as its own unquestioningly. As per the law of inertia, things tend to persist in their given states—that is, until an unbalanced force acts upon it.

§4


What results from deeply held convictions is dogmatism. A theme central to all users of devices, I find, is guilt; a discussion among classmates has led me to believe that this emotion, deeply personal, bitingly venomous, self-inflicted, and acerbic, is a product of our technological addictions. Addiction has the awesome power of distorting one’s acumen, a power comparable to that of drugs, inasmuch as it compromises the mind’s judiciary faculty, preventing it from distilling events, from correctly processing experiences, and thereby corrupting our better senses. The teen who is stopped at dinner for being on their phone while eating with their family, or the student who claims to be doing homework, when, in reality, they are playing a game or watching a video—what have they in common? The vanity of a guilty conscience—would rather be defensive than apologetic. The man of guilt is by nature disposed to remorse, and thus he is naturally apologetic in order to right his wrong; yet today, children are by nature indisposed thereto, and are conversely defensive, as though they are the ones who have been wronged—yes, we youths take great umbrage at being called out, and instead of feeling remorse, instead of desiring to absolve from our conscience our intrinsic guilt, feel that we have nothing from which to absolve ourselves, imputing the disrespect to they who called us out.

§5


Alas, what backward logic!—think how contrary were it to be if the thief were to call out that poor inhabitant who caught them. Technology has led to moral bankruptcy. A transvaluation of morals in this case, to use Nietzsche’s terminology is to our detriment, I would think. Guilt is a reactionary emotion: It is a reaction formed ex post facto, with the intent of further action. To be guilty is to want to justify oneself, for guilt is by definition self-defeating; guilt seeks to rectify itself; guilt never wants to remain guilty, no; it wants to become something else. But technology has reshaped guilt, turning it into an intransitive feeling, often giving way, if at all, to condemnation, seeking not to vindicate itself but to remonstrate, recriminate, retribute, repugn, and retaliate. Through technology, guilt has gone from being passive and reactive to active and proactive, a negative emotion with the goal of worsening things, not placating them. Digital culture has perpetuated this; now, being guilty and remaining so is seen as normal and valuable. Guilt is not something to be addressed anymore. Guilt is to be kept as long as possible. But guilt, like I said, is naturally self-rectifying, so without an output, it must be displaced—in this case, into resentment, resentment directed toward the person who made us feel this way.

§6


—You disrupt me from my device? Shame on you!—It is no good, say you? I ought get off it? Nay, you ought get off me!—You are foolish to believe I am doing something less important than what we are doing now, together, to think it is I who is in the wrong, and consequently, to expect me to thusly put it away—You are grossly out of line—You know naught of what I am doing, you sanctimonious tyrant!—

§7


When asked whether they managed their time on devices, some students replied quite unsurprisingly that they did not; notwithstanding, this serves as a frightful example of the extent to which our devices play a role in our lives. (Sadly, all but one student said they actually managed their time.) They were then asked some of the reasons they had social media, to which they replied: To get insights into others’ lives, to distress and clear their minds after studying, and to talk with friends. A follow-up question asked if using social media made them happy or sad, the answer to which was mixed: Some said it made them happier, some said it made them sadder. An absurd statement was made by one of the interviewees who, when asked how they managed their time, said they checked their social media at random intervals through studying in order to “clear their mind off of things” because their brains, understandably, were tired; another stated they measured their usage by the amount of video game matches played, which, once it was met, signaled them to move onto to something else—not something physical, but some other virtual activity, such as checking their social media account. I need not point out the hypocrisy herein.

§8


I take issue with both statements combined, for they complement each other and reveal a sad, distasteful pattern in today’s culture which I shall presently discuss. Common to all students interviewed was the repeated, woebegone usage of the dreaded word “should”:
—”I should try to be more present”—
—”I should put my phone down and be with my friends”—
—”I should probably manage my time more”—

§9


Lo! for it is one thing to be obliged, another to want. Hidden beneath each of these admissions is an acknowledgment of one’s wrongdoing—in a word, guilt. Guilt is inherent in “shoulds” because they represent a justified course of action. One should have done this, rather than that. Subsequently, the repetition of “should” is vain, a mere placeholder for the repressed guilt, a means of getting rid of some of the weight on one’s conscience; therefore, it, too, the conditional, is as frustrated as the guilt harbored therein.

§10


Another thing with which I take issue is when the two students talked about their means of time management. The first said they liked to play games on their computer, and they would take breaks intermittently by going elsewhere, either their social media or YouTube to watch videos. No less alogical, the other said they would take breaks by checking their social media, as they had just been concentrating hard. How silly it would be for the drug addict to heal himself with the very thing which plagues him! No rehabilitator assures their circle with alcohol; common sense dictates that stopping a problem with that which is the problem in the first place is nonsense! Such is the case with the culture of today, whose drugs are their devices. In the first place, how exactly does stopping a game and checking some other website constitute a “break”? There is no breach of connection between user and device, so it is not in any sense a “break,” but a mere switch from one thing to the next, which is hardly commendable, but foolish forasmuch as it encourages further usage, not less; as one defines the one in relation to the next, it follows that it is a cycle, not a regiment, for there is no real resting period, only transition. Real time management would consist of playing a few games, then deciding to get off the computer, get a snack, study, or read; going from one device to another is not management at all. Similarly, regarding the other scenario, studying on one’s computer and taking a break by checking one’s media is no more effective. One is studying for physics, and after reading several long paragraphs, sets upon learning the vocabulary, committing to memory the jargon, then solving a few problems, but one is thus only halfway through: What now? Tired, drained, yet also proud of what has been accomplished thus far, one decides to check one’s social media—only for 30 minutes, of course: just enough time to forget everything, relax, and get ready to study again—this is not the essence of management; nay, it is the antithesis thereof! No state of mind could possibly think this reasonable. If one is tired of studying, which is justifiable and respectable, then one ought to (not should!) take a real break and really manage one’s time! Social media is indeed a distraction, albeit of a terrible kind, and not the one we ought to be seeking. Checking a friend’s or a stranger’s profile and looking through their photos, yearning for an escape, hoping for better circumstances—this is not calming, nor is it productive. A good break, good time management, is closing one’s computer and doing something productive. Social media serves to irritate the brain even more after exhaustion and is not healthy; instead, healthy and productive tasks, of which their benefits have been proven, ought to be taken up, such as reading, taking a walk, or exercising, among other things: A simple search will show that any of the aforementioned methods is extremely effective after intense studying, and shows signs of better memory, better focus, and better overall well-being, not to mention the subconscious aspect, by which recently learned information is better processed if put in the back of the mind during something else, such as the latter two, which are both physical, bringing with them both physiological and psychological advantages. Conclusively, time management consists not in transitioning between devices, but in transitioning between mind- and body-states.

§11


The question arises: Why is spending too much time with technology on devices a problem in the world? Wherefore, asks the skeptic, is shutting oneself off from the world and retreating into cyberspace where there are infinite possibilities a “bad” thing? Do we really need face-to-face relationships or wisdom or ambitions when we can scroll through our media without interference, getting a window into what is otherwise unattainable? Unfortunately, as with many philosophical problems, including the simulation theory, solipsism, and the mind-body problem, no matter what is argued, the skeptic can always refute it. While I or anyone could give an impassioned speech in defense of life and about what it means to be human, it may never be enough to convince the skeptic that there is any worth in real-world experiences. It is true that one could easily eschew worldly intercourse and live a successful life on their device, establishing their own online business, finding that special person online and being in love long distance—what need is there for the real world, for the affairs of everyday men? Philosopher Robert Nozick asks us to consider the Pleasure Machine: Given the choice, we can choose to either hook ourselves up to a machine that simulates a perfect, ideal, desirable world wherein all our dreams come true, and everything we want, we get, like becoming whatever we always wanted to become, marrying whomever we have always wanted to marry, yet which is artificial, and, again, simulated; or to remain in the real world, where there are inevitable strifes and struggles, but also triumphs, and where we experience pleasure and pain, happiness and sadness—but all real, all authentic. There is, of course, nothing stopping one from choosing the machine; and the skeptic will still not be swayed, but I think the sanctity of humanity, that which constitutes our humanity, ought never be violated.

§12


What, then, is the greatest inhibition to a healthy, productive digital citizenship? What can we do to improve things? The way I see it, the answer is in the how, not the what. Schools can continue to hold events where they warn students of the dangers of technology, advise them on time management, and educate them about proper usage of technology and online presence; but while these can continue ad infinitum, the one thing that will never change is our—the students—want to change. Teachers, psychologists, and parents can keep teaching, publishing, and lecturing more and more convincingly and authoritatively, but unless the want to change is instilled in us, I am afeard no progress will be made. Today’s generation will continue to dig itself deeper into the technological world. They say the first step in overcoming a bad habit or addiction is to admit you have a problem. Like I said earlier, technology just is for us youths, and it always will be henceforth, and there will not be a time when there is not technology, meaning it is seen as a given, something that is essential, something humans have always needed and will continue to need. Technology is a tool, not a plaything. Technology is a utility, not a distraction. Social media is corrupting, not clarifying, nor essential. We have been raised in the 21st-century such that we accept technology as a fact, and facts cannot be disproven, so they will remain, planted, their roots reaching deeper into the soil, into the human psyche. Collectively, we have agreed technology is good, but this is “technology” in its broadest sense, thereby clouding our view of it. We believe our phones and computers are indispensable, that were we to live without them, we would rather die. To be without WiFi—it is comparable to anxiety, an object-less yearning, and emptiness in our souls. How dependent we have become, we “independent” beings! This is the pinnacle of humanity, and it is still rising! Ortega y Gasset, in the style of Nietzsche, proclaimed, “I see the flood-tide of nihilism rising!”¹ We must recognize technology as a problem before we can reform it and ourselves. A lyric from a song goes, “Your possessions will possess you.” Our devices, having become a part of our everyday lives to the extent that we bring them wheresoever we go, have become more controlling of our lives than we are of ourselves, which is a saddening prospect. We must check every update, every message, every notification we receive, lest we miss out on anything! We must miss out on those who care about us, who are right in front of us, in order to not miss out on that brand new, for-a-limited-time sale! But as long as we keep buying into these notification, for so long as we refuse to acknowledge our addictions and the problem before us, we will continue to miss out on life and waste moments of productivity, even if they are for a few minutes, which, when added up at the end of our lives, will turn out to be days, days we missed out on. As my teacher likes to say, “Discipline equals freedom.” To wrest ourselves from our computers or phones, we must first discipline ourselves to do so; and to discipline ourselves, we must first acknowledge our problem, see it as one, and want to change. As per the law of the vis viva (and not the vis inertiæ), things tend to persist in their given states, until its internal force wills it otherwise. We bodies animated with the vis viva, we have the determination and volition to will ourselves, to counter the inertia of being-in-the-world, of being-online, whence we can liberate ourselves, and awaken, so to speak. We, addicts, have no autonomy with our devices—we are slaves to them. Until we break out of our complacency, until we recognize our masters and affirm our self-consciousness thence, and until we take a stand and break from our heteronomy, we will remain prisoners, automata, machines under machines. We must gain our freedom ourselves. But we cannot free ourselves if we do not want to be freed, if we want to remain slaves, if we want to remain in shackles, if we want to plug into the machine. A slave who disdains freedom even when freed remains a slave. Consequently, we cannot be told to stop spending so much time on our devices, to pay attention to whom or what is in front of us; we must want to ourselves. Yet no matter how many times or by whom they are told, today’s youth will never realize it unless they do so themselves. They must make the decision for themselves, which, again, I must stress, must be of their own volition. Until then, it is merely a velleity, a desire to change, but a desire in-itself—nothing more, a wish with no intent to act. It is one thing to say we should spend less time, another that we ought to.

 


¹Ortega y Gasset, The Revolt of the Masses, p. 54

Advertisements

Harper Lee’s Guide to Empathy

Unknown.pngIn the 21st Century, surrounded by technologies that distance us, by worldviews that divide us, and by identities that define us, we do not see a lot of empathy among people. While we see friends and family every day, we never really see them, nor do we acknowledge that they, too, are real people, people who have opinions like us, feelings like us, and perspectives like us. Harper Lee is the author of To Kill a Mockingbird, a novel that itself has many perspectives, many of which are in conflict with each other. Set in the 1930’s South, the book takes place during the Great Depression, when many lost their jobs, and a time of racism, when laws were passed that prohibited the rights of black people. The protagonist is a girl named Scout who lives in the fictional town of Maycomb with her brother Jem and father Atticus, who is an empathetic lawyer. Through interactions with her peers, Scout learns to take others’ perspectives and walk in their shoes. In To Kill a Mockingbird, Harper Lee teaches that, in order to take another’s perspective and practice empathy, it is required that one understand someone else’s thoughts or background, try to relate to them, then become aware of how the consequences of one’s actions affects them.


Before one can truly take another’s perspective, Lee argues, one must first seek to understand how someone thinks and where they come from. After hearing about Mr. Cunningham’s legal entailment, Scout asks if he will ever pay Atticus back. He replies that they will, just not in money. She asks, “‘Why does he pay you like that [with food]?’ ‘Because that’s the only way he can pay me. He has no money… The Cunninghams are country folk, farmers, and the crash hit them the hardest…’ As the Cunninghams had no money to pay a lawyer, they simply paid us with what they had’” (Lee 27-8).  Scout is confused why the Cunninghams pay “like that” because it is not the conventional way of paying debts. Money is always used in business transactions, yet Atticus allows them to pay through other means. Atticus acknowledges that the Cunninghams are having economic problems. He empathizes with him by drawing on his background knowledge, namely that, because he is a farmer who gets his money from agriculture, he does not Unknown.jpeghave the means to pay. The Great Depression left many poor and without jobs, so Atticus is easier on Mr. Cunningham; he knows it would be unfair to make him pay when he hardly has any money. Accordingly, Atticus accepts that the Cunninghams are trying their best, and he compromises with them. He willingly accepts anything Mr. Cunningham will give him, since he knows it will come from the heart. For this reason, Atticus can empathize by thinking outside normal conventions to accommodate Mr. Cunningham’s situation. Just as Atticus understands the Cunninghams, so Calpurnia empathizes with them when she lectures Scout not to judge them. Jem invites Walter Cunningham from school over to have dinner with him and Scout. Reluctantly, Walter agrees, but once he starts eating, Scout takes issue with his habits; so Calpurnia scolds her. Calpurnia yells, “‘There’s some folks who don’t eat like us… but you ain’t called on to contradict ‘em at the table when they don’t… [A]nd don’t you let me catch you remarkin’ on their ways like you was so high and mighty!’” (Lee 32-3). Because Scout is not used to the way Walter eats, she immediately judges his way as different from her own, thereby patronizing him. Hence, she is not empathizing because she is not considering his point of view, but is only evaluating her own. Calpurnia states that not everyone eats like Scout does, showing that she, unlike Scout, does not form generalizations; rather, she rationalizes, recognizing that he comes from a different home, a different home with different manners. Since she empathizes with Walter in this way, Calpurnia tells Scout not to “contradict” him, meaning it is rude and unsympathetic not to consider Walter and his background. Furthermore, she warns Scout not to act as though she is “so high and mighty,” especially around others who are less fortunate and who differ from her, such as Walter. By criticizing Walter’s eating and thence abashing him, Scout is being sanctimonious, declaring that her way is the better than anyone else’s. Calpurnia gets mad at Scout for this, as it is egocentric; i.e., she is concerned with herself and cannot consider others’ perspectives. Consequently, Calpurnia shows empathy by understanding that people have different perspectives, while Scout does not. Both Atticus and Calpurnia are empathetic because, as shown, they actively try to understand other people and selflessly consider their perspectives.


Unknown-1.jpegOnce a person’s way of thinking and past is understood, one is able to see oneself in that other and make connections with them. One night, Scout, Jem, and Dill sneak off to the Radley house and are scared away, Jem losing his pants in the process. Jem decides to retrieve his pants, regardless of the danger involved therewith. The next morning, he is moody and quiet, and Scout does not know why. Upon some reflection, she says, “As Atticus had once advised me to do, I tried to climb into Jem’s skin and walk around in it: if I had gone alone to the Radley Place at two in the morning, my funeral would have been held the next afternoon. So I left Jem alone and tried not to bother him” (Lee 77). Scout follows her father’s advice and “climb[s] into Jem’s skin,” symbolizing that she has taken his perspective and seen life therethrough. She asks herself the vital question of what it would be like to be Jem; in doing this, she has visualized herself as Jem, has visualized herself doing what he did, thereby understanding him. The first step in empathizing—understanding—allows her to relate to Jem and put herself in his position: She imagines what it would have been like to risk her own life, how she would have felt doing so. As a result, she examines her emotional reaction and projects it onto Jem, relating to him, feeling as he would feel. Had she not tried to understand Jem’s position, had she not related to him emotionally, she would have never known why Jem was being moody. Jem’s “funeral would have been held the next afternoon,” says Scout, realizing why Jem is upset. If she felt that way herself, then she would not want anyone bothering her, either, seeing as it is a traumatic event. Scout connects to Jem on an emotional level, empathizing with him. Another instance in which Scout shows empathy by relating is when she connects with Mr. Cunningham. Jem and Scout sneak out at night to find Atticus, who is at the county jail keeping watch over his client, Tom Robinson. While they near to him, a mob closes in on Atticus and threatens to kill Robinson, so Scout tries to find a way of civilizing them and 120130184141-mockingbird-6-super-169.jpgtalks to Walter’s father. Thinking of conversation, she considers, “Atticus had said it was the polite thing to talk to people about what they are interested in, not what you were interested in. Mr. Cunningham displayed no interest in his son, so I tackled his entailment once more in a last-ditch effort to make him feel at home” (Lee 205). In this moment, Scout recalls that it is polite to relate to others and consider their views rather than her own. She hereby distances herself from her egocentrism, instead concerning herself with what someone other than herself wants. Empathizing requires that one cross the gorge of disparity, and Scout bridges this gap between self and other to find that she has things in common with Mr. Cunningham, common things of which she would never have thought prior. Before this connection could occur, Scout had to know his background, of which she learned when talking to Atticus; additionally, she had his Unknown-1.pngson over and learned about him then, giving her something in common with him with which to talk. Since Scout knows Walter, she thinks him a topic to which the two can both relate, seeing as Walter is close to his father, creating a strong connection. However, she notes that he “displayed no interest in his son”; thus, she thinks back further, remembers another thing they have in common, then relates to it in an attempt to “make him feel at home.” The phrase “feel at home” denotes acceptance, belonging, and coziness—being warm and welcome—so Scout, in coming up with certain topics that will be of interest to Mr. Cunningham, seeks to make him feel like he is a welcome person, to put herself in his shoes and consider what he would like to talk about, what would make him feel accepted as it would her. Through these moments in the text, Lee shows that empathy is relating to and identifying with another by removing one’s own position and taking theirs.


Empathy is accomplished when one takes another’s perspective in order to know their actions will affect them and consider how they would make them feel. Jem and Scout find out Atticus has been insulted and threatened by Bob Ewell in chapter 23. They are confused as to why their dad did nothing to retaliate, why he just took it. He tells Jem, Unknown.jpeg“[S]ee if you can stand in Bob Ewell’s shoes a minute. I destroyed his last shred of credibility at the trial, if he had any to begin with… [I]f spitting in my face and threatening me saved Mayella Ewell one extra beating, that’s something I’ll gladly take. He had to take it out on somebody and I’d rather it be me than that houseful of children out there’” (Lee 292-3). Atticus directs Jem to “stand in Bob Ewell’s shoes” so that he can understand his perspective, and therefore how Atticus’ actions could have affected him. Knowing Mr. Ewell has many children, finding a common link therein, Atticus can relate to him, imagining how horrible it would be if his children were beaten. Bob Ewell, upset over the trial, wants to take out his anger, so he displaces it onto Atticus, which Atticus says is better than his displacing it on his children. Taking the pacifist route, Atticus avoids exacerbating the situation, aware that fighting back would cause things to worsen, and he steps outside himself to become aware of how his actions will not just have direct effects, but indirect effects as well: Angering Bob Ewell would make him want to physically harm Atticus, but would further encourage him to be more hostile to his children in addition. As such, Atticus takes into account the long-term consequences and empathizes because he is aware of how his actions could possibly obviate a disaster. He thinks ahead—to Bob Ewell’s children, to his own children, concluding, “‘I’d rather it be me than that houseful of children.’” A second example of considering the consequences of one’s actions on another takes place when Scout, a couple years later, reflects on how she treated Arthur “Boo” Radley. At the beginning of chapter 26, Scout is thinking about her life and passes the Radley house, of which she and Jem were always scared, and about which they had always heard rumors. She remembers all the times in the past she and her brother and their friend played outside, acting out what happened at the house. Pensively, she Unknown-1.jpegponders, “I sometimes felt a twinge of remorse when passing by the old place [Radley house], at ever having taken part in what must have been sheer torment to Arthur Radley—what reasonable recluse wants children peeping through his shutters, delivering greetings on the end of a fishing-pole, wandering in his collards at night?” (Lee 324). Lee uses the word “remorse” here to conjure up feelings of guilt, regret, and shame, all associated with the way Scout feels about her actions. To say she feels a “twinge of remorse” is to say she feels compunction; that is, morally, she feels she has wronged the Radleys, and, looking back, that what she did was wrong. She is contrite because she can stand back and objectively evaluate her deeds, deeds she deems unempathetic, considering they were inconsiderate of Arthur. Having become aware of the weight of her choices, Scout experiences regret, an important emotional reaction because it signifies empathy, insofar as it is representative of her taking into account how she affected another person; and, in this case, how it negatively impacted Arthur, which itself requires understanding and relation to him. This regret, this guilt, is caused by the realization that her actions in the past were mean and thus incite moral guilt. Again, Scout puts herself in Arthur’s shoes, imagining what it would reasonably be like to be a “recluse”: Certainly, she affirms, she does not want “children peeping,… delivering greetings,… [or] wandering in [her] collards.” The thought process is supposed to relate to Arthur’s, so Scout is actively relating to and understanding him, ultimately to realize how her conduct impacts him. Her scruples finally notify her that, from the perspective of the solitary Arthur, her behavior had a negative effect. Scout’s awareness of the consequences of her actions makes her empathetic, for she has introjected Arthur’s perspective. In conclusion, Atticus and Scout exhibit empathy because they both consider how their comportment has an effect on others.


Unknown.pngAccording to Lee, empathy is put into practice when one takes time to learn about another person, makes a personal connection with them, and considers how their actions will affect them. We are social animals by nature, which means we desire close relationships; unfortunately, most of us seldom recognize the importance of understanding those with whom we have a relationship, leading to inconsiderateness, ignorance, and stereotypes. For such intimate animals, we all too often neglect the feelings and thoughts of others, even though they are of no less priority than ours. Therefore, empathy is a vital, indispensable tool in social interaction that helps us connect with others. As communication is being revolutionized, worldviews shaken, and identities changed, it is integral that we learn to better understand others and never forget to empathize, lest we lose our humanity.

 


To Kill a Mockingbird by Harper Lee (1982)

Attention and Mindfulness (2 of 2)

Summary of part one: Attention is “the process of focusing conscious awareness, providing heightened sensitivity to a limited range of experience requiring more extensive information processing” and requires an external stimulus. Research by Colin Cherry (1953), Donald Broadbent (1958), and Anne Treisman (1964) found that we can attend to one task at a time, suppressing all other incoming stimuli, based on quality of sound.


Unknown.png“It is easy to eat without tasting,” says Jon Kabat-Zinn in Coming to Our Senses (p. 118). At first glance, this sentence seems random, out-of-nowhere, and completely absurd. Of course we taste our feed when we eat it! However, Kabat-Zinn argues that while we claim to experience and sense things, we do not truly experience them. His message throughout the book is that we have become out of touch with ourselves, with our senses, our bodies, and with the world around us; we fail to experience things for themselves, insofar as we rush through our lives, treating food as “just another meal,” hastily consuming it, not really taking the time to taste each individual flavor. When we eat a hamburger, all we taste is hamburger, not meat, lettuce, tomato, etc., but just hamburger. Our meals are prepared then eaten, but we do not taste them as they should be tasted. Kabat-Zinn states that when attention and intention team up, we are awarded with connection; from connection, regulation; from regulation, order; and from order, we arrive at ease, contentment. There is an effect called sensory adaptation that we seldom recognize yet is always at work. Constant exposure to an external stimulus builds up our tolerance to it, resulting in the numbing of that sense, to the point that we do not notice it. The reason others can smell our body odor but we ourselves cannot is an example of this, because our odor is constantly emanated, and the brain, to avoid distractions, builds up tolerance, to the extent that we no longer smell our own bodies. The purpose of sensory adaptation is to prevent us from becoming entirely distracted. The world is full of smells, sounds, sights, touches, and tastes, but imagine if we were exposed to all of them at once—this is why we need to adapt to our senses. Of course, were we rapt on studying so that all else was ignored, the sound of a car would still interrupt us, considering the intensity of it would overstimulate our senses. While sensory adaptation has helped us biologically, Kabat-Zinn notes that it also works to our disadvantage, particularly the dampening of our Unknown-4.jpegsenses, without which we cannot live. Breathing is of especial importance in meditation. It is necessary to all living things, we must remember; yet we take it for granted, repeatedly neglecting it, forgetting to check how we are doing it. If we took a few minutes every day to attend to our breathing, we could all reduce stress, find composure, and even lower our heart rate through practice. This applies to all sense. As Aristotle keenly reminds us, “[O]ur power of smell is less discriminating and in general inferior to that of many species of animals.”[1] Unlike most animals, humans’ sense of smell is weaker, and so we rely less upon it. Smell and taste are underrated when it comes to senses, although they are of equal merit. Like breathing, both are taken for granted, appreciated only when we are sick, when we can no longer use them—only then do we wish we could taste and smell again. Just as Kabat-Zinn said, we truthfully eat without tasting. Eating our food, we feel pleasure, in the moment; but if we were sick in the same circumstances, we would appreciate our senses that much more; as such, we must live each day as though we were sick.


There are different kinds of meditations, of ways of being mindful. During meditation, you can do a body or sense scan, where you spend a few moments going through your body, focusing on the sensations in a particular part of the body, examining it, then moving on; or you can, for a few minutes at a time, focus on each of your main senses, perhaps using only your ears for a minute, your nose the next. Proprioception is an obscure sense: it is the sensation of each body part in relation to the others. In a body scan, this is most prevalent, when you feel your body in totality, as a whole, yet are able to focus on one body part. William James, writing about boredom, could just have easily been writing about this state of meditation:

The eyes are fixed on vacancy, the sounds of the world melt into confused unity, the attention is dispersed so that the whole body is felt, as it were, at once, and the foreground of consciousness is filled, if by anything, by a solemn sense of surrender to the empty passing of time.[2]

Unknown.pngTypically, when one meditates, one can either close or open their eyes, fixing them at a certain point, listening to the sounds of the world around them, acknowledging every part of their body, paying attention to the breath, overcome by a static sense of stillness, as they are neither in the past nor the future, but the present, simply being, moment to moment. There are two types of attention in meditation: abstract, or inward, and sensory, or outward, attention. The former involves impartial introspection, the clearing of the mind, the decluttering of ideas. “This curious state of inhibition can for a few moments be produced by fixing the eyes on vacancy. Some persons can voluntarily empty their minds and ‘think of nothing,’” wrote James, describing hypnotism, though inadvertently describing meditation as well.[3] Sensory attention, on the other hand, is simply being attentive to the senses and all incoming stimuli. If you are interested in meditation, there are several exercises that can be done to sharpen your attentiveness, like dhāraṇā, jhāna, samādhi, or you can practice some brahmavihāras. In dhāraṇā, the meditator is aware of themselves, as a whole and as meditating, and an object; after dhāraṇā, they move to jhāna, which is awareness of Unknown-5.jpegbeing and of an object; and finally, in samādhi, they find themselves in unity with the object. Samādhi is translated to “one-pointedness” and refers to pure concentration, pure attention. When in this state, the meditator is in what William James calls voluntary attention. This attention occurs when there is a powerful stimulus, yet you focus on something of less intensity. If you are studying and there is noisy construction outside, focusing on the studying, even though the construction is louder and demands your attention, would be an act of voluntary attention. This state, however, cannot be held indefinitely. As James writes, “[S]ustained voluntary attention is a repetition of successive efforts which bring back [a] topic to the mind.”[4] Hence there is no such thing as maintaining voluntary attention, rather coming back to it over and over. Brahmavihāras are like reflections upon Buddhist virtues. There are four traditional brahmavihāras: loving-kindness, compassion, joy, and equanimity. Feel free, too, to make your own meditation, where you reflect on something outside of the given topics—questions in philosophy, like good and evil, justice, and the sort, are some starters.


Unknown-9.jpegI briefly mentioned the idea of clearing the mind, of emptying it of ideas, and to that I shall turn again. Thoughts, in Buddhist writings, are treated like clouds, wispy and flowing; they are temporary; sometimes they are clear, sometimes they clump together; sometimes they are sunny, sometimes they are like a storm. Either way, thoughts are not permanent, nor can they harm you in any way. Generally, we ought to be on the lookout for negative thoughts. When they arise, we must simply dismiss them. Thoughts are the fire to our thinking’s gasoline, for thinking about our thoughts merely propagates more and makes them worse. It is better to let thoughts pass than to intervene through force. Meditation calls for dispelling all thoughts, good or bad. It is misleading to think that we are trying to get rid of them, that we are trying to single some thoughts out from others. This is not the case; rather, we must acknowledge that we are thinking and let them pass. If a positive thought comes, do not perpetuate it, let it pass; if a negative thought comes, do not perpetuate it, let it pass. Another thing to remember is that simply acknowledging that you are thinking is being mindful, and you should not get frustrated with yourself for this reason. An important facet of Buddhist psychology is the distinction between perception and conception. Perception is pure sensation, and conception is labeling, to put it simply. Sitting in peace and silence, you hear a sound, process it, identify it as the rustling of the trees and the singing of birds, and continue meditating—such is an act of conception, for hearing a sound is perception, but classifying it, labeling it, is conception. Unknown-8.jpegLabeling is necessary for living. Without it, there would be no way to comprehend the world. We would be exposed to a chaotic mess, an overwhelming tidal wave of sensations we cannot understand. Almost everything we see and process is conceptualized: this is a tree, that is a plant, this is grass, that is dirt on which I am walking. One is tempted to think of Kant’s categories of the mind and the differentiation between phenomena and noumena. Our mind actively shapes our world, grouping things together, creating causal links, imposing spaciotemporal relations, constantly conceiving things. Perception is to noumena as conception is to phenomena. Rarely do we perceive things as they are, as things-in-themselves, but conceive them imperfectly. We need to carry this to meditation, in thought and in sensation. We must try not to classify things by texture, color, or shape, nor judge thoughts by appearance, nor label anything as “good” or “bad.” Another danger of thinking is daydreaming, to which all meditators are vulnerable, especially if their eyes are closed. When we doze off, finding comfort and relaxation, following our breath, we might accidentally slip into our fantasies, moving from the external to the internal, where we begin to plan for the future or reminisce in the past. No matter which you do, neither is good. William James warns us, “When absorbed in [passive] intellectual Unknown-10.jpegattention we become so inattentive to outer things as to be ‘absent-minded,’’abstracted,’ or ‘distrait.’ All revery or concentrated meditation is apt to throw us into this state.”[5] By meditation, James is not referring to it in our sense, but to the act of pondering. We should not fall into the trap of thinking about the future or ruminating about the past, because as Marcus Aurelius said, “[M]an lives only in the present, in this fleeting instant: all the rest of his life is either past and gone, or not yet revealed.”[6] The past is in the past, and there is nothing we can do to change it, and wishing you could redo something will not help. And the future has not happened yet, so making unrealistic expectations will not help either.


images.jpeg“But we do far more than emphasize things, and unite some, and keep others apart. We actually ignore most of the things before us,” notes William James.[7] For such a formidable tool to which we all have access, the art of attention and how to properly apply it has all but been forgotten by today’s society, to their disadvantage. We live in an age where A.D.D is rampant, and more and more kids are diagnosed with it. Further, our technology strips us of our connection to nature, to the world, to each other. We are no longer in touch with ourselves or our senses. With mindfulness and meditation, however, by living in the present and embracing our senses and life, we can make our lives meaningful.

 


[1] Aristotle, De Anima II.8, 421a9-10
[2] James, The Principles of Psychology, XI.2, p. 261
[3] Ibid.
[4] Id., XI.6, p. 272
[5] Id., p. 271
[6] Aurelius, Meditations, III.10
[7] James, op. cit., IX.5, p. 184

 

For further reading: Buddhist Psychology Vol. 3 by Geshe Tashi Tsering (2006)
The Principles of Psychology by William James (1990)
Coming to Our Senses by Jon Kabat-Zinn (2005)
Mindfulness by Joseph Goldstein (2016)
Meditations by Marcus Aurelius (2014)
Zen Training by Katsuki Sekida (1985)
De Anima by Aristotle (1990)

Attention and Mindfulness (1 of 2)

Attention is vital to our everyday lives. Some of us are better than others at paying attention, but regardless of skill, we all need it, whether we are learning in class or playing out in the field. In a world that values fast, immediate, instantaneous things, attention is slowly fading away, leaving us disoriented and scattered, left in a culture where it is easy to be left behind if you are not fast enough. Not enough of us pay attention in our everyday lives, even in the most simplest of tasks, failing to appreciate the beauty of life, successfully missing the important things, letting life slip out of our grasp. Through a better understanding of what attention is and how it can be used in mindfulness, I believe we can all live more fulfilling lives.

Unknown-1.jpeg


In psychology, attention refers to “the process of focusing conscious awareness, providing heightened sensitivity to a limited range of experience requiring more extensive information processing.”[1] Simply put, attention is the ability to focus your awareness and senses on a particular task, leading to better experience and understanding. In order for this focusing to occur, you need an external stimulus, such as a sound, and an active goal, which is your response to or classification to such a stimulus. For example, if you hear a dog bark, the barking is the external stimulus, and your realizing it is a dog who is barking is the active goal. The act of paying attention is no direct process but a combination of three processes (Posner, 1995): Orienting senses, controlling consciousness and voluntary behavior, and maintaining alertness. The first stage, orienting senses, is what happens when your sensory organs are directed to the source of a stimulation. When you hear a sound coming from the left, it is your left ear that will process it first, as it is oriented to the direction from which the sound came. Similarly, when you touch something, your hand comes into direct contact with the object. Depending on what sense the stimulus activates, your cortex suppress the other sensory organs while focusing on the active one: rarely do you need your eyes to smell something—it is the nose’s job to do that. When you orient your senses, you tend to use your superior colliculus, responsible for eye movement; the thalamus, responsible for activating specific sensory systems; and the parietal lobe, which is usually responsible for giving us a sense of direction. The next stage is controlling consciousness and voluntary behavior, in which your brains decide just how much you want to focus on a particular sense. Your eyes, when paying attention to something, can dilate or constrict depending on light, for example. Therefore, this 250px-Basal_Ganglia_and_Related_Structures.svg.pngsecond stage’s job is to control your response to stimuli and uses the frontal lobe and basal ganglia, known for their relation to controlling thoughts and actions. Third is maintaining alertness, which is indispensable for attention, for its job is to remain focused on a sense and ignore possible distractions. When you maintain alertness, you use different neural patterns in your reticular formation and frontal lobe. A type of attention known as selection is defined as “the essence of attention” (Rees et al., 1997).[2] Selective attention is the ability to focus on something important and ignore others, whereas selective inattention is the ability to ignore something important and focus on others; the latter is used most often, either for good, as in diverting stress, or for bad, as in procrastinating.


Imagine you are at a party. You are sitting at a table with your friends, deep in conversation; the speakers are blasting music; there are people dancing; and there is another conversation across the room. Engrossed in the talk, you block out all other sound beside your own conversation, when all of a sudden, you hear your name being Unknown-2.jpegmentioned in the conversation across the room. The Cocktail Party Phenomenon, as it came to be called, was studied by Colin Cherry (1953), who found, startlingly, that not only is most information unconsciously processed, but some of this information, conscious or not, is prioritized above other information. A contemporary of his, Donald Broadbent, developed the Broadbent Filter Model (1958) to attempt to explain why this is so. Fascinated by air traffic controllers, whose job it is to receive multiple incoming messages at once and in mere seconds make quick judgments about which is most important, Broadbent began to study divided attention, “the capacity to split attention or cognitive resources between two or more tasks”[3] (Craik et al., 1996), by using a method of testing called dichotic listening, where a subject puts on a pair of headphones and is played a different messages in each ear, simultaneously. Broadbent found that only one channel can be understood at a time, while the other is blocked out. He reasoned that there must be a theoretical, Y-shaped divergence in our minds that, when two inputs try to pass, lets one through and blocks access to the other. He said, further, that we have a short-term memory store that keeps track of these channels. The question remained, though: How does the brain decide which channel to let through? In another surprising conclusion, he found that in spoken language, meaning is understood after being processed; as such, content is not the decisive factor but quality of sound, like loudness, harshness, and from what sex it came. A loud, domineering voice, therefore, will be prioritized over a softer, nicer voice, even if the latter is more important in its message. Broadbent later went back and revised his model, stating priority is based on a combination of quality of the voice, content of the words, and prior experience; however, a later psychologist, Anne Treisman, said that during the Y-exchange, the second channel is not ignored, per se, but suppressed—this would explain the Cocktail Party Effect, because although you do not consciously hear your name, you still process it.

 


[1] Westen, Psychology: Brain, Mind, & Culture, 2nd ed., p. 395
[2] Ibid., pp. 395-6
[3] Id., pp. 397-8

 

For further reading: Psychology: Brain, Mind, & Culture 2nd ed. by Drew Westen (1999)
Essentials of Psychology by Kendra Cherry (2010)
The Psychology Book by Wade E. Pickren (2014)
The Psychology Book
by DK (2012)

Some Experiments on the Recognition of Speech, with One and with Two Ears by Colin Cherry (1953)

If Thou Art Pained By Any External Thing, It Is Not This Thing That Disturbs Thee, But Thy Own Judgment About It.

Unknown.jpegIn book 8 section 47 of the Meditations, Marcus Aurelius writes,

If thou art pained by any external thing, it is not this thing that disturbs thee, but thy own judgment about it. And it is in thy power to wipe out this judgment now. But if anything in thy own disposition gives thee pain, who hinders thee from correcting thy opinion? And even if thou art pained because thou art not doing some particular thing which seems to thee to be right, why dost thou not rather act than complain?—but some insuperable obstacle is in the way?—Do not be grieved then, for the cause of its not being done depends not on thee.

Things in themselves are not nuisances, rather we make them so ourselves. Nothing is either good or bad in itself, although we commonly think they have to, and incorrectly. The thing is, though, that our thoughts, unlike external events, are within our power, and so we are able to change our thoughts, judgments, and perceptions to make things bearable. Aurelius tells us that if we are annoyed by ourselves, we oughtn’t blame it on others or on anything; instead, we should take to correcting our opinions, as they belong to us, so we can fix them. No one stops us from changing our way of thinking except ourselves. Further, he points out that when we think things we would rather not think, we often complain to others and to ourselves, ignorant of the true nature of the annoyance; as such, he advises to simply change our thinking when it appears to be straying. When we notice negative thinking, acknowledging it and knowing it is the cause of our problems is one thing—but actually acting on it and changing it, is another, and is what we ought to do. But often times we will impute our misfortune to some external thing, such as the day, leading to remarks like, “Today is not a good day,” or, “Today does not like me”; however, if we attribute our personal torment to something impersonal, something external to us, we should be doing the opposite, really, for if “day” is what is causing our problems, we know that it is a force greater than us, outside of us, and therefore not concerned with us. Because the courses of our days are outside of our power, we can do nothing to change them, so instead of resisting them, we should allow them to carry out as they please, as per nature, and leave our attitude to ourselves. Hamlet expresses this in the same way: “[T]here is nothing either good or bad, but thinking makes it so” (II.ii.265-66.). Therefore, our perception is what affects our attitude, so your life is what you make of it.

 

For further reading: Meditations by Marcus Aurelius (2014)

What is Humorism?

Unknown.jpegPsychology and medicine, finding their beginnings in Greek culture, have come a long way; and since their speculative foundations, their influence has become larger, more pertinent, and more accurate than ever before, with the invention of prosthetics in the field of medicine and cognitive studies in psychology, for example. It seems as though anything is possible, as though nothing cannot be achieved. One may wonder, then, from where psychology came, from whom modern medicine developed. Small questions, like why, when someone is in a bad mood, we say they are in bad humor; or why, when someone is angry, we say they are hot-blooded, or short-tempered, never fail to come up regarding the origins of either discipline. A glance through history, to the invention of psychology, can show us the foundations of both psychology and medicine—the ancient system of humorism.


The theory of the four humors is derived from the Pre-Socratic philosopher Empedocles (c. 490-430 BC) who posited the existence of the four basic elements that constituted all of reality: air, fire, earth, and water. Everything in the world, he explained, was a synthesis of all four, each contributing their unique characteristics and properties to create everyday objects. For this reason, early theory in medicine was based on philosophical theory, so the two subjects were closely intermingled, the cause of many a medical error in ancient times. The man whom we ought to credit for the beginnings of modern medicine is the Unknown-4.jpegGreek physician Hippocrates (c. 460-370 BC), who is most renown for the Hippocratic Oath, which is still used today. Despite the countless contributions he has made to medicine, there is difficulty when it comes to pinpointing which works he actually wrote and which works were written by either his student Polybus or perhaps even rival doctors of his. Some of his works, furthermore, seem to diverge in content, contradicting earlier theories. Central to Hippocrates’ method was a holistic approach to the body. “Hippocrates the Asclepiad says that the nature even of the body can only be understood as a whole,” remarked Plato [1]. Each part of the body was to be examined against every other part, so as to treat everything as one. He wrote of a popular principle at the time: “Certain sophists and physicians say that it is not possible for any one [sic] to know medicine who does not know what man is.”[2] Such importance placed upon the human body and its composition made the humoral theory possible, as well as the secularization of medicine itself. Apollo and Asclepius, the Gods of medicine, were thought to be the causes of disease up until Hippocrates, who, diagnosing epilepsy—once thought the work of the Gods—said it “appears… to be nowise more divine nor more sacred than any other disease, but has a natural cause from which it originates like other affectations.”[3]


The natural cause of which Hippocrates spoke was the humors. From the Latin word umor, meaning fluid, the humors were four fluids within the body, each aligning with one of the four elements of Empedocles. Hippocrates identified blood with fire, phlegm with water, black bile with earth, and yellow bile air. During the Scientific Revolution, the 16th-century physician William Harvey performed studies on the circulatory system, when he would eventually disprove Hippocrates and Galen. Acknowledging the two physicians and Aristotle (he supported the humoral theory), he wrote in his book regarding animal Unknown-2.jpeggeneration, “And thus they [the Ancients] arrived at their four humors, of which the pituita [phlegm] is held to be cold and moist; the black bile cold and dry; the yellow bile hot and dry; and the blood hot and moist.”[4] According to Hippocrates, one could tell whether the upcoming season would be one of sickness or health by analyzing the weather; if there were extreme conditions, like extreme coldness during winter or heavy rains during spring, then more diseases were to be expected, whereas normal conditions foretold of health and prosperity. Cold seasons exacerbated the cold humors, phlegm and black bile; while warm seasons exacerbated the warm humors, yellow bile and blood. Alchemist Philippus Aureolus Theophrastus Bombastus von Hohenheim, or Paracelsus (1493-1541), was a notorious physician during his time, often burning the works of Galen in public to disrespect him and his theories. Instead of the four humors, Paracelsus preferred a more alchemical approach, diagnosing based on saltiness, sweetness, bitterness, and sourness, adding a fifth property, life. In addition, he gave these elements their own properties, such as combustibility, solidness, fluidity, and vaporousness. The human body has a balance to it, what Hippocrates judged as a body’s krasis (κρασις), or mixture. A healthy body has a good mixture, eucrasia (ευκρασια), meaning it has an even amount of all four humors. Pausanias, a doctor in The Symposium, explains that,

The course of the seasons is also full of both these principles; and when, as I was saying, the elements of hot and cold, moist and dry, attain the harmonious love of one another and blend in temperance and harmony, they bring to men,… health and plenty, and do them no harm.[5]

Unknown-3.jpegWhile one should strive for an ideal balance, eucrasia, one should stay as far away as possible from dyscrasia (δυσκρασια), or bad mixture, for if it is extreme, it can result in death. Too much phlegm (mucus), warns Hippocrates, can clog the throat, choking off airflow, resulting in asphyxiation, for instance. Another Renaissance physician, shortly after Paracelsus, named Santorio Santorio (1561-1636), calculated that between the perfect balance of eucrasia and the imperfect balance of dyscrasia lie 80,000 unique diseases stemming from different combinations of humors. Determined to prove Hippocrates and Galen right, Santorio carried out extensive experiments, measuring the body’s temperature before and after diagnosis with a thermoscope, then measuring it daily thereafter, comparing each new temperature to the healthy one.


“Those diseases which medicines do not cure, iron cures; those which iron cannot cure, fire cures; and those which fire cannot cure, are to be reckoned wholly incurable,” stated Hippocrates confidently [6]. Should some poor soul suffer from dyscrasia, there were several cures with which he could proceed, and there was a cure for each type of imbalance. Hippocrates invested his faith in incisions, stating that iron, by which he means knife, is the next step up from remedies; if surgery does not work, he says one should proceed to cauterize; but if fire does not work, then one is out of luck. Other proposed cures were sweating and vomiting, which would either excrete or purge any excess humors. Of course, then there was bloodletting, the deadly, inaccurate method of making a cut in the skin and cleansing the blood. So popular was bloodletting that by the 1500’s, “[t]reatment was still based on the Hippocratic theory of humors, and bloodletting was a panacea.”[7] Virtually any disease could be cured by bloodletting—that is, until William Harvey. Besides these cleansing methods, there was an easier, more efficient way of handling humoral diseases, one which did not require knives or fire: using opposites to counteract. If there was too much blood, a doctor could counteract it with black bile, opposing the hotness and moistness of the former with the coldness and dryness of the latter; similarly, too much yellow bile could be countered with phlegm, and vice versa. Hippocrates was also famous for prescribing his patients varying diets that would in the same way counter the excess humor, usually advising the replacement of wheat with bread, of water with wine.


This raises the question, though: Why does humorism matter, why is it relevant at all, considering it is outdated and completely incorrect, and why should we be interested? As I said at the beginning, humorism was the foundation for psychology; specifically the foundation for the psychology of personality, a much-studied and much-debated area of research today. Roman physician Galen (c. 130-200) was arguably the first person to attempt a formal study of personality. A studied physician of Hippocratic writings, a learned student of Stoic logic, Galen was an empiricist at heart, emphasizing experience over speculation, what he called demonstrative knowledge (επιστημη αποδεικτικη). Neither Hippocrates nor Galen studied the interior of the human body, as the dissection of humans was taboo; thus, their findings were purely theoretical, which is rather ironic for Galen, who did cut open animals, just not humans. Galen identified two types of passions: irascible passions, those which are negative, and concupiscible, those which are positive [8]. He observed four Unknown-1.jpegtemperaments arising from the four humors. (Temperament, interestingly, translates to mixture!) In fact, “Before the invention of the clinical thermometer and even for some time afterwards, bodily ‘temperature’ was only a synonym for ‘temperament.’”[9] His theory of the four temperaments is so influential that their adjectives have carried over today: too much blood creates a sanguine character who is cheerful; too much phlegm a phlegmatic who is calm; too much yellow bile a choleric who is angry; and too much black bile a melancholic who gloomy; and for the latter two, one can say bilious. Hippocrates noticed these characteristics in his time and attested, commenting, “Those who are mad from phlegm are quiet, and do not cry nor make a sound; but those from vile are vociferous, malignant, and will not be quiet, but are always doing something improper.”[10]


One may dissent again: Why is this relevant? for it is outdated. Although Galen’s theory of the four temperaments is largely out of use [11], it has spawned interest in following Hans_Eysencks_4_Personality_Types.gifpsychologists of personality. The infamous Myers-Briggs Type Indicator, or MBTI (1943), can be seen as a derivative. It utilizes different traits to arrive at a certain personality. Those who wish to know their personality have to decide if they are introvert or extrovert, if they intuit or sense, think or feel, and perceive or judge. Another option, the Big Five, or Big Three (1949), identifies people based on their levels of openness, conscientiousness, extraversion, agreeableness, and neuroticism. Big Three limits the scales to neuroticism, extraversion, and openness. Lastly, the direct descendant is psychologist Hans J. Eysenck (1916-1997), whose method of deducing personality was influenced entirely by Galen. Eysenck created a dichotomy between extraversion and introversion, neuroticism and psychoticism, recognizing several character traits reminiscent of Galen.


[1] Plato, Phaedrus, 270c
[2] Hippocrates, On Ancient Medicine, p. 13b*
[3] Hippocrates, On the Sacred Disease, p. 326a
[4] Harvey, Anatomical Exercises on the Generation of Animals, p. 435b*
[5] Plato, The Symposium, 188a
[6] Hippocrates, Aphorisms, §7, 87
[7] Durant, The Story of Civilization, Vol. 5, p. 532
[8] This is a very superficial description; for a more detailed one, read Aquinas’ Summa Theologica, 1.81.2,ad.1
[9] Boorstin, The Discoverers, p. 341

[10] Hippocrates, On the Sacred Disease, 337a
[11] Read Florence Littauer’s Personality Plus for a modern perspective

For further reading: 
Greek Thought: A Guide to Classical Knowledge by Jacques Brunschwig (2000)
The Oxford Companion to Classical Civilization by Simon Hornblower (1998)
Anatomical Experiments on the Generation of Animals by William Harvey

An Intellectual History of Psychology by Daniel N. Robinson (1995)
The Encyclopedia of Philosophy Vol. 3 by Paul Edwards (1967)
The Encyclopedia of Philosophy Vol. 4 by Paul Edwards (1967)
The Encyclopedia of Philosophy Vol. 6 by Paul Edwards (1967)
The Story of Civilization Vol. 2 by Will Durant (1966)
The Story of Civilization Vol. 3 by Will Durant (1972)
The Story of Civilization Vol. 5 by Will Durant (1953)
The Psychology Book by Wade E. Pickren (2014)
The Story of Psychology by Morton Hunt (1993)
The Discoverers by Daniel J. Boorstin (1983)
On the Sacred Disease by Hippocrates
On Ancient Medicine 
by Hippocrates
On the Natural Faculties
 by Galen

Extra reading for fun: Personality Plus by Florence Littauer (1992)

*Pages referenced to Great Books of the Western World, Vol. 9, 26, by Mortimer J. Adler (1990), respectively

Jack and His Discontents (2 of 2)

We now move onto the late stages of Jack’s neuroticism. Jack, as we have learned, has been repressing his primitive instincts, meaning he has kept them out of the conscious, leaving the ideational presentations stuck in the unconscious, forgotten, neglected, left to multiply like fungus. As Freud said, the longer we keep our instincts repressed, the more time they have to regroup, come together, and create more resistance in our minds, creating tension, Guilt_Finger.gifresulting in the censuring of the ego by the superego, ultimately creating a sense of guilt, the result of a fight or flight response. Freud spoke of an economy in the mind, a national reserve of sorts; when this reserve is depleted, the defense mechanisms of our mind break down. Repression requires energy, and the longer an idea is repressed, the more energy is consumed. By killing the pig, Jack has given his aggression a catalyst, so the impulses grow stronger, eating more energy, his repression slowly breaking down, his aggression shining through the cracks in little bits. We see that, after killing the pig, Jack becomes increasingly aggressive. Slowly but surely, the walls of his mind are crumbling down, and his aggression is able to come through. Ralph lectures Jack for not looking after the fire. Jack notices that he is in hostile territory, and his super-ego begins to hammer on his ego. The guilt that arises thereafter cannot be tolerated by Jack, who is guilty of not completing his duties, who, feeling threatened, turns the anger onto Piggy, presently punching him and knocking him down (Golding 66). Here, there is a struggle between the id, which wants to take out its aggression, and the superego, which instills a sense of guilt in Jack. The result is displacement: unable to cope with the greed of the id and the morality of the superego, the ego decides to appease them both by taking out his feelings on something weak, vulnerable, and defenseless—Piggy. In so doing, Jack has temporarily satisfied his id. Like a hungry child, the id, once fed, will return to normal, until it begins to grow hungry once more. What has just occurred has been Jack acting out. Roger and Jack are both sadists. Golding describes a scene in which Roger throws rocks at the Littlun Henry:

Here, invisible yet strong, was the taboo of old life. Round the squatting child was the protection of parents and school and policemen and the law. Roger’s arm was conditioned by a civilization that knew nothing of him and was in ruins (57).

Roger and Jack have both been raised in a society that values temperance, control, and politeness. They were scolded by their parents not to hurt their siblings; taught in school not to do mean things to other students; warned by the police not to break the law; conditioned by society to be behaved, to be like everyone else, to resist all urges. Think, then, what this has done to their inner aggression, to have been repressed to such an extent! But here, on the island, things are different; no longer is there a higher authority Unknown.jpegto keep the boys in check. Roger, free to do as he pleases, unable to be punished, can be aggressive and not get in trouble. However, it is strange that he refuses to hit Henry directly, throwing instead into a small circle instead. Law and morality still remain with him. Despite his freedom, the idea of restraint has been ingrained into his mind. That there is no evil in him is false; his throwing rocks at Henry is proof of the opposite—Roger’s dark side is stronger than his good, for all this time it has been growing uncontrollably powerful. All it took to release it was the absence of punishment, be it from an external force, like a parent, or an internal force, namely the superego. Without the restraints of civilization, Roger, like Jack, regresses to his primal self, his aggressive, savage self. Fromm wrote,

[I]f the situation changes, repressed desires become conscious and are acted out…. Another case in point is the change that occurs in the character when the total social situation changes. The sadistic character who may have posed as a meek or even friendly individual may become a fiend in a terroristic society…. Another may suppress sadistic behavior in all visible actions, while showing it in a subtle expression of the face or in seemingly harmless and marginal remarks.[1]

Put another way, Fromm is saying that the sadist will feign a pleasant character in a certain environment, say a school, but will reveal himself in a different context, such as an island. This echoes Freud who also noted that society forces us to create reaction-formations. Because we cannot satisfy our aggressive tendencies, we must be exceedingly gentle. Fromm also notes that the sadist, even in a safe environment, will not completely hide his nature, as there will be minor signs, like expressions in the face, of which he spoke.


Unknown.pngFollowing this event, the next major stage in Jack’s neuroticism happens shortly before he kills the pig. Jack is by the riverside, collecting clay, then smearing it on his face, covering it up. He looks at himself at the river and is satisfied. “[T]he mask was a thing on its own, behind which Jack hid, liberated from shame and self-consciousness,” writes Golding (59). Hereafter, Jack relinquishes all remnants of his past life, devoured by his aggression, which takes control for the rest of the story. A small detail, the mask allows for disinhibition, allowing Jack to take on a whole new persona. This mask hides who Jack was, endows him with new strength, and lets him get away with anything. It is no longer Jack who is acting but the mask. If Jack kills Ralph, it is not Jack who does it, but the mask. One can think of the story of Gyges’ Ring as told in the Republic, in which a shepherd finds a ring that can make him invisible. Granted this awesome power, Gyges abuses it, making himself invisible and killing the king and marrying his wife. Anonymity Unknown-1.jpegbestows upon its subject great powers, including immorality. The mask on Jack’s face lets him be sadistic, for he can no longer be ashamed. A sense of invincibility is coupled with invisibility, seeing as Jack, hiding himself behind the mask, feels untouchable, as though he can do whatever he wants, since it is not he who is doing it. No more responsibilities are expected of Jack hence. When Jack steals fire from Ralph, the two come face-to-face. Committing an unforgivable act, Jack, normally, would not be able to look the other boy in the face, an overwhelming feeling of guilt preventing him; but with his mask, Jack can easily steal from Ralph without thinking twice. Ralph, Piggy, and Samneric try to go after Jack and his hunters at the end, except that “[t]hey understood only too well the liberation into savagery that the concealing paint brought” (Golding 170). Golding adds further that, “Freed by the paint,… they were more comfortable than he [Ralph] was” (173). Anyone who puts on the mask of paint is relieved of all expectancies, of all moral obligations, of all sensibleness. Freud observed that the barbarian was happier than the civilized man, inasmuch as the former could satisfy his impulses, whereas the latter could not; similarly, the hunters are more comfortable than Ralph because they can do what he cannot: gratify their aggression.


Thanatos, the major force through which Jack now operates, is committed to but one task: self-destruction, the return to the womb, to nothingness. Jack is never seen backing away from a daunting task, always one for a challenge, even if it may end up killing him. Eager to kill, Jack volunteers to go on pig hunts constantly, going as far as to hunt the dreaded beast that threatens their existence. Upon climbing the mountain, Ralph considers going back, but Jack calls him a coward, insisting that they go up. Ralph calls their mission a foolish one, and Jack agrees, continuing up the mountain, determined to kill the beast. If this is so, if Jack wants to destroy himself, why is it, then, that he kills the pig earlier in the book? Freud would answer, “It really seems as though it is necessary for us to destroy some other thing or person in order not to destroy ourselves.”[2] The real goal of Thanatos is destruction of the self, but Jack obviously does not want to die, consciously that is, so he must satisfy his death-instinct some other way, viz., killing something else. Simple trade-off: kill something else to avoid not killing myself. Like Prometheus, Jack tries to defy his god (his superego, rather) by stealing fire from their sacred home. It is a forbidden task, one that will surely result in suffering. Only, unlike Prometheus, Jack gets away with it, despite almost being compromised, successfully. This small act of defiance further tips the scale of his death-instinct.

Another trait of the sadist is that he is stimulated only be the helpless, never by those who are strong…. For the sadistic character there is only one admirable quality, and that is power. He admires,… those who have power, and he despises and wants to control those who are powerless and cannot fight back.[3]

Jack emulates Fromm’s description of the sadistic character when he orders his hunters to take the innocent Wilfred into custody to be tortured for no reason. Ralph asks Samneric why Jack ordered Wilfred to be tortured, but the twins have no answer. It seems Jack did so purely for pleasure, for fun, to fulfill his aggressive death-instinct. There is no rational reason for what he did, obviously, except for the fact that it was in his own self-interest, and that he was able to exert control over a powerless being. The relationship between Ralph and Jack is odd, the latter’s respect for the former strained by his desire to remove him from power. In some ways this is true, for Jack does not truly want to kill Ralph, as he harbors a sort of respect for him, for his demotic popularity. What Jack really wants to do is have all the power for himself. Just a few hours before Jack captured and had Wilfred beat, Roger horrendously killed Piggy, to which Jack reacted apathetically, coldly, disturbingly, responding by threatening Ralph that the same could happen to him. If Jack wanted Ralph dead, he could have done it long ago, and easily—but he did not.


1024px-VingtAnnees_258-980x682.jpg“Few people ever have the chance to attain so much power that they can seduce themselves into the delusion that it might be absolute,”[4] commented Erich Fromm. Fortunately, this is true; unfortunately, it is still possible. Completely neurotic now, Jack has become like Mr. Kurtz, gaunt and savage, his loyal hunters willing to do anything for him, as he sits in his throne as though he were an idol, or a god. Power has indeed gotten to him now, to the point that he is worshiped, thought invincible, the true leader of the boys on the island.

In many cases the sadism is camouflaged in kindness and what looks like benevolence toward certain people in certain circumstances. But it would be erroneous to think that the kindness is simply intended to deceive, or even that it is only a gesture, not based on any genuine feeling. To understand this phenomenon better, it is necessary to consider that most sane people wish to preserve a self-image that makes them out to be human in at least some respects. [5] 

Jack may not be totally sane, but he does seek to maintain his human appearance. When he is not off hunting pigs, stealing fire, or torturing kids, Jack is seen giving plentiful rations to his and his enemies’ people, not as an illusion, not to bait them, but to appear in some way humane, to be what remains of his character. In fact, Jack invites Ralph and his friends to join his tribe rather pleasantly, offering them food and protection, all in a friendly tone, no force necessary. It is only later, when he has been confronted, that he forces Samneric to join the tribe by means of  force. While this may be the last of his humanity, it does not change the fact that he is still savage. Having regressed completely to the beginning, Jack is now like his hunting ancestors, hosting ritualistic dances centered on sacrifices, complete with disturbing chants and entrancing rhythms. Jack has become so ill, so neurotic, so sadistic, that he has nearly fallen out of touch with reality, becoming more of a black hole than a human, sucking up all good, drawing in all light, all that is good. Even pure-hearted Ralph and Piggy succumb to his darkness, joining one of the rituals, eventually killing their friend Simon in cold blood. Conclusively, Jack has become a deranged, sadistic neurotic.


In conclusion, to use the wise words of Piggy, “[P]eople [are] never quite what you thought they were” (Golding 49).

 

Glossary:
(Retrieved from Stephen Glazier’s Word Menu)


Acting out- Unconscious expression of previously repressed feelings through specific behavior
Aggression- Hostile, destructive behavior towards others
Death-instinct/Thanatos- Destructive, aggressive compulsion to achieve nonexistence
Defense mechanism- Any of various mental processes, including… displacement,… projection,… reaction-formation, regression, repression,…, used by the ego for protection against instinctual demands and to reduce anxiety
Disinhibition- Removal of inhibition (process of stopping an impulse)
Ego-
 Reality-oriented, structured component of personality that enables individual to function autonomously in the world
Ego-ideal/Superego- Aspect of personality involving conscience, guilt, imposition of moral standards, and introjected authoritative and ethical images
Guilt- Recurrent feeling of self-reproach or self-blame for something wrong, often something beyond one’s control
Id- 
Unconscious, unsocialized component of personality, containing unexpressed desires and motivations and driven by pleasure principle
Neuroticism- Emotional disorder involving basic repression of primary instinctual urge and reliance on defense mechanisms that results in symptoms or personality disturbance
Reaction-formation- Defense mechanism involving denial of unacceptable unconscious urges by behavior contrary to one’s own feelings
Regression- Defense mechanism involving return to behavior expressive of earlier developmental stage, usu. due to trauma, fixation, anxiety, or frustration
Repression- Defense mechanism in which threatening or unacceptable ideas or urges are forgotten
Sadism- Condition in which pleasure, esp. sexual, is derived from inflicting pain on others

 


[1] Fromm, The Anatomy of Human Destructiveness, pp. 107-8
[2] Qtd. in Fromm, id., p. 492
[3] Id., p. 325
[4] Id., p. 323
[5] 329-30

 

For further reading: 
A General Introduction to Psychoanalysis by Sigmund Freud (1975)
The Anatomy of Human Destructiveness
 by Erich Fromm (1992)

Civilization and Its Discontents by Sigmund Freud (1929)
Instincts and Their Vicissitudes 
by Sigmund Freud (1915)
The Ego and the Id 
by Sigmund Freud (1923)
Lord of the Flies
 by William Golding (2011)
Repression
 by Sigmund Freud (1915)

Jack and His Discontents (1 of 2)

So far I have examined Lord of the Flies under the microscopic lenses of Plato, Hobbes, and Nietzsche. One form of literary theory, which is a favorite among many, which has been used on many pieces of writing, and which I will be using in this blog, is that of psychoanalysis, a branch of psychology developed by Sigmund Freud. A simple search containing both Lord of the Flies and psychoanalysis will easily generate several results, all of which are exactly the same, all of which are shallow in their depth, each of them focusing on the tripartite theory of the id, ego, and superego. What I seek to do in this blog, therefore, to distinguish my analysis from the others out there, is perform a case study on Lord of the Flies, a case study focused on one character in particular, a character central to the story, a character whose inner struggle is perfect for psychoanalyzing: Jack Merridew. By the end of this blog, I hope to prove that Jack suffers from neurotic sadism. A glossary can be found at the end to clarify any psychoanalytical terminology that I will be using.

images.jpegPsychoanalysis is the study of the unconscious and how it affects the conscious mind, initially conceived by Freud under the impression that all mental illnesses were caused by sexual tensions derived from a young age. This first stage of his thought, in which sexual energy, or libido, after being kept out of the conscious, caused mental illness, was later replaced by a later, finalized stage, characterized by a complete break away from the libidinal theory, where Freud turning instead to the life and death-instincts, the latter earning heavy criticism from his followers. These two instincts are the main forces behind human behavior, and each has a different motivation, the life-instinct, called Eros, seeking self-preservation and reproduction, and the death-instinct, usually referred to as Thanatos, seeking self-destruction, sometimes “[expressing] itself as an instinct of destruction directed against the external world and other living organisms.”[1] Freud thus created a dualism of impulses in man, a Manichaean tension caused by an internal war of life against death, of creation against destruction. Freud wrote that

Civilization has been built up, under the pressure of the struggle of existence, by sacrifices in gratification of the primitive impulses, and that to a great extent for ever being re-created, as each individual, successively joining the community, repents the sacrifice of his instinctive pleasures for the common good.[2]

According to Freud, the only reason society exists is because individuals give up their individual instincts. If each individual were to indulge their death-instinct, the very instinct of aggression, the very instinct present in everyone, then there would be constant warfare, reckless murder, and rife torture; but, by renouncing and rejecting our impulses, by stifling them, by keeping them out of our conscious, we are able to coexist, to live peacefully and without fear of our aggressive tendencies kicking in and dominating us. There will be no more destruction, either of ourselves or of others. Freud said that civilization represses its desires, by which he means that we force these unacceptable ideas and fantasies out of our minds and into the unconscious, where they are left to fester, unable to torment the conscious mind.

[T]he more a man checks his aggressive tendencies towards others the more tyrannical, that is aggressive, he becomes in his ego-ideal…. [T]he more a man controls his aggressiveness, the more intense become the aggressive tendencies of his ego-ideal against his ego.[3]

Unknown.jpegHere Freud is saying that, over time, the repressing of our instincts will only make the tension worse, as the longer they stay in the unconscious, the more persistent they become. The ego-ideal, synonymous with our conscience, will become stressed as a result, censuring us with a harsher tone, criticizing our lack of control, nagging on us, the voice of authority becoming stronger. As this happens, our reasoning diminishes, and we lose control of our conscious, letting us slowly but surely let our instincts out. However, civilization has not reached this point wholly, the reason being that we have redirected our instincts; Freud says that civilization thrives on sublimation, for it is the only productive way of combatting our desires. Because we all have within us aggression, a seething beast waiting to be released, we usually end up creating reaction-formations to fight back. Instead of letting all of our aggression out, we pretend as though we are happy and grateful, despite the terrifying reality happening below the surface. Little do we know that this pressure, this aggression, is bubbling in our depths.


Jack Merridew is an adolescent boy who was raised in England. In the beginning of the book, we immediately recognize him as a natural leader, a boy whose inherent nature is that of commanding, of gaining respect, of having his voice heard, of getting things done. For the most part, having grown up an English boy, under a Catholic household, as the head of his choir, he has good and proper morals. Jack’s whole life seems to be headed in a good direction, as he has excellent training in being a leader and in displaying Catholic morals. And like everyone else in society, he has been taught to sublimate his instincts, to hide them, to turn them into something productive. In a choir, Jack is able to reach deep into himself and take his inner aggression—with which he has not yet come to terms—and turn it into art, using his voice to express himself creatively, thereby redirecting his impulses into something acceptable. Further, as a devout Catholic, Jack has been Unknown-2.jpegdisciplined to act faithfully and morally. Indulging in his dark instincts would not be very Catholic of him, so he has been taught to repress his desires and act out of kindness and compassion; as we know, though, this is the opposite of what he truly is inside: proof of reaction-formation. When Simon talks to the Lord of the Flies, the pig takes on the guise of a schoolteacher who says, “This has gone on far enough. My poor, misguided child, do you think you know better than I do?” (Golding 141). Golding himself was influenced to write the book after he taught at a young boys’ Catholic school, so it is no surprise that he should put a reference here. One can easily imagine The Lord of the Flies like a concerned, patronizing schoolteacher shaking his head disapprovingly, mocking Simon, for he knows that there is a darkness in all the boys, yet Simon has not yet embraced it. Jack has already given up his Catholic values and given into his darkness, to the disappointment of the imaginary schoolteacher. The death-instinct still lurks unconsciously in Jack, however, and strongly, throughout the first half of the novel. When Jack tries to kill the first pig, he hesitates to drive the knife into the pig (Golding 25-6). There is a voice in Jack telling him that it is immoral, that the blood will be overwhelming, and that ultimately, it will haunt him forever. Later, when the boys create a fire, Jack and Ralph both hesitate to light the fire, because the warnings of their parents still echo in their heads: Do not play with fire! Despite being boys held back by the words of adults, there is still aggression inside of them, waiting to be acted upon.


The next stage of Jack’s neuroticism occurs with the whole pig incident, at which we just glanced. This stage is, perhaps, the most formidable, as it is the first sign we see of Jack’s aggressiveness being released. I like to think of Jack in this stage as regressing, not in the traditional sense, but in an evolutionary sense, insofar as he is almost reverting back to his ancestral roots in the hunter-gathering civilizations. There is a scene when Jack goes hunting, in which we see him get down on all fours, as though stalking; in which we see him sniffing the ground, going so far as to sniff droppings; in which he traverses the jungle, spear in hand, ready to slaughter the pig without mercy (Golding 43-4). Eric Fromm captures this mentality in the following quote:

He [the hunter] returns to his natural state, becomes one with the animal, and is freed from the burden of the existential split: to be part of nature and transcend it by virtue of his consciousness. In stalking the animal, he and the animal become equals, even though man shows his superiority by the use of his weapons.[4]

Jack is seen reverting to his natural state of being, as a predator, as a hunter, getting down on all fours, so as to become one with nature, with the animal, so he can kill it, get food, and feed himself. There is a return, then, to the primitive instincts. Freud declared that “it is easy,… for a barbarian to be healthy; for the civilized man the task is a hard one.”[5] The barbarian, or in this case the hunter, is able to freely act on his aggression, for in doing so he gets to kill and ends up with food and is therefore happy; modern man, contrarily, must keep his aggression in check, must restrain himself from hurting, and hence he is tormented. Jack, channeling his inner hunter, is able to engage his aggression naturally, for it is natural, allowing him to kill without fear of reproach. As a hunter, killing is not for pleasure; killing is now about survival. The question arises: Why the pig? We see that Jack becomes utterly obsessed with the pig, fixated even. Psychoanalytically, he does have a fixation. Thanatos, because it is pure energy, is expressed in a directed charge, similar to an electric current. Now that Jack can channel his death-instinct, he cathects it to the pig—that is, he directs his energy to an object: the pig. Consequently, Jack develops an object-cathexis, his instincts now fixated on the pig, the vulnerable animal now his prey. Evident of this fixation is the fact that Jack claims that he will kill the pig “Next time—!” Unknown-1.jpeg(Golding 26), not once, but thrice (Golding 28, 46). On three separate occasions Jack seems to take offense whenever someone asks him about the pig. It is safe to say that this is a sort of inferiority complex in Jack, a sort of rejection, of himself. When he tried to kill the pig, he hesitated, and now he feels rejected, as though everyone thinks him weak as a result. Jack develops the strange idea that he is being judged, that he is an incompetent hunter, since he is unable to complete such a simple task, causing frustration. This pressure creates a stronger cathexis in Jack’s mind, for his failure to kill the pig makes him want to kill it even more, as he feels doing so will prove himself as both worthy and competent. At this point, Jack is concerned with meat and meat alone, not rescue, not building huts, but getting meat. Food was of paramount importance in the hunter-gathering society, especially meat, for it was more difficult to acquire than berries or nuts. It is logical, then, that Jack should become so obsessed with this task. During the time that Jack is fixated on the pig, there still remains resistance in him, resistance to the idea of killing—indeed, a man’s first kill haunts him forever, so it is a frightening ordeal for Jack. Talking to Ralph, Jack tries “to convey the compulsion to track down and kill that was swallowing him up” (Golding 46). Reflecting on his two failed missions to hunt the pig, Jack is in disbelief, repeating dreadfully, “I thought I might kill” (Ibid.). In Jack’s voice, one can imagine a sense of surrealism, considering Jack nearly killed for the first time. After killing the pig, Jack describes the experience as follows:

His mind was crowded with memories; memories of the knowledge that had come to them when they closed in on the struggling pig, knowledge that they had outwitted a living thing, imposed their will upon it, taken away its life like a long satisfying drink. (Golding 65)

Notwithstanding his initial fear of killing, Jack is bestowed with great ecstasy. This disturbing imagery, that of killing being similar to “a long satisfying drink,” is not one of kindness and compassion, but sadism, pure and simple. In addition to these early signs of sadism latent in Jack, there also arises evidence of paranoia, suggestive further of neuroticism. “‘If you’re hunting sometimes you catch yourself feeling as if…. [y]ou’re not hunting, but—being hunted, as if something’s behind you all the time in the jungle,” confides Jack in Ralph (Golding 48). This comment reveals another insight into Jack, psychoanalytically, in that it reflects his projecting of his aggression. Because he has not come to terms with the aggression that lingers inside him, because he feels threatened by this new-found aggression, Jack feels it necessary to project his aggression onto the world instead of taking responsibility for it himself because it makes him feel safe, because it takes away the responsibility of having to deal with it.

 

Glossary:
(Retrieved from Stephen Glazier’s Word Menu)


Aggression- Hostile, destructive behavior towards others
Death-instinct/Thanatos- 
Destructive, aggressive compulsion to achieve nonexistence
Cathexis-
Concentration or buildup of mental energy and emotional significance in connection with an idea, activity, or object
Ego-
Reality-oriented, structured component of personality that enables individual to function autonomously in the world
Ego-ideal- Aspect of personality involving conscience, guilt, imposition of moral standards, and introjected authoritative and ethical images
Fixation- Extreme attachment to object or ideas associated with earlier stage of psychic development; halting of stage of personality development
Frustration- Disturbed state occurring when individual cannot attain goal or relieve tension
Neuroticism- Emotional disorder involving basic repression of primary instinctual urge and reliance on defense mechanisms that results in symptoms or personality disturbance
Object- “[T]hat in or through which it [an instinct] can achieve its aim (Freud, Instincts and their Vicissitudes, p. 414b)
Obsession Persistent, pervasive, disturbing fixation on an emotion, idea, object, or person
Paranoia- Persistent delusions of persecution or suspicion of others
Projection- Defense mechanism involving attribution of one’s own unacceptable or unwanted qualities and motives to others
Reaction-formation- Defense mechanism involving denial of unacceptable unconscious urges by behavior contrary to one’s own feelings
Regression- Defense mechanism involving return to behavior expressive of earlier developmental stage, usu. due to trauma, fixation, anxiety, or frustration
Repression- Defense mechanism in which threatening or unacceptable ideas or urges are forgotten
Sadism- Condition in which pleasure, esp. sexual, is derived from inflicting pain on others
Sublimation- Defense mechanism involving substitution of socialized behavior for unacceptable acting out of primary urge

 


[1] Freud, The Ego and the Id, p. 709b*
[2] Freud, A General Introduction to Psychoanalysis, p. 27
[3] Freud, The Ego and the Id, p. 715a-b
[4] Fromm, The Anatomy of Human Destructiveness, p. 156
[5] Qtd. in Seldes, The Great Thoughts, p. 149

 

For further reading: 
A General Introduction to Psychoanalysis by Sigmund Freud (1975)
The Anatomy of Human Destructiveness
by Erich Fromm (1992)

Civilization and Its Discontents by Sigmund Freud (1929)
Instincts and Their Vicissitudes 
by Sigmund Freud (1915)
The Ego and the Id 
by Sigmund Freud (1923)
Lord of the Flies
by William Golding (2011)
Repression
by Sigmund Freud (1915)

*All notes are references to Great Books of the Western World Vol. 54 by Mortimer J. Adler (1990)

 

The German Romantic Philosophers (2 of 5)

As we have seen, Romanticism was a revolt against the rational, ordered view of the world, an outlook championed by the Enlightenment, particularly in France. The Germans responded vehemently with their own revolution, inspiring a surge in faith, in the visceral, the emotional. Intuition was favored above knowledge, perception above conception, passion above reason. Part two shall analyze the first two figures of German Romanticism, Jacobi and Herder, the former in favor of faith, the latter nationalism.

Unknown-2.jpegFriedrich Heinrich Jacobi (1743-1819) was a controversial figure, partly because he wrote several polemics, many of which sought to challenge the status quo established by the Enlightenment, and partly because of his fiery personality, the cause of many a bitter friendship (like the pantheism controversy), despite befriending several vanguards of the time. Recognized primarily as a polemicist, Jacobi was an obstinate critic of the Enlightenment. He saw the apotheosis of reason as distasteful, as reason was, in his view, subordinate to faith, an opinion held by a fellow Romantic and, at one time, friend, Johann Georg Hamann. Science, obtained by observation, supposedly based on reason, had its foundations built upon empiricism, thus making it not objective, as it is so advertised. Jacobi detested science, not only because it aided the Enlightenment in forming a logical theory of the universe, but because it deified the objective, the so-called “true reality.” In all reality, all the scientists were doing was watching phenomena through their senses, then turning those observations into laws, laws that put boundaries on the world, limiting it, depriving it of its beauty, supplanting faith with reason. This, Jacobi thought, led to nihilism, a term actually first used by Jacobi himself. The Enlightenment was nihilistic in that it was fatalistic–it condemned humanity, the entire universe, to a predetermined course, all of its occurrences explainable by scientific laws, thereby reducing all values to science. Immanuel Kant was one of the biggest influences on the German philosophers, on modernity, and at one point, on Jacobi; yet this did not last long, for Jacobi later criticized Kant, dismissing his theory of things-in-themselves, imputing Kant with creating a system of subjective idealism. Faith, as I have said, was central to Jacobi, and the definition of faith, in German glaube, caused some problems for Jacobi when it was interpreted in different ways. Hence Jacobi identified faith with both its traditional definition and with belief, since he felt both were necessary. Intuition was an experienced truth supported with faith, according to Jacobi. If we drop an object, it will fall. But just seeing this truth in action is not enough; we must also have faith in this empirical observation, because without it, there still exists doubt, uncertainty. Further, faith and feeling are superior to reason simply for the fact that reason is formed after sensation, after feeling. First we see the object drop, then we formulate the notion that there is an effect which draws the object downward. In order for faith to work, Jacobi invites us all to take a salto mortale, a leap of faith, a concept mistakenly contributed to Unknown-4.jpegKierkegaard. Faith is almost like a blind trust, for we must put our whole certainty into something, regardless of whether we are certain or not. This leap, then, signifies our trust in the world. This is the first principle upon which philosophy is based for Jacobi. Critiquing Kant, Jacobi wrote that the thing-in-itself, existing in the transcendental noumenal realm, cannot be grasped by reason, precisely as Kant said. If that is so, how can Kant be sure the things-in-themselves exist? Jacobi says that we know they exist not through reason, but through faith. Faith in the phenomenal world means faith in the noumenal world, it follows. Jacobi can hardly be called an Existentialist, but he did place emphasis on the person, rather than speculative metaphysics. In an idea we will further investigate with Fichte, Jacobi said there is an I and a Thou, in other words a subject and an object. Only through the object can the subject be known, for the subject, by resisting something outside of itself, is revealed. With too much emphasis on theory in philosophy, Jacobi turned to action, stating that we are identifiable through action; we are what we do.


Unknown-3.jpegWhile arguably one of the most influential Romantics yet also one of the most unheard of, Johann Gottfried Herder (1744-1803) has made a lasting contribution to civilization, not only as a philosopher, but as a historian, a debatable nationalist, a psychologist, and a linguistic theorist. Nationalism can be traced neither exactly nor accurately to Herder, but he definitely is responsible for a different type: Cultural nationalism. In his four-volume series Ideas for the Philosophy of the History of Mankind (1784-1791), Herder sets up the idea of a creative consciousness. This consciousness is a historical process, one that has shaped, shapes, and will shape society, its work seen throughout history. Creative consciousness is not exactly a forerunner of evolution, yet it is similar, inasmuch as it relates society to a species of some sort, constantly evolving, adapting, assimilating, progressing. As this consciousness progresses, cultures come and go, cultures being different groups of people sharing in common beliefs and practices. Every culture is unique from its contemporaries, exhibiting its own characteristics, the likes of which have never been seen before nor will be seen again. Cultures, therefore, are unique, and there exists no blueprint, no recognizable pattern, for them, as they cannot be predicted, so far as there is no objective framework on which they are to be based; cultures are blank slates, each able to have a special impress made on them. Herder believes in world peace. He thinks cultures should coexist without encroaching on one another. He thinks cultures should respect one another. Based on these beliefs, Herder did not like empires, nations that conquered other nations: by removing a culture and reorienting it, empires are removing an entire past, an entire history distinct to that culture alone; and that history can never be recovered, for it has already been disrespected. Moreover, it removes the identity of the culture itself, practically wiping it clean from the world. The discipline of historicism was also influenced by Herder. To understand a certain aspect of a culture, Herder said, you must understand the entire culture. For example, I took a similar approach when researching this topic: So that I could understand the environment in which the Romantics were operating, I had to study not just the philosophy of the period, but the history, the art, the literature, the music, and the people. Reserved for every culture is a specific goal, a main focus, what Herder calls the schwerpunkt of a culture. Identifying this central idea will help you envisage the context into which you are delving. And what is culture without people? The people, volk, constitute a collective spirit, a volksgeist, that is immanent in that culture. A culture is created by the nature of humans, their expression, and their environment. Regarding the first, humans are sociable; second, Herder praises art, seeing Unknown-5.jpegit as the expression of not just a person but of a people, where there is no objective beauty, allowing cultures to figuratively speak to one another; third, he sets up environmental studies, asserting that the development of a culture is dependent upon its environment. “We live in a world we ourselves create,”[1] and, “Nature has separated nations not only by woods and mountains, seas and deserts, rivers and climates but most particularly by languages, inclinations, and character…,”[2] he says. Also a psychologist, Herder can be called an eliminativist, a view I explain here. Simply put, Herder rejects all “folk psychology,” meaning he discredits any claims about sensations like “reason,” “will,” or “desire.” He also believes that Man is entirely physical, entirely discarding Cartesian dualism, stating that only the brain and the body make up the individual. The idea that the brain is compartmentalized, with individual sections hooked up to individual functions like eating, is also rejected. Herder thought it ridiculous, mystical even, that one part of the brain could do one thing and another something else. Predating Gestalt theory, he says, “The inner man, with all his dark forces, stimuli, and impulses, is simply one.”[3] The whole is greater than the sum of its parts, in other words. Man is not identified with his desires, with his thinking, with his instincts, but with all of them; Man is not a single part but a collection. Within every man is an animating force, Kraft. Herder theorized this living force as fundamental to being, to existence; without it, there would be no life. Kraft is what allows us to function, to live, to interact with the real world, with objects outside of us, phenomena. Moving onto his linguistic theory, we find a revolutionary view, one that gave me an epiphany. Language developed and evolved alongside humans–that much is evident, and it was accepted. Each culture had a language that was different from its neighbors, for they progressed differently, and their language had to fit their specific needs. This led Herder to create a theory that inspired the Sapir-Whorf hypothesis. Because language was relative to each culture, it followed that language must accordingly affect how cultures experienced the world. Insofar as there is no such thing as a perfect translation, it must mean that words will have different connotations to different people; therefore, we all see the world differently based on our language. What made me really think was Herder relating the conscience to an “inward speaking.”[4] This, coupled with Wittgenstein’s picture theory of language, makes up the internal dialogue of the mind. We tend to think in pictures, and when we talk to ourselves in our heads, there is an inaudible voice, but a voice that we can altogether understand, as it uses language–it speaks.

 


[1] Smith, The Norton History of the Human Sciences, p. 337
[2] Meinecke, Cosmopolitanism and the National State, pp. 25-26
[3] Herder, Sämtlichte Werke, Vol. 8, p. 179
[4] Herder, Sämtlichte Werke, Vol. 21, p. 88

 

For further reading:
Ideas: A History of Thought and Invention, from Fire to Freud by Peter Watson (2006)
Oxford Dictionary of Philosophy 3rd ed. by Simon Blackburn (2016)
The Oxford Companion to Philosophy
 by Ted Honderich (1995)
The Encyclopedia of Philosophy 
Vol. 3 by Paul Edwards (1967)
The Encyclopedia of Philosophy 
Vol. 4 by Paul Edwards (1967)
The Passion of the Western Mind 
by Richard Tarnas (1993)
The Story of Civilization 
Vol. 10 by Will Durant (1967)
Dictionary of Philosophy 
by Thomas Mautner (2005)
Stanford Encyclopedia of Philosophy: Friedrich Jacobi
The Roots of Romanticism 
by Isaiah Berlin (1999)

Conformity – Part 1

thnkrsThat we are social animals is not up for question. In order to survive, we must stick together, form communities, and work with each other. Without teamwork, without cooperation, without a common goal, and without a mold to which we shape ourselves, civilization would not be possible, seeing as we rely on our peers to overcome obstacles and create laws, which we obey. Put simply, it is within our nature to conform–to shape together. Trends, norms, and values would not be possible were conformity to disappear from our nature. We must, however, begin to ask ourselves: just how much should we conform? How much of our individuality ought we give up to the group? Postbellum research and recent studies are beginning to show exactly how much freedom we are giving up.

In the early 20th century, psychologist Muzafer Sherif conducted one of the first experiments in social psychology. In his experiment, Sherif had three individuals in a pitch black room, with only a single light, an optical illusion known as autokinesis, which made the subjects misleadingly believe there was movement, where there was none. Sherif would then ask the subjects to conjecture how far the light moved. One by one the participants gave their own guess, each time adjusting their number according to the previous estimate. As Sherif carried this out several times, he noticed that the guess would change, yet the subjects would nevertheless change their opinion to fit in with the others. Because there was no real answer, and because they were unsure, the participants, Sherif concluded, would always look to their peers for guidance. When he did the same experiment in private, the individuals would still give the collective opinion, the group’s effect still fresh in their minds, making a permanent impression. Whenever we are approached with an abstract problem, we are disposed to conforming with the popular opinion, so as to derive confidence. A score later, Solomon Asch, skeptical of Sherif’s theoretical testing, carried out his famous Paradigm Test. With a sample of 123 participants, Asch took groups of eight, seven of them confederates, peers of Solomon with knowledge of the experiment, and one subject, who had no knowledge of what the experiment entailed. Every experiment comprised 18 tests wherein the subjects were presented with two cards, a single line on one, a set of three of different lengths on the other, but one of three matched the one on the other card. The eight participants were seated at a table, and the subject was always seated either second-to-last or last, so they could hear the confederates first. On six out of the 18 tests, the confederates were instructed to give the correct answer, and the subject did as well; however, on the final 12, the confederates purposely gave incorrect answers. Come the subject’s turn, they would surrender to peer pressure, selecting the wrong line 37% of the time, even when it was obvious. Though two-fifths does not seem substantial, it is a shockingly clear demonstration of how much we conform. Later, Asch did the same test with the same participants, this time in private. Without the confederates, the subjects answered correctly, admitting that they would have felt embarrassed had they not said what the majority had. When asked how they felt, those who answered correctly–even with the confederates–said they felt nervous and not entirely sure. “Living in society requires consensus as an indispensable condition. But consensus, to be productive, requires that each individual contribute independently out of his experience and insight,”¹ wrote Asch after publishing his studies.

Anyone who plays sports knows how it feels to have family and friends cheer them on in the audience. For some, they find it supportive, and it helps them focus; for others, they find it distracting, and it disrupts their performance. Psychologically, it is much more fascinating, much more complex, its effects dependent upon the circumstances. Personally, as a runner, having run in numerous competitions, I have had my fair share of passionate spectators, yelling my name, screaming words of encouragement. In sprinting, however, the moment is gone in seconds, so I never have a chance to register them. When someone’s performance is enhanced due to an audience, the psychological phenomenon is called social facilitation, a finding from experimentation done by Lee E. Travis (1925), which saw 80% of participants improve. J. Pessin, Richard W. Husband, and Robert Zajonc, in the 1930’s and mid 1960’s, respectively, found that social facilitation occurs only when an individual has mastered a specific skill, meaning it is harder to learn or practice a new skill in front of an audience than it is a learned one. In accord with the latter of the three, Edward E. Jones and H.B. Gerard (1967) believed this was a result of arousal, stating that an audience acts as a stimulus, but that it also places more stress on the performer, greatly distracting them from their task at hand. Nickolas B. Cottrell, Dennis L. Wack, Gary J. Selerack, Robert H. Rittle, Thomas Henchy, and David C. Glass (1968) conducted their own tests, claiming instead that this phenomenon occurred as a result of apprehension regarding evaluation. They found that not only did subjects who were blindfolded do worse but that those who were told an audience would not be evaluating them did  worse as well; contrariwise, if they were told they would be evaluated afterward, they did better. Further research from Alan E. Gross, B.S. Riemer, and Barry E. Collins (1973) supported this, adding that the audience’s perception, positive or negative, affected their confidence, too. Were the audience to provide encouraging feedback, all the while watching, the performer felt more confident, more optimistic, whereas those given discouraging feedback exhibited lower confidence. What all of these studies show is that the individual is driven by two factors: the need to be evaluated and watched and the need for a positive self-image. In other words, we care a lot about how we are perceived by others.

Conformity is a “change in behavior or belief toward a group as a result of real or imagined group pressure,”² wrote Charles and Sara Kiesler (1969). While it is helpful to know what conformity is and how it occurs, it is important to know why it occurs, why we conform. David P. Crutchfield (1955) stated that when confronted with conformity, an individual can choose to conform, individuate, or anti-conform. The first option, of course, means doing whatever the group is doing, joining in; the second allows the individual to be himself, to do whatever he wants, regardless of the group’s desires; the third and final option is dissent, an act of rebellion, where the individual does the opposite of what the group does. Interestingly, this can be a paradox: are anti-conformists being true individuals, or are they simply conforming to non-conforming? A triangle is used to represent the three choices, with independence at a different point, as it is on a different plane than the others. Richard H. Willis (1965), on the other hand, thought there were four choices: independence, variability, conformity, and anti-conformity. Using a screen-shot-2016-12-10-at-9-28-40-pmdiamond, Willis suggested measuring an individual’s decisions by moving along the axis, vertically and horizontally, accordingly. Now that we have a solid layout of how we may choose what to do, the question of why we may decide to join a group still remains. Once again there are three core ideas: attractiveness, competence, and status. When speaking of attractiveness, psychologists refer not to the physical attraction we desire but to the power of something to attract, namely the group in question. A group’s attractiveness can be calculated by how cohesive it is, how united its members are, how it functions as a team. Obviously a unanimous vote correlates to a stronger attractiveness, as in Sherif’s autokinetic experiment, when participants would go with the collective answer. In Asch’s experiment, if six of the confederates gave a wrong answer and the last one gave a completely different one, then the subject would be more willing to give their own answer, the reason being that the group’s coherence was not as powerful. Thus, the bigger the group and the more cogent it is, the higher the probability of conforming. The second and third factors, competence and status, deal with how experienced a group or its members are and where they are placed, socially. Those with lower confidence are more likely to conform, for they rely on more intelligent, more skilled individuals whom they look up to as a leader. Usually a single person will be considered the leader by the rest of the group, a strong-willed person, one with experience, one with knowledge, one considered “higher” in ranking. It is easy to understand, evidently, why minorities are easily exploited, especially by demagogues, who will take up the guise of a leader, who will take advantage of the masses, moving them with their words, molding them–and the people almost always willingly listen to them, so they conform. One needs not look farther than Hitler or Stalin, who used their enticing, volatile propaganda to convince an entire people to conform, to not ask questions, to listen to the majority opinion. Occasionally the leader will be put under pressure, inasmuch as they are forced to make all the decisions. With the people’s faith, it is incumbent upon the leader to do the right thing, and everyone will follow. Conformists seek approval; they seek acceptance. Adolescents are predisposed toward conformity, since it is at that time that they begin to find who they are they are, so they seek identity in others. Teenagers will go out of their ways to be someone they are not so as to elicit their approval, to feel like “one of the cool kids.”

In conclusion, conformity is natural, an inherent tendency at birth, a need to feel belonging, to feel like a part of something greater than oneself, to be a community. Not a day goes by that we must choose to conform, to be independent, or to anti-conform. Every day bears a new choice, a new opportunity to choose whom you will follow, whom you will lead, whom you will become. The bottom line is, you have to conform in one way or another, but just how much you conform is up to you.

 

 


¹Pickren, The Psychology Book, p. 320
²Tedeschi, Social Psychology, p. 547

 

For further reading:  1001 Ideas That Changed the Way We Think by Robert Arp (2013)
The Psychology Book by Catherine Collin (2012)
The Psychology Book
by Wade Pickren (2014)

Social Psychology by James Tedeschi (1976)