Why Do We Root for the Good Guys?

Warning: Lord of the Flies and Game of Thrones (Season 6) Spoilers! 

I grew up watching movies. My favorites were action movies, where the good guy shot up his enemies and performed exciting stunts in flaming buildings in order to stop some evil-doer from doing something terrible. Of course, there were also the classics that I adored, such as Star Wars, a classic good vs. evil story. Back then, I liked to think myself quite the devil’s advocate, hopping to the other side, wondering what would happen if the bad guy won this time, then cheering for them. It made me wonder as a young child: Why do the good guys always win? There are always two sides to the story, so why Unknown.jpegweren’t the villains’ sides considered? No matter whom I rooted for, good or bad, it was always the good who vanquished the bad, who stood victorious in the name of peace and order. This eternal struggle between good and evil, this Manichæan theme, this dualistic battle—it is not just present in cinema, but permeates all of Western culture, from its videogames to its literature to its mythologies to its historiography. This narrative is woven into our daily life. As such, how earth-shattering it is to read Nietzsche: “No one has… expressed the slightest doubt or hesitation in judging the ‘good man’ to be of a higher value than the ‘evil man….’ But! What if we suppose the reverse were true? What then?”—indeed, what then? [1]

Everyone has a Will-to-Power, believed Nietzsche. Deep down, hidden in the unconscious, there is an unkown, life-preserving, exploitative, driving urge that  permeates every living thing. When people act out of this unconscious Will, they are not to be blamed, for this Will is natural. To Nietzsche, it seemed absurd to say that anyone who acted on this Will to Power was blameworthy because, in essence, it is the Will that is intrinsic to them. “A measure of force,” he said, “is just such a measure of impetus, will, Unknown-1.jpegaction.”[2] Therefore, throughout nature, embedded in all our willed, voluntary actions is the Will to Power. The Will to Power is inherent to all animals, which are always seeking not the most happiness, but the most power, and are always avoiding that which prevents power. By power, Nietzsche meant the ability to triumph, to master one’s surroundings and prevail, to exploit to the best of one’s abilities, such that it lives longer, by whatever means necessary. Hence, “[A]n injurious, oppressive, exploitative or destructive action cannot be intrinsically wrong, inasmuch as life is essentially something which functions by injuring, oppressing, exploiting, and destroying, and it is absolutely inconceivable without such a characteristic.”[3] Basically, all actions we judge today as wrong are, to Nietzsche, natural expressions of the Will to Power. In fact, we should not judge them at all, because, as illustrated in the quote above, Nietzsche saw life rather pessimistically, describing life as a dog-eat-dog, every-man-for-himself competition, where only the strongest survive. One gets the idea from Nietzsche, then, that one can only make it through life if they embrace these qualities, these violent, aggressive, harmful qualities. A philologist and historian, Nietzsche concluded from his studies that ancient man was naturally sadistic: He enjoyed participating in violence and loved inflicting cruelty, deriving a savage pleasure from it. Punishment was an important part of daily life back then, so, Nietzsche proposed, those who were quick to inflict suffering were seen as good, while those who were hesitant, who were slow to deliver punishment for a forgotten debt, were seen as incompetent. This cruelty, correctly, was said by Nietzsche to be the direct product of the Will to Power. He went so far as to say that cruelty is “something to which the heart says a hearty yes.”[4] This sounds frightening. Do we really delight in cruelty, even in today’s modern, civilized world, so distant from our barbaric past? While we may be in denial or firm disagreement, thinking such a sentiment disgusting or repugnant, we must concede that we do take pleasure in cruelty, even if it is minimal. After all, we all know that wonderful German word schadenfreude—the joy we get from watching others’ misfortune. Nietzsche remarked that today, although we do not go around gaily slaughtering each other as our ancestors did, we still enjoy cruelty in other, less explicit Fighting-630x420ways, such as video games and movies and events that have fighting, like wrestling or MMA. In this way, we have not completely gotten rid of cruelty, but have rather channeled it through vicarious means, not directly inflicting it, but still experiencing it. But how many of us would willingly admit that we enjoy watching—or even inflicting—pain? Nietzsche foresaw this, even saw it in his own time: We are more likely to believe in fate or chance or free will than in the Will to Power, the idea of which repulses us and could not possibly be in our psyches. Our unwillingness to accept this exploitative Will, reasoned Nietzsche, leads to what he called “misarchism,” or hatred of rulers and ruling. By this he meant that we hated the idea of power and all its associations. To say that history’s great men were shaped by this Will to Power rather than their cultures or destinies, seems to us impossible to accept. Think of all the brutal, bloodthirsty dictators and authoritarians throughout history! We fear power, to the point of detesting it, and we are worried about its applications everywhere. Nietzsche passionately rejected Darwin’s theory of natural selection, explaining that organisms sought not survival, but flourishing. All organisms are not content with simply surviving. The lion did not survive natural selection only to settle down, feeling himself lucky to have lived out his competitors; he survived to gain more power, to be dominant, and therefore to dominate his environment and prey. Adaptation is more about being proactive than reactive. Adaptation is achieved through internalizing conflicts. Progress is a necessary sacrifice of the weak to the powerful, in Nietzsche’s eyes. He thought that strong could live by themselves. They were autonomous. In following their own morality, they could live on their own terms, unbeholden. The weak hold us back, he wrote. This gives us a picture of Unknown-2Nietzsche’s ideal man. An ideal man affirms, not denies, his Will to Power. Just as the best government has the least laws, so the best man has the least moral values save his own. He follows his own morality, not society’s. He stands out from the herd. He seeks power, not pleasure; those who seek pleasure avoid pain, but pain is inevitable, leading to “pessimism of sensibility,” or conscience. In what Mencken calls “ideal anarchy,” every man does what pleases him, and him alone. The ideal man concerns himself with himself, and no one else. Spontaneous, instinctive, and unconscious, he acts on his Will, embracing what Nietzsche calls his instinct for freedom. Unlike the weak, who experience responsibility for their actions, the strong feel no guilt or responsibility, but act in the moment, unafraid of the consequences, but wholly accepting them.

There are two kinds of people in this world: Masters and slaves. According to Nietzsche, all moralities can be divided under these two classes. In tracing the history of the concepts of Good and Evil, Nietzsche found in early societies a primitive form of this duality, finding it to be between not Good and Evil, but instead Good and Bad. He discovered these two words are linked etymologically to the aristocracy, in which the aristocrats, the rich and powerful, call themselves “Good” and everyone who is not an aristocrat, the poor and powerless, “Bad.” In other words, the idea of Goodness developed from the nobility, from the upper class, which often consisted of the dominant few who had most of the land and owned slaves. They thought themselves the best, superior to everyone else, as they had control over resources, among them, people.[5] Seeing as they were educated and could do whatever they pleased with their property, it was only fitting, Nietzsche thought, that they should differentiate themselves from the masses, whom they considered lowly and base. The nobility possessed what Nietzsche calls the pathos of distance—that feeling of separation Unknown-3between oneself and others, especially of higher from lower, owner from owned. This worldview said that whatever was not aristocratic was bad, so all slaves were bad, in that they lacked everything the nobility had. What distinguishes the master from the slave is power. Thus, anything that goes against power is slavish and therefore bad, meaning the virtues we so often praise, such as temperance and compassion, are bad qualities, to the extent that they are anti-power. A change took place in these societies when religions like Judaism and Christianity began amassing followers, pandering to the masses, particularly the slaves. Suddenly, the consensus was, “The wretched are alone the good; the poor, the weak, the lowly are alone the good… but you, on the other hand,… you men of power, you are for all eternity the evil, the horrible, the covetous, the insatiate, the godless.”[6] Religion created an inversion of the noble morality, turning Good and Bad into Good vs. Evil. There was, accordingly, a twofold inversion: The Bad became the Evil, and it was no longer a coexistence but a competition of values, and there could only be one victor. Through this inversion, the weak made themselves “stronger” than their oppressors. By painting their enemies as Evil, the manifestation of all things contemptible, the slaves managed to get the upper hand, convincing themselves that they were happier than their masters. They aggrandized suffering, rather than dominating. Nietzsche named this approach the ascetic ideal, which he defined as “an attempt to seem ‘too good’ for this world, a sort of holy debauchery.”[7] He says “too good for this world” as a way of satirizing this otherworldly approach, which emphasizes the pure and the heavenly, calling for the renunciation of the appetite, a call to a virtuous life, one that will be rewarded in the second life. These ascetics parade their “holy debauchery,” whereby they take pride in their virtuous, saintly life; in their denial of this world; and in their holier-than-thou comportment. Foreshadowing Freud, Nietzsche theorized that the Unknown-4repression of the Will to Power that took place in asceticism led to “bad conscience,” a concept similar to guilt. Simply, Judeo-Christian morality taught that it was wrong to act on the Will to Power, so its followers repressed, or kept in check, their instincts; guilt arises, then, when one’s instincts turn upon oneself. These built-up instincts, having no output, are accordingly relieved by self-inflicted suffering. This “internalization of man,” Nietzsche diagnosed, is what made the weak appear strong yet remain weak; for the Will cannot be fully renounced after all, but finds its way out in the cleverest of ways. He noted how they paradoxically “use[d] power to dam the sources of power…. [A] baleful eye is cast at physiological well-being, especially against the expression of such well-being,… while a sense of joy is experienced and sought in… wilful privation, in self-denial and flagellation.”[8] It is through the Will that the weak try in futility to deny it. They cast away their inner nature, condemning those who are complicit, who partake in it. A minority, they convince themselves they are right, and the others are wrong, as though they are doing the right thing and are guided aright, while the others are misguided, and they take pride in their apparent pureness, seeking meekly for absolution, as if it is the proper pursuit, a struggle that will, in the end, be rewarded justly in the next life, where those who were tempted suffer eternally in damnation. Psychologically, this results in ressentiment, a feeling of deep-seated animosity or hatred of the oppressed directed toward the oppressor, over whom they have no control. Again, prefiguring Freudian theory, Nietzsche develops an early form of displacement; i.e., redirecting one’s feelings onto an object or person. In this case, the oppressed, who in reality can do nothing against their powerful rulers, fabricate their own mythology, in which the oppressors are punished in the name of the weak. Therefore, ressentiment is a form of catharsis, a release, if you will, of anger, which is relieved through imagined retribution. The slaves, who are by nature weak, bearing their suffering thereby, impute this suffering to the strong, whom they blame for their condition. Pleasing oneself, or indulging the Will, 250px-Temptation_of_Saint_Anthony_by_Bosch.jpegconsequently, is seen as bad. All acts exhibited as Will become frowned-upon, made into crimes: Those who want something and take it for themselves—a quality admired by the noble—are called covetous, and those who please themselves tirelessly, always taking more—self-preservational, and thus symbolic of a master—are called insatiate. Evidently, noble virtues become slavish vices, and noble vices become slavish virtues. The Will presents itself as weakness, which is interpreted by the slaves as strength, so they convince themselves that they chose it, that it is, as Nietzsche called it, an “achievement.” They are excited to have “tamed” the Will! To summarize, “The strong man’s objective is to take as much as he can from his victim; the weak man’s is to save as much as he can from his conqueror.”[9] Without hesitation, without thought, the strong man takes what he wants; the slave denies their Will and represses it.

All this sounds quite abstract and foreign, admittedly, as if it is out of place, which it might seem to most of us at first. However, I shall proceed to highlight some relevant, modern day examples that I hope shall illustrate that what Nietzsche is describing is entirely applicable and can easily be found in Western culture, and not some idle speculation about a different time period, when things were much different. A while ago, I did a blog on Lord of the Flies, wherein I discussed the Will to Power. Based on this discussion, I would ask, Who really won in Lord of the Flies? The answer, undoubtedly, is Jack. Although Ralph may have been saved by civilization, the damage was done, and in an alternate ending, he would have ended up dying at the hands of Jack and his merciless tribe. All throughout the novel, we readers are quietly cheering for Ralph and Piggy, the untainted, the pure, the civilized, to survive and triumph over the brutal images.jpegsavages into which the other boys had devolved. How terrible it would be if those brutes, those aggressive, violent, primitive hunters had the island to themselves! What chaos would ensue! Yet, in the end, Ralph and Piggy, the protagonists, were slaves to society’s morality; they unthinkingly followed the herd instinct. They did not question the morality imposed on them by society, which taught them to behave and to control their impulses, to stifle their Will. On the other hand, Jack and his tribe fully embraced their Will to Power. Channelling the primordial hunter within them, they expressed their instincts through aggression, such as when Jack hunts the pig or when Robert terrorizes the smaller boys—in either case, the boys were accompanied not just by a great pleasure, but a feeling of power, of power over something, exploitation. Whereas Piggy and Ralph were like small gazelles trying to survive, Jack was like a lion trying to predominate. It was the strongest who won.

A classic example of the battle between Good and Evil is the (currently) heptalogy Star Wars. Based on Campbell’s The Hero with a Thousand Faces, Star Wars follows the age-old theme of Light and Dark and the cosmic duel between opposing forces. Interwoven into its narrative is the want for the good guys—the Jedi, in this case—to beat the bad guys—the Sith—so that intergalactic peace can be maintained. So why exactly are the Jedi and Sith at odds? Why are they enemies of each other even though they both harness the same energy—the Force? The Sith, who practice what is called the “Dark side of the Force,” are called Evil by the Jedi because it is known to be tempting and thence corrupting. The learned masters warn their padawan not get drawn to the Dark side, lest they gratify their instincts, no matter how natural or easy they are to gratify. In essence, the Jedi are saying to choose virtue over vice. Sound familiar? The Jedi are the slaves, the Sith the masters. If we further examine the two orders, we shall find even better evidence. Both orders adhere to their respective codes, which outline their core beliefs. Here is the Sith Code:

Peace is a lie. There is only Passion.

Through Passion I gain Strength.

Through Strength I gain Power.

Through Power I gain Victory.

Through Victory my chains are Broken.

The Force shall free me.

Canon_Sith_symbol.pngIt can be gathered from this that central to the Sith philosophy is the idea of a blind, erratic chaos which governs all. There is no order in the galaxy, only disorder. The key to the Sith is aggression, which comes from the Will, and is pure, focused anger. It is through the instincts that power is both achieved and channeled, from which comes victory, after which follows freedom. Accordingly, it is the directing of the Will that sets them free; they engage their instinct for freedom, which the slaves deny. Another part of their code “encouraged the strong to destroy the weak, and insisted on the importance of struggling and surviving”; and the master and his student always sized each other up, for “a weak master deserved to be overthrown by their pupil, just as a weak pupil deserved to be replaced by a worthier, more powerful recruit.”[10] Words like “worthier,””powerful,” and “weak” all can be connected to the master-slave morality, having originated from the aristocracy. From this perspective, the Sith favor the strong, thinking themselves superior to the Jedi, whom they consider, conversely, the slaves. Nietzsche emphasized overcoming one’s struggles through exploitation, sort of like an extreme survival of the fittest, to use Spencer’s term. Therefore, the students of Sith masters, if they were deemed too weak, were replaced to make room for better, stronger, more Willful students. Darth Vader said, “Anger and pain are natural and part of growth…. They make you strong.” Both emotions named stem from the unconscious, the self-preservational, and both are biologically necessary, according to Nietzsche. Today’s Western civilization devalues anger, calling it an ugly, unproductive emotion, and discourages it. To the Sith and Nietzsche, however, anger is a necessary emotion through which the individual overcomes himself and becomes something, someone, better. Now let’s examine the Jedi:

There is no emotion, there is peace.

There is no ignorance, there is knowledge.

There is no passion, there is serenity.

There is no chaos, there is harmony.

There is no death, there is the Force.

Unknown.pngLooking at the parallel structures of the two codes, you will notice the Jedi Code is an exact inversion of the Sith Code! Compare this to what Nietzsche claims occurred millennia ago, when the Judeo-Christian slaves pulled a complete reversal on their masters, thus establishing the slave morality, which was the opposite of the noble values. The Jedi deny any chaos, instead affirming harmony; the Jedi deny the passions, instead affirming asceticism, or a turn away from them. To say someone is emotional is usually not a compliment, as it usually means they are over-dramatic, easily upset, or moody; so when the Jedi say there are no emotions, they are basically denying the Will to Power, eschewing it totally from their worldview, because according to them, emotions lead to chaos, whereas no passions leads to peace. The wisest of the Jedi, Master Yoda—everyone’s favorite backwards-speaking native of Dagobah—has a wealth of quotable adages, among them many attacks on the Sith, one of which goes, “Powerful you have become, the dark side I sense in you.” Automatically, he associates “power” with the dark side, for it denotes exploitation, injury, and all the other volitions Nietzsche stated. He also says, “[I]f you choose the quick and easy path… you will become an agent of evil.” Yoda uses the phrase “agent of evil” deliberately here: Make no mistake, he thought his wording through very thoroughly, such that his choice of words is intentional. Recall that through ressentiment, the slaves change Bad to Evil so that it looks like they are being oppressed; similarly, Yoda calls the Sith Evil, whereas the Sith would most likely call Yoda Bad, in accordance with the aristocratic morality. And when calls the dark side the “quick and easy path,” he calls it such because it is easier, he knows, to gratify one’s instincts than to repress them, as he does.

Finally, I shall examine the very popular HBO show Game of Thrones, in which I found much food for thought. As with every narrative, we always cheer for the good side and boo for the bad side. While watching, I asked myself, Why do we like the Unknown-1.jpegStarks and hate the Lannisters? What is it about the two houses that makes one favorable to the other? How is it that our values affect our associating with the characters?  Eddard “Ed” Stark is the first major character with whom the audience starts to feel an affinity. He is the archetypal “good guy” because he is pure, ascetic, and he denied his Will. Compassionate, considerate, fatherly, and humble, Ed is loved by all because he is so virtuous and caring—we would never expect him to burn down a village of innocents, for example: It is not his character to do so. His resistance to his Will made him weak and oppressed, though. Why would we be cheering for an oppressed character? It is precisely because of his weakness that we like him: We feel pity for him, and we want him to prevail at the hands of evil, we want him to succeed, we want him to stand up against the oppressors, we want retribution, we want a David and Goliath story. The weak, we have learned, always blame their oppressors, so we naturally blame the Lannisters and acquit the Starks, who have suffered at the hands of the former. Unfortunately, it is Ed’s purity and refraining from the rampant corruption, dishonesty, and moral bankruptcy around him and his loyalty to a moral code that lead to his downfall. Each time the Starks lose and the Lannisters gain, every step backwards and forwards they take, respectively, the more we love and pity the Starks and hate and abhor the Lannisters, who seem to take everything they want, rapacious, immoral, and exploitative. We viewers suffer from the pessimism of sensibility: There is so much suffering in the show—too much—that we become disillusioned, making us feel like life is unfair, like there is no equality, and so we become disheartened every time the Starks suffer a loss; we suffer with them. We want justice for the cruel acts the Lannisters commit against the defenseless. The Lannisters do anything that will get them ahead, even if it means blurring the lines of what is considered moral, using whatever is in their advantage, cheating when they can. Hence, Unknown-2.jpegJaime and Cersei, heads of House Lannister, are masters. Jaime Lannister has a simple, anthropocentric worldview: He and Cersei are the only two people who are important in the world, and nothing else matters. In other words, Jaime cares only about himself and Cersei, and he is willing to do whatever he needs to so he can protect her. Instead of compiling a list of ethics, Jaime has a simple goal, with no guidelines. Anything goes. He can do whatever he pleases, as long as it is for his and Cersei’s sake. Even when Jaime is the prisoner of Brienne, supposedly making Brienne the master and Jaime the slave, Jaime remains the master after all. Pretty much every action movie I have seen has a scene where the good guy has a captured enemy who taunts them, encouraging them to strike them, to lose their temper and ignite their fury, but the good guy refuses, calms himself, collects his nerves, remembers his values, and does not give into the volatile words. As when in Star Wars Emperor Sidious tells Luke to act on his anger but Luke refuses to surrender to the dark side, so Jaime tries to enrage Brienne, clearly unnerving her, then telling her to release her anger on him, because he knows she wants to; as the fire lights in her eyes and she raises her sword, she then drops it, remembering her promise, and she chooses the “noble path,” the ascetic path. She wants to hurt him, deep down. She wants to be cruel. Unknown-3.jpegBut she resists her Will on account of a “higher order.” Jaime, then, has the real advantage over Brienne. While she may be the one with the sword, and while he may be the one tied up, it is he who holds dominance, who is most powerful. Another encounter, this time with Edmure Tully, takes place in a tent; this time, the positions have changed, Edmure being the prisoner, Jaime being the keeper. Edmure tells Jaime, “You understand you’re an evil man.” After a discussion that leads to the subject of Catelyn Stark, Edmure’s sister and Jaime’s former captor, Jaime states, “Catelyn Stark hated me like you hate me, but I didn’t hate her. I admired her, far more than I did her husband or her son” (S6:E08). Like Yoda, Edmure Tully calls Jaime “Evil” to demonstrate that he is his opposite. While Edmure is Good, a saint, Jaime is Evil, a sinner. One of the characteristics of the noble master, Nietzsche claimed, is that they have a “love of their enemy”; meanwhile, the slaves despise those they call Evil. The strong respect their enemies because they define themselves in relation to them. Without the Bad, there can be no Good. Nobles, therefore, respect those lower than them, because they have power over them. Jaime’s sister, Cersei, also has a straightforward moral code: Unknown.jpeg“I do things because they feel good” (S6:E10). In that episode, Cersei turns the tables against her zealot-captor Septa Unella. She says Unella made her suffer not out of compassion or a desire to see her purify herself, but out of her inner, biological craving for cruelty that comes from the Will. She made her miserable because she loved to inflict pain, which, Cersei confides, she, too, experiences. Cersei does not follow a pre-established morality; rather, she makes her own, doing whatsoever she pleases, whensoever she pleases, if it benefits her, even if it means killing thousands—even if, among those thousands, there are innocents. That is, she does not think before acting, but forms her morality from that. Nietzsche explained that pleasure is not what is good for oneself or what makes one feel pleasant. Pleasure is just a byproduct which accompanies an increase in power. Consequently, whenever Cersei does something because it pleases her, it really means she does it because she gains power, and her Will to Power is fulfilled. When she makes a decision, Cersei does not consider what effect it may have on others, especially the slaves; she only does what will further her cause. Another character who values power is Ellaria Sand, widow of Oberyn Martell, who, after killing Doran Martell, proclaims, “Weak men will never rule Dorne again” (S6:E01). Because Doran did nothing, Ellaria decided to take power into her own hands, stabbing him in order to gain control, such that she could rule Dorne, this time with purpose and conviction. Doran did not do anything. He preferred peace and was thus inactive. And weak. He did not take initiative, did not affirm his Will, and so let his country suffer. Instead of a slave, Dorne needed a master to rule. Two other characters—Dænerys and Grey Worm—ought to be evaluated as well. Danny, the so-called liberator of men, is not herself liberated, but enslaved, not in the Unknown-1.jpegsense of being indebted to another, but insofar as she is dependent on a higher morality, one that demands quiescence of the Will, and which seeks to eliminate the Will in others, the masters of Slaver’s Bay. She is pitiful and merciful, yet at the same time she possesses a certain brutality. As it is, Danny cannot be strictly classified as a master or slave insomuch that she simultaneously hinders her Will and incites it. Her loyal soldier, Grey Worm, has a talk with Tyrion. Tyrion asks, “Why don’t either of you ever drink?” to which Grey Worm replies, “Unsullied never drink.” Unconvinced, Tyrion queries, “Why not?” Grey Worm says, “Rules,” answered by Tyrion, “And who made these rules, your former masters?” (S6:E08). Here, Tyrion remarks that Grey Worm, despite being a freed man, still lives by his old master’s rules, thereby enslaving him. Morality, to Nietzsche, is a herd instinct; put another way, morality is something to which the weak flock, as though they are herd animals, and into which they invest blind trust, accepting it without questioning it, living by its rules without ever stopping to ask why they live by those rules, slaves to tradition, shackled to its ascetic ethics. Grey Worm does not live by his own, self-invented rules; he does not affirm himself; he denies his power and surrenders it to another.

What Nietzsche painted is a bleak, unaffectionate, uninviting, savage picture, in which the strong dominate the weak, and inequality reigns supreme alongside chaos and anarchy. Do I personally agree with what he said? I agree that our Western values have been and are influenced by and even derived from the Judeo-Christian traditions, which valued asceticism and renunciation of the passions, in favor of a virtuous, happy, and content life lived with value. It is not hard to see that this morality is ingrained in our Unknown-2.jpegculture, even in the 21st-century. I agree that we are approaching a time of nihilism, when our traditions are collapsing around us, and we are slowly losing these long-cherished values. I disagree with Nietzsche, however, that it is the strong and powerful who must triumph, that the slave morality is subversive and self-defeating. It is true that Nietzsche never explicitly expressed contempt for the slave morality; he just disapproved of it. Notwithstanding, today’s values have undergone changes within the last two millennia, and they will inevitably continue to change with the ages. The next time you are watching a movie or TV show, the next time you find yourself cheering for the good guy, remember that there are two sides to every story. Our protagonists all have motivations, but so do our villains. As you find yourself lounging on the couch, whether in bed or in Yin-Yang-Black-Gold-Dark-Temple-Small-308x300.jpgthe theater, watching the cosmic eternal dance of Good and Evil, consider what you value and why you value what you value. Was the point of this essay to convince you to start backing up the bad guys? Not at all. It is to get you thinking. It is to get you to consider things from a different perspective—something we all ought to do every now and then. “You are aware of my demand upon philosophers,” said Nietzsche—”that they should take up a stand Beyond Good and Evil.”[11]

[1] Nietzsche, On the Genealogy of Morals, p. 9, Preface, §6
[2] Id., p. 32, Essay 1, §13
[3] p. 62, Essay 2, §11
[4] p. 52, Essay 2, §6
[5] Aristocrat derives from the Greek aristos, meaning “best”
[6] Nietzche, op. cit., p. 22, Essay 1, §7
[7] p. 81, Essay 3, §1
[8] p. 104, Essay 3, §11
[9] Mencken, The Philosophy of Friedrich Nietzsche, p. 61
[10] http://starwars.wikia.com/wiki/Sith
[11] Nietzsche, Twilight of the Idols, p. 33
For further reading: On the Genealogy of Morals by Friedrich Nietzsche (2013)
The Philosophy of Friedrich Nietzsche by H.L. Mencken (2006)
Twilight of the Idols
by Friedrich Nietzsche (2008)

Human Rights During the Industrial Revolution


In the beginning of the 18th-century, people lived in small, isolated rural environments where they worked their farms. Families looked after their farms and grew what they needed, from which they got subsistence. Because they were small, everyone knew each other. Wealthy landowners began to enclose these rural lands, forcing farmers to emigrate to the cities. Here, the farmers thought, they could find work and create a new life. Factories were beginning to replace farms as the source of energy and production. The spreading use of coal improved these factories, making them more plentiful, but also more dangerous. As more people moved to the cities, and as factories began popping up by the tens, the cities became massive centers of civilization, increasing substantially in size from the 1750’s to the late 1800’s. The Industrial Revolution was dramatically harmful to human rights because, although it provided opportunities for work and a new life, it violated people’s rights to social services and work more so than it did to further them.

During the process of industrialization, people’s right to social security—being secure in their lives—and service were being infringed upon because they could not support themselves. According to Elizabeth Gaskell, those living in industrial Manchester lived sordidly in cheap, low-quality apartments called tenements in which the conditions were unsightly and barely suitable for humans (Gaskell, A Tale of Manchester Life). This book was written by Gaskell in the middle of the industrialization of Manchester for audiences outside of England so that they could read about the horrors of living in the city. She was a writer, so it was her job to write objectively yet entertainingly. She has a negative attitude towards the living conditions of the people as evidenced by the fact that she criticizes the homes in which the people lived, pointing out how hastily made they were, Unknown-1.jpegsuch that they posed a danger to those living in them. This supports the idea that social security was not present in Manchester, because everyday people could not afford to live in a safe home. In comparison to their previous lives on their farms, the people were miserable, having moved to the city, for they could not support themselves with the meager wages they made during the day. Because residents could often not pay for education or for a nice home, it confirms that social security was lacking during the Industrial period. Timelines.tv says in a documentary that residents in Manchester were crowded into tiny spaces and shared bathrooms along the river, which propagated cholera. Cramped together, they got each other sick and could only afford shabby houses crudely built. The documentary’s description confirms the terrible standards of living in Manchester. Confined to small spaces, which are not good for full families, and susceptible to diseases, residents’ lives were always in danger, whether inside or out. Despite advances in medicine from the agricultural days, the people were less prone to diseases before moving to the city. After moving, though, they lost their rights to comfortable housing, affordable medical care, good food, security, and livelihood.

Unknown-2.jpegFactories were the powerhouses of the Industrial Revolution, and the workers who worked them were deprived of their rights as laborers. Flora Tristan wrote that in Manchester, the workers were in bad condition: Poor, starved, choked, unhealthy, and forced to labor all day, they lived miserable lives. Half of the day they spent working, only to return home with an empty stomach and without proper comfort (Source J). An activist and defender of women’s rights, Tristan wrote her journal as a response to the industrialization around her, to the rapid changes which were occurring in 1842, the focus being on how people were treated. Her journal was published, so it was most likely publicized with the intent of letting people know why working conditions in Manchester were abhorrent, or maybe with the intent of getting more reforms to help out. Based on her background, it is clear she would have looked down upon industrialization, particularly on how it debased human rights. As such, it was a plea for help in order to make workers’ conditions more humane. Tristan attests to the poor treatment of workers, who, before living in the city, were their own bosses, worked their own hours, and got their respective pay. The work in the factory was long, tedious, and demanding, yet the workers were not paid fairly, according to their work; as a result of their unfair wages, workers were not given their right to a dignified existence from their work, but had to go home to a run-down apartment, sparse in furniture, lacking clothes, and filled with Unknown-3.jpegalcohol forced upon them by their unfulfilling lives. In the article “Why did Great Britain Industrialize 1st?”, it says that industrial England had no unions and actively banned them, meaning workers could not come together and argue for their rights; hence, entrepreneurs could do whatever they liked with them. The article reveals the unfairness and inequality with which workers were treated, because workers guilds were able to keep workers from being exploited, established to create representation for them; but when the city banned guilds, it left them open to exploitation, and they were in the hands of rich business owners. This supports the idea that workers were deprived of their right to form a union because unions themselves were discouraged and disbanded; therefore, workers could not represent themselves as they had in the past, but had their rights violated at the hands of the government.

In order to industrialize effectively without impinging on human rights, it is necessary to go slowly and thoughtfully, and not to rush. Industrialization requires that businesses are made so the economy can prosper, and also that more jobs can become available. All of these things are vital to making progress, as they improved human rights. But when unions are removed, exposing workers; when houses are made quickly and without thought, close together and dirty, dark and dank; when factories are built in the tens, with no supervision, then industrialization turns into a nightmare. Hence, it is important that an industrializing nation take into account the coordination and planning involved in creating a nation that helps, not hinders, the people.

Huxley on Experiencing Life and Being Human

Huxley on Experiences.png“‘But I don’t want comfort. I want God. I want poetry. I want real danger. I want freedom. I want goodness. I want sin.'”
“‘In fact… you’re claiming the right to be unhappy.'”
“‘All right then… I’m claiming the right to be unhappy.'”


Aldous Huxley (1894-1963), noted philosopher and author of the dystopia Brave New World, from which this is taken.


Brave New World by Aldous Huxley (2007)

Who was Solon?

Unknown.jpegToday’s politics hardly takes itself seriously. With weak leadership, horrible class inequality, and polarization, this generation is going through a rough time in a democracy where its voice is rarely heard, let alone acted upon. Back in Ancient Greece, politics was everybody’s business; it was every citizen’s duty to contribute to the polis and partake in its affairs. At a young age, children were taught rhetoric and advised in politics in order to prepare them for leadership, as a good leader was valued above else. The Greeks had the same struggles we have today, including the abuse of power by the rich, select few; the inept distribution of wealth; and conflicting party viewpoints. And like today, the Greeks had their fair share of bad leadership and lack of prudence, which resulted terribly. One man in 6th-century Athens, however, took his place in office and, resisting the temptations of power, tried his best to bring equality and prospering to his city, his legacy one of great wisdom mixed with triumphs and failures, a story of a man who struggled to make Athens free. Solon of Athens, although he did not create democracy, laid the necessary foundations for it.

As with most very old historical figures, the date of Solon’s birth is not exact, nor is his death, but it is generally thought to be in 638 B.C. The son of either Euphorion or Execestides, Solon was nonetheless of noble birth, an aristocrat—a eupatrid, meaning “of a good father.” Despite his upbringing, Solon was sympathetic toward the poor, with whom he shared an affinity, which would influence his views as a politician. To make ends meet, he became a merchant so he could travel and make money. Plutarch claimed he had not money in mind, but experience: “It is certain that he was a lover of Unknown-1.jpegknowledge (φιλόμαθος), for when he was old he would say, that he — ‘Each day grew older, and learnt something new.’”[1] Solon was able to travel across seas as a merchant, giving him access to all sorts of knowledge; already at a young age, he showed signs of being a devoted man of wisdom and learning. He gained his reputation as a brilliant strategist after he defeated the island of Salamis for Athens. Having been stolen by the Megarians, Salamis was heavily fortified, and many attempts had been made to take it back, but all in vain. Solon rallied the Athenians in the market and told them of a plan, which, when carried out, successfully got the island back, earning him respect from all the Athenians, who were all indebted to him. So, in 594 the Athenians unanimously voted to have Solon be the archon. The condition of Athens was horrible at the time; it was in a state of crisis: The poor could not pay for their land, so they sold themselves to the aristocrats, who treated them unfairly, causing the peasants to revolt against their masters. Precipitously close to civil war, desperate for a peaceful, bloodless resolution, the Athenians, poor and rich alike, turned their heads to the one man they knew could resolve it in all his wisdom: Solon—the single man who managed to get Salamis from the Megarians, and who defeated Crisa two years earlier. Those who lived on the coasts of Athens wanted the focus to be on the economy, those on the plains land; the Hills wanted a democracy, the Plains an oligarchy, and the Coast a mixed government. Humble, modest, and temperate, Solon was suspicious of power, fearing its ability to take control of a man’s better senses. He declined. The people insisted, and he was conferred the title of dictatorship, which allowed him to do anything at all without question. A popular poem mocks Solon’s humility:

Solon surely was a dreamer, and a man of simple mind;
When the gods would give him fortune, he of his own will declined;
When the nets was full of fishs, over-heavy thinking it,
He declined to hail it up, through want of heart and want of wit.
Had I but the chances of riches and kingship, for one day,
I would give my horse for flaying, and my house to die away.

Solon was a man of virtue. He detested wealth and greed, preferring virtue to vice, of which he thought wealth and greed were two. In one of his own poems, he disdains those of wealth, and champions those who live virtuous lives:

Some wicked men are rich, some good are poor,
We will not change our virtue for their store:
Virtue’s a thing that none can take away;
But money changes owners all the day.

Solon’s famous reforms are thought by some historians to have occurred 20 years after his election to the archonship, in the 570’s B.C., but no one knows for sure. His first, most infamous reform was known as the Seisachtheia (σεισαχθεια), the “shaking off of burdens.” Before his election, the Greek farmers had barely any money, and they could not manage to pay for their land. As a result, they became serfs and worked on the nobles’ lands, paying ⅙ of their yield every harvest, giving them the name “Hektemoroi,” (εκτημοροι) or “sixth-partner.” Some were better off than others: Those who were lucky became serfs and had to pay their debt off, while others had to sell themselves as slaves, sort of like indentured servants, and pay off their debt that way. The Hektemor system, images.jpegthen, was an early form of the feudalism that would become prevalent in Medieval Europe. Noble lords would have peasants, known as serfs or vassals, who would do all the work and pay them as a debt, just as those who live in apartments pay their landowners. This system created a lot of unhappiness and inequality. Seeing as the upper class got to get paid and have their own slaves, they were happy; but the lower class, evidently, was not, motivating them to want to revolt. Upon becoming archon, Solon cleared all debts whatsoever, allowing the poor to never have to pay a cent to their former owners. As he put it, “The mortgage-stones that covered her, by me / Removed, —the land that was a slave is free.” Solon removed all traces of serfdom, going so far as to buy back all slaves who had been sold across the ocean, claiming, “—so far their lot to roam, / They forgot the language of their home.” His closeness to and pity for the poor inspired him to bring everyone home. They had been so long, he says, they had even forgotten how to speak their birth language. Furthermore, Solon banned all future loans on the body, making it illegal for anyone to pay off a debt through slavery. One might think the Hektemoroi would be happy because they were now free men. Unfortunately, the Hektemoroi were no more pleased than when they had been enslaved, for they desired a redistribution of land, land Solon never gave back. On the other hand, the upper classes were unhappy, too, because they had lost their slaves. Solon, disappointed, reflected,

Formerly they boasted of me vainly; with averted eyes
Now they look askance upon me; friends no more, but enemies.

Even though they were quick to ask for his judgment, the Athenians ultimately ended up turning their backs to him, their hero, their miracle who was supposed to fix everything. They held expectations that were too high and too much to ask of Solon without becoming unfair, and thus he was made to live with his decision. The next thing Solon sought to reform was the government. Slow and steady, Solon transformed Athens from an oligarchy to a timocracy, replacing blood with wealth, family with property. This was known as the “timocratic principle”—the movement away from privilege to success. He Unknown-1.jpegdivided the Athenians into four classes: The Pentakosiomedimnoi (πεντακοσιομέδιμνοι), so named because they produced 500 of any product, who were of the highest rank and consequently eligible for the highest offices, such as archon, treasurer (ταμιας), and magistry; the Hippeis (Iππος), who were of the second greatest wealth and were able to afford horses (hence the name, which also means horse) with between 500-300 products in their name, making them eligible for the cavalry and magistry; the Zeugitai (ζευγίται), who produced 300-200 products and were able to fill lower offices, and were reserved for the hoplite phalanx, which back then was the infantry; and the Thetes (θητες), the lowest class, which made under 200 products and was incapable of taking office, their only options being to become a worker or join the assembly. Next, he made economic reforms that greatly benefitted Athens. First, he banned all exports of anything but olive oil. Grains were hard to grow on the mountainous, rough terrain of Greece, a region more fitted to the growing of olive trees. Hence, grains were difficult to grow and rare, and the Greeks needed it more than other cities did. As olive trees grow longer and were more abundant, they were to be the focus of the economy—and a great success it was! Thanks to Solon, the economy grew much quicker and more efficiently than before. Second, he invited artisans and craftsmen from other poleis to settle down in Athens with their families, so as to improve both the population and the tradesmanship of the city-state. Because Solon believed strongly in self-reliance and developing one’s skills, he wanted people to learn the importance of tehkne (τέχνη), an important term in Greek that refers to “knowledge of a craft” and “skill.” (Whence we get “technique” and Unknown-1.jpeg“technology.”) He promoted apprenticing, confident he could make Athens a great center for arts and crafts. It was made mandatory that a father teach his son his craft; if he did not, if the child had no craft, he was not in any way obligated to look after his father in his later years. In granting citizenship to foreigners, Solon was seen as very liberal, for citizenship was theretofore strict and reserved; such a law, however, led to the rise, historians say, to the amazing pottery we today see and admire from Athens. Metics (μέτοικος), or resident aliens, were able to get Athenian citizenship. Moving on to political reforms, Solon created a law which “disenfranchize[d] all who st[ood] neuter in a sedition.”[2] In other words, during a revolt, anyone who did not join a side was arrested. Sounds kinda counter-intuitive doesn’t it? Solon’s intention was to enforce loyalty and patriotism: Politics was everyone’s business, so Solon expected his people to fight for one side, a side they thought worthy of fighting for. Two of his greatest reforms came when he devised the Ekklesia (Eκκλησια) and Heliaia (Ηλιαία). The former was a probouleutic, 400-member council whose role was comparable to that of the assembly. As a probouleutic council, its job was to hold preliminary discussions and debates before passing them onto the main assembly. It was composed of 100 representatives from each of the four Attic tribes. The latter was a court system, of which the Thetes could be a part, but from which women, slaves, and metics were excluded. The role of the Heliaia was to handle public litigation; thitherto, cases could only be taken up which regarded familial or tribal matters, such as if one person harmed another, then only the family could get the case, not an individual. Therefore, the power of the law extended beyond the family and unto the community. If an individual was robbed, he could now litigate. Further, if one was unhappy with one’s verdict, one could appeal to the Heliaia, much as one can do today to the Supreme Court, whose equivalence was in Greece the Council of Areopagus. In this way, the court system of Athens was a nomothetic dikastery; i.e., it was a law-giving (nomothetic) institution consisting of a jury trial (dikastery). Aristotle commented that Solon “formed the courts of law out of all the citizens, thus creating the democracy.”[3] Solon went on to formulate ancient_police-greece.jpgnew laws, having removed all of Draco’s, except that regarding homicide. He thus reduced the severity of the Athenian law and granted amnesty to all criminals, save murderers. Regarding family matters, Solon was skeptical of the rich and powerful families who had held supremacy for a long time in the city. He made it so that every childless man—like himself—could give his property to whomever he wanted; formerly, the property automatically went to his relatives.  He placed stringent regulations on women and the size of funerals. Favoring the poor, he did not like seeing the rich flaunt their money in public. “In all other marriages,” wrote Plutarch, “he forbade dowries” because marriages were not supposed to be “for gain or an estate, but for pure love, kind affection, and the birth of children.”[4]  

Solon finally decided after all he had done to leave Athens for 10 years. While he claimed to have left because he wanted “to travel,” most think it was because he was trying to escape from the inevitable criticisms he would face regarding his reforms, which were unpopular with everybody. In the end, in spite of everything he did for Athens, for the Athenians, he had appeased no one; no one walked out the victor, none the loser either. “In large things,” he would say, “it is hard to please everyone.”[5]

Such power I gave the people as might do,
Abridged not what they had, now lavished new,
Those that were great in wealth and high in place
My counsel likewise kept from all disgrace.
Before them both I held my shield of might,
And let not either touch the other’s right.

Here, Solon talks about how he attempted to give the Athenians what was equitable. He tried to the best of his abilities to preserve equality among the poor and rich, giving them what they needed, not what they wanted. Although he favored the poor and wanted the best for them, he also sought to remain impartial, as justice is, and give the rich what they deserved as well, careful not to imbalance the social order, the only thing standing between the two, the keeper of order, the defender of peace. Looking at the political situation after he left, it is easy to compare it to the French Revolution in a way, insofar as the poor were radical, the rich reactionary; the former wanted more than what they got, and they wanted the change to happen immediately, in hopes of erasing the visages of aristocratic life; the latter wanted to go back to the way things were, when they were in charge, when they could show off what wealth they had. Either way, no party got what they wanted, and so what seemed a failure for Solon was really a success. During his travels, Solon decreed that his laws were to stay in place for 100 years, so they were tmp903725021887725569.jpgrecorded on axones, wooden posts, in the agora for everyone to see. Of course, many of the poor were illiterate and could therefore not understand many of the laws, but those who could, and who broke them anyway, had to dedicate a golden statue to the square in their name. Meanwhile, Solon was off seeing the world. He visited Egypt and encountered a priest named Sais, through whom he learned of the tale of Atlantis, the very tale which would be told to Plato. The historian Herodotus recounted that Solon also visited Crœsus, but scholars object to this, stating it is anachronistic—the two lived during different times. Returning to Athens, Solon found Athens under the sway of the young Peisistratus; Solon proceeded to warn the Athenians not to trust him, to no avail; he had, during his travels, lost his credibility, power, and esteem. He died in 559 B.C. at about the age of 80. Solon was named one of the Seven Sages, earning the title of “sophist,” a title that, ironically, would be interpreted in a much more negative way in the next century. The famous adage “Nothing in excess”—μηδέν ἄγαν—is attributed to him. Appropriately, he said, “But the hardest thing of all is to recognize the invisible Mean of judgment, which alone contains the limits of all things.”[6] Perhaps the greatest part of Solon’s legacy is his reputation as a politician-poet, a leader who led with wisdom, grace, beauty, and eloquence. His poems, some of which have been quoted above, reveal his morals and political motives:

I gave to the mass of the people such rank as befitted their need,
I took not away their honor, and I granted naught to their greed;
While those who were rich in power, who in wealth were glorious and great,
I bethought me that naught should befall them unworthy their splendor and state;
So I stood with my shield outstretched, and both were safe in its sight,
And I would not that either should triumph, when triumph was not with right.

Dark Earth, thou best canst witness, from whose breast
I swept the pillars broadcast planted there,
And made thee free, who hadst been slave of yore.
And many a man whom fraud or law had sold
For from his god-built land, an outcast slave,
I brought back again to Athens; yea, and some,
Exiles from home through debt’s oppressive load,
Speaking no more the dear Athenian tongue,
But wandering far and wide, I brought again;
And those that here in vilest slavery
Crouched ‘neath a master’s frown, I set them free.
Thus might and right were yoked in harmony,
Since by the force of law I won my ends
And kept my promise. Equal laws I gave
To evil and to good, with even hand
Drawing straight justice for the lots of each.
But had another held the goad as I,
One in whose heart was guile and greediness,
He had not kept the people back from strife.
For had I granted, now what pleased the one,
Then what their foes devised in counterpoise,
Of many a man this state had been bereft.
Therefore I showed my might on every side,
Turning at bay like wolf among the hounds.

So popular were they, imbued with such moral value, they were customarily memorized by children. Solon’s political philosophy was centered around “Eunomia” (Ευνομια), which translates roughly to “well-government,” referring to the exact stability and equality of which Solon himself dreamed. He defined Eunomia as the goddess of “peace and harmony of the whole social cosmos”—the well-being of the people, and, in general, communal happiness.[7] Also, another large part of his philosophy was the divine principle of Justice (Δικη); Justice played the role of not divine punishment, but political and social punishment, a penalty imposed on the people when there was strife and inequality. Jæger said, “It is the first objective statement of the universal truth that the violation of justice means the disruption of the life of the community.”[8] Solon believed injustice was human-caused; he therefore believed in the responsibility of the individual Unknown-1.jpegto bear the consequences of his actions, and specifically, of his vices, which affect not only himself, but his community as a whole. Most importantly, though, Solon is revered as the founder of democracy. It would be imprecise to call him the founder per se, because it is Cleisthenes who is regarded as the founder of democracy, but it was Solon who made it possible. In giving power to the masses and opening up the rights of citizenship, he “put an end to the exclusiveness of the oligarchy, emancipated the people, established the ancient Athenian democracy, and harmonized the different elements of the state.”[9] Jæger, I feel, does Solon more justice than Aristotle in describing his impact: “Because he brought together the state and the spirit, the community and the individual, he was the first Athenian.”[10]

[1] Plutarch, Twelve Lives, p. 82
[2] Id., p. 96
[3] Aristotle, Politics, II.12.1274a1-5
[4] Plutarch, ibid. 
[5] Pomeroy, Ancient Greece, p. 187
[6] Jæger, Paideia, Vol. 1, p. 148
[7] Id., p. 141
[8] Ibid.
[9] Aristotle, op. cit., 1273b35-40
[10] Jæger, op. cit., p. 149


For further reading: Ancient Greece: A Political, Social, and Cultural History 2nd ed. by Sarah B. Pomeroy (2008)
The Oxford Companion to Classical Civilization by Simon Hornblower (1998)
Ancient Greece and the Near East by Richard Mansfield Haywood (1968)
The Illustrated Encyclopedia of Ancient Greece
by Nigel Rodgers (2017)
Paideia: The Ideals of Greek Culture 
Vol. 1 by Werner Jæger (1945)
A History of the Ancient World
by Chester C. Starr (1991)

The Story of Civilization Vol. 2 by Will Durant (1966)
A History of Greece Vol. 3 by George Grote (1899) 
Twelve Lives
by Plutarch (1950)

Technology and Social Media: A Polemic


Much gratitude is to be given to our devices—those glorious, wonderful tools at our disposal, which grant us capabilities whereof man centuries ago could only have wished, the culmination of years of technology, all combined in a single gadget, be it the size of your lap or hand. What a blessing they are, to be able to connect us to those around the world, to give us access to a preponderance of knowledge, and to give longevity to our lives, allowing us to create narratives and storytell; and yet, how much of a curse they are, those mechanical parasites that latch onto their hosts and deprive them of their vitality, much as a tick does. That phones and computers are indispensable, and further, that social media acts as a necessary sphere that combines the private and public, creating the cybersphere—such is incontrovertible, although they are abused to such an extent that these advantages have been corrupted and have lost their supremacy in the human condition.


Technology is ubiquitous, inescapable, and hardwired into the 21st-century so that it is a priori, given, a simple fact of being whose facticity is such that it is foreign to older generations, who generally disdain it, as opposed to today’s youths, who have been, as Heidegger said, thrown into this world, this technologically dominated world, wherein pocket-sized devices—growing bigger by the year—are everywhere, the defining feature of the age, the zeitgeist, that indomitable force that pervades society, not just concretely, but abstractly, not just descriptive but normative. In being-in-the-world, we Millennials and we of Generation X take technology as it is, and accept it as such. To us, technology is present. It is present insofar as it is both at hand and here, whereby I mean it is pervasive, not just in terms of location but in terms of its presence. A fellow student once observed that we youths are like fish born in the water, whereas older generations are humans born on land: Born into our circumstances, as fish, we are accustomed to the water, while the humans, accustomed to the land, look upon us, upon the ocean, and think us strange, pondering, “How can they live like that?”


As per the law of inertia, things tend to persist in their given states. As such, people, like objects, like to resist change. The status-quo is a hard thing to change, especially when it is conceived before oneself is. To tell a fellow fish, “We ought to live on the land as our fathers did before us”—what an outlandish remark! Verily, one is likely to be disinclined to change their perspective, but will rather accept it with tenacity, to the extent that it develops into a complacency, a terrible stubbornness that entrenches them further within their own deep-rooted ways. This individual is a tough one to change indeed. What is the case, we say is what it ought to be, and so it is the general principle whereupon we take our case, and anyone who says otherwise is either wrong or ignorant. Accordingly, following what has been said, the youth of today, the future of humanity, accepts technology as its own unquestioningly. As per the law of inertia, things tend to persist in their given states—that is, until an unbalanced force acts upon it.


What results from deeply held convictions is dogmatism. A theme central to all users of devices, I find, is guilt; a discussion among classmates has led me to believe that this emotion, deeply personal, bitingly venomous, self-inflicted, and acerbic, is a product of our technological addictions. Addiction has the awesome power of distorting one’s acumen, a power comparable to that of drugs, inasmuch as it compromises the mind’s judiciary faculty, preventing it from distilling events, from correctly processing experiences, and thereby corrupting our better senses. The teen who is stopped at dinner for being on their phone while eating with their family, or the student who claims to be doing homework, when, in reality, they are playing a game or watching a video—what have they in common? The vanity of a guilty conscience—would rather be defensive than apologetic. The man of guilt is by nature disposed to remorse, and thus he is naturally apologetic in order to right his wrong; yet today, children are by nature indisposed thereto, and are conversely defensive, as though they are the ones who have been wronged—yes, we youths take great umbrage at being called out, and instead of feeling remorse, instead of desiring to absolve from our conscience our intrinsic guilt, feel that we have nothing from which to absolve ourselves, imputing the disrespect to they who called us out.


Alas, what backward logic!—think how contrary were it to be if the thief were to call out that poor inhabitant who caught them. Technology has led to moral bankruptcy. A transvaluation of morals in this case, to use Nietzsche’s terminology is to our detriment, I would think. Guilt is a reactionary emotion: It is a reaction formed ex post facto, with the intent of further action. To be guilty is to want to justify oneself, for guilt is by definition self-defeating; guilt seeks to rectify itself; guilt never wants to remain guilty, no; it wants to become something else. But technology has reshaped guilt, turning it into an intransitive feeling, often giving way, if at all, to condemnation, seeking not to vindicate itself but to remonstrate, recriminate, retribute, repugn, and retaliate. Through technology, guilt has gone from being passive and reactive to active and proactive, a negative emotion with the goal of worsening things, not placating them. Digital culture has perpetuated this; now, being guilty and remaining so is seen as normal and valuable. Guilt is not something to be addressed anymore. Guilt is to be kept as long as possible. But guilt, like I said, is naturally self-rectifying, so without an output, it must be displaced—in this case, into resentment, resentment directed toward the person who made us feel this way.


—You disrupt me from my device? Shame on you!—It is no good, say you? I ought get off it? Nay, you ought get off me!—You are foolish to believe I am doing something less important than what we are doing now, together, to think it is I who is in the wrong, and consequently, to expect me to thusly put it away—You are grossly out of line—You know naught of what I am doing, you sanctimonious tyrant!—


When asked whether they managed their time on devices, some students replied quite unsurprisingly that they did not; notwithstanding, this serves as a frightful example of the extent to which our devices play a role in our lives. (Sadly, all but one student said they actually managed their time.) They were then asked some of the reasons they had social media, to which they replied: To get insights into others’ lives, to distress and clear their minds after studying, and to talk with friends. A follow-up question asked if using social media made them happy or sad, the answer to which was mixed: Some said it made them happier, some said it made them sadder. An absurd statement was made by one of the interviewees who, when asked how they managed their time, said they checked their social media at random intervals through studying in order to “clear their mind off of things” because their brains, understandably, were tired; another stated they measured their usage by the amount of video game matches played, which, once it was met, signaled them to move onto to something else—not something physical, but some other virtual activity, such as checking their social media account. I need not point out the hypocrisy herein.


I take issue with both statements combined, for they complement each other and reveal a sad, distasteful pattern in today’s culture which I shall presently discuss. Common to all students interviewed was the repeated, woebegone usage of the dreaded word “should”:
—”I should try to be more present”—
—”I should put my phone down and be with my friends”—
—”I should probably manage my time more”—


Lo! for it is one thing to be obliged, another to want. Hidden beneath each of these admissions is an acknowledgment of one’s wrongdoing—in a word, guilt. Guilt is inherent in “shoulds” because they represent a justified course of action. One should have done this, rather than that. Subsequently, the repetition of “should” is vain, a mere placeholder for the repressed guilt, a means of getting rid of some of the weight on one’s conscience; therefore, it, too, the conditional, is as frustrated as the guilt harbored therein.


Another thing with which I take issue is when the two students talked about their means of time management. The first said they liked to play games on their computer, and they would take breaks intermittently by going elsewhere, either their social media or YouTube to watch videos. No less alogical, the other said they would take breaks by checking their social media, as they had just been concentrating hard. How silly it would be for the drug addict to heal himself with the very thing which plagues him! No rehabilitator assures their circle with alcohol; common sense dictates that stopping a problem with that which is the problem in the first place is nonsense! Such is the case with the culture of today, whose drugs are their devices. In the first place, how exactly does stopping a game and checking some other website constitute a “break”? There is no breach of connection between user and device, so it is not in any sense a “break,” but a mere switch from one thing to the next, which is hardly commendable, but foolish forasmuch as it encourages further usage, not less; as one defines the one in relation to the next, it follows that it is a cycle, not a regiment, for there is no real resting period, only transition. Real time management would consist of playing a few games, then deciding to get off the computer, get a snack, study, or read; going from one device to another is not management at all. Similarly, regarding the other scenario, studying on one’s computer and taking a break by checking one’s media is no more effective. One is studying for physics, and after reading several long paragraphs, sets upon learning the vocabulary, committing to memory the jargon, then solving a few problems, but one is thus only halfway through: What now? Tired, drained, yet also proud of what has been accomplished thus far, one decides to check one’s social media—only for 30 minutes, of course: just enough time to forget everything, relax, and get ready to study again—this is not the essence of management; nay, it is the antithesis thereof! No state of mind could possibly think this reasonable. If one is tired of studying, which is justifiable and respectable, then one ought to (not should!) take a real break and really manage one’s time! Social media is indeed a distraction, albeit of a terrible kind, and not the one we ought to be seeking. Checking a friend’s or a stranger’s profile and looking through their photos, yearning for an escape, hoping for better circumstances—this is not calming, nor is it productive. A good break, good time management, is closing one’s computer and doing something productive. Social media serves to irritate the brain even more after exhaustion and is not healthy; instead, healthy and productive tasks, of which their benefits have been proven, ought to be taken up, such as reading, taking a walk, or exercising, among other things: A simple search will show that any of the aforementioned methods is extremely effective after intense studying, and shows signs of better memory, better focus, and better overall well-being, not to mention the subconscious aspect, by which recently learned information is better processed if put in the back of the mind during something else, such as the latter two, which are both physical, bringing with them both physiological and psychological advantages. Conclusively, time management consists not in transitioning between devices, but in transitioning between mind- and body-states.


The question arises: Why is spending too much time with technology on devices a problem in the world? Wherefore, asks the skeptic, is shutting oneself off from the world and retreating into cyberspace where there are infinite possibilities a “bad” thing? Do we really need face-to-face relationships or wisdom or ambitions when we can scroll through our media without interference, getting a window into what is otherwise unattainable? Unfortunately, as with many philosophical problems, including the simulation theory, solipsism, and the mind-body problem, no matter what is argued, the skeptic can always refute it. While I or anyone could give an impassioned speech in defense of life and about what it means to be human, it may never be enough to convince the skeptic that there is any worth in real-world experiences. It is true that one could easily eschew worldly intercourse and live a successful life on their device, establishing their own online business, finding that special person online and being in love long distance—what need is there for the real world, for the affairs of everyday men? Philosopher Robert Nozick asks us to consider the Pleasure Machine: Given the choice, we can choose to either hook ourselves up to a machine that simulates a perfect, ideal, desirable world wherein all our dreams come true, and everything we want, we get, like becoming whatever we always wanted to become, marrying whomever we have always wanted to marry, yet which is artificial, and, again, simulated; or to remain in the real world, where there are inevitable strifes and struggles, but also triumphs, and where we experience pleasure and pain, happiness and sadness—but all real, all authentic. There is, of course, nothing stopping one from choosing the machine; and the skeptic will still not be swayed, but I think the sanctity of humanity, that which constitutes our humanity, ought never be violated.


What, then, is the greatest inhibition to a healthy, productive digital citizenship? What can we do to improve things? The way I see it, the answer is in the how, not the what. Schools can continue to hold events where they warn students of the dangers of technology, advise them on time management, and educate them about proper usage of technology and online presence; but while these can continue ad infinitum, the one thing that will never change is our—the students—want to change. Teachers, psychologists, and parents can keep teaching, publishing, and lecturing more and more convincingly and authoritatively, but unless the want to change is instilled in us, I am afeard no progress will be made. Today’s generation will continue to dig itself deeper into the technological world. They say the first step in overcoming a bad habit or addiction is to admit you have a problem. Like I said earlier, technology just is for us youths, and it always will be henceforth, and there will not be a time when there is not technology, meaning it is seen as a given, something that is essential, something humans have always needed and will continue to need. Technology is a tool, not a plaything. Technology is a utility, not a distraction. Social media is corrupting, not clarifying, nor essential. We have been raised in the 21st-century such that we accept technology as a fact, and facts cannot be disproven, so they will remain, planted, their roots reaching deeper into the soil, into the human psyche. Collectively, we have agreed technology is good, but this is “technology” in its broadest sense, thereby clouding our view of it. We believe our phones and computers are indispensable, that were we to live without them, we would rather die. To be without WiFi—it is comparable to anxiety, an object-less yearning, and emptiness in our souls. How dependent we have become, we “independent” beings! This is the pinnacle of humanity, and it is still rising! Ortega y Gasset, in the style of Nietzsche, proclaimed, “I see the flood-tide of nihilism rising!”¹ We must recognize technology as a problem before we can reform it and ourselves. A lyric from a song goes, “Your possessions will possess you.” Our devices, having become a part of our everyday lives to the extent that we bring them wheresoever we go, have become more controlling of our lives than we are of ourselves, which is a saddening prospect. We must check every update, every message, every notification we receive, lest we miss out on anything! We must miss out on those who care about us, who are right in front of us, in order to not miss out on that brand new, for-a-limited-time sale! But as long as we keep buying into these notification, for so long as we refuse to acknowledge our addictions and the problem before us, we will continue to miss out on life and waste moments of productivity, even if they are for a few minutes, which, when added up at the end of our lives, will turn out to be days, days we missed out on. As my teacher likes to say, “Discipline equals freedom.” To wrest ourselves from our computers or phones, we must first discipline ourselves to do so; and to discipline ourselves, we must first acknowledge our problem, see it as one, and want to change. As per the law of the vis viva (and not the vis inertiæ), things tend to persist in their given states, until its internal force wills it otherwise. We bodies animated with the vis viva, we have the determination and volition to will ourselves, to counter the inertia of being-in-the-world, of being-online, whence we can liberate ourselves, and awaken, so to speak. We, addicts, have no autonomy with our devices—we are slaves to them. Until we break out of our complacency, until we recognize our masters and affirm our self-consciousness thence, and until we take a stand and break from our heteronomy, we will remain prisoners, automata, machines under machines. We must gain our freedom ourselves. But we cannot free ourselves if we do not want to be freed, if we want to remain slaves, if we want to remain in shackles, if we want to plug into the machine. A slave who disdains freedom even when freed remains a slave. Consequently, we cannot be told to stop spending so much time on our devices, to pay attention to whom or what is in front of us; we must want to ourselves. Yet no matter how many times or by whom they are told, today’s youth will never realize it unless they do so themselves. They must make the decision for themselves, which, again, I must stress, must be of their own volition. Until then, it is merely a velleity, a desire to change, but a desire in-itself—nothing more, a wish with no intent to act. It is one thing to say we should spend less time, another that we ought to.


¹Ortega y Gasset, The Revolt of the Masses, p. 54

Descartes on Great Books

Descartes on books.png

I was aware that the reading of all good books is indeed like a conversation with the noblest men of past centuries who were the authors of them, nay a carefully studied conversation, in which they reveal to us none but the best of thoughts.



René Descartes (1596-1650), French philosopher and mathematician of the Early Modern period.

Discourse on the Method of Rightly Conducting the Reason by René Descartes (1990)

Harper Lee’s Guide to Empathy

Unknown.pngIn the 21st Century, surrounded by technologies that distance us, by worldviews that divide us, and by identities that define us, we do not see a lot of empathy among people. While we see friends and family every day, we never really see them, nor do we acknowledge that they, too, are real people, people who have opinions like us, feelings like us, and perspectives like us. Harper Lee is the author of To Kill a Mockingbird, a novel that itself has many perspectives, many of which are in conflict with each other. Set in the 1930’s South, the book takes place during the Great Depression, when many lost their jobs, and a time of racism, when laws were passed that prohibited the rights of black people. The protagonist is a girl named Scout who lives in the fictional town of Maycomb with her brother Jem and father Atticus, who is an empathetic lawyer. Through interactions with her peers, Scout learns to take others’ perspectives and walk in their shoes. In To Kill a Mockingbird, Harper Lee teaches that, in order to take another’s perspective and practice empathy, it is required that one understand someone else’s thoughts or background, try to relate to them, then become aware of how the consequences of one’s actions affects them.

Before one can truly take another’s perspective, Lee argues, one must first seek to understand how someone thinks and where they come from. After hearing about Mr. Cunningham’s legal entailment, Scout asks if he will ever pay Atticus back. He replies that they will, just not in money. She asks, “‘Why does he pay you like that [with food]?’ ‘Because that’s the only way he can pay me. He has no money… The Cunninghams are country folk, farmers, and the crash hit them the hardest…’ As the Cunninghams had no money to pay a lawyer, they simply paid us with what they had’” (Lee 27-8).  Scout is confused why the Cunninghams pay “like that” because it is not the conventional way of paying debts. Money is always used in business transactions, yet Atticus allows them to pay through other means. Atticus acknowledges that the Cunninghams are having economic problems. He empathizes with him by drawing on his background knowledge, namely that, because he is a farmer who gets his money from agriculture, he does not Unknown.jpeghave the means to pay. The Great Depression left many poor and without jobs, so Atticus is easier on Mr. Cunningham; he knows it would be unfair to make him pay when he hardly has any money. Accordingly, Atticus accepts that the Cunninghams are trying their best, and he compromises with them. He willingly accepts anything Mr. Cunningham will give him, since he knows it will come from the heart. For this reason, Atticus can empathize by thinking outside normal conventions to accommodate Mr. Cunningham’s situation. Just as Atticus understands the Cunninghams, so Calpurnia empathizes with them when she lectures Scout not to judge them. Jem invites Walter Cunningham from school over to have dinner with him and Scout. Reluctantly, Walter agrees, but once he starts eating, Scout takes issue with his habits; so Calpurnia scolds her. Calpurnia yells, “‘There’s some folks who don’t eat like us… but you ain’t called on to contradict ‘em at the table when they don’t… [A]nd don’t you let me catch you remarkin’ on their ways like you was so high and mighty!’” (Lee 32-3). Because Scout is not used to the way Walter eats, she immediately judges his way as different from her own, thereby patronizing him. Hence, she is not empathizing because she is not considering his point of view, but is only evaluating her own. Calpurnia states that not everyone eats like Scout does, showing that she, unlike Scout, does not form generalizations; rather, she rationalizes, recognizing that he comes from a different home, a different home with different manners. Since she empathizes with Walter in this way, Calpurnia tells Scout not to “contradict” him, meaning it is rude and unsympathetic not to consider Walter and his background. Furthermore, she warns Scout not to act as though she is “so high and mighty,” especially around others who are less fortunate and who differ from her, such as Walter. By criticizing Walter’s eating and thence abashing him, Scout is being sanctimonious, declaring that her way is the better than anyone else’s. Calpurnia gets mad at Scout for this, as it is egocentric; i.e., she is concerned with herself and cannot consider others’ perspectives. Consequently, Calpurnia shows empathy by understanding that people have different perspectives, while Scout does not. Both Atticus and Calpurnia are empathetic because, as shown, they actively try to understand other people and selflessly consider their perspectives.

Unknown-1.jpegOnce a person’s way of thinking and past is understood, one is able to see oneself in that other and make connections with them. One night, Scout, Jem, and Dill sneak off to the Radley house and are scared away, Jem losing his pants in the process. Jem decides to retrieve his pants, regardless of the danger involved therewith. The next morning, he is moody and quiet, and Scout does not know why. Upon some reflection, she says, “As Atticus had once advised me to do, I tried to climb into Jem’s skin and walk around in it: if I had gone alone to the Radley Place at two in the morning, my funeral would have been held the next afternoon. So I left Jem alone and tried not to bother him” (Lee 77). Scout follows her father’s advice and “climb[s] into Jem’s skin,” symbolizing that she has taken his perspective and seen life therethrough. She asks herself the vital question of what it would be like to be Jem; in doing this, she has visualized herself as Jem, has visualized herself doing what he did, thereby understanding him. The first step in empathizing—understanding—allows her to relate to Jem and put herself in his position: She imagines what it would have been like to risk her own life, how she would have felt doing so. As a result, she examines her emotional reaction and projects it onto Jem, relating to him, feeling as he would feel. Had she not tried to understand Jem’s position, had she not related to him emotionally, she would have never known why Jem was being moody. Jem’s “funeral would have been held the next afternoon,” says Scout, realizing why Jem is upset. If she felt that way herself, then she would not want anyone bothering her, either, seeing as it is a traumatic event. Scout connects to Jem on an emotional level, empathizing with him. Another instance in which Scout shows empathy by relating is when she connects with Mr. Cunningham. Jem and Scout sneak out at night to find Atticus, who is at the county jail keeping watch over his client, Tom Robinson. While they near to him, a mob closes in on Atticus and threatens to kill Robinson, so Scout tries to find a way of civilizing them and 120130184141-mockingbird-6-super-169.jpgtalks to Walter’s father. Thinking of conversation, she considers, “Atticus had said it was the polite thing to talk to people about what they are interested in, not what you were interested in. Mr. Cunningham displayed no interest in his son, so I tackled his entailment once more in a last-ditch effort to make him feel at home” (Lee 205). In this moment, Scout recalls that it is polite to relate to others and consider their views rather than her own. She hereby distances herself from her egocentrism, instead concerning herself with what someone other than herself wants. Empathizing requires that one cross the gorge of disparity, and Scout bridges this gap between self and other to find that she has things in common with Mr. Cunningham, common things of which she would never have thought prior. Before this connection could occur, Scout had to know his background, of which she learned when talking to Atticus; additionally, she had his Unknown-1.pngson over and learned about him then, giving her something in common with him with which to talk. Since Scout knows Walter, she thinks him a topic to which the two can both relate, seeing as Walter is close to his father, creating a strong connection. However, she notes that he “displayed no interest in his son”; thus, she thinks back further, remembers another thing they have in common, then relates to it in an attempt to “make him feel at home.” The phrase “feel at home” denotes acceptance, belonging, and coziness—being warm and welcome—so Scout, in coming up with certain topics that will be of interest to Mr. Cunningham, seeks to make him feel like he is a welcome person, to put herself in his shoes and consider what he would like to talk about, what would make him feel accepted as it would her. Through these moments in the text, Lee shows that empathy is relating to and identifying with another by removing one’s own position and taking theirs.

Empathy is accomplished when one takes another’s perspective in order to know their actions will affect them and consider how they would make them feel. Jem and Scout find out Atticus has been insulted and threatened by Bob Ewell in chapter 23. They are confused as to why their dad did nothing to retaliate, why he just took it. He tells Jem, Unknown.jpeg“[S]ee if you can stand in Bob Ewell’s shoes a minute. I destroyed his last shred of credibility at the trial, if he had any to begin with… [I]f spitting in my face and threatening me saved Mayella Ewell one extra beating, that’s something I’ll gladly take. He had to take it out on somebody and I’d rather it be me than that houseful of children out there’” (Lee 292-3). Atticus directs Jem to “stand in Bob Ewell’s shoes” so that he can understand his perspective, and therefore how Atticus’ actions could have affected him. Knowing Mr. Ewell has many children, finding a common link therein, Atticus can relate to him, imagining how horrible it would be if his children were beaten. Bob Ewell, upset over the trial, wants to take out his anger, so he displaces it onto Atticus, which Atticus says is better than his displacing it on his children. Taking the pacifist route, Atticus avoids exacerbating the situation, aware that fighting back would cause things to worsen, and he steps outside himself to become aware of how his actions will not just have direct effects, but indirect effects as well: Angering Bob Ewell would make him want to physically harm Atticus, but would further encourage him to be more hostile to his children in addition. As such, Atticus takes into account the long-term consequences and empathizes because he is aware of how his actions could possibly obviate a disaster. He thinks ahead—to Bob Ewell’s children, to his own children, concluding, “‘I’d rather it be me than that houseful of children.’” A second example of considering the consequences of one’s actions on another takes place when Scout, a couple years later, reflects on how she treated Arthur “Boo” Radley. At the beginning of chapter 26, Scout is thinking about her life and passes the Radley house, of which she and Jem were always scared, and about which they had always heard rumors. She remembers all the times in the past she and her brother and their friend played outside, acting out what happened at the house. Pensively, she Unknown-1.jpegponders, “I sometimes felt a twinge of remorse when passing by the old place [Radley house], at ever having taken part in what must have been sheer torment to Arthur Radley—what reasonable recluse wants children peeping through his shutters, delivering greetings on the end of a fishing-pole, wandering in his collards at night?” (Lee 324). Lee uses the word “remorse” here to conjure up feelings of guilt, regret, and shame, all associated with the way Scout feels about her actions. To say she feels a “twinge of remorse” is to say she feels compunction; that is, morally, she feels she has wronged the Radleys, and, looking back, that what she did was wrong. She is contrite because she can stand back and objectively evaluate her deeds, deeds she deems unempathetic, considering they were inconsiderate of Arthur. Having become aware of the weight of her choices, Scout experiences regret, an important emotional reaction because it signifies empathy, insofar as it is representative of her taking into account how she affected another person; and, in this case, how it negatively impacted Arthur, which itself requires understanding and relation to him. This regret, this guilt, is caused by the realization that her actions in the past were mean and thus incite moral guilt. Again, Scout puts herself in Arthur’s shoes, imagining what it would reasonably be like to be a “recluse”: Certainly, she affirms, she does not want “children peeping,… delivering greetings,… [or] wandering in [her] collards.” The thought process is supposed to relate to Arthur’s, so Scout is actively relating to and understanding him, ultimately to realize how her conduct impacts him. Her scruples finally notify her that, from the perspective of the solitary Arthur, her behavior had a negative effect. Scout’s awareness of the consequences of her actions makes her empathetic, for she has introjected Arthur’s perspective. In conclusion, Atticus and Scout exhibit empathy because they both consider how their comportment has an effect on others.

Unknown.pngAccording to Lee, empathy is put into practice when one takes time to learn about another person, makes a personal connection with them, and considers how their actions will affect them. We are social animals by nature, which means we desire close relationships; unfortunately, most of us seldom recognize the importance of understanding those with whom we have a relationship, leading to inconsiderateness, ignorance, and stereotypes. For such intimate animals, we all too often neglect the feelings and thoughts of others, even though they are of no less priority than ours. Therefore, empathy is a vital, indispensable tool in social interaction that helps us connect with others. As communication is being revolutionized, worldviews shaken, and identities changed, it is integral that we learn to better understand others and never forget to empathize, lest we lose our humanity.


To Kill a Mockingbird by Harper Lee (1982)

Plutarch on Contemplation

“We aPlutarch on contemplation.pngre right to blame those who misuse the natural love of inquiry and observation by extending it on unworthy objects. Every man is able to turn his mind easily upon what he thinks good. It is a duty to contemplate the best”

-Plutarch (46-120 AD), Greek philosopher and biographer.



Hamilton, The Echo of Greece, p. 198

The Echo of Greece by Edith Hamilton (1957)

Summary of Leibniz’s Philosophy

Unknown.jpegBorn in 1646, Gottfried Wilhelm von Leibniz was a German polymath. He studied many subjects and wrote many essays on them, including philosophy, mathematics, science, logic, theology, and language. A contemporary of Isaac Newton, he and the natural philosopher feuded over who invented calculus. While Leibniz published his first, it was Newton who invented it first, although today, the former’s is used more. Leibniz combined philosophy with science in order to arrive at a systematic philosophy that, by today’s standards, is very modern. Some of his findings in the 17th century anticipated many of the findings of modern physics. In this post, which serves as a more concise counterpart to my other, more in-depth essay on Leibniz, I will summarize Leibniz’s main ideas regarding logic, metaphysics, and theology.

There are two types of truths according to Leibniz: truths of reason, and truths of fact. Truths of reason cannot be proven false, for they are necessary. It is impossible for a truth of reason to be any way other than it is. For example, 2+2 always equals 4. It is a necessary truth because it cannot be false. Leibniz uses the law of noncontradiction to justify these kinds of truths. It states that the opposite of such a claim is a self-contradiction. Saying that a circle has edges involves a self-contradiction because, by Unknown.pngdefinition, a circle cannot have edges—it is impossible! Accordingly, “No circles have edges,” is a truth of reason, as to say otherwise would be wrong. Truths of fact, contrariwise, are contingent, meaning they can be either true or false. Whereas truths of reason are given and innate, truths of fact are gained through experience. A claim such as “Pumpkins are orange,” is a truth of fact because it is contingent; it does not necessarily have to be orange, but can be yellow or orange, among other colors. In the case that you do find an orange pumpkin, the claim is correct. As such, the pumpkin has the possibility of being either of the aforementioned colors. For these kinds of truths, Leibniz uses the principle of sufficient reason, whereby he states that everything exists for a reason.

This world, Leibniz contends, is one of many possible worlds. When multiple truths of fact are compatible and can exist with each other, then they are called compossibilities. Having two feet is compossible with having two legs, but having two feet with one foot is not compossible, for one negates the other: only can be true. The sum total of compossibilities constitutes a possible world.

The universe is composed not of atoms, but monads, says Leibniz. Because atoms are physical, it means they can be divided in half, from there halved again, etc. If we keep on going, dividing atoms, we find that they are always made of something simpler. Leibniz claims instead that the building blocks of reality are immaterial consciousnesses. They occupy no space and are simple, which is to say that they are not made of parts. These monads are all distinct from each other and cannot interact with each other. When Leibniz says monads are immaterial, he suggests they are pure energy because they motion is intrinsic to them. Monads are substances in that they can have properties, but bear none themselves. A car can have the property of being red or blue, but it remains a car all the same.

In English, the subject is the doer and the predicate is what the doer does. Leibniz argues that all predicates are contained in their subjects due to a pre-established harmony. Unknown.jpegSaying “Socrates was born in 469 BC,” one makes the claim that the predicate “was born in 469 BC” is exclusive to the subject, “Socrates,” alone. Being born in 469 BC is unique to this particular Socrates and is what makes him Socrates. Similarly, “Socrates died in 399 BC” is contained in “Socrates” because it is a part of him. When one studies Socrates, one learns that he died in 399 BC, and he could not have died at any other time because that is the way it happened. Remember that monads cannot interact, so when Leibniz speaks of a pre-established harmony, he means that every monad is determined before it is created. Before Socrates was created, it was pre-established that he would die in 399 BC, and it happened in harmony with the other Athenians at the court who sentenced him to death. Socrates was sentenced and the Athenians sentenced Socrates even before they were created! Because none can actually interact with the other, they do not affect each other directly, but unfold at the same time.

Monads reflect the universe within themselves. Each has a unique perspective on the universe, just as how people have different perspectives. However, each perspective is necessary for creating a single, unified picture of reality. By piecing together every microcosm, Leibniz says, we can see the macrocosm.

Monads can perceive other monads unfolding according to the pre-established harmony, use appetition to change through perceptions, and engage in apperception to gain self-consciousness, although this is reserved for humans. Some monads are clearer in their perceptions than others. Bare monads are confused and are inanimate, like rocks; integral monads have the power of memory, and are made of many monads topped off with a soul, including humans, which are called “corporeal substances”; and essential monads, such as God, are truths of reason and have the most clarity.

Space and time are relative. Space is existent only when bodies are present, and time is measured based on the sequence of monads as they harmonize. In order to measure time, for example, you have to measure it relative to something; one cannot objectively measure time by itself.

we_live_in_a_happy_world___by_omg_raichu-d31l9re.pngGod, being all-good and all-powerful, has the ability to create any world He chooses. An ideal world has the minimum causes and maximum effects. Accordingly, because He is a perfect, necessary being, He must have chosen the “Best of all possible worlds”; choosing otherwise would not bear as many compossibilities. How is evil explained? The world is not perfect, and evil is the absence of good. But God has sufficient reason: Everything exists for a reason, but humans have a hard time understanding these reasons and so are convinced of evil, when in reality, this is a great world in Leibniz’s eyes.

A very simple visual showing the Leibniz’s main ideas and some of their connections:
Screen Shot 2017-11-02 at 2.48.57 PM.png

Who was Gottfried Wilhelm von Leibniz?

Unknown.jpegIn the tradition of Modern philosophy, the rationalist movement was spearheaded by Descartes and then Spinoza, both of whom devised profound and logical systems built solely on reason. Gottfried Wilhelm von Leibniz, another of the great rationalists, a polymath by nature, a scientist and physicist, mathematician, logician, theologist, diplomat, linguist, geologist, politician, and, among other things, philosopher, lived in the mid 17th century and was a contemporary of the natural philosopher Isaac Newton, with whom he would feud on several key points. Besides being a brilliant philosopher, he was an amazing and talented mathematician and scientist who developed his own method of calculus, leading to one of the greatest scientific controversies in history. While Newton thought of and expounded his calculus first, for which he deserves the most credit, Leibniz published his own independent calculus several years before Newton. Unfortunately, Newton was much more revered and had a higher reputation, so Leibniz was soon forgotten and faded into history, neither his physics nor his philosophy being put in the spotlight, such that his predecessors’ names are remembered more than his. But perhaps Leibniz is most known for being the victim of Voltaire’s lampoon: He is represented as Dr. Pangloss, the unconditional optimist who claims it is the “Best of all possible worlds,” in Voltaire’s novel Candide. The following essay shall provide a succinct and, hopefully, simple and comprehensible summary of and look at Leibniz’s philosophy.

Bertrand Russell, in The History of Western Philosophy, remarked that Leibniz was one of the only philosophers to construct his whole system—even his metaphysics—using the foundations of a logic; it is thus that I shall begin. Leibniz begins by dividing all truths into two types: Those of reasoning, and those of those of fact. A truth of reason is necessary, which is to say that it has to be the way it is, that it cannot be otherwise. He uses the law of noncontradiction in order to justify them. It states that the opposite of a truth of reason results in a self-contradiction. For example, to say that 2+2=5 is a contradiction because is it not true, but rather is contradictory, for it goes against the Unknown.pngtruth, namely that 2+2=4. As such, 2+2=4 is a truth of reason: To say its opposite is a self-contradiction. Truths of reason are impossible to refute. They are incontrovertible. Later, the German philosopher Kant, having read Leibniz, would borrow this idea and call it an analytic a priori judgement. Basically, a truth of reason is true by definition, it is given, it is innate. In this manner, Leibniz stands in contrast to Locke, who claimed innate knowledge is impossible; Leibniz, then, is an innatist, in that he believes that certain truths, truths of reason, are already in our minds. Truths of fact, on the other hand, are contingent. This means they can either be true or false; it is not necessary for them to be one way over another. Whereas truths of reasons when refuted become self-contradictions and are therefore impossible to confute, the opposite of a truth of fact is possible, for truths of facts are, in essence, possibilities. An example would be saying that an apple is red. Saying an apple is not red does not result in a contradiction because it does not necessarily have to be red, but can be green as well. Being red is a possibility, but it is a possibility that an apple may be green, too. Leibniz justifies truths of fact with the principle of sufficient reason, which states that everything exists for a reason. There is a reason why one particular apple is red, another green, and this is God’s doing, according Unknown-1.jpegto Leibniz. He states that while everything has a reason, we humans are incapable of conceiving the final cause for all things—only God can. Another support for the principle of sufficient reason is the argument for metaphysical perfection, whereby Leibniz argues that existence is better than non-existence. It is better for more things to exist than for fewer things to exist. Hence, Leibniz calls upon us to always appeal to logic and reason in order to find the reason for everything. A question may come to mind right now: If God creates everything with a sufficient reason, including contingent, or possible, truths, does that imply that contingent truths are actually necessary? If there is a reason one apple is red, does that mean that that particular apple is necessarily red and cannot be green, for that possibility has not been actualized? As said earlier, God knows the sufficient reason for everything, so we do not. Just with analytic a priori judgements, Kant adapted this type of truth and turned it into synthetic a posteriori judgements, which are propositions that are gained from experience and are contingent. images.jpegSimilarly, Hume, who preceded Kant, is famous for his logical fork, which divides truths respectively into matters of fact and relations of ideas. The theme of contingency is essential to Leibniz’s philosophy. Contingent truths are everywhere, and they exist because there are sufficient reasons for them. When two or more possibilities are compatible and can co-exist without logical problems, a condition is met called compossibility, which translates to “possible with.” Problems arise when one possibility is not compatible with another. An illustration: It is a fact that humans have two eyes, but this is one of an infinity of possibilities, another being that humans have one eye, like a cyclops. It is impossible for both possibilities to co-exist: We cannot have two eyes and only one eye at the same time. If you were to look around, you would see that we have two eyes, not one. Accordingly, when one possibility is actualized, it negates the other possibility. Contrast this to a truth of reason. One cannot say “All triangles have four sides,” as this is a contradiction; it is simply impossible. Leibniz proposes that a world such as ours is the sum total of all its compossibilities. In our world, humans have two eyes, two ears, and a nose. However, Leibniz says that there are infinite possible multiverses. It is important to note that they are possible Unknown-2.jpegmultiverses, not plain multiverses, because the existence of our world negates the existence of the other universes. In another contingent universe, humans have an eye, an ear, and two noses, but because this world exists and not that one, it does not exist in actuality. One may ask the age-old question “Why is there something rather than nothing?” to which Leibniz would reply: The principle of sufficient reason. God created the world based on metaphysical perfection and the identity of indiscernibles. The identity of indiscernibles says that if A and B are completely identical and share every property, then they are indistinguishable and consequently the same thing. One can substitute A for B and B for A. Using this principle, Leibniz reasons that, in creating our world, God would be foolish in choosing here vs. there or now vs. then, insofar as they are all identical before the existence of the world. For this reason, everything is unique, and no two things are the same.

All is monads. So says Leibniz. A monad, from the Latin monas or mon, meaning “one,” is an independent, individual, and self-contained entity. The monad is defined as its own entity to the extent that it is completely separate from all other monads and contains within it its own individuality, by which it distinguishes itself from the others, as in the identity of indiscernibles. Leibniz claims monads are “windowless.” Unlike biological Unknown.jpegcells, which have a permeable membrane allowing for resources to come in and out, monads are enclosed and shut off from everything else, allowing nothing to either come in or out or affect them. Thus, when Leibniz speaks of monads as being self-contained, he means they cannot be affected from the outside, but contain inside themselves their own causality. Another thing about monads is that they are not like your average atoms, inasmuch as they are simple, indivisible “points” of consciousness. A “point” in geometry lacks any and all dimensions yet constitutes a location in space, and this is what an atom does. This proposition is countered by Leibniz, who says atoms are not the fundamental constituents of reality. Arguing against the Cartesian concept of matter, which states that matter is “extended,” which is to say that it has physical shape and size, that it is located in space, Leibniz says that anything that is extended is divisible. Like in Zeno’s paradoxes, take a line and divide it in half, then divide that half by half, and then that half, and so on: The line, which is extended, can always be broken down—it can be made simpler. Leibniz claims that atoms are the same way. Atoms are not simple, but complex. Because material atoms can be infinitely divided, Leibniz suggests that the building blocks of reality are immaterial. Such a building block would be simple because it has no parts; in fact, it is the part from which complexities, or aggregates—a grouping of simple parts into a more complex one—are made. In another argument against atomism, Leibniz tackles the physics laid down by Descartes and Newton. If an atom is a lifeless extension, then it requires an outside body or force to move it; but, Leibniz points out, mere extension offers no resistance, and so cannot be moved by outside force alone. Monads, then, are energy. Inherent in monads are inertia and force. Leibniz posits a vis Unknown-2.jpegviva, or living force, an entelechy, or internal drive, that is inherent to monads, a force which has a tendency to motion. This mirrors the concept of conatus, which is like the starting succession of motion; conatus is that initial force in the instant that makes a body move. Calculations by Leibniz showed that a certain amount of energy remains constant in a collision, a calculation he formulated into mv^2 (mass x velocity^2). Singlehandedly, Leibniz invented a formula for kinetic energy, a type of energy many knew existed, but for which there existed no mathematical proof. Leibniz states that kinetic energy, not momentum as Newton said, is the real cause of motion. And because kinetic motion is energy in action and requires potential energy first in order to be active, it must mean Unknown.jpegactivity is intrinsic to monads. Amazingly, Leibniz was the precursor to modern physics. He almost anticipated Einstein’s famous E=mc^2, and he was off on kinetic energy by ½ (the real formula is ½mv^2)! Leibniz, nearly 300 years before Einstein, was nearly able to prove through reasoning that matter is actually energy. Monads are substances. Substances, as opposed to matter, are simple. Substance is like a noun: It is a concept and a proper thing that can be described. Descriptors, adjectives, are called “accidents,” because qualities are contingent, whereas substance is necessary; contingent properties are applied to the substance, but they do not change the substance’s form, for they are additions and merely add to it. Leibniz proceeds to construct his philosophy with the aid of grammar. As in English, a subject is an actor, and a predicate is an action done thereby. He defines substance, then, as “unextended subjects… individuated by predicates”; i.e., immaterial forms are made distinct by their actions, or what is said (predicated) of them.[1] Here, Leibniz puts forth his famous idea of the “pre-established harmony.” Simply put, it is known that monads cannot interact with each other, so they are set in harmony before creation by God. In short, all predicates are contained in their subjects. This somewhat echoes predestination because it says that everything—past, present, and future—is hardwired into each monad so that they act not on each other, Unknown-1.jpegbut with each other, such that “the state of the whole universe could be read off from any one Monad.”[2] God creates monads as though they are clocks, each of which is designed to strike the same hour at the same time without cooperation between them. Because God designed them, He is the clock of which they are copies, so they mirror Him. Man is limited in his reason, so he cannot grasp this harmony in its entirety. When the monads are created, they are created with internal, self-regulatory laws that tell them what to do and when, like a clock mechanism. These monads are therefore spontaneous: They change on their own because they contain within themselves their future. Take the statement, “Leibniz was born in 1646.” Leibniz is a monad, a substance, and subsequently a subject, a self-contained entity, and the predicate “was born in 1646” is contained in the subject, Leibniz, whereby I mean that part of what makes Leibniz Leibniz is the fact that he was born in 1646, and not in 1647, for example. Yet another characteristic of monads is that they are microcosms—miniature universes that mimic the cosmos inside themselves. Each monad reflects the universe from its unique perspective. This concept is hard to grasp, but think of a room with furniture in it. A painting on the wall will have a wide view of the room, and the carpet will see everything above it; yet the fan mounted on the ceiling has a bird’s eye view, but it cannot see from the perspective of either the painting or the carpet. Thus, each monad is essential to the universe, and their perspectives are unique. But what do monads actually do? Monads are capable of three things: perception, appetition, and apperception. Perception is active and non-reflexive, or outward. It is the external representation of the unfolding of other monads. A dog perceives a squirrel running up a tree, but this perception is just a phenomena; the dog is witnessing the squirrel enacting one of its predicates, namely running up a tree; it is unfolding. Appetition is the ability to progress from one perception to the next. Apperception is reflexive and passive—it is self-consciousness, and it is reserved for man alone. Humans are examples of monads, albeit in different forms. A human is a “corporeal substance,” which is made up of a dominant monad (the soul) and an aggregate (multiple monads).[3] Regarding the mind-body problem, Leibniz rejects Cartesian dualism and the resulting interactionism and Malebranche’s occasionalism. I images.jpegthink it interesting that Newton likened God to a clockmaker who, every now and then, had to rewind the cosmic clock on the account that if he were to make the universe fully automatic, it would render him impotent; yet Leibniz claims the contrary for the exact opposite reason. Leibniz considered it silly to think that God had to continually rewind His own mechanism, which turned God into a functionary whose job was lowly, whereas He could exhibit His power by creating a self-regulatory universe. Leibniz’s solution to the mind-body problem is parallelism: The body and the mind, separate monads, work at the same time without causally interacting because of the pre-established harmony. Now, as monads are unique, it would be strange to assert that rocks are of the same order as humans, and humans God. To account for this, Leibniz creates a hierarchy of monads, the criterion of ranking being clarity of perception. There are aggregate, integral, and essential monads, which each correspond, respectively, to bare, animal, and rational/spiritual monads. Bare aggregates are composites, meaning they are composed of many monads. They are inanimate and unconscious, often with blurry or confused perceptions. An example would be a rock. Animal integral monads are, as the name says, animals that are made of an aggregate and a soul, endowed with the power of memory. Humans are integral but rational monads, which means they, like animals, are made of two parts: an aggregate, and, unlike animals, a spirit, not a soul.[4] Man is also dispensed with consciousness. What distinguishes man from animals most saliently, however, is his Unknown-2.jpegknowledge of truths of reason. Because he is self-conscious, because he has the ability to introspect, and because a priori knowledge is innate to him, man can grasp necessary truths. Lastly are essential monads, which are equivalent to truths of reason. A triangle, or God, is an essential monad because it is simple and cannot be refuted. The last thing Leibniz has to say about metaphysics is his thought regarding space and time. Newton believed space and time were absolutes. Space is an entity that extends everywhere, and time is another entity in which events happen. Leibniz disagrees, stating that space and time are relative. Recall the identity of indiscernibles. If space and time are absolutes, then no instance of either can be differentiated from the next, meaning that they are only a single point, and not independent dimensions. Space is defined as the coexistence of bodies, time the succession of monads’ eternal unfolding. As a result, space is only space when it is used in reference to two or more bodies. Time, in a like manner, cannot be objectively measured, but must be made in reference to something. Spacetime, it can be implied, is relative, in that it depends on what you are measuring; in this manner, Leibniz can be seen as predicting the modern theory of relativity, too.  

A theologian, Leibniz argued for the existence of God in two main ways: That of the Ontological Argument, and that of the pre-established harmony. Borrowed from Saint Anselm, the Ontological Argument runs as so: If a perfect being is imagined, it is predicated of it that is must have all perfect qualities, one of which is existence; but this would contradict a perfect being in imagination, so it stands that this perfect being must exist in order to be such a perfect being, and this perfect being is God; therefore, God exists. The argument of pre-established harmony is similar to the Teleological Argument in that it argues that there is a clearly observable harmony in nature, and this perfect harmony must have been orchestrated by some perfect being who oversaw it—God. His most famous work, the Theodicy, seeks to explicate the Problem of Evil, which asks how a benevolent, omnipotent God could allow evil. Assuming God is benevolent and omnipotent, Leibniz writes, He must have chosen, out of all the possible worlds, the best one, this one. The best world is the one with the least causes and most effects; in a word, Unknown.jpegan optimal world. For this reason, he proclaims we live in the “Best of all possible worlds.” He reminds us that happiness is not the only measure of good, and evil is the absence of good, a remark made earlier by Augustine. This world, he admits, is not perfect. But it does not need to be. Rather, because God Himself is perfect, it would be impossible for Him to create a perfect world in His image, so evidently, this world cannot be as perfect as He, but must be at least a little bit flawed, so as to distinguish it from Himself. Again, according to the principle of sufficient reason, everything happens and exists for a reason. Humans—imperfect, rational beings—cannot comprehend every reason God decrees, so even if we experience evil and cannot justify it reasonably, then it stands that it happened for a reason, albeit one of which we are ignorant; but, coming from God, it must be so. As God is the highest, clearest monad, all monads mirror Him imperfectly. Leibniz assures us that God did not create this world out of logical or metaphysical necessity, but out of ethical necessity. God created, in the words of Leibniz, “a moral world within the natural world.”[5] Ruth L. Saw wrote bluntly, “Leibniz cannot be described as a man of great moral insight.”[6] I would agree with generalization, only to the extent that he did not produce any substantial works on Unknown-1.jpegethics. Leibniz equated knowledge with power. Happiness is correlative to clarity, so clearer monads will be happier because they are closer to perceiving God. Using this reasoning, Leibniz is able to take a jab at the ignorant, who are not actually bliss, but are rather in a stupor. Those with more understanding can follow and adhere to necessary truths, which, Leibniz says, are obligations of the moral man. Charity, in this manner, is an obligation, a necessary action. Utilitarianism was antedated by Leibniz, who devised a calculus similar to Bentham’s whose purpose it was to determine the benefits of more “perfect” (well-off) beings.[7] A problem arises which has not yet been addressed and which remains an elephant in the room: The problem of free will. The pre-established harmony certainly seems to leave no room for free will, prompting the question, In a determined world, is freedom possible? Leibniz answers yes. He argues clarifies that some predicates are contingent, leaving room for free will. For example, taking “Napoleon became emperor 1804,” it may seem that it was necessary for Napoleon to become emperor in that year, seeing as it happened that way, not otherwise; however, while the subject Napoleon contains the predicate “became emperor in 1804,” thus defining Napoleon, it is possible that God may have made him emperor a year earlier, meaning the predicate is contingent, a compossibility. But while God leaves room for free will, do we have self-determination? Are we able to actually cause things through our own causal power? Technically, yes, says Leibniz, controversially. It all depends on how what exactly self-determination entails. “The free man is one who knows why he does what he does.”[8] Our actions, mind you, are internally determined by our predicates, which are already Unknown.jpegcontained within ourselves. In this sense, we have no free will. But, if we can understand our motives, if we can understand the pre-established harmony, we can realize our thoughts and total possibilities. Because we unfold according to a pre-drawn map, if we are able to find this map, study it, then predict it, we are, in a sense, in control of our actions. We know what we will do—we are just destined to do it.

One of the lesser-appreciated and lesser-studied philosophers, Gottfried Leibniz remains an insightful and prescient Rationalist, a truly modern philosopher whose genius was far ahead of his times, and whose cleverness was realized too late. A physicist just as much as a philosopher, he remains an important figure in the history of science. In his life, he designed several inventions that were revolutionary, although none of them worked. Leibniz has gone down in history as one of the first rationalist advocates for optimism, yet despite his Panglossian philosophy, he ironically did not find much success in life. Conclusively, Leibniz is one of the great systematizers of philosophy and one of the most intelligent men in history.

A very simple visual showing some of Leibniz’s main ideas and their connections:

Screen Shot 2017-11-02 at 2.48.57 PM.png

[1] Ferm, A History of Philosophical Systems, p. 248
[2] O’Connor, A Critical History of Western Philosophy, p. 224
[3] Leibniz never really answers how immaterial points can constitute a physical body.
[4] According to Leibniz, a spirit is higher than a soul
[5] Leibniz, Monadology, §86
[6] O’Connor, op. cit., p. 234
[7] By “perfect” beings, he refers to wealthy, fortunate people. His determinism precludes simpler people, making him, arguably, an elitist.
[8] Stumpf, Socrates to Sartre, p. 250


For further reading: 
The Columbia History of Western Philosophy by Richard R. Popkin (1999)
A Critical History of Western Philosophy by D.J. O’Connor (1964)
The History of Western Philosophy by Bertrand Russell (1972)
The Encyclopedia of Philosophy Vol. 4 by Paul Edwards (1967)
A History of Philosophical Systems by Vergilius Ferm (1950)
Socrates to Sartre by Samuel Enoch Stumpf (1982)
History of Philosophy by Julián Marías (1967)
The Philosophers by Ted Honderich (2001)