Technology and Social Media: A Polemic

§1


Much gratitude is to be given to our devices—those glorious, wonderful tools at our disposal, which grant us capabilities whereof man centuries ago could only have wished, the culmination of years of technology, all combined in a single gadget, be it the size of your lap or hand. What a blessing they are, to be able to connect us to those around the world, to give us access to a preponderance of knowledge, and to give longevity to our lives, allowing us to create narratives and storytell; and yet, how much of a curse they are, those mechanical parasites that latch onto their hosts and deprive them of their vitality, much as a tick does. That phones and computers are indispensable, and further, that social media acts as a necessary sphere that combines the private and public, creating the cybersphere—such is incontrovertible, although they are abused to such an extent that these advantages have been corrupted and have lost their supremacy in the human condition.

§2


Technology is ubiquitous, inescapable, and hardwired into the 21st-century so that it is a priori, given, a simple fact of being whose facticity is such that it is foreign to older generations, who generally disdain it, as opposed to today’s youths, who have been, as Heidegger said, thrown into this world, this technologically dominated world, wherein pocket-sized devices—growing bigger by the year—are everywhere, the defining feature of the age, the zeitgeist, that indomitable force that pervades society, not just concretely, but abstractly, not just descriptive but normative. In being-in-the-world, we Millennials and we of Generation X take technology as it is, and accept it as such. To us, technology is present. It is present insofar as it is both at hand and here, whereby I mean it is pervasive, not just in terms of location but in terms of its presence. A fellow student once observed that we youths are like fish born in the water, whereas older generations are humans born on land: Born into our circumstances, as fish, we are accustomed to the water, while the humans, accustomed to the land, look upon us, upon the ocean, and think us strange, pondering, “How can they live like that?”

§3


As per the law of inertia, things tend to persist in their given states. As such, people, like objects, like to resist change. The status-quo is a hard thing to change, especially when it is conceived before oneself is. To tell a fellow fish, “We ought to live on the land as our fathers did before us”—what an outlandish remark! Verily, one is likely to be disinclined to change their perspective, but will rather accept it with tenacity, to the extent that it develops into a complacency, a terrible stubbornness that entrenches them further within their own deep-rooted ways. This individual is a tough one to change indeed. What is the case, we say is what it ought to be, and so it is the general principle whereupon we take our case, and anyone who says otherwise is either wrong or ignorant. Accordingly, following what has been said, the youth of today, the future of humanity, accepts technology as its own unquestioningly. As per the law of inertia, things tend to persist in their given states—that is, until an unbalanced force acts upon it.

§4


What results from deeply held convictions is dogmatism. A theme central to all users of devices, I find, is guilt; a discussion among classmates has led me to believe that this emotion, deeply personal, bitingly venomous, self-inflicted, and acerbic, is a product of our technological addictions. Addiction has the awesome power of distorting one’s acumen, a power comparable to that of drugs, inasmuch as it compromises the mind’s judiciary faculty, preventing it from distilling events, from correctly processing experiences, and thereby corrupting our better senses. The teen who is stopped at dinner for being on their phone while eating with their family, or the student who claims to be doing homework, when, in reality, they are playing a game or watching a video—what have they in common? The vanity of a guilty conscience—would rather be defensive than apologetic. The man of guilt is by nature disposed to remorse, and thus he is naturally apologetic in order to right his wrong; yet today, children are by nature indisposed thereto, and are conversely defensive, as though they are the ones who have been wronged—yes, we youths take great umbrage at being called out, and instead of feeling remorse, instead of desiring to absolve from our conscience our intrinsic guilt, feel that we have nothing from which to absolve ourselves, imputing the disrespect to they who called us out.

§5


Alas, what backward logic!—think how contrary were it to be if the thief were to call out that poor inhabitant who caught them. Technology has led to moral bankruptcy. A transvaluation of morals in this case, to use Nietzsche’s terminology is to our detriment, I would think. Guilt is a reactionary emotion: It is a reaction formed ex post facto, with the intent of further action. To be guilty is to want to justify oneself, for guilt is by definition self-defeating; guilt seeks to rectify itself; guilt never wants to remain guilty, no; it wants to become something else. But technology has reshaped guilt, turning it into an intransitive feeling, often giving way, if at all, to condemnation, seeking not to vindicate itself but to remonstrate, recriminate, retribute, repugn, and retaliate. Through technology, guilt has gone from being passive and reactive to active and proactive, a negative emotion with the goal of worsening things, not placating them. Digital culture has perpetuated this; now, being guilty and remaining so is seen as normal and valuable. Guilt is not something to be addressed anymore. Guilt is to be kept as long as possible. But guilt, like I said, is naturally self-rectifying, so without an output, it must be displaced—in this case, into resentment, resentment directed toward the person who made us feel this way.

§6


—You disrupt me from my device? Shame on you!—It is no good, say you? I ought get off it? Nay, you ought get off me!—You are foolish to believe I am doing something less important than what we are doing now, together, to think it is I who is in the wrong, and consequently, to expect me to thusly put it away—You are grossly out of line—You know naught of what I am doing, you sanctimonious tyrant!—

§7


When asked whether they managed their time on devices, some students replied quite unsurprisingly that they did not; notwithstanding, this serves as a frightful example of the extent to which our devices play a role in our lives. (Sadly, all but one student said they actually managed their time.) They were then asked some of the reasons they had social media, to which they replied: To get insights into others’ lives, to distress and clear their minds after studying, and to talk with friends. A follow-up question asked if using social media made them happy or sad, the answer to which was mixed: Some said it made them happier, some said it made them sadder. An absurd statement was made by one of the interviewees who, when asked how they managed their time, said they checked their social media at random intervals through studying in order to “clear their mind off of things” because their brains, understandably, were tired; another stated they measured their usage by the amount of video game matches played, which, once it was met, signaled them to move onto to something else—not something physical, but some other virtual activity, such as checking their social media account. I need not point out the hypocrisy herein.

§8


I take issue with both statements combined, for they complement each other and reveal a sad, distasteful pattern in today’s culture which I shall presently discuss. Common to all students interviewed was the repeated, woebegone usage of the dreaded word “should”:
—”I should try to be more present”—
—”I should put my phone down and be with my friends”—
—”I should probably manage my time more”—

§9


Lo! for it is one thing to be obliged, another to want. Hidden beneath each of these admissions is an acknowledgment of one’s wrongdoing—in a word, guilt. Guilt is inherent in “shoulds” because they represent a justified course of action. One should have done this, rather than that. Subsequently, the repetition of “should” is vain, a mere placeholder for the repressed guilt, a means of getting rid of some of the weight on one’s conscience; therefore, it, too, the conditional, is as frustrated as the guilt harbored therein.

§10


Another thing with which I take issue is when the two students talked about their means of time management. The first said they liked to play games on their computer, and they would take breaks intermittently by going elsewhere, either their social media or YouTube to watch videos. No less alogical, the other said they would take breaks by checking their social media, as they had just been concentrating hard. How silly it would be for the drug addict to heal himself with the very thing which plagues him! No rehabilitator assures their circle with alcohol; common sense dictates that stopping a problem with that which is the problem in the first place is nonsense! Such is the case with the culture of today, whose drugs are their devices. In the first place, how exactly does stopping a game and checking some other website constitute a “break”? There is no breach of connection between user and device, so it is not in any sense a “break,” but a mere switch from one thing to the next, which is hardly commendable, but foolish forasmuch as it encourages further usage, not less; as one defines the one in relation to the next, it follows that it is a cycle, not a regiment, for there is no real resting period, only transition. Real time management would consist of playing a few games, then deciding to get off the computer, get a snack, study, or read; going from one device to another is not management at all. Similarly, regarding the other scenario, studying on one’s computer and taking a break by checking one’s media is no more effective. One is studying for physics, and after reading several long paragraphs, sets upon learning the vocabulary, committing to memory the jargon, then solving a few problems, but one is thus only halfway through: What now? Tired, drained, yet also proud of what has been accomplished thus far, one decides to check one’s social media—only for 30 minutes, of course: just enough time to forget everything, relax, and get ready to study again—this is not the essence of management; nay, it is the antithesis thereof! No state of mind could possibly think this reasonable. If one is tired of studying, which is justifiable and respectable, then one ought to (not should!) take a real break and really manage one’s time! Social media is indeed a distraction, albeit of a terrible kind, and not the one we ought to be seeking. Checking a friend’s or a stranger’s profile and looking through their photos, yearning for an escape, hoping for better circumstances—this is not calming, nor is it productive. A good break, good time management, is closing one’s computer and doing something productive. Social media serves to irritate the brain even more after exhaustion and is not healthy; instead, healthy and productive tasks, of which their benefits have been proven, ought to be taken up, such as reading, taking a walk, or exercising, among other things: A simple search will show that any of the aforementioned methods is extremely effective after intense studying, and shows signs of better memory, better focus, and better overall well-being, not to mention the subconscious aspect, by which recently learned information is better processed if put in the back of the mind during something else, such as the latter two, which are both physical, bringing with them both physiological and psychological advantages. Conclusively, time management consists not in transitioning between devices, but in transitioning between mind- and body-states.

§11


The question arises: Why is spending too much time with technology on devices a problem in the world? Wherefore, asks the skeptic, is shutting oneself off from the world and retreating into cyberspace where there are infinite possibilities a “bad” thing? Do we really need face-to-face relationships or wisdom or ambitions when we can scroll through our media without interference, getting a window into what is otherwise unattainable? Unfortunately, as with many philosophical problems, including the simulation theory, solipsism, and the mind-body problem, no matter what is argued, the skeptic can always refute it. While I or anyone could give an impassioned speech in defense of life and about what it means to be human, it may never be enough to convince the skeptic that there is any worth in real-world experiences. It is true that one could easily eschew worldly intercourse and live a successful life on their device, establishing their own online business, finding that special person online and being in love long distance—what need is there for the real world, for the affairs of everyday men? Philosopher Robert Nozick asks us to consider the Pleasure Machine: Given the choice, we can choose to either hook ourselves up to a machine that simulates a perfect, ideal, desirable world wherein all our dreams come true, and everything we want, we get, like becoming whatever we always wanted to become, marrying whomever we have always wanted to marry, yet which is artificial, and, again, simulated; or to remain in the real world, where there are inevitable strifes and struggles, but also triumphs, and where we experience pleasure and pain, happiness and sadness—but all real, all authentic. There is, of course, nothing stopping one from choosing the machine; and the skeptic will still not be swayed, but I think the sanctity of humanity, that which constitutes our humanity, ought never be violated.

§12


What, then, is the greatest inhibition to a healthy, productive digital citizenship? What can we do to improve things? The way I see it, the answer is in the how, not the what. Schools can continue to hold events where they warn students of the dangers of technology, advise them on time management, and educate them about proper usage of technology and online presence; but while these can continue ad infinitum, the one thing that will never change is our—the students—want to change. Teachers, psychologists, and parents can keep teaching, publishing, and lecturing more and more convincingly and authoritatively, but unless the want to change is instilled in us, I am afeard no progress will be made. Today’s generation will continue to dig itself deeper into the technological world. They say the first step in overcoming a bad habit or addiction is to admit you have a problem. Like I said earlier, technology just is for us youths, and it always will be henceforth, and there will not be a time when there is not technology, meaning it is seen as a given, something that is essential, something humans have always needed and will continue to need. Technology is a tool, not a plaything. Technology is a utility, not a distraction. Social media is corrupting, not clarifying, nor essential. We have been raised in the 21st-century such that we accept technology as a fact, and facts cannot be disproven, so they will remain, planted, their roots reaching deeper into the soil, into the human psyche. Collectively, we have agreed technology is good, but this is “technology” in its broadest sense, thereby clouding our view of it. We believe our phones and computers are indispensable, that were we to live without them, we would rather die. To be without WiFi—it is comparable to anxiety, an object-less yearning, and emptiness in our souls. How dependent we have become, we “independent” beings! This is the pinnacle of humanity, and it is still rising! Ortega y Gasset, in the style of Nietzsche, proclaimed, “I see the flood-tide of nihilism rising!”¹ We must recognize technology as a problem before we can reform it and ourselves. A lyric from a song goes, “Your possessions will possess you.” Our devices, having become a part of our everyday lives to the extent that we bring them wheresoever we go, have become more controlling of our lives than we are of ourselves, which is a saddening prospect. We must check every update, every message, every notification we receive, lest we miss out on anything! We must miss out on those who care about us, who are right in front of us, in order to not miss out on that brand new, for-a-limited-time sale! But as long as we keep buying into these notification, for so long as we refuse to acknowledge our addictions and the problem before us, we will continue to miss out on life and waste moments of productivity, even if they are for a few minutes, which, when added up at the end of our lives, will turn out to be days, days we missed out on. As my teacher likes to say, “Discipline equals freedom.” To wrest ourselves from our computers or phones, we must first discipline ourselves to do so; and to discipline ourselves, we must first acknowledge our problem, see it as one, and want to change. As per the law of the vis viva (and not the vis inertiæ), things tend to persist in their given states, until its internal force wills it otherwise. We bodies animated with the vis viva, we have the determination and volition to will ourselves, to counter the inertia of being-in-the-world, of being-online, whence we can liberate ourselves, and awaken, so to speak. We, addicts, have no autonomy with our devices—we are slaves to them. Until we break out of our complacency, until we recognize our masters and affirm our self-consciousness thence, and until we take a stand and break from our heteronomy, we will remain prisoners, automata, machines under machines. We must gain our freedom ourselves. But we cannot free ourselves if we do not want to be freed, if we want to remain slaves, if we want to remain in shackles, if we want to plug into the machine. A slave who disdains freedom even when freed remains a slave. Consequently, we cannot be told to stop spending so much time on our devices, to pay attention to whom or what is in front of us; we must want to ourselves. Yet no matter how many times or by whom they are told, today’s youth will never realize it unless they do so themselves. They must make the decision for themselves, which, again, I must stress, must be of their own volition. Until then, it is merely a velleity, a desire to change, but a desire in-itself—nothing more, a wish with no intent to act. It is one thing to say we should spend less time, another that we ought to.

 


¹Ortega y Gasset, The Revolt of the Masses, p. 54

Advertisements

Athletics in Ancient Greece

Ancient Greece is remembered for many things, among them philosophy, science, architecture, and drama. Rich in culture and diversity, the Greek city-states were the perfect place for new innovations and achievements. In addition to the intellectual climate, the Greeks were famous for their athletics as well. In Ancient Greece, the athlete received just as much honor as the intellectual, and thus the mental and the physical flourished together. Of the athletic achievements in Greece, the most notable are the Olympics. The Games saw the coming together of the city-states in a collective embrace of the athlete and his feats. And outside of the Olympics, the Greeks continued their love for sport.


Unknown.jpegThe first Olympics, it is said, were held in 776 BCE, its purpose not athletic but religious. When the Olympics were first conceived, the Greeks intended for it to be a religious ceremony, a way for them to honor Zeus. Introducing athletics into the Olympics was a way of pleasing the Gods, as the performance, they hoped, would entertain the gods. However, the ceremony was not wholly religious in that the Greeks did it also to celebrate their humanism, specifically that of which the body was capable. Athletic achievement was one of the highest honors; it showed to what discipline and dedication could lead, and it inspired others by example. Originally, the competitions extended only to foot races and wrestling. Only later were horse and chariot racing, boxing, and javelin added. The popularity of the Games grew thereafter, and in 582 BCE Delphi initiated the Pythian Games; a year later, Corinth hosted the Isthmian Games; and in 573 BCE, Nemea had their own Olympic Games. Popular legend says the marathon is derived from the historical battle of the same name. The historian Herodotus recorded that the Greeks “sent off to Sparta a herald, one Pheidippides, who was by birth an Athenian, and by profession and practice a trained runner…. The Athenians… established in his honour yearly sacrifices and a torch-race.”[1] Pheidippides gave word to Sparta that Athens had defeated a massive Persian army, running 26 miles, it is said, in a day, which led to the creation of the modern-day marathon. In 394, the Olympic Games were outlawed by Theodosius.


Unknown-1.jpegThe standard performance was the Pentathlon, a five-event competition consisting of the broad jump, discus, javelin, wrestling, and 200-yard dash. Jumpers would begin at a stand still, dumbbells in hand, and leap; discus throwers used 12lb-weights; wrestlers were graded by referees based on takedowns and form; and the dash was called the stadion (σταδιον), since it was the length of a stadium. Running in Ancient Greece was comparably tantamount to today, with up to three events: the diaulos (διαυλος), a single lap around a stadium; the dolichos (δολιχος), 12 laps around a stadium; and the images.jpegarmor race, which was adapted from military training, and was a race in which the competitors sprinted with a full suit of armor on. Evidence of marble sprinting blocks can be found in stadiums, dilapidated, worn-down, from repeated usage. Boxing was a popular sport, more so than today, and attracted large audiences. Hide gloves that extended to the elbows were worn by boxers, and hits were restricted to the head alone. There were no rounds; the winner was determined by whoever surrendered first. Unlike modern boxing, the Greeks did not compete based on weight classes, so the competition devolved from a sport of skill to a sport of pure brawn and muscle. A hippodrome, built specifically for horse-racing, was constructed for the Olympics, the arena wide enough for 10 four-horse chariots to race at once, everyone scrambling around the 23 turns that awaited them at every corner. To the enjoyment of the crowd, this affair would usually end with one racer making it to the Unknown-2.jpegfinish line successfully—the others, due both to the lack of space and tight corners, all wiping out. Like today, this event caught the attention of rich bidders, who would bet on horses; if their bet paid off, they—not the racer—got the horse. More popular than all the other events combined was the pankration (παγκρατιον), which translates to “all-strength.” This event was a mix of boxing and wrestling. The only rules were no stomping and no finger-breaking. Gory tales of famous pankratiasts survive, some accounts telling of one who killed his opponent by ripping out his innards. Women were not allowed to compete in the Olympic Games, so they got their own version: the Heraea, which had but one event, which was running.


Today, athletes have four years to train for the Olympics, whereas the Greeks had 10-months of training. Athletes trained in gymnasiums (γυμνασιον) or xystos (ξυστος), a type of colonnade. Wrestlers had their own training grounds called palæstra (παλαιστρα). Runners, on the other hand, trained outside. “They [athletes] also set the Artist-re-creation-of-ancient-wrestling.jpgexample of contending naked, publicly stripping and anointing themselves with oil in their gymnastic exercises. Formerly, even in the Olympic contests, the athletes who contended wore belts across their middles,” wrote Thucydides [2]. A shocking fact to some, the Greeks competed naked, covering themselves in olive oil to prevent themselves from getting dirtied from the mud, as well as to make themselves more mobile and slippery. Married women were not allowed to spectate during the games, but girls were allowed to because then they could find future husbands. Every city-state came to a truce during the Games, even if they were in the middle of the war, because everyone looked forward to the games, which occurred every four years, like today. The city-states even had separate games for younger athletes, those who were not yet matured. Olympia and Pythia had a boys division, and Nemea and Isthmia had an intermediate (ageneioi, αγενειοι) competition. So competitive were the Greeks that they had only first place prizes; there was no second or third, nor was there a team prize; the individual athlete had his time to shine in the Olympics—it was, after all, a celebration of the body and human excellence. The Greek roots athlon-, meaning prize, and agon-, meaning game, from which comes agony and antagonist, all embodied suffering. The agonistic games were not meant for fun for the athletes; rather, they were vigorous, challenging tests that put them to their limits, forcing them to endure more. Each Game, it is estimated 40-50,000 spectators came from around Greece, and each athlete was announced by their name, followed by their home city.


Upon winning an event, the victor would be crowned. The Olympia gave out olive wreaths, Pythia laurel, and Isthmia and Nemea parsley. Those who won were awarded lavishly. Plutarch said Solon had a handsome reward for those who got first: “[T]he victor in the Isthmian games was to have for reward an hundred drachmas; the conqueror in the Olympian, five hundred.”[3] During the winner’s celebration they were showered in leaves (phyllobolia, φυλλοβολια), given free food for a lifetime by their home polis (sitesis, συτησις), awarded with all kinds of gifts, promised free seats at future Unknown-3.jpegGames (prohedria, προεδρια), praised by poets, made into sculptures, and bestowed the honor of having their name engraved into the corridors which led to the arena. The Persians, when they witnessed the Olympic Games, purportedly remarked that the Greeks were “men who contend with one another, not for money, but for honour!”[4] It was strange to them, that these people would commit themselves to such arduous training, to fight their brethren for their namesake, and not out of anticipation of compensation. But according to Homer, “there is no greater glory that can befall a man than what he achieves by speed of his feet or strength of his hands.”[5] Those who lost at the Olympics, understandably, fell victim to depression and carried with them tremendous shame, a stigma which stuck with them through life.


During childhood, children would play in ball rooms (sphairisteria, σφαιριστερια), where they would play, you might guess, ball. It is thought that they played a version of wall ball, in which they would bounce a ball, either on the wall or on the ground, catch it, then throw it back. Evidence also shows that they might have had their own version of lacrosse. With sticks with nets at the end, they would play on two teams, each trying to get the ball past the other. Youths had trainers of their own, paidotribai (παιδοτριβαι) and gymnastai (γυμνασται), who respectively were the equivalent of wrestling coaches and physical educators. Some other games they played were khytrinda, a variation of monkey-in-the-middle and tag; posinda, a guessing game; and drapinda, which was like duck-duck-goose, where the objective was to catch the other children, who pretended to be “runaway slaves.”


Amidst the philosophical contemplation, political strife, and cultural growth, the Ancient Greeks found the time to enjoy their four-yearly Olympic Games that united all the city-states, reminding them of the common joy they shared for athletic competitiveness and glory. Victory odes from poets are plenteous and tell of the greatest athletes, all of whom trained hard, fought hard, and won hard. Sometimes we forget how similar we are to the Ancients, who once you think about them, are not so far of from us as we think: We share the same love of athletics and the same appreciation for the wonders the body can do when put to the test.


[1] Herodotus, The Histories, VI.105
[2] Thucydides, The History of the Peloponnesian War, I.6
[3] Plutarch, Twelve Lives, p. 99
[4] Herodotus, op. cit., VIII.26
[5] Homer, The Odyssey, VIII.145

For further reading: Ancient Greece: A Political, Social, and Cultural History 2nd ed. by Sarah B. Pomeroy (2008)
The Oxford Companion to Classical Civilization by Simon Hornblower (1998)
The Story of Civilization Vol. 2: The Life of Greece by Will Durant (1966)
Paideia: The Ideals of Greek Culture Vol. 1 by Werner Jaeger (1945)
The Story of Man: Greece and Rome by Paul MacKendrick (1977)
The Western Experience 6th ed. by Mortimer Chambers (1995)
The Founders of the Western World by Michael Grant (1991)
The History of the Peloponnesian War
by Thucydides (1990)

The Classical World by Robin Lane Fox (2006)
The Histories by Herodotus (1990)
Twelve Lives by Plutarch (1950)
The Odyssey by Homer (1990)
The Illiad by Homer (1990)

Attention and Mindfulness (2 of 2)

Summary of part one: Attention is “the process of focusing conscious awareness, providing heightened sensitivity to a limited range of experience requiring more extensive information processing” and requires an external stimulus. Research by Colin Cherry (1953), Donald Broadbent (1958), and Anne Treisman (1964) found that we can attend to one task at a time, suppressing all other incoming stimuli, based on quality of sound.


Unknown.png“It is easy to eat without tasting,” says Jon Kabat-Zinn in Coming to Our Senses (p. 118). At first glance, this sentence seems random, out-of-nowhere, and completely absurd. Of course we taste our feed when we eat it! However, Kabat-Zinn argues that while we claim to experience and sense things, we do not truly experience them. His message throughout the book is that we have become out of touch with ourselves, with our senses, our bodies, and with the world around us; we fail to experience things for themselves, insofar as we rush through our lives, treating food as “just another meal,” hastily consuming it, not really taking the time to taste each individual flavor. When we eat a hamburger, all we taste is hamburger, not meat, lettuce, tomato, etc., but just hamburger. Our meals are prepared then eaten, but we do not taste them as they should be tasted. Kabat-Zinn states that when attention and intention team up, we are awarded with connection; from connection, regulation; from regulation, order; and from order, we arrive at ease, contentment. There is an effect called sensory adaptation that we seldom recognize yet is always at work. Constant exposure to an external stimulus builds up our tolerance to it, resulting in the numbing of that sense, to the point that we do not notice it. The reason others can smell our body odor but we ourselves cannot is an example of this, because our odor is constantly emanated, and the brain, to avoid distractions, builds up tolerance, to the extent that we no longer smell our own bodies. The purpose of sensory adaptation is to prevent us from becoming entirely distracted. The world is full of smells, sounds, sights, touches, and tastes, but imagine if we were exposed to all of them at once—this is why we need to adapt to our senses. Of course, were we rapt on studying so that all else was ignored, the sound of a car would still interrupt us, considering the intensity of it would overstimulate our senses. While sensory adaptation has helped us biologically, Kabat-Zinn notes that it also works to our disadvantage, particularly the dampening of our Unknown-4.jpegsenses, without which we cannot live. Breathing is of especial importance in meditation. It is necessary to all living things, we must remember; yet we take it for granted, repeatedly neglecting it, forgetting to check how we are doing it. If we took a few minutes every day to attend to our breathing, we could all reduce stress, find composure, and even lower our heart rate through practice. This applies to all sense. As Aristotle keenly reminds us, “[O]ur power of smell is less discriminating and in general inferior to that of many species of animals.”[1] Unlike most animals, humans’ sense of smell is weaker, and so we rely less upon it. Smell and taste are underrated when it comes to senses, although they are of equal merit. Like breathing, both are taken for granted, appreciated only when we are sick, when we can no longer use them—only then do we wish we could taste and smell again. Just as Kabat-Zinn said, we truthfully eat without tasting. Eating our food, we feel pleasure, in the moment; but if we were sick in the same circumstances, we would appreciate our senses that much more; as such, we must live each day as though we were sick.


There are different kinds of meditations, of ways of being mindful. During meditation, you can do a body or sense scan, where you spend a few moments going through your body, focusing on the sensations in a particular part of the body, examining it, then moving on; or you can, for a few minutes at a time, focus on each of your main senses, perhaps using only your ears for a minute, your nose the next. Proprioception is an obscure sense: it is the sensation of each body part in relation to the others. In a body scan, this is most prevalent, when you feel your body in totality, as a whole, yet are able to focus on one body part. William James, writing about boredom, could just have easily been writing about this state of meditation:

The eyes are fixed on vacancy, the sounds of the world melt into confused unity, the attention is dispersed so that the whole body is felt, as it were, at once, and the foreground of consciousness is filled, if by anything, by a solemn sense of surrender to the empty passing of time.[2]

Unknown.pngTypically, when one meditates, one can either close or open their eyes, fixing them at a certain point, listening to the sounds of the world around them, acknowledging every part of their body, paying attention to the breath, overcome by a static sense of stillness, as they are neither in the past nor the future, but the present, simply being, moment to moment. There are two types of attention in meditation: abstract, or inward, and sensory, or outward, attention. The former involves impartial introspection, the clearing of the mind, the decluttering of ideas. “This curious state of inhibition can for a few moments be produced by fixing the eyes on vacancy. Some persons can voluntarily empty their minds and ‘think of nothing,’” wrote James, describing hypnotism, though inadvertently describing meditation as well.[3] Sensory attention, on the other hand, is simply being attentive to the senses and all incoming stimuli. If you are interested in meditation, there are several exercises that can be done to sharpen your attentiveness, like dhāraṇā, jhāna, samādhi, or you can practice some brahmavihāras. In dhāraṇā, the meditator is aware of themselves, as a whole and as meditating, and an object; after dhāraṇā, they move to jhāna, which is awareness of Unknown-5.jpegbeing and of an object; and finally, in samādhi, they find themselves in unity with the object. Samādhi is translated to “one-pointedness” and refers to pure concentration, pure attention. When in this state, the meditator is in what William James calls voluntary attention. This attention occurs when there is a powerful stimulus, yet you focus on something of less intensity. If you are studying and there is noisy construction outside, focusing on the studying, even though the construction is louder and demands your attention, would be an act of voluntary attention. This state, however, cannot be held indefinitely. As James writes, “[S]ustained voluntary attention is a repetition of successive efforts which bring back [a] topic to the mind.”[4] Hence there is no such thing as maintaining voluntary attention, rather coming back to it over and over. Brahmavihāras are like reflections upon Buddhist virtues. There are four traditional brahmavihāras: loving-kindness, compassion, joy, and equanimity. Feel free, too, to make your own meditation, where you reflect on something outside of the given topics—questions in philosophy, like good and evil, justice, and the sort, are some starters.


Unknown-9.jpegI briefly mentioned the idea of clearing the mind, of emptying it of ideas, and to that I shall turn again. Thoughts, in Buddhist writings, are treated like clouds, wispy and flowing; they are temporary; sometimes they are clear, sometimes they clump together; sometimes they are sunny, sometimes they are like a storm. Either way, thoughts are not permanent, nor can they harm you in any way. Generally, we ought to be on the lookout for negative thoughts. When they arise, we must simply dismiss them. Thoughts are the fire to our thinking’s gasoline, for thinking about our thoughts merely propagates more and makes them worse. It is better to let thoughts pass than to intervene through force. Meditation calls for dispelling all thoughts, good or bad. It is misleading to think that we are trying to get rid of them, that we are trying to single some thoughts out from others. This is not the case; rather, we must acknowledge that we are thinking and let them pass. If a positive thought comes, do not perpetuate it, let it pass; if a negative thought comes, do not perpetuate it, let it pass. Another thing to remember is that simply acknowledging that you are thinking is being mindful, and you should not get frustrated with yourself for this reason. An important facet of Buddhist psychology is the distinction between perception and conception. Perception is pure sensation, and conception is labeling, to put it simply. Sitting in peace and silence, you hear a sound, process it, identify it as the rustling of the trees and the singing of birds, and continue meditating—such is an act of conception, for hearing a sound is perception, but classifying it, labeling it, is conception. Unknown-8.jpegLabeling is necessary for living. Without it, there would be no way to comprehend the world. We would be exposed to a chaotic mess, an overwhelming tidal wave of sensations we cannot understand. Almost everything we see and process is conceptualized: this is a tree, that is a plant, this is grass, that is dirt on which I am walking. One is tempted to think of Kant’s categories of the mind and the differentiation between phenomena and noumena. Our mind actively shapes our world, grouping things together, creating causal links, imposing spaciotemporal relations, constantly conceiving things. Perception is to noumena as conception is to phenomena. Rarely do we perceive things as they are, as things-in-themselves, but conceive them imperfectly. We need to carry this to meditation, in thought and in sensation. We must try not to classify things by texture, color, or shape, nor judge thoughts by appearance, nor label anything as “good” or “bad.” Another danger of thinking is daydreaming, to which all meditators are vulnerable, especially if their eyes are closed. When we doze off, finding comfort and relaxation, following our breath, we might accidentally slip into our fantasies, moving from the external to the internal, where we begin to plan for the future or reminisce in the past. No matter which you do, neither is good. William James warns us, “When absorbed in [passive] intellectual Unknown-10.jpegattention we become so inattentive to outer things as to be ‘absent-minded,’’abstracted,’ or ‘distrait.’ All revery or concentrated meditation is apt to throw us into this state.”[5] By meditation, James is not referring to it in our sense, but to the act of pondering. We should not fall into the trap of thinking about the future or ruminating about the past, because as Marcus Aurelius said, “[M]an lives only in the present, in this fleeting instant: all the rest of his life is either past and gone, or not yet revealed.”[6] The past is in the past, and there is nothing we can do to change it, and wishing you could redo something will not help. And the future has not happened yet, so making unrealistic expectations will not help either.


images.jpeg“But we do far more than emphasize things, and unite some, and keep others apart. We actually ignore most of the things before us,” notes William James.[7] For such a formidable tool to which we all have access, the art of attention and how to properly apply it has all but been forgotten by today’s society, to their disadvantage. We live in an age where A.D.D is rampant, and more and more kids are diagnosed with it. Further, our technology strips us of our connection to nature, to the world, to each other. We are no longer in touch with ourselves or our senses. With mindfulness and meditation, however, by living in the present and embracing our senses and life, we can make our lives meaningful.

 


[1] Aristotle, De Anima II.8, 421a9-10
[2] James, The Principles of Psychology, XI.2, p. 261
[3] Ibid.
[4] Id., XI.6, p. 272
[5] Id., p. 271
[6] Aurelius, Meditations, III.10
[7] James, op. cit., IX.5, p. 184

 

For further reading: Buddhist Psychology Vol. 3 by Geshe Tashi Tsering (2006)
The Principles of Psychology by William James (1990)
Coming to Our Senses by Jon Kabat-Zinn (2005)
Mindfulness by Joseph Goldstein (2016)
Meditations by Marcus Aurelius (2014)
Zen Training by Katsuki Sekida (1985)
De Anima by Aristotle (1990)

What is Humorism?

Unknown.jpegPsychology and medicine, finding their beginnings in Greek culture, have come a long way; and since their speculative foundations, their influence has become larger, more pertinent, and more accurate than ever before, with the invention of prosthetics in the field of medicine and cognitive studies in psychology, for example. It seems as though anything is possible, as though nothing cannot be achieved. One may wonder, then, from where psychology came, from whom modern medicine developed. Small questions, like why, when someone is in a bad mood, we say they are in bad humor; or why, when someone is angry, we say they are hot-blooded, or short-tempered, never fail to come up regarding the origins of either discipline. A glance through history, to the invention of psychology, can show us the foundations of both psychology and medicine—the ancient system of humorism.


The theory of the four humors is derived from the Pre-Socratic philosopher Empedocles (c. 490-430 BC) who posited the existence of the four basic elements that constituted all of reality: air, fire, earth, and water. Everything in the world, he explained, was a synthesis of all four, each contributing their unique characteristics and properties to create everyday objects. For this reason, early theory in medicine was based on philosophical theory, so the two subjects were closely intermingled, the cause of many a medical error in ancient times. The man whom we ought to credit for the beginnings of modern medicine is the Unknown-4.jpegGreek physician Hippocrates (c. 460-370 BC), who is most renown for the Hippocratic Oath, which is still used today. Despite the countless contributions he has made to medicine, there is difficulty when it comes to pinpointing which works he actually wrote and which works were written by either his student Polybus or perhaps even rival doctors of his. Some of his works, furthermore, seem to diverge in content, contradicting earlier theories. Central to Hippocrates’ method was a holistic approach to the body. “Hippocrates the Asclepiad says that the nature even of the body can only be understood as a whole,” remarked Plato [1]. Each part of the body was to be examined against every other part, so as to treat everything as one. He wrote of a popular principle at the time: “Certain sophists and physicians say that it is not possible for any one [sic] to know medicine who does not know what man is.”[2] Such importance placed upon the human body and its composition made the humoral theory possible, as well as the secularization of medicine itself. Apollo and Asclepius, the Gods of medicine, were thought to be the causes of disease up until Hippocrates, who, diagnosing epilepsy—once thought the work of the Gods—said it “appears… to be nowise more divine nor more sacred than any other disease, but has a natural cause from which it originates like other affectations.”[3]


The natural cause of which Hippocrates spoke was the humors. From the Latin word umor, meaning fluid, the humors were four fluids within the body, each aligning with one of the four elements of Empedocles. Hippocrates identified blood with fire, phlegm with water, black bile with earth, and yellow bile air. During the Scientific Revolution, the 16th-century physician William Harvey performed studies on the circulatory system, when he would eventually disprove Hippocrates and Galen. Acknowledging the two physicians and Aristotle (he supported the humoral theory), he wrote in his book regarding animal Unknown-2.jpeggeneration, “And thus they [the Ancients] arrived at their four humors, of which the pituita [phlegm] is held to be cold and moist; the black bile cold and dry; the yellow bile hot and dry; and the blood hot and moist.”[4] According to Hippocrates, one could tell whether the upcoming season would be one of sickness or health by analyzing the weather; if there were extreme conditions, like extreme coldness during winter or heavy rains during spring, then more diseases were to be expected, whereas normal conditions foretold of health and prosperity. Cold seasons exacerbated the cold humors, phlegm and black bile; while warm seasons exacerbated the warm humors, yellow bile and blood. Alchemist Philippus Aureolus Theophrastus Bombastus von Hohenheim, or Paracelsus (1493-1541), was a notorious physician during his time, often burning the works of Galen in public to disrespect him and his theories. Instead of the four humors, Paracelsus preferred a more alchemical approach, diagnosing based on saltiness, sweetness, bitterness, and sourness, adding a fifth property, life. In addition, he gave these elements their own properties, such as combustibility, solidness, fluidity, and vaporousness. The human body has a balance to it, what Hippocrates judged as a body’s krasis (κρασις), or mixture. A healthy body has a good mixture, eucrasia (ευκρασια), meaning it has an even amount of all four humors. Pausanias, a doctor in The Symposium, explains that,

The course of the seasons is also full of both these principles; and when, as I was saying, the elements of hot and cold, moist and dry, attain the harmonious love of one another and blend in temperance and harmony, they bring to men,… health and plenty, and do them no harm.[5]

Unknown-3.jpegWhile one should strive for an ideal balance, eucrasia, one should stay as far away as possible from dyscrasia (δυσκρασια), or bad mixture, for if it is extreme, it can result in death. Too much phlegm (mucus), warns Hippocrates, can clog the throat, choking off airflow, resulting in asphyxiation, for instance. Another Renaissance physician, shortly after Paracelsus, named Santorio Santorio (1561-1636), calculated that between the perfect balance of eucrasia and the imperfect balance of dyscrasia lie 80,000 unique diseases stemming from different combinations of humors. Determined to prove Hippocrates and Galen right, Santorio carried out extensive experiments, measuring the body’s temperature before and after diagnosis with a thermoscope, then measuring it daily thereafter, comparing each new temperature to the healthy one.


“Those diseases which medicines do not cure, iron cures; those which iron cannot cure, fire cures; and those which fire cannot cure, are to be reckoned wholly incurable,” stated Hippocrates confidently [6]. Should some poor soul suffer from dyscrasia, there were several cures with which he could proceed, and there was a cure for each type of imbalance. Hippocrates invested his faith in incisions, stating that iron, by which he means knife, is the next step up from remedies; if surgery does not work, he says one should proceed to cauterize; but if fire does not work, then one is out of luck. Other proposed cures were sweating and vomiting, which would either excrete or purge any excess humors. Of course, then there was bloodletting, the deadly, inaccurate method of making a cut in the skin and cleansing the blood. So popular was bloodletting that by the 1500’s, “[t]reatment was still based on the Hippocratic theory of humors, and bloodletting was a panacea.”[7] Virtually any disease could be cured by bloodletting—that is, until William Harvey. Besides these cleansing methods, there was an easier, more efficient way of handling humoral diseases, one which did not require knives or fire: using opposites to counteract. If there was too much blood, a doctor could counteract it with black bile, opposing the hotness and moistness of the former with the coldness and dryness of the latter; similarly, too much yellow bile could be countered with phlegm, and vice versa. Hippocrates was also famous for prescribing his patients varying diets that would in the same way counter the excess humor, usually advising the replacement of wheat with bread, of water with wine.


This raises the question, though: Why does humorism matter, why is it relevant at all, considering it is outdated and completely incorrect, and why should we be interested? As I said at the beginning, humorism was the foundation for psychology; specifically the foundation for the psychology of personality, a much-studied and much-debated area of research today. Roman physician Galen (c. 130-200) was arguably the first person to attempt a formal study of personality. A studied physician of Hippocratic writings, a learned student of Stoic logic, Galen was an empiricist at heart, emphasizing experience over speculation, what he called demonstrative knowledge (επιστημη αποδεικτικη). Neither Hippocrates nor Galen studied the interior of the human body, as the dissection of humans was taboo; thus, their findings were purely theoretical, which is rather ironic for Galen, who did cut open animals, just not humans. Galen identified two types of passions: irascible passions, those which are negative, and concupiscible, those which are positive [8]. He observed four Unknown-1.jpegtemperaments arising from the four humors. (Temperament, interestingly, translates to mixture!) In fact, “Before the invention of the clinical thermometer and even for some time afterwards, bodily ‘temperature’ was only a synonym for ‘temperament.’”[9] His theory of the four temperaments is so influential that their adjectives have carried over today: too much blood creates a sanguine character who is cheerful; too much phlegm a phlegmatic who is calm; too much yellow bile a choleric who is angry; and too much black bile a melancholic who gloomy; and for the latter two, one can say bilious. Hippocrates noticed these characteristics in his time and attested, commenting, “Those who are mad from phlegm are quiet, and do not cry nor make a sound; but those from vile are vociferous, malignant, and will not be quiet, but are always doing something improper.”[10]


One may dissent again: Why is this relevant? for it is outdated. Although Galen’s theory of the four temperaments is largely out of use [11], it has spawned interest in following Hans_Eysencks_4_Personality_Types.gifpsychologists of personality. The infamous Myers-Briggs Type Indicator, or MBTI (1943), can be seen as a derivative. It utilizes different traits to arrive at a certain personality. Those who wish to know their personality have to decide if they are introvert or extrovert, if they intuit or sense, think or feel, and perceive or judge. Another option, the Big Five, or Big Three (1949), identifies people based on their levels of openness, conscientiousness, extraversion, agreeableness, and neuroticism. Big Three limits the scales to neuroticism, extraversion, and openness. Lastly, the direct descendant is psychologist Hans J. Eysenck (1916-1997), whose method of deducing personality was influenced entirely by Galen. Eysenck created a dichotomy between extraversion and introversion, neuroticism and psychoticism, recognizing several character traits reminiscent of Galen.


[1] Plato, Phaedrus, 270c
[2] Hippocrates, On Ancient Medicine, p. 13b*
[3] Hippocrates, On the Sacred Disease, p. 326a
[4] Harvey, Anatomical Exercises on the Generation of Animals, p. 435b*
[5] Plato, The Symposium, 188a
[6] Hippocrates, Aphorisms, §7, 87
[7] Durant, The Story of Civilization, Vol. 5, p. 532
[8] This is a very superficial description; for a more detailed one, read Aquinas’ Summa Theologica, 1.81.2,ad.1
[9] Boorstin, The Discoverers, p. 341

[10] Hippocrates, On the Sacred Disease, 337a
[11] Read Florence Littauer’s Personality Plus for a modern perspective

For further reading: 
Greek Thought: A Guide to Classical Knowledge by Jacques Brunschwig (2000)
The Oxford Companion to Classical Civilization by Simon Hornblower (1998)
Anatomical Experiments on the Generation of Animals by William Harvey

An Intellectual History of Psychology by Daniel N. Robinson (1995)
The Encyclopedia of Philosophy Vol. 3 by Paul Edwards (1967)
The Encyclopedia of Philosophy Vol. 4 by Paul Edwards (1967)
The Encyclopedia of Philosophy Vol. 6 by Paul Edwards (1967)
The Story of Civilization Vol. 2 by Will Durant (1966)
The Story of Civilization Vol. 3 by Will Durant (1972)
The Story of Civilization Vol. 5 by Will Durant (1953)
The Psychology Book by Wade E. Pickren (2014)
The Story of Psychology by Morton Hunt (1993)
The Discoverers by Daniel J. Boorstin (1983)
On the Sacred Disease by Hippocrates
On Ancient Medicine 
by Hippocrates
On the Natural Faculties
 by Galen

Extra reading for fun: Personality Plus by Florence Littauer (1992)

*Pages referenced to Great Books of the Western World, Vol. 9, 26, by Mortimer J. Adler (1990), respectively

The Art of Mindfulness

imagesTake a moment to breathe. It does not matter where you are, whether you are sitting down or on a plane, just take a deep breath. We seem to neglect one of the most fundamental and vital pieces of living. The concept of mindfulness is often affiliated with Zen meditation and Buddhism, but the thing is, it can be applied in our everyday life. You do not have to be spiritual to be mindful; all it takes is patience and commitment.

The definition of mindfulness is “the quality or state of being conscious or aware of something.” In all reality, the concept is truly simple. One thing people do incorrectly is they assume that there is some sort of immediate gratification or enlightenment derived from being aware. “As soon as I start to focus on my surroundings, I will be wise” is a misconstrued belief. While it is not incredibly common, mindfulness is not a way of gaining wisdom, rather it is a new way of experiencing everyday life. Luckily for those of us that have stress-filled lives, mindfulness can be done anywhere at anytime. If you are at work, if you are napping, or even if you are on the toilet, mindfulness can be applied.

So what exactly does mindfulness entail? There is a principle in Daoism called Wu Wei. Translated, it pretty much means non-doing. While it seems a bit contradictory, non-doing is an important constituent in being conscious. Wu Wei is not about doing nothing. Frankly, it is a little more than that. Non-doing is about relaxing into the moment. Relaxation is key here; all one’s attention should not be focused on what is in front of you or on that one itch you have, it is about acknowledging everything around you. Pretend like reality is a picture snapped by a camera. You, the viewer, are examining this picture as it really is. But you should not just look at this picture. Remember to be constantly using your five senses (contrary to popular belief, there are tens of senses)! Listen to whatever is going on. Feel the pressure you exert on whatever is beneath you. Smell the air. When I say that mindfulness is applicable anywhere, it even includes eating: your gustatory, aural, oral, tactile, and optic senses can be activated and brought into the moment as you do so. The main idea, if you recall, is to be in the moment. To be present.

We constantly judge things. Whenever we see a person or a thing, we seem to filter it immediately. Thoughts are constantly passing through our minds. Always remain objective, do not let personal opinions cloud your sight. Instead of noting the color of something or noticing if something does not look right, simply see it for how it is. That chair that keeps bothering you in the corner is neither ugly nor beautiful. It is a chair. Appreciate things for how they are. Be thankful for being able to be here. Now. In the present. Accept the gift of living. Too much time is spent on that which has passed or on that which has yet to come. Life is too short, so live now. Obviously, it can be boring and you can find yourself drifting off. Everything is constantly moving. Your breathing is constant, though. A consistent drive of in then out. Focusing on the breath is what keeps us bound and in the moment. Again, use your senses to examine the breathing. Do not be subjective though, feel it as it is. Not everything is under our control, and we have to learn that the hard way. Sometimes you just have to let things happen. Sometimes you just have to do not-doing. As we do this, we must inquire about ourselves. Part of mindfulness is reflecting about ourselves. What do I want to get out of this? Where do I want to go? Am I awake? Mindfulness is not a state, it is a path. The journey towards awakening is up to us as individuals.

The idea of “me”,”I,” and “mine” frequently finds its way into our thought process. When it comes to mindfulness, this is to be avoided. It has been proven that the world does not revolve around us. The course of history is not going to take a rain check just for your needs alone. You are not always in control of fate, but what you are in control of is yourself. And while you may not have control over what happens, you may discover that everything in this world is part of a cycle. Wholeness is another key aspect. Connect yourself with those around you. To explain in Jon-Kabat Zinn’s terms, life is like climbing up a mountain. We experience the world as we go up. Once we reach the top, we have a clear view of our surroundings. Then, we must journey down. In the fragility of life, there is only so much we can do. One thing most meditators like to do is meditate on an emotion. For example, you can meditate on the concepts of anger, generosity, compassion, joy, and equanimity just to name a few.

Life is a one-time thing; do not waste it. The time we spend dwelling on things we cannot change will not benefit us. All the time we spend texting and browsing the internet cannot be brought back. If there is one thing we can do, it is to be present. We must be mindful. No more seeing life in black and white. From now on we must experience the world for how it is. Nature is beautiful and so is life. There is still time to live life freely and without judgement. All it takes is patience and commitment. Carpe diem!

 

For further reading: Wherever You Go, There You Are by Jon-Kabat Zinn (1994)
Running with the Mind of Meditation
by Sakyong Mipham (2012)

1001 Ideas That Changed the Way We Think by Robert Arp (2013)
Eastern Philosophy: Wu Wei

A Brief History of Running

UnknownWhether you hate it or love it, running is one of the most primitive sports in the world. From the beginning of Man to present day, humans have been running. But what most people don’t understand is that there is far more to it than just traveling on your feet for exercise. Ever since we have taken up the activity, whether it be for survival, exercise, or for fun, running has always been in our nature. Fortunately, running has been made vogue again, echoing its primordial importance.

Christopher McDougall’s Born to Run became a massive success after detailing the lives of the secluded Tarahumara tribe. This culture of ultramarathon runners has adapted specifically to running. For 400 years they have ran races well over a hundred miles for the pure fun of it. The book also talks about the history of running and how its beauty has been tainted over the years. To sum it up, millions of years ago, when we were hunters and gatherers, we “persistence hunted.” Basically, our ability to walk upright allowed for more endurance, which allowed us to literally run animals to death. Perspiration is our automatic cooling system, our feet are arched to better propel ourselves, and our upper bodies support us, increase our efficiency, and keep us balanced is the gist of it. The recent introduction of running shoes, Christopher points out, has provided over-protection that leads to increased injury. Wearing these shoes prevents us from running how we are “supposed” to. 

Let’s jump forward a little bit. You may recall the ancient Battle of Marathons between the Athenians and Persians. If you did not notice, the word marathon is in the name and hints to the origin of our modern day distances. According to historians at the time, a messenger traveled 140 miles from Marathon to Athens to tell of the Greek’s victory. This supposed feat is incredible, to say the least. A glorious festival was held in place of the event. In 776 B.C., the first Olympic games were held in honor to the gods. What started as just racing events eventually adopted more competitive sports to become the modern Olympics. Running was held in great reverence by the Greeks. The eventual study of anatomy and physiology that was be influenced by them would greatly advance the art.

 

For further reading: Born to Run by Christopher McDougall (2009)
It’s All Greek to Me by Charlotte Higgins (2008)

Why I Love Running – Narrative*

UnknownRunning is one of those things you either love or hate. I am one of those people with an on-again, off-again relationship. Now I am a sprinter by nature, but I am also a decent long distance runner. A year ago I absolutely despised the mile (or anything longer), and it was pretty normal in middle school to fear it. However, being the diligent person I am, I decided to learn to embrace long distance. In the sixth grade, I joined the after school running club. As you might expect, all we did was run the mile, maybe more, every week. Like anything you want to get better at, it requires long hours of practice and dedication. I decided to order a book about long distance running: Born to Run. It was through these weeks of tedious and exhausting running and reading that I started to see a change. My pacing, my breathing, and my form slowly progressed. It got to the point that a mile was only semi-challenging for me. It is in these brief moments of progression that I truly enjoy running.

There are two components that make up running: first and foremost, the physiology. When you are in the middle of your run, you can feel your lungs practically dying, and your legs are filled with that irritatingly lethargic lactic acid. You can feel the hot sweat on your face and you might just have a cramp. I usually want to give up at this point, but that is where the second and most important part comes in. Mentality. Every sport requires a focused and confident mindset. Half of my current knowledge on running comes from that book I mentioned earlier. In Christopher McDougall’s book, he details the lives of the “ultramarathon-running” Tarahumara tribe of Mexico. He also talks about his own struggles and how he overcame his running problems. I also learned quite a bit about our primitive ancestors that were adapted and pretty much made for running long distance. My running teacher told me never to run on your heels. Most injuries can be obviated by running on the balls of your feet like we are supposed to. While I happily die inside, I love to just relax. Let the air cool you down, look around at nature, think about how every individual muscle is working so flawlessly, and visualize yourself getting stronger and healthier. It is in these brief moments of contemplation that I truly enjoy running.

This now takes us to the present. Every now and then I will go running with my friend Kevin, an avid runner, around the hills. Kevin is my role model, for the enthusiasm, devotion, and pure enjoyment he brings to every run is inspiring. Often times he will go for a casual 5-10 mile run (awe-inspiring and concerning). I still smile every time I recall stopping amidst a run to walk off a cramp or to give my legs a rest only to have Kevin lecture me. While his words of encouragement and incessant passion do not always work, I realize how much Kevin cares about running. So even though my calves might fall off, I power over that last hill. Those thoughts of, “Give your calves a rest. That hill is too high,” subsided into determination and gratification. It is in these brief moments of hope that I truly enjoy running.

My former P.E. teacher and I go on runs once a week. Joseph is like my other Kevin. Because he helped me get into fitness, I look up to him as a teacher yet again. On our runs, we converse about science, specifically the questions I ask about the mechanics of running. The best part of his teaching is that he makes it feel as though I am talking to one of my friends, not a teacher. His community running club, which has yet to catch on, candidly speaks his intentions: Joseph wants so share his enjoyment of running and use it as a way to get the community together. He has recently run a couple of marathons and goes on hikes regularly to work on his running. Even when it was 90° outside, I still appreciated the run with him. It is in these brief moments of learning that I truly enjoy running.

You either hate it or love it unless it is that constant struggle between desperation and inspiration that keeps you going. Those failures that stop you along the way only make you more focused, more resilient, better than before. As my current P.E. teachers tell me, “You’re only racing against yourself.” As I write this article, I am now realizing why I love running, why I keep going despite the odds. This new birth has been a long, tortuous journey that has expanded my horizons and given me something new to improve on. I have decided to end this with my favorite Kevinism: “We da best.”

Special thanks to Kevin, Mr. S, and Ms. W for getting me to where I am today. I went from running an 11:30 mile to a 6:30 mile (5-minute difference)! I am so honored to have had such great role models. Without them, I would still hate the mile.

(*Very rarely will I upload a narrative)

 

For further reading: Born to Run by Christopher McDougall (2009)