Lines Composed During Overnight Camp, June 13, 2018

While the parents are standing, waiting on the grass,
A steady line of children continues to amass
We slowly load our things and get on the bus;
Meanwhile, the kids are still making a fuss
Making sure our seatbelts are on, we ourselves seat,
And we open the windows to let out the heat
We pass by beaches, forests, towns, and more,
Protected overhead by towering trees galore
Time passes on its own terms, at its own pace,
Such that it leaves its ghostly trail for our grace
The children, though, are restless and restive:
They want to get up, move around, be festive
What’s to them an eternity, is a second;
Yet when it’s over, it’s like match to flint
For landing, finally, at Camp Jones Gulch,
They dance ’round excitedly on the mulch.

But let us pause and appreciate Nature
We’ll bask in the sun, and ourselves the day inure
Everywhere we look, there are tall, majestic trees,
The sights whereof invite us to ponder and freeze
Solitary, regal, they reach high to the sky
Oh, imagine what it would be like to fly:
Soaring weightlessly, looking down with bird’s eye view
Surrounded by an infinite sea of blue,
Within whose depths you could see the whole world,
Entrapped by mighty trees that upward curled,
So that all of life were in Nature’s cradle,
And nurtured by Her, by Her ladle.

Here we are, returned to our old home,
Around which, with discretion, we roam;
But having been from her disconnected so long,
Having journeyed forth too far, we’ve forgotten our native song:
We don’t look upon Her children with curious eyes;
Instead, we ignore them, and so lose our ties
It’s as though we have been ripped from our Mother—
Now aught’s that not technology’s considered “Other”
Mother Nature’s been from us strangely estranged
—All is foreign that is not within Wi-Fi’s range
A dose of Nature’s medicine is what we need,
Lest we our apathy to it concede.

Unknown

So how do we regain interest in whence we grew?
Firstly, by giving up what we took in lieu:
This means we must give up our devices,
Which also brings with it clearing our vices
For instance, we have succumbed to accedia;
Consequently, we accede to exceed in a
Temperament of mild intemperance
—And tempered so, have severe severance
To turn a new leaf, we foster compassion
As such, environment becomes the new fashion
And soon enough, conservation spreads like wildfire
Hopefully, this way, Nature will not tire,
Having suffered through our tire-less neglect
We’ve reneged on our promise, which we must re-elect
Our duty to uphold and love Nature—we fail
Oblig’d to make her travel her own trail of travail
This, the trail littered with our own trash, o lowly path;
Wherefore we have fated ourselves to her wrath
So, when She to us doth Herself disclose,
Acknowledge that She has Herself exposed
Her self-disclosure, methinks, is, to us, a gift
From hence to drift, is to betwixt us deepen rift
If Man’s legacy is not to be Nature’s reaper,
Then it is up to us to be Her keeper.

Nature we ought to embrace with open arms,
But if we close ourselves off, then it harms
This uttermost disgrace, I doth opine
To deal such disrespect to a tree—oh Pine!
Unto you all, to my clarion hark:
Denude not the trees, and rip not their bark!
Upon the banks, tread cautious ’bout the leaves;
As Nature appealeth to he who to her cleaves
With patient hands, preserve Her precious twigs—
Or prepare to be privy to what She rigs
Though its bathos be blue, keep Her waters see-through,
And do your best not to dirty her lands, too
Of the birds who in spring sing their songs, stay clear
Admire their distant-heard tunes, sincere
Nature’s rocks are old and full of stories,
So be mindful not to make out of them quarries;
Because even if to climb be your knack,
Surely, your mindlessness will cause in them a crack

Nature, you’ll observe, is full of beauteous sights,
But just like us, Nature, too, has Her own natural rights;
Therefore, let us in Nature enjoy our stay,
And for the whole day, we’ll have fun in our play.

~Fin

Advertisements

What is Dreaming, What Do Dreams Mean, and Why Do We Dream?

170419.jpgMartin Luther King, Jr. once had a dream—and last night, so did I. At the end of a long day, we all get in bed, close our eyes, and go to sleep. Then the magic happens. It is as if when our eyes close, the curtains of the mind are opened, and a lively and magical performance begins on the stage of our unconscious, with dances, songs, and monologues. Bright, intense, and nonsensical, these images in our heads visit us every night, although we are quick to forget them, as they soon fade away, almost as though they never happened. Dreams feel real, yet they are unreal, illusory. Sometimes they capture things that have happened to us, but sometimes they show us things that have not yet happened, and sometimes yet they show us things that are happening. Dreams are the greatest mysteries of the night, which is why they have attracted so much attention, both from individual thinkers and collective civilizations, who have attributed to dreams some sort of importance. What are dreams, really? Why do we dream? Do other animals dream? These are all questions psychology has been asking and will continue to ask. As of right now, none of these questions has a confident answer, but is constrained to theory. We humans will not rest (no pun intended), though, until we get the answer; we will refuse to just “sleep on it”—literally, because we cannot. So in today’s post, we will be exploring the science behind dreaming, the history of dreaming, and the different interpretations of dreaming that have been proposed. Although no definitive answers will be yielded, we will still gain some valuable insights on the nature of dreaming.


What is dreaming?

Types-of-brain-waves.jpgIt is not like we start dreaming as soon as we get into bed. Instead, sleep has to pass through several stages in order for dreaming to initiate. Researchers study brain waves with electroencephalograms (EEG’s)—a fancy word that refers to the skill of finding and interpreting electrical activity in the brain at a given moment. With these brain waves, psychologists have found that there are at least four stages that occur in the sleep process: First, in our everyday waking lives, our brain produces beta waves, which are released when we think, usually at 13 or more cycles per minute (cpm); second, when we close our eyes and start to relax, alpha waves start to kick in at 8-12 cpm; third, theta waves are produced at 4-7 cpm when we enter light sleep, or NREM, and begin to feel drowsy; fourth, we experience delta waves, which are 4 or fewer cpm, created during deep sleep, known as REM. It is in this last stage, when Delta waves are produced, that we experience most of our dreams. But what is a dream?


A dream is “a sequence of images, emotions, and thoughts passing through a sleeping person’s mind.” In addition, “Dreams are notable for their hallucinatory imagery, discontinuities, and incongruities, and for the dreamer’s delusional acceptance of the content and later difficulties remembering it.”[1] One important thing which is to be gleaned from this definition is the fact that dreaming is not just confined to visual displays and imagery in pictures; rather, dreaming can involve many other senses. A lucid_Dreaming.jpgquestion many people are curious about is whether blind people can see things in their dreams, or deaf people hear. What has been found is that people with blindness, because they have never seen anything, dream using senses other than sight, and the same thing applies to deaf people. In other words, people with congenital blindness, who were born blind, can hear or smell different things since they have been exposed to such stimulation, but not with their eyes. Another interesting thing about dreams is that, besides not just being about pictures, dreams can also communicate intentional states, i.e., motivations, fears, desires, etc. Dreams are set apart from waking life due to their being illogical. Whereas there is a logical cause-and-effect and sequence of narration that follows a common story in real life, there are random and disorganized events in dreams. As such, they are characterized as fantastical, belonging more to fantasy and fiction than reality, adopting unrealistic exaggerations and possibilities, more incredible than realistic. When we say there is a uniform narrativity to life, we mean there is a set plot, with a beginning, middle, and end; but with dreams, there is no such narrativity, for there is nothing that links events together in any reasonable way.


Now, regarding what actually happens during dreaming: Once we reach deep sleep, once the brain starts putting out Delta waves, we switch between two stages, REM and NREM. REM stands for “rapid eye movement,” and there are about 4-5 of them that cycle through the night, 90 minutes at a time. While we are in REM, brain waves paradoxically Unknown.jpegresemble those that happen when we are awake. If we were to look at the brain when we were awake, then it would look just like how it is when we are in REM—despite the fact that REM is deep sleep, when the entire body is paralyzed and in total relaxation. This is a kind of “dream-state,” as psychologists like to refer to it, in which animals who undergo it are very stimulated. All mammals, not just humans, experience this dream-state. The only difference is how long each animal spends in the dream-state; depending on the average lifespan of the animal, they will dream more or less. We humans are in the middle. The reason REM is named so is because the eyes literally twitch rapidly while shut, which seems to contradict all logic, and the reason why psychologists are so puzzled about the phenomenon. Speaking of puzzling phenomena, some people experience “sleep paralysis” when they regain consciousness during sleep, only to find their bodies rigid, unable to move, as if stapled to their beds, their throats pinched. Why we wake up randomly, we do not know. Why we are paralyzed—this we do know: Psychologist Michel Jouvet found in an experiment that the pons, located in the lower region of the pons2.jpgbrain stem, actually inhibits all motor activity, meaning the muscles are completely stopped. He performed a study in which cats had their pons removed, and then he watched them at night to see what happened. Because he got rid of the part of the brain that stopped muscles from being used, the cats, he observed, actually moved around quickly and ferociously, in a predatory manner, because they were, Jouvet supposed, dreaming about catching mice. What this revealed is that, if the pons were not activated during sleep, there would be many more injuries at night. It has been reported by a number of people that they experience a sort of “falling reflex”; upon falling in their dream, they wake up, as if reacting to the fall and catching themselves. Imagine, then, what would happen to many of us in some of our more threatening dreams, if it were not for the pons in the brain stem.


What about NREM? NREM stands for “non-rapid eye movement”—I know, creative. As is to be expected, NREM is not called “deep sleep” for a reason; NREM is a lighter form of sleep that is not as engaging. To better illustrate the difference, take people who can Unknown-1.jpegsleep through their alarms, and those who cannot; the former are in deep sleep, the latter in light. For a time, it was thought that dreaming only occurred during REM; however, later studies disproved this, stating that dreams do occur during NREM, just that they are less memorable and exciting. Other things that have been found about dreaming regard the environment and dream content. The external environment of a sleeper has been discovered to affect their dreams. For example, a case had test subjects enter REM-sleep, then the tester would spray them with water. Upon waking, the subjects said they dreamt of some form or another of water, be it seeing a waterfall or swimming in a pool. What surrounds a dreamer or what they touch can create associations related to the outside stimulus, or effector. Such dreams are “self-state dreams,” since their content is centered around the state in which the self finds itself. Sometimes, self-state dreams can also lend insight into future actions. One thought-provoking fact is that 80% of reported dreams are negative (Domhoff, 1999). Accordingly, for every five dreams we have, only one of them does not involve bad things happening to us.


Another subject of inquiry—one which is unbelievably trippy—is lucid dreaming. When dreams are very high in lucidity, or clearness, we are aware of ourselves as dreaming. lucid-dreaming1.jpgLet us put it this way: Lucid dreaming is knowing that we are dreaming. But are we just dreaming that we are dreaming? If you want a quick look at the philosophical problem of dreaming, then you can read here! Aside from the armchair philosophy of dreaming, there is a little more substance to lucid dreaming. For instance, lucid dreamers feel like they have no sense of touch, allowing them to pass through otherwise impassable obstacles, and they also apparently lack any other sense beside sight. Lucid dreams are also said to be more bright than regular dreams. When aware of dreaming, dreamers can ignore natural laws, doing things that defy logic and physics. All of this raises the question of why we even dream in the first place. If sleep is necessary for us to rest our bodies, then why not just sleep, why have hallucinatory visions at night? Unfortunately, we have no solid answers. There is only speculation. I will discuss these speculations in further detail at the end, but for now, here is a brief overview.

  1. Wish-fulfillment. According to this theory, dreams are symbolic representations of repressed, unconscious urges that are mostly erotic. The problem with this theory is that, surprisingly, dreams with sexual content are actually quite rare and uncommon (recall that 80% of dreams are negative).
  2. Memory storage. Those who support this theory argue that because memory is improved during REM, it stands to reason that the purpose of dreams is to filter out the day’s experiences. If you have ever heard that it is unwise to study right before going to bed, then it comes from this. Just like your body, your brain needs time to recover, so if you jam it with knowledge right before bed, then you will overload it, and your learning will not be as effective; the brain works more efficiently if it takes in smaller chunks over a longer amount of time.
  3. Neural pathways. Random sensory information from outside stimulates the brain as it sleeps, strengthening their neural connections. Thus, this theory says dreaming’s purpose is to solidify our neural pathways.
  4. Biological model. Activation-synthesis is the theory that the brain stem creates random imagery that is interpreted by the limbic system, which colors it. Hence, seemingly meaningless visuals are turned into emotional, colorful images that resemble conscious life.
  5. Cognitive development. For some, dreams reflect our cognitive development. As evidence, they use the fact that children have relatively simple, crude dreams, whereas adults have more complex, egocentric dreams. The complexity of dreams depends on how much knowledge one has.

A History of Dream Interpretation

Egypt ba.jpgSince the earliest civilizations of man, dreaming has held an important place in our culture. If we explore the human mind over 4,000 years ago, then we will find the earliest records of dreaming to date. A document known as the “Chester Beatty papyrus” was excavated and is dated to be from around 2,000 B.C. On it are written 200 dreams that were reported and interpreted by the Egyptians. Based on Mesopotamian mythology, and adapted from Assyrian sources, this Egyptian dream codex reveals the universal nature of dreaming. The fact that these three great civilizations—Egypt, Mesopotamia, and Assyria—all gave such immense attention to dreams, that they were related in study, shows how intimate dreams are to the collective conscious of a people. In all three societies, dreams were ways of contacting invisible realms through the guidance of either good or bad spirits. Then came Abrahamic monotheism. Christianity, Judaism, and Islam all interpreted dreams as coming directly from God to them in their sleep. Understandably, these dreams were heavily filled with religious metaphors and symbolism.


A little later and the Greeks would become fascinated with dreams. The Greeks had their own religious groups—some might say cults—called “Mysteries,” and many a Mystery was focused on dreaming. In order to have better dreams, Greeks encouraged sleep to each other with oils and drugs, so that they would be more immersed. An important aspect of Greek life was the oracle: Each day, hundreds of travelers would go to oracles to have their fortunes told. Dream interpretation was done in the same manner. Specialized interpreters would have a place in the temple, where they were surrounded by natural smoke that they would read and decode, then pass onto the dreamer. During the Archaic period, though, a shift occurred. The Pre-Socratic philosophers began to steer away from religion and toward scientific, rational thought. Mystery and dream divination, or magic (oneiromancy), would be replaced with more empirical observations. Each of the following philosophers accurately predicted modern-day conclusions by themselves.

  • Heraclitus (c. 535-475 B.C.) claimed dreams were nothing more than day residue, i.e., recollection of things that happened throughout the day.
  • Democritus (c. 460-370 B.C.) thought dreams were the result of the external environment acting on an individual’s consciousness.
  • Plato (428-348 B.C.) proposed that dreams were a manifestation of wish-fulfillment based on repressed desires in the soul. He also thought dreams were the divinely inspired and could grant people creative impulses.
  • Aristotle (384-322 B.C.) argued against prophetic interpretations, instead declaring dreams to be the result of sensory stimulation that could potentially affect our future actions based on their content.

Unknown-2.jpegThus, the study of dreams officially became scientific in nature. Artemidorus, coming 400 years after Aristotle, born in the same country as Heraclitus, wrote the largest encyclopedia of dreams for his time, the Oneirocritica. In it, he distinguished between two types of dreams: Insomnium and somnium. Insomnium is a dream-state whose contents are about the present. These are dreams that deal with current problems and daily occurrences. Somnium is a dream-state whose contents are about the future—self-state dreams, in other words. These dreams are “deeper,””more profound,” than insomnium dreams because they give us insight. But Artemidorus came up with even more fascinating idea, one that has hitherto been neglected and still does not receive a lot of merit today: Dream interpretation reveals more about the interpreter than it does the dreamer. Apparently, according to Artemidorus, by gaining the background of a person, by interpreting their visions in light of this, we gain insight about ourselves because we mix in our own beliefs and symbolism that they would otherwise miss. Contemporaries of the Pre-Socratics in the East—the Chinese, Buddhists, and Hindus—were the heirs of the Egyptians forasmuch as dreams were glimpses of a higher realm, a truer reality, to them. In their dreams, they would experience the transcendence of their souls from the corporeal world.


The scientific study of dreams would come crashing down in the Middle Ages, which saw a reversion back to religious symbolism. Only this time, the underpinnings were moral and theistic. The problem of interpretation came down to the whether the dreams were communicated by God or not, in which case it was either angels, and therefore holy, or demons, and therefore wicked. Thus, medieval dreamers had to discern between truth and untruth. A few hundred years more, and we get the great rebirth, the Renaissance. It is from the Renaissance that we get our contemporary connotations of dream interpretation, for it was during this time that divination once again became dominant. The Renaissance saw a surge of interest in practices like occultism, mysticism, numerology, astrology, prophecy, and hermeticism—in a word, magic. Nowadays, these associations still carry over, so when we hear people talking about interpreting dreams or discussing horoscopes, we tend to brush them off as useless, arcane magic.


Fast forward 400 years to the Modern Age in the 19th century. Still traumatized by the Renaissance, people in the 1800’s were hesitant to study dreams or consider their importance seeing as dreams were seen as “unscientific” and therefore unworthy of serious thought. The magical side of dreams was not wholly abandoned or dismissed, contrary to what some might think; literary geniuses celebrated dreams for their Dr_Jekyll_and_Mr_Hyde_poster_edit2.jpgcreativity. Famous Romantic poet Samuel Taylor Coleridge wrote his unfinished poem “Kubla Khan” after an inspiring dream, but he never finished it because he was interrupted by a mysterious “person from Porlock”; novelist Robert Louis Stevenson wrote Strange Case of Dr. Jekyll and Mr. Hyde based on a dream he had, too, in which he saw his hidden, unconscious urges battling his outward, conscious behavior; and Edgar Allen Poe also said his dreams contributed to his works. Around this time, in the mid 1800’s, anthropology was becoming a much-studied topic, so anthropologists were traveling around the world studying primitive tribes. What they found predated Jung’s theory of archetypes, and they also found that these tribes usually made their decisions based on dreams they had—the resurgence of prophecy. Next comes the 20th century and the rise of psychoanalysis, dominated by two figures, Sigmund Freud and Carl G. Jung, to whom we shall presently turn.


Modern Day Dream Interpretation Models

maxresdefault.jpgBefore discussing the psychoanalytic tradition, we will first return to the earlier models of dream interpretation (the cool name for which is oneirology) we discussed. The first model is the cognitive model, according to which dreams are a means of thinking about the day during the night. When we dream, our mind is thinking just as we normally would, but with multiple layers of metaphors emphasized unconsciously. In this way, everyday imagery is “translated,” so to speak, into metaphorical forms that stand in for ordinary things. These forms, furthermore, are colored by our emotions, so that they reflect our inner world of moods and feel significant to us. This theory also groups together the cognitive development one, so dream quality will differ based on one’s brain development. Some scientists contend that dreams are important for problem-solving. There is a scientific basis for the phrase “sleep on it,” after all. When we sleep, our unconscious and subconscious are most active, so thoughts we did not know we even had float around, and some by chance end up back in our conscious, while those in our conscious sometimes drift off into the Unknown-1.jpegsubconscious. Either way, ideas move around. A friend of mine told me the story of how he lost his headphones, only to dream about how he lost them two months later, whereupon he found them in the exact location of which he dreamt. How did something so insignificant, something that happened two months in the past, chance to occupy his dreams? The best explanation, I told him, was that after a while, his brain, by its own whims, conjured up the memory of where he left it. Why it took so long, I do not know. Whether timing is important or not and how long an average memory takes to resurface are also questions worth asking. Over time, the brain will relax, and things that were troublesome and problematic will be relieved, I can only theorize. This leads to the next idea, namely that dreams reflect our current state and condition, environment, gender, age, and emotions, according to the cognitive model.


Another model we discussed briefly was the biological model. In light of biopsychology, dreams are nothing more than mere creations of neuronal firings processed by the thalamus into visual displays that make no sense. As such, interpreting dreams is useless considering they have no inherent meaning. Personally, I am not proponent of the biological method for two reasons: First, (I know this is a terrible reason) it is too bland and boring, and it is too reductive for my tastes; and second, if these neuronal firings are so random, then how can they create coherent (in the sense of “being familiar”) images that do make sense and that resemble complete narratives and sequences? This is not to say that the cognitive model is more correct than the biological model—not at all. As I have said, these are just theories, and neither has been verified indubitably.


Freud-b.jpgMost famous, hands down, is the psychoanalytic theory, first propounded by Freud, and then expanded upon his student, Jung. Starting with Freud, he described dreams like this: “Dreams are the means of removing, by hallucinatory satisfaction, mental stimuli that disturb sleep.”[2] In Freud’s eyes, dreams arise from the irrational, hidden side of ourselves—the unconscious. As a result, dreams need to be interpreted by a therapist. Dreams work through association, creating nonsense connections between ideas that are seemingly unrelated. Since dreams are irrational and incoherent, interpreters use a technique called “free association” that Freud loved to use. The analyst says a word, and the patient says whatever comes to mind. The logic goes that if the dream is formed by associations, then the intuitional associations said by the patient will point to their roots. Having done this, the analyst can then find associations of which the patient was initially unaware. One thing Freud did that remains of a subject of interest is his splitting of dream content into manifest and latent content. Manifest content is the storyline of the dream, the surface-level meaning. On the other hand, latent content is the deeper, symbolic, underlying meaning of the dream. Whereas the dreamer has access to the manifest content, only the analyst has access to the latent content, because latent content is unconscious and therefore hidden from view; it has to be uncovered through free association. What is this elusive latent content, and why does the mind go through the trouble of disguising it? Freud said that dreams protects us from waking up due to “mental stimuli”—but to what kind of mental stimuli was he referring? He believed that the latent meaning of dreams were repressed, unacceptable ideas.


The basic formula for a Freudian dream is “any kind of trivial occurrence + a traumatic childhood memory.” Subsequently, dreams take some kind of ugly truth and dress them up with ordinary occurrences. This is why Freud said that dreams protect us from disturbances. If these unacceptable ideas were to be shown to us in full light, then we would never be able to sleep; we would be too disgusted or traumatized. Dreams prevent Unknown-2.jpegus from waking up by playing out fantastical scenarios that reflect our wishes, goals, and fears. By hidden means, the dream releases our repressed memories. Freud posited a theoretical “censor” inside the mind, a kind of watchguard that makes sure nothing from the unconscious creeps into the conscious. Obviously, then, a feeling of aggression cannot be made manifest; instead, the unconscious is clever, so it disguises the feeling of aggression, such that it is able to sneak past the sentry and make it into the conscious in the form of a dream that makes no sense, but which nonetheless has a deeper meaning. This explains why dreams are confusing and unclear, yet meaningful. How the unconscious goes about disguising the repressed ideas is called the “dream-work.” Its four methods are condensation, displacement, symbolization, and secondary elaboration.

  1. Condensation is what happens when two or more ideas are merged together into a single thought.
  2. Displacement is what happens when an emotion is misdirected toward something other than its target.
  3. Symbolization is what happens when an object is made to stand in for another.
  4. Secondary elaboration is what happens when the subject tries to recall their dream, only to distort the facts.

Unknown-3.jpegBy using all four tricks, unconscious impulses manage to invade the conscious mind. Freud went further and identified two types of dreams. Dreams of convenience are dreams related to one’s day. Closely linked to day residue, dreams of convenience focus on some kind of fear or wish that occurred during the day visually. The other type of dream is one of wish-fulfillment, for which Freud is most well-known. Basically, he said that dreams are a way of satisfying our desires with our imagination. Because we cannot satisfy these desires in reality, we are forced to do so in sleep, in ideality. These desires are either erotic or aggressive. To use an example, one night I was really thirsty, and I went to bed on my trampoline (for fun, of course!). I dreamt I got out of the trampoline, went all the way inside the house, got a drink of water, walked back to the trampoline, and fell asleep. When I woke up, I had no memory of getting up, and I realized that I could not possibly have gotten water, as it was far too cold, and it was a long walk. Thus, I came to the conclusion that I dreamed about getting water in order to satiate my thirst before going to bed. To summarize, here are Freud’s ideas about dreams:

  1. Repressed childhood memories are revealed through associations.
  2. Said memories are either painful or unrefined, which is why they are repressed.
  3. Dreams are illogical, resembling an infantile imagination.
  4. Dreams have sexual and/or aggressive themes.
  5. Dreams are disguised wish-fulfillment.

6534180_orig.pngThe reason we no longer believe in the psychodynamic model of dreams is because, simply put, there is no evidence at all that supports it. Carl Jung was Freud’s student, although he would later distance himself from his teacher’s ideas in order to develop his own in more detail. To begin, he classified dreams into three categories. The lowest level of dreams are day residuals and just focus on things that happened throughout the day. Above these are self-related dreams, dreams that are about us, our mental states—stuff like that. The highest dreams, however, are archetypal dreams, which are the deepest ones possible, for they connect us with each other through the collective unconscious. I feel the quickest way to present Jung’s views are by enumerating them and then contrasting them to Freud’s:

  1. Dreams are essentially creative.
  2. Dreams are a part of the collective unconscious. Each of us, no matter who we are, shares the same symbols and universal characters, or archetypes.
  3. Dreams reveal the personal unconscious, too. We learn about the hidden parts of who we are through dreams.
  4. Dreams give insights into the future.
  5. Dreams are positive and constructive, providing insights to the self.

And as contrasted to Freud:

  1. Dreams are meaningful in- and of-themselves, not by interpretation.
  2. Dreams represent present, not past, problems.
  3. Dreams are best interpreted based on patterns and recurrences rather than individual interpretations. Rather than look at each dream by themselves, it is better to look at them together.
  4. A holistic analysis of dreams is more efficient than free association.
  5. Symbolism is not repressed, but archetypal.

If we want a quick summary of the psychoanalytic model, then we can say that Freud’s focus was sexual, and Jung’s archetypal. But while they differed in many respects, they also had these traits in common with the modern world:

  1. Dreams give clues to life.
  2. Dreams bring the unconscious to the surface.
  3. Dreams are based on day residue.
  4. Sensory stimulation affects our dreams.
  5. Universal archetypes are a part of our collective unconscious.
  6. Dreams are a.) repressed or b.) creative.

1370918.large.jpgIn conclusion, while there is a rich history of studying dreams, there are also countless unanswered questions regarding dreaming. Will we ever know them? Who knows. Until then, we can only dream of what they might be. Since the Egyptians, who believed in otherworldly journeys, to the modern psychoanalysts, who believed in hidden symbols, there have been many views of what dreams are, and many revisions, too. What we can see from the history of oneirology is that how dreams are interpreted depends upon the culture in which one finds oneself. Where one lives, how one lives, what language one speaks—these can all affect how we interpret dreams. Does this mean that there is no objective meaning of dreams, that the purpose of dreams differs between peoples? The question remains of whether dreams are even meaningful in the first place, or whether they are, in fact, just biological accidents created by the brain. These questions create a living nightmare for psychologists. One thing that is for certain is that dreams are very personal, intimate things that happen to all of us, that are unique, and that are private to us alone. I have my dreams, and you yours. (Get ready for the cliché ending…). But then again, what if this is all a dream?  

 

 


[1] Myers, Psychology, 8th ed., p. 285
[2] Freud, The Interpretation of Dreams, p. 499c*

*From Adler’s Great Books of the Western World, Vol. 54

 

For further reading:
The Encyclopedia of Human Behavior Vol. 1 by Robert M. Goldenson (1970)
Psychology: Mind, Brain, & Culture 
2nd ed. by Drew Westen (1999)
In Defense of Human Consciousness 
by Joseph F. Rychlak (1997)

Introduction to Psychodynamics by Mardi J. Horowitz (1988)
Schools of Psychoanalytic Thought
by Ruth L. Munroe (1956)
The Secret Language of the Mind
by David Cohen (1996)
Psycholog
8th ed. by David G. Meyers (2007)

Anamnesis—Why We Know More Than We Think We Do: A Polemic

We know more than we think do.


Plato was an innatist. He believed in innatism, which states that all knowledge is “innate.” It comes from some Latin words meaning “born into,” so for knowledge to be innate means for it to be present at birth. This is the basis for his theory of anamnesis (ανἀμνηση), otherwise known as his theory of recollection. Because he believed in the immortality of the soul, and because he posited a transcendent realm of Forms, Plato wrote that the soul of an individual leaves the body upon death and enters the realm of Forms where it is able to see all them in their perfection; and having seen these Forms, having seen and retained them, it returns to another body in another life to start anew. Still endowed with the memory of the Forms, the soul is tainted and stained by the foulness of the physical world, which causes it to forget the Forms; or rather, it represses them, and they enter the unconscious, to put it in psychological terms. These ideas are latent within the soul; they are resting, waiting to be awoken or excited and so elevated to consciousness.


In his dialogue the Meno, Socrates—Plato’s teacher—discusses the nature of virtues and whether they are teachable, but is caught off guard by an argument that goes like this: If you are asking about the definition of something, then it means you do not know if, so Unknown-2even if you were to find it, then how would you know you had found it, since you do not know for what you are looking? On the flip side, if you already know the definition of something, then why ask about it? In other words, there is no point in trying to learn. Socrates says in response, “[S]eeking and learning are in fact nothing but recollection” (Meno, 81d). Virtues are knowable, Socrates argues, because we already know them; it is just a matter of re-collecting them. To defend his position, Socrates asks for a slave of Meno’s, a boy who has not been educated at all. Socrates proceeds to question the boy about the area of a square, slowly but surely getting him closer and closer to the answer. During this demonstration, Socrates never actually tells the slave anything directly—”There is no such thing as teaching, only recollection” (Meno, 82a)—but only questions him, which guides the slave through critical thinking, allowing him to arrive at the answer without ever having done math. Thus, Socrates shows that knowledge is innate and is always present, but it needs to be awakened within the soul through the pursuit of knowledge.


In today’s world, the theory of recollection seems like spiritual nonsense. Not many believe in a soul, much less the immortality thereof; however, it has been suggested by a number of scholars that Plato’s anamnesis is actually mythical in nature, which would mean that it is not literal in its telling, but metaphorical, told through a spiritual lense, not for credence but interest.¹ So what would this mean in the real world, to apply anamnesis, the theory of recollection?


The reason why many of us do not think we know as much as we really do is that we seldom think in the first place. As a result, we underestimate our abilities and ourselves. Now, this is a generalization since many do think well, but my target audience here is Generation Z, who, in my opinion, is rather thoughtless (which, I realize, is a generalization in itself). Notwithstanding, my point is that the majority of us in the 21st century do not think. The kind of unthinking of which I am talking is not simply having thoughts—this thinking is done by everyone. No, the thinking with which I am concerned is something like analytical or critical thinking, a skill that has been missing lately.


unnamed.jpgThe best examples I can provide come from school, where a lot of thinking should, but sometimes does not, occur. Oftentimes, the teacher will call on a student who is not paying attention, either because they are daydreaming or thinking about other things or because they are doing something else, like playing games on their devices. More often than not, these questions are pretty basic, yet they catch the student off guard. It hits them like a cold water balloon. Looking up at the teacher, dozens of pairs of eyes looking at them, the student blinks, goes blank, and maybe after hesitating blurts out some answer they know is not correct, but which will get the attention off of them. Although they are met with a few laughs from the class and a disappointed look from the teacher, they are relieved; thereupon, they go back to whatsoever they were on before they were “interrupted.” Obviously, this will continue in a cycle, because as soon as they return to the distraction, they will continually be called out. However, let us not get caught up in the cycle, for we wish to pause at the moment—freeze frame it—when the student tries to avert the focus.


History and math are common classes in which non-thinking occurs considering both involve lifeless, impractical facts that are usually just memorized. As such, the person who is called on, after asking to hear the question again, thinks, Why me? in two senses: First, they are confused and irritated that the teacher chose them; and second, they do not understand for what reason they have to know whatever the question is asking. Hence, the student detaches themselves from the question, creates distance between them, a distance that is thereafter insurmountable. Once detachment happens, there is no hope of reconciliation. They are lost. In order to quickly de-escalate this perceived threat, the student, instead of thinking, instead of paying attention, hardly tries and so says something they “think” is correct, whether or not it is.


This is what it looks like not to think. Not thinking is not trying to solve the problem at hand. Again, this is a generalization on my part: Sometimes people do know the answer, and the teacher catches them off guard, such that they blank or get nervous and forget images.jpegthe answer—this is totally fine because we are human, and we make mistakes; or sometimes, the question is a hard one. But besides this, the blatant unthinking that goes on in and out of classrooms is overwhelming and far more frequent than the type I just described, which can best be called a mistake. The difference between unthinking and a fault in human nature is that the former is intentional, whereas the second is accidental, given that it is outside of our control. All of us, when asked to think, have the choice of how we will respond. Not only is unthinkingness intentional, but it is also indifferent. To respond unthinkingly reflects on a person because it tells you that they do not care to think about since it is not worth their time or effort. But what should take place when we think? When we think, we think back. When we think back, we remember, we re-collect. Of course, this takes time and effort, I have said, so it is the harder of the two paths. Now I will explore some more examples and their implications:


Screen Shot 2018-06-02 at 11.51.19 PM.pngOne time, my friend and I were doing our Spanish homework together. I got through mine relatively quickly, so I went over to check on his progress. He was a few activities behind, although he was completing them fairly well. As I watched him, I noticed he was not actually doing the activity inasmuch as he was just guessing. He would glance at the question far faster than he could comprehend it, then he would put a random word in the blank space and check on Google Translate to see if his answer was correct; if it was not, he would repeat the process until he got it right. I told him to try to do the next problem without using Google Translate. Immediately, he stared at the screen, frozen, not knowing what to do. His crutch had been taken away, so he was leaning on his own weight. Then I said, “Alright, first try translating the sentence word by word, so you know Unknown.pngwhat it’s asking.” He slowly read out each word, replaced it with its English equivalent. “Oh, I get what it’s asking. This is so easy!” he exclaimed. I smiled as he looked at his options and chose the one that fitted the blank. During the next one, he did not know the meaning of one of the word choices, so he opened a new tab and was about to type it in when I told him to close it and do it himself. Once more, I gave him a little push: “Well, you know what these words mean, so through process of elimination, you can probably say it is not the ones you know, otherwise they’d work.” He nodded, chose the next word, checked it, got it correct. I said, “See, you can do Spanish if you actually try. You have it in you, you just need to think it through.”


My friend told me he was not the greatest in Spanish, yet he did totally well on his own. He underestimated his own abilities; he thought he did not know much, when, in reality, he was fully capable. Later in a conversation, I told him that this could be applied to all his classes: Imagine if he depended more on himself and felt confident he knew the answers—because he did have the answers. My friend knew more than he thought he knew; it just had to remembered. This prompted me to ask the following question: If he Unknown.jpegknew it all along, and if he was fully capable of summoning this knowledge, then why did it require my coming to help him? Why could he not have had done it himself? It is analogous to the student playing “.io” games on their computer while the teacher is lecturing, only to close out of the tab or hide it and get on their classwork as soon as the teacher comes behind them. It is similar in that they could easily be doing their work themselves, but it requires some kind of authority—in this case, the teacher, or in mine, me—to enforce it. Why is it that people feel like they can get away with not thinking? Why do they scheme and choose the easy way out?


Another example is one that particularly annoys me on a personal level. Whenever I tell friends or classmates about my blog and ask them to read it, they always come back to Unknown-1.jpegme saying, “Wow, it’s so cool… except that I didn’t understand any of it,” and I reply with a sad face, both over text and in person. In reality, what the person is really saying when they say “I attempted to read it” is “I did not try to read it fully, attentively, and thoughtfully.” One specific instance: A classmate who runs track with me, who is very intelligent and hardworking, came to me during practice and told me he did not understand my blog on whether babies exist or not. He said he tried to read the first few sentences, but did not understand them. Right away, I knew he did not read it with his full attention seeing as the first few sentences had nothing to do with philosophy, but were a part of an anecdote. But then he said he was confused by the “Descartes-I-Think-Therefore-I-Am” part. I asked him to break it down argument by argument how Descartes came to doubt everything, leading him to declare that his existence was of absolute certainty, whereupon my classmate said he understood it. “Not so hard, was it?” I told him. After thinking it through, he arrived at Descartes’ Cogito argument. He knew more than he thought he knew; he just had to remember it. To be fair, though, I understand that my blog can sometimes be hard to read, which is no one’s fault but my own. As such, while the philosophical knowledge I share is not innate, the cognitive capabilities for understanding and learning it—Socrates’ definition of recollection—are.


And lastly, a scenario we have all experienced: Math class. Whether you love it or hate, you are inevitably going to have difficulties in it. Even the smartest people in math get tripped up on a problem that stumps them. But there is something unique about math that makes it different in terms of thinking. Say you are trying to solve for a right triangle, which requires the Pythagorean Theorem and some trigonometric ratios. You 14608107_1180665285312703_1558693314_n.jpghave no idea how to begin because you are so overwhelmed, so you ask the teacher for help. The teacher comes over, and the first thing they will ask is for what you are trying to solve. While it seems obvious, solving for a right triangle, repeating the objective reminds you where to begin. Next, they ask, “What is the formula?” As if expecting some kind of trick, you reply, skeptically, “a^2 + b^2 = c^2” and “tangent of x is opposite over adjacent.” You then plug in the values, do some calculations, and—something clicks—”Ah!” you shout excitedly. It all makes sense! The teacher, having done their job, moves on to the next student. It feels as if you are Archimedes when he discovered water displacement and cried out “Eureka!” You have had a eureka moment. Eureka, traditionally translated, means “I found it,” so it is as though you remembered it; in searching the back of your mind, looking here and there, under this pile and that, you finally found it, re-collected it—recollection. What makes math unique is that it has formulas. With these schemas, we can easily and efficiently find the solution to a problem. In a sense, the answer is innate, is already in us, or in the formula, and we just have to educe it by plugging in the values. As with my friend and Spanish, the answer can be found by oneself, yet one does not do this, relying on something external to push them. For instance, in math, one only thinks as soon as one is vulnerable, when one is put on the spot, forced to think for oneself and put their mind to use.


Unknown-3.jpegSo why do we not think? Why are we not self-sufficient in the 21st century, and why do we depend on external enforcement or pressure? Importantly, pressure is the only means of encouraging thinking, it would seem. Implied by this is the idea that we will naturally resist thinking; until pressure is applied will we be forced to fall back upon ourselves and think. In a talk with my friend—the one whom I helped with Spanish—he suggested that not thinking is like a defense mechanism, an idea I found interesting. Just as Freud and his daughter Anna proposed “ego defense mechanisms” designed to protect the conscious mind and conserve its energy, so my friend proposed something like a “rational defense mechanism” designed to protect the thinking mind and conserve its energy. We have gotten to such a point when thinking is “too much,””too draining,” that we must defend ourselves from it. The thought of thinking tires us. We dread going to school because we have to use our brains. We would rather keep our peace of mind and ignorance than think hard to answer a question in class. We would rather look stupid and not waste mental energy than actually think and get a simple question right. There is a word, malinger, which means to pretend to be sick in order to avoid working. Unthinking is comparable to malinger, except that it is a very real sickness of the mind, a mental laziness, an apathy.


Unknown-4.jpegA student in my history class said, “I’m jealous of naturally smart people,” a remark very characteristic of this attitude. Whether people are naturally smart or not is for another blog for another time; regardless, while some people may be predisposed toward or may have an advantage in intelligence, implying that oneself is not among that population is a self-fulfilling prophecy, and a detrimental one at that. The people who are “naturally smart” are perceived as such because they think. Ought this be discouraging? No, encouraging. If one gets to thinking, “I’m not smart, I can’t study, I’ll never be as smart as so and so,” if one compares one’s grades to another, then one will ineluctably study worse and perform worse on tests.  By comparing oneself to another, one stops oneself from focusing on what one needs to do. Consequently, such disaffected students decide not to try at all, thinking they will never succeed, thinking they will never be able to think as well as others, so they do not think at all; they are allergic to thinking and will run from it. Stubborn, unyielding, they will not try, will not care, unless forced, pressured, or expected. It is all the more intriguing that this same student, if alone with a teacher, will perform better—yet then again, this is to be expected. Without the assent of their peers, without the freedom to not think, the student is forced to confront themselves, and they can be guided by the teacher.


I am thankful I had this conversation with my friend, as he always thinks from different angles. He interrupted me and asked the following question: “So you’re saying that we actually think better when a teacher or parent is near. But if we have a parent, teacher, or Google always at our fingertips, ready to help us, then won’t we rely entirely on them?  Like, let’s say I’m doing math—what’s to stop me from having my mom or dad solve it for me? Doesn’t having someone to help guide us make us risk losing our self-sufficiency because we’ll be tempted to use them?” I formulated my response accordingly: Educators, be they parents or teachers, are much like the training wheels on bikes insofar as they are the prerequisites for learning to function by oneself. Before a kid can Unknown-5.jpegride their bike by themselves, they must first use training wheels to get used to riding; then, when they are good enough, they can ride by themselves, will be independent, and shall forever remember how to ride. Likewise, before a person can think for themselves, they must first have a guide to conduct them; then, when they are thoughtful enough, they can think by and for themselves, will be independent, and shall forever remember how to think. Now, this solution of mine is not a perfect solution, but a solution nonetheless. How does one know when one is ready? Is there an age whereat a thinker is mature? Does it differ between people? Surely. These are all questions for further consideration, further thought.


Unknown-6.jpegWho is the educator, and what is their role? To educate means many things, yet it originally meant “to lead or guide out (from),” for it derives from e-, “out,” and ducere, “to lead or guide.” What exactly this means and whether it even clears anything up deserves attention. To lead what out of where? This phrase, “to lead or guide out from,” can be interpreted in two ways: First, in light of the Socratic tradition, it can refer to the dialectic, known as elenchus, whereby the teacher and disciple arrive at the answer through question-and-answer. As Socrates said, “[I]f the question is put in the right way they [the student] can give a perfectly correct answer, which they could not possibly do unless they had some knowledge and a proper grasp of the subject” (Phaedo, 73a). Put another way, the teacher as the Socratic ideal guides the student along the path of thinking without actually intervening. Think of it like a blindfolded man in a maze and his partner on the outside who has to guide him. The partner on the outside cannot give direct help, but he can guide the blindfolded partner indirectly. As the Socratic gadfly, the educator guides out from the student. Guides out what? Knowledge. It is the job of the educator to lead the student to Unknown-7.jpegrecollection. By helping them to bring their knowledge to the fore, the educator conducts the disciple to thinking. In the Meno, Socrates guides the slave to solving the geometry problem by probing him. When helping my friend with Spanish and my classmate with reading my blog, I asked them indirect questions, which led them on the right path, on which they embarked by themselves—I was merely their compass. Second, in light of the Platonic tradition, “to lead our guide out from” can refer to the Allegory of the Cave, in which sense the educator takes the student and leads them by hand out of the cave of ignorance and into the light of reason, where reside the Forms. Symbolically, the student is the prisoner shackled to look at shadows, but as soon as they ascend, as soon as they behold the sun as the Good, they are beholden to a new, elevated kind of thinking. And having grown used to thinking, the newly bloomed thinker can try to help others to think, with varying success.


Unknown-2.pngIn the end, two things about these two interpretations remain constant: Maturity and curiosity. Like training wheels, the educator is eventually outgrown. After much distortion and misattribution, a quote comes from Plutarch that likens education more to the lighting of a flame than to the filling of a vessel. In other words, the purpose of the educator, contrary to modern day expectations, is not to fill their students’ heads with facts but to inspire within them a burning passion and curiosity for learning. Thus, the student will seek knowledge by themselves without having to be asked. Socrates states that it was only by stumping the slave that he was able to conduct him. This way, by leaving him in confusion, by leaving him with an unsolved problem, he was able to alight within him a flame. All of Socrates’ interlocutors are frustrated by the time they leave, for they are always defeated by his thinking. We in math class are just like they are, because we feel frustrated when we cannot solve a problem. Our heads hurt, and we feel stuck, as though in mud, but we know there is an answer—it is just a matter of finding, or rather, recollecting, it.


In conclusion, we live, as I have said before, in an unthinking age. Students all around the world are unsatisfied with education and are left with a sour impression of school, leaving them deprived of its fruits, both bitter and sweet. Jaded, cynical, they Unknown-2.jpegunderestimate themselves, compare themselves to others, and give up trying. Thinking becomes too demanding a task, and they would much rather preserve their image than uphold their dignity and fight for the mind. Knowledge is transformed from power into weakness, a disease that causes muscle atrophy and mental exhaustion, and so that must be avoided at all costs. Questions become interrogations, problems torture. Yet times are such that interrogation is the only means whereby we can be made to think, for we almost certainly do not do it of our own volition; we must be forced into doing it. When students contend they “don’t know the answer,” they really mean, “I’m not thinking hard enough—if at all.” They refuse to think back, to remember. Their chronic short-term memory loss is acute. The problem lies both with the disciple and the educator because both have failed, and the institution along with it. Not only have we forgotten loads of knowledge, we have also forgotten how to think in the first place. But whereas knowledge can be recollected, thinking must be re-learned.


We just need to remember:

We know more than we think we do.


¹Friedländer, Introduction to Plato, p. 340n7 

Why Are Owls Wise?

Unknown-1.jpegWhat is more symbolic of wisdom than the owl? When asked to think of an animal that is smart, mysterious, or nocturnal, we automatically think of the owl, who alights upon the trees of the forest in the night, its big, piercing eyes glowing in the dark, its haunting call—Hoo, hoo—like a wistful calling for someone who is gone, its panoramic view taking into account the entire landscape, watching patiently at twilight. Some people like to think of the owl as their “spirit animal,” an animal that represents their inner nature, their personality, that symbolizes who they are. But from where do we get these associations? Why is it that we associate owls with wisdom? Were owls always wise, or did they mean something else at another time? I myself am quite fond of owls and am in possession of a collection of owl stuffed animals, so this question appealed to me. Reaching back over 2,000 years, we find yet another enduring contribution from the Ancient Greeks, from whom we get our archetypal “wise owl.”


owl-dark-birds.jpgIt is important to note that, as with many symbols, meanings can change. While we nowadays impute owls with wisdom, they were once regarded as evil. Cloaking themselves in the darkness, stalking silently and surreptitiously, owls represented solitude. They hid in the shadows, unseen, and so were viewed negatively, in some cultures as the bringer of death, or at least the messenger thereof. Like the raven, the owl became an image of death and the afterlife, thought to be the animal that guided the spirits from this life to the next. Ancient civilizations in Mexico, the Middle East, and especially China created horrible myths around the owl, making it the pet of Hell or the punisher of those who have done wrong. Its loud, longing screech was unsettling, and because of its ability to see in the dark, the owl could see into the future, but it also meant, in the Christian and Judaic traditions, blindness, or an inability to pierce through the darkness, ultimately preventing spiritual insight. As such, early people saw the owl as a negative force, rather than a positive one.


Unknown.jpegHowever, this was not true for all the world, for other cultures, like the Native Americans and Greeks, designed elaborate mythologies that lionized, not demonized, the owl. What the eagle was to the sun, the owl was to the moon. Whereas other cultures linked the owl’s nocturnal nature with depravity, the Greeks linked its night vision with a special sight, a clairvoyance. Fortune tellers, seers, soothsayers, and augurs, all of whom specialized in predicting the future, had as their symbol the owl. It seems plausible, too, that owls’ nocturnal vision suggests a kind of sight that, by lighting up the dark, is revelatory, or which is diametrically opposed to darkness, a kind of clearing therein, or, as some Unknown-2.jpegscholars say, an ability to see through the shroud of obscurity. In the dark, things appear faint, in mere outlines, unable to made out; but the owl is wholly perceptive and has clear vision. The owl stands for rational, inner knowledge because it, like a mirror, reflects the light of the moon. This lunar reflection leads to the owl’s being described as pensive, as deeply thoughtful, and, consequently, as reflective. Quiet, reserved, yet vigilant, the owl kept watch, observant, cautious, curious. Owls tilt their heads to the sides, much as we do when we are confused or puzzled, as though they are mimicking our curiosity—their way of scratching their heads. Thus, it is no surprise why the Greeks related learning and studying to owls. The aloofness of the owl also lends itself to the idea of “bookishness” or “studiousness,” an image closely related to the scholar who stays up at night, working by lamplight (lucubration), disengaged from the rest of the world. It is from this comparison that we call people “owlish,” referring to the silent, intellectual type, who resembles the owl, both behaviorally and physically, in that they stereotypically wear big glasses, which look like an owl’s blank, penetrating stare. Owls seem to stay where they are and rarely move. They are some of the most patient birds we know. It is as if they are waiting for something, as if owls are awaiting images.jpegsomething. Perhaps it is their prophetic wisdom at work. Owls seem to know something we do not. They are symbols of inner-knowledge, of looking-inward. They are serious and lack humor. They are constantly engaged in thought. Being able to fly, to soar high above us, and to see in the dark, where everything appears concealed, owls have a perspective much more inclusive than ours: Owls have a bird’s eye view, an ability to look down upon us, to ponder and perceive the insignificance of our actions. Maybe when they are sitting in their trees, or hiding in their little nooks, they are, like a knowing parent, shaking their heads, wondering if we humans will ever learn; and therein lies the owl’s wisdom—to be patient and consider things from a grand point of view, with matters brought forth from the dark into the light, wide-eyed, all-knowing, and waiting until we are ready to receive their wisdom. But this does not yet answer the question: Why are owls wise?


little-owl.jpgAllow me to introduce to you the little owl, known also as the Athene noctua, from the family Strigidæ. Only 8.5 inches, or 22cm, long, it dawns a wide, low, and small forehead, putting it in a scowl, and it lives in wide, open spaces, like fields. What is so special about this small bird? The little owl is the very owl that rests on the Greek goddess Athena’s shoulder! Yes, that is right: The famous Owl of Athena, or Owl of Minerva in the Roman tradition, is a real owl—the little owl. The little owl became Athena’s symbol because they could be found everywhere in Athens. As Matt Sewell writes in his charming little book Owls, “The Acropolis [a fort in a Greek city-state, or polis] was once full of Little Owls, living amongst the pillars and rocks, looking down upon a great civilization.”[1] Again the imagery of “looking down upon” is supposed to connote protection and vigilance and insight. There is an idiom—”bringing an owl to Athens”—that refers to the abundance of owls in Greece; to bring an owl to Athens would be completely unnecessary, given the large numbers that already frequented it. Athens was one of the most famous Greek poleis, and after it was named the goddess Unknown-4.jpegAthena, who happened to be the goddess of wisdom. The logic goes: Because the goddess protected the city, she was named after the city, and because little owls could be found within the said city, they were to be associated with the goddess. Hence, little owls came to be Athena’s symbol. Later, at about the first quarter of the 5th century (c. 420 B.C.), Athens adopted its silver coinage with the owl of Athena printed on one side. There are many versions of Athena, including Pallas Athena and Athena Pronoia. Pronoia (πρόνοια) means “Providence,” or “foresight,” in Greek, from which came the idea that owls could see into the future.


G.W.F. Hegel in The Philosophy of Right wrote in his preface, “The owl of Minerva spreads its wings only with the falling of the dusk”[2]. In other words, what Hegel is saying is: True insight, or wisdom, can come only in retrospect. Dusk is the latest part of the day, the end of the night, and so, metaphorically, the owl of Minerva, representing foresight, reveals the lessons of life only after they have happened; it is then that they are taught to us, and that we can apply them.


So what can we take from the majestic owl? From the owl, we can all learn to be patient, attentive, humble, introspective, thoughtful, and reflective. Then, and only then, can we hope to achieve wisdom.

happy_owl_by_henrieke.jpg

 


[1] Sewell, Owls: Our Most Charming Bird, p. 19
[2] Hegel, The Philosophy of Right, p. 7 

 

For further reading: Owls: Our Most Charming Birds by Matt Sewell (2014)
The Complete Dictionary of Symbols by Jack Tresidder (2004)
Dictionary of Symbolism by Hans Biedermann (1992)
A Dictionary of Symbols by Jean Chevalier (1994)
Birds of the World by Colin Harrison (1993)

Abulidecidibiblism: A Poem

Unknown-1.jpegI experience a great deal of anguish when I try to decide what to read. Perusing my library, scanning up and down, left and right, running my eyes over gripping titles, I find myself struggling to comprehend the true extent of what is before me. Stacks upon stacks of books, some layered in front of each other, present themselves to me, each calling my name, as if asking to be read; and I, not sure which to take first, am paralyzed, stuck in place, for I know not what to do. I say to myself, I cannot decide which to read first, so I will continue looking, which is a terrible habit because I then find more books worth reading, only to realize—I have such limited time, how could I possibly read all of these books? It is not to say that some are more deserving than others, no; each is intrinsically valuable, and the problem lies therein. A particular philosophy book is appealing to me, and it conjures up a longing for history, an urge which can only be placated through psychology, but only if I first read classic literature. This kind of paralyzing indecision regarding books I have invented abulidecidibiblism for. It is difficult to capture the exact feeling one has when one experiences abulidecidibiblism, but below is a poem I wrote:


 

There are so many books
Oh time, thou crooked crook!
Thou hast planted in me a seed,
To cultivate it, I need to read
With time doth a crop grow,
So when I read, I best read slow
But I lack such patience;
I desire to read every word,
Yet if I read too much,
They all become blurred
So I guess my energy should be conserved,
As my list of books is long
Oh but where to start!
Must I fling myself into the throng?
I cannot help that more books end up in my cart,
Even if they be unread, they’re in my heart
My of my! they’re piled so high,
They nearly touch the roof
The ceiling is the least of my problems:—
To read them all, I’d have to be aloof

Each book contains so much knowledge,
Yet they remain on the shelf
Can I really blame myself?
My shelf is limited on self-help
But God knows I need it:
I can hardly help myself
There’s not enough time in the world,
To be whirled away to another world
For as many books as there are

Must I choose between them,
Like choosing between kids?
I cannot choose favorites;
If I could choose all—
Then I should favor it
But each is precious in its own way
Unique, with its own story
Even if their messages,
Are retained in mere vestiges,
Then I shall love them—
But like children, I cannot just choose one

Cursed abulidecidibiblism:
Decision-making is a cataclysm!

My weekend is booked.

The Problem with Memetic Literacy

Unknown.jpegImmediately after they wake up, a large percentage of people check their phones to see the latest notifications from their social media or to respond to the influx of emails they have received. Similarly, a large percentage of people stay up late doing the same thing, checking their feeds, scrolling, nothing in particular on their minds, nothing to look out for—just scrolling, as if something will magically appear. Everyday, millions of pairs of eyes flicker over their bright screens, either on Instagram, Snapchat, or iFunny looking at hundreds of memes, short, humorous images or clips shared from person to person, starting with just one viewer, then spreading exponentially, until, like the game of telephone, it evolves with every share, becoming something new, something different, yet derivative, building off of the original, but with a new touch of interpretation by whoever appropriates it. It can be said that memes are one the greatest things of 21st-century technology since they are able to be universally understood, shared, and laughed at. Language barriers are no Unknown.pngmore, so someone in the U.S. can share a meme with someone in China, and they will both get it. How cool is that—to be able to communicate cross-culturally and get a laugh out of it? Memes allow for a shared knowledge and entertainment for people of all ages and backgrounds, connecting them through a single medium. While I myself like a good meme, just as anyone else does, and while they can be hilarious, I think the popularity of memes today, despite its benefits, also brings with it deficits, problems that need, and should, be addressed. The spread of a “memetic literacy,” as I like to call it, has supplanted a much more fundamental, more necessary cultural literacy, and so will, I believe, impoverish both today’s and tomorrow’s youths.


Screen Shot 2018-03-18 at 11.33.01 PM.pngWhen we think of literacy, we think of reading and writing. To be literate is to be able to read and write; to be illiterate, to be able to neither read nor write. Defined this way, our generation has the highest literacy ever, according to the graph to the left. Over time, as education has become open to more people, as education has been improved, literacy has gone up, and will continue to. We are living in an Enlightened age, the most Enlightened age, with information stored in computers and more brains than there have ever been. However, there is a difference between being able to read and write and being able to read and write well. E. D. Hirsch defines literacy as “the ability to communicate effectively with strangers.”[1] What this means is that literacy is a common, shared knowledge. If I am literate, then I should be able to engage anyone on the street and be able to have an understanding conversation with them, one in which I am able to understand them, and them me. Despite our backgrounds, we are both able to know what we are each talking about; I and they are comprehended. During the 19th century when the world was industrializing, education was universalized. Schools were implemented worldwide to teach a shared culture. National languages were codified, instead of regional dialects so that people could understand one another, and thus, as in Unknown-1.jpegthe Renaissance, reading was made available for everyone, not just the learned elite, who were usually religious members. Because language was made singular, common, the koine, the vulgar tongue, the common folk could on a mass level learn to read and write in school. Some argue that it is a language and a culture that create a nation, for what is spoken and what is spoken about constitute a common people. There is a sort of egalitarian principle behind this, a principle of making everyone equal, of giving everyone—no matter their makeup, no matter their abilities, no matter their social position, the right to an education—the right to be a part of a culture. There are no distinctions between the advantaged and the disadvantaged, the educated and the uneducated.


Unknown-2.jpegHirsch relates how the literate usually like to keep the illiterate illiterate by not telling them how to be literate, withholding the specific requirements for becoming so. It is subtle: There is no single, agreed-upon list of things to know in order to be literate, for the selection is just so vast. The Western Canon, for example, is but a sampling of the world’s greatest literature. So while some may call you literate for having read the whole Canon, some may not consider that criteria enough. As such, to be truly literate, to be well read, is to be a part of the elite, as opposed to the merely literate, comprised of those who are educated enough to read and write. I like to think that I am pretty literate in memes, but this was disabused when I was hanging out with a friend one time, and every phrase I heard out of his mouth I could not relate to. I thought I had a pretty solid grasp of memes, yet here was my friend, who was clearly more literate in memes, referencing different jokes whereof I knew not. It was like he was having an inside joke with himself that I could not understand; I lacked the shared background knowledge as he, and he assumed I had it, when I did not. On YouTube, there are famous playlists 300-videos long, lasting for several hours, full of Screen Shot 2018-03-19 at 11.11.59 AM.pngmemes. If one can sit through all of them, then one, I guess, can be called “literate” in memes. However, he will still be lacking in other memes, meaning it is hard to specify what memes one should know if one is to be literate in them. In my case, how am I to know which memes are in vogue? Moving past this, the better one can read, the better one does in other subjects. From experience, I can attest to the fact that reading a variety of texts leads to a bigger vocabulary, and thence to a larger storage of knowledge and comprehension, resulting, ultimately, in easier learning through association. Such is the outline of literacy by Hirsch. Someone who is well-rounded in their reading, who reads not just fiction but non-fiction, who looks up words they do not know so they can improve, who not only specializes but generalizes their knowledge, who associates what they do not know with what they do know—they are literate, and they are successful in reading and writing.


E. D. Hirsch writes of a study he once conducted in Richmond, Virginia at a community college. There, he interviewed students and asked them to write responses to his prompts. Eventually, he asked them to write an essay in which they compare Civil War generals Ulysses S. Grant and Robert E. Lee, the latter of whom was himself a Virginian. Although they were in the capital of Virginia, what was once the capital of the South, the students were not able to write a response because they did not know who either of the two men was. Hirsch was flabbergasted, to say the least. The point he was trying to prove was this: Cultural literacy is integral to society. A universal background is always Unknown-1.pngpresupposed. We require tacit knowledge to understand things that are implicit, both in a text and in the world around. The culture is greater than the sum of its parts. Culture must be understood generally, in relation to all its parts, kind of like a Hermeneutic Circle, where the whole and its parts must be continually interpreted in light of each other. In this sense, cultural literacy comprises political, historical, social, literary, and scientific literacy, all in one, according to Hirsch. In other words, cultural literacy is the totality of all its subjects. One must be well-rounded and not too-specialized to be culturally literate, lest one neglect a subject over another. For instance, a writer writing a non-fiction book assumes his audience knows what he knows, or at least has some kind of background information coming into it; he least expects them to be coming in blindsided, without any preconceptions or context whatsoever. There should be an interplay between specialization and generalization, because, on the one hand, a reader should have a grasp of the subject overall, but also the details within it. Things that are assumed are connotations, norms, standards, and values, among other things—in short, shared knowledge. To have this shared knowledge, this basic understanding of one’s culture, such that one is able to engage with it, “to communicate effectively with strangers,” is to be culturally literate.


Durkheim spoke of a “collective consciousness,” a totality of implicit, pre-existent notions that exist within a society. Everyone in the given culture is under this collective consciousness, is part of it. It is collective because it is common to everyone; Unknown-4.jpegconsciousness because everyone knows it, even without acknowledging it. Being an American, I have the idea of freedom as a part of my collective consciousness, just as over 300 million other people do. Were I to stop a stranger and ask them about freedom, I am sure they would have the same background knowledge as I, such as the 4th of July, which signifies independence for the U.S. This example illustrates an interaction in cultural literacy. Things are a part of our collective consciousnesses only because they are meaningful and valuable; if they are not, then they do not deserve to be presupposed by all. If it did not mean something, why should it survive in all of us? Hirsch writes, “[T]he lifespan of many things in our collective memory is very short. What seems monumental today often becomes trivial tomorrow.”[2] It is hard to become a part of the collective memory. What makes good literature good is its longevity. Homer has long been considered one of the greatest ancient writers because he has remained read for millennia. Compare this to pop singers today, whose meteoric rises soon meet an impasse, only to decline, impermanent, impertinent. With memes, the same can be said. They all explode in popularity, only to reach their apex before either fading into obscurity or being replaced by another. A meme can be overhyped. It loses its importance, and although it seems “funny” or “important” one day, it may not the next. Memes are volatile things. On a whim, they come and go. Even though some have a longer life than others, they all eventually go. The classic Vine “9+10=21” was once extremely popular, and was quoted daily in school; now, it hardly exists in our collective F759C5A8B71089736889893797888_175ced7823d.3.2.mp4.jpgmemory; it is a ghost, a fragment from oblivion. Hirsch comments that about 80% of what is taught in the collective memory has already been taught for at least 100 years. The Western Canon, again, is a good example: Its core works have been fixed since antiquity, and as civilization progressed, more works were added to it to keep up, all the way to the 20th century. In 100 years, it is incredibly unlikely—albeit still possible—that we will remember, or at most care about, people chucking things while yelling, “YEET!” Memes, while communicating entertainment, do not express values. Therefore, the Western Canon as such is as it is because it has been formative in our world; they have been studied so long and by so many people, that it has left an indelible influence, an influence that persists today.


Given all this, I can now address the main problem of this essay, namely the conflict between cultural literacy and “memetic literacy.” I have not spoken a lot about memes yet save in small bits, but I shall discuss them presently. For now, I wish to direct your attention to the issue at hand: The decline of cultural literacy. A teacher created a quiz full of famous, influential persons and gave it to his class to gauge their familiarity with historical, artistic, literary, and philosophical literacy. He was disappointed when one of his students compared the test to a game of Trivial Pursuit, because it prompted the question, What counts as important or trivial today? This is a vital question that everyone needs to ask themselves. Are famous leaders like Napoleon now trivial today, compared to the importance of Viners and YouTubers like Logan Paul? If both names were to be put on a test, would students cry, “Why do we have to know this Napoleon Unknown-5.jpegguy? Logan Paul obviously has a bigger influence today”? Is knowing who Napoleon is just trivia? Furthermore, the teacher found that his students had no knowledge of current events, specifically of their own country and its involvement in foreign affairs. Jaime M. O’Neill, the teacher, states, “Communication depends, to an extent, upon the ability to make (and catch) allusions, to share a common understanding and a common heritage.”[3] Allusions are thought by many to be pretentious. Those who make allusions are called name-droppers, and are disparaged. Many and I would argue on the contrary, saying that it connects to Hirsch’s idea of cultural literacy. Allusions are an example of shared knowledge. To be well-read, and therefore to know of many ideas and people, is to be involved in your culture. If I were to call something Kafkaesque, then I would be engaging with my culture, as I am expressing a background in literature, whereof the situation calls. Conclusively, we are losing the ability to make references to the collective consciousness, the ability to commune with strangers on the same basis. There is a paucity of literacy in literature and history. All teenagers know these days is what they need to know. No one goes out of their way to study history or literature; they are content and complacent with what they know. O’Neill records, plaintively, that some of his students thought Pablo Picasso was a 12th-century painter, and William Faulkner was an English scientist during the Scientific Revolution.


Throughout my day, I hear my friends and classmates complaining about impractical, specialized knowledge on their tests, knowledge they have to memorize. Although I can sympathize with them, and although I agree often that these tests are absurd, I also think they are in the wrong to say these things. Jeff Jacoby, a journalist for the Boston Globe, has written about the same subject. He talks about how it is actually easier to memorize what is on standardized tests than it is our peers’ standards. Put another way, we memorize so much useless information and trivia on a daily basis about sports, music, 91uBT9850xL._SL1200_.jpgand TV in order to keep up with our peers, that it is easier to memorize facts that are on a test. Unlike peer culture, whose facts are prone to change and in constant flux, tests’ facts are fixed and unchanging. Whereas 1789 is always the date of the start of the French Revolution, knowing Steph Curry is the point guard for the Golden State Warriors is bound to change in years to come. Memorizing the Pythagorean Theorem is applicable, as opposed to memorizing all the names of the band members of One Direction, which is impressive, but not applicable. The biggest complaints I hear, and which Jacoby also cites, are “I could spend my time more meaningfully” and “Why should we have to memorize facts?” Both points have merit, I concede, especially the latter. Please do not interpret me as supporting the school and not the students; I have many a problem with education today, of which one is standardized testing, because the memorization of lifeless facts is indeed a problem. My point is: We youths memorize countless dumb, trivial facts about pop culture and regurgitate them just as much as we do scientific facts, like mitochondria being the powerhouse of the cell. I am forced to ask, If you claim you could be spending your time better, what, then, would that look like? Simply put, teenagers, myself included, are false and hypocritical; and while I am not saying we should not complain at all, I think we should complain less, unless we truly have grounds for doing so.

Kids set truly high performance learning standards for each other…. If students don’t know the details of the latest clothing fashions or the hot computer games or the to-die-for movie stars, they’re liable to be mocked, shunned, and generally ‘flunked’ by others their age. That’s why so many spend hours each day absorbing the facts and names of popular culture.[4]

This is a particularly interesting insight. Writing for the Concord Review, Will Fitzhugh observes that teens memorize popular culture information to fit in with their peers, to pass their “informal tests” that they create for each other, to be cool. Just as school is standardized, so peer performance has standards, which, if not met, result in getting “flunked.” Students complain about testing in schools when life is a big test itself! One must struggle to stay afloat in the advancing rapids of entertainment that speed by. One must be “cool,” lest they be ostracized for not being a part of the peer culture. One should be studying hard for a test they have later that week, yet there they are, up late at night, stressing over whether they are literate enough in pop culture, cramming in short seven-second videos to fit in, obsessive, anxious. Memetic literacy is slowly overtaking cultural literacy. Jacoby concludes, “The question on the table is whether the subjects to be memorized will include English, math, science, and history—or whether the only mandatory subjects will be music, television, movies, and fashion.”[5]


So what actually is a meme? The following excerpt comes from the originator of the term, the scientist Richard Dawkins:

We need a name for the new replicator, a noun that conveys the idea of a unit of cultural transmission, or a unit of imitation…. [M]emes propagate themselves in the meme pool by leaping from brain to brain via process which, in the broad sense, can be called imitation.[6]

Unknown-6.jpegA meme is a certain kind of gene, a strand of code that is inherited. But unlike biological genes, memes are what Dawkins calls “cultural genes” in that they do not pass from person to person, but culture to culture. It is a gene on a mass level. Think viral. A “viral video” is so called because, like a virus, it spreads exponentially in its hosts, not just through the air, but digitally. The video goes “viral” as it is passed from person to person, computer to computer. He says a meme is a form of “imitation,” by which he means that the meme is copied and then replicated. It has copies made of it, either new ones or mutations. They are reproducible and copyable—in fact, there is a meta-meme, a meme about a meme, about stealing memes: Creators will take an already existing meme and put their own twist on it, then put their name on it to claim it, ad infinitum. A meme is a favorable way of cultural transmission, as Dawkins puts it, because they are easily reproducible. The basic meme consists of a picture background with an above and below text that makes some kind of predictable joke along a patterned outline. The picture stays the same, but the text can be changed to allow for different jokes among people. They are simple and easy to understand. Punchlines are short and witty, and they are so widely recognized, anyone, regardless of ethnicity or language, will be able to get a laugh at its comedy. Unlike cultural literacy, which differs transculturally, memes are universal. Any high schooler, I can guarantee, will know a meme from across the world if presented one. Memes have become the source of new allusions. This means, after all, that memes are a part of the collective consciousness briefly. Seen by millions daily, memes are a images.jpegworldwide shared knowledge. But of course, memes, for how good they are, come with problems, too. What is most important in the definition of a meme, I feel, is the word “idea.” Idea can be many things—a song, a joke, a theory, an emotion, a fashion, a show, a video, and a dozen others. This said, memes have great potential because they are good for spreading ideas that matter. The problem is: Memes spread ideas that do not matter. Viral videos are for entertainment, and nothing else. One laughs at a sneezing panda for enjoyment, not education, nor enlightenment. Memes are usually trivial, frivolous, meaningless, and humorous. Not all are, but most are. Despite their potential, memes are actually vapid and disruptive. I get a good laugh out of memes, and sometimes they can even be intellectual in their content, like historical memes. But for the majority of them, they are useless, fatuous entertainment. We need, in this age of ours, to find a balance between being literate in memes, and being literate in our world.


Unknown-8.jpegTo summarize, the problem at hand is that we are seeing a decline in cultural literacy, the ability to communicate with strangers with a shared, underlying knowledge, and a rise in memetic literacy, the ability to make allusions to videos, celebrities, sports, fashion, and other popular culture. This is not to say that memes should not be used at all, no; after all, Nietzsche said, “Without music life would be a mistake.”[7] A musician like Michael Jackson, being a part of popular culture, ought to be discussed just as much as Louis XVI because he is a part of our collective memory. Popular culture is, of course, a subdivision of cultural literacy, because without it, we would have little shared Unknown-7.jpegknowledge! I fear the day we no longer know of classical literacy, when we can quote Lil Pump’s “Esketit” but not Shakespeare’s “To be or not to be.” We should be able to discuss music and fashion and sports, but it should not be the priority; they are entertainment. Memes do a lot of good, but they can also do a lot of harm. They spread universal joy. They can get an idea to be seen by millions. What we need to do is ask ourselves questions. We need to consider what is trivial and important today. We need to decide what is worth studying, what ideas are worth spreading. Entertainment is essential, but spreading ideas, good ideas, is more important. We are undergoing a fundamental change in our world, and we need to be present to address it. This is a proposal to look inward instead of outward, to examine our values, to find out what we care about.

 


[1] Hirsch, The Dictionary of Cultural Literacy, p. xv
[2] Id., p. x
[3]  O’Neill, “No Allusions in the Classroom” (1985), in Writing Arguments by John D. Ramage, pp. 400-1
[4] Will Fitzhugh, qtd. in Jacoby, “The MCAs Teens Give Each Other” (2000), in Elements of Argument by Annette T. Rottenberg, p. 99
[5] Id., p. 100
[6] Dawkins, The Selfish Gene, p. 192
[7] Nietzsche, The Twilight of the Idols, §33, p. 5

 

For further reading: 
Elements of Argument: A Text and Reader 7th ed. by Annette T. Rottenberg (2003)
Writing Arguments: A Rhetoric with Readings
by John D. Ramage (1989)
The Dictionary of Cultural Literacy
by E. D. Hirsch (1988)
Challenges to the Humanities
by Chester E. Finn (1985)

An Incomplete Education by Judy Jones (2006)

Philosopher Clerihews

Invented by Edmund Clerihew Bentley, the clerihew is a poem form composed of two rhyming couplets with the scheme AABB, wherein a famous person is mentioned in the first line, and the last three complete an accomplishment, failure, biography, anecdote, rumor, or joke about them. Contrived, silly, and fun to read, these humorous poems can actually be quite educational while still being entertaining. I was inspired after reading some of Jacques Barzun’s clerihews on philosophers to write my own. Following are 16 clerihews on different philosophers. I have tried my best to make them concise summaries of their philosophies!

 

Unknown-1

 

 

 

Henry David Thoreau
Was a very thorough
Observer of nature
Who used botanical nomenclature


Unknown-5

Martin Heidegger
Conceived upon his ledger,
That what was once concealed
Would in a new beginning be revealed


Unknown-7

Michel Henry
Did French phenomenology
And he into life inquired
Whence he from interiority acquired


Unknown

Friedrich Wilhelm Nietzsche
Tried to preach the
Death of God, and of the slave morality
Favoring instead: Übermensch mentality


Unknown-2

Arthur Schopenhauer
Believed in the instinctive power
Of the blind Will-to-Life,
So his pessimism was rife


Unknown

Epictetus
Had to accede this:
Some things are outside our control
So with the punches we must roll


Unknown

Edmund Husserl
Made unfurl
In his phenomenological prolegomena
The bracketing of experienced phenomena


Unknown-1

Plato, or Aristocles,
Had found the keys
To the fundamental reality,
Which was actually ideality


Unknown-2

Socrates
Did not like Apologies
So he rushed out of the cave
And made dialectic all the rave


Unknown-3

John Stuart Mill
Had had his fill
Of individual liberty:
He used it as a Utility


Unknown

Thomas Kuhn—
Why’d you have to ruin
All of scientific history
By reducing it to anomalistic mystery?


220px-kierkegaard

Søren Kierkegaard
Was the first of Existential regard
Whose melancholy made him weep
And whose faith made him take a Leap


Unknown-2

Thomas Hobbes
Was moved to sobs
When he found life was short
And served the Leviathan’s royal court


Unknown-1

Blaise Pascal
Was a real ras-cal
Who liked to gamble
In his theological preamble


Unknown

John Locke
Pictured a rock
And said it was qualities, primarily
Conceived on a blank slate, summarily


Unknown-4

George Berkeley
Said, “Esse est percipi,”
Meaning he couldn’t find
Anything outside his mind


Should I write more philosophical clerihews? Maybe in other subjects as well, like history, literature, and psychology? Make sure to leave your own in the comments, and I’ll be sure to read them!

 

Kafka’s “The Trial” in a Poem

Unknown.jpeg
S
uddenly one morning, Joseph K is arrested at his home
Apartment to apartment, from lawyer to lawyer, whither he roams,
He discovers everything is beneath the Court’s unassailable dome.

The trial wraps itself around K’s neck like a noose;
It looms overhead, ambiguous, following like a cloud,
So that K, argumentative, confident, innocent, cannot hang loose.

On consulting the painter, K decides to drop his domineering lawyer,
With whom he’s dissatisfied, despite the daunting danger,
And of all the women he’s been with, he harangues her (Leni).

Reposed and ready for his final trial, K’s once more ripped from his room;
And dragged through the streets, as if “guilty” of a crime, he finds he can’t fight time,
For “the Law” has spoken, has driven into his heart a knife—yes, the clouds still loom.

Ycleped by a priest, a “door-keeper” of the Court, K is told a story:
A man is kept from the Law by a door-keeper, who closes it off for him.
K cries, “The door-keeper’s deceptions do himself no harm but do infinite harm to the man” (242)

A Very Short History of the Dream Argument

Unknown.jpegDreaming is an integral part of our lives, occurring every night when we are asleep. While the body relaxes, the brain stays active, creating a stream of thought, a stream that comes from the unconscious. Recent research into a method called “lucid dreaming” allows people to control their dreams, to place themselves within their illusory world, letting them make their dreams a reality; however, lucid dreaming, as cool as it is, presents a troubling problem, one that has intrigued humans for millennia: How do we know for certain we are not lucid dreaming right now? How do we distinguish our consciousness, our awareness, from the unconscious, the unaware? Are we actually asleep at this moment, life but a mere string of thoughts and sensations?


Defining dreaming and consciousness will help, as both concepts, simple though they may seem, are highly complex, each with their own requirements, psychologically and philosophically. Consciousness refers to “the quality or state of being aware especially of something within oneself”; in other words, consciousness refers to the realization or Unknown-1.jpegacknowledgement of the mind and its inner workings.[1] If you acknowledge that you are reading right now, you are conscious of yourself as reading, so consciousness is always consciousness of something, be it an activity or a mental state. American psychologist William James thought consciousness was not an existent thing, relating it to a stream, a series of experiences, one after the other, every one distinct from the other. Neurological studies later linked consciousness, the awareness of the brain, as a process within the brain itself, located in the thalamus. Dreams, on the other hand, are defined as “a succession of images, thoughts, or emotions passing through the mind during sleep.”[2] Dreams are specific from person to person, which makes it difficult, then, to “remember” a dream, considering it cannot be proven true or false. Therefore, it is difficult to differentiate the waking state from the dream state, so far as both are collections of experiences.


Apps-Lucid-Dreaming-Header.jpgMany philosophers, dating from the 5th century B.C. to the modern day, have attempted to tackle the “Dream Argument,” trying to prove that we are in fact living consciously. For example, Plato mentions it in a dialogue: “How can you determine whether at this moment we are sleeping, and all our thoughts are a dream; or whether we are awake, and talking to one another in waking state?”[3] Socrates was interested in finding out if our senses were reliable, if what we see, hear, taste, feel, and smell is real or a figment of our active minds. Perhaps when we fall asleep, when our brains switch to R.E.M., when we dream, there is a dreamer dreaming this dream. Another philosopher, René Descartes of the 17th century, in refuting the Dream Argument, famously proposed, “I think, therefore I am.” Descartes thought that his whole life was an illusion, a trick played on him by a divine being, that he was misled into believing reality. He started to doubt everything, including his senses; but one thing he could not possibly doubt was his existence, his self, because in order for him to doubt, there had to be a him to doubt in the first place!


Even though some of the greatest thinkers could not deny the Dream Argument irrefutably, at least we know from science that we exist, that dreams are just processes happening in the brain, and that reality is as real as it gets, dreams being a product of our imagination… unless we actually are dreaming, just waiting to be woken.

 

 


[1] “Consciousness.” Merriam-Webster.com. (January 19th, 2017)
[2] “Dreaming.” Merriam-Webster.com. (January 19th, 2017)
[3] Plato, Theætetus, 158d

 

If you have a lot of free time:
https://plato.stanford.edu/entries/dreams-dreaming/
http://www.iep.utm.edu/dreaming/

The Media, Democracy, and the Public Sphere [2 of 2]

Unknown.jpegClick here to read part 1 if you have not already (and makes sure to leave a like)!


Today’s technology-driven world is also system-dominated. A system is any division of labor paired with productive forces and knowledge, thinks Habermas. Systems operate through instrumental reason, or ends-means rationality. The ends justify the means. Organization and the state, accordingly, can manipulate the public with publicity, diverting their attention. The government tends to focus on technical problems, replacing democracy with bureaucracy, resulting in a democratic deficit, where principles of equality and consent of the governed lose their importance to Habermas’ “technocratic consciousness,” a state of mind brought forth by increasing specialization handled by authorities, experts, and professionals, each of whom spreads propaganda under technical jargon, claiming to be “fixing” some new problem. These technical problems are those to which social, pragmatic, pressing, and vital problems are subordinated. Technological ideology is not delusionalper se, as other ideologies are, such that their believers are under an illusion, misguided and mislead, although it is ubiquitous, as other ideologies are, infectious, spreading like wildfire. As such, the technical dominates the practical, removing thereby personal ethics. When a decision is made, its ethical dimensions are not considered; it is an ends-means instrumentality. Simply put, technology is self-determinative in terms of its values, which makes it a threat to democracy (in excess, of course, as technology is not intrinsically bad).


Unknown.pngThe commercialization of the press has led to the death of intellectual journalism. Drama takes precedence over detail, personality over policy. During the election, the press notably focused less on the actual and real issues and more on the candidates themselves. Rational discussion was thereby taken from the people, from whence they were distracted. Back in the 18th century, the bourgeois educated middle class read the newspaper daily, then went to the salon to discuss it with their peers. Now, the newspaper is still read daily, although not to the same extent. Consumers watch TV for hours every day, without ever exchanging discourse. Listening to the radio, watching TV, we cannot “disagree” with the media, in a sense, because it “takes away distance,” to use one of Habermas’ phrase, by which he means that we are so close to the media, that we cannot engage with it, we cannot talk face-to-face with the television or the interviewer or host who is speaking, but are forced to sit there, inactive, passive, taking it in, unable to respond critically. “The public Unknown-1.jpegsphere,” notes Habermas, “becomes the sphere for the publicizing of private biographies.”[1] News, publicity, focuses on celebrities, scandals, and politicians. It dramatizes everything they do, reporting it as news, using names to attract and tempt us, making a story out of anything they can get, in order to profit off of it. Rather than examine the policies and character of a person, the news analyzes their personal life. Habermas reflects ironically on the fact that, in the 19th century, ads in the press were considered dishonest, so they took up only 1/20 (0.05%) of the page. —How things have changed!— Take a look at any newspaper, even a respectable one, and behold how the whole page is practically take over by ads! Editorials are advertised and lose their meaning.  Advertisement gives a sales pitch, clear as day, but PR is more dangerous than advertisement because it exploits the public with attention-grabbing publicity, taking cover beneath the protection of the press.


Moreover, newspapers are dumbed-down. Publishers play around with type and font, adding flashy images and illustrations that distract from it, Habermas points out. The supervisors, just figureheads for their representative companies, get to control which topics are covered, scrapping any of which they disapprove. They “serv[e] up the material as ready-made convenience, patterned and predigested. Editorial opinions recede behind information from press agencies and reports from correspondents; Unknown-1.pngcritical debate disappears behind the veil of internal decisions concerning selection and presentation of the material.”[2] Debate, once a byproduct of the press, is itself commodified, restricted by formalities, aired to be watched without intervention or follow-up discussion. For this reason, debates are reduced to mere “personal incompatibilities,” trifles, minor disagreements, surrendering itself to the rampant relativism of the 21st century. In newspapers, “delayed-reward news,” valuable and informative, is vanishing, in its place “immediate-reward news,” which is tainted with too many clichés, touched up with drama, and made to sparkle with hyperbole, such that “the rigorous distinction between fact and fiction is ever more frequently abandoned.”[3]


By commercializing the press, the rich manage to hold onto power. They use propaganda to limit democracy. Playing the victim card, they complain that the wealthy minority are under attack from the powerless, uneducated minority. To combat the democratic instinct, they push for the “indoctrination of the youth,” a phrase actually used in official documents, emphasized by American philosopher Noam Chomsky (1928-) in his A images.jpegRequiem for the American Dream (2016) to critique the abuses of the media. Institutions like schools were told to be more strict in their requirements, to create criteria for education to brainwash children. The term “brainwashing” probably conjures up connotations of conspiracy; the fact is, brainwashing is very real, and very common, a technique mastered to influence people. Institutions try to limit free-thought, in hopes of making everyone conform to a single cutout. To cite an example, Chomsky refers to the Trilateral Commissions, an organization which, responding to the 1960’s, attempted to develop a “proper” society. There was purportedly “too much democracy,” so they needed to keep the masses in check, making people conform, passive, unquestioning. In post-Cambodia U.S. in the ‘70’s, local common spaces like the library and debate hall were closed off in universities to discourage critical discussion. In other words, the government attempted to shut down the public sphere, to prevent any criticisms of the state. Anyone who critiques the government, usually the educated minority of intellectuals, who impugns the media, is denounced as “anti-American,” a term which Chomsky traces to totalitarian regimes. To reduce criticisms of “concentrated power” (the state + corporations), the government discourages critical talk, alienating them, calling them traitors to the state, much as the Soviet Union did. Journalism was stifled. The public sphere cannot engage critically or rationally.


Famously, Chomsky said, “Propaganda is to democracy what violence is to totalitarianism.”[4] PR, then, is a method of cracking down on dissent, be it violent or nonviolent—a means of silencing and enforcing strict rules. Propaganda is more dangerous than censorship, he argues, because it, like PR, parades around as the public sphere, but is actually deceptive and misleading. Propaganda is brainwashing. This Screen Shot 2018-01-30 at 6.35.01 PM.pngdevelopment of PR and of propaganda stems from Edward Bernays, who coined the phrase “engineering consent,” a concept studied in depth by both Chomsky and Habermas. Bernays created what one official called “consent without consent,” because with the work of Bernays, PR was able to make decisions for people. As Chomsky relates from David Hume, power lies in the hands of the people; but if the people are made to think they have none, they will be powerless, and the government powerful. So the government exploits this. Fabricated consumption, a Veblen-esque term used by Chomsky, refers to the consumer culture of today, a culture in which we are told we need things, rather than want them. The media everywhere shouts, “Look at me!” “Buy this product!” Consumption is both uninformed and irrational, when it is supposed to be informed and rational! Evidently, all this has played a role in the 2016 Election. Rather presciently, Habermas writes that, with the decline of the critical public, those who do not ordinarily vote are swayed “by the staged or manipulatively manufactured public sphere of the election campaign”—notice the use of the word “manufactured.”[5] The presidential candidates were portrayed in a certain manner on purpose, because the corporations who owned them leaned in a certain direction. Unknown.pngBecause the media was biased and commercially influenced, it created a terrible environment, where discussion could not be grown, but rather created a desert, where no plants could grow, since there was no water, so they perished. Discussion was neither informed nor rational. Even if there were rational discussions, they were not factual, for the media reported no facts upon which to base them. This kind of political climate is poisonous, and offers no room for critical debate. “[A]n acclimation-prone mood comes to predominate, an opinion climate instead of public opinion,” declares Habermas; i.e., there is no talk about policy or the positions of the candidates; all there was was empty declarations like, “I’m voting for blah blah,” and “I’m pro so and so,” utterly devoid of thoughtfulness or decision.[6]


The decline of the public sphere and the commercialization of the media is no new concept, even here in the U.S. In the year 1934, the first Communications Act was passed, which formally established the Federal Communications Commission (FCC). This Unknown-1.pngorganization was created to handle media concerns, its service to the public interest. Then, in 1949, the controversial Fairness Doctrine was passed, a policy that required all media focus on pertinent, controversial topics and give equal airtime to opposing viewpoints, so as to allow for fair, balanced reporting based on facts, promoting discussions between parties, not just parochial, sectarian biases that supported one side, saying bad things about the other. In instating this, the FCC wanted to foster rational discussions, where both sides could be heard, and then citizens could make up their minds, instead of just listening to one and forming their decision without a second thought. With the Fairness Doctrine, the pros and cons could be heard and rationalized, challenged and defended. There would be less party polarization as a result—a problem we face very much today. The problem of the policy’s constitutionality arose, and it was challenged for impinging on First Amendment rights, so it was repealed in 1987, and formally eliminated in 2011. In 1975, the Cross-ownership Rules were passed by the FCC to “[set] limits on the number of broadcast stations — radio and TV — an entity can own, as well as limits on the common Unknown.jpegownership of broadcast stations and newspapers.”[7] These rules stipulated that a company could not own multiple mediums. Regulation of ownership was first defined thus. Giving equal voice to all media, the FCC made these rules to reduce and prevent media consolidation—the process in which big companies, or conglomerates, buy out other media companies, and thus hold legal and economic ownership of them. Like Chomsky, the FCC wanted to stop concentration of power. This set of rules appears to be a victory for the public sphere; unfortunately, it did not last long, and tragedy struck when the Telecommunications Act of 1996 was made active. Suddenly, the FCC repealed ownership regulations—hence, they deregulated the media—allowing for more companies to merge together and consolidate. From 2003-7, slowly but surely, the media was increasingly deregulated. Eventually, the Cross-ownership Rules of 1975 were null. Private concentration opened up. One of the terms stated that “whether a channel actually contains news is no longer considered in counting the percentage of a medium owned by one owner.” Companies could now hold 45% of the media market, as opposed 2Mp7dD3HM1Q7Q4QSc5zTjUym.jpegto the previous 25% in 1985.[8] This, the rise of oligopoly. By 1985, 50 companies controlled the media. Since the Telecommunications Act of 1996, over a course of several years, the number dropped infamously to five (or six, depending on the source) companies: Comcast, The Walt Disney Company, 21st Century Fox, Time Warner, and CBS/Viacom. Most recently, many an American has prophesied the “death of the Internet” as a result of a decision that took place on December 24, 2017: The FCC, after a long fight, repealed Net Neutrality. Why is it regarded as the death of a free Internet?—Because big corporations, such as Comcast, can now control data as they please. It used to be that data carriers equally distributed connection, but now, with it repealed, just like the Cross-ownership Rules, oligopoly can now thrive, meaning big companies control the market, stamping out smaller competitors, all in the name of money.


Unknown-1.jpegAnd what of fake news? What is it, and what implications has it for democracy and the public sphere? Fake news is defined as “false, often sensational, information disseminated under the guise of news reporting.”[9] Put another way, fake news is erroneous, nonfactual information based on getting attention, often with the use of shock to attract people. It conceals its falsehood under the “guise,” or cover, of “news reporting”; it uses the authority of the media to pull of its stunts. This is an existential threat to democracy for several reasons. First, it deceives the public. The public relies on the media to get information, but the press supplies them with none—or rather, it does, but it misinforms them, about everything, seeing as it is fake. Second, it besmirches the reputation of the media. Each time we read fake news and catch it, we lose more and more trust in the media, because we know we cannot believe a word it says. Considering there are good, factual, respectable presses out there, this is disadvantageous because it means that the preponderance of fake news seems to overcompensate for the good news out there, meaning media in general loses its credible character. Third, fake news does not make for critical discussion. If it is fake, then it is not factual, and if it has no facts, no logic, then it cannot be rational in any capacity.  Fourth, it signals the collapse of the public sphere and the recrudescence of feudalism, devoid of any criticism.


Unknown-2.pngIn a study done by Media Matters, Facebook was found to be one of the leading sources behind fake news circulation. Due to its algorithms, Facebook works like this: The more likes or views an article gets, the more it circulates, the more it spreads. The circulation of news is an active engagement; the more we interact with it, the more it interacts with us. Like a hot agent, the more it spreads, the more hosts it enters, which, in turn, spread it more, multiplying exponentially. Just clicking on the article, just coming into contact with it—this tells the system to send it to more people. The code says, “Oh! this must be popular, seeing as many people are clicking it; I’m sure everyone else will like it…,” and so sends it to more and more people, who then send it further. Worst of all is the fact that fake news is not ideological but commercial. Fake news is not necessarily for promoting a party, supporting one candidate intrinsically; rather, it is all for money, not surprisingly. One might find this fact hard to believe, as there were countless pro-Trump or pro-Hillary (and vice versa, anti-) articles. But the fact is, these fake articles that spread rumors or intentionally provocative comments are advertised not to gain support for either candidate, but to pander to their supporters, and so to make money. Yes, the advertisements were sent to respective supporters, but it was not to help them grow, but to, by the very essence of the article, make them click on it, thus making them money. It is not unknown that Facebook sells private information about its users. Millions of private accounts have their information sold to companies for large sums of money. Once the companies have our private information, they can manipulate us; they can manufacture our consent. If I were to put on my account that I supported a particular candidate, and if my information, which is kept private, concealed from public view, were to be sold to a company, then they could look at my profile, see who it is I support, and send me advertisements and articles supporting that candidate, or denouncing the other candidate, and I would not be able to resist: After all, we love to engage our subconscious biases. Any contrary information strengthens our resistance. Large companies, then, do us a disservice, pandering to us, selling us what we already like and  know, entrenching us in our beliefs, leading to confirmation bias, ultimately making 72li89phdx1y.pngthem lots and lots of money. Facebook has ads absolutely everywhere. Hence, they make money off of us. Going back to the threat of fake news, the biggest problem is its evolution. Originally, fake news used to be intentionally false, provocative, and contentious, designed to make its readers drawn to it, interested in finding out about the latest scandals, even if they were believable or not, obviously fake, with the purpose of entertaining. An example would be some kind of conspiracy, like “Hitler still alive in secret bunker in Africa.” This is “sensational” news. Fake news is now a disguised predator, a sheep in wolf’s clothing, preying on us gullible readers, presenting itself as real, authentic news. See, whereas sensational news was meant to be explicitly entertaining and false, fake news is more believable than it used to, meant to mimic real news, to pull us in with facts; it looks real, but is deceptive, too good to be true. Taking up the mask of real, credible news sources—which, notwithstanding, are fake—these sites adopt media names, like “San Francisco Chronicle Gazette” or “Denver Guardian.” The president of Media Matters, Angelo Carusone, remarks, “These sites actually capitalize on people’s inherent trust in the news media.”[10]


We pride ourselves on our democratic freedoms of speech and press, yet nothing could be further from the truth. Today is the age wherein left becomes right, up down, and right wrong, when everything we have come to know is flipped upside down, every fact we have accepted needing to be checked, then re-checked, just to make sure it is not images.jpeg“fake.” Such is the time we occupy. We cannot trust our media. There is a fundamental lack of discussion. Silent, powerless yet powerful, we have the power to make a change, if we want to. I am sure none of us would like to live in a country where the media purposefully obscures the news, covering up the government’s actions, adding glitter to it, to keep it from appearing as it is. And yet, we live in one. It is not so distant from a totalitarian state as we might think. Chomsky thought Orwell would be impressed, impressed beyond horror, at the extent to which we as a civilization have abandoned truth and honesty in our coverage of the government. The public sphere as we have come to know it, has faltered, trampled beneath our feet, like a clerk on Black Friday, as we insatiable consumers burst through the doors, indiscriminate, hungry, willing to feast on whatever is presented before us on a fancy platter. Bibs fastened around our necks, knives and forks tight in our fists, we voluntarily feast on the shiny and tasty-looking desserts placed in front of us, instead of eating our vegetables, salutary, good for us, though not as inviting. We have failed the public sphere. Rational discourse has been abandoned. But if we take the time to talk with one another, engage in discussion, and do our research, reading up on the latest news, attentive, then we can bring back honest, intellectual journalism. We must make our communication authentic.

 


[1] Habermas, The Structural Transformation of the Public Sphere, p. 171
[2] Id., p. 169
[3] Id., p. 170
[4] Chomsky, The Chomsky Reader, “The Manufacture of Consent (1984),” p. 136
[5] Habermas, op. cit., p. 214
[6] Id., p. 217
[7] http://transition.fcc.gov/cgb/consumerfacts/reviewrules.pdf
[8] https://en.wikipedia.org/wiki/Media_crossownership_in_the_United_States#Since_2000
[9] https://www.youtube.com/watch?v=_fc5yayLkI0&t=1s (9m10s)
[10] Id., (9m18s)

For further information:
The Structural Transformation of the Public Sphere by Jürgen Habermas (1991)
Introduction to Critical Theory: Horkheimer to Habermas by David Held (1980)
The Penguin Dictionary of Critical Theory 
by David Macey (2000)

Chomsky on Democracy & Education by Noam Chomsky (2003)
A Requiem for the American Dream by Noam Chomsky (2017)
Dictionary of Sociology by Nicholas Abercrombie (2006)
The Chomsky Reader
by Noam Chomsky (1987)

Social Imaginaries by Charles Taylor (2005)
Media Cross-ownership
Consolidation of Media

Facebook and Fake News