Category: Social Commentary

Charity Phlegm

Bacteria under microscope illustration for "Charity Phlegm" blog entry by Emily Moon about the Ice Bucket Challenge, ALS, and Pete Frates.

This is a gross understatement, but there are a lot of Ice Bucket Challenges circulating the internet right now. From innumerable athletes enduring the cold, to celebrities like Chris Pratt taking multiple buckets from their laughing wives, to that girl from elementary school you forgot you were Facebook friends with until you saw her screaming and racing circles around a yard for charity. The Ice Bucket Challenge is a veritable epidemic of good Samaritanism, and while I applaud these brave souls drenching themselves to support ALS patients, some critics note that because the trending sensation focuses primarily on the dousing and successive nominations, many participants may not know much about the cause they’re freezing for.

That’s why, when my mom sent a nomination my way, I decided that in order to steer clear of the desensitization taking place for many internet acolytes who merely scroll past the deluge of watery videos filling their newsfeeds, I’d have to conduct my own IBC very differently. With a humanitarian fire lit beneath me, I set to work learning all I could about the cause itself, researching ALS on various medical forums, catching up on Pete Frates biopics, and looking into the history of chilling challenges in general. It was and continues to be important to me that both those happy to take a bucket to the head and those quick to roll their eyes at “yet another viral campaign” understand why it’s important to keep spreading the word about ALS.

Also known as Lou Gehrig’s Disease for the New York Yankees’ Iron Horse whose career ended with his diagnosis in 1939, amyotrophic lateral sclerosis is a neurodegenerative disease that impedes the brain’s motor neurons from sending impulses through the spinal cord to the subject’s muscle fibers. This neurological disease gradually atrophies, or decreases the size and in turn strength of the muscles that engender limb movement, swallowing, speech, and even breathing. There is not yet a cure for ALS, but aids and therapies exist to maintain degrees of independence and prolong survival.

As for the Ice Bucket Challenge itself, its inspiration Pete Frates began his crusade against ALS by calling for more attention and action on the Food and Drug Administration’s behalf in the ongoing search for a cure. Once team captain and outfielder for Boston College, Frates’ recession into immobility has necessitated the aid of a full time nurse, a feeding tube, and a computer for communication.

While the Team Frate Train helped skyrocket the Ice Bucket Challenge to this year’s biggest viral sensation and catalyzed a hugely successful fundraiser for ALS research, shivering for charity has been an altruistic tool for numerous awareness campaigns. Since 1904, people have been plunging into icy waters for polar bears, dousing themselves for the Kay Yow Cancer Fund, laying in freezing tubs to garner fundraising for Madi Rogers, a victim of severe juvenile diabetes, and participating in Cold Water Challenges to induce philanthropic action for clean water, hospitals, and housing in Liberia.

Armed with this new knowledge, I set about dusting off my speech writing skills and spent an entire day crafting and then trying to memorize a four minute soliloquy that I hoped might educate viewers on ALS and the challenge taken in support of its victims. As the light of day waned into early evening, I tore around the house looking for ways to actualize my message. The only place I could conduct the challenge without damaging the downstairs neighbors’ ceiling was in the shower; I no longer have a tripod so I’d have to stack packing boxes on top of a mini fridge to support the video camera; I’d need not one but two buckets to achieve my vision; and in case I forgot anything from my speech, I needed my laptop to serve as an amateur teleprompter and my boyfriend’s assistance to operate it.

Finally ready, I called action, started my speech, and began to pour. What I hadn’t anticipated was that four straight minutes of slowly dousing your skull with ice negatively impacts your memory the way the pretty lifeguard affected Squints in The Sandlot. Not to mention the fact that ice in your eyes makes it impossible to recover your forgotten material from the faux teleprompter that wavers between blurriness and brief clarity in the distance. I was able to get a lot of my speech out, but the moments where I had to stop and start over or spit out a watery word resulted in an editors nightmare, and I would never subject my Final Cut-savvy boyfriend to that torture. So I ended up having to scrap the project and conceptualize anew, devising a different approach to filming my speech that I looked forward to completing in a couple days’ time.

Two days later, I woke up with the fingers of the common cold drumming at my throat. No stranger to sore esophagi after enduring them for eight years before realizing I was allergic to my mom’s cats, I spent the rest of the day self-medicating with colloidal silver, cup after cup of tea, and day-long parades to bathroom. Regardless of my efforts, I drove to work the next morning sick as a dog, and despite my boss’ repeated instructions to keep drinking water, I left unable to ingest anything without feeling like I was going to keel over and face plant my already fragile laptop. It was official: I had the flu.

I spent 48 hours being sicker than I’d been since last Valentine’s Day when ol’ influenza decided it wanted to attend the surprise getaway my boyfriend had planned. This time around, I ended up missing a day of work and had to conduct the next from home to keep my contagions to myself. DayQuil and NyQuil became my new best friends and the food I usually admire for its incredible versatility and piquancy was deemed an enemy. The heat of the Los Angeles summer made sleeping in bed with a high fever akin to sleeping in a muggy, half-filled kiddy pool. And bed-time became an ambiguous, all day affair.

When the flu finally began to subside and the virus returned to my throat–bringing along an inflatable bouncy house based on the scale of my swollen glands–I thought the end was in sight. Usually, my ailments start in the throat, escalate according to the virus I’ve contracted, and culminate in a day’s worth of coughing. That’s why when the coughing began and I traded my various Quils for Halls and vitamin C, I could have praised Allah: finally I’d be myself again in one last 24 hour cycle of hell!

But the weekend saw to it that I wouldn’t get off the hook that easy, and as the days passed the cough increased until I was hacking up phlegm in a performance art homage to my fifteen year old do-si-do with pneumonia. At fifteen I held out against a trip to the doctor until I’d been afflicted with the illness for three months simply because my family believed more in vitamins and orange juice than professional care and pharmaceuticals. This time around I kept naysaying my boyfriend’s wise suggestions that a medical opinion was warranted because I knew my Obama-ordered Oregon health insurance wasn’t applicable in my new state of residence and copays are steep enough as it is.

On the morning when my coughs tried suffocating me awake, blood poured faucet-like from my nose, a very bizarre rash broke out all around my neck, and I’d somehow contracted pink eye on top of everything else, I gave in: it was absolutely time to visit Dr. Stranger. Per usual, the doctor was incredibly nonchalant about all of my symptoms, causing unnerving flashbacks to the time my consistently incompetent pediatrician misdiagnosed my bout of flesh eating bacteria as a temporary skin irritation (thank God for the Urgent Care doctor who thought to actually perform a biopsy). According to Dr. Stranger, my illness had started out as a run-of-the-mill viral infection contracted when a good friend’s cold and my boss’ fever of a week before combined to create my Super Flu. With my immune system weakened, bronchial bacteria had easily hopped on board to join the party and now I’d have to fill an antibacterial prescription to rid myself of bronchitis by the end of yet another week. The rash, he said, was totally unrelated and most likely an allergic reaction to… something. For this he prescribed Benadryl and Hydrocortisone and sent me off with the promise that I could come back for a real checkup should the rash persist or spread.

So here I sit three days after diagnosis and almost two weeks after those precursory inklings of a sore throat, my bedside table weighed down by a water-filled Tervis Tumbler, tissues, cough drops, multiple Vicks cold and flu remedies, Sovereign Silver, allergy medicine, anti-rash cream, Azithromycin that I pray will kick in soon, and floss. I myself am weighed down by phlegm and the regret that by failing miserably at my attempt to complete an educational version of the Ice Bucket Challenge, I’m letting down my mother, Pete Frates, and Chris Pratt.

That’s why I’m glad there are still hundreds of people out there bolstering the internet’s incredible ability to spread awareness and simultaneously proving that philanthropy is alive and well. While my personal icy contribution has been delayed, I hope that other participants go beyond the bucket to educate themselves and others about both the fight against ALS and all the charitable movements that people have been freezing for over the decades. Spreading not only nominations but new knowledge will add a whole new element of significance to the thousands of pounds of ice that have been dumped since Frates’ recently deceased friend Corey Griffin first took up the challenge in Pete’s name. Even if bronchitis or another ailment is keeping you from joining the soaking phenomenon, take a minute to find a new, creative way to support the research for ALS and other diseases that have yet to behold a cure.

Advertisements

The Gripes of Wrath

Gripes of Wrath

In our contemporary blogosphere, it’s becoming commonplace for opinion pieces to spark galled backlash. With these internet ripostes in mind, allow me to start by addressing a point that’s heated “Jezebel” and “The Gloss” writers alike. I am now and have always been of the opinion that skinny female characters and the actresses who portray them can most certainly be strong. Some examples of this reality include Jada Pinkett Smith as The Matrix trilogy’s Niobe: a skinny woman who’s physically buff and whose on-screen presence as captain of the Logos packs a commanding wallop. Jennifer Lawrence as The Hunger Games‘ Katniss Everdeen: a skinny woman who exudes strength in controlled stoicism, perseverance, deft reflexes, and cunning. Emilia Clarke as Game of Thrones‘ Daenerys Targaryen: a skinny woman who commands dragons, the most powerful weapons in all of Westeros… when she can find them. In fact, my list of skinny badass women could go on for the duration of this entry because, quite frankly, skinny badass women proliferate the action genre. In some cases, the tiny but strong physiques that parade across theater screens are totally warranted, such as Katniss Everdeen’s lifetime of meager rations and near-starvation, which produced not only her precision in archery but also her emaciated frame.

These rare cases of justified slenderness aside, last weekend I begrudgingly sat through an aspiring blockbuster which reminded me all too blatantly that in most cases Hollywood’s coveted runway-ready action heroines possess slender builds that go without explanation, or outright contradict their characters’ backstories. Said film that I knew I would execrate from the first teaser trailer was 300: Rise of an Empire.

Now in a continued effort to keep the peace, let me apologize henceforth to anyone who adored the second installment of the 300 franchise and warn anyone who’s optimistically awaiting the DVD release that these next paragraphs aren’t for you. I would hate to rain on anyone’s pending parade but in my effort to lay down the skinny, an explanation is in order.

Perhaps my seventh grade history reenactment of the Battle of Thermopylae gives me a sense of personal connection to the story, or perhaps sneaking into the sold out premiere after enduring Wild Hogs and being forced to sit in my friend Davin’s lap in the front row made it all the more exhilarating, but 300 is one of my all-time favorite movies. A hyper-stylized, Greco-fetishism action film whose place in my heart outlived my teenage affinity for violent, adrenaline-fueled cinema, 300 is like a classical painting injected with testosterone and set to a mashup of choral hymns and industrial guitar performed in a Mediterranean arena. Couple this with my Frank Miller phase sophomore year of high school, during which all my paintings suddenly looked like Sin City and all the dialogue in my stories had the private investigator timbre of a film noir revival, and it’s a shoo-in that 300 would have me hooked.

As such, from the minute my ear registered the first inklings of a possible sequel, I knew I was in for a pile of sepia-toned, slow-motion dog crap. And when I broke down and saw the film as an escape from last Sunday’s heatwave, my expectations were not disappointed. Seven years ago when it was released, 300 was something we hadn’t seen yet. Sure Neo’s back-bending, decelerated bullet dodge in The Matrix ushered in the stylized fight sequences that pervade action films to this day, but as far as I’m concerned 300 was a new form of visual gluttony that was candidly cool. From the sheer mythos of ancient Spartans, to the absorbing narration, to the gritty and simultaneously painterly aesthetic, to the machismo choreography, to Gerard Butler and his conical beard, and to the archetypal characterizations–every facet of this narrative oozed cool.

Zack Snyder may have taken the M. Night Shyamalan route and fallen quickly from a laudable perch in film esteem to directorial leper, but based on the utter disaster that is 300: Rise of an Empire, upcoming director Noam Murro could do with a touch of Shyamalan. Where 300 was fresh cinematic confectionery, 300: Rise of an Empire came seven years too late, after a horde of fanboys reproduced its aesthetic to death in both film and television, à la The Immortals and Spartacus. As if this latency weren’t enough, 300: Rise of an Empire then took everything that was impressive about its predecessor and lamed it past the point of entertainment. Where 300 presented us with hardcore-by-definition Spartans, it’s sequel centralized around the farmers and poets of Athens, and expected us to believe that men of these dispositions would possess the same chiseled and airbrushed abs of life-long, fanatical warriors. Where 300 brought us iconic dialogue to rev up battles of hand-to-hand combat and impossible feats of flight and strength, Rise of an Empire gave us horrendously convoluted and unimpressive speech, generally followed by tedious ellipses, before merely smashing their CGI ships into one another. Where 300 brought us powerful archetypes, such as the inexplicably behemoth god-king Xerxes, the sequel squandered said mystique with inane, humanizing backstories. Where 300 brought us bizarre, prosthetic monsters that served a purpose, the new release tossed in a couple half-attempts at poorly animated creatures that did nothing but hiss, spit, and disrupt deep sea dreams. And where 300 brought us female dynamism in Queen Gorgo’s plight to aid her husband and her people by whatever means necessary, Rise of an Empire brought us Eva Green.

Prior to seeing the film, I read a review in the Los Angeles Times written by a woman who ranted and raved about Eva Green’s magnetism as the Persian navy’s most formidable commander. While it’s nothing against Green’s acting skills, I found both the writing and choreography for Artemisia dry and unimpressive, and the casting of waif-like Green (who attributes her paper-thin mien to her French affinity for “cigarettes and laziness”) really got my goat for the very reasons I started this blog entry.

Artemisia is a Greek woman betrayed by her countrymen and hot on the trail of vengeance, and as such she’s been training with the Persian herald (of all people) in combat since childhood. Her lifelong vendetta builds her up to be one of Darius’ and Xerxes’ most vicious soldiers, and when she’s pitted against the Athenians at sea, her skill with a sword makes her a killing machine amidst the onslaughts of unexplainably robust seafarers.

With a backstory and profile like that, you’re not pulling one over my eyes this time Hollywood. If Artemisia was a real woman who’d devoted her entire life to Greek-Decapitation Boot Camp, she would at least have arms like these female Adonises:

Female Adonises

(And look Hollywood, you could even keep the cinched waist for sex appeal!)

For some reason, despite the beautiful, muscular women like the afore-pictured 1905 “circus strong woman” and 1890’s Vulcana–a woman who actually looks like she beats foes to a pulp for a living, Hollywood insists on ignoring the characters’ profiles and casting the Gal Gadots of the industry as Diana of Themyscira.

Gal Gadot and Wonder Woman

Occasionally Hollywood does get it right, utilizing Gwendoline Christie’s striking height of 6’3″ to create a totally believable warrior in Brienne of Tarth, or casting stunt woman-turned-lead-lady Zoë Bell for her genuine physical prowess and ability to literally kick butt. More often than not, Hollywood makes feeble efforts at best, tailoring B-movies to women like mixed martial artist Gina Carano, whose leg locks far supersede her abominable acting. That, or they bypass accuracy altogether in favor of sex appeal.

As a girl who’s been a limp noodle far more times in my life than that period of lopsided racquetball strength and that one year track and field made me muscular, I completely understand the argument that thinness does not equate to weakness. After all, there are numerous fighting styles out there that enable a narrow figure to bring down someone twice their size. Plus, there’s always the fact that a thin actress can bulk up for a role. But I’m not going to kid myself into believing that Hollywood’s decision to cast skinny women as beastly characters is an attempt to emanate female empowerment. Rather than utilizing low-weight modes of combat to their advantage or following in the BBC’s footsteps and casting actors that realistically look the part of their roles, Hollywood is clearly only concerned with selling tickets via sex, and the current mainstream definition of feminine attractiveness is runway model thin… with breasts if she can manage to pull off that Victoria’s Secret feat.

Thus, until the the media’s interpretation of desirability begins to morph towards something of Polynesian proportions, I’ll have to buckle down and swallow my gripes, watching adequately muscular film and television contenders get passed by in the casting hunt for the fiercest commanders of the shitty-remake sea.

Hungry Like the Digitally Domesticated Wolf

Hungry Like the Digitally Domesticated Wolf

I live in Hollywood, Los Angeles, California, capital of progression in entertainment. As such, I don’t know if I could possibly be more saturated in a trend that future decades may well identify as the zeitgeist of our era. In the way that the 80s are stereotypically characterized by teased hair and overzealous synthesizers and the 20s are remembered for board-thin flappers and sexual revolution, I think our period might be historically defined by the beginnings of the technological takeover George Orwell prophesied. Only rather than relying on technology for every facet of both survival and comfortable living (as science fiction likes to predict) our era seems to utilize the majority of our technological strides for the very concept that makes my current hometown a tourist Mecca: entertainment.

In this day and age, we spend so much time sapping entertainment from our televisions, computers, and cell phones (more aptly known as “cellular devices” due to the increasing antiquity of actual phone calls), that it makes the deeply repressed wild child in me sick beyond Pepto-Bismol relief. So much so that I resorted to college-ruled paper for the crafting of this entry, just to spare my eyes the LED glare of my laptop as long as possible.

When I was a child, long before the invention of Smartphones, Rokus, iPads, and Netflix, I technically had far less access to information. In order to garner new knowledge via the answers to numerous queries, people and books already possessing said wisdom had to be sought out–and this process of learning could take far longer than tapping into your Wi-Fi and posting a thread on Yahoo Answers. But despite the hefty girth of old school dictionaries and the time it took to navigate them, the pre-MP3 world I was brought into was far more wondrous. For entertainment, we looked to nature to provide us with sand to sculpt, rocks to climb, mud to throw, trails to explore, and water to paddle. We looked to our toy box for blueprint-less Lego castles to build, Barbies to direct in plays, and whole worlds to fabricate from disparate pieces. We looked to our friends and relatives for tag between the cherry trees, trampoline acrobatics, and lava monster on the stairwells. And in the pursuit of new knowledge, where wise people and books were scant, personal experimentation in pursuit of an answer thrived. In all, it was a time when imagination and the endless joy you could glean from it ran rampant.

Now I’m not saying the child of my youth doesn’t exist anymore. Trying my hand at teaching elementary and middle school art for several years has proven that there exist many amongst the post-millennium babies who still get a kick out of seed-spitting contests, capture the flag, and playing the time-resistant “house.” But my observations have also yielded a great number of children taking cues from the modern adult: riveted with their iPhones, Angry Birds, Facebook, PSPs, and cable television. Sedentary hobbies that I fear may continue to escalate in child popularity.

Frankly though, I’m one to talk. My sister and I may as well have ushered in the child cell phone craze when at ages 9 and 11 we were envied by our peers as the only two children in school to possess brick-sized, antennae-toting Nokia 5110s. The year was 2001, Snake was one of the few 8-bit games a cellular device could support, and cell phones were still such an up-and-coming phenomenon that instead of confiscating mine when it went off in class one day, my fifth grade teacher merely laughed. But even as early prototypes of elementary school cellonistas, my sister and I only had them as safety precautions for the long, unsupervised walks home from school, not as idle distractions. And when cell phones began to proliferate throughout school systems by the eighth grade, my dad decided our exponential texting warranted the cancellation of our family plan, an act that may have deemed us social pariahs throughout high school, but ultimately did us and our eyesight a world of good.

Nine years later, sitting in a Hollywood apartment with my laptop blinking at me sleepily from the bed, my Smartphone sedate on the table, and my image reflected back at me on my boyfriend’s flatscreen TV, the thought of pre-adolescent children fixating on their digital devices with the same vim the characters of Her demonstrated with their Operating Systems is a frightening notion. I’m 23 years old, living in the age that witnessed the birth and demise of CDs, DVDs, and Blackberries; an age in which the rapidity of technological advancement grants our lifestyles increasing facility on an annual basis. And yet rather than celebrating the ease with which I can archive my music or send my sister messages via satellite, all I really yearn to do right now is ditch the muffled television conversations that eek through every Hollywood wall, throw my phone and its tempting crossword puzzles to the wayside, bid adieu to the computer that served as my life support and safe haven throughout college, and take up residence in a remote, mountain-ringed field somewhere.

For as an active participant in the age of intensifying technological reliance and reproduction, it’s nerve-wracking enough pondering ways to go about shielding my future children from the comparably substandard Harry Potter films long enough for them to read the books. With this and similar obstacles amassing by the day, it’ll be a wonder if I can convince these pending Moon babies that racing you to the other side, climbing to the highest peak, and letting your imagination run away with you provides entertainment that simply can’t be found by poring over an iPhone.

This American Life, 101

This American Life, 101

My family has a tendency to nag my sister about her repudiation of the national criterion that expects all nineteen year olds to immediately enroll in a four year college upon high school graduation, lest they wish to toil through a life of welfare or, God forbid, work a blue-collar job for the rest of their lives. When I was still enrolled in fastidious studenthood, I might have agreed with my family’s concerns for my sister. After all, she’s an incredibly bright human being with a charming personality and the same fierce drive that makes all us Moon-Woods workaholics. If she had found a college program that appealed to her, there’d be no doubt in my mind that she would excel at it. But thus far in her life, nothing a college degree can offer has yet to beguile her into attendance, and surprisingly, I commend her for standing by that fact. I’m probably going to be ostracized from the family for the newfound beliefs I’m about to confess, but after making the decision to adhere to The Official Timeline of an American Life, I wholeheartedly support my kid sister’s decision to deter from the norm.

For some reason, the overarching sentiments of this country seem to suggest that adults who veer from the expected college track will become work force pariahs, too burdened by ignorance to climb the occupational ladder and attain the life of monetary leisure the American dream extols. We have a tendency to completely discredit other forms of learning in the face of institutionalized academia, and pity those who reject the increased opportunities a diploma provides. But it’s an obvious fact that the boot of school is not tailored to every foot–especially since many of our schools operate under the delusion that packing young, overworked brains with a winter quarter’s worth of knowledge and then testing them to assess and grade their intelligence is a universally beneficial system. For some students, this rote methodology works wonders, but for many–including obsessive-compulsive grade point extremists like myself–this system is incredibly faulty, prioritizing a numerical outcome over the individualized educations every child would receive if all schools truly fulfilled their self-referential mission.

While this is certainly a cynical take on institutionalized learning, I’m not discrediting the value of education in the slightest. I think actively broadening your mind in the pursuit of knowledge is far more important than seeking a degree for the future income it might secure, and therefore I’m a huge proponent of the academic value of continuing on to college after high school. Sadly though, elements of my schooling reinforced the fact that monetary gain takes precedence in the eyes of our Capitalist system, demonstrated by my required enrollment in several courses that were entirely useless, taught by so called “educators” who had nothing to teach and instead comprised an obligatory conveyor belt in the production line that is contemporary college.

Based off of my experience, college is a business bent on perpetuating the larger mechanism of national wealth. While the notion of putting a price on knowledge is completely counterintuitive, the idea of coupling education with the exclusivism of astronomically high tuition is outright idiotic. Yes, garnering an education at a community college is a much cheaper route, but for those who can afford and choose to attend a community college, there’s still the stigma that their educational institution is merely a stepping stone for a more expensive school, where greater resources supposedly ensure better academia and, in turn, more profitable jobs.

But when we talk about the value placed on today’s “premiere educations,” we’re talking about exorbitant prices. Even with the incredible, four-year scholarship I received, the 16 hour days I worked without breaks, and the nerve-wracking amounts my parents had to proffer up every quarter, the remaining bill still weighs on me like an unmanageable dumbbell, and I’m officially a statistic on the long list of post-grads facing a lifetime of staggering college debt.

To make matters worse, I’ve now witnessed the fact that many college graduates who’ve been roaming the “real world” much longer than I have are victims to the twisted notion of the internship. This concept might have once meant a brief, occupational transition between school and adult responsibility, but has since evolved into an interim period of strenuous unpaid labor that (like my boyfriend’s internship) can demand seven straight days of serious work, imperative to the company’s success. All of this sans the pay that an uneducated fast food employee makes in one hour.

Because the American system condones the idea of unpaid labor and demands five years worth of experience for numerous entry level jobs, many recent grads have to become fast food employees, waiters, sales floor reps, and grocery store parcels just to afford residency in the city that hosts their internship, which in the arts industry that my former college caters to, means the extremely expensive cities of New York and Los Angeles. Enter a restaurant in LA and if your server isn’t an out-of-work actor, they’re likely a post-grad with a bachelor or masters under their belt and at least five internships on their resumé. And I’m not over exaggerating. Since moving to Los Angeles, I’ve become friends with law school graduate who has to waitress here in the City of Angels and can cite the impressive degrees of everyone on her restaurant’s waitstaff, and I’ve met innumerable people who shake their heads in exasperation when they tell you that yes, their fifth internship is also unpaid.

This whole transitional interlude is an incredibly stressful time, and if you took International Baccalaureate classes in high school in the hopes of attending a prestigious college that supposedly guarantees a comfortable job, you’ve been extremely stressed since you were sixteen. I accommodated this stress into my life as a natural part of living, and thanks to cautionary familial examples of the toll eschewing college can take, I always figured I’d made the right choice. But then my sister chose otherwise, and I had a new example to behold.

My sister works at a job that many would consider undesirable. In fact, having worked in the same establishment, I can vouch for those dissenting opinions myself. But my sister and I are two very different peas from the same pod of sweat and determination, and despite some displeasing elements, my sister loves her job. She’s incredibly popular amongst her coworkers, supervisors, and the customers she serves, she gets to arrange her own schedule (which happens to begin at four in the morning, at her request), she gets paid well and receives numerous benefits, and she has plenty of time to engage in her favorite after-work hobby of toning those buns and thighs at the gym she frequents. She may not have the salary of a med school-trained neurosurgeon, but she has an even more beneficial facet of life: she’s happy. And all those six years that I was tearing my hair out in academic exasperation, she was approaching life with a relaxed mindset that maintained her persistent, bona fide smile. Yes, there’s no telling what her monetary future holds without an official stamp of institutionalized approval, but is that really the most imperative crux of a human life?

To conclude, I applaud your decision to take things in stride, little sister, to live for the moment even though we can’t resist heckling you about the future, because we, like the rest of the country, abide by the fear that if you don’t acquire financial security there’s little hope for happiness. If you should ever want to learn a new skill set or venture into a new occupation that requires a piece of verifying paper, I encourage you to look into colleges or trade schools. Resist being swayed by money-hungry recruiters who’ll sing any school’s praises, and conduct your own research about the professors and the real success stories instead of the advertised statistics. Find an institution that will really give you your money’s worth, and attend it with a desire to learn, not a desire to merely graduate. And should you ever find yourself suddenly living a cautionary tale of your own, do what most narrators don’t: make an effort to change it. Life is too dang short to spend it mimicking the rest of society because that’s what’s expected of you, so go out and garner wisdom, work, and happiness however you see fit little girl. Keep approaching life with the sense of excitement and wonder you’ve always possessed, and I know you’ll do just fine.

Cirque du Inconduite

Cirque du Inconduite

An admitted dolt in the realm of pop culture, I am not one to devote two hours of my innately fickle attention to a show that awards celebrities for their societal merit, and MTV’s Video Music Awards are definitely no exception. But when the chaos of the proceedings catch the attention of my boyfriend, the diners seated next to us at Phở Show, and the old lady who rings a loud bell as she pushes her cart of purchasable goods down our street every day, I figure there’s no use fighting the tide of insignificant viral knowledge and succumbing to a few recaps. Specifically, the award ceremony’s shocking crème de la crème in the form of Miley Cyrus.

In keeping with the pop culture ignorance that replaced the actor idolatry of my youth, I don’t really know anything about Miley Cyrus beyond the fact that she used to wear a wig on TV and mesmerize kids with a proclivity for hero worship; those of us wrought with country music ineptitude consider her father an achy-breaky one hit wonder; she starred in some movie filmed on Tybee Island while I attended class completely unawares only 18 miles away; she may or may not have married the arguably less attractive Hemsworth brother; and her sexually suggestive shenanigans have been curdling PTA member’s breakfast milk for the past several years of her waning adolescence. On top of all that, I know that she’s the same age as my younger sister–born only a few days prior–and having been around both my sister’s crew and whole troops of them back in my college days, I know how 20 year olds act, and can only imagine how the constant accompaniment of a blinding limelight would amplify said behavior.

Thus, I find it hilarious that a celebrity like Miley Cyrus can get so much opprobrium for parading around in feigned nudity and conducting lewd, embarrassingly uncoordinated dance moves during her VMA performance, but the backup dancers who successively march out to the beat of a sex-driven drum in cliché, skin-tight spandex can go virtually unnoticed. Yeah, yeah, Miley Cyrus was a child role model–I remember how excited my young cousins were to unwrap Hannah Montana paraphernalia at Christmas. But we sure are quick to forget that Britney Spears was a Mickey Mouse Club member (along with Justin Timblerlake and his jheri curls) whose first album beguiled the nation’s youngins, and look where she ended up–a fact that Trey Parker and Matt Stone already equated to Miley Cyrus long before this public debacle. Is this recurrent trend not a blatant sign that we as a society keen on the scandals splayed across People Magazine are culpable for the shocking behaviors of our young icons? If we weren’t a species akin to the Ashleys of Recess fame (crying, “Scaaaandalous!” at the slightest inkling of amorality), then those young superstars we love to distract our kids with might keep their pants on for a change.

As it is, pop culture has always been a game of one-upping the last controversy to obtain some free publicity. You need to be brash to sell tickets to a society that claims to have seen it all, and if standing out means upsetting the mothers who once called you adorable, then by God, the increased attention is worth a clumsy attempt at half-nude twerking. Especially when your competition operates under the moniker “Gaga,” serves as a gay bar icon second only to Cher, and constructs her public persona from Madonna hand-me-downs, Harajuku fashion, and what must have been the deranged visions of an acid trip.

Ironically, the polls say young Miss Cyrus and her unremitting penis innuendos trumped Lady Gaga’s bug-eyed, postmodern nun, Bauhaus-ish choreography, and tacky shell bikini, a feat that even Madonna herself couldn’t pull off when she abandoned the hippie phase that produced Ray of Light, filled in her gap, donned a faux British accent, and attempted to regain popularity by enlisting the aid of M.I.A., Nicki Minaj, and some pompoms. Misinformation or not, however, according to The Slatest, Lady Gaga still managed to perturb Will Smith’s family with the nutty schtick the masses are beginning to deem passé, so perhaps there’s hope for her next public stunt yet.

Overall, the whole Video Music Awards ordeal is a silly affair sprung from a Victorian era affinity for scandal. We the people of the United States of Rabble-Rousing fuel the raunchy flames of fame crazed twenty-somethings by making a big fuss over behaviors that attention-seeking young adults conduct for small beer pong audiences on a weekly basis. The controversy we engender is the coal that keeps this monkey train rolling. Lose the voracious appetite for muckraking, and maybe we won’t have to watch girls the same age as our little sisters defile the innocence of teddy bears and #1 foam fingers with their bad dance moves and flesh-tinted ensembles.

The Tinseltown Trope

The Tinseltown Trope

By way of the media-sharing, social networking, and stalker-encouraging faculties of a little web sensation known as Facebook, my attention was recently directed to an article written by author Sophia McDougall for NewStatesman entitled “I Hate Strong Female Characters.” Initially perceived as an odd subject for a woman in full advocacy of female heroism, the article reveals an author’s vexation with the fact that the few female characters Hollywood’s male-dominated industry engenders these days are whittled down to mere “strong” women.

As if to pacify the contemporary consumer’s deterrence from the antiquated “damsel in distress,” screenplays today produce a myriad of women who not only serve as the male protagonist’s necessary love interest, but who also pack a punch. To illustrate the media’s attempt to reverse the princess hype of bygone eras, McDougall cites kung-fu-savvy Fiona from Shrek, trigger-happy Peggy Carter from Captain America, Buffy of vampire slaying fame, and Black Widow from The Avengers (am I sensing an anti-Joss Whedon trend here?), all of whom resort to violence to establish their auras of sexually intriguing power. While there’s no denying these kick-ass women have right hooks and roundhouse kicks in heels down to a T, McDougall’s article surmises that this modern cinematic woman may be nothing more than a convenient rouse to keep the idolatrous masses at bay–to paraphrase Walter Benjamin. In today’s big Hollywood blockbuster, women have to be purveyed as strong in order to receive the respect their male counterparts garner, even though a man can be prone to addictive neuroses à la Sherlock Holmes, and still be considered a hero. Ultimately, McDougall asks for equality between male and female characters. Instead of one gal and five guys in a super hero posse, why not level the gender playing field? And instead of emphasizing nothing beyond that one female character’s strength and sexual magnetism, why not add the dimensions of reality afforded to male protagonists like Spider Man, Hamlet, and Daniel Craig’s James Bond?

After reading this opinionated plea for equality (akin to the egalitarianism my inner, scale-toting Libra is always intent on), I got to thinking. On the one hand, I could rabble-rouse this cinematic platitude as reverse discrimination: a Hollywood ploy so keen on eradicating the helplessness of damsels past that it’s catapulted the blockbuster heroine into a predictable facade of strength, as if to suggest that while men are expected to be strong and therefore require ulterior characteristics to be captivating, women are expected to be weak, and therefore easily transition into compelling characters when caustic gun-wielding comes naturally. But is The Avengers’ Black Widow, with her monotonous, expository lines and repetitive harnessed flips, actually a compelling character?

On the other hand, I realized as I pondered this crux, that I myself am at fault for the fact that the sheer number of male protagonists–be it in The Avengers, Inception, or even The Smurfs–tend to exceed the number of female characters. I haven’t written recreational fiction in years and have honestly evolved well beyond the anti-feminist, male idolizing yahooligan of my youth, but back when I was able to document the adventures of my imagination on a daily basis, I was undeniably responsible for the adolescent egocentrism that results in one primary female character and a horde of dudes. Yes, there was the Holes fan-fiction from middle school that introduced a cast of female equivalents for each of the male Green Lake inhabitants, and yes, the three women featured in my story “Pampa” outnumbered the two men, but generally, my writing enveloped a sole heroine based off some constituent of myself and a host of male characters based off of other personal facets. Blame it on latent, inapparent tomboy-ism, but as a girl who found herself easily relating to a male mindset, it just felt more natural to translate my sardonic voice through a male medium and reserve my sense of teenage trepidation about body image, boys, and school for my female characters.

But just because the men outnumbered the women in my writing, didn’t mean my female characters ascribed to classic Hollywood’s helpless maidens or today’s revamped sword-brandishing pseudo-mutes. My characters may have been uncertain about a lot of the things life presented them with, but some of them certainly emanated natural strength, a couple of them had pulled through harrowing circumstances hardened but notably wiser, many of them could riposte circles around their male companions, and all of them had individual perspectives, experiences, and a distinctive voice of their own. None of them used kung-fu to merit respect (in fact, one character hid her penchant for violence as a hired gun in order to assimilate into the new identity she’d devised), and while a couple of them (my sister’s analog in the Holes fan-fic) had the men drooling, most of them deviated from the stereotypical sex symbol that makes a female character profitable in the eyes of Hollywood.

In fact, as I pondered the subject further, I realized that even though hero movies (generally inspired by comics made by men and produced by men for men) have created the Disney princess foil via their violent, “strong” female archetype, women have come a long way in the media. Just look at Tina Fey and Amy Poehler, idolized for their hilarious goofiness and witty intelligence without having to step into a leather catsuit or be raised by a pulley to conduct Crouching Tiger, Hidden Dragon-esque combat. And even in the realm of tough cookies, Arya Stark combines the honest vulnerability of youth with an adult desire to aid her family and fight because it’s inevitable, not because it’s sexy. While these multidimensional women offer hope to irritated consumers like McDougall, I won’t deny that they’re a long ways off from representing the schema perpetuated by our summer blockbusters–that of the disposable, hyper-sexualized Bond girl or the infamous “strong female character.”

I suppose that when you reside in a country where female politicians still don on pantsuits to be taken seriously, it’s no wonder Hollywood imbues strength in its female characters to elicit respect. While rugged gals can punch a chauvinist into silence or shoot their loved ones with fifty arrows out of unverified jealousy, you know our blockbuster screenplays have a few reality checks in order when the closest fictional woman I can relate to for her perseverance is Liz Lemon (that, or I just really like ham).

I’m a proud proponent of the fact that Hollywood has come a long way since Snow White lay in entombed waiting after a gullible run-in with an apple, but I can also recognize the validity in McDougall’s sentiments. Hollywood seems to be opposed to the notion of a female hero chartering her own film (and headlining a movie poster rather than standing behind Robert Downey Jr. and Chris Hemsworth) because, quite frankly, they haven’t figured out how to make her compelling enough yet. Personally, I don’t want them to make that movie until they learn to do it right–two hours worth of Scarlett Johansson’s blank expression while she pulverizes villains with the powers of… karate would make for a sure-fire box office flop. Perhaps the secret lies in employing female writers, girls who, like my adolescent self, dreamt up women who equalled men in battle but possessed senses of humor and honest queries about life to boot. Maybe Hollywood just needs to hand over the reigns to the female script writers and guys in tune with their feminine sides, thereby enabling those underused artists to revel in a little geeking out of their own.

Reverie Interrupted

Original Artwork © Emily Moon

Besides learning how to complete tax forms and fill out checks, one of the saddest, inevitable aspects of aging is the gradual diminution of daydreaming. That isn’t to say that there aren’t adults out there who still pass the hours with their head in the clouds, seemingly idling away while their imaginations rev with steam power, but I beg to proffer a generalization when I say that most adults in our Capitalist system don’t have the time or mental energy to dream like they used to.

This unfortunate phenomenon occurred to me last night after seeing Pacific Rim in IMAX 3-D at Universal’s neon-lit CityWalk. Despite the obvious holes that even Guillermo Del Toro admits to, this film was an increasingly rare personal experience in which I was actually able to relax and enjoy a summer blockbuster and all the giant, sword-wielding robots it had to offer. But while beating back motion sickness for the thrill of prismatic kaiju-jaeger carnage, the thought occurred to me that if I were a twelve-year-old kid watching this movie, my mind would be racing to fabricate a myriad of subplots and potential characters, and as soon as the movie ended I would hurry home to manifest my alternate narratives via writing, illustration, or a long bout of daydreaming. As it was, the movie ended and I hustled home to collapse exhaustedly into bed.

It’s a real shame that daydreaming seems to be a pastime literally and ideologically reserved for children. Even for those adults fortunate enough to still possess the active reveries of juvenescence, our culture seems to perpetuate a social stigma about daydreaming after a certain age. The phrase, “get your head out of the clouds,” comes to mind when pondering the fact that idle behavior in adults is generally chastised by the United States’ emphasis on productivity. Since youth, aging in America runs parallel to an exponential loss of time: our homework starts to amass in middle school to ensure that we’ll be ready for high school; or social lives have to be marginalized in order to complete all the high school work that prepares us for college; college buries us so deep in post-college preparation that sleep becomes an irregular recreation; the five unpaid internships a city like Los Angeles demands from us and the secondary jobs we fill just to make rent consume every waking hour of the day in preparation for a career; and unless we’re lucky enough to secure a relaxing schedule and ample time off, our careers become synonymous with “life.” Of course it all peters out eventually, and one can only hope that the reinstated free time of retirement might kindle some sense of contemplative woolgathering… as long as the exhaustion of the years prior doesn’t preoccupy the mind.

I think hispanic countries got it right when they established midday siestas as a cultural repose. Providing people with an opportunity to regain their energy and cerebrate at their leisure is a genius social strategy that not only aids in employee stamina but also in creative output. Daydreaming, while criticized as mere inattentiveness, self-absorption, and absentmindedness, is a progenitor of art and innovation. Back when I had the time and the energy to simply explore the contents of my imagination for as long as I saw fit, my artistic output was tenfold its current yield. Today, if I’m lucky enough to have a writing implement and jot down a creative thought when it galavants my way, I have to seek time to flesh it out, and by then I might already be preoccupied with the next fleeting fancy.

But I shouldn’t be so quick to bellyache about the future to come, for having attended art school, I’m geared up for a career in creative ideation. Despite these occupational prospects, the expectations of most middle and lower class vocations that I grew up amongst are worrisome on the creativity front. Unless you have a job at Pixar or in an advertising agency, work schedules are not conducive to imaginative thought. And even with a creative occupation, daydreaming just isn’t the same when you work to produce creative ideas versus spontaneously slipping into hours of free associative contemplation.

I suppose if there’s any consolation to be garnered from this predicament, it’s that even though the American system demands that we work hard to afford the necessities of life and work even harder to live leisurely, creativity continues to flourish. Eccentric couture designs continue to catwalk their way into fashion shows, anonymous muralists continue to adorn city streets with whimsical illustrations, teachers continue to create innovative curriculum to engage their students, architects and urban planners continue to brainstorm new strategies for cost effective living, and artists like those assigned to Pacific Rim continue to dream up bigger, more fantastical monsters. With creativity manifesting all around us every day, it’s clear that innovation is not solely the product of excessive daydreaming, and with the help of these imaginative adults, creativity will continue to augment social progress. Yet despite this propitious silver lining, I can’t help but wonder what this country would be like if everyone still had the time to dream with the same fervor that propels a child to build castles in the sky.