It’s The End Of The World As We Know It…. And I Feel Fine

According to the Mayan calendar — as translated by new-age hippies I used to know, and depicted by Roland Emmerich — the year 2012 is alleged to herald the apocalypse. Perhaps this collective unconscious sense of mass destruction is what’s driving the popularity of turn-of-the-millennium musings about the end of the world. In June 2008, Adbusters’ cover story was, literally, titled, “Hipster: The Dead End of Western Civilization.” Three and a half years later, Vanity Fair’s first issue of 2012 asks, “You Say You Want a Devolution? From Fashion to Housewares, Are We in a Decades-Long Design Rut?” While these two publications could arguably not be further apart on the target audience spectrum, they’re singing the same doomsday tune. As Kurt Andersen writes in the Vanity Fair piece, “The past is a foreign country, but the recent past—the 00s, the 90s, even a lot of the 80s—looks almost identical to the present.” The last line of the article concludes, “I worry some days, this is the way that Western civilization declines, not with a bang but with a long, nostalgic whimper.” But has cultural evolution really come to a grinding halt in the 21st century, or are we simply looking in all the old places, not realizing it’s moved on?

In Adbusters, Douglas Haddow sets up the alleged apocalypse like so:

Ever since the Allies bombed the Axis into submission, Western civilization has had a succession of counter-culture movements that have energetically challenged the status quo. Each successive decade of the post-war era has seen it smash social standards, riot and fight to revolutionize every aspect of music, art, government and civil society. But after punk was plasticized and hip hop lost its impetus for social change, all of the formerly dominant streams of “counter-culture” have merged together. Now, one mutating, trans-Atlantic melting pot of styles, tastes and behavior has come to define the generally indefinable idea of the ‘Hipster.’

Echoing that sentiment in Vanity Fair, Andersen writes:

Think about it. Picture it. Rewind any other 20-year chunk of 20th-century time. There’s no chance you would mistake a photograph or movie of Americans or an American city from 1972—giant sideburns, collars, and bell-bottoms, leisure suits and cigarettes, AMC Javelins and Matadors and Gremlins alongside Dodge Demons, Swingers, Plymouth Dusters, and Scamps—with images from 1992. Time-travel back another 20 years, before rock ’n’ roll and the Pill and Vietnam, when both sexes wore hats and cars were big and bulbous with late-moderne fenders and fins—again, unmistakably different, 1952 from 1972. You can keep doing it and see that the characteristic surfaces and sounds of each historical moment are absolutely distinct from those of 20 years earlier or later: the clothes, the hair, the cars, the advertising—all of it. It’s even true of the 19th century: practically no respectable American man wore a beard before the 1850s, for instance, but beards were almost obligatory in the 1870s, and then disappeared again by 1900.

Writing about the Adbusters piece in 2008, I pointed to a central flaw in the premise: the emergence of what Chris Anderson, in his 2006 book of the same name, calls, The Long Tail. Digital technology, Anderson writes, has ushered in “An evolution from an ‘Or’ era of hits or niches (mainstream culture vs. subcultures) to an ‘AND’ era.” In this new, rebalanced equation, “Mass culture will not fall, it will simply get less mass. And niche culture will get less obscure.” What Adbusters saw as the end of Western civilization was actually the end of mass culture; a transition to a confederacy of niches. So, if mass culture, as the construct we, and Adbusters, had known it to be was over, what was there to be “counter” to anymore? (While, more recently, Occupy Wall Street has thrown its hat into the ring, it’s not so much anti-mass culture as it is pro-redefining the concept: the 99%, through the movement’s message — let alone mathematics — is not the counterculture. It IS the culture.)

Unlike Haddow, Andersen doesn’t blame the purported cultural stagnation on any one group of perpetrators. Rather, the “decades-long design rut” has descended upon us all, he suggests, like an aesthetic recession, the result of some unregulated force originating in the 1960′s and depreciating steadily until it simply collapsed, and none of us noticed until it was too late. “Look at people on the street and in malls,” Andersen writes, “Jeans and sneakers remain the standard uniform for all ages, as they were in 2002, 1992, and 1982. Since 1992, as the technological miracles and wonders have propagated and the political economy has transformed, the world has become radically and profoundly new.” And yet, “during these same 20 years, the appearance of the world (computers, TVs, telephones, and music players aside) has changed hardly at all, less than it did during any 20-year period for at least a century. This is the First Great Paradox of Contemporary Cultural History.”

Or is it?

In a 2003 New York Times article titled, The Guts of a new Machine, the design prophet of the 21st century revealed his philosophy on the subject: “People think it’s this veneer,” said the late Steve Jobs, “That the designers are handed this box and told, ‘Make it look good!’ That’s not what we think design is. It’s not just what it looks like and feels like. Design is how it works.”

Think about it. Picture it. Those big, bulbous cars Andersen describes, with their late-moderne fenders and fins, so unmistakably different from 1952 to 1977, just how different were they, really, in how they worked? Not that much. In the 20th century you could pop open the hood of a car and with some modicum of mechanics know what it was you were looking at. Now, the guy in the wifebeater working on the Camaro in his garage is an anachronism. You’ll never see that guy leaning over the guts of a post-Transformers, 2012 Camaro. Let alone a hybrid or an electric vehicle. “With rare exceptions,” Andersen argues, “cars from the early 90s (and even the late 80s) don’t seem dated.” And yet, there’s no way anyone would confuse a Chevy Volt with anything GM was making 10 years ago, or a Toyota Prius with what was on the road in the early 90s, or voice recognition capability, completely common in a 2012 model, as anything but a science fiction conceit in a show starring David Hasselhoff, in 80s. While it’s debatable that exterior automotive styling hasn’t changed in the past 30 years (remember the Tercel? The station wagon? The Hummer? A time before the SUV?) it’s indisputable that the way a 2012 automobile works has changed.

For the majority of human history the style shifts between eras were pretty much entirely cosmetic. From the Greeks to the Romans, from the Elizabethans to the Victorians, what fluctuated most was the exterior. It wasn’t until the pace of technological innovation began to accelerate in the 20th century that design became concerned with what lay beneath the surface. In the 1930s, industrial designer Raymond Loewy forged a new design concept, called Streamlining. One of the first and most widespread design concepts to draw its rationale from technology, Streamlining was characterized by stripping Art Deco, its flamboyant 1920’s predecessor, of all nonessential ornamentation in favor a smooth, pure-line concept of motion and speed. Under the austerity of the Depression era, the superficial flourishes of Art Deco became fraudulent, falsely modern. Loewy’s vision of a modern world was minimalist, frictionless, developed from aerodynamics and other scientific concepts. By the 1960’s Loewy’s streamlined designs for thousands of consumer goods — everything from toasters and refrigerators to automobiles and spacecrafts — had radically changed the look of American life.

What began in the 20th century as a design concept has, in the 21st, become THE design concept. Technological innovation — the impact of which Andersen breezes past — has become the driving force behind aesthetic innovation. Design is how it works. Aerodynamics has paved the way for modern considerations like efficiency, performance, usability, sustainability, and more. But unlike fluctuating trends in men’s facial hair or collar size, technology moves in one direction. It does not vacillate, it iterates, improving on what came before, building incrementally. The biggest aesthetic distinctions, therefore, have become increasingly smaller.

Consider, for example, this optical illusion:

What, exactly, is the difference between the two things above? Rewind twenty years, and it’s already unlikely most people would have been able to really tell a difference in any meaningful way. Go back even further in time, and these things become pretty much identical to everyone. Yet we, the inhabitants of 2012, would never, ever, mistake one for the other. The most minute, subtlest of details are huge universes of difference to us now. We have become obsessives, no longer just consumers but connoisseurs, fanatics with post-industrial palates altered by exposure to a higher resolution. And it’s not just about circuitry. In fashion, too, significant signifiers have become more subtle.

The New York Magazine writeup for Blue in Green, a Soho-based men’s lifestyle store reads:

Fifteen hard-to-find, premium brands of jeans—most based in Japan, a country known for its quality denim—line the walls. Prices range from the low three figures all the way up to four figures for a pair by Kyuten, embedded with ground pearl and strips of rare vintage kimono. Warehouse’s Duckdigger jeans are sandblasted in Japan with grains shipped from Nevada and finished with mismatched vintage hardware and twenties-style suspender buttons. Most jeans are raw, so clients can produce their own fade, and the few that are pre-distressed are never airbrushed; free hemming is available in-house on a rare Union Special chain-stitcher from an original Levi’s factory.

(Sidenote: it’s not just jeans. Wool — probably not the next textile in line on the cool spectrum after denim — is catching up. Esquire apparently thinks wool is so interesting to their readers they created an illustrated slide show about different variations of sheep.)

“Our massively scaled-up new style industry naturally seeks stability and predictability,” Andersen argues. “Rapid and radical shifts in taste make it more expensive to do business and can even threaten the existence of an enterprise.” But in fact, when it comes to fashion, quite the opposite is true. To keep us buying new clothes — and we do: according to the Daily Mail, women have four times as many clothes in their wardrobe today as they did in 1980, buying, and discarding half their body weight in clothes per year — styles have to keep changing. Rapid and radical shifts in taste are the foundation of the fashion business; a phenomenon the industry exploits, not fears. And the churn rate has only accelerated. “Fast Fashion,” a term coined in the mid-2000′s, means more frequent replacement of cheaper clothes that become outdated more quickly.

“The modern sensibility has been defined by brief stylistic shelf lives,” Andersen writes, “Our minds trained to register the recent past as old-fashioned.” But what has truly become old-fashioned in the 21st century, whether we’ve realized it or not, is the idea of a style being able to define a decade at all. It’s as old-fashioned as a TV with a radial dial or retail limitations dictated by brick and mortar. As Andersen himself writes, “For the first time, anyone anywhere with any arcane cultural taste can now indulge it easily and fully online, clicking themselves deep into whatever curious little niche (punk bossa nova, Nigerian noir cinema, pre-war Hummel figurines) they wish.” And primarily what we wish for, as Andersen sees it, is what’s come before. “Now that we have instant universal access to every old image and recorded sound, the future has arrived and it’s all about dreaming of the past.” To be fair, there is a deep nostalgic undercurrent to our pop culture, but to look at the decentralization of cultural distribution and see only “a cover version of something we’ve seen or heard before” is to miss the bigger picture of our present, and our future. The long tail has dismantled the kind of aesthetic uniformity that could have once come to represent a decade’s singular style. In a confederacy of niches there is no longer a media source mass enough to define and disseminate a unified look or sound.

As with technology, cultural evolution in the 21st century is iterative. Incremental changes, particularly ones that originate beneath the surface, may not be as obvious through the flickering Kodak carousel frames of decades, but they are no less profound. In his 2003 book, The Rise of the Creative Class: And How It’s Transforming Work, Leisure, Community, and Everyday Life, Richard Florida opens with a similar time travel scenario to Andersen’s:

Here’s a thought experiment. Take a typical man on the street from the year 1900 and drop him into the 1950s. Then take someone from the 1950s and move him Austin Powers-style into the present day. Who would experience the greater change?

On the basis of big, obvious technological changes alone, surely the 1900-to-1950s traveler would experience the greater shift, while the other might easily conclude that we’d spent the second half of the twentieth century doing little more than tweaking the great waves of the first half.

But the longer they stayed in their new homes, the more each time-traveler would become aware of subtler dimensions of change. Once the glare of technology had dimmed, each would begin to notice their respective society’s changed norms and values, and the ways in which everyday people live and work. And here the tables would be turned. In terms of adjusting to the social structures and the rhythms and patterns of daily life, our second time-traveler would be much more disoriented.

Someone from the early 1900s would find the social world of the 1950s remarkably similar to his own. If he worked in a factory, he might find much the same divisions of labor, the same hierarchical systems of control. If he worked in an office, he would be immersed in the same bureaucracy, the same climb up the corporate ladder. He would come to work at 8 or 9 each morning and leave promptly at 5, his life neatly segmented into compartments of home and work. He would wear a suit and tie. Most of his business associates would be white and male. Their values and office politics would hardly have changed. He would seldom see women in the work-place, except as secretaries, and almost never interact professionally with someone of another race. He would marry young, have children quickly thereafter, stay married to the same person and probably work for the same company for the rest of his life. He would join the clubs and civic groups befitting his socioeconomic class, observe the same social distinctions, and fully expect his children to do likewise. The tempo of his life would be structured by the values and norms of organizations. He would find himself living the life of the “company man” so aptly chronicled by writers from Sinclair Lewis and John Kenneth Galbraith to William Whyte and C.Wright Mills.

Our second time-traveler, however, would be quite unnerved by the dizzying social and cultural changes that had accumulated between the 1950s and today. At work he would find a new dress code, a new schedule, and new rules. He would see office workers dressed like folks relaxing on the weekend, in jeans and open-necked shirts, and be shocked to learn they occupy positions of authority. People at the office would seemingly come and go as they pleased. The younger ones might sport bizarre piercings and tattoos. Women and even nonwhites would be managers. Individuality and self-expression would be valued over conformity to organizational norms — and yet these people would seem strangely puritanical to this time-traveler. His ethnic jokes would fall embarrassingly flat. His smoking would get him banished to the parking lot, and his two-martini lunches would raise genuine concern. Attitudes and expressions he had never thought about would cause repeated offense. He would continually suffer the painful feeling of not knowing how to behave.

Out on the street, this time-traveler would see different ethnic groups in greater numbers than he ever could have imagined — Asian-, Indian-, and Latin-Americans and others — all mingling in ways he found strange and perhaps inappropriate. There would be mixed-race couples, and same-sex couples carrying the upbeat-sounding moniker “gay.” While some of these people would be acting in familiar ways — a woman shopping while pushing a stroller, an office worker having lunch at a counter — others, such as grown men clad in form-fitting gear whizzing by on high-tech bicycles, or women on strange new roller skates with their torsos covered only by “brassieres” — would appear to be engaged in alien activities.

People would seem to be always working and yet never working when they were supposed to. They would strike him as lazy and yet obsessed with exercise. They would seem career-conscious yet fickle — doesn’t anybody stay with the company more than three years? — and caring yet antisocial: What happened to the ladies’ clubs, Moose Lodges and bowling leagues? While the physical surroundings would be relatively familiar, the feel of the place would be bewilderingly different.

Thus, although the first time-traveler had to adjust to some drastic technological changes, it is the second who experiences the deeper, more pervasive transformation. It is the second who has been thrust into a time when lifestyles and worldviews are most assuredly changing — a time when the old order has broken down, when flux and uncertainty themselves seem to be part of the everyday norm.

It’s the end of the world as we’ve known it. And I feel fine.

    



Subscribe for more like this.






Charlie Sheen Is Not Crazy

Image: Culture Wins

Charlie Sheen is not crazy. Or, at least, he’s not crazy the way you think he is. Charlie Sheen may finally be admitting that he’s lost his mind — exclusively to Life&Style, of all places, if we are to believe it — but that’s something that would have already been a long, long time in the making. What’s been happening over the past few weeks is not Charlie Sheen going crazy. Although it’s certainly easy to get confused. No doubt, Charlie Sheen wants you to think he’s crazy. After all, the boring recovering-addict Charlie Sheen Show — or the boring functioning-addict Charlie Sheen Show, depending on your preference — is much less interesting to watch than the “Crazy” one. And we are still watching….

In the course of this production it’s hard not to think about the film I’m Still Here, the cinéma vérité chronicling of Joaquin Phoenix’s “retirement from acting.”


.

For a year and a half, the twice Oscar-nominated Phoenix gained weight, stopped shaving, and tried to start a career as a rapper while his brother-in-law and fledgling filmmaker, Casey Affleck, came along for the ride to document this seeming descent into madness. Phoenix even famously came on Letterman in the course of I’m Still Here‘s production, disheveled and incoherent — an appearance that, by the end, prompted Letterman to say he owes an apology to Farrah Fawcett, til then considered his most disastrous guest of all time.

Of course, in the end it turned out this was not just another overindulged celebrity losing his mind. Nor, even after it was revealed that Phoenix’s “retirement” and subsequent actions weren’t exactly the plot of a straight “documentary,” was it all just simply a hoax. Back on the Late Show a year and a half later, now clean-shaven, and charming as usual, Phoenix explained:

We wanted to do a film that explored celebrity, and explored the relationship between the media and the consumers and the celebrities themselves. We wanted something that would feel really authentic. I’d started watching a lot of reality shows and I was amazed that people believed them; that they called them, like, ‘reality.’ I thought the only reason why is because it’s billed as being ‘real’ and the people use their real names. But the acting is terrible. I thought I could handle that. Because you don’t have to be very good. You just use your name, and people think that it’s real.

For a year and a half, Joaquin Phoenix lived the life of a character who shared his name and history and circumstances, both in private scenes and in the public eye. What then, truly, is the difference between what’s “real” and what isn’t? What does “hoax” even mean in the age of “reality TV?” I’m Still Here, along with the context around it, is a philosophical exploration of these questions.

It’s a very similar postmodern paradox that is at the heart of Banksy’s Exit Through The Gift Shop:


.

“The world’s first street art disaster movie” tells the story of Thierry Guetta, an eccentric French-born shop-keeper living in L.A. whose compulsive need to record every waking moment, and a cousin who happens to be the street artist Space Invader, combined to lead Guetta to become the de facto documentarian of the street art scene, tagging along on late-night art missions with its luminaries, including L.A.’s Shepard Fairey and, ultimately, the elusive reigning godfather of street art himself, Banksy. About two thirds of the way through the movie, Guetta, who had never previously edited any of the mountains of footage he’d been obsessively recording, goes to the U.K. to present a first draft of his “street art documentary” to Banksy for feedback. Deflecting his true opinion of the unwatchable film, Banksy suggests that perhaps Guetta should consider becoming a street artist himself and sends him back to L.A. with the idea of putting on a small show. Banksy also requests Guetta send him his raw video footage so that he can reedit it himself. And this is where the movie becomes something like an Andy Warhol adaptation of the Blair Witch Project.

A few months before Joaquin Phoenix would be announcing his acting “retirement,” Guetta’s artist persona, Mr. Brainwash, or MBW, had moved from plastering L.A. with his own likeness — an image of a guy holding a video camera — straight to mounting a massive “street art” show, called “Life Is Beautiful,” in a 15,000 square-foot venue. Seemingly overnight, Mr. Brainwash was being positioned as an up-and-comer with the oeuvre of a Shepard Fairey or a Banksy — by then both artists, as well as many other leading names in the street art world, had begun having their art on display inside galleries as opposed to on the exterior of walls — except unlike these artists with years, even decades of creative evolution and refinement, Guetta had no experience. He’d hired an army of sculptors and designers to manufacture the pieces for his show, ripped straight from bookmarks in art books — even the illustration of Guetta holding the camera had been created by someone else.

The day of the show the line to get in stretched for blocks. Four thousand people attended the opening. By the end of the day nearly a million dollars worth of Mr. Brainwash art had been sold.

The story, at face value, seems so preposterous that the question of whether it could truly be real has dogged the film, as well as created the suspense that’s made it even more of a phenomenon. Could an amateur who’d never actually made art himself succeed at pulling off a show that so blatantly counterfeited and so quickly eclipsed those of the art form’s recognized heavyweights? And would they really release a movie about it happening? Or is all of it — the movie, Life is Beautiful, Mr. Brainwash — simply Banksy’s greatest prank yet? Theories abound. The New York Times labeled it as a harbinger of a new cinematic subgenre: The Prankumentary. “The whole thing, it’s clear now,” Fast Company insisted, “Was an intricate prank being pulled on all of us by Banksy, who has never publicly revealed his identity, with Fairey as his accomplice.” Their conjecture about what really happened: “Banksy… convinced Guetta to pose as a budding graffiti artist wannabe so he and Fairey could ‘direct’ him in real life — manufacturing a brand new persona.” Yet when asked at the end of the film how he feels knowing that he is in part responsible for Mr. Brainwash, Shepard Fairey laughs ruefully, “I had the best intentions. But sometimes even when you have the best intentions things can go awry…. The phenomenon of Thierry becoming a street artist, and a lot of suckers buying into his show and him selling a lot of expensive art very quickly, anthropologically, sociologically, it’s a fascinating thing to observe. And maybe there’s some things to be learned from it.” For his part, Banksy, even as his voice is scrambled beyond recognition, conveys unmistakable melancholy as he says, “I used to encourage everyone I met to make art. I used to think that everyone should do it….. I don’t really do that so much anymore.”

This brutal and revealing account of what happens when fame, money and vandalism collide” could just be an L.A. story simply too bizarre to have been made up, and just as easily, it could all be a fabricated fable about what happens to an artistic movement when it becomes commercialized. From “selling out” to “cashing in” the concept is so mundane it’s a cliché, but Exit Through The Gift Shop‘s treatment is primarily to emphasize the absurdity of the progression of events rather than to make any concrete statement about them. As Banksy’s art dealer says at the end of the film, “I think the joke is on… I don’t know who the joke is on, really. I don’t even know if there is a joke.”

Which brings us back to Charlie Sheen. Not that what Sheen’s doing is any kind of joke or “prank.” This is all very much for real for him. And it is also a very deliberate performance. How did we get here? February 28, Charlie Sheen goes on Good Morning America, The Today Show, TMZ, Radar, Piers Morgan on CNN, 20/20 — basically, every celebrity interview news show he possibly can, and attracts a tsunami of flabbergasted attention for bein’ all ka-raaaazy. The next day he launches a social media empire.

Suddenly sounding not so crazy. Hell, as a digital strategist, I’d say it’s a pretty smart move. Within 25 hours and 17 minutes, Charlie Sheen had broken the world record for amassing 1 million Twitter followers faster than anyone else. Less than a week after his first tweet, he’d reached 2 million. “Another record shattered,” he tweeted, “We gobbled the soft target that was 2.0 mil, like a bag of troll-house zombie chow.” By then, he’d also launched a social media intern search:

which received over 74 THOUSAND! submissions in 5 days. Arguably no other celebrity has “gotten” the way social media works as fast. Even Conan had a slower uptake, though he’s undeniably provided a template for Sheen to work off of. (After getting canned from his TV job, Sheen did like MBW to Conan’s Banksy and announced he’s going on tour — the “Violent Torpedo of Truth/Defeat is Not An Option” Tour — just like Conan’s Banned From Television Tour last year in the wake of his own network debacle.) And, obviously, Sheen’s not doing it all on his own.

In Sheen’s 11-minute livestream episode, titled, “Torpedeos of Truth Part 2,” recorded on March 7th, 2011 — a week after his “old media” blitzkrieg — a terribly lit, grossly contrasted video in which a curmudgeonly, borderline belligerent Sheen looks like he might not have showered for days prior then rolled out of bed that morning, turned on his lap top, and started recording through the built-in camera above the screen, at 6 minutes, 40 seconds, when he ducks “below the frame line,” the camera moves. This is a recording made to look like it’s being done through a shitty built-in computer camera, but when it moves to follow Sheen as he ducks it’s suddenly clear there may be a camera person involved. If there is someone behind the camera, there could just as easily have been a lighting guy, a makeup person, but No! “Make me look as crazy as possible,” was clearly the direction here. By episode four it’d been announced that Sheen had officially been fired from his sitcom. The ante was upped. Suddenly Sheen, well-lit, made-up, looking as healthy as a marathoner — if not for the chain-smoking — in his sweat-wicking Nike shirt, was performing a soliloquy sounding like some misplaced Hunter S. Thompson diatribe. Clearly some writing talent may have been called in — if it hadn’t been already: consider that basically everything coming out of Charlie Sheen’s mouth becomes a meme — it’s been impossible to escape hearing someone say #winning (a hashtag in Charlie Sheen’s very first tweet) for weeks; then there’s #tigerblood, which is so meme-able it can’t even be summarized properly:


Tiger Blood Energy Potion
found in a hotel lobby at SXSW Interactive. Photo: Danny Newman

Right now 4Chan, the primordial ooze that has spawned everything from lolcats to Rickrolling to SadKeanu to every other Internet meme you’ve ever heard of, is looking at Charlie Sheen like Woh. The last guy anywhere near this unstoppably memetastic was the Old Spice Guy–

and that guy was created by an AD AGENCY!

Something else you might notice — Charlie Sheen almost never swears. You have never heard him bleeped in any of the interviews he’s done on TV. There are no R-rated words on his Twitter stream. Every so often there’s some sprinkled in his livestreams, but for the most part The Charlie Sheen Show is all-ages. Where he could say “assholes” or “douchebags,” he says “silly fools” or “trolls.” These Playskool insults are unexpected, amusing, almost benign, yet nostalgically cruel. This is not the syntax of a man out of control.

“Where do these words come from, Charlie,” 20/20’s Andrea Canning asked.

“I don’t know,” he rolled his eyes, “They’re just words that sound cool together. Stuff just comes out and it’s entertaining and it’s fun and it sounds different from all the other garbage people are spewing, you know?”

Charlie Sheen doesn’t have Tourettes. He is deliberately saying these things to entertain and be funny and unique. And he’s good at it. Bret Easton Ellis — the author of Less Than Zero and American Psycho, as well as Lunar Park, a haunted house story in which the main character is a writer named Bret Easton Ellis who’s lived the same history as his eponymous creator (“It was always the A booth. It was always the front seat of the roller coaster. It was never ‘Let’s not get the bottle of Cristal’ … It was the beginning of a time when it was almost as if the novel itself didn’t matter anymore — publishing a shiny booklike object was simply an excuse for parties and glamour.”) or is it, rather, the life he was expected to have been leading? (“What was I doing hanging out with gangbangers and diamond smugglers? What was I doing buying kilos? My apartment reeked of marijuana and freebase. One afternoon I woke up and realized I didn’t know how anything worked anymore. Which button turned the espresso machine on? Who was paying my mortgage? Where did the stars come from? After a while you learn that everything stops.“) — writing in an article titled, “Notes on Charlie Sheen and the End of Empire,” calls Sheen, “the most fascinating person wandering through the culture:”

You’re completely missing the point if you think the Charlie Sheen moment is really a story about drugs. Yeah, they play a part, but they aren’t at the core of what’s happening—or why this particular Sheen moment is so fascinating…. This privileged child of the media’s sprawling entertainment Empire has now become its most gifted ridiculer. Sheen has embraced post-Empire, making his bid to explain to all of us what celebrity now means. Whether you like it or not is beside the point. It’s where we are, babe. We’re learning something. Rock and roll. Deal with it.

Post-Empire isn’t just about admitting doing “illicit” things publicly and coming clean—it’s a (for now) radical attitude that says the Empire lie doesn’t exist anymore, you friggin’ Empire trolls. For my younger friends, it’s no longer rare; it’s now the norm. To Empire gatekeepers, Charlie Sheen seems dangerous and in need of help because he’s destroying (and confirming) illusions about the nature of celebrity.

It’s thrilling watching someone call out the solemnity of the celebrity interview, and Sheen is loudly calling it out as the sham it is. He’s raw and lucid and intense…. We’re not used to these kinds of interviews. It’s coming off almost as performance art and we’ve never seen anything like it—because he’s not apologizing. It’s an irresistible spectacle. We’ve never seen a celebrity more nakedly revealing.

It’s the contradiction we could never quite reconcile in I’m Still Here or Exit Through The Gift Shop; one we can accept in Lady Gaga because she’s not using her real name and we’re sort of OK with it when it’s just a “character.” Charlie Sheen is real and not real at once: a spectacle and a revelation. It’s meta-postmodernism. It’s existential performance art. Minutes before Charlie Sheen’s first livestream was set to start, the audio feed came on. You could hear Sheen rehearsing the rant he would perform that night, prompting the question: is this all an act? Of course it is! He’s an acTOR. From a family of actors, who’s spent his entire life performing. There’s no way he’d go on camera ever without rehearsing. Charlie Sheen’s whole life has been a performance, and this now is not so much different, just with a bigger audience and, as we say in the 21st century music business, cutting out the middleman. As far as Charlie Sheen knows, this is what real is. And as far a we know that’s what it is, too.

Ellis writes:

If you can’t accept the fact that we’re at the height of an exhibitionistic display culture and that you’re going to be blindsided by TMZ (and humiliated by Harvey Levin, or Chelsea Handler—princess of post-Empire) while stumbling out of a club on Sunset Boulevard at 2 in the morning, then you should be a travel agent instead of a movie star. Being publicly mocked is part of the game, and you’re a fool if you don’t play along. This is why Sheen seems saner and funnier than any other celebrity right now. He also makes better jokes about his situation than most worried editorialists or late-night comedians.

What does shame mean anymore? my friends in their 20s ask. Why in the hell did your boyfriend post a song called “Suck My Ballz” on Facebook last night? my mom asks. But nothing yet compares to the transparency that Sheen has unleashed in the past two weeks—contempt about celebrity, his profession, the old Empire world order.

Ellis’s “Empire” is a reference to Gore Vidal’s definition of global American hegemony, which Ellis dates from 1945 until 2005: the era that defined the 20th century. Post-Empire is where we are now. For Ellis, Empire is the lie, the having to hide who you really are, the keeping up appearances; post-Empire, on the other hand, is what Ellis calls, “aggressive transparency.” But his perspective has one flaw: for Ellis, both Empire and post-Empire are binary. It’s one or the other. It’s true or it’s a lie; it’s real or its counterfeit. The post-Empire reality, however, is not the end of the lie, it’s the end of the binary. Sure, “radical transparency” has become a 21st century marketing buzzword. Sure, Mark Zuckerberg believes that Privacy is Dead and has remade Facebook in that image. Sure, I wrote last year, what makes Iron Man the first 21st century superhero? His lack of alter ego; his unconflicted, absolute identity. But that all is only part of the Millennial story.

Social media researcher danah boyd writes:

There’s an assumption that teens don’t care about privacy but this is completely inaccurate. Teens care deeply about privacy, but their conceptualization of what this means may not make sense in a setting where privacy settings are a binary. What teens care about is the ability to control information as it flows and to have the information necessary to adjust to a situation when information flows too far or in unexpected ways.

Just because teens choose to share some content widely does not mean that they wish all content could be universally accessible. What they want is a sense of control.

I’d argue this is, in fact, true of all of us now in the post-Empire. Not just teens. “What Sheen has exemplified and has clarified,” writes Ellis, “Is the moment in the culture when not caring what the public thinks about you or your personal life is what matters most—and what makes the public love you even more (if not exactly CBS or the creator of the show that has made you so wealthy).” Except that Charlie Sheen still very much DOES care. And so do all the rest of us in the 21st century. It’s there in every Facebook photo you’ve untagged yourself from. You had your reasons. It’s there in every location you pulled out your phone to check in at, and then decided not to. It’s there every time you hovered over, and then didn’t click the “Like” button. As tech blogger, Robert Scoble, writes:

The other day I found myself over at Yelp.com clicking “like” on a bunch of Half Moon Bay restaurants. After a while I noticed that I was only clicking “like” on restaurants that were cool, hip, high end, or had extraordinary experiences.

That’s cool. I’m sure you’re doing the same thing.

But then I started noticing that…. What I was presenting to you wasn’t reality.

See, I like McDonalds and Subway. But I wasn’t clicking like on those. Why not?

Because we want to present ourselves to other people the way we would like to have other people perceive us as.

I’d rather be seen as someone who eats salad at Pasta Moon than someone who eats a Big Mac at McDonalds.

This is the problem with likes and other explicit sharing systems. We lie and we lie our asses off.

Not only do we still care what other people think about us, we now curate it more obsessively. Trent Reznor calls it “A hyper-real version of yourself.”

This is the hyper-real version of Charlie Sheen. It is a role that Charlie Sheen is performing. And it is also who he actually is. Because how could he not be? Whatever Charlie Sheen does, that is who he is. This is the only way he has to take control over the flow of his information. For a celebrity in particular, as Ellis points out, that control is virtually non-existent. So how did Charlie Sheen wrest it back? By outdoing TMZ and the news shows and the magazines at their own game. He is no longer just a commodity of the tabloid industrial complex. He is the creator and star of his own show, the Crazy Charlie Sheen Show, and all the press is simply promotion.

Then again, it could be something much more simple. At Coachella 2008, Prince, headlining, kept demanding over and over, “Say my name, Coachella! Say my name, Coachella! Say my name, Coachella!” And like some epic call-and-response an ocean of 150,000 voices roared back: “Prince! Prince! Prince!” And I realized that if you’re Prince, there’s probably no way you can even get off anymore without 150,000 people screaming your name. Perhaps, if you’re Charlie Sheen, you can’t stay sober unless two million people are following your every move — just over two weeks after his first Tweet, it’s now closing in on 3 million.

“We’ve come a long way in the last two weeks,” Ellis concludes. “Sheen is the new reality, bitch, and anyone who’s a hater can go back and hang out with the rest of the trolls in the graveyard of Empire.” Like I’m Still Here and Exit Through The Gift Shop, what Charlie Sheen is doing is part of a continuum exposing the now inherent unreliability of the markers we’d previously depended on to tell the difference between what’s real and what isn’t. In some ways it’s as basic as the shift from the 20th century to the 21st; from analog to digital, from binary to exponential complexity. What, truly, does reality mean when it’s photoshopable? Or just another marketing campaign for some new movie? Not that reality doesn’t exist. Things are, out in the world; you can touch them. Earthquakes happen; nuclear reactors break; nations perch perilously on the verge of catastrophe. Reality exists, but it is no different from not reality. From the inscrutably contradictory government statements about radiation levels, from the fake Nuclear Fallout maps that spread like wildfire. Reality and not reality exist in the same plane now. It’s enough to make you go crazy. Unless you’re Charlie Sheen. In which case you’re not crazy. You simply are as your world is.

.

    



Subscribe for more like this.






T.V. Killed The Movies’ Star

Vanity-Fair-shoot-mad-men-1257702_900_584

In college, we film students had a certain sense of disdain and smug superiority towards our TV-major classmates. Miramax, along with the whole independent film movement it was spearheading, had just hit it’s apex while we’d been in high school, and the late 90’s / early 2000’s saw the releases of such epics as The Matrix, American Beauty, Fight Club, Requiem For A Dream, and many, many more. Meanwhile the most relevant cultural content TV had managed to produce at the time were shows like Seinfeld, Friends, and Survivor. I remember being simply dumbfounded that anyone would want to major in TV at all. I mean, like, what for? The big screen is where the REALLY cutting-edge, fascinating, intelligent, and just plain COOL stuff was at.

Was at.

Slowly, over the course of the decade, in sync with another major trend that has been gradually, and then suddenly, taking over our world, TV has changed. These days, there is such a slew of phenomenal output coming off the small screen, and conversely, a big fat quagmire of mediocrity projecting in theaters. TV is killing the movies.

In a recent Vanity Fair article on Mad Men, Bruce Handy offers this thumbnail history of Hollywood:

Once upon a time, the studios reigned supreme. They bulldozed geniuses and turned out dreck, but in applying Henry Ford discipline and efficiencies to filmmaking they also gave us The Lady Eve, Casablanca, and Singin’ in the Rain. By the 1960s, however, the factory system began to give way, power shifted to directors and stars, and a new generation of independent-minded auteurs crafted sometimes indulgent but often original and even brilliant films such as Bonnie and Clyde, Midnight Cowboy, Taxi Driver, and Apocalypse Now. Then, another turn: studios got the upper hand back, or learned to share it grudgingly with a handful of superstars and A-list directors. But without the old assembly-line rigor the result has too often been big, bloated dreck, like the films of Michael Bay, or the gaseous Oscar bait that bubbles up every fall—the worst of all movie worlds.

But, ah, television. Its great accomplishment over the past decade has been to give us the best of all movie worlds, to meld personal filmmaking, or series-making, with something like the craft and discipline, the crank-’em-out urgency, of the old studio system. I’m thinking first and foremost of The Sopranos, which debuted in 1999 and sadly departed in 2007. This strange and entertaining series, as individual a work as anything by Hitchcock or Scorsese, was the creation of David Chase, and it paved the way for The Wire, Deadwood, Rescue Me, Damages, and its successor as the best drama on television, the equally strange and entertaining Mad Men, which launch[ed] its third season on AMC August 16.

I’ve got my own theory, tho, and it goes something like this: digital technology saved television. Not that it meant to. It just happened by accident. See, the shows of the 90’s and before were, by and large, episodic. Things basically stayed the same from episode to episode. The characters didn’t really change much. The storyline didn’t really go anywhere unexpected, and if it did, it would always manage to resolve the issue, and find its way back to the beginning by the end of each episode. Things like Ross and Rachel  getting together or breaking up or getting back together were EVENTS, reserved for seasonal ratings sweeps.

The new shows we all watch and love, however, are not episodic, they are serial. They typically start with a “previously on” montage. Episodes build on one another in a series, relationships grow, change happens — or perhaps it doesn’t, and that’s exactly where the tension comes from — characters makes life-altering decisions, or maybe we simply find out more about their back-stories, which lets us see their current predicament in a totally new light. Serial shows evolve. And up until this decade that used to scare the shit out of TV networks. Cuz that narrative evolution can quickly become confusing. Lost, as its name would suggest, is perhaps the extreme example of this kind of narrative disorientation. If you miss one episode, shit’s changed and you just have no  idea what’s going on anymore, which is off-putting, and might make you likely to switch the channel to something more familiar. Since greater audience retention means more commercial watchers and higher prices for ad slots, this sort of confusion-induced channel surfing is why TV execs generally wanted to avoid complicated serial content as much as possible.

And then digital technology came along. Technically, HBO was first, with its seminally serial Sporanos, as Handy mentioned, which they could get away with for the same reason they could get away with all their other controversial programming — on premium cable, the shows aren’t at the mercy of advertisers. Nowadays, between Hulu, Tivo, and DVDs, not to mention all the torrent sites for downloading shows, if you’re so inclined, it’s virtually impossible NOT to keep up with a show you really dig, on whatever schedule you prefer. It is absolutely no overstatement to say that these new digital tools have not only had a profound impact on the actual content of television, they’ve helped  release the latent art-form in the medium itself.

As Handy writes:

At its core Mad Men is a moving and sometimes profound meditation on the deceptive allure of surface, and on the deeper mysteries of identity. The dialogue is almost invariably witty, but the silences, of which there are many, speak loudest: Mad Men is a series in which an episode’s most memorable scene can be a single shot of a woman at the end of her day, rubbing the sore shoulder where a bra strap has been digging in. There’s really nothing else like it on television.

There isn’t even anything else like it in the theaters! And this leads me to another change that the new technologies have enabled in television. Because of the new, truly serial format (unlike, even, shows like Buffy, or the X-Files, that came before, which were still a mix of episodic and serial episodes per season), the new TV series story-arc has been extended exponentially. Every episode ends on a cliff-hanger. Nothing is settled. The through-line isn’t just 45 minutes (the duration of a typical hour-long episode, allowing for commercials), it’s now a full season long.

Handy goes on:

I asked David Carbonara, the show’s composer, about a lovely piece of music he used to score a small but key scene in the second-season opener (Episode 201, by the production’s accounting), in which Don, intoxicated for once by his wife, watches a mink-clad Betty descend a hotel’s grand staircase as she arrives for a night out in the city. This was Carbonara’s answer, by e-mail: “It’s a piece written by Nikolai Rimsky-Korsakov called ‘Song of India’ from his opera Sadko. Tommy Dorsey had a hit with an up-tempo version in 1937. Matthew Weiner [Mad Men’s meticulous creator and executive producer] wanted a harp in the hotel lobby to be playing the song, then have the arrangement become larger for scoring Betty’s entrance.… But my favorite use of ‘Song of India,’ and sadly I don’t think anyone noticed, was in episode 211, ‘The Jet Set.’ This time it’s played as a jazz samba in yet another hotel bar as Don thinks he sees Betty! It’s played as source music with a bit of score overlaid on top hopefully calling us back to the previous hotel lobby in episode 201 [which had aired 11 weeks earlier in the series’ initial run], when they were very much in love. I admit it was a bit subtle, but maybe (hopefully!) it had an effect in the viewer’s subconscious.”

There’s just no way a 90-minute movie can compete with something like this. There’s simply no opportunity for this kind of subtlety and nuance and atmosphere in the timing. It’s incomparable. Watching The Jet Set episode Carbonara mentions, in fact, at the very end, when the camera pulls back from Don’s arm, naked, outstretched over the back of the couch in a strange house in Palm Springs, I had a kind of epiphany about the show….

11doncouch-1

This shot is a direct mirror to the iconic Mad Men silhouette, from over Don’s other arm, shirt-clad, stretched over a couch in his New York, Sterling Cooper office….

mad2

With just this single, slow, meditative stroke the shot silently articulates everything you need to understand about the strangeness of this Californian mirrorland that our hero has found himself in, his own strangeness at being there, and how far removed and flipped around everything there is in contrast to his New York reality. Watching this almost subliminal storytelling layer that I’d previously known solely as an achievement of cinema, I suddenly realized that Mad Men had left TV show territory entirely. It had become almost mathematically perfect, a number multiplied by its reciprocal, always equaling 1. It had become a kind of poetry, where every single word and punctuation mark is critical to maintaining the meaning and integrity of the overall structure, which would otherwise collapse if even a single element were removed.

Sure, not every TV show is Mad Men, but there’s more and more shows edging closer. Some of my personal favorites:

  • Sons of Anarchy: Hamlet, set in the world of a central coast Harley gang club. As in, “Something is rotten in the state of California.” I kid you not, the Shakespearean tragedy was a deliberate plot basis. And especially after last year’s Mongols bust, it’s an endlessly fascinating glimpse into a truly subversive culture that’s as much an alternate reality as the world of the Irish Traveller gypsies in the now sadly defunct The Riches.
  • True Blood: the grown-up antidote to the hormonal immaturity and teenybopper banality of Twilight’s vampires. Thank you, Alan Ball (writer of American Beauty, no less), for the sophistication and wit to portray immortality as an existential boredom. There is something absolutely hilarious about an ancient viking vampire complaining, “I texted you three times. Why didn’t you reply?” And a Civil War veteran vampire responding irritated, “Ah hate using the number keys to tah-ype.” Twilight couldn’t summon this much humor from its characters in a million years… literally.
  • Californication: If it’s tortured, satirical, manic celebration of hedonistic nihilism doesn’t feel  familiar to you, you’ve probably never been alive in the 21st-century… or lived in Los Angeles. Also, not since Buffy have I wished for occasion to use the quips and one-liners from a show more.
  • Weeds: The concept alone is fantastic, plus there’s the razor sharp commentary on race and class relations, but it’s the tight structure of the writing that takes it over the edge. With every episode the rule is: Nancy gets something big; Nancy has something bigger taken away. It’s a narcotically addictive formula.
  • I’d mention Lost, too, since people still seem to like it, I guess, and at one point I was among them, until everyone went BACK to the goddamn island last season (are you fucking kidding me?!) and the show became a narrative jerkoff. (For context: Mad Men = narrative sex).

Think about the last movie that you really loved. Was there even one this year? More than one?

Probably not. The economic downturn has screwed the movie industry. Studios’ profits have plummeted. DVD buying, which might have once helped salvage theatrical-release turds, is way down in North America, and in other markets is basically nonexistent due to piracy. With a lot less money coming in, and with production costs continuing to rise, studios are pouring more money into “branded entertainment”—movies based on franchises that have strong brand recognition and can, theoretically, provide a decent opening weekend, a la G.I. Joe. According to the LA Times, an adaptation of the board game Battleship is scheduled for release July 2011, the same month as a third “Transformers” film. Studios have even recently announced the development of new movies based on Monopoly, Clue, and Candy Land. Meanwhile, as traditional movie stars’ are becoming less and less reliable for drawing an audience, major studios are producing far fewer adult dramas, and the independent film world is slowly collapsing under the weight of the recession as well. Last year alone saw the dissolution of three major independent film companies. Time Warner closed Warner Independent Pictures (Little Miss Sunshine, Good Night and Good Luck), and Picturehouse Entertainment (The Women, Mongol), and Viacom closed Paramount Vantage (No Country For Old Men, There Will Be Blood). Things have gotten so whack, Paramount has even had to delay the Martin Scorsese-Leonardo DiCaprio thriller, Shutter Island, from October to February of next year because it couldn’t afford the necessary marketing budget that kind of vehicle requires.

It’s no surprise, then, that so many movie actors are working on the small screen. Once considered a fatal oblivion for movie stars, TV shows these days include titles like Alec Baldwin, Tim Roth, Lawrence Fishburne, Ron Perlman, Anna Paquin, Minnie Driver Eddie Izzard, Jonathan Rhys Myers, Keifer Sutherland, and those are just off the top of my head, but clearly, you’ve noticed this trend yourself. It’s pretty unmistakable. So this is where we find ourselves. Hulu is developing more of a brand online than the big broadcast networks that own shares of it, overtaking ABC, NBC and Fox in web traffic for the first time in June. 1 in 3 households owns a DVR (Digital Video Recorder), 33% in fact, up from 28% a year ago, adding significant numbers of time-shifted viewers to shows’ ratings — 36 shows now add 1 million or more viewers one to seven days after the original air-date. And as movies have sunk to the new low of board game franchise tie-ins, television has woken up out of its reality-TV coma and become the far more innovative, dynamic, and risk-taking medium.

Charlie Collier, president of AMC, quoted in the Vanity Fair article describes Matthew Weiner’s vision for Mad Men, which can be as easily applied to the current state of the tube in general:It’s not television; it’s a world.”

    



Subscribe for more like this.








how not to use condoms

I know the Trojan “Evolve” Campaign has been going on for a while now, but just recently something occurred to me that I hadn’t quite realized about it before.

The campaign started out last June, with the premiere of a commercial featuring women being hit on by a bar full of anthropomorphized pigs. It’s only when one of the pigs finally shuffles off to the men’s room, and purchases a condom, that he is transformed into a hot guy, and returns to the girl he was chatting up to find that she’s now suddenly totally interested in him.

In addition to the ad, whose message at the end reads: “Evolve. Use a condom every time,” the campaign also includes a website, evolveoneevolveall.com, driven by celebrity and user-generated videos dealing with the subject of sexual health, the Trojan Evolve National Tour, a mobile, experiential campaign “Raising awareness and stimulating dialogue about America’s sexual health in towns and campuses across the country,” radio ads that deal with STDs as Christmas gifts (“How about Herpes? It’s the gift that keeps on giving.” / “Would you like Chlamydia wrapped?” / “No, I’ll give it to her unwrapped.”) and more. All of this, hinging on the word “Evolve.”

“Evolve is a wake-up call to change attitudes about using condoms and, on a larger scale, the way we think and talk about sexual health in this country,” said Jim Daniels, Trojan’s VP of marketing. As Andrew Adam Newman pointed out in the New York Times piece, “Pigs With Cellphones, but No Condoms,” the campaign is an evolution for Trojan itself:

While Mr. Daniels does not disparage the company’s double-entendre-heavy “Trojan Man” campaign from the 1990s or similar Trojan Tales Web site today, the tone of the company’s promotions is moving away from “Beavis and Butthead” and toward “Sex and the City.”

“The ‘Evolve’ ad does a nice job of being humorous, but it’s also a serious call to action,” Mr. Daniels said. “The pigs are a symbol of irresponsible sexual behavior, and are juxtaposed with the condom as a responsible symbol of respect for oneself and one’s partner.”

Newman suggest that “The perennial challenge for Trojan and its competitors is the perception that [condoms] are unpleasant to use.” But I think, for a company that, according to A. C. Nielsen Research, has 75 percent of the condom market (Durex is second with 15 percent, LifeStyles third with 9 percent), Trojan oughtta have really known better than that.

“Over the last few years conservative groups in President Bush’s support base have declared war on condoms,” wrote Nicholas D. Kristof, in an opinion piece, also in the New York Times:

I first noticed this campaign last year, when I began to get e-mails from evangelical Christians insisting that condoms have pores about 10 microns in diameter, while the AIDS virus measures only about 0.1 micron. This is junk science (electron microscopes haven’t found these pores), but the disinformation campaign turns out to be a far-reaching effort to discredit condoms, squelch any mention of them in schools and discourage their use abroad.

Then there are the radio spots in Texas: ”Condoms will not protect people from many sexually transmitted diseases.”

A report by Human Rights Watch quotes a Texas school official as saying: ”We don’t discuss condom use, except to say that condoms don’t work.”

Last month at an international conference in Bangkok, U.S. officials demanded the deletion of a recommendation for ”consistent condom use” to fight AIDS and sexual diseases. So what does this administration stand for? Inconsistent condom use?

Kristof was posing this question back in 2003, while he could still add, “So far President Bush has not fully signed on to the campaign against condoms, but there are alarming signs that he is clambering on board.”

In the now almost six years since, the very subject of contraception has become as politicized as abortion, and the emphasis on condoms’ ineffectiveness has become a standard component of Abstinence-Only sex education. (You knew about that, right?) It’s even begun to affect mass media. In a written response to Trojan about why they would not air the pigs-with-cell-phones ad, Fox (which had aired prior Trojan ads) said “Contraceptive advertising must stress health-related uses rather than the prevention of pregnancy.” CBS refused to air it, too, and didn’t even offer further comment. Meanwhile, as paid advertising for condoms is being turned away, in the past few months I’ve seen at least two TV shows where characters made a point of mentioning that condoms don’t work: Fringe, and The Practice–a show about DOCTORS for cryin’ out loud! (Clearly, “First do no harm” must not apply to the practice of TV medicine.)

As a teenager of the 90’s, I’ve never known a world where AIDS didn’t exist, and where condoms were anything but an unequivocal necessity for “safe sex” (also a 90’s-ism that seems to no longer be in use, replaced instead by the millennial “sexual health crisis”). Sure, no one was going around preaching that condoms are 100% fail-proof, but in the decade when Magic Johnson and Greg Louganis both came out as HIV-positive, I can’t imagine any TV program deliberately broadcasting (or being allowed to get away with it), the kind of message that says, “Condoms don’t work. So why bother using them at all?”

As of 2006 the birth rate among 15 to 19 year-olds in the United States has risen for the first time since 1991 (that was the year of Johnson’s announcement). While teenage sex rates have risen since 2001, condom use has dropped since 2003. In other words, more teenagers are having more sex, and using less and less condoms in the process. But then, Jamie Lynn Spears or Bristol Palin could have told you that.

And so it is we find ourselves in a situation where Church & Dwight—the consumer products company that owns Trojan—is taking on what should have been the responsibility of the Department of Health and Human Services. Teenage or not, the U.S. apparently has the highest rates of unintended pregnancy (three million per year) and sexually transmitted infections (19 million per year) of any Western nation. (What the fuck?!)

“Right now in the U.S. only one in four sex acts involves using a condom,” Says Daniels. “Our goal is to dramatically increase use.” Then what in God’s name convinced the Kaplan Thaler Group, the New York advertising agency that created the “Evolve” campaign, that aligning condoms with evolution was the way to go about achieving this?

Cuz here’s the thing: The majority of Americans do not believe in evolution!

http://graphics8.nytimes.com/images/2007/06/18/business/media/18adcol.600.jpg

(CRAP!)

In fact, according to 2006 research in Science Magazine, out of 33 European countries where peolpe were asked to respond “true”, “false”, or “whuuuu?” to the statement: “Human beings, as we know them, developed from earlier species of animals,” the only country that scored lower on belief in evolution than the US is Turkey (Also what the fuck?!)

Disturbing as this unfortunate reality may be, this is the contemporary American Landscape, and pushing Trojan as “Helping America evolve, one condom at a time,” in the face of it, seems ludicrous.

Hell, why not just call the campaign “Darwin’s theory of contraception,” while you’re at it?

The biggest threat to condoms is not the perception that they don’t feel good. It’s not even condom fatigue. The biggest threat to condoms is the Christian Right’s propaganda that they don’t work, and the government’s, and much of media’s, wholehearted complicity. And it’s the same people who are waging a war on contraception that don’t like Evolution either. I don’t know about the ultimate impact that the Evolve campaign is effecting (or not), but in my view, if, as Daniels says, Trojan’s focus is on growing the market beyond the–pardon the irony here–already converted, and getting more people to use condoms, I think a completely different slogan/campaign theme would be the way to go.

    



Subscribe for more like this.