The Last Exit To The Millennium

“Those of us who watched Kids as adolescents,” writes Caroline Rothstein, in her Narrative.ly piece Legends Never Die, “Growing up in an era before iPhones, Facebook, and Tiger Moms, had our minds blown from wherever we were watching–whether it was the Angelika Film Center on the Lower East Side or our parents’ Midwestern basements. We were captivated by the entirely unsupervised teens smoking blunts, drinking forties, hooking up, running amok and reckless through the New York City streets…. Two decades after [the] film turned Washington Square skaters into international celebrities, the kids from ‘Kids’ struggle with lost lives, distant friendships, and the fine art of growing up.”

If you came up in the 90’s, you remember Kids. But I’d hardly given it a backward glance in ages. Had it really been two decades? It seemed somehow inconceivable. The cast, none of them professional actors, all plucked from the very streets they skated on, had become fixed in my mind as eternal teenagers, immortalizing a hyperbolized — and yet, not entirely foreign — experience. Kids was grotesque and dirty and self-indulgent and unignorable, and so was high school. Which is where I, and my friends, were at the time. The movie had become internalized. I had entirely forgotten that this was where Chloe Sevigny and Rosario Dawson had come from. Like a rite of passage, it seemed to carry a kind of continuity, like it was something everyone goes through. It seemed disconnected from any kind of evolving timeline.

And yet time had passed. Revisiting the lives of the cast 20 years later, Rothstein writes, “Justin Pierce, who played Casper, took his life in July 2000, the first of several tragedies for the kids. Harold, who played himself in the film and is best remembered for swinging his dick around in the pool scene—he was that kid who wasn’t afraid, who radiated a magnetic and infectious energy both on and off screen—is gone too. He died in February 2006 from a drug-induced heart attack.” Sevigny and Dawson have become successful actors. Others tied to the crew have gone on to lead the skate brand Zoo York, and start a foundation that aims to “use skateboarding as a vehicle to provide inner-city youth with valuable life experiences that nurture individual creativity, resourcefulness and the development of life skills.” But the most striking story for me, however, was of what happened over the past 20 years to the movie’s most profoundly central character:

“I think that Kids is probably the last time you see New York City for what it was on film,” [says, Jon “Jonny Boy” Abrahams.] “That is to me a seminal moment in New York history because right after that came the complete gentrification of Manhattan.”

Kids immortalizes a moment in New York City when worlds collided–“the end of lawless New York,” Eli [Morgan, co-founder of Zoo York] says–before skateboarding was hip, before Giuliani cleaned up, suited up, and wealthy-ed up Manhattan.

“I don’t think anyone else could have ever made that movie,” says Leo [Fitzpatrick, who played the main character, Telly]. “If you made that movie a year before or after it was made, it wouldn’t be the same movie.”

Kids‘ low-budget grit and amateur acting gave it a strange ambivalence. It was neither fully fictional nor fully real. It blurred the line between the two in a way that it itself did not quite fully understand — it was the very, very beginning of “post-Empire,” when such ambiguities would become common — and neither did we. Detached from  the confines of the real and the fictional, it had a sense of also being out of time. But it turns out it was in fact the opposite. Kids was a time capsule. As Jessica [Forsyth] says in the article: “It’s almost like Kids was the dying breath of the old New York.”

It’s a strange thing. One day you wake up and discover that culture has become history. In the end it wasn’t a dramatic disaster or radical new technology that changed the narrative in an instant. It was a transition that happened gradually. The place stands still, and time revolves around it; changes it the way wind changes the topography of dunes.

Just a few days after Rothstein’s piece, I read these truly chilling words in The New York Times:

“The mean streets of the borough that rappers like the Notorious B.I.G. crowed about are now hipster havens, where cupcakes and organic kale rule.”

For current real estate purposes, the block where the Brooklyn rapper Notorious B.I.G., whose real name was Christopher Wallace, once sold crack is now well within the boundaries of swiftly gentrifying Clinton Hill, though it was at the edge of Bedford-Stuyvesant when he was growing up. Biggie, who was killed under still-mysterious circumstances in 1997, was just one of the many rappers to emerge from Brooklyn’s streets in the ’80s and ’90s. Including successful hardcore rappers, alternative hip-hop M.C.s, respected but obscure underground groups and some — like KRS-One and Gang Starr — who were arguably all of the above, the then-mean streets gave birth to an explosion of hip hop. Among the artists who lived in or hung out in this now gentrified corner of the borough: Not only Jay-Z, but also the Beastie Boys, Foxy Brown, Talib Kweli, Big Daddy Kane, Mos Def and L’il Kim.

For many, the word “Brooklyn” now evokes artisanal cheese rather than rap artists. The disconnect between brownstone Brooklyn’s past and present is jarring in the places where rappers grew up and boasted about surviving shootouts, but where cupcakes now reign. If you look hard enough, the rougher past might still be visible under the more recently applied gloss. And if you want to buy a piece of the action, Biggie’s childhood apartment, a three-bedroom walk-up, was recently listed by a division of Sotheby’s International Realty. Asking price: $725,000.

When we imagine the world of the future, it is invariably a world of science fiction. It’s always, “Here’s what Los Angeles might look like in seven years: swamped by a four-foot rise in sea level, California’s megalopolis of the future will be crisscrossed with a thousand miles of rail transportation. Abandoned freeways will function as waterslides while train passengers watch movies whiz by in a succession of horizontally synchronized digital screens. Foodies will imbibe 3-D-printed protein sculptures extruded by science-minded chefs.”

It’s always impersonal. The future,  even one just seven years away, seems always inhabited entirely by future-people. It’s not a place where we actually imagine….ourselves. Who will we be when the music that speaks to us now becomes “Classic” (Attention deficit break: “Elders react to Skrillex“); when the movies or TV shows or — lets be real, it’s most likely going to be — web content that captures the spirit of  this moment becomes a time capsule instead of a reflection? When once counter-cultural expressions — like skating, or hip hop — become mainstream? Who will we be when there is no longer a mainstream, or a counter-culture, for that matter? And who will the teenagers of this future be when the culture of their youth ages?

The past isn’t a foreign country. It’s our hometown. It’s the place we left, that has become immortalized in our memory the way it was back then. We return one day to discover new buildings have sprung up in empty lots, new people have moved in and displaced the original residents. Some from the old neighborhood didn’t made it out alive. The past has moved while we weren’t looking. It’s no longer where it was at all.

“In the ’80s and ’90s–as strange as it may seem to say this–we had such luxury of stability,” William Gibson, the once science-fiction writer who popularized the word “cyberspace,” and turned natural realist novelist in the 21st-century, said in a 2007 interview. “Things weren’t changing quite so quickly in the ’80s and ’90s. And when things are changing too quickly you don’t have any place to stand from which to imagine a very elaborate future.”

Yet this week, it seems to me the more mysterious our future, the more the past becomes a moving target.

Then again, perhaps it always was.

Strange memories on this nervous night in Las Vegas. Five years later? Six? It seems like a lifetime, or at least a Main Era—the kind of peak that never comes again. San Francisco in the middle sixties was a very special time and place to be a part of. Maybe it meant something. Maybe not, in the long run… but no explanation, no mix of words or music or memories can touch that sense of knowing that you were there and alive in that corner of time and the world. Whatever it meant.…

History is hard to know, because of all the hired bullshit, but even without being sure of “history” it seems entirely reasonable to think that every now and then the energy of a whole generation comes to a head in a long fine flash, for reasons that nobody really understands at the time—and which never explain, in retrospect, what actually happened.

There was madness in any direction, at any hour. You could strike sparks anywhere. There was a fantastic universal sense that whatever we were doing was right, that we were winning.… We had all the momentum; we were riding the crest of a high and beautiful wave.…

So now, you can go up on a steep hill in Las Vegas and look West, and with the right kind of eyes you can almost see the high-water mark—that place where the wave finally broke and rolled back.”

– Hunter S. Thompson

highwatermarknewyork
Map of New York City showing the remnants of the 6ft high water line from Hurricane Sandy.
Crom Martial Training, Rockaway Beach. (Source)

 

    



Subscribe for more like this.






It’s The End Of The World As We Know It…. And I Feel Fine

According to the Mayan calendar — as translated by new-age hippies I used to know, and depicted by Roland Emmerich — the year 2012 is alleged to herald the apocalypse. Perhaps this collective unconscious sense of mass destruction is what’s driving the popularity of turn-of-the-millennium musings about the end of the world. In June 2008, Adbusters’ cover story was, literally, titled, “Hipster: The Dead End of Western Civilization.” Three and a half years later, Vanity Fair’s first issue of 2012 asks, “You Say You Want a Devolution? From Fashion to Housewares, Are We in a Decades-Long Design Rut?” While these two publications could arguably not be further apart on the target audience spectrum, they’re singing the same doomsday tune. As Kurt Andersen writes in the Vanity Fair piece, “The past is a foreign country, but the recent past—the 00s, the 90s, even a lot of the 80s—looks almost identical to the present.” The last line of the article concludes, “I worry some days, this is the way that Western civilization declines, not with a bang but with a long, nostalgic whimper.” But has cultural evolution really come to a grinding halt in the 21st century, or are we simply looking in all the old places, not realizing it’s moved on?

In Adbusters, Douglas Haddow sets up the alleged apocalypse like so:

Ever since the Allies bombed the Axis into submission, Western civilization has had a succession of counter-culture movements that have energetically challenged the status quo. Each successive decade of the post-war era has seen it smash social standards, riot and fight to revolutionize every aspect of music, art, government and civil society. But after punk was plasticized and hip hop lost its impetus for social change, all of the formerly dominant streams of “counter-culture” have merged together. Now, one mutating, trans-Atlantic melting pot of styles, tastes and behavior has come to define the generally indefinable idea of the ‘Hipster.’

Echoing that sentiment in Vanity Fair, Andersen writes:

Think about it. Picture it. Rewind any other 20-year chunk of 20th-century time. There’s no chance you would mistake a photograph or movie of Americans or an American city from 1972—giant sideburns, collars, and bell-bottoms, leisure suits and cigarettes, AMC Javelins and Matadors and Gremlins alongside Dodge Demons, Swingers, Plymouth Dusters, and Scamps—with images from 1992. Time-travel back another 20 years, before rock ’n’ roll and the Pill and Vietnam, when both sexes wore hats and cars were big and bulbous with late-moderne fenders and fins—again, unmistakably different, 1952 from 1972. You can keep doing it and see that the characteristic surfaces and sounds of each historical moment are absolutely distinct from those of 20 years earlier or later: the clothes, the hair, the cars, the advertising—all of it. It’s even true of the 19th century: practically no respectable American man wore a beard before the 1850s, for instance, but beards were almost obligatory in the 1870s, and then disappeared again by 1900.

Writing about the Adbusters piece in 2008, I pointed to a central flaw in the premise: the emergence of what Chris Anderson, in his 2006 book of the same name, calls, The Long Tail. Digital technology, Anderson writes, has ushered in “An evolution from an ‘Or’ era of hits or niches (mainstream culture vs. subcultures) to an ‘AND’ era.” In this new, rebalanced equation, “Mass culture will not fall, it will simply get less mass. And niche culture will get less obscure.” What Adbusters saw as the end of Western civilization was actually the end of mass culture; a transition to a confederacy of niches. So, if mass culture, as the construct we, and Adbusters, had known it to be was over, what was there to be “counter” to anymore? (While, more recently, Occupy Wall Street has thrown its hat into the ring, it’s not so much anti-mass culture as it is pro-redefining the concept: the 99%, through the movement’s message — let alone mathematics — is not the counterculture. It IS the culture.)

Unlike Haddow, Andersen doesn’t blame the purported cultural stagnation on any one group of perpetrators. Rather, the “decades-long design rut” has descended upon us all, he suggests, like an aesthetic recession, the result of some unregulated force originating in the 1960′s and depreciating steadily until it simply collapsed, and none of us noticed until it was too late. “Look at people on the street and in malls,” Andersen writes, “Jeans and sneakers remain the standard uniform for all ages, as they were in 2002, 1992, and 1982. Since 1992, as the technological miracles and wonders have propagated and the political economy has transformed, the world has become radically and profoundly new.” And yet, “during these same 20 years, the appearance of the world (computers, TVs, telephones, and music players aside) has changed hardly at all, less than it did during any 20-year period for at least a century. This is the First Great Paradox of Contemporary Cultural History.”

Or is it?

In a 2003 New York Times article titled, The Guts of a new Machine, the design prophet of the 21st century revealed his philosophy on the subject: “People think it’s this veneer,” said the late Steve Jobs, “That the designers are handed this box and told, ‘Make it look good!’ That’s not what we think design is. It’s not just what it looks like and feels like. Design is how it works.”

Think about it. Picture it. Those big, bulbous cars Andersen describes, with their late-moderne fenders and fins, so unmistakably different from 1952 to 1977, just how different were they, really, in how they worked? Not that much. In the 20th century you could pop open the hood of a car and with some modicum of mechanics know what it was you were looking at. Now, the guy in the wifebeater working on the Camaro in his garage is an anachronism. You’ll never see that guy leaning over the guts of a post-Transformers, 2012 Camaro. Let alone a hybrid or an electric vehicle. “With rare exceptions,” Andersen argues, “cars from the early 90s (and even the late 80s) don’t seem dated.” And yet, there’s no way anyone would confuse a Chevy Volt with anything GM was making 10 years ago, or a Toyota Prius with what was on the road in the early 90s, or voice recognition capability, completely common in a 2012 model, as anything but a science fiction conceit in a show starring David Hasselhoff, in 80s. While it’s debatable that exterior automotive styling hasn’t changed in the past 30 years (remember the Tercel? The station wagon? The Hummer? A time before the SUV?) it’s indisputable that the way a 2012 automobile works has changed.

For the majority of human history the style shifts between eras were pretty much entirely cosmetic. From the Greeks to the Romans, from the Elizabethans to the Victorians, what fluctuated most was the exterior. It wasn’t until the pace of technological innovation began to accelerate in the 20th century that design became concerned with what lay beneath the surface. In the 1930s, industrial designer Raymond Loewy forged a new design concept, called Streamlining. One of the first and most widespread design concepts to draw its rationale from technology, Streamlining was characterized by stripping Art Deco, its flamboyant 1920’s predecessor, of all nonessential ornamentation in favor a smooth, pure-line concept of motion and speed. Under the austerity of the Depression era, the superficial flourishes of Art Deco became fraudulent, falsely modern. Loewy’s vision of a modern world was minimalist, frictionless, developed from aerodynamics and other scientific concepts. By the 1960’s Loewy’s streamlined designs for thousands of consumer goods — everything from toasters and refrigerators to automobiles and spacecrafts — had radically changed the look of American life.

What began in the 20th century as a design concept has, in the 21st, become THE design concept. Technological innovation — the impact of which Andersen breezes past — has become the driving force behind aesthetic innovation. Design is how it works. Aerodynamics has paved the way for modern considerations like efficiency, performance, usability, sustainability, and more. But unlike fluctuating trends in men’s facial hair or collar size, technology moves in one direction. It does not vacillate, it iterates, improving on what came before, building incrementally. The biggest aesthetic distinctions, therefore, have become increasingly smaller.

Consider, for example, this optical illusion:

What, exactly, is the difference between the two things above? Rewind twenty years, and it’s already unlikely most people would have been able to really tell a difference in any meaningful way. Go back even further in time, and these things become pretty much identical to everyone. Yet we, the inhabitants of 2012, would never, ever, mistake one for the other. The most minute, subtlest of details are huge universes of difference to us now. We have become obsessives, no longer just consumers but connoisseurs, fanatics with post-industrial palates altered by exposure to a higher resolution. And it’s not just about circuitry. In fashion, too, significant signifiers have become more subtle.

The New York Magazine writeup for Blue in Green, a Soho-based men’s lifestyle store reads:

Fifteen hard-to-find, premium brands of jeans—most based in Japan, a country known for its quality denim—line the walls. Prices range from the low three figures all the way up to four figures for a pair by Kyuten, embedded with ground pearl and strips of rare vintage kimono. Warehouse’s Duckdigger jeans are sandblasted in Japan with grains shipped from Nevada and finished with mismatched vintage hardware and twenties-style suspender buttons. Most jeans are raw, so clients can produce their own fade, and the few that are pre-distressed are never airbrushed; free hemming is available in-house on a rare Union Special chain-stitcher from an original Levi’s factory.

(Sidenote: it’s not just jeans. Wool — probably not the next textile in line on the cool spectrum after denim — is catching up. Esquire apparently thinks wool is so interesting to their readers they created an illustrated slide show about different variations of sheep.)

“Our massively scaled-up new style industry naturally seeks stability and predictability,” Andersen argues. “Rapid and radical shifts in taste make it more expensive to do business and can even threaten the existence of an enterprise.” But in fact, when it comes to fashion, quite the opposite is true. To keep us buying new clothes — and we do: according to the Daily Mail, women have four times as many clothes in their wardrobe today as they did in 1980, buying, and discarding half their body weight in clothes per year — styles have to keep changing. Rapid and radical shifts in taste are the foundation of the fashion business; a phenomenon the industry exploits, not fears. And the churn rate has only accelerated. “Fast Fashion,” a term coined in the mid-2000′s, means more frequent replacement of cheaper clothes that become outdated more quickly.

“The modern sensibility has been defined by brief stylistic shelf lives,” Andersen writes, “Our minds trained to register the recent past as old-fashioned.” But what has truly become old-fashioned in the 21st century, whether we’ve realized it or not, is the idea of a style being able to define a decade at all. It’s as old-fashioned as a TV with a radial dial or retail limitations dictated by brick and mortar. As Andersen himself writes, “For the first time, anyone anywhere with any arcane cultural taste can now indulge it easily and fully online, clicking themselves deep into whatever curious little niche (punk bossa nova, Nigerian noir cinema, pre-war Hummel figurines) they wish.” And primarily what we wish for, as Andersen sees it, is what’s come before. “Now that we have instant universal access to every old image and recorded sound, the future has arrived and it’s all about dreaming of the past.” To be fair, there is a deep nostalgic undercurrent to our pop culture, but to look at the decentralization of cultural distribution and see only “a cover version of something we’ve seen or heard before” is to miss the bigger picture of our present, and our future. The long tail has dismantled the kind of aesthetic uniformity that could have once come to represent a decade’s singular style. In a confederacy of niches there is no longer a media source mass enough to define and disseminate a unified look or sound.

As with technology, cultural evolution in the 21st century is iterative. Incremental changes, particularly ones that originate beneath the surface, may not be as obvious through the flickering Kodak carousel frames of decades, but they are no less profound. In his 2003 book, The Rise of the Creative Class: And How It’s Transforming Work, Leisure, Community, and Everyday Life, Richard Florida opens with a similar time travel scenario to Andersen’s:

Here’s a thought experiment. Take a typical man on the street from the year 1900 and drop him into the 1950s. Then take someone from the 1950s and move him Austin Powers-style into the present day. Who would experience the greater change?

On the basis of big, obvious technological changes alone, surely the 1900-to-1950s traveler would experience the greater shift, while the other might easily conclude that we’d spent the second half of the twentieth century doing little more than tweaking the great waves of the first half.

But the longer they stayed in their new homes, the more each time-traveler would become aware of subtler dimensions of change. Once the glare of technology had dimmed, each would begin to notice their respective society’s changed norms and values, and the ways in which everyday people live and work. And here the tables would be turned. In terms of adjusting to the social structures and the rhythms and patterns of daily life, our second time-traveler would be much more disoriented.

Someone from the early 1900s would find the social world of the 1950s remarkably similar to his own. If he worked in a factory, he might find much the same divisions of labor, the same hierarchical systems of control. If he worked in an office, he would be immersed in the same bureaucracy, the same climb up the corporate ladder. He would come to work at 8 or 9 each morning and leave promptly at 5, his life neatly segmented into compartments of home and work. He would wear a suit and tie. Most of his business associates would be white and male. Their values and office politics would hardly have changed. He would seldom see women in the work-place, except as secretaries, and almost never interact professionally with someone of another race. He would marry young, have children quickly thereafter, stay married to the same person and probably work for the same company for the rest of his life. He would join the clubs and civic groups befitting his socioeconomic class, observe the same social distinctions, and fully expect his children to do likewise. The tempo of his life would be structured by the values and norms of organizations. He would find himself living the life of the “company man” so aptly chronicled by writers from Sinclair Lewis and John Kenneth Galbraith to William Whyte and C.Wright Mills.

Our second time-traveler, however, would be quite unnerved by the dizzying social and cultural changes that had accumulated between the 1950s and today. At work he would find a new dress code, a new schedule, and new rules. He would see office workers dressed like folks relaxing on the weekend, in jeans and open-necked shirts, and be shocked to learn they occupy positions of authority. People at the office would seemingly come and go as they pleased. The younger ones might sport bizarre piercings and tattoos. Women and even nonwhites would be managers. Individuality and self-expression would be valued over conformity to organizational norms — and yet these people would seem strangely puritanical to this time-traveler. His ethnic jokes would fall embarrassingly flat. His smoking would get him banished to the parking lot, and his two-martini lunches would raise genuine concern. Attitudes and expressions he had never thought about would cause repeated offense. He would continually suffer the painful feeling of not knowing how to behave.

Out on the street, this time-traveler would see different ethnic groups in greater numbers than he ever could have imagined — Asian-, Indian-, and Latin-Americans and others — all mingling in ways he found strange and perhaps inappropriate. There would be mixed-race couples, and same-sex couples carrying the upbeat-sounding moniker “gay.” While some of these people would be acting in familiar ways — a woman shopping while pushing a stroller, an office worker having lunch at a counter — others, such as grown men clad in form-fitting gear whizzing by on high-tech bicycles, or women on strange new roller skates with their torsos covered only by “brassieres” — would appear to be engaged in alien activities.

People would seem to be always working and yet never working when they were supposed to. They would strike him as lazy and yet obsessed with exercise. They would seem career-conscious yet fickle — doesn’t anybody stay with the company more than three years? — and caring yet antisocial: What happened to the ladies’ clubs, Moose Lodges and bowling leagues? While the physical surroundings would be relatively familiar, the feel of the place would be bewilderingly different.

Thus, although the first time-traveler had to adjust to some drastic technological changes, it is the second who experiences the deeper, more pervasive transformation. It is the second who has been thrust into a time when lifestyles and worldviews are most assuredly changing — a time when the old order has broken down, when flux and uncertainty themselves seem to be part of the everyday norm.

It’s the end of the world as we’ve known it. And I feel fine.

    



Subscribe for more like this.






The First 21st Century Vampires

Eric-In-VQ_Vampire-Quarterly-true-blood-7000515-1460-1956

A month before the premiere of True Blood’s third season earlier this summer I wrote a post about the first 21st century superhero. The new Iron Man, as reimagined by Jon Favreau and portrayed by Robert Downey Jr., had broken the mold constricting the superhero archetype since its inception back in the late 1930’s, and in its place offered a vibrantly modern model for the character, reflecting the unique culture, ethos, and mores of the 21st century. True Blood, I’m realizing, is now doing the same for that other undying superhuman trope: the vampire.

Of course, the vampire has been undead for a lot longer. The earliest recorded vampire myth dates back to Babylonia, about 4,000 years ago, and over the millennia it has appeared in almost every culture. But lets cut to the chase: 1922 was year vampires broke ground in film (though, technically, they’d made a few cameos before then). It was the year F. W. Murnau’s “Nosferatu” came out.

20081028002243

Take a good look. That’s what a movie vampire used to be. A creature no teen girl, or anyone else for that matter, would want to see as a lead in a summer mystical romance franchise. In all the silent films that featured vampires there was always a clear and consistent view: here be monsters.

While this original archetype might have undergone a radical transformation over the past 80+ years of cinema — from grotesque monster to, ironically, heartthrob, a result of the only evolutionary force vampires are actually subject to: sexual selection, naturally — don’t be fooled. Just because Twilight’s Edward Cullen or the whatever-their-names-are characters of The Vampire Diaries happen to be getting panties in a twist at the moment, they are not in any way contemporary. Much has been made about the exceptionally “old-fashioned” gender roles in Twilight, but that analysis is basically missing the forest for one tree. Think about it: is there ANYTHING that happens in Twilight that could not have happened just as easily 50 years ago? You could turn Twilight into a 1950’s period piece and basically NOTHING about the major plot points, dialogue, personalities, relationships, or motivations — of either the vampires OR humans in this saga — would need to change. This does not a 21st century story make. In fact, if you’re curious about exactly why Twilight is so popular, the mechanics of this process are actually quite timeless:

Twilight’s preternatural hotties aren’t so much throwbacks as they are completely out of time. The story could be happening in any age; its characters’ capacity to reflect some kind of cultural context is irrelevant, probably detrimental.

The predominant Millennial quality that grounds Iron Man in the 21st century, I wrote, is transparency. In his total openness about everything from his deepest secret to his fleeting impulses he is as “post-privacy” as Facebook would have us all become. To suggest that True Blood’s vampires are uniquely modern because they too, like Tony Stark, have revealed their secret identity to the world, would be easy — it is, after all the premise that the entire show is based on — but it wouldn’t be accurate. For Stark, radical transparency is a way of life. You never have to wonder what Tony Stark is thinking because it’s usually exactly what’s coming out of his mouth at any given moment. The vampires on true blood are anything but transparent. Their secret truths and ulterior motives are consistently obscure. Tellingly, even Sookie Stackhouse, the show’s mind-reader, can’t penetrate their thoughts. Despite a superficial simulation, transparency is not really a quality that connects True Blood’s vampires to the modern age. But you know what does?

Recycling.

These vampires are environmentally conscious! Hey, it’s the  the 21st century, caring about the environment is hot! In fact, in the wake of the BP Oil Spill disaster which has affected all the Gulf states — chief among them, Louisiana, True Blood’s setting — there is a subtly startling undercurrent of environmentalism running through this season’s sublot. At one point, Russell Edgington, the 3,000-year old vampire King of Mississippi, a new character introduced this season, rhapsodizes, “I mean, do you remember how the air used to smell? How humans used to smell? How they used to taste?” Earlier, the vampire Queen of Louisiana describes a rare delicacy: “A Latvian boy. Has to be tasted to be believed. Not polluted like most humans. Tastes exactly the way they used to taste before the industrial revolution fucked everything to hell.” When Russell asks rhetorically, “What other creature actively destroys its own habitat,” one imagines these vampires didn’t need to see an Inconvenient Truth because they’ve lived it. They may be blood-sucking fiends but destroying the planet is below even their standards.

Nevertheless, consumer culture that they’ve lived to find themselves in, they’re not beyond shopping at the mall. (Looking good is, after all, a vampire priority.)

mall

No doubt, there’ll be some anecdote about a vampire shopping online eventually. Most likely Eric will get there before Bill, I’m assuming, based on this classic exchange from season 1:

Eric: “I sent you three texts, why didn’t you reply?”
Bill: “I hate using the number keys to type.”

In fact, while Bill might be True Blood’s most conservative vampire (how postmodern!) — his education on how to be a vampire for the 17-year old girl he’s just been forced to turn into one is about as awkward and evasive as the birds and the bees talk from a religious dad — Eric is, arguably, its most progressive. That is, he has no fear of progress. Eric might be 1,000 years old but he’s as naturally at ease with his tech gadgets as any “digital native.” So far, he’s the only vampire I’ve seen use a bluetooth device. Ever.

bluetooth

As the proprietor of a popular vampire bar called Fangtasia, Eric clearly recognized “The Great Revelation” — as the vampires call their coming out to the world — as a great business opportunity. Entrepreneurship is an unexpected quality for a vampire in general — I mean, why bother with such pedestrian concerns when you’re immortal, right? On the other hand, what else would you do with an eternity of nights? Might as well launch a nightlife startup. According the Wall Street Journal, The Great Recession, which began in full force around the time True Blood first got on the air, is churning out ever more entrepreneurs. Entrepreneur.com reports, 8.7% of job seekers gained employment by starting their own businesses in the second quarter of 2009, and they expect to see even more people starting their own businesses in 2010. So it’s no surprise that 21st century vampires would be business-minded. Upon visiting Fangtasia, Russell, himself a semi-silent owner of a werewolf bar in Mississippi called Lou Pines, even tells Eric, “We must talk of franchising.”

If being an entrepreneur isn’t your thing, there’s always the royal route: seizing assets from your subjects. In the vampire Queen’s case, that asset is vampire blood, which she then has other vampires move as black market narcotic. Since selling their blood is a high crime among vampires, it’s initially unclear why the Queen would be doing this. What inscrutable and ominous vampiric motives could she have? By season 3 it’s revealed that the Queen needs the money to pay off the IRS. For vampires in the 21st century, death might not be certain, but taxes are. Indeed, True Blood’s portrayal of vampire culture is more of a bureaucracy than any other cinematic depiction. After a religious fanatic suicide bomber self-detonates at a party in a vampire lair, killing a number of humans and vampires in attendance, there are, literally, forms that the lair’s owner has to fill out in this situation — a sequence that encapsulates the equally bizarre extremes of both the terrorism and banality of our age.

While just last Wednesday, U.S. District Judge Vaughn Walker ruled that California’s Proposition 8 initiative, which denies marriage rights to same-sex couples, was unconstitutional, on True Blood, same-sex couple Russell and Talbot have been married for 700 years. Homoerotica is by no means anything new in vampire lore, but gay marriage?? There’s a concept that barely existed in the public discourse before the 21st century. And Russell and Talbot’s relationship is exactly what you’d expect from a couple that’s been married for 7 centuries — anything but erotic. A particularly noticeable departure for the otherwise seriously agrosexual HBO series. Of course, the new phenomenon of marriage between vampire and human — which, though legal in the word of True Blood, is still highly controversial — has, from the show’s beginnings, served as a running metaphor for “marriage equality.” Alan Ball, the creator of True Blood, as well as Six Feet Under, and the Oscar-winning screenwriter of American Beauty, is not only someone who clearly understands a thing or two about the modern existential condition, he is also an openly gay man. No surprise, then that True Blood’s very opening credits sequence weekly drives home a starkly unfantastical image that connects vampires to that other minority fighting religious opposition for equal rights in the 21st century.

godhatesfangs

“Alternative lifestyle,” an often-used euphemism for homosexuality, is actually a perfect way to describe True Blood’s approach to vampirism. Even the show’s brilliantly integrated marketing campaigns have sought to bring True Blood’s fictional world off the screen and into reality by treating vampires as an increasingly visible minority with their own lifestyle brands and targeted advertising:

tbmonstertbmini

tbharley tbecko

True Blood’s vampires even blog. Well, technically, it’s only Jessica, with her http://babyvamp-jessica.com blog, but as a 17 year-old who just became undead last year she’s the only Gen-Y vampire on the show, so obviously she’d be the one blogging — check out the awesomely pointless first few entries — 1, 2, 3 — this directionless experimentation with a new “toy” is exactly how a teenager would start a blog. (Vampire diaries?? Who the hell keeps a “diary” anymore in the age of social media? Sheesh.)

Overall, there is a deep, underlying theme about progress coursing through True Blood. “It’s vampires like you, who’ve been holding the rest of us back for centuries,” sneers Russell before destroying a Spanish Inquisition-era vampire Magister. It’s the vampires that are most hung up on the past who are some of the show’s craziest messes. The psychotic vampire Queen, who’s stuck in some perpetual 1940’s costume drama, has just been stripped of power; Lorena, whose inability to get over her past with Bill becomes her destruction; Eric’s newly-revealed 1,000 year old revenge obsession for the murder of his father will no doubt promptly lead him into some kind of trouble this season. Godric, Eric’s maker, even destroyed himself in part because after 2,000 years he could no longer bear that vampires had not progressed; that he hadn’t. Unlike the atemporal caricatures of the other franchises, True Blood’s vampires offer a uniquely compelling commentary on our rapidly changing present through their own, archly extrahuman, relationship to it. We are living in a time when change, whether we like it or not, is coming at us so fast and furious we can barely comprehend it — speaking on a panel at Techonomy last week, Google CEO Eric Schmidt said we now create 5 exabytes of data every two days, an amount equal to all the information created from the dawn of civilization through 2003. Who can really understand whatever the hell that even means?  True Blood’s vampires are at once representations of cultural change within the narrative of the show, and, likewise, must themselves confront a new millennium’s progress. Some adapt better than others. Some have more sinister interpretations of where progress should lead, but they, like the rest of us in the 21st century, either accept change, or deny it at their own peril.

    



Subscribe for more like this.






What The F**K Is Social Media NOW?

http://www.brandinfiltration.com/img/logo.png

For the past two years, Espresso has taken a stab at answering a simple, compelling question: “What The F**k Is Social Media?” The answer has turned into a series of presentations that have been viewed over 750,000 times, translated into about 10 languages (including Russian, so I’ve finally been able to explain to my parents what it is I “do”), and proclaimed “a social media hit for its wit and its very convincing case for the raw power of social media,” by Mashable.

This year, I’m proud to say I helped research and cowrite the third installment in Espresso’s “blockbuster summer franchise.” Check it out!

.
If you’d perfer a  non-“parental advisory” rendition, the “radio version” is here.

    



Subscribe for more like this.






the integration is the message

Have you seen these billboards?

They’re all over the place:

Been wondering what the hell that’s all about, maybe?

Well, answer #1 goes like this:

Hope people have been seeing the billboards that I have put up around town. I think its important everyone knows how much Sarah Marshall SUCKS! How she does look fat in those jeans! How my mom never liked her! How over her I am!

So, I used the money that I spent on her engagement ring to buy every available billboard around town. (That’s right Sarah I was going to propose to you. I was just waiting for the right time. I guess that time is never O’clock in the month of Nev-ruary).

Sarah, I really hope you are un-happy for the rest of your life – that you understand how totally over you I am.

That said, you should call me if you want to talk, I can have these things taken down.

Answer #2 goes like this:

While driving home I saw a billboard that read “You Suck Sarah Marshall”. At the bottom of the message I saw the URL www.ihatesarahmarshall.com so when I got home I jumped on my computer and checked out the website. [It’s] a blog that is currently being written by a loved obsessed 26 year old guy who is YouTubing videos about how infatuated he is with his hot TV star girlfriend.

Well, as it turns out this website is the launch of a new marketing campaign for a movie “Forgetting Sarah Marshall“. I have no idea if this movie is any good although it is brought to us by the guys who gave us the 40 Year Old Virgin. On a quick side note I can not recall seeing the R rated warning on the billboard but if I had I would have known right away it was a movie.

The point of this post is to point out the way this movie is being marketed. They are utilizing a combination of vague yet somewhat shocking billboard ads to drive people to a Google Blog thats incorporating YouTube videos as a way to create buzz. It should be interesting to see how it works out.

And answer #3 goes like this:

At OMMA a few weeks ago the theme was “Welcome to the Machine.” All the panels and presentations were framed around the question: How to prepare for the kind of dubious advertising that would be in store in the “Machine”-mediated future? (At least that’s what I think the theme was supposed to mean.)

The model for creating advertising has, in general, been pretty conglomerative. The media department buys the adspace, the creative department puts stuff in the adspace, the “new media” department does….who knows what, and the whole process is as compartmentalized as an assembly line. You know, it’s funny. There’s now hypersonic sound technology, which can be used to literally beam audio ads DIRECTLY at individuals in its path, yet we still insist on referring to the internet as “new media.” And that kind of segregated perspective may be part of the problem.

In strict media buy terms all that’s going on in the IHSM campaign is a grip of outdoor and a domain name, you could even say that ihatesarahmarshall.com is a kind of “microsite” I suppose, or maybe an “adverblog,” but are any of those elements individually responsible for the effectiveness of the campaign? While there’s certainly no shortage of ads out there that make a play on our curiosity, the IHSM billboards are the first that immediately struck me as possesing a deliberate, blatant, “What the hell is that about? Oh, I’ll just check it out on my iPhone,” quality.

There’s now more and more people carrying the internet around in their pocket. What does that mean in terms of how we approach Mobile, Online, Experiential, Outdoor, or Out-of-home media–all together! Then multiply all of that by the coefficient of search.

From Boinboing:

Cabel Maxfield Sasser recently went to Japan and noticed an interesting trend in advertising there: search boxes have replaced URLs. Picture 1-160

Within minutes of riding on the first trains in Japan, I notice a significant change in advertising, from train to television. The trend? No more printed URL’s. The replacement? Search boxes! With recommended search terms!

An ARG–which stands for Alternate Reality Game–is defined as: An interactive narrative that uses the real world as a platform, often involving multiple media and game elements, to tell a story that may be affected by participants’ ideas or actions.” Which could also serve as both a philosophical definition for marketing in general, and a more advanced version of what the I Hate Sarah Marshall campaign has started to touch on in a very basic, accessible way. The opportunity is now there to create advertising that works not by managing to take our attention hostage for an instant, but because it’s able to move between media the same way that our attention does.

“Integration” may be getting primed to become the next “viral” when it comes to overabused industry buzzwords, but it’s more than just a trendy new widget. The next phase is not about defeating some monolithic “Machine.” It’s about figuring out: How do we create messages that cater to the way technology lets us interact with all different media? Meanwhile, the paint-by-numbers, assembly-line approach is still trying to figure out which department’s responsibility it is to come up with the answer.

    



Subscribe for more like this.