Violate Me

miley_cyrus_vmas-620x412

Here is what I can tell you. When I was in New York a month ago and one night someone suggested we go to an MTV party, the first thought I had was — wait, MTV still exists?

But I guess it does because this week I’ve spent a lot of time talking about MTV. Well, not really so much MTV as the MTV Video Music Awards. Well, not even that, so much as Miley Cyrus’s performance. Yeah, I’ve spent a lot of time talking about Miley Cyrus’s performance at the VMAs. And so has the rest of America. Not only was a story about her performance the main event on the CNN homepage the next day, I then saw The Onion’s fictional op-ed, ostensibly written by the managing editor of CNN.com, with the headline, “Let Me Explain Why Miley Cyrus’ VMA Performance Was Our Top Story This Morning” (CNN spoiler alert: ad revenue), retweeted in my feed no less than 9 times in a matter of hours (The Onion spoiler alert: ad revenue).

For a culture that has become desensitized to multi-million dollar celebrity media empires built off the backs of sex tapes, something about Cyrus’s performance nevertheless managed to strike a nerve.

Here’s what we saw:

.

Afterwards, I don’t think any of us were quite sure exactly what had just happened to us.

It wasn’t just the raunchiness or the shock value. This is the VMA’s, after all, where Madonna kicked things off 30 years ago by dry humping the stage in a punk wedding dress; where Britney sang “I’m a Slave 4 u” while dancing in a green version of Cyrus’s flesh-toned 2-piece, with a live python draped around her body, and later where Madonna and Britney and Christina all made out, and then after that, where Lady Gaga hanged herself. The controversial VMA performance is now pretty much a traditional rite of passage in the transition from Disney child star into adult entertainer.

Wait… what?

Anyway, we expect this. We’re  practically inured to it at this point. But this show, Cyrus’s show, got under our skin. And not in, like, a good way.

“It seems everyone hated whatever it was Miley Cyrus was doing at last night’s VMAs,” Neetzan Zimmerman wrote on Gawker.

Whatever it was she was doing…. we couldn’t even be sure. The next morning we woke up in turns “stunned,” “shocked,” “outraged,” outraged by the outrage. From the moment Cyrus first stuck out her tongue, things felt weird. We’re so used to performers adhering to a strict code of conduct of media training — gliding through precise sequences of polished, camera-ready choreography. You want this to wind up being the image that follows you around the internet tomorrow, we thought to ourselves watching Cyrus gag.

Little did we know.

Then the performance began in earnest, Cyrus singing and dancing to her summer jam, “We Can’t Stop,” and we tried to relax. But 90 seconds in, as Jody Rosen writes on the Vulture blog, “pausing to spank and simulate analingus upon the ass of a thickly set African-American backup dancer, her act tipped over into what we may as well just call racism: a minstrel show routine whose ghoulishness was heightened by Cyrus’s madcap charisma.”

Awkward.

And that was all before Robin Thicke got onstage and Cyrus snapped out of her teddy-bear teddy, down to a nude, vinyl bikini, to duet Thicke’s own controversial summer hit, “Blurred Lines,” and the REALLY uncomfortable shit happened. The most disconcerting thing about their performance was Thicke’s consistent lack of….. engagement. While Cyrus twerked all over his body, Thicke seemed barely aware she was there. The New York Times described Cyrus’s behavior as a “molesting” of Robin Thicke. Behind his shades you couldn’t be sure whether he was even making eye contact. Of course, what Thicke was doing was reenacting the Blurred Lines video. Directed by Diane Martel, who’s also responsible for the video for We Can’t Stop, the video features basically completely naked women dancing next to, strutting past, facing away from, and engaging in a host of other activities that in general involve pretty much anything except actually acknowledging the presence of Robin Thicke. Or of T.I. or Pharrell Williams. The non-interactions between the fully-dressed men in the video and naked women seem so unaligned and asynchronous and non-sequitured they might as well be SnapChatting them in. “I directed the girls to look into the camera,” Martel explained on Grantland. “This is very intentional and they do it most of the time; they are in the power position. I wanted to deal with the misogynist, funny lyrics in a way where the girls were going to overpower the men. Look at Emily Ratajkowski’s performance; it’s very, very funny and subtly ridiculing. I find [the video] meta and playful.”

Whether the end result really succeeds in its intention is debatable (“Is meta-nudity a thing? Is there such thing as ‘ironic objectification?'” Callie Beusman asks on Jezebel), but this conceit at least makes sense in the context of a music video — and, by the way, subconsciously speaks to all of us and our modern experience of hyper-mediated, asynchronous connection. But you know where it doesn’t actually work? Live, on stage, as a visual to support a 20-year old former child star’s transformation into a woman claiming her sexuality.

“Performing near-nude on the VMA stage 10 years earlier,” Daniel D’addario writes on Salon.com, “Christina Aguilera was singing an ode to her own empowerment and desire to get sexual satisfaction on her own terms. Last night, Miley was singing a song about how good Robin Thicke is at sex.” And in this context, Thicke’s lack of engagement in the proceedings made Cyrus’s relentless hypersexualization look desperate, or worse yet, depraved. At first Cyrus came across like that girl you knew in college, drunk at a party, looking to fuck for validation. If you happened to stop to factor in the 16 year age difference between Thicke and Cyrus, a whole other kind of psychological issue could, conceivably, have seemed to be spilling itself out all over MTV. But the real cringe-worthy element of the experience was that, in the absence of active participation — and its implicit consent — from anyone sharing the stage with her, Cyrus’s agrosexual zeal very quickly began to look kinda….uhm…. predatory.

In one singular moment, Cyrus appeared to us as victim and predator. The violated, and the violator. No wonder we weren’t sure what we were even looking at. Cognitive dissonance, haaaaaaaay! Miley Cyrus had roofied us all. You could understand why, the next morning, MSNBC’s, Mika Brzezinski would call her “disturbed.”

Perhaps the problem is that “no one has apparently said ‘no’ [to Cyrus] for the last six months,” Jon Carmanica, suggested in The New York Times.

But it sure did  make for some great GIFs tho, amirite?!

ku-medium

ku-medium (2)

ku-medium (1)

From its very first steps, Cyrus’s performance felt, unmistakably, like watching a GIF happen in real-time. On the Atlantic, Nolan Feeney called this “the most GIFable award show ever,” and, indeed, Cyrus’s performance felt like the first one truly made for the age of the Internet. The act was speaking the native tongue — stuck all the way out — of the digital age, its direct appeal to meme culture as blatant and aggressive as the display of sexuality. All the performances before it had been made for TV. This show changed that. The source material and its inevitable meme-ification appeared to be happening simultaneously. The  Internet was inherently integrated within the performance. It was no longer a “second” screen; it was the same damn screen. If you go to watch the performance now on MTV.com, a bright pink button, set in stark relief against the site’s black background, blares, “GIF THIS!”

You want this to wind up being the image that follows you around the internet tomorrow? 

Yes. That was the whole point.

It’s our party we can do what we want
It’s our party we can say what we want
It’s our party we can love who we want
We can kiss who we want
We can sing what we want
– “We Can’t Stop

Six years before Cyrus was even born, a trio of dudes demanded you had to fight for your right to party. But that’s not what We Can’t Stop is is about. This song is a rallying cry for the right to be your own person. Something the human collateral of the Disney industrial complex, and the daughter of a Hollywood dad, would know something about, no doubt. (“It’s my mouth; I can say what I want to.”) But it’s also something that any adolescent can relate to, especially now.

“It’s like a giant, fucked-up selfie,” Martel said, explaining the concept behind the “We Can’t Stop” video, on RollingStone.com. “She’s absolutely taking the piss out of being in a pop video.” Even if you haven’t had to shoulder the weight of a multi-million dollar entertainment franchise since you were a child, everyone growing up now is saddled with the responsibility of managing their mediated identities. So how do you rebel against that responsibility? How do you subvert the expectation to maintain your put-together, meticulously edited persona? Maybe you have a video of yourself doing a Salvia bong hit at a house party on your 18th birthday end up on TMZ. Maybe you fuck your image up. You don’t try to look good. You grimace and stick your tongue out and take a photo and post that fucked-up selfie for the world to see.

Because if you don’t do it on your terms, the Internet meme hive force will do it for you. Here’s a pic that made the Internet meme rounds in the wake of Beyonce’s Super Bowl performance earlier this year:

Superbowl XLVII - Baltimore Ravens v San Francisco 49ers  - Mercedes-Benz Superdome

 

And here’s Cyrus fucking the shot up on purpose, before you could do it to her:

ku-bigpic

If you think Cyrus was trying to look good for you, if you think that no one was telling her “no” as she was putting the VMA performance together, that she herself wasn’t scrutinizing each frame of rehearsal video, and keenly understanding just how wrong it all looked, you’re completely missing the point.

We live in an age of violation. From News of the World hacking the cell phones of celebrities and bombing victims, to PRISM hacking everyone, everything, all the time. From doxxing to TMZ, from Wikileaks to Kiki Kannibal to Star Wars Kid to so many victims of online harassment driven to suicide, to Diana dying in a car crash in a French tunnel while being chased by paparazzi, to “Sad Keanu.”

The meme hive force is the digestive system of our networked world, capable of gleefully devouring its victims — or at least its objects — alive. Cyrus doing it to herself is “disturbed,” but the violating, exploiting meme hive force doing it to her is just another Tuesday on the Internet? And we’re totally cool with that. But, see, Cyrus thinks this is her song. And she can sing if she wants to. Her performance, crass, lewd, uncomfortable, disturbing, whatever, turned the hive force dynamic on its head. The meme object rolled out of a giant teddy bear, landed on stage and screamed, “GIF THIS!” It stuck its tongue out at all of us and belted, FIRST! at the top of its lungs and memed itself. Before anyone else could. The show got the upper hand by turning itself into the object of its own violation.

Because when we’ve already been titillated in every way imaginable, what else is there left to do? Cyrus basically didn’t do anything on the VMA stage that hasn’t been simulated there in one way or another before. So how else is there for a female pop star to traffic in her own sexuality in any new way, except to make us all feel like she was coercing us into violating her?

It was a new one for me. Was it weird for you, too?

“The Internet is fickle,” Martel said on Grantland, “But if a video is strong and entertaining, it is going to get massive hits, so of course strong work is going to have an effect on record sales. As I said, I’m mega-focused on selling records right now, so I’m doing that. I’m only taking jobs where this is a possibility. There is a new generation of kids that are overstimulated as viewers and you have to address that somehow. I’m just paying attention to the audience and their movements.”

What I learned from the 2013 VMAs is that owning your sexuality is passé, but owning meme culture by exploiting your sexuality is now. After all, in the attention economy, self-exploitation is self-empowerment. (Miley Cyrus spoiler alert: ad revenue).

Whatever you think of it, Cyrus’s performance was a deliberate reflection of where we are as culture. Calling it a “commentary” may be an overstatement, but it’s definitely a comment: 

R U NOT ENTERTAINED?????

942064_641_5YfeVAd

 

Oh, and guess what else? MTV, it turns out, still exists.

    



Subscribe for more like this.






The Search For Stark

First of all, do yourself a favor and watch this 2 minutes and 44 seconds of utter awesomeness above.

Then recall the ending of Iron Man 3. In fact, recall the entire 130 minutes of its insulting, technology guilt-laden self-hatred.

Or better yet, don’t do that.

If you’ve been here since 2010, you know that I have had a special place in my heart for the character I called “The First 21st Century Superhero.” Tony Stark — as  reimagined by Jon Favreau, and reincarnated by Robert Downey Jr. — and I have had an unexpectedly personal relationship these past 3 years. Ever since Favreau retweeted my post and it took on a life of its own and  became the most popular thing I’d ever written. From the intimacy of Tony Stark’s relationship with his gadgets, to his eschew of a secret identity in favor of that uniquely post-digital virtue of radical transparency, to his narcissism, Favreau’s Iron Man reflected a radical departure from the tropes that defined the 20th century superhero.

I could tell you about how Shane Black, who directed this third installment in the Iron Man franchise tried his best to undo all that. How deliberately he went after the things that not only made Tony Stark so brilliantly modern, but also lay at the very heart of his character. I could tell you about the relentless “techno fear” that ran like an electromagnetic current through the entire movie from start — on New Year’s Eve 1999, ground zero of the Y2k paranoia — to finish — with Stark throwing his arc reactor heart into the ocean like the he’s an old lady, letting go of a luminescent, blue burden at the end of fucking Titanic. Or some shit.

I could tell you how this conflicted, 20th century relationship to technology, wielded with all the subtlety of Catholic guilt, bashed all of us over the head like a blunt instrument the first time we saw Pepper and Tony on screen together — but wait! That’s not actually Tony. It’s a Siri-powered autonomous-driving Iron Man suit, and it’s just asked Pepper to, quote, “Kiss me on my mouth slit.”

(I seriously feel like I need to go wash my hands with soap now after typing those words.)

And yet, under Favreau’s direction, Pepper kissing Tony’s helmet in Iron Man 2 was most likely one of the sexiest moments Gwyneth Paltrow has ever had on film:

 

iron_man_2_alternate_opening_movie_image_slice_01

 

I could tell you how Black drove Tony Stark into hiding (while Favreau celebrated his coming out) and stripped him of his suit and access to his technology, making him fight his battles in the flesh for most of the film. We’re to believe Stark built a more advanced suit while a POW in a cave in fucking Afghanistan than he could on his credit limit in Tennessee??

 

tumblr_inline_mlmdtfwRqa1qz4rgp

 

I could tell you how the thing I was thinking about the most as I walked out of the theater — even more than that Black got thisclose to turning Pepper into a legitimate superhero in her own right, which would have been practically the only 21st-century compliant move he’d have made in the whole movie, but then, of course Tony had to “fix” her back to normal — was:

THANK GOD STEVE JOBS DID NOT LIVE TO SEE TONY STARK THROW HIS HEART INTO THE FUCKING OCEAN.

Do you remember the love that the first Iron Man movie, and the Tony Stark in it, had for his first suit? The one he made in captivity. The painstaking, terrifying labor that birthed this child of necessity? The metal manifestation of the power of ingenuity and creativity and talent that won him his freedom? Remember his second suit? The one he built once he got back home. The hotter, cooler, younger sibling of the scrap heap he’d left in the desert. The first real Iron Man suit. How much fun he had making it, tweaking it, perfecting it, and how much fun we had going along on the joyride? Tony Stark fought a custody battle against the American government for the suit in Iron Man 2. He said no one else could have it. He said the suit he created was a part of him, that he and it were one. And we all intimately understood exactly what he meant. Because  even if the rest of us don’t actually literally plug our gadgets into our chest cavities, 80% of us go t0 sleep with our phone by our bedside.

I could tell you how Shane Black changed all that for Tony, replaced his passion for innovation with a 20th century irreconcilability. His suits, once so precious the greatest military superpower in the world couldn’t force him to part with just one, have been rendered as meaningless as disposable cups. For Black’s Iron Man, technology still has friction. He can “disconnect,” can “unplug.” This feels like a “real” thing to do. As if there is still a world that isn’t part of the digital world. It’s not just an anachronistic, Gen X misunderstanding of the Millennial reality, it kills what makes Tony Stark, Tony Stark.

“We create our own demons” are the first words we hear as the movie begins. Stark is speaking in voiceover, and this becomes his ongoing refrain throughout the movie. We create our own demons. We create our own demons. By the end, when Stark destroys all of his dozens of indistinguishable suits — because they are “distractions” (the actual word he uses, twice), because we create our own demons and these are his creations, because (and this is the most fucked up part of all) he thinks this is what will make Pepper happy — it is the moment that Black destroys the soul of this character.

proof that tony stark has a heart

Imagine Steve Jobs throwing the iPhone prototype into the ocean and walking away.

Imagine Elon Musk, who Favreau modeled his interpretation of the modern-day tech genius inventor after, driving a fleet of Teslas off a cliff.

I could tell you how Shane Black imagined it.

Speaking to an audience at Standford in the wake of the Social Network, Mark Zuckerberg said, “The framing [of the movie] is that the whole reason for making Facebook is because I wanted to get girls, or wanted to get into clubs…. They just can’t wrap their head around the idea that someone might build something because they like building things.”

This is why Tony Stark builds things. Because he likes building things. Technology is not a “distraction” from something realer, it is a part of what IS real.  The digital and the analog worlds aren’t binary. They are inextricably intertwined. Technology is as much a part of us now as it has always been for Tony Stark — corporeally and philosophically. And there is no going back. Texting is not a distraction from the “realness” of the telephone — itself, a completely unnatural, manufactured, awkward medium that we all learned to take communication through for granted. Electricity is not a distraction from the “realness” of candle-light. Driving a car is not a distraction from the “realness” of riding a horse.

Which brings us back to this impeccably clever Audi commercial.

Featuring the two actors who’ve played Spock, himself an embodiment of hybridity, in a battle that starts out via iPad chess, doubles down over the phone, escalates by car, and culminates with the finishing touch of  a Vulcan nerve pinch. It makes the depiction of the permeable membrane between the digital and the analog, of the seamless absorption of a “fictional” personality into the “real” self, and of unapologetic techno-joy look effortlessly cool.

This is the Audi ad Iron Man USED TO BE!

In 2010, I wrote:

The first 21st century superhero is a hedonistic, narcissistic, even nihilistic, adrenaline junkie, billionaire entrepreneur do-gooder. If Peter Parker’s life lesson is that “with great power comes great responsibility,” Tony Stark’s is that with great power comes a shit-ton of fun.

You can’t get any more Gen Y than that.

Three Mays later, Tony Stark has changed. He’s entirely forgotten how to have fun. He doesn’t even get joy out of building things anymore — hell, he was having a better time when he had a terminal illness, back when Favreau was at the helm. Under Black’s direction, Stark doesn’t seem excited about anything. He’s on Xanax for his panic attacks — I’m assuming. Since there isn’t a single thing that fills him with anywhere near the kind of fascination Leonard Nimoy and Zachary Quinto express as they watch  a self-driving Audi pull out of a golf club driveway. As Black sees it, to embrace the technological innovation that is in Tony Stark’s blood — both figuratively and literally — to create something that isn’t a demon, to want to build things because he likes building things, all of that would somehow make Stark less human.

But as the mixed-race Spock always knew — what makes us human can’t be measured in degrees.

Oh well.

thanks for keeping the seat warm gen x we'll take it from here sincerely gen y

 

After all….. It’s only logical.

    



Subscribe for more like this.






Lose My Number

bananaphone

 One of the 20-somethings I’ve been working with over the past few weeks sent me a summary put together by another Millennial colleague of Sherry Turkle’s book, Alone Together: Why We Expect More from Technology and Less from Each Other. An MIT technology and society specialist, Turkle argues that our relentless modern connectivity leads to a new disconnect. As technology ramps up, our emotional lives ramp down. Hell in an handbasket, yadda yadda.

The Millennial reviewer called it “a fascinating and highly depressing affirmation of many of the problems that face my sister and I’s generation.” The summary includes such gems as:

– Young adults often cite the awkwardness of attempting to steer a phone call to its conclusion as one of the top reasons they avoid phone calls. This skill is not one that teens seem open to learning as a quick “got to go, bye” is easier than determining a natural way to break off a conversation.
 
– Teens avoid making phone calls for fear that they reveal too much. Texting is easier and “more efficient” than a human voice. Things that occur in “realtime” take far too much time. Even adults and academics admit that they would rather leave a voicemail or send an email than talk face-to-face.
 
– We used to live in an era when teenagers would race to the ringing phone after suppertime, now teens are content with receiving fewer calls in favor of texts or Facebook messages.
But here’s what I want to know: 

Why is telephone-based behavior the benchmark of communication proficiency?

The telephone isn’t part of our biology. It is, itself, a completely unnatural, manufactured, utterly awkward medium that we all learned to take communication through for granted.  

It could be lamented that “kids these days” dont know the etiquette of visiting card communication either — which is what people had to resort to before the phone:

Visiting cards became an indispensable tool of etiquette, with sophisticated rules governing their use. The essential convention was that one person would not expect to see another person in her own home (unless invited or introduced) without first leaving his visiting card for the person at her home. Upon leaving the card, he would not expect to be admitted at first, but might receive a card at his own home in response. This would serve as a signal that a personal visit and meeting at home would not be unwelcome. On the other hand, if no card were forthcoming, or if a card were sent in an envelope, a personal visit was thereby discouraged. As an adoption from French and English etiquette, visiting cards became common amongst the aristocracy of Europe, and also in the United States. The whole procedure depended upon there being servants to open the door and receive the cards and it was, therefore, confined to the social classes which employed servants. If a card was left with a turned corner it indicated that the card had been left in person rather than by a servant.

I mean, oy. You know?

The summary goes on:

For young adults electronic media “levels the playing field” between outgoing people and shy people, removing the barrier of awkward conversations and body language. Texts allow a two-minute pause to become normal and acceptable while the recipient thinks of an adequate response. The same is not possible in face-to-face conversations. Screens offer a place to reflect, retype and edit.

The assumption here being that two minutes is NOT acceptable? For the thousands of years when all we had was the paper operating system, it could take two months, or TWO YEARS to get a reply. The drama of the entire ouvre of Jane Austen and the Bronte sisters hinges on telecommunications not having been invented, just as much as the drama of Ferris Bueller’s Day Off hinges on cell phones not having been invented.

So why are we lamenting the telephone and all its associate UI becoming irrelevant and obsolete? (Current unlistened-to voicemails count: 49. You?) It was never meant to last forever. The telephone is like whale blubber. Brian Eno knows what I’m talking about:

“I think records were just a little bubble through time and those who made a living from them for a while were lucky. I always knew it would run out sooner or later. It couldn’t last, and now it’s running out. I don’t particularly care that it is and like the way things are going. The record age was just a blip. It was a bit like if you had a source of whale blubber in the 1840s and it could be used as fuel. Before gas came along, if you traded in whale blubber, you were the richest man on Earth. Then gas came along and you’d be stuck with your whale blubber. Sorry mate – history’s moving along. Recorded music equals whale blubber. Eventually, something else will replace it.”

Music existed before the plastic disc, and it will continue to exist after the MP3. Communication existed before the telephone, and it will continue to exist after the text message.

It just takes a frame of reference broader than what one generation takes for granted — or finds foreign — to see it.

    



Subscribe for more like this.






The Next 21st Century Superhero Will Be a Chick

A musician friend of mine was once seeing the best friend of a famous heiress and he told me this story: “I had been dating her for a month and one night she invited me out to go meet her whole crew for the first time. I was SUPER nervous. Meeting the group of friends of someone you’re dating for the first time can be nerve-racking anyway, but especially if they are like…. that. I drove there and I was standing outside like, ‘OK… I need to get my shit straight and go in there and own this place.’ All of a sudden it hit me: ‘Channel your inner Tony Stark!'” It worked, he said, “Game over.”

Hearing this story, I wondered, who was my inner spirit superheroine? What clever badass would I conjure for existential ammo in a situation like this? I started searching my mental pop culture database for an acceptable candidate and this is when I realized I could barely think of a single one. The only two vaguely applicable options coming to mind were both from a decade ago: Buffy foremost, and, more hazily, Trinity. But Buffy’s final episode had aired, and Trinity had devolved from enigma to boring love interest saved by her boyfriend at the end of the Matrix trilogy, both back in 2003. As far as contemporary, mainstream, pop culture was concerned, there was a giant void.

I turned to the Internet for help, and found a list of the 100 Greatest Female Characters, compiled by Total Film. While not exactly rigorous in its methodology (fully 6% of the list’s alleged 100 greatest female characters are not actually human; 3 — Audrey 2 from Little Shop of Horrors, Lady from Lady and the Tramp, and Dory from Finding Nemo — aren’t even humanoid), the audit is, at the very least… directional. Narrowing the list down to just those heroines who’ve graced the big screen within the past 10 years (minus the non-human entries) the chronological order looks like this:

Among these 15 possible spirit superhoreine candidates there are 6 victims of sexual abuse, 3 are dealing with some form of depression, 4 haven’t hit puberty, 2 are addicts — including one vampire — and, most notably, a full third who would sooner slaughter a party than charm it. New York Times film critic Manohla Dargis observed this trend last year, writing:

It’s no longer enough to be a mean girl, to destroy the enemy with sneers and gossip: you now have to be a murderous one. That, at any rate, seems to be what movies like Hanna, Sucker Punch, Super, Let Me In, Kick-Ass and those flicks with that inked Swedish psycho-chick seem to be saying. One of the first of these tiny terrors was played by the 12-year-old Natalie Portman in Luc Besson’s neo-exploitation flick The Professional (1994). Her character, a cigarette-smoking, wife-beater-wearing Lolita, schooled by a hit man, was a pint-size version of the waif turned assassin in Mr. Besson’s Femme Nikita (1990), which spawned various imitators. Mr. Besson likes little ladies with big weapons. As does Quentin Tarantino and more than a few Japanese directors, including Kinji Fukasaku, whose 2000 freakout, Battle Royale, provided the giggling schoolgirl who fights Uma Thurman’s warrior in Kill Bill Vol. 1. Mr. Tarantino and his celebrated love of the ladies of exploitation has something to do with what’s happening on screens. Yet something else is going on…. The question is why are so many violent girls and women running through movies now.

That question is particularly pointed since this genre is not exactly blockbuster material. Hanna was only slightly profitable. Sucker Punch flopped, as did Haywire and the Besson-produced, Colombiana; both Kick-Ass and Let Me In were “gore-athons that movieplexers don’t want to see,” and, in spite of all its hype, the American remake of The Girl With The Dragon Tattoo was a “huge box office disappointment.” And that’s all just in the past two years.

In an April, 2011, New Yorker article titled, “Funny Like A Guy, Anna Faris and Hollywood’s Women Problem,” Tad Friend wrote:

Female-driven comedies such as Juno, Mean Girls, The House Bunny, Julie & Julia, Something’s Gotta Give, It’s Complicated, and Easy A have all done well at the box office. So why haven’t more of them been made? “Studio executives think these movies’ success is a one-off every time,” Nancy Meyers, who wrote and directed Something’s Gotta Give and It’s Complicated, observes. “They’ll say, ‘One of the big reasons that worked was because Jack was in it,’ or ‘We hadn’t had a comedy for older women in forever.”

Amy Pascal, who as Sony’s cochairman put four of the above films into production, points out, “You’re talking about a dozen or so female-driven comedies that got made over a dozen years, a period when hundreds of male-driven comedies got made. And every one of those female-driven comedies was written or directed or produced by a woman. Studio executives believe that male moviegoers would rather prep for a colonoscopy than experience a woman’s point of view. “Let’s be honest,” one top studio executive said. “The decision to make movies is mostly made by men, and if men don’t have to make movies about women, they won’t.”

Except, it seems, if those women happen to be traumatized, ultra-violent vigilantes of some sort. Perhaps these movies keep getting made because their failure is seen as a one-off every time, too.

“Men just don’t understand the nuance of female dynamics,” Friend quotes an anonymous, prominent producer. Although the conversation is about comedy (why men can’t relate to Renee Zellweger in Bridget Jones, for example), it could explain why all these vengeful heroines seem to inevitably wind up defective. This violent femmes sub-genre — which expands the traditional Rape/Revenge archetype to also encompass psychologically violated prepubescents — by default demands female protagonists. But since their creators don’t understand how to make them, they stick to what they know. Consider that the title role in Salt was originally named Edwin, and intended for Tom Cruise before she became Evelyn and went to Angelina Jolie. The emotionally stunted, socially inept, tech savant protagonists of David Fincher’s two latest films — male in The Social Network, female in The Girl With The Dragon Tattoo — are equally as interchangeable. From Hannah to Hit Girl, all the way back to Matilda in The Professional, it’s always been a father, or father figure who’s trained them. A woman, this narrative suggests, would have nothing to offer in raising a powerful daughter. When a film needs a Violent Femme the solution has become to simply write a man, and then cast a girl. (Failing that, just mix up a cocktail of disorders — Asperger’s, attachment disorder, PTSD; a splash of Stockholm Syndrome — where a character needs to be.) No understanding of female dynamics required.

“What if the person you expect to be the predator is not who you expect it to be? What if it’s the other person,” asks producer, David W. Higgins, on the DVD featurette for his 2005 film, Hard Candy, about a 14-year-old girl, played by Ellen Paige, who blithely brutalizes a child molester. Whereas for 20th century heroines like Princess Leia (#5 on Total Film’s 100 Greatest Female Characters), Sarah Connor (#3), or Ellen Ripley (#1 — of course), not to mention their brethren, overcoming trauma is what made them become heroes, for this new crop, trauma is what excuses them from seeming like villains in their own right. We love to see the underdog triumph, but do we really want to watch a victim become the predator, and a predator become the hero? The ongoing failures of films fetishizing this scenario suggest we’re just not that into this cognitive dissonance.

So much for movies no one wants to see, but what about those those every girl has? On the one hand there’s Twilight, whose Bella Swan is a dishrag of a damsel in distress so useless her massive popularity is a disturbing, cultural atavism. On the other, there’s the Harry Potter series, whose Hermione Granger (#7) might be “The Heroine Women Have Been Waiting For,” according to Laura Hibbard in the Huffington Post. “The early books were full of her eagerly answering question after question in class, much to the annoyance of the other characters. In the later books, that unapologetic intelligence very obviously saves Harry Potter’s life on more than one occasion. Essentially, without Hermione, Harry wouldn’t have been ‘the boy who lived.'” Meanwhile, here’s how Total Film describes Leia: “Royalty turned revolutionary, a capital-L Lady with a laser gun in her hand. Cool, even before you know she also has Jedi blood.”

And that is the one, simple, yet infinitely complex element that is consistently missing across the entire spectrum of stiff, 21st century downers: Cool. “Of all the comic books we published at Marvel,” said Stan Lee, the creator of Iron Man, Spider-Man, the Hulk, the X-Men, and more, “we got more fan mail for Iron Man from women than any other title.” Cool is the platonic ideal Tony Stark represents. It’s what makes him such an effective spirit superhero for the ordeal of party. But while Stark may be special he’s not an anomaly. From James Bond to Tyler Durden, male characters Bogart the cool. And it’s not because they’re somehow uniquely suited for it (see: the femme fatale). It’s because their contemporary female counterparts are consistently forced to be lame.

“You have to defeat her at the beginning,” Tad Friend quotes a successful female screenwriter describing her technique. “It’s a conscious thing I do — abuse and break her, strip her of her dignity, and then she gets to live out our fantasies and have fun. It’s as simple as making the girl cry fifteen minutes into the movie.” That could just as easily describe Bridesmaids as The Girl With The Dragon Tattoo. Which is totally fucked, first of all. And secondly, it’s boring. You’d think there’d be more narrative to go around — though I suppose I did just see the once female-driven Carrie, and The Craft remade as an all-male superhero origin flick called, Chronicle. Perhaps we really have reached Peak Plot. In which case now would really be the time to be R&Ding some alternatives.

“I love to take reality and change one little aspect of it, and see how reality then shifts.” said director, Jon Favreau. “That was what was fun about Iron Man, you [change] one little thing, and how does that affect the real world?” Favreau’s experiment has yielded a superhero archetype that reflects a slew of Millennial mores, from the intimacy of his relationship with his gadgets, to his eschew of a secret identity in favor of that uniquely post-digital virtue of radical transparency, to his narcissism. “If Peter Parker’s life lesson is that ‘with great power comes great responsibility,'” I wrote in a post titled, Why Iron Man is the First 21st Century Superhero, “Tony Stark’s is that with great power comes a shit-ton of fun. Unlike the prior century’s superhero, this new version saves the world not out of any overwhelming sense of obligation or indentured servitude to duty, but because he can do what he wants, when he wants, because he wants to. Being Iron Man isn’t a burden, it’s an epic thrill-ride.” Breaking with the established conventions of the genre to create a uniquely modern superhero has made Iron Man a success, to the tune of a billion dollar box office between the two movies, and launched Marvel Studios and ensuing Avengers’ franchises in its wake. But there’s one 21st century shift Tony Stark will never be able to embody. And it’s kind of a big one.

From The Atlantic Magazine:

Man has been the dominant sex since, well, the dawn of mankind. But for the first time in human history, that is changing—and with shocking speed.

In the wreckage of the Great Recession, three-quarters of the 8 million jobs lost were lost by men. The worst-hit industries were overwhelmingly male and deeply identified with macho: construction, manufacturing, high finance. Some of these jobs will come back, but the overall pattern of dislocation is neither temporary nor random. The recession merely revealed—and accelerated—a profound economic shift that has been going on for at least 30 years, and in some respects even longer.

According to the Bureau of Labor Statistics, women now hold 51.4 percent of managerial and professional jobs—up from 26.1 percent in 1980. About a third of America’s physicians are now women, as are 45 percent of associates in law firms—and both those percentages are rising fast. A white-collar economy values raw intellectual horsepower, which men and women have in equal amounts. It also requires communication skills and social intelligence, areas in which women, according to many studies, have a slight edge. Perhaps most important—for better or worse—it increasingly requires formal education credentials, which women are more prone to acquire, particularly early in adulthood.

To see the future—of the workforce, the economy, and the culture—you need to spend some time at America’s colleges and professional schools, where a quiet revolution is under way. Women now earn 60 percent of master’s degrees, about half of all law and medical degrees, and 42 percent of all M.B.A.s. Most important, women earn almost 60 percent of all bachelor’s degrees—the minimum requirement, in most cases, for an affluent life. In a stark reversal since the 1970s, men are now more likely than women to hold only a high-school diploma.

American parents are beginning to choose to have girls over boys. As they imagine the pride of watching a child grow and develop and succeed as an adult, it is more often a girl that they see in their mind’s eye.

Yes, the U.S. still has a wage gap, one that can be convincingly explained—at least in part—by discrimination. Yes, women still do most of the child care. And yes, the upper reaches of society are still dominated by men. But given the power of the forces pushing at the economy, this setup feels like the last gasp of a dying age rather than the permanent establishment. It may be happening slowly and unevenly, but it’s unmistakably happening: in the long view, the modern economy is becoming a place where women hold the cards.

That view makes even comedian (and father of two daughters) Louis C.K.’s pronouncement in a recent Fast Company article that “The next Steve Jobs will  be a chick” not unimaginable. And when she is, who will be her inner superheroine? Any of the girls brandishing medieval weaponry headed, like crusaders, for movie theaters this year?

Considering the cruel, dystopian premise of The Hunger Games, Katniss will likely get to have as fun as an overachiever prepping for the SATs. And while Kristen Stewart as persecuted maiden turned, apparently, warrior in Snow White and the Huntsman (whose producer previously suited up Alice for battle in Wonderland) couldn’t possibly be more joyless and blank than as Bella (….right??), my money’s on Brave‘s Merida to win in the the flat out cool department, here:

Either way, while Tony Stark is an archetype boys grow into, the above are all manifestations of one girls grow out of, and when they do, they will expect their own spirit superheroine to aspire to. Someone who doesn’t have to be brutalized to be a badass, or a predator to be a hero. Someone clever and charming and cool as fuck, whom you’d just as soon want to party with as have saving the world; who’s faced the dark forces that don’t understand her and threaten to break her and strip her of her dignity, and, like the century of superheroes before her, has overcome. The next 21st century superhero will be a chick. The girls coming for the 21st century won’t be satisfied with anything less.

    



Subscribe for more like this.






Who The iPad Ads Are For

Ever since Apple started putting a lowercase i in front of its products, their advertisements have been known for basically two things — articulating a visceral, transcendent grace inherent within the Mac product experience:

…and making fun of people who don’t already use Macs:

Which is why the iPad ads — with their exaggeratedly simplistic gestures, their induced first-person perspective, (the people in the photos always seem to be seated in some awkward position in order to give us, the viewers, the perspective of being the “user” in the image), and above all, the blatantly basic depiction of the product experience — just don’t quite fit with the image of what an Apple ad is supposed to be.

If these ads seem like a departure, it’s because they are.

In the 60′s, Everett Rogers broke down the process by which trends, products, and ideas proliferate through culture. There are five basic types of adopter personas in his diffusion of innovation theory:

Innovators are the first to adopt an innovation. They are, by defualt, risk-takers since being on the front lines means they are likely to adopt a technology or an idea which may ultimately fail. Early Adopters are the second fastest category to adopt an innovation. They’re more discrete in their adoption choice than Innovators, but have the highest degree of opinion leadership among the other adopter categories. Individuals in the Early Majority adopt an innovation after having let the Innovators and Early adopters do product-testing for them. The Late Majority approaches an innovation with a high degree of skepticism, and after the majority of society has already adopted the innovation first. And finally, Laggards are the last to get on board with a new innovation. These individuals typically have an aversion to change-agents, tend to be advanced in age, and to be focused on “traditions.”

The thinking in marketing, especially when launching a new product, generally tends to be about aiming at the early adopters over on the left side of the adoption bell-curve. Once the early adopters get into it, the thinking goes, whatever it is will trickle down through all the rest of the early and late majority who make up the vast bulk of the market share. A few years back I wrote about how Nintendo was going for a “late adopter strategy” with its Wii console. At the time (and perhaps still now) the Wii was outselling both Sony’s PlayStation and Microsoft’s X-box combined. The Wii’s uniquely simple controller and intuitive game-play enabled it to appeal to a much broader audience than the more complicated, hardcore-gaming consoles.

From a Time Magazine article on the eve of the Wii release in 2006:

“The one topic we’ve considered and debated at Nintendo for a very long time is, Why do people who don’t play video games not play them?” [Nintendo president Satoru] Iwata has been asking himself, and his employees, that question for the past five years. And what Iwata has noticed is something that most gamers have long ago forgotten: to nongamers, video games are really hard. Like hard as in homework.

The key to the Wii’s success is that it made gaming simple, broadly accessible, and inherently intuitive. Later that year, AdAge wrote that the Wii’s popularity is “part of a growing phenomenon that’s overhauling the video-gaming industry…. Video gaming is beginning to transcend the solitary boy-in-the-basement stereotype with a new generation of gamers including women, older people and younger children.”

Anyone who has bought, or even used, an iPhone at some point during the three years since the first iteration was released, already understands what the iPad is all about without any help from an ad. Indeed, Apple has done such a good  job of making ads aimed at early adopters for the past decade, they no longer need to. An ad is not going to make a difference in whether someone on the left-hand side of Apple’s adopter bell-curve buys an iPad or not. Instead, these ads are targeted straight at the people on the downhill slope.

New results from a Pew Research Center survey tracking 2,252 adults 18 and older show that use of social network sites among older adults has risen dramatically over the past two years:

While overall social networking use by online American adults has grown from 35% in 2008 to 61% in 2010, the increase is even more dramatic among older adults. The rate of online social networking approximately quadrupled among Older Boomers (9% to 43%) and the GI Generation (4% to 16%).

Of course, Millennials still have a healthy lead among all age groups in social network use, with 83% of online adults from 18-33 engaging in social networking, but grandma and grandpa are just catching up. Particularly grandma. Last year, the fastest growing demographic on Facebook was women over 55.

Unlike the Apple ads we’ve become accustomed to in the 2000’s, these iPad ads are no longer touting the product’s “higher resolution experience” to digital natives. That is, they are not emphasizing the ephemeral or smugly superior subtleties that are inaccessible to anyone who does not intuitively “get it.” These ads are, instead, paring the experience down to be as unintimidating as possible. Not only is the iPad a completely new way to experience personal computing, it is as effortless to use this technology, the ads say to you, the viewer, as if you were, yourself, a digital native.

    



Subscribe for more like this.