“Web Design” Is Dead

ios7

One of my favorite of Steve Jobs’ quotes — if not THE favorite — is this:

People think it’s this veneer — that the designers are handed this box and told, ‘Make it look good!’ That’s not what we think design is. It’s not just what it looks like and feels like. Design is how it works.

It’s a sentiment echoed in the philosophy behind Apple’s completely rethought new design language for the forthcoming iOS7, just announced this week:

Nothing we’ve ever created has been designed just to look beautiful. That’s approaching the opportunity from the wrong end. Instead, as we reconsidered iOS, our purpose was to create an experience that was simpler, more useful, and more enjoyable — while building on the things people love about iOS. Ultimately, redesigning the way it works led us to redesign the way it looks. Because good design is design that’s in service of the experience.

This is not only a standard that Apple holds itself to, it now extends to all those who develop on the iOS platform with comprehensive guidelines on how third party developers should design for iOS 7 to match Apple’s own style. As TechCrunch puts it, “Developers will have to adapt their apps to match the rest of the operating system if they don’t want them to look antiquated.”

Here are Apple’s three main themes for developing for iOS 7:

Deference. The UI helps users understand and interact with the content, but never competes with it.

Clarity. Text is legible at every size, icons are precise and lucid, adornments are subtle and appropriate, and a sharpened focus on functionality motivates the design.

Depth. Visual layers and realistic motion heighten users’ delight and understanding.

On Apple’s list of  things app developers should do to get ready for iOS 7 are instructions like: “Revisit the use of drop shadows, gradients, and bezels. Because the iOS 7 aesthetic is smooth and layered—with much less emphasis on using visual effects to make UI elements look physical — you may want to rethink these effects.”

This kind of “smooth” digital design aesthetic, that rejects the skeuomorphism of making icons on a flat screen look like 3-dimensional, analog objects, has a name — “flat design.” And Apple was not even the first to adopt it. (They were the last holdout, in fact). Microsoft and Google got there first. Back in 2011, when Microsoft unveiled its “Metro” design language, now simply referred to as “Windows 8,”  its design principles were:

Clean, Light, Open and Fast

We took an approach that we call “Fierce Reduction” to remove any elements in the UI that we felt were unnecessary; both visual elements and feature bloat. It allows us to shine a focus on the primary tasks of the UI, and makes the UI feel smart, open, fast, and responsive.

Alive in Motion

The transitions between screens in a UI are as important the design of the screens themselves. Motion gives character to a UI, but also communicates the navigation system, which helps to improve usability.

Celebrate Typography

Our design inspiration is very typographic, and it felt like it was time for User Interfaces to be uncompromising about type as well. Type is information, type is beautiful.

Content, Not Chrome

It’s the content on the phone that people want, not the buttons. Reducing the visuals on the phone that aren’t content will help you create a more open UI, and it also promotes direct interaction with the content.

Authentically Digital

Finally, we believe in honesty in design. A user interface is created of pixels, so in Metro we try to avoid using the skeumorphic shading and glossiness used in some UI’s that try to mimic real world materials and objects.

 windows_8

With Apple joining the tech trifecta, (“Leading by following“), Flat Design has been transformed from a trend into a manifesto. It is a fundamental philosophical shift from what’s come before in what design’s role is in the natively digital experience.

Living in L.A. it’s very easy for all metaphors and analogies to get reduced to the automotive experience. So here we go — for a long time in the web, we had the same people, approaching in the same way, the design of something like this:

5GTDN13E378117608-1c

 

And something like this:

28332_original

I mean, we just didn’t know any better. We didn’t really grasp that we needed completely different types of design philosophies. All we knew was that things on the web had to…. look….. like something….. Maybe … something pretty?…. Or… Cool? Anyway, we had to get designers. To design them. And what visual arts genius was going to want to create a digital masterpiece that looks like they were barely there in the first place?

 

ios-7-ui-transition-guide_-layout-and-appearance

Orite…. Apple.

For many a designer — sadly, still — whatever the reason a user has come to the destination they are designing, it is second to the privilege of being exposed to the designer’s creative brilliance and superior taste.

Apple is saying, this is where we are as a culture: we’re past that now. Apple wants “designers” to get out the way. Users are not here to marvel at your “design.” They’re here to get to the shit your design is jumping up and down, waving its hands frantically trying to get attention, getting in the way of. Apple wants to make it very clear that the UI — that layer between the human, and the content that this human is trying to access, aka the “design” layer — is not the star. It should, quote, “play a supporting role.”

Commuters don’t care about your “creative vision.” They are just trying to get fucking home.

If you want to be Dali, you should probably not be a freeway designer. But if you want to design freeways — or iOS experiences — then your art is about making something sublimely useful, usable, and effective. This is what the companies defining the way we access the digital world all stand for now. This is what they believe makes for a beautiful experience on their devices and on their operating systems.

“Web design” is dead because everywhere the “design layer” of the web is being sandblasted off, the interface reduced down to its barest essence. This is why the new, natively digital design disciplines are found deep beyond the surface of aesthetics, in user experience design, in information architecture, in interaction design.

More than ever, Jobs’ words are true: design is now a fundamentally inextricable part of how it works.

There’s simply not much room left for anything else.

And if you think you’ll at least get to choose the colors based on your personal design taste…. like Apple says, “you may want to rethink” that as well.

From Fast Company’s “The Science Behind Colors in Marketing“:

“Green connotes ideas like “natural” and “environment,” and given its wide use in traffic lights, suggests the idea of “go” or forward movement. The color red, on the other hand, is often thought to communicate excitement, passion, blood, and warning. It is also used as the color for stopping at traffic lights. Red is also known to be eye-catching.”

So, clearly an A/B test between green and red would result in green, the more friendly color. At least that was their guess. Here is what their experiment looked like:

3009317-inline-inline-4-why-is-facebook-blue-the-science-behind-colors-in-marketing

The red button outperformed the green button by 21%.

What’s most important to consider is that nothing else was changed at all: 21% more people clicked on the red button than on the green button. Everything else on the pages was the same, so it was only the button color that made this difference.

    



Subscribe for more like this.






The Search For Stark

First of all, do yourself a favor and watch this 2 minutes and 44 seconds of utter awesomeness above.

Then recall the ending of Iron Man 3. In fact, recall the entire 130 minutes of its insulting, technology guilt-laden self-hatred.

Or better yet, don’t do that.

If you’ve been here since 2010, you know that I have had a special place in my heart for the character I called “The First 21st Century Superhero.” Tony Stark — as  reimagined by Jon Favreau, and reincarnated by Robert Downey Jr. — and I have had an unexpectedly personal relationship these past 3 years. Ever since Favreau retweeted my post and it took on a life of its own and  became the most popular thing I’d ever written. From the intimacy of Tony Stark’s relationship with his gadgets, to his eschew of a secret identity in favor of that uniquely post-digital virtue of radical transparency, to his narcissism, Favreau’s Iron Man reflected a radical departure from the tropes that defined the 20th century superhero.

I could tell you about how Shane Black, who directed this third installment in the Iron Man franchise tried his best to undo all that. How deliberately he went after the things that not only made Tony Stark so brilliantly modern, but also lay at the very heart of his character. I could tell you about the relentless “techno fear” that ran like an electromagnetic current through the entire movie from start — on New Year’s Eve 1999, ground zero of the Y2k paranoia — to finish — with Stark throwing his arc reactor heart into the ocean like the he’s an old lady, letting go of a luminescent, blue burden at the end of fucking Titanic. Or some shit.

I could tell you how this conflicted, 20th century relationship to technology, wielded with all the subtlety of Catholic guilt, bashed all of us over the head like a blunt instrument the first time we saw Pepper and Tony on screen together — but wait! That’s not actually Tony. It’s a Siri-powered autonomous-driving Iron Man suit, and it’s just asked Pepper to, quote, “Kiss me on my mouth slit.”

(I seriously feel like I need to go wash my hands with soap now after typing those words.)

And yet, under Favreau’s direction, Pepper kissing Tony’s helmet in Iron Man 2 was most likely one of the sexiest moments Gwyneth Paltrow has ever had on film:

 

iron_man_2_alternate_opening_movie_image_slice_01

 

I could tell you how Black drove Tony Stark into hiding (while Favreau celebrated his coming out) and stripped him of his suit and access to his technology, making him fight his battles in the flesh for most of the film. We’re to believe Stark built a more advanced suit while a POW in a cave in fucking Afghanistan than he could on his credit limit in Tennessee??

 

tumblr_inline_mlmdtfwRqa1qz4rgp

 

I could tell you how the thing I was thinking about the most as I walked out of the theater — even more than that Black got thisclose to turning Pepper into a legitimate superhero in her own right, which would have been practically the only 21st-century compliant move he’d have made in the whole movie, but then, of course Tony had to “fix” her back to normal — was:

THANK GOD STEVE JOBS DID NOT LIVE TO SEE TONY STARK THROW HIS HEART INTO THE FUCKING OCEAN.

Do you remember the love that the first Iron Man movie, and the Tony Stark in it, had for his first suit? The one he made in captivity. The painstaking, terrifying labor that birthed this child of necessity? The metal manifestation of the power of ingenuity and creativity and talent that won him his freedom? Remember his second suit? The one he built once he got back home. The hotter, cooler, younger sibling of the scrap heap he’d left in the desert. The first real Iron Man suit. How much fun he had making it, tweaking it, perfecting it, and how much fun we had going along on the joyride? Tony Stark fought a custody battle against the American government for the suit in Iron Man 2. He said no one else could have it. He said the suit he created was a part of him, that he and it were one. And we all intimately understood exactly what he meant. Because  even if the rest of us don’t actually literally plug our gadgets into our chest cavities, 80% of us go t0 sleep with our phone by our bedside.

I could tell you how Shane Black changed all that for Tony, replaced his passion for innovation with a 20th century irreconcilability. His suits, once so precious the greatest military superpower in the world couldn’t force him to part with just one, have been rendered as meaningless as disposable cups. For Black’s Iron Man, technology still has friction. He can “disconnect,” can “unplug.” This feels like a “real” thing to do. As if there is still a world that isn’t part of the digital world. It’s not just an anachronistic, Gen X misunderstanding of the Millennial reality, it kills what makes Tony Stark, Tony Stark.

“We create our own demons” are the first words we hear as the movie begins. Stark is speaking in voiceover, and this becomes his ongoing refrain throughout the movie. We create our own demons. We create our own demons. By the end, when Stark destroys all of his dozens of indistinguishable suits — because they are “distractions” (the actual word he uses, twice), because we create our own demons and these are his creations, because (and this is the most fucked up part of all) he thinks this is what will make Pepper happy — it is the moment that Black destroys the soul of this character.

proof that tony stark has a heart

Imagine Steve Jobs throwing the iPhone prototype into the ocean and walking away.

Imagine Elon Musk, who Favreau modeled his interpretation of the modern-day tech genius inventor after, driving a fleet of Teslas off a cliff.

I could tell you how Shane Black imagined it.

Speaking to an audience at Standford in the wake of the Social Network, Mark Zuckerberg said, “The framing [of the movie] is that the whole reason for making Facebook is because I wanted to get girls, or wanted to get into clubs…. They just can’t wrap their head around the idea that someone might build something because they like building things.”

This is why Tony Stark builds things. Because he likes building things. Technology is not a “distraction” from something realer, it is a part of what IS real.  The digital and the analog worlds aren’t binary. They are inextricably intertwined. Technology is as much a part of us now as it has always been for Tony Stark — corporeally and philosophically. And there is no going back. Texting is not a distraction from the “realness” of the telephone — itself, a completely unnatural, manufactured, awkward medium that we all learned to take communication through for granted. Electricity is not a distraction from the “realness” of candle-light. Driving a car is not a distraction from the “realness” of riding a horse.

Which brings us back to this impeccably clever Audi commercial.

Featuring the two actors who’ve played Spock, himself an embodiment of hybridity, in a battle that starts out via iPad chess, doubles down over the phone, escalates by car, and culminates with the finishing touch of  a Vulcan nerve pinch. It makes the depiction of the permeable membrane between the digital and the analog, of the seamless absorption of a “fictional” personality into the “real” self, and of unapologetic techno-joy look effortlessly cool.

This is the Audi ad Iron Man USED TO BE!

In 2010, I wrote:

The first 21st century superhero is a hedonistic, narcissistic, even nihilistic, adrenaline junkie, billionaire entrepreneur do-gooder. If Peter Parker’s life lesson is that “with great power comes great responsibility,” Tony Stark’s is that with great power comes a shit-ton of fun.

You can’t get any more Gen Y than that.

Three Mays later, Tony Stark has changed. He’s entirely forgotten how to have fun. He doesn’t even get joy out of building things anymore — hell, he was having a better time when he had a terminal illness, back when Favreau was at the helm. Under Black’s direction, Stark doesn’t seem excited about anything. He’s on Xanax for his panic attacks — I’m assuming. Since there isn’t a single thing that fills him with anywhere near the kind of fascination Leonard Nimoy and Zachary Quinto express as they watch  a self-driving Audi pull out of a golf club driveway. As Black sees it, to embrace the technological innovation that is in Tony Stark’s blood — both figuratively and literally — to create something that isn’t a demon, to want to build things because he likes building things, all of that would somehow make Stark less human.

But as the mixed-race Spock always knew — what makes us human can’t be measured in degrees.

Oh well.

thanks for keeping the seat warm gen x we'll take it from here sincerely gen y

 

After all….. It’s only logical.

    



Subscribe for more like this.






Lose My Number

bananaphone

 One of the 20-somethings I’ve been working with over the past few weeks sent me a summary put together by another Millennial colleague of Sherry Turkle’s book, Alone Together: Why We Expect More from Technology and Less from Each Other. An MIT technology and society specialist, Turkle argues that our relentless modern connectivity leads to a new disconnect. As technology ramps up, our emotional lives ramp down. Hell in an handbasket, yadda yadda.

The Millennial reviewer called it “a fascinating and highly depressing affirmation of many of the problems that face my sister and I’s generation.” The summary includes such gems as:

– Young adults often cite the awkwardness of attempting to steer a phone call to its conclusion as one of the top reasons they avoid phone calls. This skill is not one that teens seem open to learning as a quick “got to go, bye” is easier than determining a natural way to break off a conversation.
 
– Teens avoid making phone calls for fear that they reveal too much. Texting is easier and “more efficient” than a human voice. Things that occur in “realtime” take far too much time. Even adults and academics admit that they would rather leave a voicemail or send an email than talk face-to-face.
 
– We used to live in an era when teenagers would race to the ringing phone after suppertime, now teens are content with receiving fewer calls in favor of texts or Facebook messages.
But here’s what I want to know: 

Why is telephone-based behavior the benchmark of communication proficiency?

The telephone isn’t part of our biology. It is, itself, a completely unnatural, manufactured, utterly awkward medium that we all learned to take communication through for granted.  

It could be lamented that “kids these days” dont know the etiquette of visiting card communication either — which is what people had to resort to before the phone:

Visiting cards became an indispensable tool of etiquette, with sophisticated rules governing their use. The essential convention was that one person would not expect to see another person in her own home (unless invited or introduced) without first leaving his visiting card for the person at her home. Upon leaving the card, he would not expect to be admitted at first, but might receive a card at his own home in response. This would serve as a signal that a personal visit and meeting at home would not be unwelcome. On the other hand, if no card were forthcoming, or if a card were sent in an envelope, a personal visit was thereby discouraged. As an adoption from French and English etiquette, visiting cards became common amongst the aristocracy of Europe, and also in the United States. The whole procedure depended upon there being servants to open the door and receive the cards and it was, therefore, confined to the social classes which employed servants. If a card was left with a turned corner it indicated that the card had been left in person rather than by a servant.

I mean, oy. You know?

The summary goes on:

For young adults electronic media “levels the playing field” between outgoing people and shy people, removing the barrier of awkward conversations and body language. Texts allow a two-minute pause to become normal and acceptable while the recipient thinks of an adequate response. The same is not possible in face-to-face conversations. Screens offer a place to reflect, retype and edit.

The assumption here being that two minutes is NOT acceptable? For the thousands of years when all we had was the paper operating system, it could take two months, or TWO YEARS to get a reply. The drama of the entire ouvre of Jane Austen and the Bronte sisters hinges on telecommunications not having been invented, just as much as the drama of Ferris Bueller’s Day Off hinges on cell phones not having been invented.

So why are we lamenting the telephone and all its associate UI becoming irrelevant and obsolete? (Current unlistened-to voicemails count: 49. You?) It was never meant to last forever. The telephone is like whale blubber. Brian Eno knows what I’m talking about:

“I think records were just a little bubble through time and those who made a living from them for a while were lucky. I always knew it would run out sooner or later. It couldn’t last, and now it’s running out. I don’t particularly care that it is and like the way things are going. The record age was just a blip. It was a bit like if you had a source of whale blubber in the 1840s and it could be used as fuel. Before gas came along, if you traded in whale blubber, you were the richest man on Earth. Then gas came along and you’d be stuck with your whale blubber. Sorry mate – history’s moving along. Recorded music equals whale blubber. Eventually, something else will replace it.”

Music existed before the plastic disc, and it will continue to exist after the MP3. Communication existed before the telephone, and it will continue to exist after the text message.

It just takes a frame of reference broader than what one generation takes for granted — or finds foreign — to see it.

    



Subscribe for more like this.






The Next 21st Century Superhero Will Be a Chick

A musician friend of mine was once seeing the best friend of a famous heiress and he told me this story: “I had been dating her for a month and one night she invited me out to go meet her whole crew for the first time. I was SUPER nervous. Meeting the group of friends of someone you’re dating for the first time can be nerve-racking anyway, but especially if they are like…. that. I drove there and I was standing outside like, ‘OK… I need to get my shit straight and go in there and own this place.’ All of a sudden it hit me: ‘Channel your inner Tony Stark!'” It worked, he said, “Game over.”

Hearing this story, I wondered, who was my inner spirit superheroine? What clever badass would I conjure for existential ammo in a situation like this? I started searching my mental pop culture database for an acceptable candidate and this is when I realized I could barely think of a single one. The only two vaguely applicable options coming to mind were both from a decade ago: Buffy foremost, and, more hazily, Trinity. But Buffy’s final episode had aired, and Trinity had devolved from enigma to boring love interest saved by her boyfriend at the end of the Matrix trilogy, both back in 2003. As far as contemporary, mainstream, pop culture was concerned, there was a giant void.

I turned to the Internet for help, and found a list of the 100 Greatest Female Characters, compiled by Total Film. While not exactly rigorous in its methodology (fully 6% of the list’s alleged 100 greatest female characters are not actually human; 3 — Audrey 2 from Little Shop of Horrors, Lady from Lady and the Tramp, and Dory from Finding Nemo — aren’t even humanoid), the audit is, at the very least… directional. Narrowing the list down to just those heroines who’ve graced the big screen within the past 10 years (minus the non-human entries) the chronological order looks like this:

Among these 15 possible spirit superhoreine candidates there are 6 victims of sexual abuse, 3 are dealing with some form of depression, 4 haven’t hit puberty, 2 are addicts — including one vampire — and, most notably, a full third who would sooner slaughter a party than charm it. New York Times film critic Manohla Dargis observed this trend last year, writing:

It’s no longer enough to be a mean girl, to destroy the enemy with sneers and gossip: you now have to be a murderous one. That, at any rate, seems to be what movies like Hanna, Sucker Punch, Super, Let Me In, Kick-Ass and those flicks with that inked Swedish psycho-chick seem to be saying. One of the first of these tiny terrors was played by the 12-year-old Natalie Portman in Luc Besson’s neo-exploitation flick The Professional (1994). Her character, a cigarette-smoking, wife-beater-wearing Lolita, schooled by a hit man, was a pint-size version of the waif turned assassin in Mr. Besson’s Femme Nikita (1990), which spawned various imitators. Mr. Besson likes little ladies with big weapons. As does Quentin Tarantino and more than a few Japanese directors, including Kinji Fukasaku, whose 2000 freakout, Battle Royale, provided the giggling schoolgirl who fights Uma Thurman’s warrior in Kill Bill Vol. 1. Mr. Tarantino and his celebrated love of the ladies of exploitation has something to do with what’s happening on screens. Yet something else is going on…. The question is why are so many violent girls and women running through movies now.

That question is particularly pointed since this genre is not exactly blockbuster material. Hanna was only slightly profitable. Sucker Punch flopped, as did Haywire and the Besson-produced, Colombiana; both Kick-Ass and Let Me In were “gore-athons that movieplexers don’t want to see,” and, in spite of all its hype, the American remake of The Girl With The Dragon Tattoo was a “huge box office disappointment.” And that’s all just in the past two years.

In an April, 2011, New Yorker article titled, “Funny Like A Guy, Anna Faris and Hollywood’s Women Problem,” Tad Friend wrote:

Female-driven comedies such as Juno, Mean Girls, The House Bunny, Julie & Julia, Something’s Gotta Give, It’s Complicated, and Easy A have all done well at the box office. So why haven’t more of them been made? “Studio executives think these movies’ success is a one-off every time,” Nancy Meyers, who wrote and directed Something’s Gotta Give and It’s Complicated, observes. “They’ll say, ‘One of the big reasons that worked was because Jack was in it,’ or ‘We hadn’t had a comedy for older women in forever.”

Amy Pascal, who as Sony’s cochairman put four of the above films into production, points out, “You’re talking about a dozen or so female-driven comedies that got made over a dozen years, a period when hundreds of male-driven comedies got made. And every one of those female-driven comedies was written or directed or produced by a woman. Studio executives believe that male moviegoers would rather prep for a colonoscopy than experience a woman’s point of view. “Let’s be honest,” one top studio executive said. “The decision to make movies is mostly made by men, and if men don’t have to make movies about women, they won’t.”

Except, it seems, if those women happen to be traumatized, ultra-violent vigilantes of some sort. Perhaps these movies keep getting made because their failure is seen as a one-off every time, too.

“Men just don’t understand the nuance of female dynamics,” Friend quotes an anonymous, prominent producer. Although the conversation is about comedy (why men can’t relate to Renee Zellweger in Bridget Jones, for example), it could explain why all these vengeful heroines seem to inevitably wind up defective. This violent femmes sub-genre — which expands the traditional Rape/Revenge archetype to also encompass psychologically violated prepubescents — by default demands female protagonists. But since their creators don’t understand how to make them, they stick to what they know. Consider that the title role in Salt was originally named Edwin, and intended for Tom Cruise before she became Evelyn and went to Angelina Jolie. The emotionally stunted, socially inept, tech savant protagonists of David Fincher’s two latest films — male in The Social Network, female in The Girl With The Dragon Tattoo — are equally as interchangeable. From Hannah to Hit Girl, all the way back to Matilda in The Professional, it’s always been a father, or father figure who’s trained them. A woman, this narrative suggests, would have nothing to offer in raising a powerful daughter. When a film needs a Violent Femme the solution has become to simply write a man, and then cast a girl. (Failing that, just mix up a cocktail of disorders — Asperger’s, attachment disorder, PTSD; a splash of Stockholm Syndrome — where a character needs to be.) No understanding of female dynamics required.

“What if the person you expect to be the predator is not who you expect it to be? What if it’s the other person,” asks producer, David W. Higgins, on the DVD featurette for his 2005 film, Hard Candy, about a 14-year-old girl, played by Ellen Paige, who blithely brutalizes a child molester. Whereas for 20th century heroines like Princess Leia (#5 on Total Film’s 100 Greatest Female Characters), Sarah Connor (#3), or Ellen Ripley (#1 — of course), not to mention their brethren, overcoming trauma is what made them become heroes, for this new crop, trauma is what excuses them from seeming like villains in their own right. We love to see the underdog triumph, but do we really want to watch a victim become the predator, and a predator become the hero? The ongoing failures of films fetishizing this scenario suggest we’re just not that into this cognitive dissonance.

So much for movies no one wants to see, but what about those those every girl has? On the one hand there’s Twilight, whose Bella Swan is a dishrag of a damsel in distress so useless her massive popularity is a disturbing, cultural atavism. On the other, there’s the Harry Potter series, whose Hermione Granger (#7) might be “The Heroine Women Have Been Waiting For,” according to Laura Hibbard in the Huffington Post. “The early books were full of her eagerly answering question after question in class, much to the annoyance of the other characters. In the later books, that unapologetic intelligence very obviously saves Harry Potter’s life on more than one occasion. Essentially, without Hermione, Harry wouldn’t have been ‘the boy who lived.'” Meanwhile, here’s how Total Film describes Leia: “Royalty turned revolutionary, a capital-L Lady with a laser gun in her hand. Cool, even before you know she also has Jedi blood.”

And that is the one, simple, yet infinitely complex element that is consistently missing across the entire spectrum of stiff, 21st century downers: Cool. “Of all the comic books we published at Marvel,” said Stan Lee, the creator of Iron Man, Spider-Man, the Hulk, the X-Men, and more, “we got more fan mail for Iron Man from women than any other title.” Cool is the platonic ideal Tony Stark represents. It’s what makes him such an effective spirit superhero for the ordeal of party. But while Stark may be special he’s not an anomaly. From James Bond to Tyler Durden, male characters Bogart the cool. And it’s not because they’re somehow uniquely suited for it (see: the femme fatale). It’s because their contemporary female counterparts are consistently forced to be lame.

“You have to defeat her at the beginning,” Tad Friend quotes a successful female screenwriter describing her technique. “It’s a conscious thing I do — abuse and break her, strip her of her dignity, and then she gets to live out our fantasies and have fun. It’s as simple as making the girl cry fifteen minutes into the movie.” That could just as easily describe Bridesmaids as The Girl With The Dragon Tattoo. Which is totally fucked, first of all. And secondly, it’s boring. You’d think there’d be more narrative to go around — though I suppose I did just see the once female-driven Carrie, and The Craft remade as an all-male superhero origin flick called, Chronicle. Perhaps we really have reached Peak Plot. In which case now would really be the time to be R&Ding some alternatives.

“I love to take reality and change one little aspect of it, and see how reality then shifts.” said director, Jon Favreau. “That was what was fun about Iron Man, you [change] one little thing, and how does that affect the real world?” Favreau’s experiment has yielded a superhero archetype that reflects a slew of Millennial mores, from the intimacy of his relationship with his gadgets, to his eschew of a secret identity in favor of that uniquely post-digital virtue of radical transparency, to his narcissism. “If Peter Parker’s life lesson is that ‘with great power comes great responsibility,'” I wrote in a post titled, Why Iron Man is the First 21st Century Superhero, “Tony Stark’s is that with great power comes a shit-ton of fun. Unlike the prior century’s superhero, this new version saves the world not out of any overwhelming sense of obligation or indentured servitude to duty, but because he can do what he wants, when he wants, because he wants to. Being Iron Man isn’t a burden, it’s an epic thrill-ride.” Breaking with the established conventions of the genre to create a uniquely modern superhero has made Iron Man a success, to the tune of a billion dollar box office between the two movies, and launched Marvel Studios and ensuing Avengers’ franchises in its wake. But there’s one 21st century shift Tony Stark will never be able to embody. And it’s kind of a big one.

From The Atlantic Magazine:

Man has been the dominant sex since, well, the dawn of mankind. But for the first time in human history, that is changing—and with shocking speed.

In the wreckage of the Great Recession, three-quarters of the 8 million jobs lost were lost by men. The worst-hit industries were overwhelmingly male and deeply identified with macho: construction, manufacturing, high finance. Some of these jobs will come back, but the overall pattern of dislocation is neither temporary nor random. The recession merely revealed—and accelerated—a profound economic shift that has been going on for at least 30 years, and in some respects even longer.

According to the Bureau of Labor Statistics, women now hold 51.4 percent of managerial and professional jobs—up from 26.1 percent in 1980. About a third of America’s physicians are now women, as are 45 percent of associates in law firms—and both those percentages are rising fast. A white-collar economy values raw intellectual horsepower, which men and women have in equal amounts. It also requires communication skills and social intelligence, areas in which women, according to many studies, have a slight edge. Perhaps most important—for better or worse—it increasingly requires formal education credentials, which women are more prone to acquire, particularly early in adulthood.

To see the future—of the workforce, the economy, and the culture—you need to spend some time at America’s colleges and professional schools, where a quiet revolution is under way. Women now earn 60 percent of master’s degrees, about half of all law and medical degrees, and 42 percent of all M.B.A.s. Most important, women earn almost 60 percent of all bachelor’s degrees—the minimum requirement, in most cases, for an affluent life. In a stark reversal since the 1970s, men are now more likely than women to hold only a high-school diploma.

American parents are beginning to choose to have girls over boys. As they imagine the pride of watching a child grow and develop and succeed as an adult, it is more often a girl that they see in their mind’s eye.

Yes, the U.S. still has a wage gap, one that can be convincingly explained—at least in part—by discrimination. Yes, women still do most of the child care. And yes, the upper reaches of society are still dominated by men. But given the power of the forces pushing at the economy, this setup feels like the last gasp of a dying age rather than the permanent establishment. It may be happening slowly and unevenly, but it’s unmistakably happening: in the long view, the modern economy is becoming a place where women hold the cards.

That view makes even comedian (and father of two daughters) Louis C.K.’s pronouncement in a recent Fast Company article that “The next Steve Jobs will  be a chick” not unimaginable. And when she is, who will be her inner superheroine? Any of the girls brandishing medieval weaponry headed, like crusaders, for movie theaters this year?

Considering the cruel, dystopian premise of The Hunger Games, Katniss will likely get to have as fun as an overachiever prepping for the SATs. And while Kristen Stewart as persecuted maiden turned, apparently, warrior in Snow White and the Huntsman (whose producer previously suited up Alice for battle in Wonderland) couldn’t possibly be more joyless and blank than as Bella (….right??), my money’s on Brave‘s Merida to win in the the flat out cool department, here:

Either way, while Tony Stark is an archetype boys grow into, the above are all manifestations of one girls grow out of, and when they do, they will expect their own spirit superheroine to aspire to. Someone who doesn’t have to be brutalized to be a badass, or a predator to be a hero. Someone clever and charming and cool as fuck, whom you’d just as soon want to party with as have saving the world; who’s faced the dark forces that don’t understand her and threaten to break her and strip her of her dignity, and, like the century of superheroes before her, has overcome. The next 21st century superhero will be a chick. The girls coming for the 21st century won’t be satisfied with anything less.

    



Subscribe for more like this.






It’s The End Of The World As We Know It…. And I Feel Fine

According to the Mayan calendar — as translated by new-age hippies I used to know, and depicted by Roland Emmerich — the year 2012 is alleged to herald the apocalypse. Perhaps this collective unconscious sense of mass destruction is what’s driving the popularity of turn-of-the-millennium musings about the end of the world. In June 2008, Adbusters’ cover story was, literally, titled, “Hipster: The Dead End of Western Civilization.” Three and a half years later, Vanity Fair’s first issue of 2012 asks, “You Say You Want a Devolution? From Fashion to Housewares, Are We in a Decades-Long Design Rut?” While these two publications could arguably not be further apart on the target audience spectrum, they’re singing the same doomsday tune. As Kurt Andersen writes in the Vanity Fair piece, “The past is a foreign country, but the recent past—the 00s, the 90s, even a lot of the 80s—looks almost identical to the present.” The last line of the article concludes, “I worry some days, this is the way that Western civilization declines, not with a bang but with a long, nostalgic whimper.” But has cultural evolution really come to a grinding halt in the 21st century, or are we simply looking in all the old places, not realizing it’s moved on?

In Adbusters, Douglas Haddow sets up the alleged apocalypse like so:

Ever since the Allies bombed the Axis into submission, Western civilization has had a succession of counter-culture movements that have energetically challenged the status quo. Each successive decade of the post-war era has seen it smash social standards, riot and fight to revolutionize every aspect of music, art, government and civil society. But after punk was plasticized and hip hop lost its impetus for social change, all of the formerly dominant streams of “counter-culture” have merged together. Now, one mutating, trans-Atlantic melting pot of styles, tastes and behavior has come to define the generally indefinable idea of the ‘Hipster.’

Echoing that sentiment in Vanity Fair, Andersen writes:

Think about it. Picture it. Rewind any other 20-year chunk of 20th-century time. There’s no chance you would mistake a photograph or movie of Americans or an American city from 1972—giant sideburns, collars, and bell-bottoms, leisure suits and cigarettes, AMC Javelins and Matadors and Gremlins alongside Dodge Demons, Swingers, Plymouth Dusters, and Scamps—with images from 1992. Time-travel back another 20 years, before rock ’n’ roll and the Pill and Vietnam, when both sexes wore hats and cars were big and bulbous with late-moderne fenders and fins—again, unmistakably different, 1952 from 1972. You can keep doing it and see that the characteristic surfaces and sounds of each historical moment are absolutely distinct from those of 20 years earlier or later: the clothes, the hair, the cars, the advertising—all of it. It’s even true of the 19th century: practically no respectable American man wore a beard before the 1850s, for instance, but beards were almost obligatory in the 1870s, and then disappeared again by 1900.

Writing about the Adbusters piece in 2008, I pointed to a central flaw in the premise: the emergence of what Chris Anderson, in his 2006 book of the same name, calls, The Long Tail. Digital technology, Anderson writes, has ushered in “An evolution from an ‘Or’ era of hits or niches (mainstream culture vs. subcultures) to an ‘AND’ era.” In this new, rebalanced equation, “Mass culture will not fall, it will simply get less mass. And niche culture will get less obscure.” What Adbusters saw as the end of Western civilization was actually the end of mass culture; a transition to a confederacy of niches. So, if mass culture, as the construct we, and Adbusters, had known it to be was over, what was there to be “counter” to anymore? (While, more recently, Occupy Wall Street has thrown its hat into the ring, it’s not so much anti-mass culture as it is pro-redefining the concept: the 99%, through the movement’s message — let alone mathematics — is not the counterculture. It IS the culture.)

Unlike Haddow, Andersen doesn’t blame the purported cultural stagnation on any one group of perpetrators. Rather, the “decades-long design rut” has descended upon us all, he suggests, like an aesthetic recession, the result of some unregulated force originating in the 1960′s and depreciating steadily until it simply collapsed, and none of us noticed until it was too late. “Look at people on the street and in malls,” Andersen writes, “Jeans and sneakers remain the standard uniform for all ages, as they were in 2002, 1992, and 1982. Since 1992, as the technological miracles and wonders have propagated and the political economy has transformed, the world has become radically and profoundly new.” And yet, “during these same 20 years, the appearance of the world (computers, TVs, telephones, and music players aside) has changed hardly at all, less than it did during any 20-year period for at least a century. This is the First Great Paradox of Contemporary Cultural History.”

Or is it?

In a 2003 New York Times article titled, The Guts of a new Machine, the design prophet of the 21st century revealed his philosophy on the subject: “People think it’s this veneer,” said the late Steve Jobs, “That the designers are handed this box and told, ‘Make it look good!’ That’s not what we think design is. It’s not just what it looks like and feels like. Design is how it works.”

Think about it. Picture it. Those big, bulbous cars Andersen describes, with their late-moderne fenders and fins, so unmistakably different from 1952 to 1977, just how different were they, really, in how they worked? Not that much. In the 20th century you could pop open the hood of a car and with some modicum of mechanics know what it was you were looking at. Now, the guy in the wifebeater working on the Camaro in his garage is an anachronism. You’ll never see that guy leaning over the guts of a post-Transformers, 2012 Camaro. Let alone a hybrid or an electric vehicle. “With rare exceptions,” Andersen argues, “cars from the early 90s (and even the late 80s) don’t seem dated.” And yet, there’s no way anyone would confuse a Chevy Volt with anything GM was making 10 years ago, or a Toyota Prius with what was on the road in the early 90s, or voice recognition capability, completely common in a 2012 model, as anything but a science fiction conceit in a show starring David Hasselhoff, in 80s. While it’s debatable that exterior automotive styling hasn’t changed in the past 30 years (remember the Tercel? The station wagon? The Hummer? A time before the SUV?) it’s indisputable that the way a 2012 automobile works has changed.

For the majority of human history the style shifts between eras were pretty much entirely cosmetic. From the Greeks to the Romans, from the Elizabethans to the Victorians, what fluctuated most was the exterior. It wasn’t until the pace of technological innovation began to accelerate in the 20th century that design became concerned with what lay beneath the surface. In the 1930s, industrial designer Raymond Loewy forged a new design concept, called Streamlining. One of the first and most widespread design concepts to draw its rationale from technology, Streamlining was characterized by stripping Art Deco, its flamboyant 1920’s predecessor, of all nonessential ornamentation in favor a smooth, pure-line concept of motion and speed. Under the austerity of the Depression era, the superficial flourishes of Art Deco became fraudulent, falsely modern. Loewy’s vision of a modern world was minimalist, frictionless, developed from aerodynamics and other scientific concepts. By the 1960’s Loewy’s streamlined designs for thousands of consumer goods — everything from toasters and refrigerators to automobiles and spacecrafts — had radically changed the look of American life.

What began in the 20th century as a design concept has, in the 21st, become THE design concept. Technological innovation — the impact of which Andersen breezes past — has become the driving force behind aesthetic innovation. Design is how it works. Aerodynamics has paved the way for modern considerations like efficiency, performance, usability, sustainability, and more. But unlike fluctuating trends in men’s facial hair or collar size, technology moves in one direction. It does not vacillate, it iterates, improving on what came before, building incrementally. The biggest aesthetic distinctions, therefore, have become increasingly smaller.

Consider, for example, this optical illusion:

What, exactly, is the difference between the two things above? Rewind twenty years, and it’s already unlikely most people would have been able to really tell a difference in any meaningful way. Go back even further in time, and these things become pretty much identical to everyone. Yet we, the inhabitants of 2012, would never, ever, mistake one for the other. The most minute, subtlest of details are huge universes of difference to us now. We have become obsessives, no longer just consumers but connoisseurs, fanatics with post-industrial palates altered by exposure to a higher resolution. And it’s not just about circuitry. In fashion, too, significant signifiers have become more subtle.

The New York Magazine writeup for Blue in Green, a Soho-based men’s lifestyle store reads:

Fifteen hard-to-find, premium brands of jeans—most based in Japan, a country known for its quality denim—line the walls. Prices range from the low three figures all the way up to four figures for a pair by Kyuten, embedded with ground pearl and strips of rare vintage kimono. Warehouse’s Duckdigger jeans are sandblasted in Japan with grains shipped from Nevada and finished with mismatched vintage hardware and twenties-style suspender buttons. Most jeans are raw, so clients can produce their own fade, and the few that are pre-distressed are never airbrushed; free hemming is available in-house on a rare Union Special chain-stitcher from an original Levi’s factory.

(Sidenote: it’s not just jeans. Wool — probably not the next textile in line on the cool spectrum after denim — is catching up. Esquire apparently thinks wool is so interesting to their readers they created an illustrated slide show about different variations of sheep.)

“Our massively scaled-up new style industry naturally seeks stability and predictability,” Andersen argues. “Rapid and radical shifts in taste make it more expensive to do business and can even threaten the existence of an enterprise.” But in fact, when it comes to fashion, quite the opposite is true. To keep us buying new clothes — and we do: according to the Daily Mail, women have four times as many clothes in their wardrobe today as they did in 1980, buying, and discarding half their body weight in clothes per year — styles have to keep changing. Rapid and radical shifts in taste are the foundation of the fashion business; a phenomenon the industry exploits, not fears. And the churn rate has only accelerated. “Fast Fashion,” a term coined in the mid-2000′s, means more frequent replacement of cheaper clothes that become outdated more quickly.

“The modern sensibility has been defined by brief stylistic shelf lives,” Andersen writes, “Our minds trained to register the recent past as old-fashioned.” But what has truly become old-fashioned in the 21st century, whether we’ve realized it or not, is the idea of a style being able to define a decade at all. It’s as old-fashioned as a TV with a radial dial or retail limitations dictated by brick and mortar. As Andersen himself writes, “For the first time, anyone anywhere with any arcane cultural taste can now indulge it easily and fully online, clicking themselves deep into whatever curious little niche (punk bossa nova, Nigerian noir cinema, pre-war Hummel figurines) they wish.” And primarily what we wish for, as Andersen sees it, is what’s come before. “Now that we have instant universal access to every old image and recorded sound, the future has arrived and it’s all about dreaming of the past.” To be fair, there is a deep nostalgic undercurrent to our pop culture, but to look at the decentralization of cultural distribution and see only “a cover version of something we’ve seen or heard before” is to miss the bigger picture of our present, and our future. The long tail has dismantled the kind of aesthetic uniformity that could have once come to represent a decade’s singular style. In a confederacy of niches there is no longer a media source mass enough to define and disseminate a unified look or sound.

As with technology, cultural evolution in the 21st century is iterative. Incremental changes, particularly ones that originate beneath the surface, may not be as obvious through the flickering Kodak carousel frames of decades, but they are no less profound. In his 2003 book, The Rise of the Creative Class: And How It’s Transforming Work, Leisure, Community, and Everyday Life, Richard Florida opens with a similar time travel scenario to Andersen’s:

Here’s a thought experiment. Take a typical man on the street from the year 1900 and drop him into the 1950s. Then take someone from the 1950s and move him Austin Powers-style into the present day. Who would experience the greater change?

On the basis of big, obvious technological changes alone, surely the 1900-to-1950s traveler would experience the greater shift, while the other might easily conclude that we’d spent the second half of the twentieth century doing little more than tweaking the great waves of the first half.

But the longer they stayed in their new homes, the more each time-traveler would become aware of subtler dimensions of change. Once the glare of technology had dimmed, each would begin to notice their respective society’s changed norms and values, and the ways in which everyday people live and work. And here the tables would be turned. In terms of adjusting to the social structures and the rhythms and patterns of daily life, our second time-traveler would be much more disoriented.

Someone from the early 1900s would find the social world of the 1950s remarkably similar to his own. If he worked in a factory, he might find much the same divisions of labor, the same hierarchical systems of control. If he worked in an office, he would be immersed in the same bureaucracy, the same climb up the corporate ladder. He would come to work at 8 or 9 each morning and leave promptly at 5, his life neatly segmented into compartments of home and work. He would wear a suit and tie. Most of his business associates would be white and male. Their values and office politics would hardly have changed. He would seldom see women in the work-place, except as secretaries, and almost never interact professionally with someone of another race. He would marry young, have children quickly thereafter, stay married to the same person and probably work for the same company for the rest of his life. He would join the clubs and civic groups befitting his socioeconomic class, observe the same social distinctions, and fully expect his children to do likewise. The tempo of his life would be structured by the values and norms of organizations. He would find himself living the life of the “company man” so aptly chronicled by writers from Sinclair Lewis and John Kenneth Galbraith to William Whyte and C.Wright Mills.

Our second time-traveler, however, would be quite unnerved by the dizzying social and cultural changes that had accumulated between the 1950s and today. At work he would find a new dress code, a new schedule, and new rules. He would see office workers dressed like folks relaxing on the weekend, in jeans and open-necked shirts, and be shocked to learn they occupy positions of authority. People at the office would seemingly come and go as they pleased. The younger ones might sport bizarre piercings and tattoos. Women and even nonwhites would be managers. Individuality and self-expression would be valued over conformity to organizational norms — and yet these people would seem strangely puritanical to this time-traveler. His ethnic jokes would fall embarrassingly flat. His smoking would get him banished to the parking lot, and his two-martini lunches would raise genuine concern. Attitudes and expressions he had never thought about would cause repeated offense. He would continually suffer the painful feeling of not knowing how to behave.

Out on the street, this time-traveler would see different ethnic groups in greater numbers than he ever could have imagined — Asian-, Indian-, and Latin-Americans and others — all mingling in ways he found strange and perhaps inappropriate. There would be mixed-race couples, and same-sex couples carrying the upbeat-sounding moniker “gay.” While some of these people would be acting in familiar ways — a woman shopping while pushing a stroller, an office worker having lunch at a counter — others, such as grown men clad in form-fitting gear whizzing by on high-tech bicycles, or women on strange new roller skates with their torsos covered only by “brassieres” — would appear to be engaged in alien activities.

People would seem to be always working and yet never working when they were supposed to. They would strike him as lazy and yet obsessed with exercise. They would seem career-conscious yet fickle — doesn’t anybody stay with the company more than three years? — and caring yet antisocial: What happened to the ladies’ clubs, Moose Lodges and bowling leagues? While the physical surroundings would be relatively familiar, the feel of the place would be bewilderingly different.

Thus, although the first time-traveler had to adjust to some drastic technological changes, it is the second who experiences the deeper, more pervasive transformation. It is the second who has been thrust into a time when lifestyles and worldviews are most assuredly changing — a time when the old order has broken down, when flux and uncertainty themselves seem to be part of the everyday norm.

It’s the end of the world as we’ve known it. And I feel fine.

    



Subscribe for more like this.