Who The iPad Ads Are For

Ever since Apple started putting a lowercase i in front of its products, their advertisements have been known for basically two things — articulating a visceral, transcendent grace inherent within the Mac product experience:

…and making fun of people who don’t already use Macs:

Which is why the iPad ads — with their exaggeratedly simplistic gestures, their induced first-person perspective, (the people in the photos always seem to be seated in some awkward position in order to give us, the viewers, the perspective of being the “user” in the image), and above all, the blatantly basic depiction of the product experience — just don’t quite fit with the image of what an Apple ad is supposed to be.

If these ads seem like a departure, it’s because they are.

In the 60′s, Everett Rogers broke down the process by which trends, products, and ideas proliferate through culture. There are five basic types of adopter personas in his diffusion of innovation theory:

Innovators are the first to adopt an innovation. They are, by defualt, risk-takers since being on the front lines means they are likely to adopt a technology or an idea which may ultimately fail. Early Adopters are the second fastest category to adopt an innovation. They’re more discrete in their adoption choice than Innovators, but have the highest degree of opinion leadership among the other adopter categories. Individuals in the Early Majority adopt an innovation after having let the Innovators and Early adopters do product-testing for them. The Late Majority approaches an innovation with a high degree of skepticism, and after the majority of society has already adopted the innovation first. And finally, Laggards are the last to get on board with a new innovation. These individuals typically have an aversion to change-agents, tend to be advanced in age, and to be focused on “traditions.”

The thinking in marketing, especially when launching a new product, generally tends to be about aiming at the early adopters over on the left side of the adoption bell-curve. Once the early adopters get into it, the thinking goes, whatever it is will trickle down through all the rest of the early and late majority who make up the vast bulk of the market share. A few years back I wrote about how Nintendo was going for a “late adopter strategy” with its Wii console. At the time (and perhaps still now) the Wii was outselling both Sony’s PlayStation and Microsoft’s X-box combined. The Wii’s uniquely simple controller and intuitive game-play enabled it to appeal to a much broader audience than the more complicated, hardcore-gaming consoles.

From a Time Magazine article on the eve of the Wii release in 2006:

“The one topic we’ve considered and debated at Nintendo for a very long time is, Why do people who don’t play video games not play them?” [Nintendo president Satoru] Iwata has been asking himself, and his employees, that question for the past five years. And what Iwata has noticed is something that most gamers have long ago forgotten: to nongamers, video games are really hard. Like hard as in homework.

The key to the Wii’s success is that it made gaming simple, broadly accessible, and inherently intuitive. Later that year, AdAge wrote that the Wii’s popularity is “part of a growing phenomenon that’s overhauling the video-gaming industry…. Video gaming is beginning to transcend the solitary boy-in-the-basement stereotype with a new generation of gamers including women, older people and younger children.”

Anyone who has bought, or even used, an iPhone at some point during the three years since the first iteration was released, already understands what the iPad is all about without any help from an ad. Indeed, Apple has done such a good  job of making ads aimed at early adopters for the past decade, they no longer need to. An ad is not going to make a difference in whether someone on the left-hand side of Apple’s adopter bell-curve buys an iPad or not. Instead, these ads are targeted straight at the people on the downhill slope.

New results from a Pew Research Center survey tracking 2,252 adults 18 and older show that use of social network sites among older adults has risen dramatically over the past two years:

While overall social networking use by online American adults has grown from 35% in 2008 to 61% in 2010, the increase is even more dramatic among older adults. The rate of online social networking approximately quadrupled among Older Boomers (9% to 43%) and the GI Generation (4% to 16%).

Of course, Millennials still have a healthy lead among all age groups in social network use, with 83% of online adults from 18-33 engaging in social networking, but grandma and grandpa are just catching up. Particularly grandma. Last year, the fastest growing demographic on Facebook was women over 55.

Unlike the Apple ads we’ve become accustomed to in the 2000’s, these iPad ads are no longer touting the product’s “higher resolution experience” to digital natives. That is, they are not emphasizing the ephemeral or smugly superior subtleties that are inaccessible to anyone who does not intuitively “get it.” These ads are, instead, paring the experience down to be as unintimidating as possible. Not only is the iPad a completely new way to experience personal computing, it is as effortless to use this technology, the ads say to you, the viewer, as if you were, yourself, a digital native.

    



Subscribe for more like this.






The Peril of Perfect Evil

inglourious-basterds-poster

Have you noticed the slate of WWII resistance movies lately? There’s last year’s Valkyrie, starring Tom Cruise, which depicts the actual attempted plot devised by a cadre of senior German officers to assassinate Hitler. Earlier this year saw the release of Defiance, also based on a true story, with Daniel Craig and Liev Schriber portraying the Bielski brothers, who formed a Jewish resistance otryad in Nazi-occupied Eastern Europe and helped over 1,000 people survive in the Belorussian forest. Not to be outdone by American productions, Europe is getting in on its own action with the Danish film, Flame & Citron, which came out a few weeks ago, an ultra-stylized spy noir based, once again, on a true story of two resistance fighters, nicknamed, as one might expect, Flame and Citron, who became heroes of the underground through their violent dealings. And finally (or perhaps not?) there’s Inglourious Basterds, due out this Friday, starring Brad Pitt, and directed by Quentin Tarantino.

inglourious-basterds-movie-poster

For the majority of onscreen depictions of WWII warfare, the script has been about specifically military combat (Saving Private Ryan, Band of Brothers, Flags of our Fathers, etc.) What’s striking about this current slew of films, however, is that the focus has shifted to stories of renegade insurgence. One could postulate all sorts of hypotheses about why that shift might have gained traction recently, but regardless, obviously there really were a lot of resistance movements going on during WWII, and there are a lot of incredible stories of heroism and courage to be told. Yet the particular cultural territory the Inglorious Basterds are invading I find rather dangerous and troubling.

Of all the recent resistance-themed movies, Basterds is the only one that does not appear to be based, as far as I can tell, on any sort of actual events, and from what I can gather from the movie’s trailer the whole premise basically just seems like an excuse for Tarantino, the Auteur of American Violence Porn to do a 1940’s period flick — altho, allegedly there is some sort of plot, also. After the trailer’s initial 50 seconds of banter, when we’re watching a Nazi getting his head glibly bashed in with that All-American weapon, the baseball bat, like it’s a Wiemar Goodfellas, it seems the huffing, gleaming black trend train, trailing more than a half century’s smoke behind it, might have arrived at its ultimate cinematic destination. Though not before — and this is perhaps reflective of what Tarantino does with his down time — video games got there first.

Wolfenstein, Call of Duty: World at War, Medal of Honor, Mortyr, ÜberSoldier, Commandos: Strike Force — these are all first-person shooter games (and doubtless there’s others I’m missing on this list) wherein players are, as Pitt puts it in the trailer, “in the killin’ Nazi bidness.” In fact, the the film is so seamlessly aligned with this game genre that much of the promotion for it is happening through online gamer destinations like IGN and GGL. Even the movie’s iconography might as well be for an Inglourious Basterds video game, (which, I’m sort of surprised it isn’t):

inglourious-basterds-20090220000844483

Not that I am against the killin’ Nazi bidness, it’s just that I find the progressive reduction of the Third Reich to a cartoon, to be rather tasteless — also, a little bit queasily horrifying.

65 years after the defeat of Nazi Germany, our concept of Nazisms seems to be losing its reality. More and more we are turning a cancerous pathology of human behavior into a fantasy of evil. Their atrocious actuality wiped away by time, Nazis have become almost too perfectly evil to be have been true. They have come to serve now as a broad cultural shorthand for the ultimate, rottenest badness, or otherwise, for just whatever we happen to find personally distasteful. (See: Bill O’reilly expounding in all seriousness on why the Huffington Post are a bunch of Nazis for an example of just how utterly degenerated the cultural understanding of the term’s meaning has become.) Nazis have evolved into mythical, timeless, uncomplaining boogiemen, always on call to play the supreme Hollywood villains or video game baddies now that the idea of Soviets as arch enemies is an anachronism and Arab villains still feel way too real to be fantasy.

But there is a hugely real and present danger in treating Nazis like the occupied-Europe equivalent of Ninja adversaries in Revenge / Action flicks, or like human-looking-yet-conveniently-not human alien monsters. It is at our own peril that we think of the absolute, explicit worst that humanity is capable of doing, as if it were a supernatural, science-fiction evil, safely beyond human achievement. It’s very much not. Nazis are not Hollywood creations. They were REAL. And the fact of their existence is STILL real. And there is no fantasy for any of us in forgetting.

.

    



Subscribe for more like this.








taste the difference

…And I can make you wanna buy a product
Movers shakers and producers
Me and my friends understand the future
– The Flobots: “Handlebars”

I’ve been trying to get through Matt Mason’s The Pirate’s Dilemma for a while. It’s an easy read, but between digging up mind-blowing historical discoveries from the cultural strata–Did you know that a nun at the orphanage David Mancuso was raised at is pretty much responsible for modern dance culture? Dude, I know, it’s insane–And so many unconscious ironies and philosophical inconsistencies that I’m tempted to write a post after I finally do finish it called “The Pirate’s Contradiction”…. it’s difficult to read too much of it at a time.

There’s one very interesting section in it, however, that I think can be dealt with outside of the rest of the book. In keeping with the recent theme of musings on contemporary adulthood, here’s an excerpt from a section called “Parents Just Do Understand”:

The hip-hop generation was the first to grow up in a brand-saturated world. Before hip-hop, as Will Smith and DJ Jazzy Jeff once postulated, it was a given that parents just didn’t understand. But now parents who are the age of Smith have the same albums on their iPods as their kids, and the same reissued retro sneakers on their feet. This has serious ramifications for youth culture, commerce, and everything else.

…What does it mean now to “grow up” in a world where we all want a Nintendo Wii for Christmas?

BAM!

And while Mason presents the caveat that younger generations now find the outlet for rebellion through media and technology, that last bastion where parents and kids are still reliably segregated, in general his conclusion is that “The generation gap has become obsolete.”

But I wonder if perhaps it’s not quite that simple. Maybe the generation gap hasn’t gotten filled in and paved over, but has, in fact, gone deeper below the surface. From above, the divisions that would once define a generational cohort and distinguish it from its predecessors would appear to have eroded, but underneath, a different separation is very much intact.

A 2006 Rolling Stone article called “Teens Save Classic Rock” talks about how the genre of Hendrix, Floyd and Zeppelin is experiencing a resurgence among a whole new generation of kids. “We’re now seeing an audience that goes from sixteen to sixty,” said Allman Brothers manager Bert Holman.

The internet made this possible. iTunes means the music we can listen to is no longer determined solely by the offerings of an ever more homogenized radio, or limited to the finite selection of a physical record store. And while we can now instantly get to hear a bigger breadth of music from across genres and ages than was ever possible before, the question remains, as Rolling Stone points out, “Why would kids born in the Nineties turn to timeworn guitar anthems?”

One answer:

For all of the vibrant rock recorded in the past ten years — from pop punk to neogarage to dance rock — no new, dominant sound has emerged since grunge in the early Nineties. “I can’t think of a record recently that blew people’s minds,” says Jeff Peretz, a Manhattan producer and guitar teacher. “And there aren’t really any guitar heroes around anymore. Kids don’t come in and say, ‘I want to play like John Mayer.’”

“There is such a drought that kids are going back and rediscovering the Who and Sabbath,” says Paul Green, who runs the Paul Green School of Rock Music.

But I don’t think it’s a “drought” so much as a glut. Popular, contemporary music is so ominpresent and obvious there’s barely room for kids to even figure out if they like it. By default, it’s what they’re expected to be listening to. The hideaway of classic rock, where no doubt no one expected to find them, is a relished escape. The musical equivalent of disobeying your mom when she tells you “Just stay where I can see you.”

According to Rolling Stone, “9% of kids ages 12-17 listened to classic-rock radio in any given week in 2005 — marking a small but significant increase during the past three years, according to the radio-ratings company Arbitron.” It’s not just a sign of teen taste, it’s a sign of teen distinction. If you’re listening to classic rock in high school, you’re doing something the other 91% of the kids at your high school aren’t into, or onto yet. That’s some indisputable early adopter appeal there.

Which is perhaps the complete opposite of what appeals to adults about listening to the music of their own youth.

In a 2004 USA Today article about how Kids Are Listening To Their Parents’ Music, Jeremy Hammond, head of artist development at Sanctuary Records noted, “There’s not so much peer pressure to identify with a particular genre or even generation of music,” says “Back then, you had to choose a lifestyle associated with a genre. In England, you were in a gang of rockers or skinheads or Mods. Potheads wanted psychedelic music. Those boundaries are gone. [Now] It’s much more about defining one’s own unique tastes.”

The way a modern identity is constructed has changed. It’s no longer something as simple as how old we are that determines what is or is not “for us” to buy, or listen to, or dress like. The mechanics of taste is the next marketing frontier.

“I think the rebellion is that kids aren’t rebelling,” Says Rana Reeves, creative director of Shine Communications in The Pirate’s Dilemma. “They aren’t rebelling against the marketers; they want to be marketers.”

    



Subscribe for more like this.