The story of the biggest transformation of our time has a marketing problem: no one knows it’s happening.
There were many important events that happened in 2016. Some were deafening, trumpeting the seemingly inexplicable ascent of backwards-facing forces. But one event of great historical significance went largely unremarked upon.
In 2016 solar power became the cheapest form of new electricity on the planet and for the first time in history installed more new electric capacity than any other energy source.
Amid the sepia haze oozing from the past’s rusting, orange pipeline, humanity was placing a serious bet on a new kind of future. And you didn’t even know about it.
That’s a problem.
Powering Disruption
It was a bit like if you had a source of whale blubber in the 1840s and it could be used as fuel. Before gas came along, if you traded in whale blubber, you were the richest man on Earth. Then gas came along and you’d be stuck with your whale blubber. Sorry mate — history’s moving along.
— Brian Eno
“The beginning of the end for fossil fuels,” according to Bloomberg, occurred in 2013. “The world is now adding more capacity for renewable power each year than coal, natural gas, and oil combined. And there’s no going back…. The shift will continue to accelerate, and by 2030 more than four times as much renewable capacity will be added.”
The International Energy Agency’s Executive Director, Fatih Birol, said, “We are witnessing a transformation of global power markets led by renewables.”
“While solar was bound to fall below wind eventually, given its steeper price declines, few predicted it would happen this soon,” notes Bloomberg.
Later campaigns for the iPhone didn’t even show the product at all:
The product became the conduit to the experience. And the experience that solar has to sell is Future.
– Claim the Narrative of Future –
Two decades ago — back when it was still possible to talk about the future as anything but dystopia — a series of ads painted a striking vision of how that future was going to unfold. “Have you ever borrowed a book from thousands of miles away,” asked the ad voice. “Crossed the country without stopping to ask for directions? Or watched the movie you wanted to, the minute you wanted to?”
“You will,” said the voice, “and the company that will bring it to you: AT&T.”
Today I use a device to do basically 90% of what those ads predicted. (OK, I’ve never sent a fax from the beach, or tucked a baby in from a phone-booth, but you can’t get the Future 100% right). All of these things are so obvious and mundane now we barely even remember — some of us never knew — there was a time before. But, indeed, there was a point when this fantastical world was the future, and the future still seemed like a fantastical world.
There are no grand visions for the future now, no scenarios for humanity that don’t fill us with dread. A dying oligarchy tells us dissolution is freedom; regression is hope. It has disfigured our understanding of what’s happening in our world. The result is a gaping void in our collective vision when we look ahead. 17 years in to our new century there is a desperate hunger for a bright vision for the future, and at the moment arguably no one outside the world of clean energy has a legitimate claim to one. In the end, it’s not about utility bills or net metering laws or even solar panels for that matter. It’s about a vision of a Future worth demanding. Solar has the opportunity to be the voice of that vision for decades to come with a simple, cohesive, culture-focused messaging strategy.
Don’t blame it on the algorithm — assuming you’re designing experiences for “happy, upbeat, good-life users” might make you a terrible person.
My friend is going through a divorce. Like nearly 5 million other Americans. And recently Facebook greeted her with this careless user experience:
When this UX intrusion happened to her, it reminded me of a similar, psychological violation I’d read about four months earlier. That post, by Eric Meyer, had begun:
I didn’t go looking for grief this afternoon, but it found me anyway, and I have designers and programmers to thank for it. In this case, the designers and programmers are somewhere at Facebook.
I know they’re probably pretty proud of the work that went into the “Year in Review” app they designed and developed, and deservedly so—a lot of people have used it to share the highlights of their years. Knowing what kind of year I’d had, though, I avoided making one of my own. I kept seeing them pop up in my feed, created by others, almost all of them with the default caption, “It’s been a great year! Thanks for being a part of it.” Which was, by itself, jarring enough, the idea that any year I was part of could be described as great.
Still, they were easy enough to pass over, and I did. Until today, when I got this in my feed, exhorting me to create one of my own. “Eric, here’s what your year looked like!”
Yes, my year looked like that. True enough. My year looked like the now-absent face of my little girl. It was still unkind to remind me so forcefully.
I remember first reading this post the day it was published, Christmas eve 2014. When I went to look it up after my friend’s own violation by a Facebook app module I was surprised to (re)discover that it had been titled, generously, “Inadvertent algorithmic cruelty:”
And I know, of course, that this is not a deliberate assault. This inadvertent algorithmic cruelty is the result of code that works in the overwhelming majority of cases, reminding people of the awesomeness of their years, showing them selfies at a party or whale spouts from sailing boats or the marina outside their vacation house.
But for those of us who lived through the death of loved ones, or spent extended time in the hospital, or were hit by divorce or losing a job or any one of a hundred crises, we might not want another look at this past year.
To show me Rebecca’s face and say “Here’s what your year looked like!” is jarring. It feels wrong, and coming from an actual person, it would be wrong. Coming from code, it’s just unfortunate.
But of course, it did come from an actual person. “[The app] was awesome for a lot of people,” the product manager for Facebook’s Year in Review app, Jonathan Gheller, later told The Washington Post. Like all the digital experiences with, and within, which we all increasingly live our lives, an actual person — in fact a whole team of people — was responsible for concepting, designing, building, testing, and iterating this experience. No doubt, the responsibility for the rollout of this particular app featured prominently in a number of Facebook employees’ job performance reviews. From start to finish, this experience was crafted by people (not code). Calling its end result “inadvertent algorithmic cruelty” is like describing a drunk driving accident as “inadvertent gasoline cruelty.” For sure, it could have been avoided with an empty gas tank, but is that really the most accurate way to ascribe accountability in this situation? (Don’t blame it on the algohol).
“In creating this Year in Review app, there wasn’t enough thought given to cases like mine, or anyone who had a bad year,” Meyer wrote. “If I could fix one thing about our industry, just one thing, it would be that: to increase awareness of and consideration for the failure modes, the edge cases, the worst-case scenarios.”
If I could fix one thing about ourindustry, it would be to destroy the idea that these scenarios are edge cases.
Last year in the US, 2.6 million people died, leaving behind untold numbers of Facebook users who mourn the absence of their loved ones.
These are not “edge cases.” These are not “worst case scenarios.” These are all peoplewho use Facebook. And that’s not even counting your run of the mill disappointments, broken hearts, and inevitable wrongs and slights and meannesses that are, basically, life.
“The design [of the Year in Review app] is for the ideal user, the happy, upbeat, good-life user,” Meyer wrote. But if you are a product manager or UX designer creating experiences that will afflict affect hundreds of millions of people and you are only designing for an “ideal user”… at best that’s just lazy, and at worst — it’s creating LITERAL suffering.
The world, obviously, is a manifestly unjust place: people are always meeting fates they didn’t deserve, or not receiving rewards they did deserve for hard work or virtuous behaviour. Yet several decades of research have established that our need to believe otherwise runs deep.
If we didn’t all believe that [things happen for a reason] to some degree, life would be an intolerably chaotic and terrifying nightmare in which effort and payback were utterly unrelated, and there was no point planning for the future, saving money for retirement or doing anything else in hope of eventual reward. We’d go mad.
Yet, ironically, this desire to believe that things happen for a reason leads to the kinds of positions that help entrench injustice instead of reducing it.
Much in the same way that the “just world” cognitive bias can actually lead us to make crueler decisions, designing product features with the “happy, upbeat, good-life” ideal user bias can lead us to create crueler user experiences.
“To shield ourselves psychologically from the terrifying thought that the world is full of innocent people suffering,” Burkeman writes, we, as humans, “endorse policies more likely to make that suffering worse.” And by denying the full spectrum of the realities of people’s lives, awesome and tragic, we, as experience designers, do the same. Except we’re the ones with the power to actually do something about it.
“Just to pick two obvious fixes,” Meyer wrote at the end of his post, “First, don’t pre-fill a picture until you’re sure the user actually wants to see pictures from their year. And second, instead of pushing the app at people, maybe ask them if they’d like to try a preview—just a simple yes or no. If they say no, ask if they want to be asked again later, or never again. And then, of course, honor their choices. This is… designing for crisis, or maybe a better term is empathetic design.”
Or how about just, you know, design.
In the wake of Meyer’s post, the product manager for Facebook’s Year in Review app told The Washington Post. “We can do better.”
But four months later, Facebook’s photo collage assault on my friend suggests perhaps they don’t really think they can.
“Faced with evidence of injustice, we’ll certainly try to alleviate it if we can,” Burkeman wrote, “But, if we feel powerless to make things right, we’ll do the next best thing, psychologically speaking: we’ll convince ourselves that the world isn’t so unjust after all.”
We used to understand that brands were run by humans. But now, a decade in to social media, we are beginning to experience brands as human. And our technology is increasingly improving at executing the simulation.
.
.
In the future, it will have begun, like you knew it would, during the 2015 Super Bowl.
“The Coca-Cola Company spent a ridiculous sum of money during America’s No. 1 National Pastime on the evening’s most cynical advertising blitz: the “MakeItHappy” campaign,” Sam Biddle wrote on Gawker. “The premise was simple and also dumb: the internet is a mean place, and Coca-Cola was going to try make the internet a nice place. It was attempting to be the “I’d like to buy the world a Coke” for our modern digital idiot age: The company created a Twitter bot to take “mean” tweets and reformat their words into a cartoon rabbit playing the drums, or a cat. With this, the toxic web would be steam-cleaned, or something. So, in the hopes of making a minor point about the automated vacuum at the heart of Coke’s cynical anti-meanness push, we built a bot to tweet [Hitler’s autobiography] Mein Kampf through Coke’s automated positivity generator:
It has turned out fortunate for me to-day that destiny appointed Braunau-on-the-Inn to be my birthplace.
For that little town is situated just on the frontier between those two States the reunion of which seems,
at least to us of the younger generation,
a task to which we should devote our lives and in the pursuit of which every possible means should be employed.
German-Austria must be restored to the great German Motherland. And not indeed on any grounds of economic calculation whatsoever. No, no.
There’s more of these, but you get the idea.
“We assumed that the response to our little stunt would be largely apathetic,” Biddle writes:
Not only was our point obvious and slight, but in tweeting hateful sentiment at @CocaCola, we were doing exactly what the marketing campaign had asked us to do.
And then Coca-Cola, slow-witted and cowardly like all global megabrands, killed its bot, and suddenly countless people across the internet were aghast. We hadn’t thrown a tiny wrench into the slickly oiled workings of a $3 billion marketing operation, we’d embarrassed someone’s pal. Someone’s pal who was just trying to do some good online! We’d brought negativity into the positive sphere of Coke-swilling. For something totally devoid of humanity, Coca-Cola—a brutish company that condones slave labor and anti-union kidnapping and murder and whose CEO netted $30 million in 2012—was able to muster levels of smarmy cybertears not seen since Kony’s reign of terror with its Twitter stunt.
Coca-Cola's effort to clean up negativity on social media becomes the victim of a Gawker hate crime. http://t.co/Q5Ay9kQTjL
Actual flesh-and-blood humans felt bad for a corporation, in public. Real people poured the kind of empathy and anguish that’s historically been reserved for other real people upon a multinational conglomerate worth billions of dollars that sells liquid fructose poison and has a history of literally enslaving impoverished workers.
Human beings—including journalists—flocked to Coke’s side. The Verge sobbed that we’d “ruined” Coke’s “courage and optimism,” AdWeek called our work a “debacle,” and Coke itself feigned dismay: “It’s unfortunate Gawker made it a mission to break the system, and the content they used to do it is appalling.” “Have a Coke and a—frown,” bleated some dunce at USA Today. Coca Cola’s rough approximation of humanity had made an enormous impression, and its drinkers and friends took a stand. No more, they tweet-chanted in unison, no more unkind words for this maker of sweet liquid toxins.
“On Facebook, the button to ‘like’ a brand (like a brand!) is functionally identical to ‘liking’ another person.” Biddle writes. And more than 34 million people have “liked” Pepsi. “More than a million people have made a similar life decision with Mr. Clean, more than 300,000 people are Facebook friends with Jimmy Dean Sausages and Kleenex.” What has happened in the “friendification of corporate brands” is that advertising messages now co-exist in the same newsfeed, as “mom and bae and Brian from hockey practice.” News from brands and people we care about has blended into the same stream. And at this point, not only are there are a lot of people using social media who don’t really remember or relate to the time before this happened, there are a lot of brands using social media that are starting to forget, too.
Increasingly, the way brands (try) to sound is less and less like brands, and more and more…. like just actual people.
.
“This was [the] year of galumphing attempts of consumer brands to curry favor with #millennials on their #social networks with #memes designed to go #viral,” Annie Lowrey wrote in December in New York Magazine. “A new, horrible-brilliant Twitter account distills the trend down to its essence. It is called @BrandsSayingBae. It is comprised of brands tweeting the word bae or other trending neologisms. And it is, as the Verge puts it, just what “we’ve needed in 2014.”
“You can almost hear the white-collar conversation leading to tweets like these if you listen closely enough,” Lowrey adds, patomiming: “’Jones, the youths have adopted new phraseology again! This time it’s bae. Pronounced like the Chesapeake, spelled like babe with one letter missing!’ Sometimes, the results of such corporate-think are really funny. [For example] Denny’s stoner-Dada Twitter account.”
The best Coachella look is french toast remnants all over yr face while not appropriating any other cultures.
Why are brands doing this? Lowery attempts to explain:
They [saw] lightning get captured in a bottle once, on the evening of February 3, 2013. The San Francisco 49ers and the Baltimore Ravens had just kicked off the second half of their Super Bowl matchup when a power outage hit the stadium. Fans went crazy on Twitter — had Beyoncé rocked the halftime show so hard that she blew a fuse? And a few canny companies capitalized on the mania, including Oreo:
It was perfect — funny, sweet, timely, on-brand, apropos. It went viral, with a suit at Oreo’s parent company declaring that the tweet “not only shows the power of real-time engagement, but also the sheer importance of understanding the overall media ecosystem.”
People retweeted it. They wrote about it. They talked about it. But I doubt that they purchased or consumed more cookies because of it. And I doubt that they thought more positively of the Oreo brand, either.
Spammers took to Tinder soon after the matchmaking app went mainstream in 2013, setting up automated accounts to message lonely bachelors with ads for porn and webcam strip shows, according to reports from security firm Symantec.
“It’s usually, ‘Hey, if you want to talk further, go to this link on this website, and you can see all my pictures there,’” Satnam Narang, a senior security response manager at Symantec who’s written about the phenomenon, told me.
But lately, many Tinder spammers’ approaches have grown subtler. They’ve migrated from lewd photos and explicit language to more plausible, girl-next-door-style pictures. And they’ve programmed their bots to try to mimic a normal conversation.
“Social media will always be an incongruous and gross place for brands to mingle, because a company does not have feelings. It will never love you,” Biddle writes.
But how far away is a point where….. we can’t tell the difference?
“Spend some time to make your bot more personal,” Melendez quotes a user named cygon from a marketing forum where spammers trade tips for steering clear of Tinder’s spam detection systems and not raising users’ suspicions. “Your conversions will skyrocket. Once a guy gets feels a little emotionally involved he will go above and beyond to get a date. Remember—most your leads/conversions will be from beta guys who are desperate to get their dicks wet.”
But how long will it take before branded social media experiences are created by programs overseen by linguists and mathematicians and programmers writing AI code? How long before a major tech vendor sells in an artificial intelligence operating system to Coke?
How long until people are actually having relationships like the one depicted in the movie Her… with brands?
Anyway, back to getting approval for that social media editorial calendar.
“The kids are doing the normcore,” my friend Quang said, trying out the new phrase with a deliberate, old fart dialect.
Only a few moments earlier I had tossed off the word like common parlance.
“‘Normcore?’” he had repeated, making sure he’d heard correctly.
“Yeah,” I explained, “It’s exactly what you think it is. It’s us, now.”
A shockingly pleasant March afternoon had arrived in Boston that day, on the heels of a cold that had felt like osteoporosis. A decade in LA had turned me into a wimp. I had forgotten how I’d ever managed to live through this in my youth.
But that day in Boston, in 2014, hanging out with friends who had come up through the rave, circus, and goth subcultures, you could hardly tell where any of us had been. What we wore now was nondescript. Non-affiliated. Normal.
The week before, at a craft beer tasting party at an indie advertising agency in Silver Lake, a sculpture artist was remarking about recently looking through photos of style choices from the aughts. “What was I thinking,” she said in bewilderment. That evening she was wearing a black tank top, and, like, pants. Maybe three quarter length? Or not? Maybe black jeans? Or not-jean pants? I couldn’t recall. Perhaps, I thought, this was just a symptom of getting older. There was some kind of sartorial giving a shit phase that we had all grown out of. But it turned out this, too, was a trend. Kids, too young to have grown out of anything, were dressing this way.
“By late 2013, it wasn’t uncommon to spot the Downtown chicks you’d expect to have closets full of Acne and Isabel Marant wearing nondescript half-zip pullovers and anonymous denim,” wrote Fiona Duncan, in a February New York Magazine article titled, “Normcore: Fashion for Those Who Realize They’re One in 7 Billion:”
I realized that, from behind, I could no longer tell if my fellow Soho pedestrians were art kids or middle-aged, middle-American tourists. Clad in stonewash jeans, fleece, and comfortable sneakers, both types looked like they might’ve just stepped off an R-train after shopping in Times Square. When I texted my friend Brad (an artist whose summer uniform consisted of Adidas barefoot trainers, mesh shorts and plain cotton tees) for his take on the latest urban camouflage, I got an immediate reply: “lol normcore.”
Normcore—it was funny, but it also effectively captured the self-aware, stylized blandness I’d been noticing. Brad’s source for the term was the trend forecasting collective (and fellow artists) K-Hole. They had been using it in a slightly different sense, not to describe a particular look but a general attitude: embracing sameness deliberately as a new way of being cool, rather than striving for “difference” or “authenticity.”
Oh my god, I thought reading this: this is me.
In Nation of Rebels: Why Counterculture Became Consumer Culture, published in 2004, cultural critics, Joseph Heath and Andrew Potter examined the inherent contradiction in the idea that counterculture was an opposition to mass consumer culture. Not only were they not opposed, Heath and Potter explained, they weren’t even separate. Alternative culture’s obsession with being different — expressing that difference through prescribed fashion products and subcultural artifacts — had, in fact, helped to create the very mass consumer society the counterculture believed itself to be the alternative to.
“To me, Nike’s famous swoosh logo had long been the mark of the manipulated,” wrote Rob Walker, author of 2008′s Buying In: The Secret Dialogue Between What We Buy And Who We Are, ”A symbol for suckers who take its ‘Just Do It’ bullying at face value. It’s long been, in my view, a brand for followers. On the other hand, the Converse Chuck Taylor All Star had been a mainstay sneaker for me since I was a teenager back in the 1980′s, and I stuck with it well into my thirties. Converse was the no-bullshit yin to Nike’s all-style-and-image yang. It’s what my outsider heroes from Joey Ramone to Kurt Combain wore. So I found [Nike’s] buyout [of Converse] disheartening…. but why, really, did I feel so strongly about a brand of sneaker–any brand of sneaker?”
In response to Buying In, I’d written, “Whether we’re choosing to wear Nikes, Converse, Timberlands, Doc Martens, or some obscure Japanese brand that doesn’t even exist in the US, we’re deliberately saying something about ourselves with the choice. And regardless of how “counter” to whatever culture we think we are, getting to express that differentiation about our selves requires buying something.”
But that was five years ago. A funny thing happened on the way to the mid twenty-teens. The digital era ushered in an unprecedented flood of availability — of both information and products. This constant, ubiquitous access to everything — what Chris Anderson dubbed the “Long Tail” in his 2006 book of the same name – had changed the cultural equation. We had evolved, as Anderson predicted, “from an ‘Or’ era of hits or niches (mainstream culture vs. subcultures) to an ‘AND’ era.” With the widespread proliferation of internet access, mass culture got less mass, and niche culture got less obscure. We became what Anderson called a “massively parallel culture: millions of microcultures coexisting and interacting in a baffling array of ways.” On this new, flattened landscape, what was there to be counter to?
“Jeremy Lewis, the founder/editor of Garmento and a freelance stylist and fashion writer, calls normcore ‘one facet of a growing anti-fashion sentiment,’” Duncan writes in New York Magazine. “His personal style is (in the words of Andre Walker, a designer Lewis featured in the magazine’s last issue) ‘exhaustingly plain’—this winter, that’s meant a North Face fleece, khakis, and New Balances. Lewis says his ‘look of nothing’ is about absolving oneself from fashion.”
That is how normcore happened to me, too. When I quit the circus, leaving behind its sartorial regulations, I realized that difference wasn’t an expression of identity: it was a rat race.
“Fashion has become very overwhelming and popular,” Lewis explains in New York Magazine. “Right now a lot of people use fashion as a means to buy rather than discover an identity and they end up obscured and defeated. I’m getting cues from people like Steve Jobs and Jerry Seinfeld. It’s a very flat look, conspicuously unpretentious, maybe even endearingly awkward. It’s a lot of cliché style taboos, but it’s not the irony I love, it’s rather practical and no-nonsense, which to me, right now, seems sexy. I like the idea that one doesn’t need their clothes to make a statement.”
“Magazines, too,” Duncan writes, “have picked up the look:”
The enduring appeal of the Patagonia fleece [was] displayed on Patrik Ervell and Marc Jacobs’s runways. Edie Campbell slid into Birkenstocks (or the Céline version thereof) in Vogue Paris. Adidas trackies layered under Louis Vuitton cashmere in Self Service. A bucket hat and Nike slippers framed an Alexander McQueen coveralls in Twin. Smaller, younger magazines like London’s Hot and Cool and New York’s Sex, were interested in even more genuinely average ensembles, skipping high-low blends for the purity of head-to-toe normcore.
One of the first stylists I started bookmarking for her normcore looks was the London-based Alice Goddard. She was assembling this new mainstream minimalism in the magazine she co-founded, Hot and Cool, as early as 2011. For Goddard, the appeal of normal clothes was the latest thing. One standout editorial from Hot and Cool no. 5 (Spring 2013) was composed entirely of screenshots of people from Google Map’s Street View app. Goddard had stumbled upon “this tiny town in America” on Map sand thought the plainly-dressed people there looked amazing. The editorial she designed was a parody of contemporary street style photography—“the main point of difference,” she says, “being that people who are photographed by street style photographers are generally people who have made a huge effort with their clothing, and the resulting images often feel a bit over fussed and over precious—the subject is completely aware of the outcome; whereas the people we were finding on Google Maps obviously had no idea they were being photographed, and yet their outfits were, to me, more interesting.”
New media has changed our relation to information, and, with it, fashion. Reverse Google Image Search and tools like Polyvore make discovering the source of any garment as simple as a few clicks. Online shopping—from eBay through the Outnet—makes each season available for resale almost as soon as it goes on sale. As Natasha Stagg, the Online Editor of V Magazine and a regular contributor at DIS (where she recently wrote a normcore-esque essay about the queer appropriation of mall favorite Abercrombie & Fitch), put it: “Everyone is a researcher and a statistician now, knowing accidentally the popularity of every image they are presented with, and what gets its own life as a trend or meme.” The cycles of fashion are so fast and so vast, it’s impossible to stay current; in fact, there is no one current.
Emily Segal of K-HOLE insists that normcore isn’t about one specific aesthetic. “It’s not about being simple or forfeiting individuality to become a bland, uniform mass,” she explains. Rather, it’s about welcoming the possibility of being recognizable, of looking like other people—and “seeing that as an opportunity for connection.”
K-HOLE describes normcore as a theory rather than a look; but in practice, the contemporary normcore styles I’ve seen have their clear aesthetic precedent in the nineties. The editorials in Hot and Cool look a lot like Corinne Day styling newcomer Kate Moss in Birkenstocks in 1990, or like Art Club 2000′s appropriation of madras from the Gap, like grunge-lite and Calvin Klein minimalism. But while (in their original incarnation) those styles reflected anxiety around “selling out,” today’s version is more ambivalent toward its market reality.
In a post Hot-Topic world, where Forever21 serves up fast fashion in processed flavors like, Occupy:
and Burning Man:
we’re realizing that alternativeness, as a means for authentic self expression, is futile.“Normcore isn’t about rebelling against or giving into the status quo,” Duncan concludes, “It’s about letting go of the need to look distinctive.”
In our all-access, always connected, globalized world, obscurity is scarce. When everything is accessible, nothing is alternative.
“In the 21st century,” Rob Walker wrote back in 2008, not recognizing the quickly approaching end of counterculture, “We still grapple with the eternal dilemma of wanting to feel like individuals and to feel as though we’re apart of something bigger than ourselves. We all seek ways to resolve this fundamental tension of modern life.”
In 2014, normcore is one solution we’ve found to resolve it.
In the year since our app launched, our users have created over 5 million images. By now you’ve seen this mirrored selfie trend all over Instagram, not to mention throughout the greater popular culture.
But it’s the selfies — mirrored or otherwise — that have been on my mind a lot lately.
Selfies.
Right now, there are 50 million images on Instagram with the hashtag #selfie, and nearly 140 million tagged #me.
“Selfies,” Elizabeth Day reports in the Guardian, “Have become a global phenomenon. Images tagged as #selfie began appearing on the photo-sharing website Flickr as early as 2004. But it was the introduction of smartphones – most crucially the iPhone 4, which came along in 2010 with a front-facing camera – that made the selfie go viral.”
A recent survey of more than 800 American teenagers by the Pew Research Centre found that 91% posted photos of themselves online – up from 79% in 2006.
But the selfie isn’t just a self-portrait, it is a self-object.
“Again and again, you offer yourself up for public consumption,” Day writes. “Your image is retweeted and tagged and shared. Your screen fills with thumbs-up signs and heart-shaped emoticons. Soon, you repeat the whole process, trying out a different pose.”
“The selfie is about continuously rewriting yourself,” says Dr. Mariann Hardey, a lecturer in marketing at Durham University who specializes in digital social networks. “It’s an extension of our natural construction of self.”
But what is it we are constructing our selves into?
Porn.
Before we go any further, let’s get this out of the way: unless you are a teenager right now, you do not understand what it means to grow up in a world where porn and Facebook are equidistant — in case you don’t know, that proximity is one click away, and apart. If you’re curious to understand what, in fact, this experience is like — in teenagers’ own words — you should read Nancy Jo Sales’ recent Vanity Fair article, “Friends Without Benefits.” But not until after you’ve finished reading this one because I’ll be drawing on it quite a bit.
If you are, at this moment, older than at least your mid-20s, whatever it is that you think you can draw on to relate to 2013 from an analog adolescence frame of reference, just put that away, because it is not a parallel to what is happening right now. What is, according to Gail Dines, the author of Pornland: How Porn Has Hijacked Our Sexuality, is “a massive social experiment.” Here are some results from that experiment so far:
93% of boys and 62% of girls have seen internet porn
83% of boys and 57% of girls have seen group sex online
18% of boys and 10% of girls have seen rape or sexual violence
But that was five iPhone versions ago at this point, so, you do the math.
“In the absence of credible, long-term research, we simply don’t know where the age of insta-porn is taking us,” writes Peggy Drexler on TheDailyBeast, but that we are in it, and that it is pervasive, is undeniable.
“What does this do to teenagers,” Sales asks in Vanity Fair. “And to children? How does it affect boys’ attitudes toward girls? How does it affect girls’ self-esteem and feeling of well-being? And how is this affecting the way that children and teenagers are communicating on these new technologies?”
In the the Guardian, Day describes one typical answer to that last question: “The pouting mouth, the pressed-together cleavage, the rumpled bedclothes in the background hinting at opportunity — a lot of female selfie aficionados take their visual vernacular directly from pornography (unwittingly or otherwise).”
“Because of porn culture,” says Dines, “Women have internalised that image of themselves. They self-objectify.”
“The girls I interviewed,” says Sales, “Even if they’re not doing it themselves, it’s in their faces: their friends posting really provocative pictures of themselves on Facebook and Instagram, sending nude pictures on Snapchat. Why are they doing this? Is this sexual liberation? Is it good for them? Girls know the issues, and yet some of them still can’t resist objectifying themselves, as they even talk about [themselves]. As the girl I call ‘Greta’ says, ‘more provocative equals more likes.’ To be popular, which is what high school is all about, you have to get ‘likes’ on your social-media pics.”
Flattening the hierarchies that separate trash from art, porn from erotica, and moral justice from exploitation by any means necessary, Spring Breakers… embraces and elaborates upon the prevalent suspicion that nobody lives on the stable side of reality any more.
“Pretend you’re in a videogame,” says one of the film’s female anti-heroines as they begin their spree of rampant self-abuse and crime. That’s what Miley Cyrus does, trying on new aspects of performance and sexual self-expression in her new persona. It’s also how the vulnerable models that Robin Thicke ogles [in the music video for his song, Blurred Lines] make it through the gauntlet that the video’s scene creates.
The childlike goofiness Katy Perry expressed with California Gurlz in 2010, or the sweet hope of Carly Rae Jepsen’s smash of last year, Call Me Maybe, have intensified into something more unsettling. In this strange summer of too much heat, so many precariously excessive songs and videos now play on that line between healthy catharsis and chaos.
Summer.
The summer would get stranger still. Punctuated in its final days by what may just be the most controversial MTV Video Music Awards performance of all time, featuring a duet by Cyrus and Thicke.
From its very first steps, Cyrus’s performance felt, unmistakably, like watching a GIF happen in real-time. The act was speaking the native tongue — stuck all the way out — of the digital age, its direct appeal to meme culture as blatant and aggressive as the display of sexuality. The source material and its inevitable meme-ification appeared to be happening simultaneously. The Internet was inherently integrated within the performance. It was no longer a “second” screen; it was the same damn screen. All the performances before it had been made for TV. This show changed that.
What I learned from the 2013 VMAs is that owning your sexuality is passé, but owning meme culture by exploiting your sexuality is now. Whatever you think of it, Cyrus’s performance was a deliberate reflection of where we are as culture.
A burner had been left blindly on. Something invisible and pervasive had accumulated. Watching the VMAs, a giant fireball exploded in our faces.
We were unprepared.
This, ultimately, would be why everyone freaked out. Cyrus became a highly visible target for embodying this shift on a mainstream stage, and exploiting it to increase her fame and drive her record to #1, but all she was doing was deftly surfing the cultural current.
By the end of August, she was exposing us to the new normal.
Fall.
“In news that’s not at all surprising, yet another tech event was disrupted by a sexist joke,” Lauren Orsini wrote on ReadWriteWeb, within days of the VMAs:
“Titstare” was the first presentation of the TechCrunch Disrupt 2013 hackathon. Created by Australians Jethro Batts and David Boulton, the joke app is based on the “science” of how sneaking a peek at cleavage helps men live healthier lives.
The opening salvo cast an ugly shadow over the event, reminding attendees that, just like at PyCon and other technology conferences, “brogrammer” culture is still the norm.
Perhaps most disconcerting is the fact that Batts and Boulton presented immediately before Adria Richards, a programmer who rose to the national spotlight after she witnessed sexist jokes at PyCon 2013. Her gall to disapprove of the offensive jokes earned her death threats.
In the wake of the VMA article, I kept tweeting over and over, “Everything is changing….but into whatttttt?” By the early days of Fall, the culture had undeniably shifted. I kept kept seeing an escalating, atavistic gender warfare. Why is this happening, I thought.
That week I was approached to speak at a women’s startup conference and felt, reflexively, offended. The idea that there should be segregated events seemed insulting and damaging — to everyone. I began to feel self-conscious that I had an app startup with a male business partner. I texted him, “What is happening???” and “Can’t we all just get along?” We laughed, but we began to feel like an anomaly.
Pretend you’re in a videogame.
“When we listeners find ourselves taking pleasure in these familiar but enticingly refreshed acts of transgression,” Powers writes, “Echoing the Michael Jackson-style whoops that Pharrell makes in Blurred Lines, or nodding along to the stoned, melancholy chorus of Cyrus’s arrestingly sad party anthem, We Can’t Stop, are we compromising ourselves? Or is it okay, because after all, it’s just pretend?”
And when the technology that I, you, and everyone we know use on a daily basis gets developed to the sound of this same, blurry, pop culture soundtrack (figuratively or literally), what happens then? How are the creators of objectifying technology supposed to know it isn’t cool — if all of our technology is used for objectification?
In Vanity Fair, Sales talks to Jill Bauer and Ronna Gradus co-directors of Sexy Baby, a documentary about girls and women in the age of porn. “We saw these girls embracing this idea that ‘If I want to be like a porn star, it’s so liberating,’” Gradus said. “We were skeptical. But it was such a broad concept. We asked, ‘What is this shift in our sexual attitudes, and how do we define this?’ I guess the common thread we saw that is creating this is technology. Technology being so available made every girl or woman capable of being a porn star, or thinking they’re a porn star. They’re objectifying themselves. The thinking is: ‘If I’m in control of it, then I’m not objectified.’”
In October, Sinead O’Connor — whose video for Nothing Compares 2 U inspired Cyrus’s look in her video for Wrecking Ball — wrote an “open letter” to Cyrus, beautifully capturing, “in the spirit of motherliness and with love,” the generational disconnect at the heart of the cultural shift. “The message you keep sending is that it’s somehow cool to be prostituted.. it’s so not cool Miley. Don’t let the music business make a prostitute out of you,” O’Connor wrote, not getting it.
The familiar, analog, 20th century relationship in between objectification and commercialization has eroded. In its place, a new, post-Empire dynamic has arrived, built on a natively digital experience that O’Connor and an entire population still able to remember and relate to a world before the internet and mobile technology, can’t wrap their heads around.
“The blurred messages Thicke, Cyrus and others are now sending fit a time when people think of themselves as products, more than ever before,” Powers writes.
In the attention economy, self-exploitation is self-empowerment. We are all objects. We are all products. We are all selfies.
And we can’t stop.
“Social media is destroying our lives,” Sales quotes a girl in Vanity Fair.
“So why don’t you go off it?” Sales asks.
“Because then we would have no life.”
The ubiquitousness of digital cameras and social media platforms to share their instant output has not only turned the idea that objectification is violation into an anachronism, but self-objectification is now, as Powers, writes “part of today’s ritual of romance.” Nearly one in three teenagers is sending nude photos, after all.
Like the girls in Sales’ article, who tell her that “presenting themselves in this way is making them anxious and depressed,” but continue to do it anyway, we do not self-objectify because we’re in control. We self-objectify because it is the norm.
We self-objectify to rationalize, to placebo-ize that we had control in the first place.
We Can’t Stop.
“Both young women and young men are seriously unhappy with the way things are,” says, Donna Freitas, a former professor at Hofstra and Boston Universities, who studies hook-up culture on college campuses in her new book, The End of Sex (which Sales suggests, “might as well be called The End of Love.”)
Sales writes:
Much has been written about hook-up culture lately, notably Hanna Rosin’s The End of Men(2012) and a July New York Times article, “Sex on Campus: She Can Play That Game Too,” both of which attributed the trend to feminism and ambitious young women’s desire not to be tied down by relationships.
But Freitas’s research, conducted over a year on seven college campuses, tells a different story.
She describes the sex life of the average college kid as “Mad Men sex, boring and ambivalent. Sex is something you’re not to care about. They drink to drown out what is really going on with them. The reason for hooking up is less about pleasure and fun than performance and gossip—it’s being able to update [on social media] about it. Social media is fostering a very unthinking and unfeeling culture.”
College kids, both male and female, also routinely rate each other’s sexual performance on social media, often derisively, causing anxiety for everyone.
And researchers are now seeing an increase in erectile dysfunction among college-age men—related, Freitas believes, to their performance anxiety from watching pornography: “The mainstreaming of porn is tremendously affecting what’s expected of them.”
Porn has killed our imaginations. We sit and try to fantasize. We shut our eyes tight and think, ‘Wait, what did I used to masturbate about before porn? What image is going to turn me on right now?” But your brain gets tired and your genitalia isn’t used to working this hard so you open your reliable go-to porno and get off in two minutes. Later, you have trouble maintaing an erection during actual sex because your partner doesn’t look like a blow up doll from the Valley.
Our sex lives are having less and less to do with actual sex. Intimacy has morphed into something entirely more narcissistic. What used to be about making each other feel good and connecting is now about validation.
When sex does happen, when we finally make it through the endless hoops of text messaging, planning a date and actually sticking to it and you discover that you like this person (or could like them for an evening), it feels like an old faded photograph that’s been sitting in a shoebox at the bottom of your closet. “This orgasm feels like a vintage ball gown! Is this how people used to do it in the olden days?!” It’s terrifying!
In 2013, our phones are getting to have all the fun. They’re getting laid constantly while we lay naked in the dark, rubbing our skin, trying pathetically to get turned on by the feel of our own touch. We scroll through our camera and see a buffet of anonymous naked photos we’ve collected over the last few months for us to jack off to. Somehow, this has become enough for us. Getting off has become like fast food. It’s accessible, cheap, and most likely going to make us feel like shit after.
We are actively participating in the things that keep us from what we want. Feel good now, feel bad forever later. Stomachache stomachache, junk food junk food.
In a pervasively mediated culture, where porn primes our perception of ourselves and others, and our technology reduces us to selfies, objectification is inevitable.
And the trouble is — it doesn’t matter how you treat objects…. It’s not like they’re people.
What people want today is “to hurt one another” and “get back at the people that hurt them,” Hunter Moore, the founder of IsAnyoneUp.com, told Rolling Stone last October.
In a September article on TheVerge titled, The End of Kindness, Greg Sandoval writes:
And Moore ought to know. He’s one of the pioneers of revenge porn, the practice of posting nude photos to the web of a former lover in an attempt to embarrass, defame, and terrorize.
While minorities and homosexuals are often targeted, experts say no group is more abused online than women. Danielle Citron, a law professor at the University of Maryland lays out some of the numbers in her upcoming book, Hatred 3.0. The US National Violence Against Women Survey reports 60% of cyberstalking victims are women. A group called Working to Halt Online Abuse studied 3,787 cases of cyberharassment, and found that 72.5% were female, 22.5% were male and 5% unknown. A study of Internet Relay Chat showed male users receive only four abusive or threatening messages for every 100 received by women.
Moore has sold his site but scores of wannabes are cropping up. A check of these sites shows that victims are almost always women. At Myex.com over 1,000 nude photos and new pictures are added nearly every day. Each post typically includes the name of the person photographed, their age, and the city they live in. The posts come with titles like, “Manipulative Bitch,” “Cheater,” “Has genital warts,” “Drunk,” “Meth User,” “This girl slept with so many other guys,” and “Filthy Pig.”
The Verge contacted several women found on some of these sites, including Myex.com. While all of them declined to be interviewed, they did acknowledge that the photos were posted without permission by an ex-boyfriend or lover. One woman said that she was trying to get the pictures pulled down and had successfully removed them from other sites because she was not yet 18 years old when they were taken (if her claim is accurate it would make the snapshots child pornography). She pleaded that we not use her name and asked that we not contact her again.
If the woman was upset and afraid, she has a right to be, says Holly Jacobs, 30, who has started a nonprofit organization dedicated to ending revenge porn and supporting its victims. Jacobs knows firsthand that these sites are killers of reputations and relationships. Three years ago, Jacobs was studying for her PhD in industrial organizational psychology and working as a consultant at a university when a former boyfriend began posting nude photos of her online. The embarrassment and terror was just the beginning. Jacobs’ ex sent copies of the photos to her boss and suggested she was sexually preying on students. Jacobs’ employers, fearing bad press, asked her to prove she didn’t upload the photos herself. She finally felt compelled to change her name (Jacobs is the new name).
In July The Washington Post published a story about men who post phony ads to make it appear as if their ex-wives or girlfriends are soliciting sex. One man, Michael Johnson II of Hyattsville, Maryland, published an ad titled “Rape Me and My Daughters” and included his ex-wife’s home address. More than 50 men showed up to the victim’s house. One man tried to break in and another tried to undress her daughter. Johnson was sentenced to 85 years in prison. His victim was physically unharmed but these ads can be lethal. In December 2009, a Wyoming woman was raped with a knife sharpener in her home after an ex-boyfriend assumed her identity and posted a Craigslist ad that read, “Need an aggressive man with no concern or regard for women.” Her ex and the man who raped her are both serving long prison sentences.
Winter.
While people, trapped as we are by our digital avatars, are increasingly being reduced to objects, our technology seems to be benefitting from a transference of humanity.
Spike Jonze’s new movie, Her, due out in December, is being called “science fiction,” but the “future” depicted in the trailer looks essentially indistinguishable from the reality we all find ourselves in today. In it, a melancholy man, played by Joaquin Phoenix, and a Turing test-approved virtual assistant program, voiced by Scarlett Johansson, fall in love.
“Unlike the science fiction of yesteryear,” writes David Plumb on Salon.com, “Her is not about the evolving relationship between humans and artificial intelligence. Instead, Samantha appears to be essentially a human being trapped in a computer. Her thus appears to be about programming the perfect woman who fits in your pocket, manages your life, doesn’t have a body (and thus free will), and has an off switch.”