The Future of the Sun

The story of the biggest transformation of our time has a marketing problem: no one knows it’s happening.

There were many important events that happened in 2016. Some were deafening, trumpeting the seemingly inexplicable ascent of backwards-facing forces. But one event of great historical significance went largely unremarked upon.

In 2016 solar power became the cheapest form of new electricity on the planet and for the first time in history installed more new electric capacity than any other energy source.

Amid the sepia haze oozing from the past’s rusting, orange pipeline, humanity was placing a serious bet on a new kind of future. And you didn’t even know about it.

That’s a problem.

 

Powering Disruption

It was a bit like if you had a source of whale blubber in the 1840s and it could be used as fuel. Before gas came along, if you traded in whale blubber, you were the richest man on Earth. Then gas came along and you’d be stuck with your whale blubber. Sorry mate — history’s moving along.
Brian Eno

“The beginning of the end for fossil fuels,” according to Bloomberg, occurred in 2013. The world is now adding more capacity for renewable power each year than coal, natural gas, and oil combined. And there’s no going back…. The shift will continue to accelerate, and by 2030 more than four times as much renewable capacity will be added.”

The International Energy Agency’s Executive Director, Fatih Birol, said, “We are witnessing a transformation of global power markets led by renewables.”

“While solar was bound to fall below wind eventually, given its steeper price declines, few predicted it would happen this soon,” notes Bloomberg.

In the United States, as coal production fell an estimated 17% in 2016, continuing an 8 year decrease, the solar market nearly doubled, breaking all records and beating oil, coal, and natural gas as the country’s biggest source of new electric generating capacity. While it’s still just a tiny fraction of the domestic electricity mix, in 2016 40% of all new capacity additions came from solar. 2016 was also the 4th consecutive year that US solar jobs grew by more than 20%, according to the Solar Foundation. One out of every 50 new jobs added in the United States in 2016 was created by the solar industry, which now employs more people than oil, coal, and gas extraction combined.

“Solar investment has gone from nothing — literally nothing — like five years ago to quite a lot,” said Ethan Zindler, head of U.S. policy analysis at BNEF. “A huge part of this story is China, which has been rapidly deploying solar” and, Bloomberg notes, helping other countries finance their own projects.

Between 2008 and 2013, solar panel costs dropped by 80% worldwide thanks to the accelerant of Chinese manufacturing. As Scientific American writes:

China leapfrogged from nursing a tiny, rural-oriented solar program in the 1990s to become the globe’s leader in what may soon be the world’s largest renewable energy source.

According to DOE, the [Chinese] federal government was willing to chip in as much as $47 billion to help build its solar manufacturing into what it calls a “strategic industry.”

In building up the world’s largest solar manufacturing industry, one that became the price leader in most aspects of the world’s market — beginning with cheaper solar panels — China helped create a worldwide glut. [In 2013] there were roughly two panels being made for every one being ordered by an overseas customer.

By 2015, China’s domestic market bypassed Germany’s to be the largest in the world…. [Today] China dominates the solar market in PV installation as well as total installed capacity, with the United States a distant third and fourth, respectively.

“If there was ever a situation where the Chinese have put their whole governmental system behind manufacturing, it’s got to be solar modules,” [Ken] Zweibel [30-year veteran of the U.S. solar industry and DOE] said.

“They fundamentally changed the economics of solar all over the world,” said Amit Ronen, director of the Solar Institute of George Washington University.

The impact of cheap, abundant solar technology has begun to ripple out across a planet where, as Scientific American notes, “the mathematics have long shown that solar power is the most abundant energy resource.”

In 2016 Sweden announced it’s committing to becoming 100% renewable by 2040. India broke a world record with a 6 square mile-wide solar farm, and set its sights on doubling its solar power capacity. In January, Saudi Arabia, the world’s biggest crude oil exporter, announced a plan to invest $163 billion in renewables to support 50% of the country’s energy needs by 2050. Although admittedly their goal is to free up more oil for export in the short term, the country’s fate, The Atlantic reports, “may now depend on its investment in renewable energy.”

Before 2016 came to a close, China announced plans to cancel over 100 coal plants in development and to create 13 million jobs in renewables over the next 4 years. (For context, the US clean energy industry is just over 3 million jobs. The entire US tech industry is 6.7 million jobs.)

“The question is now no longer if the world will transition to cleaner energy,” FastCompany writes, “but how long it will take.”

According to the International Energy Agency, while solar makes up less than 1% of the electricity market today it could be the world’s biggest single source by 2050

Already, The Wall Street Journal reports, energy companies are beginning to confront the “crude reality… that some fossil-fuel resources will remain in the ground indefinitely.”

“A Goldman Sachs report last year forecast solar and wind will generate more new energy capacity in the next five years than the shale-oil revolution did in the last five,” writes David Bank, of ImpactAlpha.

Bloomberg predicts peak fossil-fuel use for electricity may be reached within the next decade. Peak gasoline demand by 2021.

For over 100 years, the oil industry and its stakeholders have believed that the market for their products will continue to grow ad infinitum without competitive challenges,” energy economist, Peter Tertzakian wrote last month. “Never in my 35-year career following energy markets has there been so much widespread disagreement about future demand for oil.”

From the water’s edge of 2017 we can see out onto the horizon. When the future history books are written, 2016 will be the year the tide turned.

Why didn’t we realize it?

 

Powering Denial

“For as I detest the doorways of Death, I detest that man, who hides one thing in the depths of his heart, and speaks forth another.”
— Achilles, The Iliad

“We live in the Stone Age in regard to renewable power,” Florida state Rep. Dwight Dudley, said last year in Rolling Stone’s expose on the war entrenched utilities are waging on solar energy. “The power companies hold sway here, and the consumers are at their mercy.”

Rate hikes and punishing fees for homeowners who turn to solar power [have] darkened green-energy prospects in could-be solar superpowers like Arizona and Nevada. But nowhere has the solar industry been more eclipsed than in Florida, where the utilities’ powers of obstruction are unrivaled… .The solar industry in Florida has been boxed out by investor-owned utilities (IOUs) that reap massive profits from natural gas and coal… .These IOUs wield outsize political power in the state capital of Tallahassee, and flex it to protect their absolute monopoly on electricity sales.

The rise of distributed solar power poses a triple threat to these monopol[ies]. First: When homeowners install their own solar panels it means the utilities build fewer power plants, and investors miss out on a chance to profit. Second: Solar homes buy less electricity from the grid; utilities lose out on recurring profits from power sales. Third: Under “net metering” laws, most utilities have to pay rooftop solar producers for the excess power they feed onto the grid. In short, rooftop solar transforms a utility’s traditional consumers into business rivals.

The utility trade group Edison Electric Institute (EEI) warns that rooftop solar could do to the utility industry what digital photography did to Kodak, bringing potentially “irreparable damages to revenues and growth prospects.”

Few industries are worse equipped to deal with disruption than power utilities. Their profits depend on infrastructure investments that pay off over a generation or more. “Utilities are structured to be in stasis,” says Zach Lyman, partner at Reluminati, an energy consultancy in Washington, D.C. “When you get fully disrupted, you’ve got to find a new model. But utilities are not designed to move to new models; they never were. So they play an obstructionist role.”

Obstruction plays out in the State Houses, but it also plays out in hearts and minds. Here’s what obstruction looks like as a messaging strategy:

 

Bills percolating through state legislatures across the U.S. are giving the education fight a new flavor, by encompassing climate change denial and serving it up as academic freedom.

Newsweek

 

 

 

And so on.

But behind the petroleum-jellied lens of blurry obstructionism “Freedom” is just a marketing gimmick when you’ve got nothing left to lose except your entire whale blubber fortune.

As futurist Alex Steffen explains:

There is no long game in high-carbon industries. Their owners know this. They don’t need a long game, though… . All they need is the perception of the inevitability of future profit, today. That’s what keeps valuations high… .The Carbon Bubble will pop not when high-carbon practices become impossible, but when their profits cease to be seen as reliable.

For high-carbon industries to continue to be attractive investments, they must spin a tale of future growth. As it becomes clear that these assets will not produce profit in the future, their valuations will drop — even if the businesses that own them continue to function for years. The value of oil companies will collapse long before the last barrel of oil is burned.

Put another way: The pop comes when people understand that growth in these industries is over and that, in fact, these industries are now going to contract. That’s when investors start pulling out and looking for safer bets. As investors begin to flee these companies, others realize more devaluation is on the way, so they want to get out before the drop: a trickle of divestment becomes a flood and the price collapses. What triggers the drop is investors ceasing to believe the company has a strong future.

“Gridlock is the greatest friend a global warming skeptic has,” a spokesman for an Oklahoma senator says in Jane Mayer’s Dark Money: The Hidden History of the Billionaires Behind the Rise of the Radical Right. “That’s all you really want. There’s no legislation we’re championing. We’re the negative force. We are just trying to stop stuff.”

The energy generated by the obstruction force of the most powerful industry that has ever existed on the face of the Earth has created such friction it has ground our sense of the future a halt. The lights keep dimming on us and we don’t know why. The gears of culture groan precariously against grinding, backwards momentum. The crude, snake oil slogans peddling past glory are so bleakly recursive they erase the very idea of future. And that is no coincidence. Carbon has a stake in wiping the future out of our imaginations. Because in the Future the world pivots.

 

Powering Destiny

One of these mornings
It won’t be very long
They will look for me
And I’ll be gone.
— Patti LaBelle

This past October, Liu Zhenya, the former chairman of China’s state-owned power company, State Grid Corp., came to the United Nations to present a vision for what a post-carbon Future would look like. He described a global power grid that could transmit 80% renewable energy by 2050, Scientific American reports.

His speech invited U.N. support for a new international group to plan and build the grid. It’s called the Global Energy Interconnection Development and Cooperation Organization (GEIDCO), and China has named Liu its chairman. [This] global grid would transmit solar, wind and hydroelectric-generated power from places on Earth where they are abundant to major population centers, where they are often not.

His grid’s development would take shape in three phases. First, Liu explained, individual nations would redesign their own power electric grids. He noted that China’s effort is already underway, generating 140 GW of wind power and 70 GW of solar power, “more than that of any country of the world.” By completing a network of long-distance, high-voltage direct-current power lines to move renewable power from the north to the south and from the east to the west, China could finish its new grid by 2025, he predicted.

The second phase, Liu described, would be an international effort to build regional grids that would be able to transmit substantially more power across national borders in Northeast and Southeast Asia, between Africa and Eurasia, and between nations in both North and South America. The third phase would build power lines and undersea cables that would connect the regional grids. The upshot would create what he called a “win-win situation.”

There would be plenty of work for “all global players” to coordinate the effort, to share and innovate new technology, and to develop global standards and rules for cooperation, Liu promised. He closed his U.N. presentation with a glimpse of a future world where a combination of renewable energy, a network of high-voltage direct-current transmission lines and “smart grid” operating systems can serve the planet the way the human “blood-vascular system” serves the human body.

When the global grid is completed, “the world will turn into a peaceful and harmonious global village with sufficient energy, green lands and blue sky,” he predicted.

Is it kooky that the Chinese would be talking about World Peace on the eve of 2017? Sure. But the thing is — that could have been us.

In the 1960’s, American architect and systems theorist, Buckminster Fuller created the World Game project. An alternative take on the war game simulations that dominated the Cold War era, the World Game requires participants to solve the following problem: “Make the world work for 100% of humanity in the shortest possible time through spontaneous cooperation without ecological damage or disadvantage to anyone.”

“The global energy grid is the World Game’s highest priority objective,” Fuller wrote.

Half a century after this idea of a distributed energy Future first emerged out of the American counterculture, a representative from the world’s second largest economy just presented it at the United Nations as the vision for his country’s energy ambitions.

“We argue so much about the silly politics of climate change and fail to recognize the gargantuan economic opportunity that this presents,” says Gregory Wilson, co-director of DOE’s National Center for Photovoltaics. “The energy system is going to get re-engineered, and someone is going to do it. The Chinese seem to have recognized the significance of this opportunity.”

Last year China invested a record $32 billion in overseas renewable energy projects. A 60% increase from the year before. Over the next 4 years the Chinese plan to to invest $360 billion in renewables domestically to boost capacity by 500%. The rapidly accelerating innovation that this kind of financing unleashes creates global market forces that may have their own momentum. From radically reimagined (and profoundly cheaper) battery technologies to printable solar panels that could transform “nearly any surface into a power generator” to electric busses that can go 350 miles on a single charge, new pieces of this vast puzzle seem to be emerging almost daily.

“Eventually,” Vox’s David Roberts writes, “power generation and storage will become ambient, something that simply happens, throughout the urban infrastructure. With that will come more and more sophisticated software for managing, sharing, and economizing all that power. [This] will one day change the world as much as the internet has.”

Indeed, as TechCrunch writes: Energy is the new Internet.

An undeniable, distributed energy Future powered by solar and other renewable sources is emerging. Perhaps the only Future on offer in the 21st century thus far that presents a bright vision worth striving for.

And yet — from a consumer perspective, solar energy seems to have no idea what it’s really selling.

 

Powering Desire

“If you don’t like what is being said, then change the conversation.”
— Don Draper

“In 2008, when fracking was still just a tiny thing, Davos was crowning it as the start of a new world order,” clean energy entrepreneur, Jigar Shah said recently on the Energy Gang Podcast. Yet solar is still considered “just a tech thing,” he lamented. “We’re not Earth shattering.”

Where does the momentum for a movement come from, Shah wondered? Why is solar perceived as just some sort of… appliance? Why, despite breaking records and reshaping trend-lines the world over in 2016, isn’t solar getting the kind of buzz befitting one of the biggest stories of our time?

Because messaging.

Here is how SolarCity, the largest residential solar installer, positions its product:

“Our solar panels not only generate energy on your roof, they can also generate cash in your pocket. That’s because when you go solar you can save on your monthly utility bill and secure lower fixed energy rates for years to come. The savings over time add up and allow you to plan for your future. See how quality, savings and affordability make going solar the right choice.”

Solar is the Future but you’d never know it from the way it’s marketed. And this commoditized framing is reflective of the industry as a whole. The retail model for solar hasn’t changed from what it was a decade ago. But the world has. The internet is on our doorstep but solar is still selling people on the value prop of word processors.

Compounding the messaging problem, solar is still positioned as an “alternative.” Droga5’s campaign for NRG Home Solar still presents renewable energy as an option relative fossil fuels. (Perhaps inevitable given the nature of NRG’s legacy-fuel masters.)

Both of these misguided approaches are a drag on the industry’s true potential. Solar isn’t a gadget, or an alternative lifestyle — it is an entry point to the new Future. In 2017, the territory of a desirable Future is totally unclaimed white space in consumer consciousness and the solar industry is uniquely positioned to own it.

Here’s how:

 

– Industry-level Messaging Platform –

In the 1990’s the Got Milk? campaign gave a commoditized product the status of a cultural icon. Executed at a trade level by the American dairy farmers, the industry-wide platform created a bigger impact than could have been possible for any dairy producer individually.

Like milk, energy is not a sphere with recognizable consumer brands that are part of the larger cultural conversation. The one notable exception, of course, is Tesla, which dropped the “Motors” from its name when it acquired SolarCity at the end of 2016. Analysts insisted that this acquisition is an “unneeded distraction,” and that Tesla ought to be “singularly focused on becoming a mass automobile manufacturer,” but that is a shortsighted view for a company that now makes solar panels and energy storage products. When it comes to Tesla’s true ambitions, as CEO, Elon Musk puts it, “We need a revolt against the fossil fuel industry.”

Everything Tesla does unequivocally insists, desirable Future, but there is enough shine for the entire industry to own. At the end of the day, it’s solar itself that consumers have an affinity for — 

Even among the majority of all political affiliations, no less—

It doesn’t take a moonshot PR campaign to capitalize on this abundance of positive consumer sentiment. Just a cohesive voice with which to claim the message and consistently speak it into the culture.

 

– Expand the Target Audience –

Everyone in solar is targeting the same homeowner and business audience. A vastly unexplored area is strategic ways to engage literally everyone else.

In 2008 I wrote about Toyota’s integration with Whyville, an online virtual world for tweens:

Pretty much the coolest thing you can buy in Whyville is a Scion, and its added bonus is that then you can drive all your other friends around in it in the game. The most fascinating thing about this whole strategy, however, is that the Tween demographic is between 8–12 years old. It’s gonna be a while before they even have a driver’s license, let alone be in a position to be buying a car in the real world, but when they are, they will already have a virtual experience to draw on when making the purchase decision.

As the Massachusetts Clean Energy Center shows with its Clean Energy Activity Day program for elementary and middle school students, this approach doesn’t need to just be virtual.

From group purchasing at a community level to modular options for renters to innovative uses for incentive programs and student grants, and more, what are the actionable and scalable strategies for expanding the target audience and the bottom line across the solar industry?

 

– Sell the Experience –

Most people don’t really want to think about energy. We flip a switch and the electricity is just there. We interact with electricity literally all day, and never think about it. The narrative of distributed, storable, smart-gridded, clean energy is so profoundly different from what most people know, or know how to think about, for them to understand it — or even want to — requires a transformative shift in the way it is communicated.

When Apple first marketed the iPod, it didn’t sell the product, they sold its end result — the experience of music:

Later campaigns for the iPhone didn’t even show the product at all:

The product became the conduit to the experience. And the experience that solar has to sell is Future.

 

– Claim the Narrative of Future –

Two decades ago — back when it was still possible to talk about the future as anything but dystopia — a series of ads painted a striking vision of how that future was going to unfold. “Have you ever borrowed a book from thousands of miles away,” asked the ad voice. “Crossed the country without stopping to ask for directions? Or watched the movie you wanted to, the minute you wanted to?”

“You will,” said the voice, “and the company that will bring it to you: AT&T.”

Today I use a device to do basically 90% of what those ads predicted. (OK, I’ve never sent a fax from the beach, or tucked a baby in from a phone-booth, but you can’t get the Future 100% right). All of these things are so obvious and mundane now we barely even remember — some of us never knew — there was a time before. But, indeed, there was a point when this fantastical world was the future, and the future still seemed like a fantastical world.

There are no grand visions for the future now, no scenarios for humanity that don’t fill us with dread. A dying oligarchy tells us dissolution is freedom; regression is hope. It has disfigured our understanding of what’s happening in our world. The result is a gaping void in our collective vision when we look ahead. 17 years in to our new century there is a desperate hunger for a bright vision for the future, and at the moment arguably no one outside the world of clean energy has a legitimate claim to one. In the end, it’s not about utility bills or net metering laws or even solar panels for that matter. It’s about a vision of a Future worth demanding. Solar has the opportunity to be the voice of that vision for decades to come with a simple, cohesive, culture-focused messaging strategy.

    



Subscribe for more like this.






UX Cruelty

Don’t blame it on the algorithm — assuming you’re designing experiences for “happy, upbeat, good-life users” might make you a terrible person.

 

My friend is going through a divorce. Like nearly 5 million other Americans. And recently Facebook greeted her with this careless user experience:

IMG_3473
 

When this UX intrusion happened to her, it reminded me of a similar, psychological violation I’d read about four months earlier. That post, by Eric Meyer, had begun:

I didn’t go looking for grief this afternoon, but it found me anyway, and I have designers and programmers to thank for it. In this case, the designers and programmers are somewhere at Facebook.

I know they’re probably pretty proud of the work that went into the “Year in Review” app they designed and developed, and deservedly so—a lot of people have used it to share the highlights of their years. Knowing what kind of year I’d had, though, I avoided making one of my own. I kept seeing them pop up in my feed, created by others, almost all of them with the default caption, “It’s been a great year! Thanks for being a part of it.”  Which was, by itself, jarring enough, the idea that any year I was part of could be described as great.

Still, they were easy enough to pass over, and I did.  Until today, when I got this in my feed, exhorting me to create one of my own.  “Eric, here’s what your year looked like!”

fb-year
 

A picture of my daughter, who is dead.  Who died this year.

Yes, my year looked like that.  True enough.  My year looked like the now-absent face of my little girl.  It was still unkind to remind me so forcefully.

I remember first reading this post the day it was published, Christmas eve 2014. When I went to look it up after my friend’s own violation by a Facebook app module I was surprised to (re)discover that it had been titled, generously, “Inadvertent algorithmic cruelty:”

And I know, of course, that this is not a deliberate assault.  This inadvertent algorithmic cruelty is the result of code that works in the overwhelming majority of cases, reminding people of the awesomeness of their years, showing them selfies at a party or whale spouts from sailing boats or the marina outside their vacation house.

But for those of us who lived through the death of loved ones, or spent extended time in the hospital, or were hit by divorce or losing a job or any one of a hundred crises, we might not want another look at this past year.

To show me Rebecca’s face and say “Here’s what your year looked like!” is jarring.  It feels wrong, and coming from an actual person, it would be wrong. Coming from code, it’s just unfortunate.

 

But of course, it did come from an actual person. “[The app] was awesome for a lot of people,” the product manager for Facebook’s Year in Review app, Jonathan Gheller, later told The Washington Post. Like all the digital experiences with, and within, which we all increasingly live our lives, an actual person — in fact a whole team of people — was responsible for concepting, designing, building, testing, and iterating this experience. No doubt, the responsibility for the rollout of this particular app featured prominently in a number of Facebook employees’ job performance reviews. From start to finish, this experience was crafted by people (not code). Calling its end result “inadvertent algorithmic cruelty” is like describing a drunk driving accident as “inadvertent gasoline cruelty.” For sure, it could have been avoided with an empty gas tank, but is that really the most accurate way to ascribe accountability in this situation? (Don’t blame it on the algohol).

“In creating this Year in Review app, there wasn’t enough thought given to cases like mine, or anyone who had a bad year,” Meyer wrote. “If I could fix one thing about our industry, just one thing, it would be that: to increase awareness of and consideration for the failure modes, the edge cases, the worst-case scenarios.”

If I could fix one thing about our industry, it would be to destroy the idea that these scenarios are edge cases.

Last year in the US, 2.6 million people died, leaving behind untold numbers of Facebook users who mourn the absence of their loved ones.

Right now 8.5 million people can’t find a job;

14.5 million people have cancer;

16 million people suffer from depression;

23.5 million people are addicted to alcohol and drugs;

45 million people live below the poverty line (including 16 million children)

These are not “edge cases.” These are not “worst case scenarios.” These are all people who use Facebook. And that’s not even counting your run of the mill disappointments, broken hearts, and inevitable wrongs and slights and meannesses that are, basically, life.

“The design [of the Year in Review app] is for the ideal user, the happy, upbeat, good-life user,” Meyer wrote. But if you are a product manager or UX designer creating experiences that will afflict affect hundreds of millions of people and you are only designing for an “ideal user”… at best that’s just lazy, and at worst — it’s creating LITERAL suffering.

Put another way:

Screen Shot 2015-04-16 at 7.40.34 PM

As Oliver Burkeman writes in The Guardian:

The world, obviously, is a manifestly unjust place: people are always meeting fates they didn’t deserve, or not receiving rewards they did deserve for hard work or virtuous behaviour. Yet several decades of research have established that our need to believe otherwise runs deep.

Confronted with an atrocity they otherwise can’t explain, people become slightly more likely, on average, to believe that the victims must have brought it on themselves. Hence the finding, in a 2009 study, that Holocaust memorials can increase antisemitism. Or that reading about the eye-popping state of economic inequality could make you less likely to support politicians who want to do something about it. These are among numerous unsettling implications of the “just-world hypothesis”, a psychological bias explored in a new essay by Nicholas Hune-Brown at Hazlitt.

If we didn’t all believe that [things happen for a reason] to some degree, life would be an intolerably chaotic and terrifying nightmare in which effort and payback were utterly unrelated, and there was no point planning for the future, saving money for retirement or doing anything else in hope of eventual reward. We’d go mad.

Yet, ironically, this desire to believe that things happen for a reason leads to the kinds of positions that help entrench injustice instead of reducing it.

Much in the same way that the “just world” cognitive bias can actually lead us to make crueler decisions, designing product features with the “happy, upbeat, good-life” ideal user bias can lead us to create crueler user experiences.

“To shield ourselves psychologically from the terrifying thought that the world is full of innocent people suffering,” Burkeman writes, we, as humans, “endorse policies more likely to make that suffering worse.” And by denying the full spectrum of the realities of people’s lives, awesome and tragic, we, as experience designers, do the same. Except we’re the ones with the power to actually do something about it.

“Just to pick two obvious fixes,” Meyer wrote at the end of his post, “First, don’t pre-fill a picture until you’re sure the user actually wants to see pictures from their year.  And second, instead of pushing the app at people, maybe ask them if they’d like to try a preview—just a simple yes or no.  If they say no, ask if they want to be asked again later, or never again. And then, of course, honor their choices. This is… designing for crisis, or maybe a better term is empathetic design.”

Or how about just, you know, design.

In the wake of Meyer’s post, the product manager for Facebook’s Year in Review app told The Washington Post. “We can do better.”

But four months later, Facebook’s photo collage assault on my friend suggests perhaps they don’t really think they can.

“Faced with evidence of injustice, we’ll certainly try to alleviate it if we can,” Burkeman wrote, “But, if we feel powerless to make things right, we’ll do the next best thing, psychologically speaking: we’ll convince ourselves that the world isn’t so unjust after all.”

    



Subscribe for more like this.








Hardcore Norm

Because dressing different is such a cliché.

 

stereotyped_classified

art by Curtis Mead

 

“The kids are doing the normcore,” my friend Quang said, trying out the new phrase with a deliberate, old fart dialect.

Only a few moments earlier I had tossed off the word like common parlance.

“‘Normcore?’” he had repeated, making sure he’d heard correctly.

“Yeah,” I explained, “It’s exactly what you think it is. It’s us, now.”

A shockingly pleasant March afternoon had arrived in Boston that day, on the heels of a cold that had felt like osteoporosis. A decade in LA had turned me into a wimp. I had forgotten how I’d ever managed to live through this in my youth.

I had grown up here. In high school I discovered raves. By college I was throwing them in 20,000 square foot warehouses in Dumbo. After that, I moved out to the west coast and managed a vaudeville circus troupeproduced electronic music festivals, and worked with a bunch of bands, among other things. In the span of the past decade I saw the niche “electronica” genre evolve into mainstream “EDM;” I saw the circus subculture infiltrate pop performance acts, and the signature, post-apocalyptic, tribal fashion aesthetic originated within the Burning Man community become a major fashion trend.

But that day in Boston, in 2014, hanging out with friends who had come up through the rave, circus, and goth subcultures, you could hardly tell where any of us had been. What we wore now was nondescript. Non-affiliated. Normal.

The week before, at a craft beer tasting party at an indie advertising agency in Silver Lake, a sculpture artist was remarking about recently looking through photos of style choices from the aughts. “What was I thinking,” she said in bewilderment. That evening she was wearing a black tank top, and, like, pants. Maybe three quarter length? Or not? Maybe black jeans? Or not-jean pants? I couldn’t recall. Perhaps, I thought, this was just a symptom of getting older. There was some kind of sartorial giving a shit phase that we had all grown out of. But it turned out this, too, was a trend. Kids, too young to have grown out of anything, were dressing this way.

“By late 2013, it wasn’t uncommon to spot the Downtown chicks you’d expect to have closets full of Acne and Isabel Marant wearing nondescript half-zip pullovers and anonymous denim,” wrote Fiona Duncan, in a February New York Magazine article titled, “Normcore: Fashion for Those Who Realize They’re One in 7 Billion:”

I realized that, from behind, I could no longer tell if my fellow Soho pedestrians were art kids or middle-aged, middle-American tourists. Clad in stonewash jeans, fleece, and comfortable sneakers, both types looked like they might’ve just stepped off an R-train after shopping in Times Square. When I texted my friend Brad (an artist whose summer uniform consisted of Adidas barefoot trainers, mesh shorts and plain cotton tees) for his take on the latest urban camouflage, I got an immediate reply: “lol normcore.”

Normcore—it was funny, but it also effectively captured the self-aware, stylized blandness I’d been noticing. Brad’s source for the term was the trend forecasting collective (and fellow artists) K-Hole. They had been using it in a slightly different sense, not to describe a particular look but a general attitude: embracing sameness deliberately as a new way of being cool, rather than striving for “difference” or “authenticity.”

Oh my god, I thought reading this: this is me.

In Nation of Rebels: Why Counterculture Became Consumer Culture, published in 2004, cultural critics, Joseph Heath and Andrew Potter examined the inherent contradiction in the idea that counterculture was an opposition to  mass consumer culture. Not only were they not opposed, Heath and Potter explained, they weren’t even separate. Alternative culture’s obsession with being different — expressing that difference through prescribed fashion products and subcultural artifacts — had, in fact, helped to create the very mass consumer society the counterculture believed itself to be the alternative to.

“To me, Nike’s famous swoosh logo had long been the mark of the manipulated,” wrote Rob Walker, author of  2008′s Buying In: The Secret Dialogue Between What We Buy And Who We Are, ”A symbol for suckers who take its ‘Just Do It’ bullying at face value. It’s long been, in my view, a brand for followers. On the other hand, the Converse Chuck Taylor All Star had been a mainstay sneaker for me since I was a teenager back in the 1980′s, and I stuck with it well into my thirties. Converse was the no-bullshit yin to Nike’s all-style-and-image yang. It’s what my outsider heroes from Joey Ramone to Kurt Combain wore. So I found [Nike’s] buyout [of Converse] disheartening…. but why, really, did I feel so strongly about a brand of sneaker–any brand of sneaker?”

In response to Buying In, I’d written, “Whether we’re choosing to wear Nikes, Converse, Timberlands, Doc Martens, or some obscure Japanese brand that doesn’t even exist in the US, we’re deliberately saying something about ourselves with the choice. And regardless of how “counter” to whatever culture we think we are, getting to express that differentiation about our selves requires buying something.”

But that was five years ago. A funny thing happened on the way to the mid twenty-teens. The digital era ushered in an unprecedented flood of availability — of both information and products. This constant, ubiquitous access to everything — what Chris Anderson dubbed the “Long  Tail” in his 2006 book of the same name – had changed the cultural equation. We had evolved, as Anderson predicted, “from an ‘Or’ era of hits or niches (mainstream culture vs. subcultures) to an ‘AND’ era.” With the widespread proliferation of internet access, mass culture got less mass, and niche culture got less obscure. We became what Anderson called  a  “massively parallel culture: millions of microcultures coexisting and interacting in a baffling array of ways.” On this new, flattened landscape, what was there to be counter to?

“Jeremy Lewis, the founder/editor of Garmento and a freelance stylist and fashion writer, calls normcore ‘one facet of a growing anti-fashion sentiment,’” Duncan writes in New York  Magazine. “His personal style is (in the words of Andre Walker, a designer Lewis featured in the magazine’s last issue) ‘exhaustingly plain’—this winter, that’s meant a North Face fleece, khakis, and New Balances. Lewis says his ‘look of nothing’ is about absolving oneself from fashion.”

That is how normcore happened to me, too. When I quit the circus, leaving behind its sartorial regulations, I realized that difference wasn’t an expression of identity: it was a rat race.

“Fashion has become very overwhelming and popular,” Lewis explains in New York Magazine. “Right now a lot of people use fashion as a means to buy rather than discover an identity and they end up obscured and defeated. I’m getting cues from people like Steve Jobs and Jerry Seinfeld. It’s a very flat look, conspicuously unpretentious, maybe even endearingly awkward. It’s a lot of cliché style taboos, but it’s not the irony I love, it’s rather practical and no-nonsense, which to me, right now, seems sexy. I like the idea that one doesn’t need their clothes to make a statement.”

“Magazines, too,” Duncan writes, “have picked up the look:”

The enduring appeal of the Patagonia fleece [was] displayed on Patrik Ervell and Marc Jacobs’s runways. Edie Campbell slid into Birkenstocks (or the Céline version thereof) in Vogue Paris. Adidas trackies layered under Louis Vuitton cashmere in Self Service. A bucket hat and Nike slippers framed an Alexander McQueen coveralls in Twin. Smaller, younger magazines like London’s Hot and Cool and New York’s Sex, were interested in even more genuinely average ensembles, skipping high-low blends for the purity of head-to-toe normcore.

One of the first stylists I started bookmarking for her normcore looks was the London-based Alice Goddard. She was assembling this new mainstream minimalism in the magazine she co-founded, Hot and Cool, as early as 2011. For Goddard, the appeal of normal clothes was the latest thing. One standout editorial from Hot and Cool no. 5 (Spring 2013) was composed entirely of screenshots of people from Google Map’s Street View app. Goddard had stumbled upon “this tiny town in America” on Map sand thought the plainly-dressed people there looked amazing. The editorial she designed was a parody of contemporary street style photography—“the main point of difference,” she says, “being that people who are photographed by street style photographers are generally people who have made a huge effort with their clothing, and the resulting images often feel a bit over fussed and over precious—the subject is completely aware of the outcome; whereas the people we were finding on Google Maps obviously had no idea they were being photographed, and yet their outfits were, to me, more interesting.”

New media has changed our relation to information, and, with it, fashion. Reverse Google Image Search and tools like Polyvore make discovering the source of any garment as simple as a few clicks. Online shopping—from eBay through the Outnet—makes each season available for resale almost as soon as it goes on sale. As Natasha Stagg, the Online Editor of V Magazine and a regular contributor at DIS (where she recently wrote a normcore-esque essay about the queer appropriation of mall favorite Abercrombie & Fitch), put it: “Everyone is a researcher and a statistician now, knowing accidentally the popularity of every image they are presented with, and what gets its own life as a trend or meme.” The cycles of fashion are so fast and so vast, it’s impossible to stay current; in fact, there is no one current.

Emily Segal of K-HOLE insists that normcore isn’t about one specific aesthetic. “It’s not about being simple or forfeiting individuality to become a bland, uniform mass,” she explains. Rather, it’s about welcoming the possibility of being recognizable, of looking like other people—and “seeing that as an opportunity for connection.”

K-HOLE describes normcore as a theory rather than a look; but in practice, the contemporary normcore styles I’ve seen have their clear aesthetic precedent in the nineties. The editorials in Hot and Cool look a lot like Corinne Day styling newcomer Kate Moss in Birkenstocks in 1990, or like Art Club 2000′s appropriation of madras from the Gap, like grunge-lite and Calvin Klein minimalism. But while (in their original incarnation) those styles reflected anxiety around “selling out,” today’s version is more ambivalent toward its market reality.

In a post Hot-Topic world, where Forever21 serves up fast fashion in processed flavors like, Occupy:

and Burning Man:

Screenshot-2014-03-10-15.20.59 Screenshot-2014-03-10-15.20.48

we’re realizing that alternativeness, as a means for authentic self expression, is futile.“Normcore isn’t about rebelling against or giving into the status quo,” Duncan concludes, “It’s about letting go of the need to look distinctive.”

In our all-access, always connected, globalized world, obscurity is scarce. When everything is accessible, nothing is alternative.

“In the 21st century,”  Rob Walker wrote back in 2008, not recognizing the quickly approaching end of counterculture, “We still grapple with the eternal dilemma of wanting to feel like individuals and to feel as though we’re apart of something bigger than ourselves. We all seek ways to resolve this fundamental tension of modern life.”

In 2014, normcore is one solution we’ve found to resolve it.

    



Subscribe for more like this.






Objectionable

Our technology is turning us all into objects. And it doesn’t matter how you treat objects, does it?

 

I have been responsible for more selfies than most people.

I didn’t take them. They’re not of me. But I launched an app which allows users to easily create mirrored images. So the leap from this:

The Tonight Show with Jay Leno - Season 21

To this:

Was almost instant.

In the year since our app launched, our users have created over 5 million images. By now you’ve seen this mirrored selfie trend all over Instagram, not to mention throughout the greater popular culture.

To be fair, mirrored selfie-grams are far from the only way people engage with the app. They also use it to create stunningly beautiful, painstakingly crafted, kaleidoscopic works of abstract art:

 

But it’s the selfies — mirrored or otherwise — that have been on my mind a lot lately.

 

Selfies.

Right now, there are 50 million images on Instagram with the hashtag #selfie, and nearly 140 million tagged #me.

“Selfies,” Elizabeth Day reports in the Guardian, “Have become a global phenomenon. Images tagged as #selfie began appearing on the photo-sharing website Flickr as early as 2004. But it was the introduction of smartphones – most crucially the iPhone 4, which came along in 2010 with a front-facing camera – that made the selfie go viral.”

A recent survey of more than 800 American teenagers by the Pew Research Centre found that 91% posted photos of themselves online – up from 79% in 2006.

But the selfie isn’t just a self-portrait, it is a self-object.

“Again and again, you offer yourself up for public consumption,” Day writes. “Your image is retweeted and tagged and shared. Your screen fills with thumbs-up signs and heart-shaped emoticons. Soon, you repeat the whole process, trying out a different pose.”

“The selfie is about continuously rewriting yourself,” says Dr. Mariann Hardey, a lecturer in marketing at Durham University who specializes in digital social networks. “It’s an extension of our natural construction of self.”

But what is it we are constructing our selves into?

 

Porn.

Before we go any further, let’s get this out of the way: unless you are a teenager right now, you do not understand what it means to grow up in a world where porn and Facebook are equidistant — in case you don’t know, that proximity is one click away, and apart. If you’re curious to understand what, in fact, this experience is like — in teenagers’ own words — you should read Nancy Jo Sales’ recent Vanity Fair article, “Friends Without Benefits.” But not until after you’ve finished reading this one because I’ll be drawing on it quite a bit.

If you are, at this moment, older than at least your mid-20s, whatever it is that you think you can draw on to relate to 2013 from an analog adolescence frame of reference, just put that away, because it is not a parallel to what is happening right now. What is, according to Gail Dines, the author of Pornland: How Porn Has Hijacked Our Sexuality, is “a massive social experiment.” Here are some results from that experiment so far:

According to a 2008 CyberPsychology & Behavior study:

  • 93% of boys and 62% of girls have seen internet porn
  • 83% of boys and 57% of girls have seen group sex online
  • 18% of boys and 10% of girls have seen rape or sexual violence

But that was five iPhone versions ago at this point, so, you do the math.

“In the absence of credible, long-term research, we simply don’t know where the age of insta-porn is taking us,” writes Peggy Drexler on TheDailyBeast, but that we are in it, and that it is pervasive, is undeniable.

“What does this do to teenagers,” Sales asks in Vanity Fair. “And to children? How does it affect boys’ attitudes toward girls? How does it affect girls’ self-esteem and feeling of well-being? And how is this affecting the way that children and teenagers are communicating on these new technologies?”

In the the Guardian, Day describes one typical answer to that last question: “The pouting mouth, the pressed-together cleavage, the rumpled bedclothes in the background hinting at opportunity — a lot of female selfie aficionados take their visual vernacular directly from pornography (unwittingly or otherwise).”

“Because of porn culture,” says Dines, “Women have internalised that image of themselves. They self-objectify.”

“The girls I interviewed,” says Sales, “Even if they’re not doing it themselves, it’s in their faces: their friends posting really provocative pictures of themselves on Facebook and Instagram, sending nude pictures on Snapchat. Why are they doing this? Is this sexual liberation? Is it good for them? Girls know the issues, and yet some of them still can’t resist objectifying themselves, as they even talk about [themselves]. As the girl I call ‘Greta’ says, ‘more provocative equals more likes.’ To be popular, which is what high school is all about, you have to get ‘likes’ on your social-media pics.”

 

Spring. 

spring-breakers-poster-2

Harmony Korine’s, Spring Breakers, originally released in March, 2013, “Horrifies and entices in equal measure,” wrote NPR music critic, Ann Powers:

Flattening the hierarchies that separate trash from art, porn from erotica, and moral justice from exploitation by any means necessary, Spring Breakers… embraces and elaborates upon the prevalent suspicion that nobody lives on the stable side of reality any more.

“Pretend you’re in a videogame,” says one of the film’s female anti-heroines as they begin their spree of rampant self-abuse and crime. That’s what Miley Cyrus does, trying on new aspects of performance and sexual self-expression in her new persona. It’s also how the vulnerable models that Robin Thicke ogles [in the music video for his song, Blurred Lines] make it through the gauntlet that the video’s scene creates.

The childlike goofiness Katy Perry expressed with California Gurlz in 2010, or the sweet hope of Carly Rae Jepsen’s smash of last year, Call Me Maybehave intensified into something more unsettling. In this strange summer of too much heat, so many precariously excessive songs and videos now play on that line between healthy catharsis and chaos.

 

Summer.

The summer would get stranger still. Punctuated in its final days by what may just be the most controversial MTV Video Music Awards performance of all time, featuring a duet by Cyrus and Thicke.

I would write about it:

miley_cyrus_vmas-620x412

From its very first steps, Cyrus’s performance felt, unmistakably, like watching a GIF happen in real-time. The act was speaking the native tongue — stuck all the way out — of the digital age, its direct appeal to meme culture as blatant and aggressive as the display of sexuality. The source material and its inevitable meme-ification appeared to be happening simultaneously. The  Internet was inherently integrated within the performance. It was no longer a “second” screen; it was the same damn screen. All the performances before it had been made for TV. This show changed that.

What I learned from the 2013 VMAs is that owning your sexuality is passé, but owning meme culture by exploiting your sexuality is now. Whatever you think of it, Cyrus’s performance was a deliberate reflection of where we are as culture.

 

A burner had been left blindly on. Something invisible and pervasive had accumulated. Watching the VMAs, a giant fireball exploded in our faces.

We were unprepared.

This, ultimately, would be why everyone freaked out. Cyrus became a highly visible target for embodying this shift on a mainstream stage, and exploiting it to increase her fame and drive her record to #1, but all she was doing was deftly surfing the cultural current.

By the end of August, she was exposing us to the new normal.

 

Fall.

“In news that’s not at all surprising, yet another tech event was disrupted by a sexist joke,” Lauren Orsini wrote on ReadWriteWeb, within days of the VMAs:

“Titstare” was the first presentation of the TechCrunch Disrupt 2013 hackathon. Created by Australians Jethro Batts and David Boulton, the joke app is based on the “science” of how sneaking a peek at cleavage helps men live healthier lives.

The opening salvo cast an ugly shadow over the event, reminding attendees that, just like at PyCon and other technology conferences, “brogrammer” culture is still the norm.

Perhaps most disconcerting is the fact that Batts and Boulton presented immediately before Adria Richards, a programmer who rose to the national spotlight after she witnessed sexist jokes at PyCon 2013. Her gall to disapprove of the offensive jokes earned her death threats.

 

In the wake of the VMA article, I kept tweeting over and over, “Everything is changing….but into whatttttt?” By the early days of Fall, the culture had undeniably shifted. I kept kept seeing an escalating, atavistic gender warfare. Why is this happening, I thought.

Why is this happening?

Spinnin-Pioneer

 

Why is THIS happening?

pax

Why is THIS happening??

Susie

That all happened in one day.

That week I was approached to speak at a women’s startup conference and felt, reflexively, offended. The idea that there should be segregated events seemed insulting and damaging — to everyone. I began to feel self-conscious that I had an app startup with a male business partner. I texted him, “What is happening???” and “Can’t we all just get along?” We laughed, but we began to feel like an anomaly.

 

Pretend you’re in a videogame.

“When we listeners find ourselves taking pleasure in these familiar but enticingly refreshed acts of transgression,” Powers writes, “Echoing the Michael Jackson-style whoops that Pharrell makes in Blurred Lines, or nodding along to the stoned, melancholy chorus of Cyrus’s arrestingly sad party anthem, We Can’t Stop, are we compromising ourselves? Or is it okay, because after all, it’s just pretend?”

And when the technology that I, you, and everyone we know use on a daily basis gets developed to the sound of this same, blurry, pop culture soundtrack (figuratively or literally), what happens then? How are the creators of objectifying technology supposed to know it isn’t cool — if all of our technology is used for objectification?

In Vanity Fair, Sales talks to Jill Bauer and Ronna Gradus co-directors of Sexy Baby, a documentary about girls and women in the age of porn. “We saw these girls embracing this idea that ‘If I want to be like a porn star, it’s so liberating,’” Gradus said. “We were skeptical. But it was such a broad concept. We asked, ‘What is this shift in our sexual attitudes, and how do we define this?’ I guess the common thread we saw that is creating this is technology. Technology being so available made every girl or woman capable of being a porn star, or thinking they’re a porn star. They’re objectifying themselves. The thinking is: ‘If I’m in control of it, then I’m not objectified.’”

In October, Sinead O’Connor — whose video for Nothing Compares 2 U inspired Cyrus’s look in her video for Wrecking Ball — wrote an “open letter” to Cyrus, beautifully capturing, “in the spirit of motherliness and with love,” the generational disconnect at the heart of the cultural shift. “The message you keep sending is that it’s somehow cool to be prostituted.. it’s so not cool Miley. Don’t let the music business make a prostitute out of you,” O’Connor wrote, not getting it.

The familiar, analog, 20th century relationship in between objectification and commercialization has eroded. In its place, a new, post-Empire dynamic has arrived, built on a natively digital experience that O’Connor and an entire population still able to remember and relate to a world before the internet and mobile technology, can’t wrap their heads around.

“The blurred messages Thicke, Cyrus and others are now sending fit a time when people think of themselves as products, more than ever before,” Powers writes.

In the attention economy, self-exploitation is self-empowerment. We are all objects. We are all products. We are all selfies.

And we can’t stop.

“Social media is destroying our lives,” Sales quotes a girl in Vanity Fair.

“So why don’t you go off it?” Sales asks.

“Because then we would have no life.”

The ubiquitousness of digital cameras and social media platforms to share their instant output has not only turned  the idea that objectification is violation into an anachronism, but self-objectification is now, as Powers, writes “part of today’s ritual of romance.”  Nearly one in three teenagers is sending nude photos, after all.

Like the girls in Sales’ article, who tell her that “presenting themselves in this way is making them anxious and depressed,” but continue to do it anyway, we do not self-objectify because we’re in control. We self-objectify because it is the norm.

We self-objectify to rationalize, to placebo-ize that we had control in the first place.

 

We Can’t Stop.

“Both young women and young men are seriously unhappy with the way things are,” says, Donna Freitas, a former professor at Hofstra and Boston Universities, who studies hook-up culture on college campuses in her new book, The End of Sex  (which Sales suggests, “might as well be called The End of Love.”)

Sales writes:

Much has been written about hook-up culture lately, notably Hanna Rosin’s The End of Men (2012) and a July New York Times article, “Sex on Campus: She Can Play That Game Too,” both of which attributed the trend to feminism and ambitious young women’s desire not to be tied down by relationships.

But Freitas’s research, conducted over a year on seven college campuses, tells a different story.

She describes the sex life of the average college kid as “Mad Men sex, boring and ambivalent. Sex is something you’re not to care about. They drink to drown out what is really going on with them. The reason for hooking up is less about pleasure and fun than performance and gossip—it’s being able to update [on social media] about it. Social media is fostering a very unthinking and unfeeling culture.”

College kids, both male and female, also routinely rate each other’s sexual performance on social media, often derisively, causing anxiety for everyone.

And researchers are now seeing an increase in erectile dysfunction among college-age men—related, Freitas believes, to their performance anxiety from watching pornography: “The mainstreaming of porn is tremendously affecting what’s expected of them.”

 

Or as ThoughtCatalog writer, Ryan O’Connell, (oh, hey, sup, a dude), put it, “This is how we have sex now:”

Porn has killed our imaginations. We sit and try to fantasize. We shut our eyes tight and think, ‘Wait, what did I used to masturbate about before porn? What image is going to turn me on right now?” But your brain gets tired and your genitalia isn’t used to working this hard so you open your reliable go-to porno and get off in two minutes. Later, you have trouble maintaing an erection during actual sex because your partner doesn’t look like a blow up doll from the Valley.

Our sex lives are having less and less to do with actual sex. Intimacy has morphed into something entirely more narcissistic. What used to be about making each other feel good and connecting is now about validation.

When sex does happen, when we finally make it through the endless hoops of text messaging, planning a date and actually sticking to it and you discover that you like this person (or could like them for an evening), it feels like an old faded photograph that’s been sitting in a shoebox at the bottom of your closet. “This orgasm feels like a vintage ball gown! Is this how people used to do it in the olden days?!” It’s terrifying!

In 2013, our phones are getting to have all the fun. They’re getting laid constantly while we lay naked in the dark, rubbing our skin, trying pathetically to get turned on by the feel of our own touch. We scroll through our camera and see a buffet of anonymous naked photos we’ve collected over the last few months for us to jack off to. Somehow, this has become enough for us. Getting off has become like fast food. It’s accessible, cheap, and most likely going to make us feel like shit after.

We are actively participating in the things that keep us from what we want. Feel good now, feel bad forever later. Stomachache stomachache, junk food junk food.

 

In a pervasively mediated culture, where porn primes our perception of ourselves and others, and our technology reduces us to selfies, objectification is inevitable.

And the trouble is — it doesn’t matter how you treat objects…. It’s not like they’re people.

What people want today is “to hurt one another” and “get back at the people that hurt them,” Hunter Moore, the founder of IsAnyoneUp.com, told Rolling Stone last October.

In a September article on The Verge titled, The End of KindnessGreg Sandoval writes:

And Moore ought to know. He’s one of the pioneers of revenge porn, the practice of posting nude photos to the web of a former lover in an attempt to embarrass, defame, and terrorize.

While minorities and homosexuals are often targeted, experts say no group is more abused online than women. Danielle Citron, a law professor at the University of Maryland lays out some of the numbers in her upcoming book, Hatred 3.0. The US National Violence Against Women Survey reports 60% of cyberstalking victims are women. A group called Working to Halt Online Abuse studied 3,787 cases of cyberharassment, and found that 72.5% were female, 22.5% were male and 5% unknown. A study of Internet Relay Chat showed male users receive only four abusive or threatening messages for every 100 received by women.

Moore has sold his site but scores of wannabes are cropping up. A check of these sites shows that victims are almost always women. At Myex.com over 1,000 nude photos and new pictures are added nearly every day. Each post typically includes the name of the person photographed, their age, and the city they live in. The posts come with titles like, “Manipulative Bitch,” “Cheater,” “Has genital warts,” “Drunk,” “Meth User,” “This girl slept with so many other guys,” and “Filthy Pig.”

The Verge contacted several women found on some of these sites, including Myex.com. While all of them declined to be interviewed, they did acknowledge that the photos were posted without permission by an ex-boyfriend or lover. One woman said that she was trying to get the pictures pulled down and had successfully removed them from other sites because she was not yet 18 years old when they were taken (if her claim is accurate it would make the snapshots child pornography). She pleaded that we not use her name and asked that we not contact her again.

If the woman was upset and afraid, she has a right to be, says Holly Jacobs, 30, who has started a nonprofit organization dedicated to ending revenge porn and supporting its victims. Jacobs knows firsthand that these sites are killers of reputations and relationships. Three years ago, Jacobs was studying for her PhD in industrial organizational psychology and working as a consultant at a university when a former boyfriend began posting nude photos of her online. The embarrassment and terror was just the beginning. Jacobs’ ex sent copies of the photos to her boss and suggested she was sexually preying on students. Jacobs’ employers, fearing bad press, asked her to prove she didn’t upload the photos herself. She finally felt compelled to change her name (Jacobs is the new name).

In July The Washington Post published a story about men who post phony ads to make it appear as if their ex-wives or girlfriends are soliciting sex. One man, Michael Johnson II of Hyattsville, Maryland, published an ad titled “Rape Me and My Daughters” and included his ex-wife’s home address. More than 50 men showed up to the victim’s house. One man tried to break in and another tried to undress her daughter. Johnson was sentenced to 85 years in prison. His victim was physically unharmed but these ads can be lethal. In December 2009, a Wyoming woman was raped with a knife sharpener in her home after an ex-boyfriend assumed her identity and posted a Craigslist ad that read, “Need an aggressive man with no concern or regard for women.” Her ex and the man who raped her are both serving long prison sentences.

 

Winter.

While people, trapped as we are by our digital avatars, are increasingly being reduced to objects, our technology seems to be benefitting from a transference of humanity.

Spike Jonze’s new movie, Her, due out in December, is being called “science fiction,” but the “future” depicted in the trailer looks essentially indistinguishable from the reality we all find ourselves in today. In it, a melancholy man, played by Joaquin Phoenix, and a Turing test-approved virtual assistant program, voiced by Scarlett Johansson, fall in love.

“Unlike the science fiction of yesteryear,” writes David Plumb on Salon.com, “Her is not about the evolving relationship between humans and artificial intelligence. Instead, Samantha appears to be essentially a human being trapped in a computer. Her thus appears to be about programming the perfect woman who fits in your pocket, manages your life, doesn’t have a body (and thus free will), and has an off switch.”

 

Pretend you’re in a videogame.

 

    



Subscribe for more like this.