The Algorithm Defense Frontier

Algorithms are the perfect tool for delivering individualized exploitation to billions. They may yet have potential for mobilizing collective power at scale.

 

Buy Me a Sandwich

Out in California, my friend is working on a new kind of energy storage startup. Here’s what they do. They buy electricity from the grid when it’s cheap (middle of the night), and store it in a battery in your house for you to use during the day, when demand is high. Participants will be able to save 20%-30% on their energy bills. And the best part is you don’t even have to do anything. The whole thing runs on algorithms.

I was thinking about this as I read Ben Tarnoff’s article in The Guardian, which begins:

What if a cold drink cost more on a hot day?

Customers in the UK will soon find out. Recent reports suggest that three of the country’s largest supermarket chains are rolling out surge pricing in select stores. This means that prices will rise and fall over the course of the day in response to demand. Buying lunch at lunchtime will be like ordering an Uber at rush hour.

But what if an algorithm bought it for you at midnight instead?

Amid the infinite churn of natural and political terrors— juxtaposed with photos of babies made by people I love — blowing up my Facebook feed, I keep seeing a hipster hostage video selling me something called Paribus.

“Stores change their prices all the time — and many of them have price guarantees,” explains Paribus’ App Store description. “Paribus gets you money when prices drop. So if you buy something online from one of these retailers and it goes on sale later, Paribus automatically works to get you the price adjustment.”

Most likely the reason Facebook expects I’d be interested in this service (more on this later) is because I already have an app duking it out with Comcast to lower my bill.

It sends me updates via Facebook Messenger to let me know its progress:

And I message it back choose-your-own-adveture style responses:

This feature is part of a service called Trim, which positions itself as “a personal assistant that saves money for you.” (It makes money by taking a cut of the recovered charges.)

Trim calls its Comcast negotiator bot, knowingly:

And, indeed — we should all be thinking about the possibilities for algorithm defense.

 

Buy Me A Monopoly

The day Amazon announced its purchase of Whole Foods, shares in both Kroger and Walmart — which generates more than half its revenue from grocery sales — went into free-fall. Kroger’s in particular fell 8% in a matter of hours.

On the day the sale closed, Whole Foods’ new management cut prices by up to 43%.

“Amazon has demonstrated that it is willing to invest to dominate the categories that it decides to compete in,” said Mark Baum, a senior vice president at the Food Marketing Institute. But the way Amazon “decides to compete” is to actually make the category uncompetitive, driving other players out until it has become the category.

Previously on Amazon: book stores.

Tarnoff writes:

Amazon isn’t abandoning online retail for brick-and-mortar. Rather, it’s planning to fuse the two. It’s going to digitize our daily lives in ways that make surge-pricing your groceries look primitive by comparison. It’s going to expand Silicon Valley’s surveillance-based business model into physical space, and make money from monitoring everything we do.

Concepts like Paribus, or Trim’s “Comcast Defense” are still primitive now, too, but extrapolate the possibilities out at scale. Imagine hundreds of thousands of people on a negotiation platform like this. That becomes the ability to exert real power. To pit Comcast and Verizon and AT&T against each other for who will offer the best deal, and have the leverage to switch all your members over en masse.

Reading Trim’s about page, it seems possible the thought could have crossed their minds:

So far, we’ve saved our users millions of dollars by automating little pieces of their day-to-day finances.

Now we’re starting to work on the hard stuff. Will I have enough money to retire someday? Which credit card should I get? Can my car insurance switch automatically to a cheaper provider?

Maybe Trim’s already thinking of the power of collective negotiating. Or maybe someone else will. The idea isn’t even all that new. It’s essentially how group insurance plans work. And unions. We’ve come up with it before. But we’ve never really tried it with code.

 

Buy Me A Pen


art by Curtis Mead
 

“When mass culture breaks apart,” Chris Anderson wrote a decade ago, in The Long Tail, “It doesn’t re-form into a different mass. Instead it turns into millions of microcultures, which coexist and interact in a baffling array of ways.”

On this new landscape of “massively parallel culture,” as Anderson called it, hyper-segmentation has become our manifest destiny.

Now we atomize the universal, dividing into ever nicher niches. We invent new market subsegments where none previously existed, or need to. We splash pink onto pens to create “differentiated” product lines:

Why?

Because segmentation sells.

Those pink pens “For Her”? They cost up to 70% more than Bic’s otherwise identical ones for everyone.

And the algorithms got this memo: there’s incentive to create ever more segmented “filter bubbles.”

It’s lucrative. And effective.

As John Lanchester writes in the London Review of Books:

Facebook knows your phone ID and can add it to your Facebook ID. It puts that together with the rest of your online activity: not just every site you’ve ever visited, but every click you’ve ever made. Facebook sees you, everywhere. Now, thanks to its partnerships with the old-school credit firms, Facebook knew who everybody was, where they lived, and everything they’d ever bought with plastic in a real-world offline shop. All this information is used for a purpose which is, in the final analysis, profoundly bathetic. It is to sell you things via online ads.

 

Buy Me Clicks

All day long, Facebook’s News Feed algorithm “is mapping your brain, seeking patterns of engagement,” writes Tobias Rose. “It can predict what you’ll click on better than anyone you know. It shows you stories, tracks your responses, and filters out the ones that you are least likely to respond to. It follows the videos you watch, the photos you hover over, and every link you click on.”

And Facebook follows you even when you’re not on Facebook. “Because every website you’ve ever visited (more or less) has planted a cookie on your web browser,” writes Lanchester, “when you go to a new site, there is a real-time auction, in millionths of a second, to decide what your eyeballs are worth and what ads should be served to them, based on what your interests, and income level and whatnot, are known to be.”

Facebook’s algorithms can create not only a private, personal pipeline of media, but an entirely individualized reality where information is repackaged for you — like pink pens — dynamically, in real-time, to whatever color, shape, or price point will extract the most value out of you specifically.

Example:

Four researchers based in Spain creat[ed] automated [shopper] personas to behave as if, in one case, ‘budget conscious’ and in another ‘affluent’, and then checking to see if their different behaviour led to different prices. It did: a search for headphones returned a set of results which were on average four times more expensive for the affluent persona. An airline-ticket discount site charged higher fares to the affluent consumer. In general, the location of the searcher caused prices to vary by as much as 166 per cent.

Another instance, from 2016:

An anti-Clinton ad repeating a notorious speech she made in 1996 on the subject of ‘super-predators’… was sent to African-American voters in areas where the Republicans were trying, successfully as it turned out, to suppress the Democrat vote. Nobody else saw the ads.

“As consumers,” Tarnoff writes, “we’re nearly powerless, but as citizens, we can demand more democratic control of our data. The only solution is political.”

I’ve had the same thought. In an article last year about how technology has gotten so good at degrading us even some of its creators are starting to have enough, I wrote, “We take it for granted now, that cars have seat-belts to keep the squishy humans inside from flying through a meat-grinder of glass and metal during a collision. But they didn’t always. How did we ever get so clever? Regulation.”

And I still believe it. But we must also realize: we find ourselves now in a climate of hostility toward consumer and citizen protection. (Has anyone heard from the EPA lately?) And the breakneck speed with which technology is charging exponentially ahead of borders’ or regulation’s ability to keep pace isn’t about to relent. Hell, even the Council on Foreign Relations is out here like, “Democratic governments concerned about new digital threats need to find better algorithms to defend democratic values in the global digital ecosystem.

Oof.

One way or another, when it comes to defense from algorithmic forces being deployed against us… we’re gonna need a bigger bots.

 

Buy Me Power

We’ve been classified and stereotyped and divided and conquered by algorithms. Lines of code deliver custom-targeted exploitation to billions of earthlings at once.

Can our individual fragments of power be scaled towards something bigger by them as well?

“Addressing our biggest issues as a species — from climate change, to pandemics, to poverty —” (to Jesus Christ, have you tried ever canceling your Comcast account?) “— requires us to have a common narrative of the honest problems we face,” writes Rose. “Without this, we are undermining our greatest strength — our unique ability to cooperate and share the careful and important burdens of being human.”

As individuals we are indeed basically powerless, and algorithms have proven a stunningly effective tool for extracting ever greater value out of our atomization. We have perhaps yet to imagine the potential of what algorithms deployed to concentrate our individual power into a group force can achieve at a global scale.

One thing is for certain — we ned to start thinking about defense.

Oh, and if you’re a homeowner in California (or have a cool landlord) and want to lower your energy bill (and “inadvertently” accelerate the adoption of renewables to the grid), go sign up to be a beta tester for my friend’s energy startup! They’ll install a battery for free at your home to reduce your electric bill if you let them train their algorithms to predict when you’re going to need power.

    



Subscribe for more like this.






UX Cruelty

Don’t blame it on the algorithm — assuming you’re designing experiences for “happy, upbeat, good-life users” might make you a terrible person.

 

My friend is going through a divorce. Like nearly 5 million other Americans. And recently Facebook greeted her with this careless user experience:

IMG_3473
 

When this UX intrusion happened to her, it reminded me of a similar, psychological violation I’d read about four months earlier. That post, by Eric Meyer, had begun:

I didn’t go looking for grief this afternoon, but it found me anyway, and I have designers and programmers to thank for it. In this case, the designers and programmers are somewhere at Facebook.

I know they’re probably pretty proud of the work that went into the “Year in Review” app they designed and developed, and deservedly so—a lot of people have used it to share the highlights of their years. Knowing what kind of year I’d had, though, I avoided making one of my own. I kept seeing them pop up in my feed, created by others, almost all of them with the default caption, “It’s been a great year! Thanks for being a part of it.”  Which was, by itself, jarring enough, the idea that any year I was part of could be described as great.

Still, they were easy enough to pass over, and I did.  Until today, when I got this in my feed, exhorting me to create one of my own.  “Eric, here’s what your year looked like!”

fb-year
 

A picture of my daughter, who is dead.  Who died this year.

Yes, my year looked like that.  True enough.  My year looked like the now-absent face of my little girl.  It was still unkind to remind me so forcefully.

I remember first reading this post the day it was published, Christmas eve 2014. When I went to look it up after my friend’s own violation by a Facebook app module I was surprised to (re)discover that it had been titled, generously, “Inadvertent algorithmic cruelty:”

And I know, of course, that this is not a deliberate assault.  This inadvertent algorithmic cruelty is the result of code that works in the overwhelming majority of cases, reminding people of the awesomeness of their years, showing them selfies at a party or whale spouts from sailing boats or the marina outside their vacation house.

But for those of us who lived through the death of loved ones, or spent extended time in the hospital, or were hit by divorce or losing a job or any one of a hundred crises, we might not want another look at this past year.

To show me Rebecca’s face and say “Here’s what your year looked like!” is jarring.  It feels wrong, and coming from an actual person, it would be wrong. Coming from code, it’s just unfortunate.

 

But of course, it did come from an actual person. “[The app] was awesome for a lot of people,” the product manager for Facebook’s Year in Review app, Jonathan Gheller, later told The Washington Post. Like all the digital experiences with, and within, which we all increasingly live our lives, an actual person — in fact a whole team of people — was responsible for concepting, designing, building, testing, and iterating this experience. No doubt, the responsibility for the rollout of this particular app featured prominently in a number of Facebook employees’ job performance reviews. From start to finish, this experience was crafted by people (not code). Calling its end result “inadvertent algorithmic cruelty” is like describing a drunk driving accident as “inadvertent gasoline cruelty.” For sure, it could have been avoided with an empty gas tank, but is that really the most accurate way to ascribe accountability in this situation? (Don’t blame it on the algohol).

“In creating this Year in Review app, there wasn’t enough thought given to cases like mine, or anyone who had a bad year,” Meyer wrote. “If I could fix one thing about our industry, just one thing, it would be that: to increase awareness of and consideration for the failure modes, the edge cases, the worst-case scenarios.”

If I could fix one thing about our industry, it would be to destroy the idea that these scenarios are edge cases.

Last year in the US, 2.6 million people died, leaving behind untold numbers of Facebook users who mourn the absence of their loved ones.

Right now 8.5 million people can’t find a job;

14.5 million people have cancer;

16 million people suffer from depression;

23.5 million people are addicted to alcohol and drugs;

45 million people live below the poverty line (including 16 million children)

These are not “edge cases.” These are not “worst case scenarios.” These are all people who use Facebook. And that’s not even counting your run of the mill disappointments, broken hearts, and inevitable wrongs and slights and meannesses that are, basically, life.

“The design [of the Year in Review app] is for the ideal user, the happy, upbeat, good-life user,” Meyer wrote. But if you are a product manager or UX designer creating experiences that will afflict affect hundreds of millions of people and you are only designing for an “ideal user”… at best that’s just lazy, and at worst — it’s creating LITERAL suffering.

Put another way:

Screen Shot 2015-04-16 at 7.40.34 PM

As Oliver Burkeman writes in The Guardian:

The world, obviously, is a manifestly unjust place: people are always meeting fates they didn’t deserve, or not receiving rewards they did deserve for hard work or virtuous behaviour. Yet several decades of research have established that our need to believe otherwise runs deep.

Confronted with an atrocity they otherwise can’t explain, people become slightly more likely, on average, to believe that the victims must have brought it on themselves. Hence the finding, in a 2009 study, that Holocaust memorials can increase antisemitism. Or that reading about the eye-popping state of economic inequality could make you less likely to support politicians who want to do something about it. These are among numerous unsettling implications of the “just-world hypothesis”, a psychological bias explored in a new essay by Nicholas Hune-Brown at Hazlitt.

If we didn’t all believe that [things happen for a reason] to some degree, life would be an intolerably chaotic and terrifying nightmare in which effort and payback were utterly unrelated, and there was no point planning for the future, saving money for retirement or doing anything else in hope of eventual reward. We’d go mad.

Yet, ironically, this desire to believe that things happen for a reason leads to the kinds of positions that help entrench injustice instead of reducing it.

Much in the same way that the “just world” cognitive bias can actually lead us to make crueler decisions, designing product features with the “happy, upbeat, good-life” ideal user bias can lead us to create crueler user experiences.

“To shield ourselves psychologically from the terrifying thought that the world is full of innocent people suffering,” Burkeman writes, we, as humans, “endorse policies more likely to make that suffering worse.” And by denying the full spectrum of the realities of people’s lives, awesome and tragic, we, as experience designers, do the same. Except we’re the ones with the power to actually do something about it.

“Just to pick two obvious fixes,” Meyer wrote at the end of his post, “First, don’t pre-fill a picture until you’re sure the user actually wants to see pictures from their year.  And second, instead of pushing the app at people, maybe ask them if they’d like to try a preview—just a simple yes or no.  If they say no, ask if they want to be asked again later, or never again. And then, of course, honor their choices. This is… designing for crisis, or maybe a better term is empathetic design.”

Or how about just, you know, design.

In the wake of Meyer’s post, the product manager for Facebook’s Year in Review app told The Washington Post. “We can do better.”

But four months later, Facebook’s photo collage assault on my friend suggests perhaps they don’t really think they can.

“Faced with evidence of injustice, we’ll certainly try to alleviate it if we can,” Burkeman wrote, “But, if we feel powerless to make things right, we’ll do the next best thing, psychologically speaking: we’ll convince ourselves that the world isn’t so unjust after all.”

    



Subscribe for more like this.






SXSW 2015: The End of Techno-Joy

 

IMG_3202
Art by: Sue Zola

 

I used to be pretty excited about technology. I’ve worked in social media since before the term existed; I co-created an app, I wrote pretty rah-rah-tech essays that people liked, like “Why Iron Man is the First 21st Century Superhero” (hint: his relationship to tech). I, like you, side-eyed wet blankets like Evgeny Morozov and Sherry Turkle and Jaron Lanier, like, sucks to be them; glad I don’t have their problem. Until, one day, I did. It started when I was writing Objectionable, and it never really went away. But perhaps nothing has made it feel quite as immersive as going to South By Southwest Interactive 2015.

The first time I went to SXSW was 8 years ago. The social web was a wild west where new and interesting things were emerging, and, unless you worked in music at the time, you probably didn’t give a shit. (Myspace was still the social network running everything, so don’t even). Twitter didn’t take off until, in fact, SXSW that year, and Facebook wouldn’t do anything at all even remotely relevant to brands until a few months later. At the time this was all a ghetto called “new media;” I had a title with the words “electronic marketing.” Since then, the tech startup industry has become a major, entrenched, cultural establishment, disrupting and colonizing other culture industries like entertainment and music. The SXSW festival, because it spans Music, Film, and Interactive technology, has come to occupy a unique position on the venn diagram of these 3 main influences of contemporary culture. So here’s a few snapshots from where we are in 2015.

 

MUSIC.

Pretty much all the highlights of my third SXSW trip that are fit to print, involve music, so we might as well just get the fun out of the way first.

Before I even got to Austin, the coolest flight I’ve ever had happened to me. I was seated next to an awesome, come up rapper from Miami, called Enoch da Prophet, whom you should listen to immediately:

We talked for a bit about how the new generation of hiphop (J Cole, Kendrick, etc.), is rebelling against the violence, materialism, and other stereotypical “bullshit” of what’s become the established hiphop mainstream in order to define themselves and their own, new sound and vibe. If this is what the future of hiphop sounds like, I am soooo into it.

The moment I landed in Austin, my major evaluative criteria for which otherwise indistinguishable tech-sponsored parties to attend quickly turned out to be based entirely on music. There were parties with performances and / or DJ sets by Sir Mix-a-Lot, Busta Rhymes, Nas, The Flaming Lips, and more — all as part of Interactive. You could tell the difference between Interactive parties and Music parties because the former all had nostalgia-wave acts with name recognition among middle-aged marketing executives. By contrast, if the people on-stage and mostly everyone else are in their 20s and more interested in dancing than networking and the majority of the visible badges people are wearing around their necks have the word “STAFF” on them and the air smells like Swisher Sweets and hash, you can easily tell you’re at Music.

Mostly, the event that lived up to and exceeded the anticipation I had for it was Odesza at the Spotify House. It was their first time playing SXSW, and they were super adorable and excited and totally rocked the shit out of everything and got everyone goin’ up at 5 pm on a Tuesday.

Also fun was the party at the W with The Jane Doze, where my best friend, Jason, was managing a bunch of mermaids. Jason lives in Austin and co-runs Sirenalia, which creates custom, high-end silicone mermaid tails. The startup-sponsored party had hired a few of the mermaid performers to hang out in the pool and be generally Instagrammable.

IMG_3134

IMG_3143

Jason and I spent a lot of time talking about parenting in the age of social media, since Jason had just become a father 8 days prior. There’s a lot to consider now, like what your general philosophy is going to be about how you treat technology and content sharing, when the subject of said content is your progeny. Kids don’t get to “choose to be online now,” Jason says, “any more than they got to choose to have a mailing address.”

In retrospect, Music (and also music) turned out to be a welcome, transportive reprieve from the relentless grind of Interactive.

 

INTERACTIVE.

On the plus side of what I got to see as part of the official Interactive programming was a talk by Martin Harrison, Planning Director at Huge, entitled,  “The Empathy Gap: Why Stalin Nailed Big Data.”

“One death,” Harrison quoted Stalin, “is a tragedy; one million is just statistics.” Basically, at a certain point the scope of violation becomes so massive that our minds kind of break at trying to comprehend it or calculate a just recourse. We literally can’t even. For example, in an experiment, people gave shorter jail sentences to food company executives who knowingly poisoned 20 people (4.2 years), than 2 people (5.8 years):

One of the most notable things about this talk was that it was so good even the questions asked by the audience at the end were legitimately interesting and extended the conversation (a phenomenon as unheard of as sitting next to someone relevant on an airplane). One of the questions was how can we institutionalize empathy within risk-averse organizations reliant on the dehumanizing safety-blanket of data? Harrison had some interesting thoughts about this, namely to do with having diversity among the decision-makers.

On the other side of the spectrum, I also went to a panel called Culture Clash: When Marketing and Product Converge, which I had actually been really interested in, not only personally, as this is the exact intersection of disciplines I’ve found myself straddling since launching Mirrorgram / SparkMode, but also from a macro, inter and intra-industry perspective. For marketing, this is the next big step in the evolution of the agency model — as Deutsch’s VP, Invention Director, Christine Outram says, “from ads that are designed to die, to ads that are designed to live” as products that people use every day. And for the product side, marketing is a critical capability to understand and embrace. What the ad world (at its best) has done for the length of its existence is seek to understand and leverage insights about human behavior. (Happiness is a billboard on the side of the road that screams with reassurance that whatever you’re doing… it’s okay. You are okay.“) This human competency is necessary to surf the culture currents and capture lifestyle opportunities in a way that just features alone don’t.

Spoiler alert, the above paragraph is not what the panel was about at all. At least not the first 20 minutes of it, after which my friend, Rachel Rutherford, the Co-CEO of fashion startup, Pose, said we had to go. It was hard, she said, to listen to what marketing considered to be product successes.

 

@rachelaubrey's shoe game tho…

A photo posted by babiejenks (@babiejenks) on

 

“Welcome to how the other half lives,” I told her.

But that’s kind of how the whole premise of SXSW is flawed, though, isn’t it? Because so few things really are the kind of marketing or product successes we’re all claiming they are — in one day I managed to attend two different presentations that both referenced the now half a decade old, Old Spice Guy campaign. But you can’t sell a $1200 festival pass on the reality that most of what attendees are going to hear is aggrandized to sound amazing, to look epic, to seem important, to be Instagram-worthy. SXSW Interactive has a mass hallucination to uphold. And you’ve got an expense report to justify. One talk for real included a slide titled, “So what do I need to tell my boss I learned from your presentation so my expense report gets approved?” —

FullSizeRender (1)

Ppl photo’d the shit out of that.

 

CULTURE.

One of the most jarring things that happened at SXSW was at the start of the Flaming Lips show sponsored by Spreadfast, which was super cool-looking and also had the feel of being deliberately reverse-engineered for Instagram. At one point, Wayne Coyne literally stopped a song and restarted it because the audience participation on the lyrics call-and-response wasn’t up to par for the optics “for YouTube.” Later, when I relayed this story to my friends they insisted that Coyne’s way too punk rock for the whole thing not to have been a joke, and maybe they’re right, but here’s the thing…. no one in the audience got it. This is 2015. We do as many takes as it takes to get something share-worthy. It’s not a joke; it’s where we are now as a culture.

Everything feels inescapably more cynical now. One night at a party at the Handlebar, I was talking to a couple of guys from San Francisco. I mentioned that I’d used to live there, and they asked me the standard followup, “Where?” But I shook my head and said, “The question isn’t ‘where,’ it’s ‘when?'”

I lived in San Francisco in 2000. It was a totally different city then than it is now, 15 years later. One of the guys from San Francisco was working at a new mobile search app, or whatever. (The goal being to take away even just .5% of Google’s market share. #Innovation!) He was describing the environment in San Francisco now. “You go out to cafes or anywhere, and it’s just” — he hunched over, smushing his arms against his chest like a Tyrannosaurus, fingers manically typing from flaccid wrists. “Meep, meep, meep,” he said in a robotic voice, completing the pantomime. Then he lowered his hands and confessed, “I’m shitting on the situation, but at the same time, I work in this industry.” He shook his head, sighed into his drink, “I’m part of the problem.”

Earlier, at the W party with the mermaids, Jason was wearing a sailor hat to complement the aquatic motif. A guy walked up to him, his eyes darting back and forth shiftily, his voice so conspiratorially low I could barely make out what he was saying; a question that seemed too absurd and sketchy to be real. Jason smiled carefully, and shook his head.

“Did he just ask to buy your hat off of you,” I said as we walked away.

“Yes.”

“Jesus. I thought he was looking for drugs.”

“I know.”

There was a relentless, transactional quality to SXSW Interactive interactions. You could imagine people picturing price tags floating above everyone’s heads like Sims character diamonds. Is that for sale? Is he for sale? Are they for sale?

I remembered an article I’d read a few months ago in Fortune, The Age of Unicorns, that began, “Stewart Butterfield had one objective when he set out to raise money for his startup last fall: a billion dollars or nothing. If he couldn’t reach a $1 billion valuation for Slack, his San Francisco business software company, he wouldn’t bother. It wasn’t long ago that the idea of a pre-IPO tech startup with a $1 billion market value was a fantasy. Google was never worth $1 billion as a private company. Neither was Amazon nor any other alumnus of the original dotcom class. Today the technology industry is crowded with billion-dollar startups. When Cowboy Ventures founder Aileen Lee coined the term unicorn as a label for such corporate creatures in a November 2013 TechCrunch blog post, just 39 of the past decade’s VC-backed U.S. software startups had topped the $1 billion valuation mark. Now, casting a wider net, Fortune counts more than 80 startups that have been valued at $1 billion or more by venture capitalists. And given that these companies are privately held, a few are sure to have escaped our detection. The rise of the unicorn has occurred rapidly and without much warning, and it’s starting to freak some people out.”

On my last day in Austin I heard about Jumpolin, a local piñata and bouncy house store, that was torn down to make space for parking for a South by Southwest tech party:

The morning of February 12, 2015, Austinite Sergio Lejarazu was driving past his small business, Jumpolin at 1401 E. Cesar Chavez Street, on his way to drop his daughter off at school. That’s when he noticed something strange. Jumpolin wasn’t there anymore. He pulled over and quickly learned that his new landlords, Jordan French and Darius Fisher, operating as F&F Real Estate Ventures, had demolished the building that Jumpolin occupied for eight years. The building still had all the inventory, cash registers and some personal property inside. Sergio and his wife Monica say they were given no prior warning and were up-to-date on their rent with a lease good until 2017.

In the end, the sponsor wound up moving the party to a different venue anyway due to the controversy. (Although not before one of the landlords managed to make an analogy to cockroaches in regards to his tenants.)

Reading about this happening — for a festival, for a party for all of the entitled, out-of-towner assholes like me and you and everyone we know in our badge-holder echo chamber — I felt gross. We are all sighing into our free drinks now; we’re all part of the problem.

Beyond the impact of its output, undoubtedly the most pathological impact technology has already had on our culture is economic. The increasingly stratified division between the people who make a living in some technology-adjacent field, and everyone else. And worse — the way people in technology treat “everyone else.”

When you ask people if they’re from Austin, the real locals consistently add the phrase “born ‘n raised.” My best friend is one of these people. He moved back to Austin after a stint in San Francisco came to an end when he was no longer able to afford to live there. Now he sees “the Google glass people” moving to his hometown, “and they have nothing to contribute to the culture except money.”

It’s beyond a cliche now to talk about how San Francisco has changed. Living in LA (where we don’t have a non-exploitable culture anyway, ha ha ha), I’ve heard the conversation about San Francisco turning into Monaco humming away up north in the distance. But in Austin, it felt very real and present and metastatic — it felt like everywhere else would be next.

If he couldn’t reach a $1 billion valuation, he wouldn’t bother… How much for your hat?…  Is he for sale?…  20 people… 4.2 years….

One city gone is a tragedy. The rest is just statistics.

    



Subscribe for more like this.








The Data Is The Story…. And Also, My Proposed SXSW Interactive 2012 Panel

A few months ago I started noticing a proliferation of editorial content using data as the narrative foundation. The first place it occurred to me was on the OkTrends blog, which publishes research compiled from hundreds of millions of OkCupid user interactions. Their insight opuses on “The REAL ‘Stuff White People Like,’” and, “Gay Sex vs. Straight Sex,” for example, are some of the best reads on the internet. Then I began to see it in other places. Slate.com published an article on “What Rotten Tomatoes data tell us about the best, worst, and most bizarre Hollywood trajectories;” The New York Times teamed up with OkCupid to publish a story about “the sexual availability index” — aka, what’s the best night of the week to meet someone at a bar (spoiler: Wednesday). And on it went. Once I started paying attention, these stories were everywhere. And these weren’t simply infographics — statistics visualized in fun, creative layouts — which are, themselves, already ubiquitous, these were narratives; journalistic reportage…. driven by data.

In a June Media Shift article, Nicholas White, co-founder and CEO of The Daily Dot, which bills itself as “The hometown newspaper of the World Wide Web,” called data “a new kind of source.”

The news industry is built on the assumption that if you give a reporter a notebook and a few days to ramp up, he can write authoritatively on any subject. That’s not enough anymore. In today’s information-rich world, reporters need to bring more to the table.

The old skills still matter. In some sense they’re more precious than ever. But they aren’t enough. Data needs to be interpreted well, and we need people who can use technology in highly advanced ways to produce the insight readers crave. We need to ask the data the same tough questions we ask experts and other sources. We’ve enlisted sophisticated mathematicians in the cause of journalism. We’ve hired an editor that loves to geek out over data. There’s a lot of nuance and expertise in this process.

The article was titled, “The Necessity of Data Journalism in the New Digital Community.”

The data had become the story.

One night I dropped the bon mot, “the data is the story,” over wine with Hilary Read, co-founder of HUMAN, a live communications agency, and next thing I know she’s taken the idea and run with it, and I’m part of a proposed panel for SXSW Interactive 2012:

Data is the New Creative. Let’s Debate!

We’ve hit our tipping point. Where creative once was king, it now takes its marching orders from data. The question is­–will it stick and where has all the good creative gone? Come join HUMAN as they take on two savvy digital strategists to debate the merits, the pitfalls and ultimately the humanity of data dominated creative. This session will be a mixture of theatrics, metrics and live data-generated artwork that is sure to entice even the most cynical enthusiasts. We won’t know how it will end until we get there. Come help us decide–Is data really the new creative?

You should vote for this panel at SXSW 2012: HERE.

And in the meantime, you can rep for your team — Data vs. Creative — HERE

    



Subscribe for more like this.