UX In Advertising

This is where we are now:

 

 

 

 

 

In contrast, here’s what it looks like when technology ads rely on pushing the lingua franca of features instead of the native tongue of  *experience*:

 

 

The technology pervading our lives has brought with it a new colonizing language. Even the term “UX” has become mainstream enough within the cultural lexicon that it can now referenced explicitly, as in the new MySpace ad. But more importantly, we have evolved a shared vocabulary for technology that goes beyond the rudimentary terms of features and specifications. In the years since Apple first pioneered and perfected this approach, we have all become fluent in technology’s emotional language.

This split between the emotional and the rational appeals, between the experience and the specs, comes at an interesting time. As Millennials are notoriously buying fewer cars (“Even the proportion of teenagers with a [driver’s] license fell, by 28 percent, between 1998 and 2008,”) — the new technology that keeps us connected is now being sold like automobiles.

 

(Thanks to @ThomPulliam for pointing out the common theme.)

 

    



Subscribe for more like this.






“Web Design” Is Dead

ios7

One of my favorite of Steve Jobs’ quotes — if not THE favorite — is this:

People think it’s this veneer — that the designers are handed this box and told, ‘Make it look good!’ That’s not what we think design is. It’s not just what it looks like and feels like. Design is how it works.

It’s a sentiment echoed in the philosophy behind Apple’s completely rethought new design language for the forthcoming iOS7, just announced this week:

Nothing we’ve ever created has been designed just to look beautiful. That’s approaching the opportunity from the wrong end. Instead, as we reconsidered iOS, our purpose was to create an experience that was simpler, more useful, and more enjoyable — while building on the things people love about iOS. Ultimately, redesigning the way it works led us to redesign the way it looks. Because good design is design that’s in service of the experience.

This is not only a standard that Apple holds itself to, it now extends to all those who develop on the iOS platform with comprehensive guidelines on how third party developers should design for iOS 7 to match Apple’s own style. As TechCrunch puts it, “Developers will have to adapt their apps to match the rest of the operating system if they don’t want them to look antiquated.”

Here are Apple’s three main themes for developing for iOS 7:

Deference. The UI helps users understand and interact with the content, but never competes with it.

Clarity. Text is legible at every size, icons are precise and lucid, adornments are subtle and appropriate, and a sharpened focus on functionality motivates the design.

Depth. Visual layers and realistic motion heighten users’ delight and understanding.

On Apple’s list of  things app developers should do to get ready for iOS 7 are instructions like: “Revisit the use of drop shadows, gradients, and bezels. Because the iOS 7 aesthetic is smooth and layered—with much less emphasis on using visual effects to make UI elements look physical — you may want to rethink these effects.”

This kind of “smooth” digital design aesthetic, that rejects the skeuomorphism of making icons on a flat screen look like 3-dimensional, analog objects, has a name — “flat design.” And Apple was not even the first to adopt it. (They were the last holdout, in fact). Microsoft and Google got there first. Back in 2011, when Microsoft unveiled its “Metro” design language, now simply referred to as “Windows 8,”  its design principles were:

Clean, Light, Open and Fast

We took an approach that we call “Fierce Reduction” to remove any elements in the UI that we felt were unnecessary; both visual elements and feature bloat. It allows us to shine a focus on the primary tasks of the UI, and makes the UI feel smart, open, fast, and responsive.

Alive in Motion

The transitions between screens in a UI are as important the design of the screens themselves. Motion gives character to a UI, but also communicates the navigation system, which helps to improve usability.

Celebrate Typography

Our design inspiration is very typographic, and it felt like it was time for User Interfaces to be uncompromising about type as well. Type is information, type is beautiful.

Content, Not Chrome

It’s the content on the phone that people want, not the buttons. Reducing the visuals on the phone that aren’t content will help you create a more open UI, and it also promotes direct interaction with the content.

Authentically Digital

Finally, we believe in honesty in design. A user interface is created of pixels, so in Metro we try to avoid using the skeumorphic shading and glossiness used in some UI’s that try to mimic real world materials and objects.

 windows_8

With Apple joining the tech trifecta, (“Leading by following“), Flat Design has been transformed from a trend into a manifesto. It is a fundamental philosophical shift from what’s come before in what design’s role is in the natively digital experience.

Living in L.A. it’s very easy for all metaphors and analogies to get reduced to the automotive experience. So here we go — for a long time in the web, we had the same people, approaching in the same way, the design of something like this:

5GTDN13E378117608-1c

 

And something like this:

28332_original

I mean, we just didn’t know any better. We didn’t really grasp that we needed completely different types of design philosophies. All we knew was that things on the web had to…. look….. like something….. Maybe … something pretty?…. Or… Cool? Anyway, we had to get designers. To design them. And what visual arts genius was going to want to create a digital masterpiece that looks like they were barely there in the first place?

 

ios-7-ui-transition-guide_-layout-and-appearance

Orite…. Apple.

For many a designer — sadly, still — whatever the reason a user has come to the destination they are designing, it is second to the privilege of being exposed to the designer’s creative brilliance and superior taste.

Apple is saying, this is where we are as a culture: we’re past that now. Apple wants “designers” to get out the way. Users are not here to marvel at your “design.” They’re here to get to the shit your design is jumping up and down, waving its hands frantically trying to get attention, getting in the way of. Apple wants to make it very clear that the UI — that layer between the human, and the content that this human is trying to access, aka the “design” layer — is not the star. It should, quote, “play a supporting role.”

Commuters don’t care about your “creative vision.” They are just trying to get fucking home.

If you want to be Dali, you should probably not be a freeway designer. But if you want to design freeways — or iOS experiences — then your art is about making something sublimely useful, usable, and effective. This is what the companies defining the way we access the digital world all stand for now. This is what they believe makes for a beautiful experience on their devices and on their operating systems.

“Web design” is dead because everywhere the “design layer” of the web is being sandblasted off, the interface reduced down to its barest essence. This is why the new, natively digital design disciplines are found deep beyond the surface of aesthetics, in user experience design, in information architecture, in interaction design.

More than ever, Jobs’ words are true: design is now a fundamentally inextricable part of how it works.

There’s simply not much room left for anything else.

And if you think you’ll at least get to choose the colors based on your personal design taste…. like Apple says, “you may want to rethink” that as well.

From Fast Company’s “The Science Behind Colors in Marketing“:

“Green connotes ideas like “natural” and “environment,” and given its wide use in traffic lights, suggests the idea of “go” or forward movement. The color red, on the other hand, is often thought to communicate excitement, passion, blood, and warning. It is also used as the color for stopping at traffic lights. Red is also known to be eye-catching.”

So, clearly an A/B test between green and red would result in green, the more friendly color. At least that was their guess. Here is what their experiment looked like:

3009317-inline-inline-4-why-is-facebook-blue-the-science-behind-colors-in-marketing

The red button outperformed the green button by 21%.

What’s most important to consider is that nothing else was changed at all: 21% more people clicked on the red button than on the green button. Everything else on the pages was the same, so it was only the button color that made this difference.

    



Subscribe for more like this.






The Last Exit To The Millennium

“Those of us who watched Kids as adolescents,” writes Caroline Rothstein, in her Narrative.ly piece Legends Never Die, “Growing up in an era before iPhones, Facebook, and Tiger Moms, had our minds blown from wherever we were watching–whether it was the Angelika Film Center on the Lower East Side or our parents’ Midwestern basements. We were captivated by the entirely unsupervised teens smoking blunts, drinking forties, hooking up, running amok and reckless through the New York City streets…. Two decades after [the] film turned Washington Square skaters into international celebrities, the kids from ‘Kids’ struggle with lost lives, distant friendships, and the fine art of growing up.”

If you came up in the 90’s, you remember Kids. But I’d hardly given it a backward glance in ages. Had it really been two decades? It seemed somehow inconceivable. The cast, none of them professional actors, all plucked from the very streets they skated on, had become fixed in my mind as eternal teenagers, immortalizing a hyperbolized — and yet, not entirely foreign — experience. Kids was grotesque and dirty and self-indulgent and unignorable, and so was high school. Which is where I, and my friends, were at the time. The movie had become internalized. I had entirely forgotten that this was where Chloe Sevigny and Rosario Dawson had come from. Like a rite of passage, it seemed to carry a kind of continuity, like it was something everyone goes through. It seemed disconnected from any kind of evolving timeline.

And yet time had passed. Revisiting the lives of the cast 20 years later, Rothstein writes, “Justin Pierce, who played Casper, took his life in July 2000, the first of several tragedies for the kids. Harold, who played himself in the film and is best remembered for swinging his dick around in the pool scene—he was that kid who wasn’t afraid, who radiated a magnetic and infectious energy both on and off screen—is gone too. He died in February 2006 from a drug-induced heart attack.” Sevigny and Dawson have become successful actors. Others tied to the crew have gone on to lead the skate brand Zoo York, and start a foundation that aims to “use skateboarding as a vehicle to provide inner-city youth with valuable life experiences that nurture individual creativity, resourcefulness and the development of life skills.” But the most striking story for me, however, was of what happened over the past 20 years to the movie’s most profoundly central character:

“I think that Kids is probably the last time you see New York City for what it was on film,” [says, Jon “Jonny Boy” Abrahams.] “That is to me a seminal moment in New York history because right after that came the complete gentrification of Manhattan.”

Kids immortalizes a moment in New York City when worlds collided–“the end of lawless New York,” Eli [Morgan, co-founder of Zoo York] says–before skateboarding was hip, before Giuliani cleaned up, suited up, and wealthy-ed up Manhattan.

“I don’t think anyone else could have ever made that movie,” says Leo [Fitzpatrick, who played the main character, Telly]. “If you made that movie a year before or after it was made, it wouldn’t be the same movie.”

Kids‘ low-budget grit and amateur acting gave it a strange ambivalence. It was neither fully fictional nor fully real. It blurred the line between the two in a way that it itself did not quite fully understand — it was the very, very beginning of “post-Empire,” when such ambiguities would become common — and neither did we. Detached from  the confines of the real and the fictional, it had a sense of also being out of time. But it turns out it was in fact the opposite. Kids was a time capsule. As Jessica [Forsyth] says in the article: “It’s almost like Kids was the dying breath of the old New York.”

It’s a strange thing. One day you wake up and discover that culture has become history. In the end it wasn’t a dramatic disaster or radical new technology that changed the narrative in an instant. It was a transition that happened gradually. The place stands still, and time revolves around it; changes it the way wind changes the topography of dunes.

Just a few days after Rothstein’s piece, I read these truly chilling words in The New York Times:

“The mean streets of the borough that rappers like the Notorious B.I.G. crowed about are now hipster havens, where cupcakes and organic kale rule.”

For current real estate purposes, the block where the Brooklyn rapper Notorious B.I.G., whose real name was Christopher Wallace, once sold crack is now well within the boundaries of swiftly gentrifying Clinton Hill, though it was at the edge of Bedford-Stuyvesant when he was growing up. Biggie, who was killed under still-mysterious circumstances in 1997, was just one of the many rappers to emerge from Brooklyn’s streets in the ’80s and ’90s. Including successful hardcore rappers, alternative hip-hop M.C.s, respected but obscure underground groups and some — like KRS-One and Gang Starr — who were arguably all of the above, the then-mean streets gave birth to an explosion of hip hop. Among the artists who lived in or hung out in this now gentrified corner of the borough: Not only Jay-Z, but also the Beastie Boys, Foxy Brown, Talib Kweli, Big Daddy Kane, Mos Def and L’il Kim.

For many, the word “Brooklyn” now evokes artisanal cheese rather than rap artists. The disconnect between brownstone Brooklyn’s past and present is jarring in the places where rappers grew up and boasted about surviving shootouts, but where cupcakes now reign. If you look hard enough, the rougher past might still be visible under the more recently applied gloss. And if you want to buy a piece of the action, Biggie’s childhood apartment, a three-bedroom walk-up, was recently listed by a division of Sotheby’s International Realty. Asking price: $725,000.

When we imagine the world of the future, it is invariably a world of science fiction. It’s always, “Here’s what Los Angeles might look like in seven years: swamped by a four-foot rise in sea level, California’s megalopolis of the future will be crisscrossed with a thousand miles of rail transportation. Abandoned freeways will function as waterslides while train passengers watch movies whiz by in a succession of horizontally synchronized digital screens. Foodies will imbibe 3-D-printed protein sculptures extruded by science-minded chefs.”

It’s always impersonal. The future,  even one just seven years away, seems always inhabited entirely by future-people. It’s not a place where we actually imagine….ourselves. Who will we be when the music that speaks to us now becomes “Classic” (Attention deficit break: “Elders react to Skrillex“); when the movies or TV shows or — lets be real, it’s most likely going to be — web content that captures the spirit of  this moment becomes a time capsule instead of a reflection? When once counter-cultural expressions — like skating, or hip hop — become mainstream? Who will we be when there is no longer a mainstream, or a counter-culture, for that matter? And who will the teenagers of this future be when the culture of their youth ages?

The past isn’t a foreign country. It’s our hometown. It’s the place we left, that has become immortalized in our memory the way it was back then. We return one day to discover new buildings have sprung up in empty lots, new people have moved in and displaced the original residents. Some from the old neighborhood didn’t made it out alive. The past has moved while we weren’t looking. It’s no longer where it was at all.

“In the ’80s and ’90s–as strange as it may seem to say this–we had such luxury of stability,” William Gibson, the once science-fiction writer who popularized the word “cyberspace,” and turned natural realist novelist in the 21st-century, said in a 2007 interview. “Things weren’t changing quite so quickly in the ’80s and ’90s. And when things are changing too quickly you don’t have any place to stand from which to imagine a very elaborate future.”

Yet this week, it seems to me the more mysterious our future, the more the past becomes a moving target.

Then again, perhaps it always was.

Strange memories on this nervous night in Las Vegas. Five years later? Six? It seems like a lifetime, or at least a Main Era—the kind of peak that never comes again. San Francisco in the middle sixties was a very special time and place to be a part of. Maybe it meant something. Maybe not, in the long run… but no explanation, no mix of words or music or memories can touch that sense of knowing that you were there and alive in that corner of time and the world. Whatever it meant.…

History is hard to know, because of all the hired bullshit, but even without being sure of “history” it seems entirely reasonable to think that every now and then the energy of a whole generation comes to a head in a long fine flash, for reasons that nobody really understands at the time—and which never explain, in retrospect, what actually happened.

There was madness in any direction, at any hour. You could strike sparks anywhere. There was a fantastic universal sense that whatever we were doing was right, that we were winning.… We had all the momentum; we were riding the crest of a high and beautiful wave.…

So now, you can go up on a steep hill in Las Vegas and look West, and with the right kind of eyes you can almost see the high-water mark—that place where the wave finally broke and rolled back.”

– Hunter S. Thompson

highwatermarknewyork
Map of New York City showing the remnants of the 6ft high water line from Hurricane Sandy.
Crom Martial Training, Rockaway Beach. (Source)

 

    



Subscribe for more like this.






The Search For Stark

First of all, do yourself a favor and watch this 2 minutes and 44 seconds of utter awesomeness above.

Then recall the ending of Iron Man 3. In fact, recall the entire 130 minutes of its insulting, technology guilt-laden self-hatred.

Or better yet, don’t do that.

If you’ve been here since 2010, you know that I have had a special place in my heart for the character I called “The First 21st Century Superhero.” Tony Stark — as  reimagined by Jon Favreau, and reincarnated by Robert Downey Jr. — and I have had an unexpectedly personal relationship these past 3 years. Ever since Favreau retweeted my post and it took on a life of its own and  became the most popular thing I’d ever written. From the intimacy of Tony Stark’s relationship with his gadgets, to his eschew of a secret identity in favor of that uniquely post-digital virtue of radical transparency, to his narcissism, Favreau’s Iron Man reflected a radical departure from the tropes that defined the 20th century superhero.

I could tell you about how Shane Black, who directed this third installment in the Iron Man franchise tried his best to undo all that. How deliberately he went after the things that not only made Tony Stark so brilliantly modern, but also lay at the very heart of his character. I could tell you about the relentless “techno fear” that ran like an electromagnetic current through the entire movie from start — on New Year’s Eve 1999, ground zero of the Y2k paranoia — to finish — with Stark throwing his arc reactor heart into the ocean like the he’s an old lady, letting go of a luminescent, blue burden at the end of fucking Titanic. Or some shit.

I could tell you how this conflicted, 20th century relationship to technology, wielded with all the subtlety of Catholic guilt, bashed all of us over the head like a blunt instrument the first time we saw Pepper and Tony on screen together — but wait! That’s not actually Tony. It’s a Siri-powered autonomous-driving Iron Man suit, and it’s just asked Pepper to, quote, “Kiss me on my mouth slit.”

(I seriously feel like I need to go wash my hands with soap now after typing those words.)

And yet, under Favreau’s direction, Pepper kissing Tony’s helmet in Iron Man 2 was most likely one of the sexiest moments Gwyneth Paltrow has ever had on film:

 

iron_man_2_alternate_opening_movie_image_slice_01

 

I could tell you how Black drove Tony Stark into hiding (while Favreau celebrated his coming out) and stripped him of his suit and access to his technology, making him fight his battles in the flesh for most of the film. We’re to believe Stark built a more advanced suit while a POW in a cave in fucking Afghanistan than he could on his credit limit in Tennessee??

 

tumblr_inline_mlmdtfwRqa1qz4rgp

 

I could tell you how the thing I was thinking about the most as I walked out of the theater — even more than that Black got thisclose to turning Pepper into a legitimate superhero in her own right, which would have been practically the only 21st-century compliant move he’d have made in the whole movie, but then, of course Tony had to “fix” her back to normal — was:

THANK GOD STEVE JOBS DID NOT LIVE TO SEE TONY STARK THROW HIS HEART INTO THE FUCKING OCEAN.

Do you remember the love that the first Iron Man movie, and the Tony Stark in it, had for his first suit? The one he made in captivity. The painstaking, terrifying labor that birthed this child of necessity? The metal manifestation of the power of ingenuity and creativity and talent that won him his freedom? Remember his second suit? The one he built once he got back home. The hotter, cooler, younger sibling of the scrap heap he’d left in the desert. The first real Iron Man suit. How much fun he had making it, tweaking it, perfecting it, and how much fun we had going along on the joyride? Tony Stark fought a custody battle against the American government for the suit in Iron Man 2. He said no one else could have it. He said the suit he created was a part of him, that he and it were one. And we all intimately understood exactly what he meant. Because  even if the rest of us don’t actually literally plug our gadgets into our chest cavities, 80% of us go t0 sleep with our phone by our bedside.

I could tell you how Shane Black changed all that for Tony, replaced his passion for innovation with a 20th century irreconcilability. His suits, once so precious the greatest military superpower in the world couldn’t force him to part with just one, have been rendered as meaningless as disposable cups. For Black’s Iron Man, technology still has friction. He can “disconnect,” can “unplug.” This feels like a “real” thing to do. As if there is still a world that isn’t part of the digital world. It’s not just an anachronistic, Gen X misunderstanding of the Millennial reality, it kills what makes Tony Stark, Tony Stark.

“We create our own demons” are the first words we hear as the movie begins. Stark is speaking in voiceover, and this becomes his ongoing refrain throughout the movie. We create our own demons. We create our own demons. By the end, when Stark destroys all of his dozens of indistinguishable suits — because they are “distractions” (the actual word he uses, twice), because we create our own demons and these are his creations, because (and this is the most fucked up part of all) he thinks this is what will make Pepper happy — it is the moment that Black destroys the soul of this character.

proof that tony stark has a heart

Imagine Steve Jobs throwing the iPhone prototype into the ocean and walking away.

Imagine Elon Musk, who Favreau modeled his interpretation of the modern-day tech genius inventor after, driving a fleet of Teslas off a cliff.

I could tell you how Shane Black imagined it.

Speaking to an audience at Standford in the wake of the Social Network, Mark Zuckerberg said, “The framing [of the movie] is that the whole reason for making Facebook is because I wanted to get girls, or wanted to get into clubs…. They just can’t wrap their head around the idea that someone might build something because they like building things.”

This is why Tony Stark builds things. Because he likes building things. Technology is not a “distraction” from something realer, it is a part of what IS real.  The digital and the analog worlds aren’t binary. They are inextricably intertwined. Technology is as much a part of us now as it has always been for Tony Stark — corporeally and philosophically. And there is no going back. Texting is not a distraction from the “realness” of the telephone — itself, a completely unnatural, manufactured, awkward medium that we all learned to take communication through for granted. Electricity is not a distraction from the “realness” of candle-light. Driving a car is not a distraction from the “realness” of riding a horse.

Which brings us back to this impeccably clever Audi commercial.

Featuring the two actors who’ve played Spock, himself an embodiment of hybridity, in a battle that starts out via iPad chess, doubles down over the phone, escalates by car, and culminates with the finishing touch of  a Vulcan nerve pinch. It makes the depiction of the permeable membrane between the digital and the analog, of the seamless absorption of a “fictional” personality into the “real” self, and of unapologetic techno-joy look effortlessly cool.

This is the Audi ad Iron Man USED TO BE!

In 2010, I wrote:

The first 21st century superhero is a hedonistic, narcissistic, even nihilistic, adrenaline junkie, billionaire entrepreneur do-gooder. If Peter Parker’s life lesson is that “with great power comes great responsibility,” Tony Stark’s is that with great power comes a shit-ton of fun.

You can’t get any more Gen Y than that.

Three Mays later, Tony Stark has changed. He’s entirely forgotten how to have fun. He doesn’t even get joy out of building things anymore — hell, he was having a better time when he had a terminal illness, back when Favreau was at the helm. Under Black’s direction, Stark doesn’t seem excited about anything. He’s on Xanax for his panic attacks — I’m assuming. Since there isn’t a single thing that fills him with anywhere near the kind of fascination Leonard Nimoy and Zachary Quinto express as they watch  a self-driving Audi pull out of a golf club driveway. As Black sees it, to embrace the technological innovation that is in Tony Stark’s blood — both figuratively and literally — to create something that isn’t a demon, to want to build things because he likes building things, all of that would somehow make Stark less human.

But as the mixed-race Spock always knew — what makes us human can’t be measured in degrees.

Oh well.

thanks for keeping the seat warm gen x we'll take it from here sincerely gen y

 

After all….. It’s only logical.

    



Subscribe for more like this.






Lose My Number

bananaphone

 One of the 20-somethings I’ve been working with over the past few weeks sent me a summary put together by another Millennial colleague of Sherry Turkle’s book, Alone Together: Why We Expect More from Technology and Less from Each Other. An MIT technology and society specialist, Turkle argues that our relentless modern connectivity leads to a new disconnect. As technology ramps up, our emotional lives ramp down. Hell in an handbasket, yadda yadda.

The Millennial reviewer called it “a fascinating and highly depressing affirmation of many of the problems that face my sister and I’s generation.” The summary includes such gems as:

– Young adults often cite the awkwardness of attempting to steer a phone call to its conclusion as one of the top reasons they avoid phone calls. This skill is not one that teens seem open to learning as a quick “got to go, bye” is easier than determining a natural way to break off a conversation.
 
– Teens avoid making phone calls for fear that they reveal too much. Texting is easier and “more efficient” than a human voice. Things that occur in “realtime” take far too much time. Even adults and academics admit that they would rather leave a voicemail or send an email than talk face-to-face.
 
– We used to live in an era when teenagers would race to the ringing phone after suppertime, now teens are content with receiving fewer calls in favor of texts or Facebook messages.
But here’s what I want to know: 

Why is telephone-based behavior the benchmark of communication proficiency?

The telephone isn’t part of our biology. It is, itself, a completely unnatural, manufactured, utterly awkward medium that we all learned to take communication through for granted.  

It could be lamented that “kids these days” dont know the etiquette of visiting card communication either — which is what people had to resort to before the phone:

Visiting cards became an indispensable tool of etiquette, with sophisticated rules governing their use. The essential convention was that one person would not expect to see another person in her own home (unless invited or introduced) without first leaving his visiting card for the person at her home. Upon leaving the card, he would not expect to be admitted at first, but might receive a card at his own home in response. This would serve as a signal that a personal visit and meeting at home would not be unwelcome. On the other hand, if no card were forthcoming, or if a card were sent in an envelope, a personal visit was thereby discouraged. As an adoption from French and English etiquette, visiting cards became common amongst the aristocracy of Europe, and also in the United States. The whole procedure depended upon there being servants to open the door and receive the cards and it was, therefore, confined to the social classes which employed servants. If a card was left with a turned corner it indicated that the card had been left in person rather than by a servant.

I mean, oy. You know?

The summary goes on:

For young adults electronic media “levels the playing field” between outgoing people and shy people, removing the barrier of awkward conversations and body language. Texts allow a two-minute pause to become normal and acceptable while the recipient thinks of an adequate response. The same is not possible in face-to-face conversations. Screens offer a place to reflect, retype and edit.

The assumption here being that two minutes is NOT acceptable? For the thousands of years when all we had was the paper operating system, it could take two months, or TWO YEARS to get a reply. The drama of the entire ouvre of Jane Austen and the Bronte sisters hinges on telecommunications not having been invented, just as much as the drama of Ferris Bueller’s Day Off hinges on cell phones not having been invented.

So why are we lamenting the telephone and all its associate UI becoming irrelevant and obsolete? (Current unlistened-to voicemails count: 49. You?) It was never meant to last forever. The telephone is like whale blubber. Brian Eno knows what I’m talking about:

“I think records were just a little bubble through time and those who made a living from them for a while were lucky. I always knew it would run out sooner or later. It couldn’t last, and now it’s running out. I don’t particularly care that it is and like the way things are going. The record age was just a blip. It was a bit like if you had a source of whale blubber in the 1840s and it could be used as fuel. Before gas came along, if you traded in whale blubber, you were the richest man on Earth. Then gas came along and you’d be stuck with your whale blubber. Sorry mate – history’s moving along. Recorded music equals whale blubber. Eventually, something else will replace it.”

Music existed before the plastic disc, and it will continue to exist after the MP3. Communication existed before the telephone, and it will continue to exist after the text message.

It just takes a frame of reference broader than what one generation takes for granted — or finds foreign — to see it.

    



Subscribe for more like this.