Why We Need to Pick Up Alvin Toffler’s Torch

This article was published in the NYTimes in July, in Farhad Majoo’s column, a week after Alvin Toffler died.

06state-master768

More than 40 years ago, Alvin Toffler, a writer who had fashioned himself into one of the first futurists, warned that the accelerating pace of technological change would soon make us all sick. He called the sickness “future shock,” which he described in his totemic book of the same name, published in 1970.

In Mr. Toffler’s coinage, future shock wasn’t simply a metaphor for our difficulties in dealing with new things. It was a real psychological malady, the “dizzying disorientation brought on by the premature arrival of the future.” And “unless intelligent steps are taken to combat it,” he warned, “millions of human beings will find themselves increasingly disoriented, progressively incompetent to deal rationally with their environments.”

Mr. Toffler, who collaborated on “Future Shock” and many of his other books with his wife, Heidi, died last week at 87. It is fitting that his death occurred in a period of weeks characterized by one example of madness after another— a geopolitical paroxysm marked by ISIS bombings, “Brexit,” rumors of Mike Tyson taking the stage at a national political convention and a computer-piloted Tesla crashing into an old-fashioned tractor-trailer. It would be facile to attribute any one of these events to future shock.

Yet in rereading Mr. Toffler’s book, as I did last week, it seems clear that his diagnosis has largely panned out, with local and global crises arising daily from our collective inability to deal with ever-faster change.

All around, technology is altering the world: Social media is subsuming journalism, politics and even terrorist organizations. Inequality, driven in part by techno-abetted globalization, has created economic panic across much of the Western world. National governments are in a slow-moving war for dominance with a handful of the most powerful corporations the world has ever seen — all of which happen to be tech companies.

But even though these and bigger changes are just getting started — here come artificial intelligence, gene editing, drones, better virtual reality and a battery-powered transportation system — futurism has fallen out of favor. Even as the pace of technology keeps increasing, we haven’t developed many good ways, as a society, to think about long-term change.

Look at the news: Politics has become frustratingly small-minded and shortsighted. We aren’t any better at recognizing threats and opportunities that we see emerging beyond the horizon of the next election. While roads, bridges, broadband networks and other vital pieces of infrastructure are breaking down, governments, especially ours, have become derelict at rebuilding things — “a near-total failure of our political institutions to invest for the future,” as the writer Elizabeth Drew put it recently.

In many large ways, it’s almost as if we have collectively stopped planning for the future. Instead, we all just sort of bounce along in the present, caught in the headlights of a tomorrow pushed by a few large corporations and shaped by the inescapable logic of hyper-efficiency — a future heading straight for us. It’s not just future shock; we now have future blindness.

“I don’t know of many people anymore whose day-to-day pursuit is the academic study of the future,” said Amy Webb, a futurist who founded the Future Today Institute.

Stuart Goldenberg Credit Stuart Goldenberg

It didn’t have to come to this. In the 1950s, 1960s and 1970s, as the American government began to spend huge sums in the Cold War, futurists became the high priests of the coming age. Forecasting became institutionalized; research institutes like RAND, SRI and MITRE worked on long-range projections about technology, global politics and weaponry, and world leaders and businesses took their forecasts as seriously as news of the present day.

In 1972, the federal government even blessed the emerging field of futurism with a new research agency, the congressional Office of Technology Assessment, which reviewed proposed legislation for its long-term effects. Futurists were optimistic about lawmakers’ new interest in the long term.

“Congressmen and their staffs are searching for ways to make government more anticipatory,” Edward Cornish, president of the World Future Society, said in 1978. “They’re beginning to realize that legislation will remain on the books for 25 or 50 years before it’s reviewed, and they want to be sure that what they do now won’t have an adverse impact years from today.”

But since the 1980s, futurism has fallen from grace. For one thing, it was taken over by marketers.

“‘Futurist’ always sounded like this weird, made-up, science-fiction term,’” Ms. Webb said, even though in its early years, people were doing deep, nuanced research on how various tech and social movements would shape the world.

Futurism’s reputation for hucksterism became self-fulfilling as people who called themselves futurists made and sold predictions about products, and went on the conference circuit to push them. Long-term thinking became associated with the sort of new-agey “thinkfluencers” who hung out at TED and Davos, and who went by names like Shingy and Faith Popcorn. Futurism became a joke, not a science.

The end of the Cold War and a rise in partisan political interests also changed how lawmakers saw the utility of looking at the future. In the Reagan years, many on the right began to see the government as the cause of most of the nation’s ills. The idea that the government could do something as difficult as predict the future came to be considered a ridiculous waste of money.

Newt Gingrich has long been enamored of science fiction — he wants to build a moon base. But when Mr. Gingrich, a Georgia Republican, became speaker of the House in 1995, he quickly shut down the Office of Technology Assessment. The government no longer had any place for futurists, and every decision about the future was viewed through the unforgiving lens of partisan politics.

Of course, the future doesn’t stop coming just because you stop planning for it. Technological change has only sped up since the 1990s. Notwithstanding questions about its impact on the economy, there seems no debate that advances in hardware, software and biomedicine have led to seismic changes in how most of the world lives and works — and will continue to do so.

Yet without soliciting advice from a class of professionals charged with thinking systematically about the future, we risk rushing into tomorrow headlong, without a plan.

“It is ridiculous that the United States is one of the only nations of our size and scope in the world that no longer has an office that is dedicated to rigorous, nonpartisan research about the future,” Ms. Webb said. “The fact that we don’t do that is insane.”

Or, as Mr. Toffler put it in “Future Shock,” “Change is avalanching upon our heads and most people are grotesquely unprepared to cope with it.”

 

Tastemade

Quem disse que You Tube é só para videos de gatinhos serelepes e cachorros semi-histéricos? Indeed, a plataforma de video que possibilitou canais inteligentes (como o SciShow e o Intelligent Channel) também chegou na cozinha. Tastemade, que acabou de ter uma injeção de capital de 10 milhões de dólares, não vai demorar muito para se tornar a Food Network da era digital.  Tastemade tem 100 canais feitos com uma curadoria super atenciosa, totalmente voltada para videos de receitas. A rede até oferece treinamento em artes culinárias para seus criadores de video. Assim garantem o padrão estético.
Screen Shot 2013-08-22 at 8.16.46 AM

Showrunners, the movie

showrunners

Showrunners: A Documentary Film is the first ever feature length documentary to explore the fascinating world of Us television showrunners and the creative forces aligned around them. These people are responsible for creating, writing and overseeing every element of production on one of the United State’s biggest exports – television drama and comedy series. The film takes audiences behind the scenes of the chaotic world of the showrunner to reveal the incredible amount of work that goes into making sure their favorite TV series airs on time as well as the challenges that showrunners must overcome to ensure a new series makes it onto the schedules at all.

The premise is simple: Director Des Doyle and crew interview some of the biggest showrunners in Hollywood, giving the audience an inside look at how their favorite shows are made. It’s the kind of behind-the-scenes documentary film fans eat up.

Some of the talent appearing in Showrunners: A Documentary Film include:

showrunners

The Socially Deficient

Digital connections are at a point of mental-emotional exhaustion… Has it reached the ceiling? I find it hard to believe it will grow more; or evolve into a way that makes us connect in a  more productive or fulfilling way. Maybe this is the Mayan prophecy after all.  The global village is full. Here’s a story by David Carr, published today at the NYT, with the title of “My Dinner with Clay Shirky….”

Last week, I had dinner at Clay Shirky’s house along with a group of journalists and academics, all of whom are very active on the Web. Mr. Shirky is the oft-quoted thinker who wrote “Here Comes Everybody” and “Cognitive Surplus.” I have no idea why he was having the dinner or why I was invited, only that it sounded like a fun, smart bunch. We all “know” each other, either through a direct online connection or by digital reputation, but I am more familiar with them as digital avatars than as people.

As a group, we have some things in common, like an obsession with the future of journalism, and I’ve engaged in vigorous digital debate with people at the dinner over the willingness of people to pay for journalism.

But, funny enough, we didn’t talk about that much in person. We talked about Twitter, the Facebook I.P.O. and the limits and glories of Storify. But we also talked about “Homeland,” school politics, New York provincialism and, as it turned out, bread.

Before dinner, Mr. Shirky set out some bread. It was warm on the inside, crusty on the outside, little French loaves of goodness. I went into the kitchen to ask where Mr. Shirky had found such treasure and he pointed to the oven and the loaf pans on top.

As it turns out, Mr. Shirky became very good at bread eating at a young age, so his mother decided that he should also be good at bread making. We all chewed on the bread as Mr. Shirky told the story of learning how to make bread as a 10-year-old.

Now, he could have told that story in a blog post or in an e-mail chain, but it became a very different story because we were tasting what he talked about. The connection in an online conversation may seem real and intimate, but you never get to taste the bread. To people who lead a less-than-wired existence, that may seem like a bit of a “duh,” but I spend so much interacting with people on the Web that I have become a little socially deficient.

In her book “Alone Together,” Sherry Turkle has written about the lack of nutrition in what seem like significant online relationships.

After an evening of avatar-to-avatar talk in a networked game, we feel, at one moment, in possession of a full social life and, in the next, curiously isolated, in tenuous complicity with strangers. We build a following on Facebook or MySpace and wonder to what degree our followers are friends. We recreate ourselves as online personae and give ourselves new bodies, homes, jobs and romances. Yet, suddenly, in the half-light of virtual community, we may feel utterly alone. As we distribute ourselves, we may abandon ourselves. Sometimes people experience no sense of having communicated after hours of connection.

I left Mr. Shirky’s apartment with a full belly, but even more filled up by what happened around the perimeter of the bread. No one tweeted, no one texted, everyone talked. I’ve noticed more and more that when I go to gatherings, people are walking around in their own customized world defined by what is on their smartphone, not by who is sitting next to them at dinner. The serendipity of the offline world has been increasingly replaced by the nice, orderly online world where people only follow whom they want to and opt in to conversations that seem interesting.

The funny thing is, the user is not always the one who is doing the deciding. As the author Eli Pariser has written in “The Filter Bubble,” Facebook, Google and Yahoo are deciding what we want to know, even though we are the ones doing the searching.

We end up in what seems like a self-selected informational ghetto, finding out about what is most “relevant” to us, but not finding out much of anything new. Google would never know that I wanted to bake because I didn’t know it either. If someone had Google Plus-ed Mr. Shirky’s recipe for bread or provided a link on Twitter, I would have never clicked on it.

But because I had tasted the actual bread out in the actual world, I wanted to try to make it myself. I got online — yes, I stipulate to the irony — and goaded Mr. Shirky back into sharing the recipe. It might as well been a formula for cold fusion, what with its two separate pauses to let the dough rise and daunting list of tips, but the memory of the smell, of the taste, compelled me to try to make the bread.

(The writer and educator Zeynep Tufekci would point out that I never would have gotten around to eating that bread with those people unless I had had a digital connection to them. She has observed that so-called weak ties often lead to strong ones, )

I circulated a picture of my lumpy but fundamentally sound loaves to the ad hoc group that was formed around that dinner and we had some laughs. Yes, we did that online, but it was reprising something that had actually happened when we were together.

In addition to asking Clay for the bread recipe, I asked him about the ingredients of communication in a wired era. “When people talk to one another long enough, they want to meet, and when they’ve been in one another’s presence, they want to keep in touch.” In other words, we will probably break bread together as a group again.

All of which is a way of saying something that is probably obvious to others who are less digitally obsessed: you can follow someone on Twitter, friend them on Facebook, quote or be quoted by them in a newspaper article, but until you taste their bread, you don’t really know them.

Fragmentation and Aggregation

Growing up I always heard expressions that I couldn’t quite understand because of their paradoxical fatalism-slash-positive twist. Whenever someone made such prophetic statements such as “It’s in time of crisis that the big fortunes are made” people would just smile or nod in agreement an I would just look puzzled…. I had no clue what exactly was meant. There was also “less is more”; “out of the box”; “less is more, small is beautiful, limitation is your friend…”; and “tighter budgets, bigger ideas”.
Alas, here’s television at it’s all-time low, with the cynical announcing the end of broadcast television — myself included. Truth is it is still vibrant and creative, and –myself included — am being creative as ever, looking at the glass half full, and trying to figure out a new business model for myself and the medium, trying to foresee the new era with new finance models. Development execs are tightening budgets and changing the overrated prime time slots. I love it. Leno’s 10pm will end up the pilot money hemorrhage. The Screenings will offer as good titles as ever. How to monetize the viewing migration? Creating new business models.  Content is still king, and nothing replaces a good idea. But television is becoming democratic and social media is changing the face of broadcast. Embrace fragmentation, await aggregation. My new fancy words….Aggregation of channels and bundling themes together will make my viewing experience unique. Can’t wait for it.