Tom caused a ruckus with his interview by Kara Swisher on the Recode Decode podcast. Tune in and judge for yourself:
The podcast focuses on The Excellence Dividend, but Tom also gives the story of his beginnings and In Search of Excellence.
Maybe not to you, but to me these DAILY stats came as a shock:
154.6 billion emails
400 million tweets
16 billion words on Facebook
52 TRILLION words on email and social media*
(*equivalent to 520 million books)
Said stats appeared in the October issue of the Wired written by Clive Thompson: "THINKING OUT LOUD: How Successful Networks Nurture Good Ideas." I was captivated from start to finish. I admit a positive bias toward the value of social media, gaming, etc. On my lengthy list of recent reads you'll find at the top: Steven Johnson's Everything Bad Is Good For You: How Today's Popular Culture Is Actually Making Us Smarter and Jane McGonigal's Reality Is Broken: Why Games Make Us Better and How They Can Change the World. Thompson suggests that our social publication mania also yields extraordinary benefits. Here are a few quotes (which, of course, I also turned into a micro-PowerPoint presentation):
"Before the Internet, most people rarely wrote for pleasure or intellectual satisfaction after graduating from high school or college. ... The fact that so many of us are writing—sharing our ideas, good and bad, for the world to see—has changed the way we think. Just as we now live in public, so do we think in public. And that is accelerating the creation of new ideas and the advancement of global knowledge."
"Having an audience can clarify thinking. It's easy to win an argument inside your head. But when you face a real audience, you have to be truly convincing. ... Studies have found that the effort of communicating to someone else forces you to pay more attention and learn more."
"Brenda Clark Gray, an instructor at Douglas College in British Columbia, had her English students create Wikipedia entries on Canadian writers to see if it would get them to take the assignment more seriously. She was stunned at how well it worked. 'Often they're handing in these essays without any citations, but with Wikipedia they suddenly were staying up till 2 a.m. honing and writing the entries and carefully sourcing everything,' she tells me. The reason, the students explained to her, was that their audience—the Wikipedia community—was quite gimlet-eyed and critical. They were harder 'graders' than Gray herself."
"Once thinking is public, connections take over. Anyone who's Googled a favorite hobby, food, or political subject has discovered some teeming site devoted to servicing the infinitesimal fraction of the public that shares their otherwise obscure obsession. (Mine: guitar pedals, modular origami, and the 1970s anime show Battle of the Planets.) Propelled by the hyperlink, the Internet is a connection-making machine. And making connections is a big deal in the history of thought. ..."
Tom was tweeting about Big Data & Gamification & Algorithmic determinism this morning. Though the thread here is not 100% transparent, we thought you might be amused.
As I dig deeper into big data/algorithmic determinism/gamification I am appropriately impressed but feel as if the world is being sterilized.
Read big data/gamification/algorithmic gurus and wonder where the human beings/humanity have gone. Exabyte/zettabyte/yottabyte heaven awaits.
Loyalty 3.0 and The Gamification Revolution are my two latest Amazon acquisitions. I have no idea what I think.
"Why" [questions of causation] may become obsolete & insurance be denied because you had a Zoroastrian college roommate, but human chaos will continue to reign.
But will your sociology department be a hotbed of revolutionary thought with faculty who were selected by a big data-derived recruitment algorithm?
But artist will be uninsured because data/cameras show he once inadvertently sat across a bus isle from convicted pedophile.
I'm a trained behavioral scientist who loves nothing more than wallowing in data, but some bigdata-/alorithmized-world implications unsettle.
On the other hand: As a 40-yr student DKahneman (e.g., Thinking, Fast & Slow), I'm frightfully aware of how routinely our instincts suck.
Kurzweil's The Singularity Is Near sits on my bedside table, and I wish I'd wake up one morning and discover it missing.
As someone who's lived for 50 years with a religious belief in the primacy of data, it's an "Oh-shit-my-dream's-come-true" nightmare moment.
Religiously believing in the Supreme Power of Data was fine ... as long as your data sucked.
Data uber alles. There is no God but Correlation. Is the Googleplex Heaven? Or Hell?
My problem is the more I study, the less idea I have of what I think. And I know for sure that's either a good thing or a bad thing.
What is the mathematical relationship between a yottabyte and yadda yadda yadda?
A couple of years ago, Wired editor-in-chief Chris Anderson authored a cover story titled "The Petabyte Age." The use of "big data" (more or less everything, not a sample) and the attendant primacy of correlation over causation as the basis for discovery was described thus: "The data deluge makes the scientific method obsolete." He also called the phenomenon "the end of theory."
I was outraged—correct word choice. But that was then and this is now. I still haven't swallowed the whole pitcher of Kool-Aid, but I have moved to the point of open-mindedness. Recently, I have read and re-read two books. One on Big Data. One on the looming takeover of pretty much everything by algorithms—yes, I do exaggerate.
Mostly, assuming you're not a full-fledged expert, I urge you to give yourself some space—beach reading?—and take a deep dive into both.
To perhaps titillate, but not summarize, I am providing a handful of quotes from each of the two.
Big Data: A Revolution That Will Transform How We Live, Work, and Think, by Viktor Mayer-Schonberger and Kenneth Cukier
"As humans, we have been conditioned to look for causes, even though searching for causality is often difficult and may lead us down the wrong paths. In a big data world, by contrast, we won't have to be fixated on causality; instead, we can discover patterns and correlations in the data that offer us novel and invaluable insights. The correlation may not tell us precisely why something is happening, but they alert us that it is happening. And in many situations, this is good enough. If millions of electronic medical records reveal that cancer sufferers who take a certain combination of aspirin and orange juice see their disease go into remission, then the exact cause for the remission in health may be less important than the fact that they lived."
"Correlations let us analyze a phenomenon not by shedding light on its inner workings, but by identifying a useful proxy for it."
"Predictions based on correlations lie at the heart of big data."
"There is a philosophical debate going back centuries over whether causality even exists."
"Unfortunately, Kahneman argues [Nobel laureate Daniel Kahneman's masterpiece Thinking, Fast and Slow], very often our brain is too lazy to think slowly and methodically. Instead, we let the fast way of thinking take over. As a consequence, we often 'see' imaginary causalities, and thus fundamentally misunderstand the world."
Walmart: "[Using big data], the company noticed that prior to a hurricane, not only did sales of flashlights increase, but so did sales of Pop-Tarts. ... Walmart stocked boxes of Pop-Tarts at the front of the store [and dramatically boosted sales]."
"Aviva, a large insurance firm, has studied the idea of using credit reports and consumer-marketing data as proxies for the analysis of blood and urine samples for certain applicants. The intent is to identify those who may be at higher risk of illnesses like high blood pressure, diabetes, or depression. The method uses lifestyle data that includes hundreds of variables such as hobbies, the websites people visit, and the amount of television they watch, as well as estimates of their income. Aviva's predictive model, developed by Deloitte Consulting, was considered successful at identifying health risks."
Bonus: On the topic of causation and incomplete models, I offer this wonderful commentary by pollster Daniel Yankelovich, which appeared in Jack Bogle's stellar book Enough! To wit:
"The first step is to measure what can easily be measured. This is okay as far as it goes. The second step is to disregard that which cannot be measured, or give it an arbitrary quantitative value. This is artificial and misleading. The third step is to presume that what cannot be measured is not very important. This is blindness. The fourth step is to say that what cannot be measured does not really exist. This is suicide."
Automate This: How Algorithms Came to Rule Our World, by Christopher Steiner
"Algorithms have already written symphonies as moving as those composed by Beethoven, picked through legalese with the deftness of a senior law partner, diagnosed patients with more accuracy than a doctor, written news articles with the smooth hand of a seasoned reporter, and driven vehicles on urban highways with far better control than a human driver."
"... The audience then voted on the identity of each composition.* [Music theory professor and contest organizer] Larson's pride took a ding when his piece was fingered as that belonging to the computer. When the crowd decided that [algorithm] Emmy's piece was the true product of the late musician, Larson winced." (*There were three possible composers: Bach/Larson/Emmy-the-algorithm.)
"When Emmy [algorithm] produced orchestral pieces so impressive that some music scholars failed to identify them as the work of a machine, [Prof. David] Cope instantly created legions of enemies. ... At an academic conference in Germany, one of his peers walked up to him and whacked him on the nose. ..."
"... Which haiku are human writing and which are from a group of bits? Sampling centuries of haiku, devising rules, spotting patterns, and inventing ways to inject originality, Annie [algorithm] took to the short Japanese sets of prose the same way all of [Prof David] Cope's algorithms tackled classical music. 'In the end, it's just layers and layers of binary math, he says. ... Cope says Annie's penchant for tasteful originality could push her past most human composers who simply build on work of the past, which, in turn, was built on older works. ..."
"When you ask [Cloudera founder Jeff] Hammerbacher what he sees as the most promising field that could be hacked by people like himself, he responds with two words: 'Medical diagnostics.' And clearly doctors should be watching their backs, but they should be extra vigilant knowing that the smartest guys of our generation—people like Hammerbacher—are gunning for them. The targets on their backs will only grow larger as their complication rates, their test results, and their practices are scrutinized by the unyielding eye of algorithms built by smart engineers. Doctors aren't going away, but those who want to ensure their employment in the future should find ways to be exceptional. Bots can handle the grunt work, the work that falls to our average practitioners."
From the extraordinary/chastening book Automate This: How Algorithms Came to Rule Our World, by Christopher Steiner:
"... The audience then voted on the identity of each composition.* [Music theory professor and contest organizer] Larson's pride took a ding when his piece was fingered as that belonging to the computer. When the crowd decided that [algorithm] Emmy's piece was the true product of the late musician [Bach], Larson winced." (*There were three, one each by Bach/Larson/Emmy-the-algorithm.)
" ... Which haiku are human writing and which are from a group of bits? Sampling centuries of haiku, devising rules, spotting patterns, and inventing ways to inject originality, Annie [algorithm] took to the short Japanese sets of prose the same way all of [Prof David] Cope's. algorithms tackled classical music. 'In the end, it's just layers and layers of binary math, he says. ... Cope says Annie's penchant for tasteful originality could push her past most human composers who simply build on work of the past, which, in turn, was built on older works. ..."
We've included more from Steiner's book, and some related other stuff in an attached PowerPoint mini-presentation.
Happy International Women's Day! This may not be a traditional gift-giving holiday, but I thought I'd give your curiosity a gift today. Patricia Martin (disclosure: she's a client of mine, not Tom's) has pulled together a list of Top Women (you've probably never heard of) Shaping Digital Culture. These are some truly fascinating women doing remarkable things. From a creator of an open-source EPUB reader to researchers reporting on sociological images to the Chief of Technology at the Brooklyn Museum, these women are leading projects that are sure to spark your creativity if not your interest. Check them out.
The buzz around Google Authorship is gaining serious momentum. Why should you pay attention?
Google Authorship gives credit where credit is due. For all those writers whose writing can be found in various places on the Web, it's the long-overdue feature that links your identity to your writing in a standard way, regardless of the publication. The link is essentially your Google+ profile, and establishing yourself this way will add to your credibility:
That's right, he said "higher rankings." This affects businesses directly since Google+ features the ability to offer social feedback by giving +1 ratings. Google is strengthening its search results by adding a social component, not just relying on the bots to make the ranking decisions. Read Monica Romeri's The Lowdown on Google+ for Business to get a better feel for how this impacts businesses.
Google is the uncontested monarch of search. It's name is now listed in the dictionary as the equivalent of search. Adding Google Authorship to Google+ increases Google+'s relevancy for businesses, enhancing its attraction as a social network. Furthermore, Dave Lloren's Fast Company article today goes into detail about several other tools Google is bringing to bear while quickly becoming a powerful hub of business solutions we'll soon not be able to live without.
Giving a speech is [for me] a primal act.
It is the ultimate in being purely "alive."
At its end I die.
The exhaustion leads to odd thought patterns.
I've been thinking about Kurzweil's singularity a lot.
And juxtaposing it with my work.
I have no idea what the singularity is.
But I'm simultaneously clear about what it is—in the primal part of my brain.
I think [GOD HELP ME] in tweets these days.
Take notes in tweet form.
Herewith a set that emerged from my keyboard during DL494 Santo Domingo-JFK:
The more things change the more they say the same. Not true circa 2011. The more things change the more things change.
Pretty sure I agree with Kurzweil on the meaning of the "singularity" except for timing—perhaps it's already occurred????
Intensity of giving a speech—each speech always leads to copious tears when I return to hotel room and adrenaline evaporates. It's a form of dying.
I start "going weird" about 72 hours before a speech. I stay weird for about 48 hours after a speech.
Kurzweil says "singularity." I say The Great Flip. We labor and wear our fingertips to the bone to feed and clothe and educate our computers-networks.
We now [ALREADY] work for our computers-networks more than they work for us??!!
We barely have time to eat because it takes so much time to feed our computers-networks.
If computers-networks could laugh spontaneously [WILL THEY SOMEDAY SOON?] they'd laugh hysterically at "user communities."
Business goes a lot faster these days not because of our needs or even our wants—but because the computers-networks require us to go ever faster.
Who's in charge: JIT-driven computer networks determine [COMMAND] the moment-to-moment behavior of millions/tens-of-millions of workers worldwide.
1985: Speech prep 2 or 3 hours—shuffling my (almost) static 500 glass-mount slide set. 2011: I work for PowerPoint—50 to100 hours prep-per-speech.
Soon we may not be needed.
I talk ceaselessly about the "eternal basics." Perhaps I'm wrong.
Are we already living in the matrix?
If Thos. Pink quit making shirts I'd probably quit giving speeches.
Twixt '97-'05 back pain treatment up 65% to $90B. No improvement in back health per population self-reports. (Source: Forbes 03.27.08)
Is fusion surgery a pure and simple racket? (Rehab-exercise program just as effective per numerous studies.) (Source: Forbes 03.27.08)
On the evening of May 26, I made my first "presentation" (an informal talk) on social media. The affair, called "Sweets & Tweets," was held in Georgetown and hosted by corporate social media consultant Debbie Weil. I participate in social media somewhat myself, but in no way, shape, or form am an expert. Moreover, I did not spend an enormous amount of time preparing—the talk was intended to be "off the cuff." But with my obsessive penchant for lists (ah, engineers), I did jot a few things down which I shall simply call "musings from an incredibly old guy and unadulterated amateur" on social media:
[Our guest blogger is John O'Leary. It seems Erik called him upon receiving spam from John's email address. The conversation led to this idea. The post is a re-blog from John's website.]
One nice thing about being repeatedly hacked in your email and social networking accounts is hearing back from old friends and business colleagues you haven't been in touch with for years! I'm sure you can relate. In my case I can't say that everyone on my spammed contact list has been entirely pleased to hear from me—or who they thought was me—but amazingly many of them have taken the bait. It appears that hundreds of folks are now wondering how I've been able to start so many multi-million-dollar home businesses this year AND successfully sell cheap meds on the side (while maintaining a consulting practice). Well, I've decided to exploit this opportunity and share my trade secrets in a new book I'm working on: How *YOU* Can Make Millions From Getting Hacked & Spammed in Your Spare Time. (The first step is: Don't give up that AOL account.) Subtitle: Business Lessons From Viagra.