I've sometimes picked on Harvard in this blog, and though I'm sure I haven't harmed their stature or endowment, I think I owe it to them and myself to commend them sometimes as well. This blog piece, which I first read on Zite, is a true gem -- anyone who reads it and has to give a commencement speech will probably steal it (I hope with proper attribution to Mr. Haque).
Create a Meaningful Life Through Meaningful Work
"When we are born we weep that we are come to this great stage of fools" - William Shakespeare "To me the meanest flower that blows can give thoughts that do often lie too deep for tears." William Wordsworth
Monday, January 30, 2012
Sunday, January 29, 2012
It Seemed Like a Good Idea at the Time
In Stumbling
on Happiness, psychologist Daniel Gilbert wryly observes that
“psychologists…take a vow, promising that, at some point in their professional
lives they will publish a book, a chapter, or at least an article that contains
this sentence: ‘The human being is the only animal that…’ We are allowed to
finish the sentence any way we like, but it has to start with those eight
words.” Noting several refuted
hypotheses (e.g. uses tools or language), he goes on to say that “it is for
good reason that most psychologists put off completing The Sentence for as long
as they can, hoping that if they wait long enough, they just might die in time
to avoid being publicly humiliated by a monkey.” Psychologists aren't the only social scientists who run the risk of premature generalization, as this post will prove.
Lately I’ve been reading several books that lie along the border of psychology and economics – Daniel Kahneman’s
Thinking Fast and Slow, Dan Ariely’s Predictably Irrational, Nicholas Nassim
Taleb’s Fooled By Randomness, all of
which provide a lot of evidence that our abilities to predict the future, and
even to act rationally in the present, are a lot poorer than we imagine. Yet this never prevents stock market
“experts” and management theorists from making a killing by confidently
labeling businesses and their leaders successful or promising, flawed or
doomed. The danger with all these
judgments, of course, comes when you make them before the fat lady has sung –
because contrary to F. Scott Fitzgerald, there are plenty of second acts in
American lives, and often Act II turns the expectations of Act I on their
heads.
Take Charismatic
Leadership in Organizations, by Jay A. Conger and Rabindra N. Kanungo,
published in 1998. Chapter 7, “The Shadow Side of
Charisma,” begins, “Although we have emphasized throughout this volume the positive
face of charismatic leadership, it has at times produced disastrous outcomes
for both followers and organizations.”
True enough. In fact, only
three years later, Jim Collins’s Good to
Great swung the pendulum of popular thinking decisively against charisma,
showing that in the 10 “great” companies he and his investigators had found,
one common factor was leaders who were “quiet, humble, modest, reserved, shy,
gracious, mild-mannered, understated…and so forth.” Collins also noted that “in two thirds of the comparison
cases, we noted the presence of a gargantuan personal ego that contributed to
the demise or continued mediocrity of the company.”
Unfortunately, as the old newsreels used
to say, “Time Marches On.” With
time, Collins’s list of great companies has come under severe
scrutiny; by 2006, the average GtoG company was not in the Fortune Top 200.
Likewise, Conger and Kanungo went on to
describe “positive” and “negative” charismatic leaders. One of their key definitions was that
“The negative charismatic leaders point their efforts toward achieving the goal
of self-aggrandizement, whereas the positive charismatic leaders develop
self-discipline to endure the personal risk or cost of benefitting
others.” Negative
charismatic leaders create “goals that are largely self-serving,” and do not
accurately estimate resources, support, or the larger (market)
environment. Finally, only the
most positive leaders “recognize their own ‘organizational mortality,”’ and
prepare properly for a succession of leadership.
In the abstract, all these points are
well taken. However Conger and
Kanungo were betrayed by the need to concretize the principles with
examples. Here they run into the
problem recognized by Einstein (and, oddly, attributed to Yogi Berra by over
10% of those who quote it): “In theory, theory and practice are the same. In
practice, they are not.”
The Internet, or course, gives a reader
an unfair leg up on published writers, because any one of us can bring the discussion
up to date with a few clicks.
After noting one case of a premature death notice, which I will discuss
momentarily, I began to look up other examples to see how the leader and/or
company fared after Conger and Kanungo’s judgment.
One such example was Lucent ‘s Henry
Schacht, described as a charismatic but instrumental leader who was highly
successful in planning for and mentoring a successor. During Schacht’s brief tenure as head of the newly formed
technology company, he named his successor immediately, and “assumed the
principal role of teaching and coaching, helping [his designated successor,
Rich] McGinn and his team to be more effective in building the senior team’s
collective identity.” Starting with his accession in October 1997, McGinn then
built his team with a further succession in mind.
But life, and markets, are what happens when you’re busy
making other plans. Exactly three
years later, McGinn was fired by the board, and Schacht returned in an effort
to save the sinking company.
Lucent’s stock, which had hit a high of $103 a share a few months after
McGinn’s succession, was now trading at $27. Two Octobers later it was a penny stock, at 55 cents a share,
and had admitted a $125 million accounting error and several other dubious
accounting and sales practices. It
was later acquired by the French firm Alcatel. (Sources: NY Times, 2/28/1998; CNET News, October 23, 2000;
CFO.com December 18, 2002; Kiplinger’s, May 2003).
On the opposite side of the coin, Conger and Kanungo singled
out one CEO above all as a model of the negative charismatic leader. They labeled
this CEO both charismatic and narcissistic, a “particularly potent and
dangerous” combination. They cite another author as saying the man “has a
tendency to surround himself with people who, though talented, aren’t likely to
question his vision.” Like other
narcissistic leaders (John deLorean and Lee Iacocca are mentioned), he often
claimed credit for others’ ideas. This person’s “visions became
increasingly a reflection of personal obsessions rather than what the
marketplace was seeking.” After an
initial success, his company’s market shares collapsed, and he blundered into
expensive failure after expensive failure because he “could not defy the laws
of the marketplace or ignore the dictates of the business he was in, no matter
how passionately he viewed himself as being above such dictates.”
This paragon of egomania and market-blindness? One “Steven” Jobs as the authors called
him, who had just returned to Apple as interim CEO and who had nearly 15 years
of unparalleled marketing and creative success ahead of him. (A few weeks before his death, Apple
had a market capitalization larger than Google and Microsoft combined, and the
quarter that began with his death was Apple’s best ever. Close the books on Mr. Jobs’s career,
with a pretty positive balance sheet.)
This is not to say that Messrs. Conger and Kanungo are
unusual in their inability to judge the future by the past. They’re just like
the rest of us, this writer included.
But their story, like so many others, shows the common tendency to persuade
ourselves that our theory can account for all contingencies (what Daniel Kahneman
calls the WYSIATI or “what you see is all there is” fallacy), and of assuming
that past performance is a guarantee of future results.
Thursday, January 12, 2012
The Not-So-Smart World of Harvard Business School
Ever
since my graduate school days, as I pored over Renaissance love poems and
Anglican sermons in pursuit of a Ph.D. in English, I have had a voyeur’s
interest in the Harvard Business School, that megalith across the Charles river
from the Graduate School of Arts and Sciences, the Divinity School, and other
more ivory-clad Cambridge towers that are among the fiefdoms of the World’s Greatest
University. Any school that can
charge over $57,000 in tuition and fees, and $11,000 for a week-long seminar, can
get top price for its Review, then
recycle the articles into thin $25 paperbacks, and can count Michael Bloomberg,
Mitt Romney, Meg Whitman, and Robert McNamara among its alumni is surely good
at what it does.
Fortunately
HBS drops some crumbs for gleaners who follow in the wake of its giant harvesters. One of these is the HBR Ideacast, a
podcast that features 10-15 minute interviews with the authors of new Review articles. You can listen to the gist of a hot new
article or a reflection by a master of the business universe for free while
working out or driving. Some of
the ones I’ve listened to have been very impressive: Martin Seligman, Edgar
Schein, Warren Bennis, Sherry Turkle, Oliver Sacks, and even Francis Ford
Coppola have appeared.
Unfortunately,
Ecclesiastes’ observation that of the making of many books there is no end is
all the more true in the publish or perish world, and sometimes what passes for
business wisdom falls very short.
Case
in point, and only the most egregious of many such: one Richard Ogle, whose
book Smart World: Breakthrough Creativity
and the New Science of Ideas was actually published by the Business School
in 2007. The interviewer began by
explaining Ogle’s thesis: “creative breakthroughs and great innovations don’t
simply emanate from the minds of individual geniuses” but “are born from
intelligent networks” that “access something you call idea spaces.” I don’t know about you, but I only get
the “don’t” part of that statement.
At
any rate, as the interview progressed, I found myself surprised by the examples
Ogle gave: Rupert Murdoch’s brilliantly intuitive acquisition of MySpace in
2005, David Wallerstein’s supersizing idea, first used for popcorn and soda at the movies, then
brought over to McDonald’s; Ruth Handler’s creation of the Barbie doll. Oddly, it seemed that all the ideas did
emanate from the minds of individual “geniuses,” but let that go for the
moment.
Murdoch’s
brilliant decision, made twenty months after Facebook had launched, ultimately resulted in News Corp’s selling
MySpace at more than a half billion dollar loss, and an additional half billion
in losses before the sale. Anytime
you sell something for 6% of what you paid for it, your genius score obviously
takes a huge hit.
Of
course, no one is immune from financial blunders, or from praising eventual
blunderers, but considering the whole Murdoch enterprise and its current fairly
predictable humiliations, was this the best choice of examples? (Murdoch is now being wonderfully evoked
in the person of Sir Richard Carlisle, an early 20th century British
tabloid mogul based on the real Alfred Harmsworth, in Downton Abbey. Bet he
won’t be half as villainous as Murdoch himself.)
The
supersizing example is even more remarkable, not only for its choice but for
the arguments behind it. After
all, Ogle was interviewed in 2007, three years after Morgan Spurlock’s
“Supersize Me” had exposed the McDonald’s regimen’s effects, and had caused the
company to phase out supersizing in favor of an “Eat Smart Be Active” campaign.
Ogle’s
explanation of the original brilliance of supersizing is even less persuasive
than his evaluation. According to
Ogle, movie theaters wanted people to go back for second helpings, but
Christian ethical prohibitions against greed inhibited them from doing so.
Wallerstein realized that offering one huge portion avoided the stigma of greed
and thus increased sales.
Assuming
that Ogle’s argument even passes what Alan Dershowitz called “the giggle test,”
it’s hard to know where to begin challenging it. Do people who go back for seconds, at buffets for example,
look more greedy than people who heap their plates to overflowing the first
time around? How profoundly do
Christian views of greed affect the population of the most Christian and most
obese country in the world? And
didn’t Ogle ever go to the movies?
Most of us don’t go back for seconds because we don’t want to miss
anything, just as we all hang on and rush to the rest rooms when the credits
come up.
Regarding Christian ethics and gluttony
(which is really the term Ogle should have used), last year Northwestern's
Feinberg School of Medicine published a study reporting that religious people
were 50 percent more likely to become obese, while the Gallup Poll’s 10 most
religious states are all among the 20 most obese states (including #1,2,4 and
5), while the 10 least religious include 8 of the 20 least obese states.
Finally, isn’t there something
distressing about Ogle’s apparent definition of brilliance? To choose, out of all the possibilities
available, businesses that contribute to so many cultural ills, from the
sexualization of childhood to numerous physical ailments, to the incivility of
public discourse, suggests that the only criterion for genius that Ogle and his
publishers recognize is pecuniary.
Why not, for example, single out
the creation of microloans for developing countries, the use of multi-drug
therapies for HIV, the invention of adaptive technologies for people with
disabilities, online fundraising for charities, and innumerable other recent
innovations that have actually done good while usually both making money and
making people more productive and happy?
Of
course I’m only judging the book by what Ogle chose to cover in his
interview. But if the coming
attractions seem awful to you, how bad is the movie going to be?
Subscribe to:
Posts (Atom)