Posted by Jason Zweig on Jun 16, 2015 

Image credit: William Heath, “Dutch steamers on the frozen Zuyder Zee,” etching, 1829, Rijksmuseum
 

By Jason Zweig | June 16, 2015 9:19 p.m. ET

This speech, which I gave as the keynote address at the Morningstar Investment Conference in 2001, is hard to find online, so I’m making it available again here.

In it, I argued that the future outperformance of stocks is far from the sure thing that many financial advisors pretend that it is, that advisors are dangerously naive in the way they market their abilities, that their fees need to come down, and that the financial-planning industry is full of people busily identifying market “patterns” that aren’t even there. I also warned financial advisors not to extrapolate the past into the future and, above all, to acknowledge the inevitability of surprise.

The speech was tailored to the lessons of the recent bursting of the Internet bubble, but I think it still has some relevance to today’s markets as well.

A few excerpts (the full text follows below):

  • “…the real lesson of the late 90s mania and the 2000 massacre is that certainty is every investor’s worst enemy. The only universal truth that the past offers about the markets is that they will surprise us in the future. And the corollary to that law is that the markets will most brutally surprise those who are most certain what the future holds.”
  • “Being right is the enemy of staying right—partly because it makes you overconfident, even more importantly because it leads you to forget the way the world works.”
  • “If you have centered your practice on your ability to forecast markets…then you have positioned yourself on thin ice in a world of fat tails.”

 

 

 

“Fat Tails, Thin Ice”

Jason Zweig

Morningstar Investment Conference

Chicago, Ill.

June 27, 2001

 

 

 

“What can no longer be imagined must happen, for if one could imagine it, it would not happen.”

     –Karl Kraus

 

 

Everyone knows the four most dangerous words on Wall Street are “this time it’s different.” But those aren’t the only four dangerous words. “Studies have shown that” are four words that typically introduce the opinions of people who have never read those studies. “Our proprietary computer models” are four words that should send you scurrying into the nearest bomb shelter. And I’d add that four of the most dangerous words of all are simply: “I told you so.”

 

I know how tempting it is to say “I told you so” right now. You want to say it to the clients who deserted you because you refused to buy stocks with names like Dork.com when they were trading at 500 times their projected revenues for the year 2525. You want to say it to the clients who didn’t desert you but constantly whined about why you didn’t put more tech stocks into their portfolios. You want to say it to the so-called experts who ridiculed diversification, declared that value investing was dead, and made getting rich quick seem easier than getting out of bed.

 

Don’t say “I told you so.” Don’t say it. Don’t you dare say it.

 

Why not? Because the real lesson of the late 90s mania and the 2000 massacre is that certainty is every investor’s worst enemy. The only universal truth that the past offers about the markets is that they will surprise us in the future. And the corollary to that law is that the markets will most brutally surprise those who are most certain what the future holds.

 

As soon as you declare, “I told you so,” you’ve painted a gigantic bullseye on your own back. That’s because the risk of being wrong is at its highest precisely when you seem most certain to be right. Just when you think you’ve got everything totally figured out, the system goes haywire on you.

 

In case you’ve been taking an 18-month nap, let’s take a quick look back at the end of 1999 and the beginning of 2000:

 

  • The chairman of Pres. Clinton’s Council of Economic Advisers, Dr. Martin Baily, declared, “There’s no reason why [the nation’s economic expansion] cannot continue indefinitely.”
  • CEOs like Cisco’s John Chambers were insisting that they’d keep growing at 60% no matter what. No one dared to doubt it: Cisco was far and away the No. 1 holding of large-cap growth funds.
  • Jeffrey Applegate, chief U.S. investment strategist for Lehman Brothers, declared that “technology stocks are the growth stocks of our era, period.” Then he asked rhetorically: “Is the stock market riskier today than two years ago simply because prices are higher? The answer is no.”
  • Robert Froelich, chief investment strategist at the Kemper Funds, told The Wall Street Journal, “We see people discard all the right companies with all the right people with the right vision because their stock price is too high—that’s the worst mistake an investor can make.”

 

What’s happened since? A mere eight months after Dr. Baily said the economic expansion could continue indefinitely, the U.S. began teetering into recession. This spring, Cisco took a $1.2 billion restructuring charge and wrote off $2.2 billion in excess inventory—in one fell swoop wiping out almost a third of all the profits it had earned over the course of its entire life as a public company. That means that the people who paid up to 184.7 times Cisco’s earnings in late 1999 were not only wrong about its future growth; the growth that they thought they were getting all along was itself largely an illusion. No wonder Cisco has lost roughly $400 billion in market value. Anyone who put 60% of his portfolio into tech stocks, as Jeff Applegate urged, has lost roughly a third of his money. Finally, paying attention to valuation no longer seems like “the worst mistake an investor can make,” and that’s putting it mildly.

 

So why did such smart people make such public fools of themselves? Precisely because they had been so right for so long that they had forgotten they could be wrong. Likewise, what has happened in the past year has not proven you right. It has just made you look right—for now. Before you say “I told you so,” remember that that’s just what all these people were doing then, and that’s how history is likely to remember them. Being right is the enemy of staying right—partly because it makes you overconfident, even more importantly because it leads you to forget the way the world works.

 

Investing is based on the belief that the world is normal and that risk can be managed, if not eliminated, with careful planning. Rare events are so rare that you can see them coming, if they’re worth bothering with at all. So, since stocks have never lost money over any 30-year period, Jim Glassman and Kevin Hassett can insist that the Dow should reprice itself immediately to 36,000, since it’s high time for the market to admit that it has become risk-free.

 

But the belief that the markets are built on a bedrock of normality is nothing but sand. Less than three years ago the Nobel Laureates at Long-Term Capital Management nearly destroyed the global financial system by assuming that the global financial system was normal. The world’s brainiest hedge fund, whose partners had calculated that it could never lose more than $35 million in a day, lost $553 million—16% of its value—on August 21, 1998. Myron Scholes could only describe the events that overtook them as a practical impossibility.[i]

 

And back in 1994, Piper Jaffray Institutional Government Income was the safest and most lucrative bond fund in existence. This mortgage-derivative fund had outperformed 99% of all bond funds over the previous one, three, and five years. It had a beta of .98. It had a standard deviation only a hair higher than Vanguard Total Bond Market Index Fund. It had gone up smoothly, steadily, predictably for its entire life, as if it were a high-yield CD. Then came February, 1994, when the Fed began raising interest rates like mad. By year-end, this fund—with its five stars from Morningstar, its top Lipper ranking, even a glowing write-up from the know-it-all who was then Forbes’ mutual funds editor—had lost 28% of its value.

 

Even stranger things have happened lately. By late 1999, the spread between the Russell 2500 Growth and Value Indexes had widened to 5,336 basis points. According to the folks at Barr Rosenberg, that spread was 6.8 standard deviations from the mean, an event so rare that it should occur only once every 285 billion years. Betting that such an abnormality could not possibly persist, Julian Robertson steered his giant hedge fund right into oblivion. Great value managers like Robert Sanborn and David Schafer were hounded out of the business in disgrace.

 

Back in 1996, the S&P 500 moved by at least 1% of its value only 9.5% of the time. By last year, that number had risen to 26.6%, and the market moved by at least 3% on nearly one out of ten of all trading days.[ii] You can see this on the first five pages of your handouts. Five years ago, the daily moves of the S&P looked normally distributed. They don’t anymore. Could it be that the markets are trying to tell us something? Could it be that the normal distribution no longer holds across the board?

 

It seems that the unimaginable has become normal. The rare has become abundant. Events that ought to be practical impossibilities, as Myron Scholes called them, have become an everyday part of financial life. Meanwhile, scientists in a variety of fields have been tentatively redrawing the bell curve in the past two years. From the turbulence of magnetized particles and the distribution of endangered species, to the frequency of earthquakes, avalanches, and forest fires, researchers are finding that rare events are much more common than a traditional model would have predicted.[iii] You can see what the proposed new curve looks like in your handouts—wide, with fat tails.

 

The scientists I’ve contacted don’t know whether their findings are relevant to investing. Nor do they know why the normal distribution no longer seems adequate to describe the natural world. But I think these reconsiderations of the bell curve have obvious implications. First of all, they suggest that the unpredictable is inevitable. If your next draw from the deck stands a high chance of coming not from the tall, skinny mean of the distribution but from those surprisingly fat tails, you face vastly higher odds of being surprised. And if the unprecedented is lurking out there, just waiting to show itself at the most improbable—if not unimaginable—moment, then where does that leave your optimizers? What good are your Monte Carlo simulations? A geophysicist at UCLA, Didier Sornette, has even found that seeking to limit small risks within a complex system increases the danger of encountering large risks.[iv]

 

I’m not surprised that scientists have begun to speculate that the bell curve needs an overhaul.   Humans are pattern-seeking animals, and we have a terribly hard time accepting how random the world is—and how frequently the unexpected can occur. If you set up an experiment in which you flash a green light 80% of the time and a red one 20% of the time, but keep the exact sequencing of the flashes random, a rat or a pigeon will soon learn to pick green every time, since it comes up four times as often. But humans will insist on trying to guess whether the next flash will be green or red, even when you tell them the flashes are totally random. There is only one sure way to stop people from thinking random sequences are predictable: You can surgically slice their brains in half.[v]

 

Thus, it appears that seeking patterns in random data is hard-wired into the structure of the human brain. If you show people a long random number, they will see it as a series that contains patterns of alternation and repetition. The great philosopher of science, Karl Popper, even wrote a formula to create numbers that look random—an algorithm so intricate that after three iterations it generates a number 65,536 digits long. Paradoxically, humans will believe something is random only if you very carefully design it to look that way; if it’s actually random, they’ll think it’s predictable.[vi] Gordon Hester, formerly a psychologist at Carnegie Mellon, found that people tended to bet on tails if a coin had come up heads several times in a row—but if he let the coin “rest” for awhile before flipping it again, people would bet on heads. The passage of time evidently made people feel that the odds of getting heads again had reverted to 50-50.[vii]

 

I’m afraid that financial advisors are all too similar to those people in the laboratory experiments watching coins being flipped. You see trends and reversals and predictability everywhere you look: This manager will beat the market, that stock is due for a fall, this style of investing is proven superior. Most of the time, I’m sad to say, you’re just imagining. You are “finding” nonexistent patterns in random data. In short, you’re kidding yourselves. And, unfortunately, it’s your clients who pay the price for your delusions.

 

All by itself, the way you extrapolate from history is enough to give me the heebie-jeebies. History supposedly “proves” that the risk of owning stocks declines, more or less to zero, as the holding period lengthens. But Jeremy Siegel’s famous data series is an object lesson in survivorship bias. In the early years—nearly a third of his total period—Prof. Siegel links together the returns on a maximum of 27 stocks, every one of them either a bank, an insurance company or a railroad, all traded either on the New York, Boston, or Philadelphia exchanges.

 

In 1810, among the biggest stock issuers in the country were turnpikes; in the 1820s, canals; in the 1830s, small-town banks; in the 1850s, coal mines; in the 1860s, petroleum. By 1865, the over-the-counter markets in mining stocks in New York alone were capitalized at $800 million—$10 billion in today’s money. Nearly every penny of it went down the drain, and not one penny of that loss is reflected in Prof. Siegel’s data. We don’t know, we simply can’t know, the compound return on U.S. stocks in the 19th century. I thought you’d like to see just a smattering of the stocks whose returns are absent from the so-called historical record; they’re in your handouts on the following pages.

 

To say that stocks have never lost money over a 30-year period means almost nothing. If good data don’t begin until the 20th century, then we have a grand total of three non-overlapping 30-year periods to base that judgment on. So far, stocks are batting 3-for-3 — to which an objective observer can only say, Whoopie pickle. Any grand conclusion from that sample is a giant gamble on fragmentary evidence.

 

Now let me ask: How many of you believe that you can identify managers who will outperform their benchmarks? And how many of you, when selecting funds, have taken the sensible step of seeking out active managers, like Numeric Investors and the Bridgeway Funds and John Bogle Jr., who have the integrity to charge a performance incentive fee that rewards them for beating their benchmark and penalizes them for trailing it?

 

Now how many of you are prepared to take your own fees based on whether you succeed in picking benchmark-beating managers? If you believe you can identify superior funds, why don’t you charge accordingly? If you rightly admire managers for taking this courageous step, why won’t you do it yourself? I’m bewildered by this inconsistency, and the investing public is slowly getting wise to it.

 

One thing we can forecast clearly is that, if the past is any guide, the future won’t resemble it. For your reading pleasure, I’ve included in the handout some asset allocations, and forecasts of their returns, from a CFA conference held 15 years ago. As you can see, oil & gas partnerships were near the top of everyone’s list. And let’s not forget precious metals—you just had to have your 5% there. As you are today, asset allocators then were using history as their guide, and that’s where history led them to put their clients’ money…and that’s why most of them are history.

 

There’s a lesson here. Whenever I hear the word “optimizer,” I think of the philosopher Ludwig Wittgenstein, whose life’s work was to try to establish what it truly means to know something. He once imagined someone who would “buy several copies of the morning paper to assure himself that what it said was true.” No matter how many times you reshuffle the past in your Monte Carlo machinery, you’re still drawing from the past to predict the future. You are reading from multiple copies of the same newspaper and pretending that each time you read it you have gotten another, independent confirmation of what you already knew.

 

If you have centered your practice on your ability to forecast markets—from the performance of a specific fund to the return of stocks overall—then you have positioned yourself on thin ice in a world of fat tails. That’s a suicidally narrow business model. Instead, your practice should be designed to create a community of people with shared beliefs.

 

Let’s look at the next page in my handout, which probably describes the experience of all too many people in this room over the past year or two….

 

At first, it may seem obvious why clients desert financial advisors under these circumstances. You let them down, they dumped you. But it’s not obvious at all. In fact, it’s very puzzling. People can believe things that are far more ridiculous than the notion that your firm will beat the market for them.

 

We’ve all heard the term “cognitive dissonance.” It was coined by the psychologist Leon Festinger in the 1950s to describe the human tendency to disregard evidence that something we believe in is false. Festinger coined the term after studying a strange cult led in a town he called Lake City by a woman he named Marian Keech. Mrs. Keech believed that Sananda, a new incarnation of Christ, had visited her in a flying saucer that he had flown from the Planet Clarion. He told her that other lifeforms called Guardians would come from Venus and usher in the Last Judgment.   Sananda also told Mrs. Keech that Lake City would soon be destroyed by a flood. But no Guardians ever landed; no flood ever came.

 

And now the story really gets interesting. Mrs. Keech’s followers, who called themselves the Seekers, did not abandon their faith in her prophecies. Instead of admitting that Mrs. Keech was a fraud, the Seekers declared that their fervent prayers had saved the world from the Last Judgment she had prophesied! That’s what “cognitive dissonance” means: People can easily keep believing something is true even after it has clearly been proven false. In fact, as you can see from the next page in the handout, the Seekers were acting much as other believers have done throughout history.

 

Now you see why I say it’s surprising that you’ve lost clients. If people can remain loyal to someone who talks about saviors in flying saucers, why is their faith in you so perishable? Is your investment skill really harder to believe in than Sananda the savior from Planet Clarion?

 

I think the answer is clear: You’ve failed to turn clients into converts. They don’t believe in you, and they won’t believe in you, because you’ve sold yourself mainly on your ability to capture investment outperformance. And they know full well that it’s as perishable as unrefrigerated fish.

 

Now let’s look at the next handout, which shows what enduring spiritual movements have in common….

 

In this overnetworked Internet world, this is your ultimate challenge: to make your clients believe in you, to make them feel that they are part of a community. Unfortunately, the psychological aspects of financial planning are probably the hardest. When you warned your clients not to day-trade Qualcomm in December 1999 because it was “too risky,” they probably didn’t listen—because you were talking at cross purposes. When you talked about risk, you might have meant the standard deviation of the expected return of day-trading—something few clients would understand. Or you might have meant the chances of losing every dollar invested—something that the evidence before your clients’ own eyes clearly disproved at the time. You might as well have been speaking Greek—or whatever they speak on Planet Clarion. To your client, “risk” meant the chance that you would stop him from getting rich quick. It meant the certainty that he would be humiliated when that damn chiropractor Larry showed up at his next barbecue and bragged, again, about his profits from day-trading Pets.com.

 

Communicating risk is notoriously tricky. In March, 1993, one of the worst storms of the century hit the Eastern Seaboard—and the Weather Channel called it perfectly. Never before had its computers so accurately predicted a major storm; never before had the channel so clearly and confidently warned of a storm’s severity. Even so, hundreds of people died from car accidents, heart attacks, collapsing trees, even hypothermia. The Weather Channel hired Baruch Fischhoff, a psychologist at Carnegie Mellon, to study where it went wrong. Fischhoff found that the users of forecasts interpret and apply them in drastically different ways—that they speak, most of the time, in mutually incomprehensible terms.[viii]

 

So you need to define your clients’ goals, and the means you propose to help them get there, not only in terms you both can understand, but in terms that mean the same thing to you and to your clients. In Fischhoff’s words, “Competent professionals…have an obligation to determine what their clients think they are talking about.” That means investing huge amounts of time in asking personal questions about what investing means to your clients; it also means telling them what money and risk and other fundamental concepts mean to you, both professionally and personally. All too many of you still think you can determine your clients’ risk tolerance primarily with a multiple-choice questionnaire. That’s just not enough.

 

Another vital step is to make sure that what you’re forecasting is forecastable in the first place. Start off by reviewing your own track record. (If I were you, I’d do this in secret!) How many of you have measured all your manager selections and asset-allocation changes, then compared them against appropriate benchmarks, for the life of your practice?   How many of you have analyzed those results and broken them down into periods of success and disappointment so you can figure out when, and why, you went wrong?

 

If you’re not ashamed of these numbers, then share them with your clients, taking care to point out where you have added value and where you haven’t. Show your holding period for each investment and display its return against that of a fair benchmark for the same holding period. Tell them which market conditions seem to help and hurt your results the most; tell them how you monitor your own results and how you define success; and, most of all, talk about your worst mistakes and what you’ve learned from them. Ask them whether they would be comfortable if you answer “I don’t know” when they ask you to predict the markets. Some clients will never think you’re worth what they have to pay you if you can’t tell them what the market is about to do, and educating these people may be a lost cause. But confessing your own limits is a key part of building up a community of trust and shared beliefs.

 

Next, help your clients track their own forecasts. With all that whiz-bang software of yours, offer to set up a virtual kind of paper portfolio for them. The trick is to make sure someone can input each and every investment idea your clients get; naturally, the more ideas they have, the worse their returns will be. By showing them that, you will do them a real service.

 

Then make a habit of estimating your odds of being wrong, as weather forecasters do with each prediction. Speak not in certainties, but in probabilities. Maybe you should also estimate how long it might take before you turn out to be right. When, in May 1999, I wrote that “Internet stocks and the funds that buy them have no more chance of living up to their hype than Mike Tyson has of winning the Nobel Peace Prize,” I also stated that for months to come, perhaps even a year or two, that would seem “like one of the stupidest investing columns ever written.” I ended up being right—on both counts.

 

Why was I able to make that forecast so confidently? I spent my first five or six years in financial journalism laboring under the false notion that investing is driven by economics. Finally I learned that it’s driven, instead, by psychology and history. And the most important thing psychology tells us is that people learn almost nothing from history.

 

Back in 1999, the people who said this was a new era—and the people who said there’s no such thing as a new era—were equally wrong. History shows that every era is new. It also shows that the Internet is hardly unprecedented.

 

Just look back to the 1860s. On July 27, 1866, telegraph cable was successfully laid across the floor of the Atlantic Ocean, linking the U.S. and Europe with instantaneous communications for the first time. And on May 10, 1869, the Golden Spike was hammered home at Promontory Point, Utah, belting the United States together with a miraculous steel railway.

 

The contrast in life before and after these events is so great it’s almost impossible to grasp. On February 10, 1825, a young man named Samuel sent a letter from Washington, DC, to his ailing wife Lucretia: “I long to hear from you,” he wrote plaintively. The next day, Samuel received word that his wife had died the day before he mailed his letter. By the time he got home to New Haven, Lucretia had been buried for three days. The man’s full name was Samuel Finley Breese Morse; he eliminated the possibility that such a tragic irony would ever darken anyone else’s life by inventing the telegraph in 1844.

 

And on July 14, 1846, a young U.S. Army captain was posted from Charleston, S.C., to a new base in Buena Yerba in the Alta California territory. How long do you think it took him to arrive in what we now call San Francisco, as fast as the U.S. Army could muster? The trip took six-and-a-half… months. The captain and his wife wrote letters to each other every day. In April, 1847, he finally got his first letter from her; she had written it in October, 1846. When this soldier, William Tecumseh Sherman, became famous for his March to the Sea in 1865, his two priorities were to destroy the railroads in the Southeast and cut the telegraph lines. He knew exactly what he was doing.

 

In these three breathtaking years from 1866 to 1869, travel time from the East Coast to California dropped from six months to roughly two weeks—and nearly everyone who crossed the continent survived. Suddenly food and medicine could traverse immense distances in time to save lives. Suddenly today’s New York Times described what happened in Europe yesterday, instead of what had happened two or three weeks earlier. Suddenly people could learn that it was a matter of life and death for them to get somewhere immediately; and they could actually get there.

 

Now how could anyone claim, as one venture capitalist did in early 1999, that the Internet is “the greatest invention in the history of the world”? It’s simply an incremental improvement in the high speed at which we already share information by phone and fax and FedEx. It’s a big deal, but the telegraph and the railroad were at least as big.

 

That’s the way history can be useful to you. It has nothing to do with spreadsheets or time series of returns. Those things aren’t history at all. They are merely old numbers. Instead, it’s the human lessons of history that can help vaccinate your common sense against the plagues of stupidity that ripple through the stock market every few years.

 

So where does all this leave you?   I’d ask you to stop building your practice entirely around the doomed effort to predict what the markets will do, and to devote less of your energy to designing portfolios that might prosper if those predictions somehow happened to come true. Instead, build your practice around understanding your clients better and explaining to them exactly what is knowable and what is not. You shouldn’t just admit your limitations; you should embrace them. You are in the blessed position of being able to understand your clients better than they may be able to understand themselves—as long as you listen to them and speak the truth to them, including the truth about your own blind spots, in terms they can understand. By forgetting about forecasting, you can try to control the things that are controllable.

 

G.K. Chesterton put it best in his great satire, The Napoleon of Notting Hill, which he wrote in 1904:

“[O]ne of the games to which [the human race] is most attached is called, ‘Keep tomorrow dark,’ and…is also named…’Cheat the Prophet.’ The players listen very carefully and respectfully to all that the clever men have to say about what is to happen in the next generation. The players then wait until all the clever men are dead, and bury them nicely. They then go and do something else. That is all. For a race of simple tastes, however, it is great fun.”[ix]

 

Thank you.

 

[i] Roger Lowenstein, When Genius Failed: The Rise and Fall of Long-Term Capital Management (New York: Random House, 2000), p. 146.

[ii] “Standard & Poor’s U.S. Indices: 2000 Summary and Statistics,” pp. 5-6.

[iii] Steven T. Bramwell et al., “The Probability Density Function for Magnetic Fluctuations in the Classical XY Model: the Origin of an Exponential Tail in a Complex System.”

Steven T. Bramwell et al., “Universal Fluctuations in Correlated Systems.”

 

John Harte, Ann Kinzig and Jessica Green, “Self-Similarity in the Distribution and Abundance of Species,” Science, Vol. 284 (Apr. 9, 1999), pp. 334-336.

 

[iv] Didier Sornette, J.V. Andersen and P. Simonetti, “Portfolio Theory for ‘Fat Tails': Minimizing Volatility Increases Large Risks,” International Journal of Theoretical and Applied Finance, Vol. 3, No. 3 (2000), pp. 523-535.

[v] George Wolford, Michael B. Miller, and Michael Gazzaniga, “The Left Hemisphere’s Role in Hypothesis Formation,” The Journal of Neuroscience, Vol. 20 (2000), RC64, pp. 1-4.  See also Jason Zweig, “The Trouble with Humans,” Money, November, 2000, pp. 67-70.

[vi] Karl R. Popper, The Logic of Scientific Discovery (Routledge, London and New York, 1992 ed.), pp. 292-293. See also Maya Bar-Hillel and Willem A. Wagenaar, “The Perception of Randomness,” Advances in Applied Mathematics, Vol. 12 (1991), pp. 428-454.

[vii] E. Gold and Gordon Hester, “The Gambler’s Fallacy and the Coin’s Memory,” working paper, Carnegie-Mellon University, 1989.

[viii] Baruch Fischhoff, “What Forecasts (Seem to) Mean,” International Journal of Forecasting, Vol. 10 (1994), pp. 387-403. See also Baruch Fischhoff, Ann Bostrom, and Marilyn Jacobs Quadrel, “Risk Perception and Communication,” in Oxford Textbook of Public Health, Oxford University Press, New York, 1997, Vol. II (The Methods of Public Health), pp. 987-1002.

[ix] G.K. Chesterton, The Napoleon of Notting Hill (Wordsworth Classics, Ware, Hertfordshire, UK, 1996), p. 3.