brainroads-toward-tomorrows mental patterns

pyramid2dna

pyramid to dna

The Frontiers of Management

by Peter Drucker — his other books

 

line

 

#Note the number of books about Drucker ↓

books-about-drucker-collage-pict-t-600

Inside Drucker's Brain World According to Drucker

My life as a knowledge worker

Drucker: a political or social ecologist ↑ ↓

 

“I am not a ‘theoretician’;

through my consulting practice

I am in daily touch with

the concrete opportunities and problems

of a fairly large number of institutions,

foremost among them businesses

but also hospitals, government agencies

and public-service institutions

such as museums and universities.

 

And I am working with such institutions

on several continents:

North America, including Canada and Mexico;

Latin America; Europe;

Japan and South East Asia.

 

Still, a consultant is at one remove

from the day-today practice —

that is both his strength

and his weakness.

And so my viewpoint

tends more to be that of an outsider.”

broad worldview ↑ ↓

 

Most mistakes in thinking ↑seeing only part of the picture

peter-drucker-timescape_600x545

#pdw larger ↑ ::: Books by Peter Drucker ::: Rick Warren + Drucker

Peter Drucker's work

Books by Bob Buford and Walter Wriston

Global Peter Drucker Forum ::: Charles Handy — Starting small fires

Post-capitalist executive ↑ T. George Harris

evidence-wall-and-time-line-pict-600

harvest and implement

Learning to Learn (ecological awareness ::: operacy)

The MEMO “they” don’t want you to SEE

 

line

 

See rlaexp.com initial bread-crumb trail — toward the
end of this page — for a site “overview”

 

Frontiers of Management

Amazon link: The Frontiers of Management: Where Tomorrow's Decisions Are Being Shaped Today (Drucker Library)

See about management

 

line

 

 

line

 

Preface: The Future Is Being Shaped Today

To predict the future is futile.

But to look searchingly at new and unexpected developments in the present and to ask what might they foretell is the way to prepare for the future.

And this is what The Frontiers of Management—and every chapter in this volume—tried to do.

 

Many, many years ago, when I was a green beginner, a wise old editor said to me: “You’ll never make a first-rate journalist; you always think of next month instead of next morning.”

He was right do not look at a “story” as tomorrow’s headline.

I rather look at it as the harbinger of headlines a year or two hence.

 

There is risk in this—so many of the sensations of today are only tomorrow’s tired fads.

But one can learn to distinguish between the two.

There are a few—a very few—events of today which first indicate important and long-run changes, first sound a new note, first signal new issues.

It is on these that a successful business policy and a successful business strategy have to be based.

 

I have divided my writings for many years into two categories.

There are the big books—most of them quite a few years in the making.

They present one major subject in depth and are written to be the definitive text on a major area, if not to found a new discipline.

My 1954 book, The Practice of Management, for instance, is still being used all over the world as both the basic introduction to the subject for management student and beginner, and as a reference work for the experienced manager and executive.

 

And then there are essays and articles—such as those assembled in this volume—which analyze today’s events in order to reach out, to anticipate, to divine tomorrow’s new opportunities and tomorrow’s new challenges.

They are, so to speak, “reconnaissances in force.”

 

Frontiers of Management presents five years of such “reconnaissance” essays and articles, written between 1982 and 1986.

The book was actually planned all along—from the day in 1982 when the first of the essays was written—to bring together in one volume the best and most durable of these analyses of then current events likely to become tomorrow’s big issues.

It has lived up to this ambition.

For this book—now more than ten years old—continues still to be in so much demand that the publisher has decided to bring it out in this new paperback edition.

It is an unrevised edition—purposefully so; not one word of the original has been changed.

Readers can therefore decide for themselves where the author got it wrong.

The only change is the addition of this new introduction to the book.

It comments on the essays in each part from the vantage point of 1997.

 

I trust and hope that readers will find that these essays—precisely because they were written when a topic first emerged—tell them as much about crucial issues of 1997 or 2000 as some of the voluminous treatises now being written about these topics.

Above all I hope that this republication of Frontiers of Management will induce readers to ask the right questions.

 

line

 

Interview: A Talk With A Wide-Ranging Mind



Q: The last book of yours was the one in which you wrote about the deliberateness of the innovation process. Has there been any deliberateness in your own life? Was there a plan for Peter Drucker?

A: In retrospect, my life makes sense, but not in prospect, no.

I was probably thirty before I had the foggiest notion where I belonged.

For ten or twelve years before that I had experimented, not by design but by accident.

I knew, even as a little boy, that I didn’t want to stay in Austria, and I knew that I didn’t want to waste four years going to a university.

So I had my father get me a job as far away as one could go and as far away from anything that I was eventually headed for.

I was an apprentice clerk in an export house.

Then I worked in a small bank in Frankfurt.

It’s a job I got because I was bilingual in English and German.

That was October 1929.

The stock market crashed, and I was the last in and the first out.

I needed a job and got one at the local newspaper.

It was a good education, I must say.

 

In retrospect, the one thing I was good at was looking at phenomena and asking what they meant.

I knew in 1933 how Hitler would end, and I then began my first book, The End of Economic Man, which could not be published until 1939, because no publisher was willing to accept such horrible insights.

It was very clear to me that Hitler would end up killing the Jews.

And it was also very clear that he would end up with a treaty with Stalin.

 

I had been quite active in German conservative politics even though I had a foreign passport, and so I knew that Hitler was not for me.

I left and went first to England and then, four years later, to this country.

I worked in London for an insurance company as a securities analyst and as an investment banker.

If I had wanted to be a rich man I would have stayed there, but it bored me to tears.




Q: Would you define entrepreneur?

A: The definition is very old.

It is somebody who endows resources with new wealth-producing capacity.

That’s all.

(But not in the way you’d expect. Get a Kindle version of Management, Revised Edition and do a key word search on entrepreneur then entrepreneurship. — bobembry)




Q: You make the point that small business and entrepreneurial business are not necessarily the same thing.

A: The great majority of small businesses are incapable of innovation, partly because they don’t have the resources, but a lot more because they don’t have the time and they don’t have the ambition.

I’m not even talking of the corner cigar store.

Look at the typical small business.

It’s grotesquely understaffed.

It doesn’t have the resources and the cash flow.

Maybe the boss doesn’t sweep the store anymore, but he’s not that far away.

He’s basically fighting the daily battle.

He doesn’t have, by and large, the discipline.

He doesn’t have the background.

The most successful of the young entrepreneurs today are people who have spent five to eight years in a big organization.




Q: What does that do for them?

A: They learn.

They get tools.

They learn how to do a cash-flow analysis and how one trains people and how one delegates and how one builds a team.

The ones without that background are the entrepreneurs who, no matter how great their success, are being pushed out.

For example, if you ask me what’s wrong with [Apple Computer Inc. cofounders] Wozniak and Jobs .




Q: That’s exactly what I was going to ask

A: They don’t have the discipline.

They don’t have the tools, the knowledge.




Q: But that’s the company that we’ve looked to for the past five or six years as being prototypical of entrepreneurial success.

A: I am on record as saying that those two young men would not survive.

The Lord was singularly unkind to them.




Q: Really?

A: By giving them too much success too soon.

If the Lord wants to destroy, He does what He did to those two.

They never got their noses rubbed in the dirt.

They never had to dig.

It came too easy.

Success made them arrogant.

They don’t know the simple elements.

They’re like an architect who doesn’t know how one drives a nail or what a stud is.

A great strength is to have five to ten years of, call it management, under your belt before you start.

If you don’t have it, then you make these elementary mistakes.




Q: People who haven’t had this big-company experience you prescribe: would you tell them that they shouldn’t attempt their own enterprise?

A: No, I would say read my entrepreneurial book, because that’s what it’s written for.

We have reached the point [in entrepreneurial management] where we know what the practice is, and it’s not waiting around for the muse to kiss you.

The muse is very, very choosy, not only in whom she kisses but in where she kisses them.

And so one can’t wait.

In high tech, we have the old casualty rate among young companies, eight out of ten, or seven out of ten. But outside of high tech, the rate is so much lower.




Q: Because?

A: Because they have the competence to manage their enterprises and to manage themselves.

That’s the most difficult thing for the person who starts his own business, to redefine his own role in the business.




Q: You make it sound so easy in the book

A: It is simple, but not easy.

What you have to do and how you do it are incredibly simple.

Are you willing to do it?

That is another matter.

You have to ask the question.

 

There is a young man I know who starts businesses.

He is on his fifth.

He develops them to the point that they are past the baby diseases and then sells out.

He’s a nanny.

You know, when I grew up there were still nannies around, and most of them used to give notice on the day their child spoke its first word.

Then it was no longer a baby.

That’s what this particular fellow is, a baby nurse.

When his companies reach twenty-nine employees he says, “Out!” I ask why and he says, “Once I get to thirty people, including myself, then I have to manage them, and I’m simply not going to do anything that stupid.”




Q: That example would tend to confirm the conventional wisdom, which holds that there are entrepreneurs and there are managers, but that the two are not the same.

A: Yes and no.

You see, there is entrepreneurial work and there is managerial work, and the two are not the same.

But you can’t be a successful entrepreneur unless you manage, and if you try to manage without some entrepreneurship, you are in danger of becoming a bureaucrat.

Yes, the work is different, but that’s not so unusual.

 

Look at entrepreneurial businesses today.

A lot of them are built around somebody in his fifties who came straight out of college, engineering school, and went to work for GE.

Thirty years later, he is in charge of market research for the small condenser department and is very nicely paid.

His mortgage is paid up and his pension is vested, the kids are grown, and he enjoys the work and likes GE, but he knows he’s never going to be general manager of the department, let alone of the division.

And that’s when he takes early retirement, and three weeks later, he’s working for one of the companies around Route 128 in Boston.

This morning I talked to one of those men.

He had been in market planning and market research for a du Pont division—specialty chemicals—and he said, “You know, I was in my early fifties, and I enjoyed it, but they wanted to transfer me.

Do I have to finish the story?

So now he’s a vice-president for marketing on Route 128 in a company where he is badly needed, an eight-year-old company of engineers that has grown very fast and has outgrown its marketing.

But he knows how one does it.

At du Pont, was he an entrepreneur or was he a manager?

He knows more about how one finds new markets than the boys at his new company do.

He’s been doing it for thirty years.

It’s routine.

You come out with something, and it works fine in the market, but then you see what other markets there are that you never heard of.

There are lots of markets that have nothing to do with the treatment of effluents, or whatever that company does, but they didn’t know how to find them until this fellow came along.

There is entrepreneurial work and there is managerial work, and most people can do both.

But not everybody is attracted to them equally.

The young man I told you about who starts companies, he asked himself the question, and his answer was, “I don’t want to run a business.”




Q: Isn’t there some irony in the fact that you who study organizations aren’t part of one?

A: I couldn’t work in a large organization. They bore me to tears.




Q: Aren’t you being very hard on the Route 128 and Silicon Valley people? You’ve called them arrogant, immature.

A: High tech is living in the nineteenth century, the pre-management world.

They believe that people pay for technology.

They have a romance with technology.

But people don’t pay for technology: they pay for what they get out of technology.

 

If you look at the successful companies, they are the ones who either learn management or bring it in.

In the really successful high-tech companies, the originator isn’t usually there five years later.

He may be on the board; he may be honorary chairman; but he is out, and usually with bitterness.

The Apple story is different only in its dimensions.

Steve Jobs lacked the discipline.

I don’t mean the self-discipline.

I mean the basic knowledge and the willingness to apply it.

 

High tech, precisely because it has all the glamour, is prone to arrogance far more than any other.

But it’s not confined to high tech.




Q: Where else?

A: Finance.

There’s a different kind of egomaniac there, but still an egomaniac.

Partly for the same reason.

They make too much money too soon.

It spoils you, you know, to get $450,000 in stock options at age twenty-three.

It’s a very dangerous thing.

It’s too much excitement.




Q: This entrepreneurial society that you write about in the book, how did it develop? And are you absolutely persuaded that it’s not just a fad?

A: Certainly, demographics have had a lot to do with it.

You go back thirty years, twenty-five years, and the able graduates of, let’s say, Harvard Business School all wanted to go into big business.

And it was a rational, intelligent thing to do because the career opportunities were there.

But now, you see, because of the baby boom, the pipelines are full.

 

Another reason we have an entrepreneurial society, and it’s an important reason, is that high tech has made it respectable.

The great role of high tech is in creating the climate for entrepreneurs, the vision.

And it has also created the sources of capital.

When you go to the venture capitalists, you know, most of them are no longer emphasizing high tech.

But all of them began in high tech.

It was high tech that created the capital flow.

And how recent this is is very hard to imagine.

In 1976, I published a book on pension funds in which I said that one of the great problems in making capital formation institutional is that there won’t be any money for new businesses.

That was only ten years ago and at that time what I said was obvious.

Today it would be silly.

 

The third thing promoting the entrepreneurial society perhaps is the most important, although I’m not sure whether I’m talking chicken or egg.

There’s been a fundamental change in basic perception over the past, make it, fifty years.

The trend was toward centralization—in business, in government, and in health care.

At the same time, when we came out of World War II we had discovered management.

But management was something that we thought could work only in large, centralized institutions.

In the early 1950s, I helped start what became the Presidents’ Course given by the American Management Associations.

In the first years, until 1970, for every hundred people they invited, eighty wrote back and said: “This is very interesting, but I’m not GE.

What would I need management for?”

And the same was true when I first started to work with the American College of Hospital Administrators, which gave a seminar in management.

Hospital administrators needed it, but invariably we got the answer, “We have only ninety beds; we can’t afford management.”

This has all changed now.

Don’t ask me how and when.

But nowadays, the only place left where you still have the cult of bigness is in Japan.

There, bigger is better and biggest is best.

 

So, in part, the entrepreneurial society came about because we all “learned” how to manage.

It’s become part of the general culture.

Look, Harper & Row—who is the publisher for [Tom] Peters and [Bob] Waterman—half of the 2 or 3 million books they sold were graduation presents for high school graduates.




Q: Your book or In Search of Excellence?

A: Oh, no, no.

Not my book.

My book would be hopeless.

They couldn’t read it, much less master it.

The great virtue of the Peters and Waterman book is its extreme simplicity, maybe oversimplification.

But when Aunt Mary has to give that nephew of hers a high school graduation present and she gives him In Search of Excellence, you know that management has become part of the general culture.




Q: Does the arrival of the entrepreneurial society mean that we should be rejoicing now because our national economic future is assured?

A: No. It’s bringing tremendous change to a lot of vast institutions, and if they can’t learn, the changes will be socially unbearable.




Q: Has any of them started to change?

A: My God, yes.

The new companies are the least of it, historically.

The more important part is what goes on in existing institutions.

What is far more important is that the American railroad has become innovative with a vengeance in the last thirty years.

When I first knew the railroads in the late 1940s, there was no hope for them.

I was quite sure that they would all have to be nationalized.

Now, even Conrail, the government-owned railroad, makes money.

 

What has happened in finance is even more dramatic.

In, make it, 1960, some smart cookies at General Electric Credit Corporation realized that commercial paper is a commercial loan, not legally, but economically.

Legally, in this country, it’s a security, so the commercial banks have a hard time using it.

Our number-two bank is not Chase and not Bank of America.

It’s General Electric Credit.

 

The most robotized plant in the world is probably the GE locomotive plant in Erie, Pennsylvania.

Twenty years ago, GE didn’t make a single locomotive in this country.

It was much too expensive.

They were all made by GE Brazil.

Now, the U.S. plant is far more automated than anything you could possibly find in Japan or Korea.

 

That’s where the innovation has been, and that’s where we need it, because if we don’t get the changes in there we will have one corpse after another, with enormous social danger.




Q: Is that why you wrote Innovation and Entrepreneurship?

A: I wrote the book because I felt the time had come to be a little more serious about the topic than most of the prevailing work was and also in part because, bluntly, most of the things you read or hear seem to me, on the basis of thirty years of work and experience, to be misunderstandings.

The entrepreneur—the person with George Gilder’s entrepreneurial personality—yes, there are such people, but they are rarely successful.

On the other hand, people whom Gilder would never accept as entrepreneurs are often very successful.

Entrepreneurship is not a romantic subject.

It’s hard work.

I wanted to dislodge the nineteenth-century folklore that holds that entrepreneurship is all about small business and new business.

Entrepreneurs range from the likes of Citibank, whom nobody has accused of being new or small or General Electric Credit—to Edward D. Jones & Co. in St. Louis, the fastest-growing American financial-services company.

 

But there’s another reason.

When I published Practice of Management thirty years ago, that book made it possible for people to learn how to manage, something that up to then only a few geniuses seemed able to do, and nobody could replicate it.

I sat down and made a discipline of it.

This book does the same with innovation and entrepreneurship.




Q: Well, you didn’t invent the stuff.

A: In a large part, yes.




Q: You didn’t invent the strategies. They were around before you wrote them down.

A: Not really.




Q: No? What I’m trying to say is that people were doing these things—finding market niches, promoting entrepreneurial behavior in their employees—before your book came out.

A: Yes, and everybody thought it required genius and that it could not be replicated.

Look, if you can’t replicate something because you don’t understand it, then it really hasn’t been invented; it’s only been done.

 

When I came into management, a lot of it had come out of engineering.

And a lot of it came out of accounting.

And some of it came out of psychology.

And some more came out of labor relations.

Each of those was considered separate, and each of them, by itself, was ineffectual.

You can’t do carpentry, you know, if you have only a saw, or only a hammer, or you never heard of a pair of pliers.

It’s when you put all those tools into one kit that you invent.

That’s what I did in large part in this book.




Q: You’re certainly one of the most accessible of the serious writers on management topics.

A: Well, I’m a professional writer, and I do not believe that obscurity is a virtue.




Q: Why do you work alone? No staff?

A: I don’t enjoy having to do work to keep other people busy.

I want to do the work I want to do and not the work I have to do because I have to pay them or they have to eat.

I’m a solo performer.

I’ve never been interested in building a firm.

I’m also not interested in managing people.

It bores me stiff.




Q: Do clients come to you now?

A: With one exception I don’t do any consulting elsewhere.




Q: Why are you interested in business? If your overarching interest is in organizations, why not study other kinds? Why not political organizations?

A: My consulting practice is now fifty / fifty profit-nonprofit.

But I didn’t come out of business.

I came out of political journalism.

In my second book, The Future of Industrial Man, I came to the conclusion that the integrating principle of modern society had become the large organization.

At that time, however, there was only the business organization around.

In this country, the business enterprise was the first of the modern institutions to emerge.

I decided that I needed to be inside, to really study a big company from the inside: as a human, social, political organization—as an integrating mechanism.

I tried to get inside, and I had met quite a few people as a journalist and as an investment banker.

They all turned me down.

The chairman of Westinghouse was very nice to me when I came to see him, but when I told him what I wanted he not only threw me out, he gave instructions to his staff not to allow me near the building because I was a Bolshevik.

This was 1940.

 

By 1942, I was doing quite a bit of work for the government.

I had given up looking for a company to study when one day the telephone rang and the fellow said, “My name is Paul Garrett, and I am vice-president of public relations for General Motors Corp.

My vice-chairman has asked me to call you to ask whether you would be willing and available to make a study of our top management structure.”

Since then, nobody at General Motors has ever admitted to having been responsible for this, but that is how I got into business.




Q: You look at business from a position that is unique. You’re neither an academic

A: Though I’ve been teaching for fifty years.




Q: But you don’t consider yourself an academic. And you certainly don’t write like an academic.

A: That is, you know, a slur on academics.

It is only in the last twenty or thirty years that being incomprehensible has become a virtue in academia.




Q: Nor are you an operations person.

A: No, I’m no good at operations.




Q: So you don’t get down there in the mud with your clients.

A: Oh yes, a little.

Look, whatever problem a client has is my problem.

Here is that mutual fund company that sees that the market is shifting from sales commissions to no front-end loads.

A terrific problem, they said.

I said, no, it’s an opportunity.

The salesman has to get his commission right away, and the customer has to pay it over five years.

That’s a tax shelter.

Make it into something that, if you hold it for five years, you pay not income tax but capital gains tax on it.

Then you’ll have a new product.

It’s been the greatest success in the mutual fund industry.

That’s nitty-gritty enough, isn’t it?

I let him do all the work, and he consults the lawyers.

I could do it, too, but he doesn’t need me to sit down with his lawyers and write out the prospectus for the SEC.




Q: Do you read the new management books that come out?

A: I look at a great deal of them.

Once in a while you get a book by a practitioner, like [Intel Corp. president] Andy Grove’s book, High Output Management, on how one maintains the entrepreneurial spirit in a very big and rapidly growing company.

I think that’s a beautiful book and very important.

But in order to get a book like that, I plow through a lot of zeros.

Fortunately, the body processes cellulose very rapidly.




Q: It doesn’t bother you that Tom Peters and Bob Waterman got rich and famous for a book built on ideas, Peters has said, that you had already written about?

A: No. The strength of the Peters book is that it forces you to look at the fundamentals.

The book’s great weakness—which is a strength from the point of view of its success—is that it makes managing sound so incredibly easy.

All you have to do is put that book under your pillow, and it’ll get done.




Q: What do you do with your leisure time?

A: What leisure time?




Q: Maybe I should have asked if you have any?

A: On my seventieth birthday, I gave myself two presents.

One was finishing my first novel and the other was a second professorship, this one in Japanese art.

For fifty years, I’ve been interested in Oriental art, and now I’ve reached a point where I’m considered, especially in Japan, to be the expert in certain narrow areas—advising museums, helping collectors.

That takes a fair amount of time.

 

Also, I don’t know how many hundreds or thousands of people there are now all over the world who were either clients of mine or students and who take the telephone and call to hear themselves talk and for advice.

 

I swim a great deal and I walk a great deal.

But leisure time in the sense of going bowling, no.




Q: How do you write?

A: Unsystematically.

It’s a compulsion neurosis.

There’s no pattern.




Q: Do you use a typewriter?

A: Sometimes. It depends.

And I never know in advance how it’s going to work.




Q: How long, for example, does it take you to write a Wall Street Journal column?

A: To write it, not very long—a day.

To do it, much longer.

They’re only fourteen hundred, fifteen hundred words.

I recently did a huge piece on the hostile takeover wave—six thousand, seven thousand words—for The Public Interest.

I had to change quite a bit.

It suddenly hit me that I know what a hostile takeover is, but how many readers do?

That had to be explained.

I totally changed the structure, and that takes a long time for me.

Once I understand it, though, I can do it very fast.

See, I’ve made my living as a journalist since I was twenty.

My first job was on a paper that published almost as much copy as The Boston Globe.

The Globe has 350 editorial employees; we were 14 employees fifty years ago, which is much healthier.

 

On my first day—I was barely twenty—I was expected to write two editorials.




Q: If the business of America is business, and all that sort of thing, why don’t businesspeople have a better popular image than they do?

A: The nice thing about this country is that nobody’s popular except people who don’t matter.

This is very safe.

Rock stars are popular, because no rock star has ever lasted for more than a few years.

Rock stars are therefore harmless.

The wonderful thing about this country is the universal persecution mania.

Every group feels that it is being despised and persecuted.

Have you ever heard the doctors talk about how nobody appreciates how much they bleed for the good of humanity?

Everybody is persecuted.

Everybody feels terribly sorry for himself.

You sit down with university professors, and it is unbelievable how terrible their lot is.

The businessman feels unloved, misunderstood, and neglected.

And have you ever sat down with labor leaders?

 

They are all of them right.

It’s all true.

This is not a country that has great respect, and this is one of its great safeguards against tyranny.

We save our adulation for people who will never become a menace—for baseball players and rock stars and movie idols who are completely innocuous.

We have respect for accomplishment, but not for status.

There is no status in this country.

There’s respect for the office of the president, but no respect for the president.

As a consequence, here, everybody feels persecuted and misunderstood, not appreciated, which I think is wonderful.




Q: Would you like to say something disrespectful about economists?

A: Yes.

Economists never know anything until twenty years later.

There are no slower learners than economists.

There is no greater obstacle to learning than to be the prisoner of totally invalid but dogmatic theories.

The economists are where the theologians were in 1300: prematurely dogmatic.

 

Until fifty years ago, economists had been becomingly humble and said all the time, “We don’t know.”

Before 1929, nobody believed that government had any responsibility for the economy.

Economists said, “Since we don’t know, the only policy with a chance for success is no policy.

Keep expenditures low, productivity high, and pray.”

 

But after 1929, government took charge of the economy and economists were forced to become dogmatic, because suddenly they were policymakers.

They began asserting, Keynes first, that they had the answers, and what’s more the answers were pleasant.

It was like a doctor telling you that you have inoperable liver cancer, but it will be cured if you go to bed with a beautiful seventeen-year-old.

Keynes said there’s no problem that can’t be cured if only you keep purchasing power high.

What could be nicer?

The monetarist treatment is even easier: There’s nothing that won’t be cured if you just increase the money supply by 3 percent per year, which is also increasing incomes.

The supply-siders are more pleasant still: There’s no disease that can’t be cured by cutting taxes.

 

We have no economic theory today.

But we have as many economists as the year 1300 had theologians.

Not one of them, however, will ever be sainted.

By 1300, the age of saints was over, more or less, and there is nothing worse than the theologian who no longer has faith.

That’s what our economists are today.




Q: What about government? Do you see any signs that the entrepreneurial society has penetrated government as an organization?

A: The basic problem of American government today is that it no longer attracts good people.

They know that nothing can be done; government is a dead-end street.

Partly it’s because, as in business, all the pipelines are full, but also because nobody has belief in government.

Fifty years ago, even twenty years ago, government was the place where the ideas were, the innovation, the new things.

Japan is the only country where government is still respected and where government service still attracts the top people.




Q: So there’s nothing for government to do, in your view?

A: Oh, no, no.

The days of the welfare state are over, but we are not going to abolish it.

We have to find its limits.

What are the limits?

At what point does welfare do damage?

This is the real question, and it’s brought up by the success of the welfare state.

The problems of success, I think, are the basic issues ahead of us, and the only thing I can tell you is that they don’t fit the political alignments of the nineteenth and early twentieth centuries.

They do not fit liberal and conservative and socialist.

The traditional parties make absolutely no sense whatever to anybody of age thirty.

And yet, what else is there?




Q: Is Ronald Reagan administration promoting or inhibiting this entrepreneurial society of yours?

A: It’s a very interesting administration: totally schizophrenic.

When you look at its deeds, it hasn’t done one darn thing Mr. Carter wouldn’t have done.

And probably he wouldn’t have done it any worse either, or any better.

The words, however, are different.

 

This is a very clear symptom, I think, that there has been an irrevocable shift in the last ten years.

No matter who is in power, he would no longer believe in big government and would preach cutting expenses and would end up doing nothing about it.

This is because we, the American people, are at that interesting point where we are all in favor of cutting the deficit—at somebody else’s expense.

It’s a very typical stage in alcoholism, you know, where you know you have to stop—tomorrow.




Q: Do you think we will?

A: Alcoholics usually don’t reform until they’re in the gutter.

Maybe we won’t wait that long.

Three years ago, to do anything about Social Security would have been unspeakable.

Now it’s speakable.

It’s not doable yet, but I think we’re inching toward solutions.




Q: You’re not too worried about the future then?

A: Well, one can get awfully pessimistic about the world.

It’s clearly not in good shape, but it probably never has been, not in my lifetime.

One of my very early childhood memories is the outbreak of World War I. My father and his brother-in-law, who was a very famous lawyer, jurist, and philosopher, and my father’s close friend at the time, who was Tomás Masaryk, the founder of Czechoslovakia, a great historian and much older of course.

… I still remember our house.

Hot-air heating pipes carry sound beautifully.

Our bathroom was above my father’s study.

I was not quite five, and I listened at the hot-air register to my father and my uncle Hans and Masaryk saying, “This is the end not just of Austria, but of civilization.”

That is the first thing that I can remember clearly.

And then I remember the endless obituaries in the newspaper.

That’s the world I grew up in and was very conscious of, the last days of anything that had any value.

And it hasn’t changed since.

So it’s awfully easy for me to be pessimistic, but what’s the use of it?

Lots of things worry me.

On the other hand, we have survived against all odds.




Q: It’s hard to place you, politically …

A: I’m an old—not a neo—conservative.

The neoconservatives started out on the Left, and now they are basically old-fashioned liberals, which is respectable, but I’ve never been one.

For instance, although I believe in the free market, I have serious reservations about capitalism.

Any system that makes one value absolute is wrong.

Basically, the question is not what are our rights, but what are our responsibilities.

These are very old conservative approaches, and I raised them in the first book I wrote, The End of Economic Man, when I was in my twenties, so I have not changed.




Q: Were you ever tempted to go into politics?

A: No, I realized very early that I was apolitical, in the sense that I hadn’t the slightest interest in power for myself.

And if you have no interest in power, you are basically a misfit in politics.

On the other hand, give me a piece of paper and pencil, and I start to enjoy myself.




Q: What other things cheer you?

A: I am very much impressed by the young people.

First, that most of the things you hear about them are nonsense, for example, complaints that they don’t work.

I think that basically they’re workaholics.

And there is a sense of achievement there.

But I’m glad that I’m not twenty-five years old.

It’s a very harsh world, a terribly harsh world for young people.

[1985]

This interview was conducted by senior writer Tom Richman and appeared in the October 1985 issue of Inc.

 

line

 

The Changed World Economy

There is a lot of talk today of the changing world economy.

But—and this is the point of this chapter—the world economy is not changing.

It has already changed in its foundations and in its structure, and irreversibly so in all probability.




Within the last ten or fifteen years, three fundamental changes have occurred in the very fabric of the world’s economy:

1. The primary-products economy has come “uncoupled” from the industrial economy;

2. In the industrial economy itself, production has come uncoupled from employment;

3. Capital movements rather than trade in goods and services have become the engines and driving force of the world economy.

The two have not, perhaps, become uncoupled.

But the link has become quite loose, and worse, quite unpredictable.




These changes are permanent rather than cyclical.

We may never understand what caused them—the causes of economic change are rarely simple.

It may be a long time before economic theorists accept that there have been fundamental changes, and longer still before they adapt their theories to account for them.

They will surely be most reluctant, above all, to accept that the world economy is in control rather than the macroeconomics of the national state, on which most economic theory still exclusively focuses.

Yet this is the clear lesson of the success stories of the last twenty years: of Japan and South Korea; of West Germany, actually a more impressive though far less flamboyant performance than Japan; and of the one great success within the United States, the turnaround and rapid rise of an industrial New England that, only twenty years ago, was widely considered moribund.




But practitioners, whether in government or in business, cannot wait till there is a new theory, however badly needed.

They have to act.

And then their actions will be the more likely to succeed the more they are being based on the new realities of a changed world economy.

The Primary-Products Economy

The collapse in nonoil commodity prices began in 1977 and has continued, interrupted only once, right after the 1979 petroleum panic, by a speculative burst that lasted less than six months and was followed by the fastest drop in commodity prices ever recorded.




In early 1986, overall, raw-materials prices (other than petroleum*) were at the lowest level in recorded history in relation to the prices of manufactured goods and services—as low as in 1932, and in some cases (lead and copper) lower than at the depths of the Great Depression.




The collapse of raw-materials prices and the slowdown of raw-materials demand is in startling contrast to what was confidently predicted.

Ten years ago The Report of the Club of Rome predicted that desperate shortages for all raw materials were an absolute certainty by the year 1985.

Even more recently, in 1980 the Global 2000 Report of President Carter’s administration concluded that world demand for food would increase steadily for at least twenty years; that food production worldwide would go down except in developed countries; and that real food prices would double.

This forecast largely explains why American farmers bought up whatever farmland was available, thus loading on themselves the debt burden that now threatens so many of them.




But contrary to all these predictions, agricultural output in the world actually rose almost a full third between 1972 and 1985 to reach an all-time high.

And it rose the fastest in less developed countries.

Similarly, production of practically all forest products, metals, and minerals has been going up between 20 and 35 percent in these last ten years, again with production rising the fastest in less developed countries.

And there is not the slightest reason to believe that the growth rates will be slackening, despite the collapse of prices.

Indeed, as far as farm products are concerned, the biggest increase, at an almost exponential rate of growth, may still be ahead.*




But perhaps even more amazing than the contrast between what everybody expected and what happened is that the collapse in the raw-materials economy seems to have had almost no impact on the industrial economy of the world.

Yet, if there was one thing that was “known”—and considered “proved” without doubt in business cycle theory, it was that a sharp and prolonged drop in raw-materials prices inevitably, and within eighteen months to two and a half years, brings on a worldwide depression in the industrial economy.

The industrial economy of the world is surely not normal by any definition of the term.

But it is also surely not in a worldwide depression.

Indeed, industrial production in the developed noncommunist countries has continued to grow steadily, albeit at a somewhat slower rate, especially in Western Europe.




Of course the depression in the industrial economy may only have been postponed and may still be triggered, for instance, by a banking crisis caused by massive defaults on the part of commodity-producing debtors, whether in the Third World or in Iowa.

But for almost ten years, the industrial world has run as though there were no raw-materials crisis at all.




The only explanation is that for the developed countries—excepting only the Soviet Union—the primary-products sector has become marginal where it had always been central before.




In the late 192os, before the Great Depression, farmers still constituted nearly one-third of the U.S. population, and farm income accounted for almost a quarter of the gross national product (GNP).

Today they account for one-twentieth of the population and GNP, respectively.

Even adding the contribution that foreign raw-materials and farm producers make to the American economy through their purchases of American industrial goods, the total contribution of the raw-materials and food-producing economies of the world to the American GNP is, at most, one-eighth.

In most other developed countries, the share of the raw-materials sector is even lower than in the United States.

Only in the Soviet Union is the farm still a major employer, with almost a quarter of the labor force working on the land.




The raw-materials economy has thus come uncoupled from the industrial economy.

This is a major structural change in the world economy, with tremendous implications for economic and social policy and economic theory, in developed and developing countries alike.




For example, if the ratio between the prices of manufactured goods and the prices of primary products (other than petroleum)—that is, of foods, forest products, metals, and minerals—had been the same in 1985 as it had been in 1973, or even in 1979, the U.S. trade deficit in 1985 might have been a full third less, $100 billion as against an actual $150 billion.

Even the U.S. trade deficit with Japan might have been almost a third lower, some $35 billion as against $50 billion.

American farm exports would have brought almost twice as much.

And our industrial exports to one of our major customers, Latin America, would have held; their near-collapse alone accounts for a full one-sixth of the deterioration in U.S. foreign trade.

If primary-products prices had not collapsed, America’s balance of payments might even have shown a substantial surplus.




Conversely, Japan’s trade surplus with the world might have been a full one-fifth lower.

And Brazil in the last few years would have had an export surplus almost 50 percent higher than its actual one.

Brazil would then have had little difficulty meeting the interest on its foreign debt and would not have had to endanger its economic growth by drastically curtailing imports as it did.

Altogether, if raw-materials prices in relationship to manufactured goods prices had remained at the 1973 or even the 1979 level, there would be no crisis for most debtor countries, especially in Latin America.




What has happened?

And what is the outlook?




Demand for food has actually grown almost as fast as the Club of Rome and the Global 2000 Report anticipated.

But the supply has been growing much faster.

It not only has kept pace with population growth; it steadily outran it.

One cause of this, paradoxically, is surely the fear of worldwide food shortages, if not of world famine.

It resulted in tremendous efforts to increase food output.

The United States led the parade with a farm policy successfully aiming (except in one year: 1983) at subsidizing increased food production.

The European Common Market followed suit, and even more successfully.

The greatest increases, both in absolute and in relative terms, have, however, been in developing countries: in India, in post-Mao China, and in the rice-growing countries of Southeast Asia.




And then there is also the tremendous cut in waste.

Twenty-five years ago, up to 80 percent of the grain harvest of India fed rats and insects rather than human beings.

Today in most parts of India the wastage is down to 20 percent, the result of such unspectacular but effective infrastructure innovations as small concrete storage bins, insecticides, or three-wheeled motorized carts that take the harvest straight to a processing plant instead of letting it sit in the open for weeks on end.




And it is not too fanciful to expect that the true revolution on the farm is still ahead.

Vast tracts of land that hitherto were practically barren are being made fertile, either through new methods of cultivation or through adding trace minerals to the soil: the sour clays in the Brazilian highlands, for instance, or aluminum-contaminated soils in neighboring Peru, which never produced anything before and which now produce substantial quantities of high-quality rice.

Even greater advances are registered in biotechnology, both in preventing diseases of plants and animals and in increasing yields.




In other words, just as the population growth of the world is slowing down, and in many parts quite dramatically, food production is likely to increase sharply.




But import markets for food have all but disappeared.

As a result of its agricultural drive, Western Europe has become a substantial food exporter plagued increasingly by unsalable surpluses of all kinds of foods, from dairy products to wine and from wheat to beef.

China, some observers now predict, will have become a food exporter by the year 2000.

India has already reached that stage, especially in respect to wheat and coarse grains.

Of all major noncommunist countries only Japan is still a substantial food importer, buying abroad about one-third of her food needs.

Today most of this comes from the United States.

Within five or ten years, however, South Korea, Thailand, and Indonesia—low-cost producers that are increasing food output fast—will compete with the United States to become Japan’s major suppliers.

The only remaining major world-market food buyer may then be the Soviet Union, and Russia’s food needs are likely to grow.

However, the food surpluses in the world are so large, maybe five to eight times what Russia would ever need to buy, that the Russian food needs are not by themselves enough to put upward pressure on world prices.

On the contrary, the competition for access to the Russian market among the surplus producers—the United States, Europe, Argentina, Australia, New Zealand (and, probably within a few years, India as well)—is already so intense as to knock down world food prices.




For practically all nonfarm commodities, whether forest products, minerals, or metals, world demand itself—in sharp contrast to what the Club of Rome so confidently predicted—is shrinking.

Indeed, the amount of raw materials needed for a given unit of economic output has been dropping for the entire century, except in wartime.

A recent study by the International Monetary Fund* calculates the decline as being at the rate of one and a quarter percent a year (compound) ever since 1900.

That would mean that the amount of industrial raw materials needed for one unit of industrial production is now no more than two-fifths of what it was in 1900, and the decline is accelerating.

Even more startling are recent Japanese developments.

In 1984, Japan, for every unit of industrial production, consumed only 6o percent of the raw materials she had consumed for the same amount of industrial production in 1973, only eleven years earlier.




Why this decline?

It is not that industrial production is becoming less important, a common myth for which, as we shall see shortly, there is not the slightest evidence.

What is happening is much more important.

Industrial production is steadily switching from heavily material-intensive to far less material-intensive products and processes.

One reason for this is the emergence of the new and especially the high-tech industries.

The raw materials in a semiconductor microchip account for 1 to 3 percent; in an automobile their share is 40 percent; and in pots and pans, 60 percent.

But the same scaling down of raw-material needs goes on in old industries, and with respect to old products as well as new ones.

Fifty to one hundred pounds of fiberglass cable transmits as many telephone messages as does one ton of copper wire, if not more.




This steady drop in the raw-material intensity of manufacturing processes and manufacturing products extends to energy as well, and especially to petroleum.

To produce one hundred pounds of fiberglass cable requires no more than one-twentieth of the energy needed to mine and smelt enough copper ore to produce one ton of copper and then to draw it out into copper wire.

Similarly plastics, which are increasingly replacing steel in automobile bodies, represent a raw-materials cost, including energy, of less than half that of steel.




And if copper prices were to double—and that would still mean a fairly low price by historical standards—we would soon start to “mine” the world’s largest copper deposits, which are not the mines of Chile or of Utah, but the millions of tons of telephone cable under the streets of our large cities.

It would then pay us to replace the underground copper cables with fiberglass.




Thus it is quite unlikely that raw-materials prices will rise substantially compared to the prices of manufactured goods (or of high-knowledge services such as information, education, or health care) except in the event of a major prolonged war.




One implication of this sharp shift in the terms of trade of primary products concerns the developed countries, whether major raw-materials exporters like the United States or major raw-materials importers such as Japan.

The United States for two centuries has seen maintenance of open markets for its farm products and raw materials as central to its international trade policy.

This is in effect what is meant in the United States by an “open world economy” and by “free trade.”

Does this still make sense?

Or does the United States instead have to accept that foreign markets for its foodstuffs and raw materials are in long-term and irreversible decline?

But also, does it still make sense for Japan to base its international economic policy on the need to earn enough foreign exchange to pay for imports of raw materials and foodstuffs?

Since Japan opened herself to the outside world 120 years ago, preoccupation, amounting almost to a national obsession, with this dependence on raw-materials and food imports has been the driving force of




Japan’s policy, and not in economics alone.

But now Japan might well start out with the assumption, a far more realistic one in today’s world, that foodstuffs and raw materials are in permanent oversupply.




Taken to their logical conclusion, these developments might mean that some variant of the traditional Japanese policy—highly “mercantilist” with strong deemphasis of domestic consumption and equally strong emphasis on capital formation, and with protection of “infant” industries—might suit the United States better than its own traditions.

Conversely the Japanese might be better served by some variant of America’s traditional policies, and especially by shifting from favoring savings and capital formation to favoring consumption.

But is such a radical break with a hundred years and more of political convictions and commitments likely?

Still, from now on the fundamentals of economic policy are certain to come under increasing criticism in these two countries, and in all other developed countries as well.




They will also, however, come under increasing scrutiny in major Third World nations.

For if primary products are becoming of marginal importance to the economics of the developed world, traditional development theories and traditional development policies are losing their foundations.

All of them are based on the assumption, historically a perfectly valid one, that developing countries pay for imports of capital goods by exporting primary materials—farm and forest products, minerals, metals.

All development theories, however much they differ otherwise, further assume that raw-materials purchases on the part of the industrially developed countries must rise at least as fast as industrial production in these countries.

This then implies that, over any extended period of time, any raw-materials producer becomes a better credit risk and shows a more favorable balance of trade.

But this has become highly doubtful.

On what foundation, then, can economic development be based, especially in countries that do not have a large enough population to develop an industrial economy based on the home market?

And, as we shall presently see, economic development of these countries can also no longer be based on low labor costs.

What “De-Industrialization” Means

The second major change in the world economy is the uncoupling of manufacturing production from manufacturing employment.

To increase manufacturing production in developed countries has actually come to mean decreasing blue-collar employment.

As a consequence, labor costs are becoming less and less important as a “comparative cost” and as a factor in competition.




There is a great deal of talk these days about the “deindustrialization” of America.

But in fact, manufacturing production has gone up steadily in absolute volume and has not gone down at all as a percentage of the total economy.

Ever since the end of the Korean War, that is, for more than thirty years, it has held steady at around 23 to 24 percent of America’s total GNP.

It has similarly remained at its traditional level in all of the major industrial countries.




It is not even true that American industry is doing poorly as an exporter.

To be sure, this country is importing far more manufactured goods than it ever did from both Japan and Germany.

But it is also exporting more than ever before—despite the heavy disadvantage in 1983, 1984, and most of 1985 of a very expensive dollar, of wage increases larger than our main competitors had, and of the near-collapse of one of our main industrial markets, Latin America.

In 1984, the year the dollar soared, exports of American manufactured goods rose by 8.3 percent, and they went up again in 1985.

The share of U.S. -manufactured exports in world exports was 17 percent in 1978.

By 1985 it had risen to 20 percent, with West Germany accounting for 18 percent and Japan for 16 (the three countries together thus accounting for more than half of the total).




Thus it is not the American economy that is being “deindustrialized.”

It is the American labor force.




Between 1973 and 1985, manufacturing production in the United States actually rose by almost 40 percent.

Yet manufacturing employment during that period went down steadily.

There are now million fewer people employed in blue-collar work in the American manufacturing industry than there were in 1975.




Yet in the last twelve years total employment in the United States grew faster than at any time in the peacetime history of any country—from 82 to 110 million between 1973 and 1985, that is, by a full third.

The entire growth, however, was in nonmanufacturing, and especially in non-blue-collar jobs.




The trend itself is not new.

In the 1920s, one out of every three Americans in the labor force was a blue-collar worker in manufacturing.

In the 1950s, the figure was still one in every four.

It now is down to one in every six—and dropping.




But although the trend has been running for a long time, it has lately accelerated to the point where, in peacetime at least, no increase in manufacturing production, no matter how large, is likely to reverse the long-term decline in the number of blue-collar jobs in manufacturing or in their proportion of the labor force.




And the trend is the same in all developed countries and is, indeed, even more pronounced in Japan.

It is therefore highly probable that developed countries such as the United States or Japan will, by the year 2010, employ no larger a proportion of the labor force in manufacturing than developed countries now employ in farming—at most, one-tenth.

Today the United States employs around iS million people in blue-collar jobs in the manufacturing industry.

Twenty-five years hence the number is likely to be 10—at most, 12—million.

In some major industries the drop will be even sharper.

It is quite unrealistic, for instance, to expect the American automobile industry to employ, twenty-five years hence, more than one-third of its present blue-collar force, even though production might be 50 percent higher.




If a company, an industry, or a country does not succeed in the next quarter century in sharply increasing manufacturing production, while sharply reducing the blue-collar work force, it cannot hope to remain competitive, or even to remain “developed.”

It would decline fairly fast.

Great Britain has been in industrial decline these last twenty-five years, largely because the number of blue-collar workers per unit of manufacturing production went down far more slowly than in all other noncommunist developed countries.

Yet Britain has the highest unemployment rate among noncommunist developed countries: more than 13 percent.




The British example indicates a new but critical economic equation: A country, an industry, or a company that puts the preservation of blue-collar manufacturing jobs ahead of being internationally competitive (and that implies steady shrinkage of such jobs) will soon have neither production nor steady jobs.

The attempt to preserve industrial blue-collar jobs is actually a prescription for unemployment.




On the national level, this is accepted only in Japan so far.

Indeed, Japanese planners, whether those of the government or those of private business, start out with the assumption of a doubling of production within fifteen or twenty years based on a cut in blue-collar employment of 25 to 40 percent.

And a good many large American companies such as IBM, General Electric, or the big automobile companies forecast parallel development.

Implicit in this is also the paradoxical fact that a country will have the less general unemployment the faster it shrinks blue-collar employment in manufacturing.




But this is not a ‘conclusion that politicians, labor leaders, or indeed the general public can easily understand or accept.




What will confuse the issue even more is that we are experiencing several separate and different shifts in the manufacturing economy.




One is the acceleration of the substitution of knowledge and capital for manual labor.

Where we spoke of mechanization a few decades ago, we now speak of robotization or automation.

This is actually more a change in terminology than a change in reality.

When Henry Ford introduced the assembly line in 1909, he cut the number of man-hours required to produce a motorcar by some 80 percent in two or three years: far more than anybody expects to happen as a result even of the most complete robotization.

But there is no doubt that we are facing a new, sharp acceleration in the replacement of manual workers by machines, that is, by the products of knowledge.




A second development—and in the long run it may be fully as important if not more important—is the shift from industries that are primarily labor-intensive to industries that, from the beginning, are primarily knowledge-intensive.

The costs of the semiconductor microchip are about 70 percent knowledge and no more than 12 percent labor.

Similarly, of the manufacturing costs of prescription drugs, “labor” represents no more than 10 or 15 percent, with knowledge—research, development, and clinical testing—representing almost 50 percent.

By contrast, in the most fully robotized automobile plant labor would still account for 20 or 25 percent of the costs.




Another, and highly confusing, development in manufacturing is the reversal of the dynamics of size.

Since the early years of this century, the trend in all developed countries has been toward larger and ever larger manufacturing plants.

The “economies of scale” greatly favored them.

Perhaps equally important, what one might call the economies of management favored them.

Up until recently, modern management seemed to be applicable only to fairly large units.




This has been reversed with a vengeance the last fifteen to twenty years.

The entire shrinkage in manufacturing jobs in the United States has been in large companies, beginning with the giants in steel and automobiles.

Small and especially medium-size manufacturers have either held their own or actually added people.

In respect to market standing, exports, and profitability too, smaller and especially middle-size businesses have done remarkably better than the big ones.

The same reversal of the dynamics of size is occurring in the other developed countries as well, even in Japan, where bigger was always better and biggest meant best!

The trend has reversed itself even in old industries.

The most profitable automobile company these last years has not been one of the giants, but a medium-size manufacturer in Germany: BMW.

The only profitable steel companies worldwide have been medium-size makers of specialty products, such as oil-drilling pipe, whether in the United States, in Sweden, or in Japan.




In part, especially in the United States,* this is a result of a resurgence of entrepreneurship.

But perhaps equally important, we have learned in the last thirty years how to manage the small and medium-size enterprise—to the point that the advantages of smaller size, for example, ease of communications and nearness to market and customer, increasingly outweigh what had been forbidding management limitations.

Thus the United States, but increasingly in the other leading manufacturing nations such as Japan and West Germany, the dynamism in the economy has shifted from the very big companies that dominated the world’s industrial economy for thirty cars after World War II to companies that, while much smailer, are still professionally managed and, largely, publicly financed.




But also there are emerging two distinct kinds of “manufacturing industry”: one group that is materials-based, the industries that provided economic growth in the first three-quarters of this century; and another group that is information, and knowledge-based, pharmaceuticals, telecommunications, analytical instruments, information processing such as computers, and so on.

And increasingly it is in the information-based manufacturing industries in which growth has come to center.




These two groups differ in their economic characteristics and especially in respect to their position in the international economy.

The products of materials-based industries have to be exported or imported as products.

They appear in the balance of trade.

The products of information-based industries can be exported or imported both as products and as services.




An old example is the printed book.

For one major scientific publishing company, “foreign earnings” account for two-thirds of total revenues.

Yet the company exports few books, if any; books are heavy.

It sells “rights.”

Similarly, the most profitable computer “export sale” may actually show up in the statistics as an “import.”

It is the fee some of the world’s leading banks, some of the big multinationals, and some Japanese trading companies get for processing in their home offices data sent in electronically from their branches or their customers anywhere in the world.




In all developed countries, knowledge workers have already become the center of gravity of the labor force, even in numbers.

Even in manufacturing they will outnumber blue-collar workers within fewer than ten years.

And then, exporting knowledge so that it produces license income, service fees, and royalties may actually create substantially more jobs than exporting goods.




This then requires, as official Washington has apparently already realized, far greater emphasis in trade policy on “invisible trade” and on abolishing the barriers, mostly of the non-tariff kind, to the trade in services, such as information, finance and insurance, retailing, patents, and even health care.

Indeed, within twenty years the income from invisible trade might easily be larger, for major developed countries, than the income from the export of goods.

Traditionally, invisible trade has been treated as a stepchild, if it received any attention at all.

Increasingly, it will become central.




Another implication of the uncoupling of manufacturing production from manufacturing employment is, however, that the choice between an industrial policy that favors industrial production and one that favors industrial employment is going to be a singularly contentious political issue for the rest of this century.

Historically these have always been considered two sides of the same coin.

From now on, however, the two will increasingly pull in different directions and are indeed becoming alternatives, if not incompatible.




“Benevolent neglect”—the policy of the Reagan administration these last few years—may be the best policy one can hope for, and the only one with a chance of success.

It is not an accident, perhaps, that the United States has, next to Japan, by far the lowest unemployment rate of any industrially developed country.

Still, there is surely need also for systematic efforts to retrain and to replace redundant blue-collar workers—something that no one as yet knows how to do successfully.




«§§§»


Finally, low labor costs are likely to become less and less of an advantage in international trade, simply because in the developed countries they are going to account for less and less of total costs.

But also, the total costs of automated processes are lower than even those of traditional plants with low labor costs, mainly because automation eliminates the hidden but very high costs of “not working,” such as the costs of poor quality and of rejects, and the costs of shutting down the machinery to change from one model of a product to another.




Examples are two automated U.S. producers of television receivers, Motorola and RCA.

Both were almost driven out of the market by imports from countries with much lower labor costs.

Both then automated, with the result that their American-made products successfully compete with foreign imports.

Similarly, some highly automated textile mills in the Carolinas can underbid imports from countries with very low labor costs, for example, Thailand.

Conversely, in producing semiconductors, some American companies have low labor costs because they do the labor-intensive work offshore, for instance, in West Africa.

Yet they are the high-cost producers, with the heavily automated Japanese easily underbidding them, despite much higher labor costs.




The cost of capital will thus become increasingly important in international competition.

And it is the cost in respect to which the United States has become, in the last ten years, the highest-cost country—and Japan the lowest-cost one.

A reversal of the U.S. policy of high interest rates and of high cost of equity capital should thus be a priority of American policymakers, the direct opposite of what has been U.S. policy for the past five years.

But this, of course, demands that cutting the government deficit rather than high interest rates becomes our defense against inflation.




For developed countries, and especially for the United States, the steady downgrading of labor costs as a major competitive factor could be a positive development.

For the Third World, and especially for the rapidly industrializing countries—Brazil, for instance, or South Korea or Mexico—it is, however, bad news.

Of the rapidly industrializing countries of the nineteenth century, one, Japan, developed herself by exporting raw materials, mainly silk and tea, at steadily rising prices.

One, Germany, developed by “leapfrogging” into the “high-tech” industries of its time, mainly electricity, chemicals, and optics.

The third rapidly industrializing country of the nineteenth century, the United States, did both.

Both ways are blocked for the present rapidly industrializing countries: the first one because of the deterioration of the terms of trade for primary products, the second one because it requires an “infrastructure” of knowledge and education far beyond the reach of a poor country (although South Korea is reaching for it!).

Competition based on lower labor costs seemed to be the way out.

Is this way going to be blocked too?

From “Real” to “Symbol” Economy

The third major change is the emergence of the symbol economy—capital movements, exchange rates and credit flow—as the flywheel of the world economy, in the place of the real economy: the flow of goods and services—and largely independent of the latter.

It is both the most visible and yet the least understood of the changes.




World trade in goods is larger, much larger, than it has ever been before.

And so is the invisible trade, the trade in services.

Together, the two amount to around $2.5 to $3 trillion a year.

But the London Eurodollar market, in which the world’s financial institutions borrow from and lend to each other, turns over $300 billion each working day, or $75 trillion a year, that is, at least twenty-five times the volume of world trade.




In addition, there are the (largely separate) foreign-exchange transactions in the world’s main money centers, in which one currency is traded against another (for example, U.S. dollars against the Japanese yen).

These run around $150 billion a day, or about $35 trillion a year: twelve times the worldwide trade in goods and services.




No matter how many of these Eurodollars, or yen, or Swiss francs are just being moved from one pocket into another and thus counted more than once, there is only one explanation for the discrepancy between the volume of international money transactions and the trade in goods and services: capital movements unconnected to, and indeed largely independent of, trade greatly exceed trade finance.




There is no one explanation for this explosion of international—or more accurately, transnational—money flows.

The shift from fixed to “floating” exchange rates in 1971 may have given the initial impetus (though, ironically, it was meant to do the exact opposite).

It invited currency speculation.

The surge in liquid funds flowing to Arab petroleum producers after the two “oil shocks” of 1973 and 1979 was surely a major factor.

But there can be little doubt that the American government deficit also plays a big role.

It sucks in liquid funds from all over into the “Black Hole” that the American budget has become* and thus has already made the United States into the world’s major debtor country.

Indeed, it can be argued that it is the budget deficit which underlies the American trade and payments deficit.

A trade and payments deficit is, in effect, a loan from the seller of goods and services to the buyer, that is, to the United States.

Without it the administration could not possibly finance its budget deficit, or at least not without the risk of explosive inflation.




Altogether, the extent to which major countries have learned to use the international economy to avoid tackling disagreeable domestic problems is unprecedented: the United States, for example, by using high interest rates to attract foreign capital and thus avoiding facing up to its domestic deficit, or the Japanese through pushing exports to maintain employment despite a sluggish domestic economy.

And this “politicization” of the international economy is surely also a factor in the extreme volatility and instability of capital flows and exchange rates.




Whatever the causes, they have produced a basic change: In the world economy, the real economy of goods and services and the symbol economy of money, credit and capital are no longer tightly bound to each other, and are, indeed, moving further and further apart.




Traditional international economic theory is still neoclassical and holds that trade in goods and services determines international capital flows and foreign-exchange rates.

Capital flows and foreign-exchange rates these last ten or fifteen years have, however, moved quite independently of foreign trade and indeed (for instance, in the rise of the dollar in 1984/85) have run counter to it.




But the world economy also does not fit the Keynesian model in which the symbol economy determines the real economy.

And the relationship between the turbulences in the world economy and the domestic economies has become quite obscure.

Despite its unprecedented trade deficit, the United States has, for instance, had no deflation and has barely been able to keep inflation in check.

Despite its trade deficit, the United States also has the lowest unemployment rate of any major industrial country, next to Japan.

The U.S. rate is lower, for instance, than that of West Germany, whose exports of manufactured goods and trade surpluses have been growing as fast as those of Japan.

Conversely, despite the exponential growth of Japanese exports and an unprecedented Japanese trade surplus, the Japanese domestic economy is not booming but has remained remarkably sluggish and is not generating any new jobs.




What is the outcome likely to be?

Economists take it for granted that the two, the real economy and the symbol economy, must come together again.

They do disagree, however—and quite sharply—about whether they will do so in a “soft landing” or in a head-on collision.




The soft-landing scenario-the Reagan administration is committed to it, as are the governments of most of the other developed countries-expects the U.S. government deficit and the U.S. trade deficit to go down together until both attain surplus, or at least balance, sometime in the early 1990S.

And then capital flows and exchange rates would both stabilize, with production and employment high and inflation low in major developed countries.




In sharp contrast to this is the “hard-landing” scenario.

With every deficit year the indebtedness of the U.S. government goes up, and with it the interest charges on the U.S. budget, which in turn raises the deficit even further.

Sooner or later, the argument goes, this then must undermine foreign confidence in America and the American dollar: some authorities consider this practically imminent.

Then foreigners stop lending money to the United States.

Indeed, they try to convert the dollars they hold into other currencies.

The resulting “flight from the dollar” brings the dollar’s exchange rates crashing down.

It also creates an extreme credit crunch, if not a “liquidity crisis,” in the United States.

The only question is whether the result will be a deflationary depression in the United States, a renewed outbreak of severe inflation, or, the most dreaded affliction, stagflation, that is, both a deflationary, stagnant economy and an inflationary currency.




There is, however, also a totally different “hard-landing” scenario, one in which it is Japan rather than the United States that faces a hard—a very hard—landing.

For the first time in peacetime history the major debtor, the United States, owes its foreign debt in its own currency.

To get out of its debt it does not need to repudiate, to declare a moratorium, or to negotiate a rollover.

All it has to do is to devalue its currency, and the foreign creditor has effectively been expropriated.




For foreign creditor read Japan.

The Japanese by now hold about half of the dollars the United States owes foreigners.

In addition, practically all their other claims on the outside world are in dollars, largely because the Japanese have so far resisted all attempts to make the yen an international trading currency lest the government lose control over it.

Altogether, the Japanese banks now hold more international assets than do the banks of any other country, including the United States.

And practically all these assets are in U.S. dollars-64) billions of them!

A devaluation of the U.S. dollar thus falls most heavily on the Japanese and immediately expropriates them.




But also, the Japanese might be the main sufferers of a hard landing in their trade and their domestic economy.

By far the largest part of Japan’s exports go to the United States.

If there is a hard landing, the United States might well turn protectionist almost overnight; it is unlikely that we would let in large volumes of imported goods were our unemployment rate to soar.

But this would immediately cause severe unemployment in Tokyo and Nagoya and Hiroshima and might indeed set off a true depression in Japan.




There is still another hard-landing scenario.

In it neither the United States nor Japan—nor the industrial economies altogether—experiences the hard landing; this will be suffered by the already depressed primary-products producers.

Practically all primary materials are traded in dollars; thus, their prices may not go up at all should the dollar be devalued.

They actually went down when the dollar plunged by 30 percent between June 1985 and January 1986.

Japan may thus be practically unaffected by a dollar devaluation; all she needs her dollar balances for, after all, is to pay for primary-products imports, as she buys little else on the outside and has no foreign debt.

The United States, too, may not suffer, and may even benefit as American industrial exports become more competitive.

But while the primary producers sell mainly in dollars, they have to pay in other developed-nations currencies for a large part of their industrial imports.

The United States, after all, although the world’s leading exporter of industrial goods, still accounts for one-fifth only of the industrial goods on the world market.

Four-fifths are furnished by others—the Germans, the Japanese, the French, the British, and so on.

Their prices in U.S. dollars are likely to go up.

This then might bring on a further deterioration in the terms of trade of the already depressed primary producers.

Some estimates of the possible drop go as high as 10 percent, which would entail considerable hardship for metal mines in South America and Rhodesia, and also for farmers in Canada, Kansas, or Brazil.




There is, however, one more possible scenario.

And it involves no “landings,” whether soft or hard.

What if the economists were wrong and both American budget deficit and American trade deficit could go on and on, albeit perhaps at lower levels than in recent years?

This would happen if the outside world’s willingness to put its money into the United States were based on other than purely economic considerations—on their own internal domestic politics, for instance, or simply on escaping political risks at home that appear to be far worse than a U.S. devaluation.




Actually, this is the only scenario that is so far supported by hard facts rather than by theory.

Indeed, it is already playing.




The U.S. government forced down the dollar by a full third (from a rate of 250 to a rate of 180 yen to the dollar) between June 1985 and February 1986—one of the most massive devaluations ever of a major currency, though called a readjustment.

America’s creditors unanimously supported this devaluation and indeed demanded it.

More amazing still, they have since increased their loans to the United States, and substantially so.

There is agreement, apparently, among international bankers that the United States is the more creditworthy the more the lender stands to lose by lending to it!




And a major reason for this Alice in Wonderland attitude i. that our biggest creditors, the Japanese, clearly prefer even very heavy losses on their dollar holdings to domestic unemployment.

For without the exports to the United States, Japan might have unemployment close to that of Western Europe, that is at a rate of 9 to 11 percent, and concentrated in the politically most sensitive smokestack industries in which Japan is becoming increasingly vulnerable to competition by newcomers, such as South Korea.




Similarly, economic conditions alone will not induce the Hong Kong Chinese to withdraw the money they have transferred to American banks in anticipation of Hong Kong’s “return” to Red China in 1997—and these deposits amount to billions.

The even larger amounts, at least several hundred billions, of “flight capital” from Latin America that have found refuge in the U.S. dollar, will also not be lured away by purely economic incentives, such as higher interest rates.




The sum needed from the outside to keep going both a huge U.S. budget deficit and a huge U.S. trade deficit would be far too big to make this scenario more than a possibility.

Still, if political factors are in control, then the symbol economy is indeed truly uncoupled from the real economy, at least in the international sphere.




And whichever scenario proves right, none promises a return to “normality” of any kind.




«§§§»


One implication of the drifting apart of symbol and real economy is that from now on the exchange rates between major currencies will have to be treated in economic theory and business policy alike as a “comparative-advantage” factor, and as a major one to boot.




Economic theory teaches that the comparative-advantage factors of the real economy-comparative labor costs and labor productivity, raw-materials costs, energy costs, transportation costs, and the like—determine exchange rates.

And practically all businesses base their policies on this theorem.

Increasingly, however, exchange rates decide how labor costs in country A compare to labor costs in country B. Increasingly, exchange rates are a major comparative cost and one totally beyond business control.

And then, any firm at all exposed to the international economy has to realize that it is in two businesses at the same time.

It is both a maker of goods (or a supplier of services) and a financial business.

It cannot disregard either.

Specifically, the business that sells abroad—whether as an exporter or through subsidiaries in foreign countries—will have to protect itself against foreign-exchange exposure in respect to all three: proceeds from sales, working capital devoted to manufacturing for overseas markets, and investments abroad.

This will have to be done whether the business expects the value of its own currency to go up or to go down.

Businesses that buy abroad will have to do the same.

Indeed, even purely domestic businesses that face foreign competition in their home market will have to learn to hedge against the currency in which their main competitors produce.

If American businesses had been run that way during the years of the overvalued dollar, that is, from 1982 through 1985, most of the losses in market standing abroad and in foreign earnings might have been prevented.

These were management failures rather than acts of God.

Surely stockholders, but also the public in general, have every right to expect managements to do better the next time around.




«§§§»


In respect to government policy there is one conclusion: Don’t be clever.

It is tempting to exploit the ambiguity, instability, and uncertainty of the world economy to gain short-term advantages and to duck unpopular political decisions.

But it does not work.

Indeed—and this is the lesson of all three of the attempts made so far—disaster is a more likely outcome than success.




The Carter administration pushed down the U.S. dollar to artificial lows to stimulate the American economy through the promotion of American exports.

American exports did indeed go up—spectacularly so.

But far from stimulating the domestic economy, this depressed it and resulted in simultaneous record unemployment and accelerated inflation, the worst of all possible outcomes.




Mr. Reagan then, a few years later, pushed up interest rates to stop inflation and also pushed up the dollar.

This did indeed stop inflation.

It also triggered massive inflows of capital.

But it so overvalued the dollar as to create a surge of foreign imports.

As a result, the Reagan policy exposed the most vulnerable of the old smokestack industries, such as steel and automotive, to competition they could not possibly meet with a dollar exchange rate of 250 yen to the dollar (or a D Mark rate of three to the dollar).

And it deprived them of the earnings they needed to modernize themselves.

Also, the policy seriously damaged, perhaps irreversibly, the competitive position of American farm products in the world markets, and at the worst possible time.

Worse still, his “cleverness” defeated Mr. Reagan’s major purpose: the reduction of the U.S. government deficit.

Because of the losses to foreign competition, domestic industry did not grow enough to produce higher tax revenues.

Yet the easy and almost unlimited availability of foreign money enabled the Congress (and the administration) to postpone again and again action to cut the deficit.




The Japanese, too, may have been too clever in their attempt to exploit the disjunction between the international symbol economy and the international real economy.

Exploiting an undervalued yen, the Japanese have been pushing exports, a policy quite reminiscent of America under the Carter administration.

But, as earlier in America, the Japanese policy failed to stimulate the domestic economy; it has been barely growing these last few years, despite the export boom.

As a result, the Japanese, as mentioned earlier, have become dangerously overdependent on one customer, the United States.

And this has forced them to invest huge sums in American dollars, even though every thoughtful Japanese (including, of course, the Japanese government and the Japanese Central Bank) knew all along that these claims would end up being severely devalued.




Surely these three lessons should have taught us that government policies in the world economy will succeed to the extent to which they try to harmonize the needs of the two economies, rather than to the extent to which they try to exploit the disharmony between them.

Or to repeat very old wisdom: “In finance don’t be clever; be simple and conscientious.”

But, I am afraid, this is advice that governments are not likely to heed soon.

Conclusion

It is much too early even to guess what the world economy of tomorrow will look like.

Will major countries, for instance, succumb to the traditional fear reaction—that is, retreat into protectionism—or will they see a changed world economy as an opportunity?




Some of the main agenda are however pretty clear by now.




High among them will be the formulation of new development concepts and new development policies, especially on the part of the rapidly industrializing countries such as Mexico or Brazil.

They can no longer hope to finance their development by raw-materials exports, for example, Mexican petroleum.

But it is also becoming unrealistic for them to believe that their low labor costs will enable them to export large quantities of finished goods to the developed countries—which is what the Brazilians, for instance, still expect.

They would do much better to go into production sharing, that is, to use their labor advantage to become subcontractors to developed-country manufacturers for highly labor-intensive work that cannot be automated—some assembly operation, for instance, or parts and components needed in relatively small quantities only.

Developed countries simply do not have the labor anymore to do such work.

Yet even with the most thorough automation it will still account for 15 or 20 percent of manufacturing work.




Such production sharing is, of course, how the noncommunist Chinese of Southeast Asia—Singapore, Hong Kong, Taiwan—bootstrapped their development.

Yet in Latin America production sharing is still politically quite unacceptable and, indeed, anathema.

Mexico, for instance, has been deeply committed—since its beginnings as a modern nation in the early years of this century—to making her economy less dependent on, and less integrated with, that of its big neighbor to the north.

That this policy has been a total failure for eighty years has only strengthened its emotional and political appeal.




But even if production sharing is used to the fullest, it would not by itself provide enough income to fuel development, especially of countries so much larger than Chinese city-states.

We thus need a new model and new policies.

Can we, for instance, learn something from India?

Everyone knows, of course, of India’s problems—and they are legion.

Few people seem to know, however, that India, since independence, has done a better development job than almost any other Third World country: 

the fastest increase in farm production and farm yields; 

a growth rate in manufacturing production equal to that of Brazil, and perhaps even of South Korea (India now has a bigger industrial economy than any but a handful of developed countries!); 

the emergence of a large and highly entrepreneurial middle class; 

and, arguably the greatest achievement, progress in providing both schooling and health care in the villages.

Yet the Indians followed none of the established models.

They did not, like Stalin, Mao, and so many of the Africans, despoil the peasants to produce capital for industrial development.

They did not export raw materials.

And they did not export the products of cheap labor.

But ever since Nehru’s death in 1964 India has encouraged and rewarded farm productivity and sponsored consumer-goods production and local entrepreneurs.

India and her achievement are bound to get far more attention from now on than they have received.




The developed countries, too, need to think through their policies in respect to the Third World—and especially in respect to the hopes of the Third World, the rapidly industrializing countries.

There are some beginnings: the new U.S. proposals for the debts of the primary-products countries that U.S. Treasury Secretary Baker recently put forth, or the new lending criteria which the World Bank recently announced and under which loans to Third World countries from now on will be made conditional on a country’s overall development policies rather than based mainly on the soundness of individual projects.

But these proposals are so far aimed more at correcting past mistakes than at developing new policies.




The other major agenda item is, inevitably, going to be the international monetary system.

Since the Bretton Woods Conference at the end of World War II, it has been based on the U.S. dollar as the “reserve currency.”

This clearly does not work anymore.

The reserve currency’s country must be willing to subordinate its domestic policies to the needs of the international economy, for instance, risk domestic unemployment to keep currency rates stable.

And when it came to the crunch, the United States refused to do so, as Keynes, by the way, predicted forty years ago.




The stability the reserve currency was supposed to supply could be established today only if the major trading countries—at a minimum the United States, West Germany, and Japan—agreed to coordinate their economic, fiscal, and monetary policies, if not to subordinate them to joint, and that would mean supernational, decision making.

Is such a development even conceivable, except perhaps in the event of worldwide financial collapse?

The European experience with the far more modest European Currency Unit (ECU) is not encouraging; so far, no European government has been willing to yield an inch for the sake of the ECU.

But what else could be done?

Or have we come to the end of the 300-year-old attempt to regulate and stabilize money on which, in the last analysis, both the modern national state and the international system are largely based?




Finally, there is one conclusion: Economic dynamics have decisively shifted to the world economy.




Prevailing economic theory—whether Keynesian, monetarist, or supply-side—considers the national economy, especially that of the large developed countries, to be autonomous and the unit of both economic analysis and economic policy.

The international economy may be a restraint and a limitation, but it is not central, let alone determining.

This “macroeconomic axiom” of the modern economist has become increasingly shaky.

The two major developed countries that fully subscribe to it in their economic policies, Great Britain and the




United States, have done least well economically in the last thirty years and have also had the most economic instability.

West Germany and Japan never accepted the macroeconomic axiom.

Their universities teach it, of course.

But their policymakers, both in government and in business, reject it.

Instead, both have all along based their economic policies on the world economy, have systematically tried to anticipate its trends, and to exploit its changes as opportunities.

Above all, both make the country’s competitive position in the world economy the first priority in their policies—economic, fiscal, monetary, and largely even social—to which domestic considerations are normally subordinated.

And these two countries have, of course, done far better, both economically and socially, than Great Britain and the United States these last thirty years.

In fact, their focus on the world economy and the priority they give it may be the real “secret” of their success.




Similarly the secret of successful businesses in the developed world—the Japanese, the German carmakers like Mercedes and BMW, ASEA and Ericsson in Sweden, IBM and Citibank in the United States, but equally of a host of medium-size specialists in manufacturing and in all kinds of services—has been that they base their plans and their policies on exploiting the world economy’s changes as opportunities.




From now on any country—but also any business, especially a large one—that wants to do well economically will have to accept that it is the world economy that leads and that domestic economic policies will succeed only if they strengthen, or at least not impair, the country’s international competitive position.




This may be the most important—it surely is the most striking—feature of the changed world economy.

[1986]

 

line

 

Modern Prophets: Schumpeter or Keynes?

The two greatest economists of this century, Joseph A. Schumpeter and John Maynard Keynes, were born, only a few months apart, a hundred years ago: Schumpeter on February 8, 1883, in a provincial Austrian town; Keynes on June 5, 1883, in Cambridge, England.

(And they died only four years apart—Schumpeter in Connecticut on January 8, 1950, Keynes in southern England on April 21, 1946.)

The centenary of Keynes’s birth is being celebrated with a host of books, articles, conferences, and speeches.

If the centenary of Schumpeter’s birth were noticed at all, it would be in a small doctoral seminar.

And yet it is becoming increasingly clear that it is Schumpeter who will shape the thinking and inform the questions on economic theory and economic policy for the rest of this century, if not for the next thirty or fifty years.




«§§§»


The two men were not antagonists.

Both challenged longstanding assumptions.

The opponents of Keynes were the very “Austrians” Schumpeter himself had broken away from as a student, the neoclassical economists of the Austrian School.

And although Schumpeter considered all of Keynes’s answers wrong, or at least misleading, he was a sympathetic critic.

Indeed, it was Schumpeter who established Keynes in America.

When Keynes’s masterpiece, The General Theory of Employment, Interest and Money, came out in 1936, Schumpeter, by then the senior member of the Harvard economics faculty, told his students to read the book and told them also that Keynes’s work had totally superseded his own earlier writings on money.




Keynes, in turn, considered Schumpeter one of the few contemporary economists worthy of his respect.

In his lectures he again and again referred to the works Schumpeter had published during World War I, and especially to Schumpeter’s essay on the Rechenpfennige (that is, money of account) as the initial stimulus for his own thoughts on money.

Keynes’s most successful policy initiative, the proposal that Britain and the United States finance World War II by taxes rather than by borrowing, came directly out of Schumpeter’s 1918 warning of the disastrous consequences of the debt financing of World War I.




Schumpeter and Keynes are often contrasted politically, with Schumpeter being portrayed as the “conservative” and Keynes the “radical.”

The opposite is more nearly right.

Politically Keynes’s views were quite similar to what we now call “neoconservative.”

His theory had its origins in his passionate attachment to the free market and in his desire to keep politicians and governments out of it.

Schumpeter, by contrast, had serious doubts about the free market.

He thought that an “intelligent monopoly”—the American Bell Telephone system, for instance—had a great deal to recommend itself.

It could afford to take the long view instead of being driven from transaction to transaction by short-term expediency.

His closest friend for many years was the most radical and most doctrinaire of Europe’s left-wing socialists, the Austrian Otto Bauer, who, though staunchly anticommunist, was even more anticapitalist.

And Schumpeter, although never even close to being a socialist himself, served during 1919 as minister of finance in Austria’s only socialist government between the wars.

Schumpeter always maintained that Marx had been dead wrong in every one of his answers.

But he still considered himself a son of Marx and, held him in greater esteem than any other economist.

At least, so he argued, Marx asked the right questions, and to Schumpeter questions were always more important than answers.




The differences between Schumpeter and Keynes go much deer than economic theorems or political views.

The two saw different economic reality, were concerned with different problems, and defined economics quite differently.

These differences are highly important to an understanding of today’s economic world.




Keynes, for all that he broke with classical economics, operated entirely within its framework.

He was a heretic rather than an infidel.

Economics, for Keynes, was the equilibrium economics of Ricardo’s 1810 theories, which dominated the nineteenth century.

This economics deals with a closed system and a static one.

Keynes’s key question was the same question the nineteenth-century economists had asked: “How can one maintain an economy in balance and stasis?”




For Keynes, the main problems of economics are the relationship between the “real economy” of goods and services and the “symbol economy” of money and credit; the relationship between individuals and businesses and the “macroeconomy” of the nation-state; and finally, whether production (that is, supply) or consumption (that is, demand) provides the driving force of the economy.

In this sense Keynes was in a direct line with Ricardo, John Stuart Mill, the “Austrians,” and Alfred Marshall.

However much they differed otherwise, most of these nineteenth-century economists, and that includes Marx, had given the same answers to these questions: The “real economy” controls, and money is only the “veil of things,” the micro-economy of individuals and businesses determines, and government can, at best, correct minor discrepancies and, at worst, create dislocations; and supply controls, with demand a function of it.




«§§§»


Keynes asked the same questions that Ricardo, Mill, Marx, the “Austrians,” and Marshall had asked but, with unprecedented audacity, turned every one of the answers upside down.

In the Keynesian system, the “symbol economy” of money and credit are “real,” and goods and services dependent on it and its shadows.

The macroeconomy—the economy of the nationstate—is everything, with individuals and firms having neither power to influence, let alone to direct, the economy nor the ability to make effective decisions counter to the forces of the macroeconomy.

And economic phenomena, capital formation, productivity, and employment are functions of demand.




By now we know, as Schumpeter knew fifty years ago, that every one of these Keynesian answers is the wrong answer.

At least they are valid only for special cases and within fairly narrow ranges.

Take, for instance, Keynes’s key theorem: that monetary events—government deficits, interest rates, credit volume, and volume of money in circulation—determine demand and with it economic conditions.

This assumes, as Keynes himself stressed, that the turnover velocity of money is constant and not capable of being changed over the short term by individuals or firms.

Schumpeter pointed out fifty years ago that all evidence negates this assumption.

And indeed, whenever tried, Keynesian economic policies, whether in the original Keynesian or in the modified Friedman version, have been defeated by the micro-economy of business and individuals, unpredictably and without warning, changing the turnover velocity of money almost overnight.




«§§§»


When the Keynesian prescriptions were initially tried—in the United States in the early New Deal days—they seemed at first to work.

But then, around 1935 or so, consumers and businesses suddenly sharply reduced the turnover velocity of money within a few short months, which aborted a recovery based on government deficit spending and brought about a second collapse of the stock market in 1937.

The best example, however, is what happened in this country in 1981 and 1982.

The Federal Reserve’s purposeful attempt to control the economy by controlling the money supply was largely defeated by consumers and businesses who suddenly and almost violently shifted deposits from thrifts into money-market funds and from longterm investments into liquid assets—that is, from low-velocity into high-velocity money—to the point where no one could really tell anymore what the money supply is or even what the term means.

Individuals and businesses seeking to optimize their self-interest and guided by their perception of economic reality will always find a way to beat the “system”—whether, as in the Soviet bloc, through converting the entire economy into one gigantic black market or, as in the United States in 1981 and 1982, through transforming the financial system overnight despite laws, regulations, or economists.




This does not mean that economics is likely to return to pre-Keynesian neoclassicism.

Keynes’s critique of the neoclassic answers is as definitive as Schumpeter’s critique of Keynes.

But because we now know that individuals can and will defeat the system, we have lost the certainty which Keynes imposed on economics and which has made the Keynesian system the lodestar of economic theory and economic policy for fifty years.

Both Friedman’s monetarism and supply-side economics are desperate attempts to patch up the Keynesian system of equilibrium economics.

But it is unlikely that either can restore the self-contained, self-confident equilibrium economics, let alone an economic theory or an economic policy in which one factor, whether government spending, interest rates, money supply, or tax cuts, controls the economy predictably and with near-certainty.




That the Keynesian answers were not going to prove any more valid than the pre-Keynesian ones that they replaced was clear to Schumpeter from the beginning.

But to him this was much less important than that the Keynesian questions—the questions of Keynes’s predecessors as well—were not, Schumpeter thought, the important questions at all.

To him the basic fallacy was the very assumption that the healthy, the “normal,” economy is an economy in static equilibrium.

Schumpeter, from his student days on, held that a modern economy is always in dynamic disequilibrium.

Schumpeter’s economy is not a closed system like Newton’s universe—or Keynes’s macroeconomy.

It is forever growing and changing and is biological rather than mechanistic in nature.

If Keynes was a “heretic,” Schumpeter was an “infidel.”




Schumpeter was himself a student of the great men of Austrian economics and at a time when Vienna was the world capital of economic theory.

He held his teachers in lifelong affection.

But his doctoral dissertation—it became the earliest of his great books, The Theory of Economic Development (which in its original German version came out in 1911, when Schumpeter was only twenty-eight years old)—starts out with the assertion that the central problem of economics is not equilibrium but structural change.

This then led to Schumpeter’s famous theorem of the innovator as the true subject of economics.




«§§§»


Classical economics considered innovation to be outside the system, as Keynes did, too.

Innovation belonged in the category of “outside catastrophes” like earthquakes, climate, or war, which, everybody knew, have profound influence on the economy but are not part of economics.

Schumpeter insisted that, on the contrary, innovation—that is, entrepreneurship that moves resources from old and obsolescent to new and more productive employments—is the very essence of economics and most certainly of a modern economy.




He derived this notion, as he was the first to admit, from Marx.

But he used it to disprove Marx.

Schumpeter’s Economic Development does what neither the classical economists nor Marx nor Keynes was able to do: It makes profit fulfill an economic function.

In the economy of change and innovation, profit, in contrast to Marx and his theory, is not a Mehrwert, a “surplus value” stolen from the workers.

On the contrary, it is the only source of jobs for workers and of labor income.

The theory of economic development shows that no one except the innovator makes a genuine “profit”; and the innovator’s profit is always quite short-lived.

But innovation in Schumpeter’s famous phrase is also “creative destruction.”

It makes obsolete yesterday’s capital equipment and capital investment.

The more an economy progresses, the more capital formation will it therefore need.

Thus what the classical economist—or the accountant or the stock exchange—considers “profit” is a genuine cost, the cost of staying in business, the cost of a future in which nothing is predictable except that today’s profitable business will become tomorrow’s white elephant.

Thus, capital formation and productivity are needed to maintain the wealth-producing capacity of the economy and, above all, to maintain today’s jobs and to create tomorrow’s jobs.




«§§§»


Schumpeter’s “innovator” with his “creative destruction” is the only theory so far to explain why there is something we call “profit.”

The classical economists very well knew that their theory did not give any rationale for profit.

Indeed, in the equilibrium economics of a closed economic system there is no place for profit, no justification for it, no explanation of it.

If profit is, however, a genuine cost, and especially if profit is the only way to maintain jobs and to create new ones, then capitalism becomes again a moral system.




Morality and profits: The classical economists had pointed out that profit is needed as the incentive for the risk taker.

But is this not really a bribe and thus impossible to justify morally?

This dilemma had driven the most brilliant of the nineteenth-century economists, John Stuart Mill, to embrace socialism in his later years.

It had made it easy for Marx to fuse dispassionate analysis of the “system” with the moral revulsion of an Old Testament prophet against the exploiters.

The weakness on moral grounds of the profit incentive enabled Marx at once to condemn the capitalist as wicked and immoral and assert “scientifically” that he serves no function and that his speedy demise is “inevitable.”

As soon, however, as one shifts from the axiom of an unchanging, self-contained, closed economy to Schumpeter’s dynamic, growing, moving, changing economy, what is called profit is no longer immoral.

It becomes a moral imperative.

Indeed, the question then is no longer the question that agitated the classicists and still agitated Keynes: How can the economy be structured to minimize the bribe of the functionless surplus called profit that has to be handed over to the capitalist to keep the economy going?

The question in Schumpeter’s economics is always, Is there sufficient profit?

Is there adequate capital formation to provide for the costs of the future, the costs of staying in business, the costs of “creative destruction”?




«§§§»


This alone makes Schumpeter’s economic model the only one that can serve as the starting point for the economic policies we need.

Clearly the Keynesian—or classicist—treatment of innovation as being “outside,” and in fact peripheral to, the economy and with minimum impact on it, can no longer be maintained (if it ever could have been).

The basic question of economic theory and economic policy, especially in highly developed countries, is clearly: How can capital formation and productivity be maintained so that rapid technological change as well as employment can be sustained?

What is the minimum profit needed to defray the costs of the future?

What is the minimum profit needed, above all, to maintain jobs and to create new ones?




Schumpeter gave no answer; he did not much believe in answers.

But seventy years ago, as a very young man, he asked what is clearly going to be the central question of economic theory and economic policy in the years to come.




And then, during World War I, Schumpeter realized, long before anyone else—and a good ten years before Keynes did—that economic reality was changing.

He realized that World War I had brought about the monetarization of the economies of all belligerents.

Country after country, including his own still fairly backward Austria-Hungary, had succeeded during the war in mobilizing the entire liquid wealth of the community, partly through taxation but mainly through borrowing.

Money and credit, rather than goods and services, had become the real economy.”




In a brilliant essay published in a German economic journal in July 1918—when the world Schumpeter had grown up in and had known was crashing down around his ears—he argued that, from now on, money and credit would be the lever of control.

What he argued was that neither supply of goods, as the classicists had argued, nor demand for goods, as some of the earlier dissenters had maintained, was going to be controlling anymore.

Monetary factors—deficits, money, credit, taxes—were going to be the determinants of economic activity and of the allocation of resources.




This is, of course, the same insight on which Keynes later built his General Theory.

But Schumpeter’s conclusions were radically different from those Keynes reached.

Keynes came to the conclusion that the emergence of the symbol economy of money and credit made possible the “economist-king,” the scientific economist, who by playing on a few simple monetary keys—government spending, the interest rate, the volume of credit, or the amount of money in circulation—would maintain permanent equilibrium with full employment, prosperity, and stability.

But Schumpeter’s conclusion was that the emergence of the symbol economy as the dominant economy opened the door to tyranny and, in fact, invited tyranny.

That the economist now proclaimed himself infallible, he considered pure hubris.

But, above all, he saw that it was not going to be economists who would exercise the power, but politicians and generals.




And then, in the same year, just before World War I ended, Schumpeter published The Tax State (“The Fiscal State” would be a better translation).

Again, the insight is the same Keynes reached fifteen years later (and, as he often acknowledged, thanks to Schumpeter): The modern state, through the mechanisms of taxation and borrowing, has acquired the power to shift income and, through “transfer payments,” to control the distribution of the national product.

To Keynes this power was a magic wand to achieve both social justice and economic progress, and both economic stability and fiscal responsibility.

To Schumpeter—perhaps because he, unlike Keynes, was a student of both Marx and history—this power was an invitation to political irresponsibility, because it eliminated all economic safeguards against inflation.

In the past the inability of the state to tax more than a very small proportion of the gross national product, or to borrow more than a very small part of the country’s wealth, had made inflation self-limiting.

Now the only safeguard against inflation would be political, that is, self-discipline.

And Schumpeter was not very sanguine about the politician’s capacity for self-discipline.




Schumpeter’s work as an economist after World War I is of great importance to economic theory.

He became one of the fathers of business cycle theory.




«§§§»


But Schumpeter’s real contribution during the thirty-two years between the end of World War I and his death in 1950 was as a political economist.

In 1942, when everyone was scared of a worldwide deflationary depression, Schumpeter published his best-known book, Capitalism, Socialism and Democracy, still, and deservedly, read widely.

In this book he argued that capitalism would be destroyed by its own success.

This would breed what we would now call the new class: bureaucrats, intellectuals, professors, lawyers, journalists, all of them beneficiaries of capitalism’s economic fruits and, in fact, parasitical on them, and yet all of them opposed to the ethos of wealth production, of saving, and of allocating resources to economic productivity.

The forty years since this book appeared have surely proved Schumpeter to be a major prophet.




And then he proceeded to argue that capitalism would be destroyed by the very democracy it had helped create and made possible.

For in a democracy, to be popular, government would increasingly shift income from producer to nonproducer, would increasingly move income from where it would be saved and become capital for tomorrow to where it would be consumed.

Government in a democracy would thus be under increasing inflationary pressure.

Eventually, he prophesied, inflation would destroy both democracy and capitalism.




When he wrote this in 1942, almost everybody laughed.

Nothing seemed less likely than an inflation based on economic success.

Now, forty years later, this has emerged as the central problem of democracy and of a free-market economy alike, just Schumpeter had prophesied.




The Keynesians in the 1940s ushered in their “promised land,” in which the economist-king would guarantee the perfect equilibrium of an eternally stable economy through control of money, credit, spending, and taxes.

Schumpeter, however, increasingly concerned himself with the question of how the public sector could be controlled and limited so as to maintain political freedom and an economy capable of performance, growth, and change.

When death overtook him at his desk, he was revising the presidential address he had given to the American Economic Association only a few days earlier.

The last sentence he wrote was “The stagnationists are wrong in their diagnosis of the reason the capitalist process should stagnate; they may still turn out to be right in their prognosis that it will stagnate—with sufficient help from the public sector.”




Keynes’s best-known saying is surely “In the long run we are all dead.”

This is one of the most fatuous remarks ever made.

Of course, in the long run we are all dead.

But Keynes in a wiser moment remarked that the deeds of today’s politicians are usually based on the theorems of long-dead economists.

And it is a total fallacy that, as Keynes implies, optimizing the short term creates the right long-term future.

Keynes is in large measure responsible for the extreme short-term focus of modern politics, of modern economics, and of modern business—the short-term focus that is now, with considerable justice, considered a major weakness of American policymakers, both in government and in business.




«§§§»


Schumpeter also knew that policies have to fit the short term.

He learned this lesson the hard way—as minister of finance in the newly formed Austrian republic in which he, totally unsuccessfully, tried to stop inflation before it got out of hand.

He knew that he had failed because his measures were not acceptable in the short term—the very measures that, two years later, a noneconomist, a politician and professor of moral theology did apply to stop the inflation, but only after it had all but destroyed Austria’s economy and middle class.




But Schumpeter also knew that today’s short-term measures have long-term impacts.

They irrevocably make the future.

Not to think through the futurity of short-term decisions and their impact long after “we are all dead” is irresponsible.

It also leads to the wrong decisions.

It is this constant emphasis in Schumpeter on thinking through the long-term consequences of the expedient, the popular, the clever, and the brilliant that makes him a great economist and the appropriate guide for today, when short-run, clever, brilliant economics—and short-run, clever, brilliant politics—have become bankrupt.




In some ways, Keynes and Schumpeter replayed the best-known confrontation of philosophers in the Western tradition—the Platonic dialogue between Parmenides, the brilliant, clever, irresistible sophist, and the slow-moving and ugly, but wise Socrates.

No one in the interwar years was more brilliant, more clever than Keynes.

Schumpeter, by contrast, appeared pedestrian—but he had wisdom.

Cleverness carries the day.

But wisdom endureth.

[1983]

 

line

 

The Information-Based Organization

The “office of the future” is still largely speculation.

But the organization of the future is rapidly becoming reality—a structure in which information serves as the axis and as the central structural support.

A number of businesses—Citibank, for instance, in the United States; Massey-Ferguson, the Canadian multinational tractor maker; and some of the large Japanese trading companies—are busily reshaping their managerial structure around the flow of information.

And wherever we have been moving into genuine automation of manufacturing production, as in the Erie, Pennsylvania, locomotive plant of General Electric, we are finding that we have to restructure management and redesign it as an information-based organization.

 

The organization chart of an information-based system may look perfectly conventional.

Yet such an organization behaves quite differently and requires different behavior from its members.

 

The information-based structure is flat, with far fewer levels of management than conventional ones require.

When a large multinational manufacturer restructured itself around information and its flow, it found that seven of its twelve levels of management could be cut out.

Similarly, in automated plants, for example, the Nissan auto assembly plant outside of Yokohama, Japan, and the GE locomotive plant in Erie, most of the traditional management layers between first-line supervisor and plant manager have disappeared.

 

These levels, it turns out, were not levels of authority, of decision making, or even of supervision.

They were relays for information, similar in function to the boosters on a telephone cable, which collect, amplify, repackage, and send on information—all tasks that an impersonal “information system” can do better.

This pertains in particular to management levels that “coordinate” rather than “do”—group executives, or assistants to, or regional sales managers.

But such levels of management as remain in information-based organizations find themselves with far bigger, far more demanding, and far more responsible jobs.

This is true particularly in respect to the first-level supervisor in the automated plant.

 

The information-based structure makes irrelevant the famous principle of the span of control, according to which the number of subordinates who can report to one superior is strictly limited, with five or six being the upper limit.

Its place being taken by a new principle—I call it the span of communications: The number of people reporting to one boss is limited only by the subordinates’ willingness to take responsibility for their own communications and relationships, upward, sideways, and downward.

“Control,” it turns out, is the ability obtain information.

And an information system provides that in depth, and with greater speed and accuracy than reporting to the boss can possibly do.

 

The information-based organization does not actually require advanced “information technology.”

All it requires is willingness to ask, Who requires what information, when and where?

 

British asked those questions in India two hundred years ago and came out with the world’s flattest organization structure, in which four levels of management staffed by fewer than a thousand Britons-most of them youngsters barely out of their teens and “lower-middle management”-efficiently ruled a subcontinent.

 

But when a company builds its organization around modern information technology it must ask the questions.

And then management positions and management layers whose main duty it has been to report rather than to do can be scrapped.

 

At the same time, however, the information-based structure permits, indeed it often requires, far more “soloists” with far more and different specializations in all areas, from technical and research people to service professionals taking care of special groups of customers.

Citibank, for instance, recently appointed a senior vice-president in New York headquarters to take care of the bank’s major Japanese customers and their financial needs anyplace in the world.

This man is not the “boss” of the bank’s large branches in Japan.

But he is not “service” staff either.

He is very definitely “line.”

He is a soloist and expected to function somewhat the way the pianist playing a Beethoven concerto is expected to function.

And both he and the “orchestra” around him, that is, the rest of the bank, can function only because both “know the score.”

It is information rather than authority that enables them mutually to support each other.

 

Automated manufacturing plants have equally found that they need a good many quality-assurance specialists.

These people, though very much seniors, hold no rank.

They are not in the chain of command.

Yet they take over as a kind of “pinch-hitting” super-boss whenever any process within the plant runs into quality problems.

 

The information-based system also allows for far greater diversity.

It makes it possible, for instance, to have within the same corporate structure purely managerial units, charged with optimizing what exists, and entrepreneurial units, charged with making obsolete what exists and with creating a different tomorrow.

 

Traditional organization basically rests on command authority.

The flow is from the top down.

Information-based organization rests on responsibility.

The flow is circular from the bottom up and then down again.

The information-based system can therefore function only if each individual and each unit accepts responsibility: for their goals and their priorities, for their relationships, and for their communications.

Each has to ask, What should the company expect of me and hold me accountable for in terms of performance and contribution?

Who in the organization has to know and understand what I am trying to do so that both they and I can do the work?

On whom in the organization do I depend for what information, knowledge, specialized skill?

And who in turn depends on me for what information, knowledge, specialized skill?

Whom do I have to support and to whom, in turn, do I look for support?

 

The conventional organization of business was modeled after the military.

The information-based system much more closely resembles the symphony orchestra.

All instruments play the same score.

But each plays a different part.

They play together, but they rarely play in unison.

There are more violins but the first violin is not the boss of the horns; indeed the first violin is not even the boss of the other violins.

And the same orchestra can, within the short span of an evening, play five pieces of music, each completely different in its style, its scoring, and its solo instruments.

 

In the orchestra, however, the score is given to both players and conductor.

In business the score is being written as it being played.

To know what the score is, everyone in the information-based organization has to manage by objectives that are agreed upon in advance and clearly understood.

Management by objectives and self-control is, of necessity, the integrating principle of the information-based structure.

 

The information-based organization thus requires high self-discipline.

This in turn makes possible fast decisions and quick response.

It permits both great flexibility and considerable diversity.

 

These advantages will be obtained only if there are understanding, shared values and, above all, mutual respect.

This probably rules out the finance-based diversification of the conglomerate.

If every player has to know the score, there has to be a common language, a common core of unity.

And this, experience has shown, is supplied only by a common market (for example, health-care providers or the housewife), or by a common technology.

Even with a traditional command-based system, diversification that rests primarily on financial control, as it does in the typical conglomerate, has never outlasted the tenure of its founder, whether ITT’s Harold Geneen or Gulf & Western’s Charles Bluhdorn.

But if the organization is information-based, diversification in which financial control is the only common language is bound to collapse in the confusion of the Tower of Babel.

 

The information-based organization is not permissive: it is disciplined.

It requires strong decisive leadership; first-rate orchestra conductors are without exception unspeakably demanding perfectionists.

What makes a first-rate conductor is, however, the ability to make even the most junior instrument at the last desk way back play as if the performance of the whole depended on how each one of those instruments renders its small supporting part.

What the information-based organization requires, in other words, is leadership that respects performance but demands self-discipline and upward responsibility from the first-level supervisor all the way to top management.

 

line

 

Management: The Problems of Success

The best-kept secret in management is that the first systematic applications of management theory and management principles did not take place in business enterprise.

They occurred in the public sector.

The first systematic and deliberate application of management principles in the United States—undertaken with full consciousness of its being an application of management—was the reorganization of the U.S. Army by Elihu Root, Teddy Roosevelt’s secretary of war.

Only a few years later, in 1908, came the first “city manager” (in Staunton, Virginia), the result of a conscious application of such then-brand-new management principles as the separation of “policy” (lodged in an elected and politically accountable city council) from “management” (lodged in a nonpolitical professional, accountable managerially).

The city manager, by the way, was the first senior executive anyplace called a manager; in business, this title was still quite unknown.

Frederick W. Taylor, for instance, in his famous 1911 testimony before the U.S. Congress never used the term but spoke of “the owners and their helpers.”

And when Taylor was asked to name an organization that truly practiced “Scientific Management,” he did not name a business but the Mayo Clinic.




Thirty years after, the city manager Luther Gulick applied management and management principles to the organization of a federal government that had grown out of control in the New Deal years.

It was not until 1950 and 1951, that is, more than ten years later, that similar management concepts and principles were systematically applied in a business enterprise to a similar task: the reorganization of the General Electric Company after it had outgrown its earlier, purely functional organization structure.




Today, surely, there is as much management outside of business as there is in business—maybe more.

The most management-conscious of our present institutions are probably the military, followed closely by hospitals.

Forty years ago the then-new management consultants considered only business enterprises as potential clients.

Today half of the clients of a typical management consulting firm are nonbusiness: government agencies, the military, schools and universities, hospitals, museums, professional associations, and community agencies like the Boy Scouts and the Red Cross.




And increasingly, holders of the advanced degree in Business Administration, the MBA, are the preferred recruits for careers in city management, in art museums, and in the federal government’s Office of Management and Budget.




Yet most people still hear the words business management when they hear or read management.

Management books often outsell all other nonfiction books on the bestseller lists; yet they are normally reviewed on the business page.

One “graduate business school” after another renames itself “School of Management.”

But the degree it awards has remained the MBA, the Master of Business Administration.

Management books, whether textbooks for use in college classes or books for the general reader, deal mainly with business and use business examples or business cases.




That we hear and read business management when the word management is spoken or printed has a simple explanation.

The business enterprise was not the first of the managed institutions.

The modern university and the modern army each antedate the modern business enterprise by a half century.

They emerged during and shortly after the Napoleonic Wars.

Indeed, the first “CEO” of a modern institution was the chief of staff of the post-Napoleonic Prussian army, an office developed between 1820 and 1840.

In spirit as well as in structure, both the new university and the new army represented a sharp break with their predecessors.

But both concealed this—deliberately—by using the old titles, many of the old rites and ceremonies and, especially, by maintaining the social position of the institution and of its leaders.




No one could, however, have mistaken the new business enterprise, as it arose in the third quarter of the nineteenth century, for a direct continuation of the old and traditional “business firm”—the “counting house” consisting of two elderly brothers and one clerk that figures so prominently in Charles Dickens’s popular books published in the 1850s and 1860s, and in so many other nineteenth-century novels, down to Thomas Mann’s Buddenbrooks published in 1906.




For one, the new business enterprise—the long-distance railroad as it developed in the United States after the Civil War, the Universal Bank as it developed on the European Continent, or the trusts such as United States Steel, which J. P. Morgan forged in the United States at the turn of the twentieth century—were not run by the “owners.”

Indeed, they had no owners, they had “shareholders.”

Legally, the new university or the new army was the same institution it had been since time immemorial, however much its character and function had changed.

But to accommodate the new business enterprise, a new and different legal persona had to be invented, the “corporation.”

A much more accurate term is the French Société Anonyme, the anonymous collective owned by no one and open to investment by everyone.

In the corporation, shares become a claim to profits rather than to property.

Share ownership is, of necessity, separate from control and management, and easily divorced from both.

And in the new corporation capital is provided by large, often by very large, numbers of outsiders, with each of them holding only a minute fraction and with none of them necessarily having an interest in, or—a total novelty—any liability for, the conduct of the business.




This new “corporation,” this new “Société Anonyme,” this new “Aktiengesellschaft,” could not be explained away as a reform, which is how the new army, the new university, and the new hospital presented themselves.

It clearly was a genuine innovation.

And this innovation soon came to provide the new jobs—at first, for the rapidly growing urban proletariat, but increasingly also for educated people.

It soon came to dominate the economy.

What in the older institutions could be explained as different procedures, different rules, or different regulations became in the new institution very soon a new function, management, and a new kind of work.

And this then invited study; it invited attention and controversy.




But even more extraordinary and unprecedented was the position of this newcomer in society.

It was the first new autonomous institution in hundreds of years, the first to create a power center that was within society yet independent of the central government of the national state.

This was an offense, a violation of everything the nineteenth century (and the twentieth-century political scientists still) considered “law of history,” and frankly a scandaL




Around 1860 one of the leading social scientists of the time, the Englishman Sir Henry Maine, coined the phrase in his book Ancient Law that the progress of history is “from status to contract.”

Few phrases ever have become as popular and as widely accepted as this one.




And yet, at the very time at which Maine proclaimed that the law of history demands the elimination of all autonomous power centers within society, the business enterprise arose.

And from the beginning it was clearly a power center within society and clearly autonomous.




To many contemporaries it was, and understandably so, a totally unnatural development and one that bespoke a monstrous conspiracy.

The first great social historian America produced, Henry Adams, clearly saw it this way.

His important novel, Democracy, which he wrote during the Grant administration, portrays the new economic power as itself corrupt and, in turn, as corrupting the political process, government, and society.

Henry’s brother, Brooks Adams, a few decades later, further elaborated on this theme in one of the most popular political books ever published in the United States, The Degeneration of the Democratic Dogma.




Similarly, the Wisconsin economist, John R. Commons—the brain behind the progressive movement in Wisconsin, the father of most of the “reforms” that later became the social and political innovations of the New Deal, and, last but not least, commonly considered the father of America’s “business unionism”—took very much the same tack.

He blamed business enterprise on a lawyers’ conspiracy leading to a misinterpretation of the Fourteenth Amendment to the Constitution by which the corporation was endowed with the same “legal personality” as the individual.




Across the Atlantic in Germany, Walter Rathenau—himself the successful chief executive of one of the very large new “corporations” (and later on to become one of the earliest victims of Nazi terror when he was assassinated in 1922 while serving as foreign minister of the new Weimar Republic)similarly felt that the business enterprise was something radically new, something quite incompatible with prevailing political and social theories, and indeed a severe social problem.




In Japan, Shibusawa Eiichi, who had left a promising government career in the 1870s to construct a modern Japan through building businesses, also saw in the business enterprise something quite new and distinctly challenging.

He tried to tame it by infusing it with the Confucian ethic; and Japanese big business as it developed after World War II is very largely made in Shibusawa’s image.




Everyplace else, the new business enterprise was equally seen as a radical and dangerous innovation.

In Austria, for instance, Karl Lueger, the founding father of the “Christian” parties that still dominate politics in Continental Europe, was elected lord mayor of Vienna in 1897 on a platform that defended the honest and honorable small businessman—the shopkeeper and the craftsman—against the evil and illegitimate corporation.

A few years later, an obscure Italian journalist, Benito Mussolini, rose to national prominence by denouncing “the soulless corporation.”




And thus quite naturally, perhaps even inevitably, concern with management, whether hostile to it or friendly, concentrated on the business enterprise.

No matter how much management was being applied to other institutions, it was the business enterprise that was visible, prominent, controversial, and above all, new, and therefore significant.




By now, however, almost a hundred years after management arose in the early large business enterprises of the 1870s, it is clear that management pertains to every single social institution.

In the last hundred years every major social function has become lodged in a large and managed organization.

The hospital of 1870 was still the place where the poor went to die.

By 1950 the hospital had become one of the most complex organizations, requiring management of extraordinary competence.

The labor union in developed countries is run today by a paid managerial staff, rather than by the politicians who are nominally at the head.

Even the very large university of 1900 (and the largest then had only five thousand students) was still simple, with a faculty of, at most, a few hundred, each professor teaching his own specialty.

It has by now become increasingly complex—including undergraduate, graduate, and postgraduate students—with research institutes and research grants from government and industry and, increasingly, with a large administrative superstructure.

And in the modern military, the basic question is the extent to which management is needed and the extent to which it interferes with leadership—with management apparently winning out.




The identification of management with business can thus no longer be maintained.

Even though our textbooks and our studies still focus heavily on what goes on in a business—and typically, magazines having the word management in their title (for example, Britain’s Management Today or Germany’s Management Magazin) concern themselves primarily if not exclusively with what goes on in business enterprises—management has become the pervasive, the universal organ of a modern society.




For modern society has become a “society of organizations.”

The individual who conforms to what political and social theorists still consider the norm has become a small minority: the individual who stands in society directly and on his own, with no intermediary institution of which he is a member and an employee between himself and the sovereign government.

The overwhelming majority of all people in developed societies are employees of an organization; they derive their livelihood from the collective income of an organization, see their opportunity for career and success primarily as opportunity within an organization; and define their social status largely through their position within the ranks of an organization.

Increasingly, especially in the United States, the only way in which the individual can amass a little property is through the pension fund, that is, through membership in an organization.




And each of these organizations, in turn, depends for its functioning on management.

Management makes an organization out of what otherwise would be a mob.

It is the effective, integrating, life-giving organ.




In a society of organizations, managing becomes a key social function and management the constitutive, the determining, the differential organ of society.

The New Pluralism

The dogma of the “liberal state” is still taught in our university departments of government and in our law schools.

According to it, all organized power is vested in one central government.

But the society of organizations is a pluralist society.

In open defiance of the prevailing dogma, it contains a diversity of organizations and power centers.

And each has to have a management and has to be managed.

The business enterprise is only one; there are the labor unions and the farm organizations, the health-care institutions and the schools and universities, not to mention the media.

Indeed, even government is increasingly becoming a pluralist congeries of near-autonomous power centers, very different indeed from the branches of government of the American Constitution.

There is the civil service, for instance.

The last president of the United States who had effective control of the civil service was Franklin D. Roosevelt fifty years ago; in England it was Winston Churchill; in Russia, Stalin.

Since their time the civil service in all major countries has become an establishment in its own right.

And so, increasingly, has the military.




In the nineteenth century the “liberal state” had to admit the parties, though it did so grudgingly and with dire misgivings.

But the purpose of the parties was the conquest of government.

They were, so to speak, gears in the governmental machine and had neither existence nor justification outside of it.




No such purpose animates the institutions of the new pluralism.




The institutions of the old pluralism, that is, of medieval Europe or of medieval Japan (the princes and the feudal barons, the free cities, the artisans the bishoprics and abbeys) were themselves governments.

Each indeed tried to annex as much of the plenitude of governmental power as it could get away with.

Each levied taxes and collected customs duties.

Each strove to be granted the right to make laws, and to establish and run its own law courts.

Each tried to confer knighthoods, patents of nobility, or titles of citizenship.

And each tried to obtain the most coveted right of them all, the right to mint its own coins.




But the purpose of today’s pluralist institution is nongovernmental: to make and to sell goods and services, to protect jobs and wages, to heal the sick, to teach the young, and so on.

Each only exists to do something that is different from what government does or, indeed, to do something so that government need not do it.




The institutions of the old pluralism also saw themselves as total communities.

Even the craft guild, the powerful woolen weavers of Florence, for instance, organized itself primarily to control its members.

Of course, weavers got paid for selling woolen goods to other people.

But their guild tried as hard as possible to insulate the members against economic impacts from the outside by severely restricting what could be made, how much of it, and how and at what price it could be sold, and by whom.

Every guild gathered its members into its own quarter in the city, over which it exerted governmental control.

Every one immediately built its own church with its own patron saint.

Every one immediately built its own school; there is still “Merchant Taylor’s” in London.

Every one controlled access to membership in the guild.

If the institutions of the old pluralism had to deal with the outside at all, they did so as “foreign relations” through formal pacts, alliances, feuds, and, often enough, open war.

The outsider was a foreigner.




The institutions of the new pluralism have no purpose except outside of themselves.

They exist in contemplation of a “customer” or a “market.”

Achievement in the hospital is not a satisfied nurse, but a cured former patient.

Achievement in business is not a happy work force, however desirable it may be; it is a satisfied customer who reorders the product.




All institutions of the new pluralism, unlike those of the old, are single-purpose institutions.

They are tools of society to supply one specific social need, whether making or selling cars, giving telephone service, curing the sick, teaching children to read, or providing benefit checks to unemployed workers.

To make this single, specific contribution, they themselves need a considerable measure of autonomy, however.

They need to be organized in perpetuity, or at least for long periods of time.

They need to dispose of a considerable amount of society’s resources, of land, raw materials, and money, but above all of people, and especially of the scarcest resource of them all, highly trained and highly educated people.

And they need a considerable amount of power over people, and coercive power at that.

It is only too easy to forget that in the not-so-distant past, only slaves, servants, and convicts had to be at the job at a time set for them by someone else.




This institution has—and has to have—power to bestow or to withhold social recognition and economic rewards.

Whichever method we use to select people for assignments and promotions—appointment from above, selection by one’s peers, even rotation among jobs—it is always a power decision made for the individual rather than by him, and on the basis of impersonal criteria that are related to the organization’s purpose rather than to the individual’s purpose.

The individual is thus, of necessity, subjected to a power grounded in the value system of whatever specific social purpose the institution has been created to satisfy.




And the organ through which this power is exercised in the institution is the organ we call management.




This is new and quite unprecedented.

We have neither political nor social theory for it as yet.




This new pluralism immediately raises the question, Who takes care of the commonweal when society is organized in individual power centers, each concerned with a specific goal rather than with the common good?




Each institution in a pluralist society sees its own purpose as the central and the most important one.

Indeed, it cannot do otherwise.

The school, for instance, or the university could not function unless they saw teaching and research as what makes a good society and what makes a good citizen.

Surely nobody chooses to go into hospital administration or into nursing unless he or she believes in health as an absolute value.

And as countless failed mergers and acquisitions attest, no management will do a good job running a company unless it believes in the product or service the company supplies, and unless it respects the company’s customers and their values.




Charles E. Wilson, GM’s chairman (later President Eisenhower’s secretary of defense), never said, “What is good for General Motors is good for the country.”

What he actually said is “What is good for the country is good for General Motors, and vice versa.”

But that Wilson was misquoted is quite irrelevant.

What matters is that everybody believed that he not only said what he was misquoted to have said, but that he actually believed it.

And indeed no one could run General Motors—or Harvard University, or Misericordia Hospital, or the Bricklayers Union, or the Marine Corps—unless he believed that what is good for GM, or Harvard, or Misericordia, or the Bricklayers, or the Marines is indeed good for the country and is indeed a “mission,” that if not divinely ordained, is still essential to society.




Yet each of these missions is one and only one dimension of the common good—important yes, indispensable perhaps, and yet a relative rather than an absolute good.

As such, it must be limited, weighed in the balance with, and often subordinated to, other considerations.

Somehow the common good must be made to emerge out of the clash and clamor of special interests.




The old pluralism never solved this problem.

This explains why suppressing it became the “progressive cause” and the one with which the moral philosophers of the modern age (that is, of the sixteenth through the nineteenth centuries) aligned themselves.




Can the new pluralism do any better?

One solution is, of course, to suppress the pluralist institutions.

This is the answer given by totalitarianism and is indeed its true essence.

The totalitarian state, whether it calls itself Fascist, Nazi, Stalinist, or Maoist, makes all institutions subservient to and extensions of the state (or of the omnipotent party).

This saves the “state” of modern political theory, but at the sacrifice of individual freedom, of free thought and free expression, and of any limitation on power altogether.

The state (or the party) is then indeed the only power center, as traditional theory preaches.

But it can maintain its monopoly on power only by being based on naked terror, as Lenin was the first to realize.

And even at that horrible price, it does not really work.

As we now know—and the experience of all totalitarian regimes is exactly the same, whether they call themselves Right or Left—the pluralist institutions persist behind the monolithic facade.

They can be derived of their autonomy only if they and society altogether are rendered unable to perform, for instance, through Stalin’s urges or Mao’s Cultural Revolution.

What the totalitarian regimes have proved is that modern society has to be a “society of organizations,” and that means a pluralist society.

About un-centralizing in
The Age of Discontinuity: Guidelines To Our Changing Society

The only choice is whether individual freedom is being maintained or is being suppressed and destroyed, albeit to no purpose other than naked power.




The opposite approach to that of the totalitarian is the American one.

The United States, alone among modern nations, never fully accepted the dogma of the liberal state.

It opposed to it, quite early in its history, a pluralist political theory, that of John C. Calhoun’s “concurrent majority.”

In the way in which Calhoun presented his theory in the 183os and 1840s, that is, as a pluralism exercised through the individual states and intended to prevent the breakup of the Union over slavery, the “concurrent majority” did not survive the Civil War.

But thirty years later, Mark Hanna, the founder of the modern Republican party and of modern American politics altogether, reformulated Calhoun’s pluralism as a concurrent majority of the major “interests”: farmers, workers, business.

Each of these three “estates of the realm” can effectively veto the majority.

It must not impose its will on the others.

But it must be able to prevent the others from imposing their will on it.

Another thirty years later, Franklin D. Roosevelt made this the basic political creed of the New Deal.

In Roosevelt’s system government became the arbiter whose job it is to make sure that no one interest gets too powerful.

When Roosevelt came in, “capital”—business as a term came later, and management later still—appeared to be far too powerful.

Farmers and workers were thus organized to offset the business power.

And then, not so many years later, when the labor power seemed to become too great, farmers and business were organized to offset and balance labor power, and so on.




Each of the “interests” is free to pursue its own goals regardless of the common good; it is indeed expected to do so.

In the darkest days of World War II, in 1943 when American troops still lacked arms and ammunition, John L. Lewis, the founder of the Congress of Industrial Organizations (that is, of modern American unionism) and the powerful head of the coal miners’ union, called a coal strike to get higher wages for his men, defying national wage controls.

President Roosevelt attacked him publicly for endangering the nation’s survival.

Lewis retorted: “The President of the United States is paid to look after the nation’s survival.

I am paid to look after the interests of the coal miners.”

And while the newspapers at tacked Lewis harshly, public opinion apparently felt that Lewis had only said out aloud what the Roosevelt administration had practiced all along.

It gave Lewis enough support to win the strike.




This example, however, shows that the American pluralist doctrine is hardly adequate.

Indeed, just as the old pluralism did, it has given birth to so many vested interests and pressure groups that it is almost impossible to conduct the business of government, let alone to conduct it for the common good.




In 1984-85 practically everyone in the United States agreed that the country needed a drastic tax reform to replace an increasingly complicated and irrational tax code, with a few tax rates and with exemptions eliminated.

But no such code could be enacted.

Every single exemption became the sacred cause of a vested interest.

And even though some of them represented only a few hundred or a few thousand voters, each of them could and did block tax reform.




Is there a way out?

The Japanese seem to be the only ones so far able to reconcile a society of organizations with the pursuit of the common good.

It is expected of the major Japanese interests that they take their cue from “what is good for the country”: Then they are expected to fit what is good for themselves into the framework of a public policy designed to serve the national interest.




It is doubtful, however, whether even Japan can long maintain this approach.

It reflects a past in which Japan saw herself as isolated in a hostile and alien world—so that all of Japan, regardless of immediate interests, had to hang together lest it hang separately.

Will this attitude survive Japan’s success?

And could such an approach have a chance in the West, where interests are expected to behave as interests?




Is this a problem of management, it will be asked?

Is it not a problem of politics, of government, or political philosophy?

But if management does not tackle it, then almost inevitably there will be imposed political solutions.

When, for instance, the health-care institutions in America, the hospitals and the medical profession, did not take responsibility for spiraling health-care costs, government imposed restrictions on them, for example, the Medicare restrictions on the care of the aged in hospitals.

These rules clearly are not concerned with health care at all and may even be detrimental to it.

They are designed to serve short-run fiscal concerns of government and employers, that is, designed to substitute a different but equally one-sided approach for the one-sided, self-centered approach of the health-care “interests.”




This must be the outcome unless the managements of the institutions of the new pluralism see it as their job to reconcile concern for the common good with the pursuit of the special mission for the sake of which their institution exists.

The Legitimacy of Management

Power has to be legitimate.

Otherwise it has only force and no authority, is only might and never right.

To be legitimate, power has to be grounded outside of it in something transcending it that is accepted as a genuine value, if not as a true absolute by those subject to the power—whether descent from the gods or apostolic succession; divine institution or its modern, totalitarian counterpart the scientific laws of history; the consent of the governed, popular election or, as in so much of modern society, the magic of the advanced degree.

If power is an end in itself, it becomes despotism and both illegitimate and tyrannical.




Management has to have power to do its job, whatever the organization.

In that respect there is little difference between the Catholic diocese, the university, the hospital, the labor union, and the business enterprise.

And because the governing organ of each of these institutions has to have power, it has to have legitimacy.




And here we encounter a puzzle.

The management of the key institutions of our society of organizations is by and large accepted as legitimate.

The single exception is the management of the business enterprise.

Business enterprise is seen as necessary and accepted as such.

Indeed, society is often more concerned with the survival of a large business or an industry than it is with that of any other single institution.

If a major business is in trouble, there is a crisis and desperate attempts to salvage the company.

But at the same time, business management is suspect.

And any exercise of management power is denounced as usurpation, with cries from all sides for legislation or for judicial action to curb if not to suppress managerial power altogether.




One common explanation is that the large business enterprise wields more power than any other institution.

But this simply does not hold water.

Not only is business enterprise hemmed in in its power on all sides—by government and government regulations, by labor unions, and so on.

The power of even the largest and wealthiest business enterprise is insignificant next to that of the university now that a college degree has become a prerequisite for access to any but the most menial jobs.

The university and its management are often criticized, but their legitimacy is rarely questioned.




The large labor union in Western Europe and in American mass-production industries surely has more power than any single business enterprise in its country or industry.

Indeed in Western Europe, both in Britain and on the Continent, the large labor union became society’s most powerful institution in the period after World War II, more powerful sometimes than the nation’s government.

The unions’ exercise of their power during this period was only too often self-serving, if not irresponsible.

But even their bitterest critics in Western Europe and in the United States rarely questioned the unions’ legitimacy.




Another explanation—the prevalent one these days—is that the managements of all other institutions are altruistic, whereas business is profit-seeking and therefore out for itself and materialistic.

But even if it is accepted that for many people nonprofit is virtuous, and profit dubious, if not outright sinful, the explanation that profit undermines the legitimacy of business management is hardly adequate.

In all Western countries the legitimacy of owners, that is, of real capitalists, and their profits is generally accepted without much question.

That of a professional management is not, yet professional management obtains profits for other people rather than for itself—and its main beneficiaries today are the pension funds of employees.




And then there is the situation in Japan.

In no other country, not even in France or in Sweden, was the intellectual climate of the postwar period as hostile to “profit” as in Japan, at least until 1975 or so.

The left-wing intelligentsia of Japan in the universities or the newspapers might have wanted to nationalize Japan’s big businesses.

But it never occurred even to the purest Marxist among them to question the necessity of management or its legitimacy.




The explanation clearly lies in the image which Japanese management has of itself and which it presents to its society.

In Japanese law, as in American and European law, management is the servant of the stockholders.

But this the Japanese treat as pure fiction.

The reality which is seen as guiding the behavior of Japanese big-business management (even in companies that are family-owned and family-managed like Toyota) is management as an organ of the business itself.

Management is the servant of the going concern, which brings together in a common interest a number of constituencies: employees first, then customers, then creditors, and finally suppliers.

Stockholders are only a special group of creditors, rather than “the owners” for whose sake the enterprise exists.

As their performance shows, Japanese businesses are not run as philanthropies and know how to obtain economic results.

In fact, the Japanese banks, which are the real powers in the Japanese economy, watch economic performance closely and move in on a poorly performing or lackluster top management much faster than do the boards of Western publicly held companies.

But the Japanese have institutionalized the going concern and its values through lifetime employment, under which the employees’ claim to job and income comes first—unless the survival of the enterprise itself is endangered.




The Japanese formulation presents very real problems, especially at a time of rapid structural change in technology and economy when labor mobility is badly needed.

Still, the Japanese example indicates why management legitimacy is a problem in the West.

Business management in the West (and in particular business management in the United States) has not yet faced up to the fact that our society has become a society of organizations of which management is the critical organ.




Thirty years ago or so, when the serious study of management began, Ralph Cordiner, then CEO of the General Electric Company, tried to reformulate the responsibility of corporate top management.

He spoke of its being the “trustee for the balanced best interest of stockholders, employees, customers, suppliers and plant communities”—the groups which would now be called stakeholders or constituencies.

As a slogan this caught on fast.

Countless other American companies wrote it into their Corporate Philosophy statement.

But neither Mr. Cordiner nor any of the other chairmen and presidents who embraced his rhetoric did what the Japanese have done: institutionalize their professions.

They did not think through what the best-balanced interest of these different stakeholders would mean, how to judge performance against such an objective, and how to create accountability for it.

The statement remained good intentions.

And good intentions are not enough to make power legitimate.

In fact, good intentions as the grounds for power characterize the “enlightened despot.”

And enlightened despotism never works.




The term enlightened despot was coined in the eighteenth century—with Voltaire probably its greatest and most enthusiastic exponent—when the divine right of princes was no longer generally accepted as a ground of legitimate power.

The prince with the best intentions among eighteenth-century enlightened despots and the very model of the progressive, the enlightened liberal, was the Austrian emperor Joseph II (reigned 1765-90).

Every one of the reforms that he pioneered was a step in the right direction—the abolition of torture; religious toleration for Protestants, Jews, and even atheists; universal free education and public hospitals in every county; abolition of serfdom; codification of the laws; and so on.

Yet his subjects, and especially his subjects in the most advanced parts of his empire, the Austrian Netherlands, rose against him in revolt.

And when, a few years later, the French Revolution broke out, the enlightened despots of Europe toppled like ninepins.

They had no constituency to support them.




Because Ralph Cordiner and his contemporaries never even tried to ground management power in institutional arrangements, their assertion very rapidly became enlightened despotism.

In the 19505 and 1960s it became corporate capitalism, in which an enlightened “professional” management has absolute power within its corporation, controlled only by itself and irremovable except in the event of catastrophe.

“Stock ownership,” it was argued, had come to be so widely dispersed that shareholders no longer could interfere, let alone exercise control.




But this is hubris: arrogance and sinful pride, which always rides before a fall.

Within ten years after it had announced the independence of management in the large, publicly owned corporation, “corporate capitalism” began to collapse.

For one, stock ownership came to be concentrated again, in the hands of the pension funds.




And then inflation distorted values, as it always does, so that stock prices, which are based on earnings expectations, came to appear far lower than book values and liquidation values.

The result was the wave of hostile takeovers that has been inundating the American economy these last years and is spilling over into Europe now.

Underlying it is the assertion that the business enterprise exists, and solely, for the sake of stockholder profits, and short-run, immediate profits at that.




By now it has become accepted widely—except on Wall Street and among Wall Street lawyers—that the hostile takeover is deleterious and in fact one of the major causes of the loss of America’s competitive position in the world economy.

One way or another, the hostile takeover will be stopped (on this see also Chapter 28 of this volume).

It may be through a “crash”; speculative booms always collapse in the end.

It may be through such changes as switching to different classes of common stock, with the shares owned by the outside public having a fraction of the voting power of the insiders’ shares, or by giving up voting rights for publicly held common shares altogether.

(I owe this suggestion to Mr. Walter Wriston, the chairman emeritus of New York’s Citibank.)




No matter how the hostile takeover boom is finally stopped, it will have made certain that the problem of management legitimacy has to be tackled.

We know some of the specifications for the solution.

There have to be proper safeguards of the economic performance of a business: its market standing, the quality of its products or services, and its performance as an innovator.

There has to be emphasis on, and control of, financial performance.

If the takeover boom has taught us one thing, it is that management must not be allowed substandard financial performance.




But somehow the various “stakeholders” also have to be brought into the management process (for example, through the company’s pension plan as a representative of the company’s employees for whom the pension plan is the trustee).

And somehow the maintenance of the wealth-producing and the job-producing capacity of the enterprise, that is, the maintenance of the going concern, needs to be built into our legal and institutional arrangements.

It should not be too difficult.

After all, we built the preservation of the going concern into our bankruptcy laws all of ninety years ago when we gave it priority over all other claims, including the claims of the creditors.

But whatever the specifics, business management has to attain legitimacy; its power has to be grounded in a justification outside and beyond it and has to be given the “constitutional” sanction it still largely lacks.




Closely connected to the problem of the legitimacy of management is management’s compensation.




Management, to be legitimate, must be accepted as “professional.”

Professionals have always been paid well and deserve to be paid well.

But it has always been considered unprofessional to put money ahead of professional responsibility and professional standards.

This means that there have to be limitations on managerial incomes.

It is surely not professional for a chief executive officer to give himself a bonus of several millions at the very time at which the pay of the company’s other employees is cut by 30 percent, as the chief executive officer of Chrysler did a few years ago.

It is surely not professional altogether for people who are employees and not “owners” to pay themselves salaries and bonuses greatly in excess of what their own colleagues, that is, other members of management, receive.

And it is not professional to pay oneself salaries and bonuses that are so far above the norm as to create social tension, envy, and resentment.

Indeed there is no economic justification for very large executive incomes.

German and Japanese top managers surely do as good a job as American top managers—perhaps, judging by results, an even better one.

Yet their incomes are, at the most, half of what American chief executives of companies in similar industries and of similar size are sometimes being paid.




But there is also work to be done on the preparation, testing, and selection of, and on the succession to, the top-management jobs in the large business enterprises; on the structure of top management; and on performance standards for top management and the institutional arrangements for monitoring and enforcing them.




Business management is not yet fully accepted as legitimate in the West because it has not yet realized the full implications of its success.

Individual executives, even those of the biggest company, are largely anonymous.

They only make asses of themselves if they try to behave as if they were aristocrats.

They are hired hands like the rest of us.

On the day on which they retire and move out of the executive suite they become “nonpersons” even in their old company.

But while in office they represent; individually almost faceless, collectively they constitute a governing group.

As such their behavior is seen as representative.

What is private peccadillo for ordinary mortals becomes reprehensible misconduct and indeed betrayal if done by a leader.

For not only is the leader visible; it is his duty to set an example.




But then there is also the big question of what is now being called the “social responsibility” of management.

It is not, despite all rhetoric to the contrary, a social responsibility of business but of all institutions—otherwise we would hardly have all the malpractice suits against American hospitals or all the suits alleging discrimination against American colleges and universities.

But business is surely one of the key institutions of a society of organizations and as such needs to determine what its social responsibilities are—and what they are not.




Surely business, like anyone else, is responsible for its impacts: responsibility for one’s impacts is, after all, one of the oldest tenets of the law.

And surely, business, like anyone else, is in violation of its responsibilities if it allows itself impacts beyond those necessary to, and implicit in, its social purpose, for example, producing goods and services.

To overstep these limits constitutes a tort, that is, a violation.




But what about problems that do not result from an impact or any other activity of business and yet constitute grave social ills?

Clearly it is not a responsibility of business, or of any organization, to act where it lacks competence; to do so is not responsibility but irresponsibility.

Thus when a former mayor of New York City in the 196os called for “General Electric and the other big corporations of New York City to help solve the problem of the Black Ghetto by making sure that there is a man and father in the home of every Black Welfare Mother,” he was not only ridiculous.

He demanded irresponsibility.




But also management must not accept “responsibility” if by doing so it harms and impedes what is its first duty: the economic performance of the enterprise.

This is equally irresponsible.




But beyond these caveats there is a no-man’s-land where we do not even fully understand what the right questions are.

The problems of New York, for instance, are in no way caused by business.

They were largely caused by public policies business had warned against and fought against: primarily by rent control, which, as it always does, destroys the very housing the poor need, that is, decent, well-maintained older housing; by demagogic welfare policies; and by equally demagogic labor-relations policies.

And yet when New York City was on the verge of self-destruction, in the late 1960s and early 1970s, a small group of senior executives of major New York business enterprises mobilized the business community to reverse the downward slide and to renew New York City—people like Austin Tobin of the Port of New York Authority; David Rockefeller of the Chase Manhattan Bank; Walter Wriston and William Spencer of Citibank; Felix Rohatyn of Lazard Frères, the private bankers; the top management of Pfizer, a pharmaceutical company; and several others.

They did this not by “taking responsibility” for things they lacked competence in, for example, the problems of the black ghetto.

They did it by doing what they were highly competent to do: they started and led the most dramatic architectural development of any major city since Napoleon III had created a new Paris and Francis Joseph a new Vienna a hundred years earlier.

The black ghetto is still there, and so are all the ills associated with it, for example, crime on the streets.

But the city has been revitalized.




And this did not happen because these businesses and their managements needed the city; excepting only the Port of New York Authority, they could all have moved out, as a good many of their colleagues—IBM, for instance, or General Electric, or Union Carbide—were doing.

These businesses and their top managements acted because the city needed them, though, of course, they benefited in the end if only because a business—and any other institution—does better in a healthy rather than a diseased social environment.




Is there a lesson in this?

There surely is a challenge.




Altogether, for management of the big business to attain full legitimacy, it will have to accept that to remain “private” it has to accept that it discharges a social, and that means a “public,” function.

The Job as Property Right

When, in 1985, a fair-size Japanese company found itself suddenly threatened by a hostile takeover bid made by a group of American and British “raiders”—the first such bid in recent Japanese history—the company’s management asserted that the real owners of the business, and the only ones who could possibly sell it, were not the stockholders, but the employees.

This was considerable exaggeration, to be sure.

The real owners of a major Japanese company are the banks, as has already been said.

But it is true that the rights of the employees to their jobs are the first and overriding claim in a large Japanese company, except when the business faces a crisis so severe that its very survival is at stake.




To Western ears the Japanese company statement sounded very strange.

But actually the United States—and the West in general—may be as far along in making the employees the dominant interest in business enterprise, and not only in the large one as in Japan.

All along, of course, the employees’ share of the revenues of a business, almost regardless of size, exceeds what the “owners” can possibly hope to get: ranging from being four times as large (that is, 7 percent for after-tax profits, as against 25 percent for wages and salaries) to being twelve times as large (that is, 5 percent for profits versus 6o percent of revenues for wages and salaries).

The pension fund not only greatly increased the share of the revenues that go into the “wage fund,” to the point that in poor years the pension fund may claim the entire profit and more.

American law now also gives the pension fund priority over the stockholders and their property rights in a company’s liquidation, way beyond anything Japanese law and Japanese custom give to the Japanese worker.




Above all, the West, with the United States in the lead, is rapidly converting the individual employee’s job into a new property right and, paradoxically, at the very time at which the absolute primacy of stockholder short-term rights is being asserted in and by the hostile takeover.




The vehicle for this transformation in the United States is not the union contract or laws mandating severance pay as in many European countries.

The vehicle is the lawsuit.

First came the suit alleging discrimination, whether in hiring an employee, in firing, in promotion, in pay, or in job assignment-discrimination on grounds of race or sex or age or handicap.

But increasingly these suits do not even allege discrimination, but violation of “due process.”

They claim that the employer has to treat the employee’s job, including the employee’s expectations for pay and promotion, as something the enjoyment of which and of its fruits can be diminished or taken away only on the basis of preset and objective standards and through an established process which includes an impartial review and the right to appeal.

But these are the features that characterize “property” in the history of the law.

In fact, they are the only features a right must possess to be called property in the Western legal tradition.




And as few managements yet seem to realize, in practically every such suit the plaintiff wins and the employer loses.




This development was predictable.

Indeed, it was inevitable.

And it is irreversible.

It is also not “novel” or “radical.”

What gives access to a society’s productive resources—gives access thereby to a livelihood and to social function and status and constitutes a major, if not the major, avenue to economic dependence however modest—has always become a “property right” in Western society.

And this is what the job has come, and especially the knowledge worker’s job as a manager or a professional.




We still call land “real” property.

For until quite recently it was land alone that gave to the great majority of mankind—95 percent or more—what “property” gives: access to, and control over, society’s productive resources; access to a livelihood and to social status and function; and finally a chance at an estate (the term itself meant, at first, a landholding) and with it economic independence.




In today’s developed societies, however, the overwhelming majority—all but 5 or 10 percent of the population—find access to and control over productive resources and access to a livelihood and to social status and function through being employees of organizations, that is, through their jobs.

For highly educated people the job is practically the only access route.

Ninety-five percent, or more, of all people with college degrees will spend their entire working lives as employees of an organization.

Modern organization is the first, and so far the only, place where we can put large numbers of highly educated people to productive work and pay them for applying knowledge.




For the great majority of Americans, moreover, the pension fund at their place of employment is their only access to an “estate,” that is to a little economic independence.

By the time the main breadwinner in the American family, white collar or blue collar, is forty-five years old, the claim to the pension fund is likely to be the family’s largest asset, far exceeding in value the equity in the home or the family’s personal belongings, for example, their automobiles.




Thus the job had to become a property right—the only question is in what form and how fast.




Working things like this out through lawsuits may be “as American as apple pie,” but is hardly as wholesome.

There is still a chance for management to take the initiative in this development and to shape the new property rights in the job so that they equally serve the employee, the company, and the economy.

We need to maintain flexibility of employment.

We need to make it possible for a company to hire new people and to increase its employment.

And this means that we must avoid the noose the Europeans have put around their neck: the severance pay which the law of so many Continental countries mandates makes it so expensive to lay off anybody that companies simply do not hire people.

That Belgium and Holland have such extraordinarily high unemployment is almost entirely the result of these countries’ severance pay laws.

But whichever way we structure the new property rights which the job embodies there will be several requirements which every employer, that is, every organization, will have to satisfy.

First, there 1st be objective and equal performance standards for every. e performing a given job, regardless of race, color, sex, or age.

Secondly, to satisfy the requirements of due process, the appraisal against these standards of performance has to be reviewed by somebody who is truly disinterested.

Finally, due process demands a right of appeal—something, which by the way, as “authoritarian” a company as IBM has had for more than half a century.




The evolution of the job into a “property right” changes the position of the individual within the organization.

It will change equally, if not more, the position of the organization in society.

For it will make clear what at present is still nebulous: organized and managed institutions have increasingly become the organs of opportunity, of achievement, and of fulfillment for the individual in the society of organizations.

Conclusion

There is still important work ahead—and a great deal of it—in areas that are conventionally considered “management” in the schools of management, in management journals, and by practicing managers themselves.

But the major challenges are new ones, and well beyond the field of management as we commonly define it.

Indeed, it will be argued that the challenges I have been discussing are not management at all, but belong in political and social theory and public law.




Precisely.

The success of management has not changed the work of management.

But it has greatly changed management’s meaning.

Its success has made management the general, the pervasive function, and the distinct organ of our society of organizations.

As such, management inevitably has become “affected with the public interest.”

To work out what this means for management theory and management practice will constitute the “management problems” of the next fifty years.

[1986]

 

line

 

Management as a Liberal Art

also Drucker’s Lost Art of Management

Three foreigners—all Americans—are thought by the Japanese to be mainly responsible for the economic recovery of their country after World War II and for its emergence as a leading economic power.

On Japan

Edwards Deming taught the Japanese statistical quality control and introduced the “quality circle.”

Joseph M. Juran taught them how, to organize production in the factory and how to train and manage people at work.

What is now the “latest” import from Japan and the “hottest management discovery”—the “just-in-time” inventory delivery system (the Japanese word for it is Kanban)—was introduced to Japan by Juran, who had been instrumental in developing it for America’s World War II production effort.




I am the third of these American teachers.

My contribution, or so the Japanese see it, was to educate them about management and marketing.

I taught them that people are a resource (you have to see this concept to grasp its power) rather than a cost, and that people therefore have to managed to take responsibility for their own as well as for the group’s objectives and productivity.

I taught them that communication has to be upward if it is to work at all.

I taught them the importance of structure but also that structure has to follow strategy.

I taught them that top management is a function and a responsibility rather than a rank and a privilege.

And I also taught them that the purpose of a business is to create a customer, and that a business only exists in contemplation of the market.




All these things the Japanese could have learned from my books and they have, indeed, been my most avid readers—some of my management books have sold proportionately many more copies in Japan than they have in the United States.

But my real impact in Japan was through the three- to four-week seminars that I ran in Japan every other year from the late 1950s to the mid-1980s for top people in government and business.

My effectiveness in these seminars did not, however, rest on my knowledge of management techniques.

It rested squarely on my interest in Japanese art and my acquaintance with Japanese history.




This interest of mine began as a result of a purely accidental visit to a Japanese art exhibition way back in 1934 when I was a young bank economist in London.

My fascination with Japanese art, which resulted from this visit, led me to ask, What in their history, society, and culture explains the ability of the Japanese to have anticipated, sometimes centuries earlier, the most recent trends in Western modern art, beginning with impressionism and progressing through expressionism and cubism to abstract art?




Thus I found myself soon face-to-face with a mystery, a still largely unexplained mystery: How did the Japanese, alone of all non-Western people, manage to build a modern nation and a modern economy on technology and institutions imported from the West, and yet, at the same time, maintain their basic national identity and integrity?

At first glance, nothing that the Japanese did in the late nineteenth century appeared different from what anybody else did at the time.

The new kingdoms in the Balkans such as Bulgaria, the South American republics, or Persia similarly imported a broad range of Western institutions—a parliament and a navy modeled after the British, an army modeled after Prussia, a constitutional monarchy and government ministries modeled after Germany, universal education (again on the German model), universities modeled after America, banking modeled after France and Germany, and legal codes copied from the Germans, Swiss, and French.

Yet only in Japan did these foreign imports “take.”

Moreover, they flourished as effective modern institutions and, at the same time, served to maintain a Japan as distinct, as clearly identified, and as cohesive as it had been when it was totally isolated from intercourse with the foreign world.




I have always been attracted to the unexpected success; in my experience, it holds the key to understanding.

It occurred to me that there had been no more unexpected or more unique success than that of the Japanese after the Meiji Restoration of 1867.

But I soon realized that this had not been the first such Japanese achievement.

The Japanese had had a very similar success twelve hundred years earlier when they adopted the institutions and the religions of what was then the world’s most advanced civilization, the China of the T’ang Dynasty, and used them to create a totally different and uniquely Japanese government, society, culture, religious life, and art.

And they repeated this success on a lesser scale several times during their subsequent history.

The more I explored the issue, the more mystified I became.

What did, however, become increasingly clear was that the Japanese achievement rested on a unique ability to use imported tools, whether social institutions or material techniques, to embody Japanese values and to achieve Japanese objectives.




And so, when I first found myself working with senior executives in Japanese government and business, it came naturally to me to lead off with the question “How can your values, your traditions, your culture and its beliefs be used to perform the objective, impersonal tasks of a modern economy and to harness modern technology for social and economic performance?

In my earlier books, I had pointed out that although unemployment insurance in the West, originally a British invention, protected the worker’s income, it did not satisfy the worker’s need for psychological and social security.

This, I had argued, required employment security as well.

I had argued further that the need for security required gearing wage and employment policies to the family life cycle and its needs.

Finally, however, I had pointed out that flexibility in labor costs was equally essential.




Thus, because I knew a little Japanese history, I was able to help the Japanese leaders who participated in my seminars to work out the combination of high employment security, high labor-force flexibility, and a wage structure in tune with the family cycle and its needs—the combination that has since become known in the West as lifetime employment and which, for thirty years, has given Japan, in all its earlier history until World War II a country of violent class wars and bloody worker revolts, unprecedented industrial cooperation and harmony.

Similarly, the one reason that the “marketing concept” I presented in my seminars has “taken” in Japan—whereas in the United States, the country of its birth, it is still being preached rather than practiced—is surely that marketing as a technique could be embedded in the deeply rooted Confucian ethics of mutual relationships.

A sale to a customer thus creates a “relationship,” and with it a permanent commitment.




«§§§»


These days I am always being asked to explain the success of the Japanese, especially as compared with the apparent malperformance of American business in recent years.

One reason for the difference is not, as is widely believed, that the Japanese are not profit conscious or that Japanese businesses operate at a lower profit margin.

This is pure myth.

In fact, when measured against the cost of capital—the only valid measurement for the adequacy of a company’s profit—large Japanese companies have, as a rule, earned more in the last ten or fifteen years than comparable American companies.

And that is the main reason the Japanese have had the funds to invest in global distribution of their products.

Also, in sharp contrast to governmental behavior in the West, Japan’s government, especially the powerful Ministry of International Trade and Industry (MITI), is constantly pushing for higher industrial profits to ensure an adequate supply of funds for investment in the future—in jobs, in research, in new products, and in market development.




One of the principal reasons for the success of Japanese business is that Japanese managers do not start out with a desired profit, that is, with a financial objective in mind.

Rather, they start out with business objectives and especially with market objectives.

They begin by asking “How much market standing do we need to have leadership?”

“What new products do we need for this?”

“How much do we need to spend to train and develop people, to build distribution, to provide the required service?”

Only then do they ask “And how much profit is necessary to accomplish these business objectives?”

Then the resulting profit requirement is usually a good deal higher than the profit goal of the Westerner.




Second, Japanese businesses—perhaps as a long-term result of my management seminars twenty and thirty years ago—have come to accept what they originally thought was very strange doctrine.

They have come to accept my position that the end of business is not “to make money.”

Making money is a necessity of survival.

It is also a result of performance and a measurement thereof.

But in itself it is not performance.

As I mentioned earlier, the purpose of a business is to create a customer and to satisfy a customer.

That is performance and that is what a business is being paid for.

The job and function of management as the leader, decision maker, and value setter of the organization, and, indeed, the purpose and rationale of an organization altogether, is to make human beings productive so that the skills, expectations, and beliefs of the individual lead to achievement in joint performance.




These were the things which, almost thirty years ago, Ed Deming, Joe Juran, and I tried to teach the Japanese.

Even then, every American management text preached them.

The Japanese, however, have been practicing them ever since.




I have never slighted techniques in my teaching, writing, and consulting.

Techniques are tools; without tools, there is no “practice,” only preaching.

In fact, I have designed, or at least formulated, a good many of today’s management tools, such as management by objectives, decentralization as a principle of organizational structure, and the whole concept of “business strategy,” including the classification of products and markets.




My seminars in Japan also dealt heavily with tools and techniques.

In the summer of 1985, during my most recent trip to Japan, one of the surviving members of the early seminars reminded me that the first week of the very first seminar I ran opened with a question by a Japanese participant, “What are the most useful techniques of analysis we can learn from the West?”

We then spent several days of intensive work on breakeven analysis and cash-flow analysis: two techniques that had been developed in the West in the years immediately before and after World War II, and that were still unknown in Japan.




Similarly, I have always emphasized in my writing, in my teaching, and in my consulting the importance of financial measurements and financial results.

Indeed, most businesses do not earn enough.

What they consider profits are, in effect, true costs.

One of my central theses for almost forty years has been that one cannot even speak of a profit unless one has earned the true cost of capital.

And, in most cases, the cost of capital is far higher than what businesses, especially American businesses, tend to consider as “record profits.”

I have also always maintained—often to the scandal of liberal readers—that the first social responsibility of a business is to produce an adequate surplus.

Without a surplus, it steals from the commonwealth and deprives society and the economy of the capital needed to provide jobs for tomorrow.




Further, for more years than I care to remember, I have maintained that there is no virtue in being nonprofit and that, indeed, any activity that could produce a profit and does not do so is antisocial.

Professional schools are my favorite example.

There was a time when such activities were so marginal that their being subsidized by society could be justified.

Today, they constitute such a large sector that they have to contribute to the capital formation of an economy in which capital to finance tomorrow’s jobs may well be the central economic requirement, and even a survival need.




But central to my writing, my teaching, and my consulting has been the thesis that the modern business enterprise is a human and a social organization.

Management as a discipline and as a practice deals with human and social values.

To be sure, the organization exists for an end beyond itself.

In the case of the business enterprise, the end is economic (whatever this term might mean); in the case of the hospital, it is the care of the patient and his or her recovery; in the case of the university, it is teaching, learning, and research.

To achieve these ends, the peculiar modern invention we call management organizes human beings for joint performance and creates a social organization.

But only when management succeeds in making the human resources of the organization productive is it able to attain the desired outside objectives and results.




I came to this thesis naturally, for my interest in management did not start with business.

In fact, it started when I decided to become a writer and teacher rather than continue a promising career as an investment banker.

My interest in modern organization, in business and management, began with an analysis of modern society and with my conclusion, reached around the time World War II began, that the modern organization and especially the large business corporation was fast becoming the new vehicle of social integration.

It was the new community, the new order of a society in which the traditional vehicles of integration—whether small town, craft guild, or church—had disintegrated.

So I began to study management with an awareness of economic results, to be sure, but also searching for principles of structure and organization, for constitutional principles, and for values, commitments, and beliefs.




«§§§»


There is a good deal of talk these days of the “culture” of a company.

But my book, The Practice of Management, published more than thirty years ago, ends with a chapter on the “spirit” of an organization, which says everything to be found in such current best-sellers as In Search of Excellence.

From the beginning I wrote, taught, and advised that management has to be both outside-focused on its mission and on the results of the organization, and inside-focused on the structure, values, and relationships that enable the individual to achieve.




For this reason, I have held from the beginning that management has to be a discipline, an organized body of knowledge that can be learned and, perhaps, even taught.

All of my major books, beginning with Concept of the Corporation (1946) and Practice of Management (1954) and progressing through my most recent one, Innovation and Entrepreneurship (1985), have tried to establish such a discipline.

Management, Revised Edition

Management is not, and never will be, a science as that word is understood in the United States today.

Management is no more a science than is medicine: both are practices.

A practice feeds from a large body of true sciences.

Just as medicine feeds off biology, chemistry, physics, and a host of other natural sciences, so management feeds off economics, psychology, mathematics, political theory, history, and philosophy.

But, like medicine, management is also a discipline in its own right, with its own assumptions, its own aims, its own tools, and its own performance goals and measurements.

And as a separate discipline in its own right management is what the Germans used to call a Geisteswissenschaft—though “moral science” is probably a better translation of that elusive term than the modern “social science.”

Indeed, the old-fashioned term liberal art may be the best term of all.

[1985]

The memo they don’t want you to see

 

Concept of the Corporation

 

The Management Revolution

 

line

 

The Innovative Organization

It is widely believed that large companies cannot innovate.

This is simply not true: Merck, Citibank, and 3M are but three examples of highly innovative corporate giants.

But it is true that to innovate successfully, a company has to be run differently from the typical “well-managed” business, whether large or small.

 

The innovative company understands that innovation starts with an idea.

Ideas are somewhat like babies—they are born small, immature, and shapeless.

They are promise rather than fulfillment.

In the innovative company executives do not say, “This is a damn-fool idea.”

Instead they ask, “What would be needed to make this embryonic, half-baked, foolish idea into something that makes sense, that is feasible, that is an opportunity for us?”

 

But an innovative company also knows that the great majority of ideas will turn out not to make sense.

Innovative ideas are like frogs’ eggs: of a thousand hatched, only one or two survive to maturity.

Executives in innovative organizations therefore demand that people with ideas think through the work needed to turn an idea into a product, a process, a business, or a technology.

They ask, “What work would we have to do and what would we have to find out and learn before we can commit the company to this idea of yours?”

 

These executives know that it is as difficult and risky to convert a small idea into successful reality as it is to make a major innovation.

They do not aim at “improvements” or “modifications” in products or technology.

They aim at innovating a new business.

And they know that innovation is not a term of the scientist or technologist.

It is a term of the businessman.

 

For innovation means the creation of new value and new satisfaction for the customer.

Organizations therefore measure innovations not by their scientific or technological importance but by what they contribute to market and customer.

They consider social innovation as important as technological innovation.

Installment selling may have had a greater impact on economics and markets than most of the great advances in technology in this century.

 

Innovative companies know that the largest market for a successful new idea is usually unexpected.

In developing dynamite, Alfred Nobel was trying to find a better military explosive.

But dynamite is too unstable to be used in bombs and shells; instead it was used for removing rock and replaced the pick and shovel in mining, railroad building, and construction.

IBM built its dominance of the large-computer market by realizing that the greatest demand for computers would come not from science and defense—the two uses for which the computer had been designed—but from such mundane applications as payroll, billing, and inventory control.

 

Innovative companies do not start out with a “research budget.”

They end with one.

They start out by determining how much innovation will be needed for the business to stay even.

They assume that all existing products, services, processes, and markets are becoming obsolete—and pretty fast at that.

They try to assess the probable speed of decay of whatever exists and then determine the “gap” that innovation has to fill for the company not to go downhill.

They know that their program for innovation must include promises several times the size of the innovation gap, for not more than a third of such promises, if that many, ever becomes reality.

And then they know how much of an innovative effort—and how large an innovative budget—they need as the very minimum.

 

“But,” says the chief executive of a highly successful innovative company, “then I double the size of the effort and of the budget.

After all, the competition is no dumber than we are and may be luckier.”

 

Smart companies know that money does not produce innovation; people do.

They know that in innovative work, quality counts far more than quantity.

They do not spend a penny unless there is a first-rate person to do the work.

Successful innovations rarely require a great deal of money in the early and crucial stages.

But they do require a few highly competent people, dedicated to the task, driven by it, working full time and very hard.

Such companies will always back a person or a team rather than a “project” until the innovating idea has been proved out.

 

But these organizations also know that the majority of innovative ideas, however brilliant, never bring “results.”

 

So they treat innovative work quite differently from the existing, ongoing business in respect to planning, budgets, expectations, and controls.

 

Typically innovative companies have two separate budgets: an operating budget and an innovation budget.

The operating budget contains everything that is already being done.

The innovation budget contains the things that are to be done differently and the different things to be worked on.

The operating budget runs to hundreds of pages, even in a middle-size company.

The innovation budget even in the giant business rarely runs to more than forty or fifty pages.

But top management spends as much time and attention on the fifty pages of the innovation budget as on the five hundred of the operating budget—and usually more.

 

Top management asks different questions about each budget.

On operations it asks, “What is the least effort needed to keep things from caving in?”

And “What is the least effort needed to give the best ratio between effort and results?

What, in other words, is the optimization point?”

But for innovations, top management asks, “Is this the right opportunity?”

And if the answer is yes, top management asks, “What is the most this opportunity can absorb by way of resources at this stage?”

 

Innovative companies know that returns on innovation behave radically differently from returns in the ongoing business.

For long periods, years in many cases, innovations have no “returns”; they have only costs.

But then returns should increase exponentially: An innovation is unsuccessful if it does not return the investment several hundredfold, for the risks are simply too great to justify a lower return.

 

To expect from innovative efforts the steady 10 percent rate of return and 10 percent growth rate—the yardstick of the “sound financial manager”—is foolishness.

It is expecting both too much and too little.

Innovative companies therefore keep such efforts out of the return-on-investment figures for ongoing businesses and do not use these figures to measure the soundness of an innovative idea and its progress, or the compensation of people working on them.

The oldest rule, probably formulated by du Pont sixty years ago, is that the new will not be included in the figures for the ongoing business until its end result—the new product or service—has been on the market for two or three years and is past its infancy.

 

Yet, innovative companies closely control these efforts.

One never hears talk of “creativity” in innovative companies—creativity is the buzzword of those who don’t innovate.

Innovative companies talk of work and self-discipline.

They ask, “What is the next point at which we should review this project?

What results should we expect by then?

And how soon?”

When an idea fails to meet the targets two or three times in a row, the innovative company does not say, “Let’s redouble our efforts.”

It says, “Isn’t it time we did something else?”

 

Above all the innovative company organizes itself to abandon the old, the obsolete, the no longer productive.

It never says, “There will always be a market for a well-made buggy whip.”

It knows that whatever human beings have created becomes obsolete sooner or later—usually sooner.

And it prefers to abandon its obsolete products itself rather than have them made obsolete by the competition.

 

Every three years or so, the innovative company therefore puts on trial for its life every product, process, technology, service, and market.

It asks, “Knowing what we now know, would we now go into this product or service?”

And if the answer is no, the company does not say, “Let’s make another study.”

It says, “How do we get out?”

 

One way might be to stop putting additional resources in and keeping the product or service only as long as it still gives a yield—I coined the term cash cow for this twenty years ago.

Or, the Japanese are past masters at this, one finds uses and markets where the old technology or the old product is still genuinely new and confers competitive advantage.

Or one abandons.

But one does not pour good money after bad.

Organized abandonment of the obsolete is the one sure way for an organization to focus the vision and energies of its people on innovation.

 

We clearly face a period in which the demands and opportunities for innovation will be greater than at any time in living memory—as great perhaps as in the fifty years preceding World War I, during which new technical or social inventions, almost immediately spawning new industries, emerged on average every eighteen months.

 

Telecommunications, automation of manufacturing processes around the microprocessor, the “automated office,” rapid changes in banking and finance, medicine, biogenetics, bioengineering, and biophysics—these are only a few of the areas where change and innovation are already proceeding at high speed.

To compete in this environment, companies will need to muster large sums of money to boost their research budgets, even in a severe depression.

But what will be required above all are the attitudes, policies, and practices of the innovative organization.

(calendarize this?)

 

line

 

Social Needs and Business Opportunities

In the early years of this century, two Americans, independently and, in all probability, without knowing of each other, were among the first businessmen to initiate major community reforms.

Andrew Carnegie preached and financed the free public library.

Julius Rosenwald fathered the county farm agent system and adopted the infant 4-H Clubs.

Carnegie was already retired from business as one of the world’s richest men.

Rosenwald, who had recently bought a near-bankrupt mail-order firm called Sears, Roebuck and Company, was only beginning to build both his business and his fortune.

Both men were radical innovators.




The monuments that earlier businessmen had erected for themselves were cultural: museums, opera houses, universities.

In Carnegie’s and Rosenwald’s own time leading American businessmen, A. Leland Stanford, Henry E. Huntington, J. P. Morgan, Henry C. Frick, and a little later, Andrew Mellon, still followed this tradition.

Carnegie and Rosenwald instead built communities and citizens—their performance, their competence, and their productivity.




But there the similarity ends.

The two held basically different philosophies.

Carnegie shouted his from the housetops: The sole purpose of being rich is to give away money.

God, Carnegie asserted, wants us to do well so that we can do good.

Rosenwald, modest, publicity shy, unassuming, never preached, but his deeds spoke louder than his words.

“You have to be able to do good to do well” was Julius Rosenwald’s credo, a far more radical one than that of the anarchist steelmaster from Pittsburgh.

Carnegie believed in the social responsibility of wealth.

Rosenwald believed in the social responsibility of business.




Rosenwald saw the need to develop the competence, productivity, and income of the still desperately poor and backward American farmer.

To accomplish this it was necessary to make available the enormous fund of scientific farming knowledge and farming skills that had been amassed in decades of systematic study of agronomy and farm marketing, but that, in 1900 or 1910, were still largely theoretical and inaccessible to all but a tiny minority of more affluent agriculturalists.

Although his motives were partially philanthropic, he also saw that Sears, Roebuck’s prosperity was linked to the prosperity of its main customer, the farmer, which in turn depended on his productivity.

The county farm agent—and Sears, Roebuck for almost a decade single-handedly supported this innovation of Rosenwald’s until the U.S. government finally took it over—and the 4-H Clubs were clearly philanthropy.

But they were also Sears, Roebuck’s corporate advertising, public relations, and above all market and customer development.

Their success partially explains how the near-bankrupt Sears, Roebuck became within ten years the country’s first truly national retailer and one of its most profitable and fastest-growing enterprises.




After World War II, another American businessman developed yet another approach to social responsibility.

William C. Norris, the founder (in 1957) and, until his retirement in 1986, chairman of Control Data Corporation, saw the solution of social problems and the satisfaction of social needs as opportunities for profitable business.

He too was a philanthropist motivated by concern for his fellowman.

He picked his projects (skill training and employment in the inner-city ghetto, rehabilitation and training of prisoners, teaching problem learners) by social need rather than by market demand.

But he directed his investment and his corporation’s human resources where information handling and data processing, his company’s expertise, could create a business that, while solving a problem, would become self-sustaining and profitable.




Like Carnegie’s philanthropy and Rosenwald’s community development, Norris’s investments in social needs aimed at creating human capital in the form of individuals capable of performance and of a healthy community able to help itself.

But Norris’s social enterprises also aimed at creating economic capital.

Carnegie’s public libraries were strictly philanthropies, though they did create opportunities for individual self-development.

Rosenwald’s community projects were not business ventures.

However much they benefited Sears, Roebuck, they did so indirectly.

They were good business, farsighted investments in market development, but not themselves business.

Norris’s good works or excursions into social problem solving were capital investments in new profit-making businesses, in a stricter sense.

He was an entrepreneur.




In its view of social responsibility much of American business and the American public still follow Carnegie.

They accept, as he did, that wealth and economic power entail responsibility for the community.

The rich man as social reformer, Carnegie’s innovation, established a uniquely American institution: the foundation.

One after the other of the superrich, from Rockefeller to Ford, followed Carnegie’s example.

And Carnegie also set the tone for what is now known as the social responsibility of business, a phrase that has become exceedingly popular.




Julius Rosenwald has had far fewer followers.

The best known is probably Rosenwald’s own successor as head of Sears, Roebuck, General Robert E. Wood.

Even greater perhaps was the impact of James Couzens, cofounder of the Ford Motor Company, for ten years Henry Ford’s partner as the company’s financial and administrative head, then mayor of Detroit and finally, from 1922 to 1936, U.S. senator from Michigan and, though nominally a Republican, one of the intellectual fathers of the New Deal.

Couzens introduced skill training into American industry as a social responsibility of business.

A few years later, in 1913, he established, over Henry Ford’s strenuous objections, the famous five-dollar-a-day wage—both out of deep compassion for the suffering of an exploited work force and as a highly successful and indeed immediately profitable cure for high rates of absenteeism and turnover that threatened Ford’s competitive position.




In our own time J. Irwin Miller of the Cummins Engine Company in Columbus, Indiana, has systematically used corporate funds to create a healthy community that, at the same time, is a direct though intangible investment in a healthy environment for his company.

Miller specifically aimed at endowing his small industrial town with the quality of life that would attract to it the managerial and technical people on whom a big high-technology business depends.




The thesis of this essay is that in the years to come the most needed and effective approach to corporate social responsibilities will be that exemplified by William Norris and Control Data Corporation.

Only if business learns how to convert the major social challenges facing developed societies today into novel and profitable business opportunities can we hope to surmount these challenges in the future.

Government, the agency looked to in recent decades to solve these problems, cannot be depended on.

The demands on government are increasingly outrunning the resources it can realistically hope to raise through taxes.

Social needs can be solved only if their solution in itself creates new capital, profits, that can then be tapped to initiate the solution for new social needs.




Fundamental changes in technology and society have changed the nature of social needs.

Today we are very conscious of technological change.

Few people realize that what actually is changing are not technologies but the very concept of technology.

For three hundred years technology has had for its ultimate model the mechanical phenomena inside a star such as the sun.

This development reached its climax with a technology that replicates the mechanical processes inside the sun, that is, with nuclear fission and fusion.

Now the dynamics of technology are switching to what might be called an organic model, organized around information rather than around mechanical energy.




Fossil-fuel energy has been a mature, if not declining, industry since 1950, well before OPEC and the energy crisis.

In all developed countries the ratio of energy usage to gross domestic product has been falling steadily and rapidly since then.

Even in sectors that until then still showed an incremental energy growth—private automobiles; aviation, both civilian and military; and residential lighting, heating, and air conditioning—energy consumption per unit of output has been declining since well before 1973 and is almost certain to continue to do so, almost irrespective of cost.




Biological processes progress in terms of information content.

The specific energy of biological systems is information.

Mechanical systems are organized by the laws of physics; they express forces.

Biological systems obey the laws of physics, of course.

But they are not organized by forces but by information (for example, the genetic code).




As a consequence, the shift from the mechanical to the biological model calls for a shift in the resource that constitutes capital.

Before the mechanical age, animal energy, that is physical exertion, constituted capital.

Skill was of course highly prized.

But there was so little market for it that it had to be organized as a monopoly, with access strictly controlled through apprenticeship programs and guild regulations.

Skill beyond a minimum was simply not employable; there was no market for it.

And knowledge was pure luxury.




In the age of the mechanical model, in the last three hundred years, human skill increasingly became the productive resource—one of the greatest advances in human history.

This development reached its culmination in this century when mass production converted the laborer into the semiskilled worker.

But in an age in which information is becoming the organizing energy the capital resource is knowledge.




This shift in the meaning of technology that is now well underway represents a far more important change than any technological change, no matter how rapid or how spectacular, and deserves even more attention than it gets.




Demographic changes may be even more important.

Fortunately the educational explosion of the last fifty years in all developed countries coincided with the shift in technology.

In the developed countries now about half of all young people undergo formal schooling beyond secondary school, developing the human resources needed to make the new technology operational, productive, and beneficial.

In turn the new technology creates the employment opportunities for the new work force of the developed countries.

Which is chicken and which is egg no one, I daresay, could determine.




These changes create major discontinuities and problems.

First, in the developed countries there is the transition problem for a labor force trained to operate in the age of the mechanical model and left stranded in the shift to the technology of the biological model.

And the remnants of what today we would call preindustrial society—for example, those in the inner-city ghettos or Chicano immigrants fleeing the destitution of overpopulated Mexico—who are prepared only to use physical strength as the resource they are getting paid for, are problems in today’s developed countries.




Second, between the developed and the poorest countries there is a new and dangerous discontinuity.

Up to three hundred years ago there were no “poor countries.”

There were rich people in every country—not very many—and there were vast hordes of poor people in every country.

One hundred years later, that is, by 1700, when the new technology of the mechanical model first began to make a difference, the world began to split into rich countries and poor countries.

By 1900 average per capita income in the then-developed countries was as much as three times as high as per capita income in the developing countries.

By now the gap has widened to an unprecedented and probably unsustainable ten to one, or worse.

Today the poorest proletarian in developed countries has a higher standard of living than all but a minute minority of the rich in the poorest countries.

The class conflict of earlier times has become a north-south cleavage, if not a source of racial conflict.

There is another discrepancy between developed countries, that is, countries with a high standard of formal learning, and thus with access to the opportunities of the biological model, and countries that at best can begin to form human skill capital.

One-third of humanity, in the developed countries, is ready to exploit the opportunities of the biological model, while two-thirds, in the developing countries, are just entering the stage in which their human resources are prepared for the opportunities of the mechanical model.




Just as the technology of the mechanical model requires a skill base, which is slowly and painfully being built in some of the developing countries, so does the technology of the biological model require a broad knowledge base.

This, we now know, cannot be improvised but requires a long period of hard work and above all a capital investment far beyond the means any but already highly developed countries.

Thus for the foreseeable future the world will remain divided into societies with the knowledge base to convert the new technology into major economic and social opportunities and those without the broad base of schooled people on which the technology of the biological model rests and with a surplus of people equipped only for the technologies of the mechanical model.




It is the conjunction of the shifts in technology and demographics that creates the social needs business will have to learn to transform into opportunities.




Developed countries are facing a situation for which there is no parallel in recent economic history.

We will have growing labor shortages and at the same time growing unemployment.

A large and growing share of the new entrants into the labor force will have sat in school too long to be available for traditional manual, blue-collar work.

By 1982 the proportion of Americans who entered the civilian labor force with only an elementary school education was down to about 3 percent.

The proportion entering with only a high school education was down to about 50 percent.

And the trend is most unlikely to be reversed.




This means that the basic employment problem of the United States and of every other developed country is to create challenging, satisfying, and well-paid jobs for people with so much schooling that they are qualified only for putting knowledge to work.

It also means that demand for capital formation in the developed countries will go up rapidly.

In particular, jobs for which capital requirements were traditionally lowest, that is in clerical and service areas, will be transformed.

Whatever the office of the future will look like, it will be capital-intensive, with capital investment per worker going from a meager $3,000 at present to something like $20,000 or $30,000 within ten years or so.

Knowledge jobs, on the average, require a multiple of the capital that manual jobs, on the average, require.

They require a high and growing investment in schooling before the individual can begin to contribute, and, increasingly, substantial investment in continuing or refresher education.

In other words they require an investment in human resources at least matching that in physical capital.




At the same time there will be redundancies of workers in traditional blue-collar employment.

In developed countries traditional blue-collar manual labor will simply not be economical.

This is in part because work based on information, whether this be called automation or data processing, will have so much greater value added per unit of effort.

Whatever processes can be automated—that is, shifted to an information base—must be automated.

Otherwise industry cannot compete, especially with the very large and abundant low-cost labor resources of Third World.

It is almost certain that by the year 2010, that within twenty-five years, the proportion of the labor force he developed countries that is engaged in traditional blue-collar work in manufacturing will be down to what it is now in our most highly scientific and most capital-intensive industry, modern agriculture.

Manufacturing blue-collar labor accounts for almost one-fifth of the labor force in all developed countries.

But the proportion employed in modern agriculture is about one out of every twenty or less.




For the transition period, the next twenty-five years, there will be highly visible and highly concentrated populations of traditional blue-collar workers who are being made redundant and now have nothing to offer except skill, or, more often, semiskills.

That there will at the same time be shortages in certain places of manual, blue-collar workers, because so many entrants into the labor force will have too much education to be interested in blue-collar jobs, will not help these redundant workers.

They will not be where the shortages are and will, usually, not have the skills the available jobs demand.




The blue-collar workers who are being made redundant by the shift of manufacturing from work requiring brawn and skill to knowledge-intensive work are typically found in high-wage jobs in the mass-production industries.

For the last fifty years these groups have been among the favored groups in industrial society, the groups that have gained the most in economic and social position with the least increase in their actual capacity to perform.

They are likely to be older people; younger people move before an industry decays.

They are highly concentrated in a very small number of metropolitan areas and thus both visible and politically potent.

Eight hundred thousand automobile workers, for instance, are concentrated mostly in twenty counties in the Midwest, from Milwaukee to Dayton and Cleveland, and in only four states.

And they tend to be unionized and to act collectively rather than as individuals.




Paradoxically, the labor shortages will be as real as the redundancies.

What is needed to bring the two together?

Is it training?

Is it organized placement?

Is it moving industries in need of traditional labor into the areas where the redundancies will occur?

Above all, there is need to anticipate redundancies and to organize the systematic placement of individuals in new jobs.




The gap between labor shortages in manufacturing and unemployment in manufacturing may coexist even within the same geographic area.

But it will be particularly sharp between different sections of the same country, between different industries and between different wage levels.

Unless we succeed in bridging this gap, we will be in grave danger.

Instead of promoting the new information-based industries and their employment, which fit the needs and qualifications of the young population, economic policy will focus on maintaining yesterday’s employment.

We will, in other words, be sorely tempted to follow the example of Great Britain and sacrifice tomorrow on the altar of yesterday—to no avail, of course.




«§§§»


Government cannot tackle this problem, let alone solve it.

It is a problem for the entrepreneur who sees in the available labor surplus an opportunity.

Government can provide money; the best examples are probably the retraining grants of West Germany, which now amount to 2 percent of West German GNP but, according to some German estimates (for example, those of the West German Arbeitsministerium), save as much as four times the amount in unemployment and welfare benefits.

But the actual training, to be effective, has to be focused on a specific job the individual can be assured of getting once he reaches the required skill level.

It has to be individual rather than general, and it has to be integrated with placement.

Government, we have learned in sixty years of work on “distressed industries” and “distressed areas,” going back to Lloyd George’s first post-World War I cabinet in Great Britain, cannot do either.

By its very nature government focuses on large groups rather than on this person with his or her specific skills, background, and needs.




Also the new jobs are likely to be in small and local rather than in big, national business.

Since about 1960, unprecedented growth in the American labor force and employment has occurred.

The great majority of all new jobs (between two-thirds and three-quarters) has been created in the private sector, not in large, let alone giant, companies, but in businesses employing twenty employees or fewer.

During this period employment in the Fortune 500 companies actually declined by 5 percent.

And since 1970 the former rapid increase in government employment, federal, state, and local, has leveled off in all developed countries.




Finding workers about to become redundant, identifying their strengths, finding new jobs for them, and retraining them as needed (and often the new skills needed are social rather an technical) are tasks to be done locally and for this reason e business opportunities.

But unless redundancy is seen systematically as an opportunity, and above all by existing businesses with the knowledge and capital to act, we will suffer an ever-worsening problem that threatens the future of any developed economy and especially of the American economy.




Several other severe social problem areas, which offer business opportunities, are of particular interest.

Within every developed country, and particularly in the United States, there the problem of the preindustrial population, which in an American context means primarily racial minorities and, above, the blacks.

Only a minority of blacks by now have not been able to acquire the competence needed to become productive in an economy in which brawn is not adequate to provide the kind of living developed societies consider standard.

Yet few of the many attempts to educate these groups have lived up to expectations.

Part of this failure is due to the fact that training and education succeed only where there is a vision of the future.

It is the lack of vision, grounded in decades, if not centuries, of frustration, failure, and discrimination, that prevents education and training from being converted into confidence and motivation.




But we also know that these people work well if the opportunity is provided for them.

Until the job is there, there is, however, no motivation for training, no belief that it will lead to a permanent change, and a conviction that this effort too will fail.

There is thus a major task of putting human resources to work and developing their competence.

Opportunities exist in all kinds of services, if only because the supply of people willing and able to do the work will fall far below the demand, whether in hospitals, in maintenance, or in repair and services of all kinds.




One company that has turned this social problem into an opportunity is based in Denmark.

It operates in some fifty countries of the world, mostly developed ones.

It systematically finds, trains, and employs preindustrial people for maintenance of office buildings and plants, at good incomes, with a minimum of turnover, and with only one problem: It cannot find enough people to satisfy the demand.

It does not “train” people.

It employs them, making high demands on performance that then create self-respect and workmanship.

It provides career opportunities for advancement within the competence of the individual.

This company, which now has sales well in excess of a half billion dollars, started with one small crew less than twenty years ago.

The opportunities are there—but is the vision?




Then there is the unprecedented problem of the earthquake fault between the developed countries, with their large supply of highly educated people and shortages of people qualified and prepared for traditional manual work, and the Third World countries in which, in the next fifteen years, unprecedentedly large numbers of young people will reach adulthood prepared and qualified only for traditional blue-collar manual work.

These young blue-collar workers will find employment opportunities only if labor-intensive stages of production are moved to where the labor supply is, that is, to the developing countries.

Production sharing is the economic integration ahead of us.

If it cannot be developed as a successful business opportunity, we face both fast-decreasing standards of living in the developed countries, where traditional manufacturing work cannot be performed both because there is an absolute shortage of labor and because the price of manual or will become totally noncompetitive, and social catastrophe on a massive scale in the Third World.

No society, no matter what its political or social system, whether capitalist or communist, can hope to survive the strains of 40 or 50 percent employment among young, able-bodied people prepared for work, and willing to do work, and familiar, if only through television and radio, with the way the rich countries of the world live.




«§§§»


Why shouldn’t government do these tasks and tackle these problems?

Governments have had to concern themselves with social problems since time immemorial.

There were the reforms of the Gracchi in Republican Rome in the second century B.C. and the Poor Laws of Elizabethan England.

But as part of a systematic political theory the idea that the solution of social problems is permanently a task of government and one for which no other social institution is fitted dates back only two hundred years.

It is a child of the Enlightenment of the eighteenth century; it presupposes a modern civil service and modern fiscal system.

It was first expressed and practiced in e most enlightened of the enlightened despotisms and, so to speak, their development lab, the Hapsburg Grandduchy of Florence where, between 1760 and 1790, the first countrywide hospital system, the first countrywide public health planning, and—first in Europe—a countrywide system of free compulsory schooling were established.




The nineteenth century saw the blossoming of this new idea.

From the British Factory Acts of 1844 to Bismarck’s social security legislation in the 1880s, one social problem after the other was tackled by governments—and solved triumphantly.




The twentieth century and especially the last fifty years saw this idea elevated to an article of the faith, to the point where a great many people consider it practically immoral and certainly futile for a social need to be tackled any way other than by a government program, and where a substantial majority, only a few years ago, in the heady Kennedy and Johnson years, was convinced that any social problem would almost immediately yield to attack by a government program.

But the years since then have brought increasing disenchantment.

There is now no developed country, whether free enterprise or communist, in which people still expect government programs to succeed.




One reason is surely that government is doing far too many things.

By itself a social program accomplishes nothing except the expenditure of money.

To have any impact at all such a program requires above all the hard work and dedication of a small number of first-rate people.

First-rate people are always in short supply.

There may be enough for a very few social programs at any one time (though the two most successful entrepreneurs of social programs with whom I have discussed this, the late Arthur Altmeyer, the father of America’s Social Security program, and the late David Lilienthal, the builder of the Tennessee Valley Authority [TVA], both said—and independently—that in their experience there were at most enough first-rate people available at any one time in any one country to launch one major social program).

But under the Johnson administration the United States in four short years tried to launch a half dozen—in addition to fighting a major overseas war!




One might also say that government is congenitally unsuited to the time dimensions of social programs.

Government needs immediate results, especially in a democracy where every other year is an election year.

The growth curve of social programs is the hyperbola: very small, almost imperceptible results for long hard years, followed, if the program is successful, by years of exponential growth.

It took eighty years before America’s program of agricultural education and research began to revolutionize American farming and farm productivity.

It took twenty years before every American at work was covered by Social Security.

Would the American electorate have waited twenty, let alone eighty, years before seeing major results from President Johnson’s War on Poverty?

And yet we know that learning has a long lead time before it shows massive suits.

Individuals, not classes, learn, and there has to be built , one by one, a large stock of individuals who have learned, no serve as examples, as leaders, who give encouragement.




Paradoxically, government that finds it hard to start small and to be patient finds it even harder to abandon.

Every program immediately creates its own constituency, if only the people who are employed by it.

It is easy, all too easy, for modern government to give.

It is all but impossible for it to take away.

The rule for failures is therefore not to bury them but to redouble the budget and to divert to them the able people who might, if employed on more promising opportunities, produce results.




Furthermore, it is all but impossible for government to experiment.

Everything it now does has to be nationwide from the start, and everything has to be finite.

But that, in anything new, is a guarantee of failure.

It is no coincidence that practically all successful New Deal programs had been pilot tested as small-scale experiments in states and cities over the preceding twenty years—in Wisconsin, New York State, New York City, or by one of the Chicago reform administrations.

The two total New Deal failures, the National Recovery Administration (NRA) and the Works Progress Administration (WPA), were also the only genuine inventions without prior experiment at the state or local level.




Surely William Norris was right when he spoke of his company’s social business enterprises as research and development.

Long lead times, willingness to experiment, and to abandon in case of nonresults are precisely the characteristics of research and development work.

But R & D is, we now know, not done well by government, for a variety of well-studied reasons.

It is done best in autonomous institutions, whether university laboratory, individual hospital, or business laboratory, although the provider or source of the funds might well be government.




Equally important as an explanation for the inability of government to tackle successfully the kind of social problems we face is that they are hard problems.

A hard problem is one in which there are so many constituencies that it is difficult, if not impossible, to set specific goals and targets.

It is perhaps here that the social problems of the mid-twentieth century differ most fundamentally from those of the eighteenth and nineteenth centuries.

But the problems we face in the decades ahead will be even harder than those we now handle so poorly.

Each of them has powerful constituencies with radically different, indeed mutually exclusive, goals and values, which practically guarantee that government could not succeed in solving them.




“Reindustrializing America,” for instance, means to the labor union preserving traditional blue-collar jobs in traditional industries in central cities or at least slowing the shrinkage of traditional jobs.

However, if reindustrializing America means restoring the country’s capacity to increase the output of manufactured goods and to compete internationally, it unambiguously means the fastest possible automation of traditional processes and in all probability a shift to new and decentralized locations.

It means liquidating Big Steel in Pittsburgh and Chicago and shifting to minimills near customers.

The first definition is politically acceptable for a short time.

But it can lead only to failure, as the British and the Polish examples show.

But can any government program embrace the second definition?

Even the Japanese, who reportedly invest in winners and starve losers (at least according to a currently popular American myth), are finding that it cannot be done politically.

Indeed the Japanese have found that they cannot give up support of a retail distribution system that everyone in Japan knows to be obsolete and frightfully expensive but the only social security for a fairly small group of older people.




Nongovernmental institutions, whether businesses or institutions of the rapidly growing nonprofit third sector, can, however, direct themselves to a single objective.

They can break down hard problems into several easy problems, each capable of solution or, at least, of alleviation.

And because nongovernmental institutions can and do compete with each other, they can develop alternate approaches.

They can experiment.




The increasing inability of government to tackle effectively the social needs of contemporary society creates a major opportunity for nongovernmental institutions and especially for the most flexible and most diverse of nongovernmental institutions: business.

Increasingly, even countries organized on what are proclaimed to be socialist principles will have to “privatize.”

It will be necessary, in other words, to create conditions under which a task is outlined by government and the means to perform the task are provided for either by government (as for instance in the case of the rapidly growing private health-care insurance in Britain, which is reimbursed by the National Health Service) or by third-party payors, but in which a task is actually performed by nongovernmental institutions, especially business, locally and on a competitive basis.




A good example is the American communication system, in which increasingly the tasks exclusively done fifty years ago by the post office are now carried out by a host of agencies competing with one another and with the postal service.

Quite clearly, garbage removal, health care, and many other services will become privatized in such a way that the service itself is grounded in public policy and law (if only through tax advantages), while the performance is the task of competitive private business enterprises.




The true mixed economy of the future will consist of three parts.

There will be a private sector in which government limits itself to protection against fraud, extreme exploitation, collusion, unsafe working conditions, and deprivation of civil rights.

There will be a true public sector, for defense (excluding procurement) and justice, in which government will both specify the job and do it.

And there will be a mixed sector, the best example being the American hospital system, which is primarily a private system.

Nonprofit community hospitals, church-affiliated hospitals, and proprietary—for-profit hospitals are increasingly organized in large and growing chains.

All then compete for patients.

Yet most of their income is public money, whether it comes directly from the government via the tax system or through compulsory private health insurance plans.

Another well-known example is defense procurement.




«§§§»


In most discussions of the social responsibility of business it is assumed that making a profit is fundamentally incompatible with social responsibility or is at least irrelevant to it.

Business is seen as the rich man who should, if only for the good of his soul, give alms to the less fortunate.




Most people who discuss social responsibility, including its opponents, would be exceedingly suspicious of any business that asserts, as does for instance William Norris, that it is the purpose of business to do well by doing good.

To those hostile to business, who believe that profit is a “rip-off,” this would appear the grossest hypocrisy.

But even to those who are pro-business and who then, as did Andrew Carnegie, demand that business, the rich man, give alms and become a philanthropist, doing good in order to do well would not be acceptable.

It would convert what is seen as virtue into self-interest.

And for those who counsel business to stick to its last and to leave social problems and issues to the proper authorities, which in fact means to government (this is where Milton Friedman stands), the self-interest of business and the public good are seen as two quite separate spheres.

But in the next decade it will become increasingly important to stress that business can discharge its social responsibilities only if it converts them into self-interest—that is, into business opportunities.




The first social responsibility of business in the next decade will be one not mentioned in the discussion of the social responsibilities of business today.

It is the increasingly important responsibility for creating the capital that alone can finance tomorrow’s jobs.

The shift from the mechanical model of technology to the organic model will require substantial increase in capital investment per worker.

The demand for capital formation will be as great as the demand was a hundred years ago when today’s modern industries emerged; and there will be equal need for a surplus to pay for the R&D needed when technology, as well as the world economy and society, is rapidly changing.




We have been in a phase in which existing technologies were extended and modified with fairly low marginal costs, as a result of which there was a fairly low need for capital formation.

Now we are well past that stage.

To be sure, old industries are still declining or are being restructured, but more important, new industries are exploding: information, communication, biochemistry, bioengineering, and genetic medicine, for example.

And with them emerge other new industries, such as the continuing education of already well-educated adults, which may well be the major growth industry of the next ten years and which increasingly is in the hands of entrepreneurs.




The early growth stages make the greatest demands on capital formation.

But what does capital formation actually mean, especially in a modern society in which the traditional incentives to personal savings have largely been eliminated?

Savings rates in all countries tend to go down with two factors: one, an increase in the proportion of the population past retirement age, who as a rule do not tend to save but who primarily consume; and two, the degree to which Social Security takes care of the risks and contingencies for which individuals traditionally save.

One example is the United States, where savings rates have gone down in direct proportion to both the aging of the population and the extension of social services to cover such risks as retirement, illness, and unemployment.

Another is Japan.

In the last ten years the savings rate in Japan has been going down steadily, although it is still high.




Furthermore we now have conclusive proof that rising income levels for wage-earning families do not materially increase the savings rate.

We know that new consumer needs, rather than investment, take over.

As a result, in a modern economy the main source of capital formation is business profits.

Indeed we now know that the term profit is a misunderstanding.

There are only costs—costs of the past and costs of the future; the costs of economic, social, and technical change; and the costs of tomorrow’s jobs.

Present revenues must cover both, and both costs are likely to go up sharply in the next twenty years.




The first social responsibility of business thus is to make enough profit to cover the costs of the future.

If this social responsibility is not met, no other social responsibility can be met.

Decaying businesses in a decaying economy are unlikely to be good neighbors, good employers, or socially responsible in any way.

When the demand for capital grows rapidly, surplus business revenues available for noneconomic purposes, especially for philanthropy, cannot possibly go up.

They are almost certain to shrink.




This argument will not satisfy those who believe that today’s businessman should become the successor to yesterday’s prince, a delusion to which businessmen unfortunately are themselves only too susceptible.

But princes were able to be benefactors because they first took it away, and, of course, mostly from the poor.




There are also those, again especially among businessmen, who feel that to convert problems into business opportunities is prosaic and not particularly romantic.

They see business as the dragon slayer and themselves as St. Georges on white chargers.




But the proper social responsibility of business is to tame the dragon—that is, to turn a social problem into economic opportunity and economic benefit, into productive capacity, into human competence, into well-paid jobs, and into wealth.

[1984]

 

line

 

Social Innovation — Management’s New Dimension

Are we overemphasizing science and technology as this century’s change agents? Social innovations—few of them owing anything to science or technology—may have had even profounder impacts on society and economy, and indeed profound impacts even on science and technology themselves. And management is increasingly becoming the agent of social innovation.

Here are five examples—five among many:

  • the research lab;
  • Eurodollar and commercial paper; 
  • mass and mass movement;
  • the farm agent; and 
  • management itself as an organized function and discipline.

The Research Lab

The research lab as we now know it dates back to 1905.

It was conceived and built for the General Electric Company in Schenectady, New York, by one of the earliest “research managers,” the German-American physicist Charles Proteus Steinmetz.

Steinmetz had two clear objectives from the start: 

  • to organize science and scientific work for purposeful technological invention and 
  • to build continuous self-renewal through innovation into that new social phenomenon—the big corporation.




Steinmetz took two of the features of his new lab from nineteenth-century predecessors.

From the German engineer, Hefner-Alteneck, he took the idea of setting up within a company a separate group of scientifically and technically trained people to devote themselves exclusively to scientific and technical work—something Hefner-Alteneck had pioneered in 1872 in the Siemens Company in Berlin five years after he had joined it as the first college-trained engineer to be hired anywhere by an industrial company.

From Thomas Alva Edison, Steinmetz took the research project: the systematic organization of research, beginning with a clear definition of the expected end result and identification of the steps in the process and of their sequence.




But Steinmetz then added three features of his own.

First, his researchers were to work in teams.

Hefner-Alteneck’s “designers”—the term researcher came much later—had worked the way scientists worked in the nineteenth-century university, each in his own lab with a helper or two who ran errands for the “boss,” looked up things for him, or, at most, carried out experiments the boss had specified.

In Steinmetz’s lab there were seniors and juniors rather than bosses and helpers.

They worked as colleagues, each making his own contribution to a common effort.

Steinmetz’s teams thus required a research director to assign researchers to projects and projects to researchers.




Second, Steinmetz brought together on the same team people of diverse skills and disciplines—engineers, physicists, mathematicians, chemists, even biologists.

This was brand-new and heretical.

Indeed, it violated the most sacred principle of nineteenth-century scientific organization: the principle of maximum specialization.

But the first Nobel Prize awarded to a scientist in industrial research was awarded in 1932 to a chemist, Irving Langmuir, who worked in Steinmetz’s electrotechnical lab.




Finally, Steinmetz’s lab radically redefined the relationship between science and technology in research.

In setting the goals of his project, Steinmetz identified the new theoretical science needed to accomplish the desired technological results and then organized the appropriate “pure” research to obtain the needed new knowledge.

Steinmetz himself was originally a theoretical physicist; on a recent U.S. postage stamp he is being honored for his “contributions to electrical theory.”

But every one of his “contributions” was the result of research he had planned and specified as part of a project to design and to develop a new product line, for example, fractional horsepower motors.

Technology, traditional wisdom held and still widely holds, is “applied science.”

In Steinmetz’s lab science—including the purest of “pure research”—is technology-driven, that is, a means to a technological end.




Ten years after Steinmetz completed the General Electric lab, the famed Bell Labs were established on the same pattern.

A little later du Pont followed suit, and then IBM.

In developing what eventually became nylon, du Pont worked out a good deal of the pure science for polymer chemistry.

In the 1930s when IBM started to develop what eventually became the computer, it included from the beginning research in switching theory, solid-state physics, and computer logic in its engineering project.




Steinmetz’s innovation also led to the “lab without walls,” which is America’s specific, and major, contribution to very large scientific and technological programs.

The first of these, conceived and managed by President Franklin D. Roosevelt’s former law partner, Basil O’Connor, was the National Foundation for Infantile Paralysis (March of Dimes), which tackled polio in the early 1930S.

This project continued for more than twenty-five years and brought together in a planned, step-by-step effort a large number of scientists from half a dozen disciplines, in a dozen different locations across the country, each working on his own project but within a central strategy and under overall direction.

This then established the pattern for the great World War II projects: the RADAR lab, the Lincoln Laboratory and, most massive of them all, the Manhattan Project for atomic energy.

Similarly, NASA organized a “research lab without walls” when this country decided, after Sputnik, to put a man on the moon.

Steinmetz’s technology-driven science is still highly controversial, is indeed anathema to many academic scientists.

Still, it is the organization we immediately reach for whenever a new scientific problem emerges, for example, when AIDS suddenly became a major medical problem in 1984-85.

Eurodollar and Commercial Paper

In fewer than twenty years, the financial systems of the world have changed more perhaps than in the preceding two hundred.

The change agents were two social innovations: the Eurodollar and the use of commercial paper as a new form of “commercial loan.”

The first created a new world economy, dominated by the “symbol” economy of capital flows, exchange rates, and credits.

The second triggered the “financial revolution” in the United States.

It has replaced the old, and seemingly perennial, segmentation of financial institutions into insurance companies, savings banks, commercial banks, stock brokers, and so on, by “financial supermarkets,” each focused on whatever financial services the market needs rather than on specific financial products.

And this financial revolution is now spreading worldwide.




Neither the Eurodollar nor commercial paper was designed as “revolutionary.”

The Eurodollar was invented by the Soviet Union’s State Bank when General Eisenhower was elected president of the United States in 1952, in the middle of the Korean War.

The Russians feared that the new president would embargo their dollar deposits in American banks to force them to stop supporting the North Koreans.

They thus hurriedly withdrew these deposits from American banks.

Yet they wanted to keep their money in American dollars.

The solution they found was the Eurodollar: a deposit denominated in U.S. currency but kept in a bank outside the United States.

And this then created, within twenty years, a new supranational money and capital market.

It is outside and beyond the control of national central banks, and indeed totally unregulated.

Yet in its totality—and there are now Euroyen and Euro-Swiss-francs and Euromark in addition to Eurodollars—it is larger in both deposits and turnover than the deposits and turnover of the banking and credit systems of all major trading nations taken together.

Indeed, without this innovation on the part of the overseas executives of the Soviet State Bank—every one undoubtedly a good Communist—capitalism might not have survived.

It made possible the enormous expansion of world trade, which has been the engine of economic growth and prosperity in the developed free enterprise countries these last thirty years.




At about the same time, perhaps a little later, two American financial institutions—one a brokerage house, Goldman Sachs, the other a finance company, General Electric Credit Corporation (founded to provide credit to the buyers of General Electric’s electrical machinery) hit on the idea of using an old but totally obscure financial instrument, the “commercial paper,” as a new form of commercial loan, that is, as a substitute for bank credit.

Neither institution is allowed under American financial regulations to make commercial loans—only banks are.

But commercial paper, essentially simply a promise to pay a certain amount at a certain date, is not considered a loan in American law but a security, and this, in turn, banks are not permitted to issue.

Economically, however, there is not the slightest difference between the two—something which nobody had, however, seen earlier.

By exploiting this legal technicality these two firms, and dozens of others following them in short order, managed to outflank the seemingly impregnable lending monopoly of the commercial banks, especially as credit based on commercial paper can be provided at substantially lower interest rates than banks can lend money against customers’ deposits.

The banks at first dismissed commercial paper as a mere gimmick.

But within fifteen years it had abolished most, if not all, of the demarcation lines and barriers between all kinds of credits and investments in the American economy to the point that, today, practically every financial institution and every financial instrument competes with every other institution and every other financial instrument.




For almost two hundred years economists have considered the financial and credit system to be the central core of an economy, and its most important feature.

In every country it is hedged in by laws, rules, and regulations, all designed to preserve the system and to prevent any changes in it.

And nowhere was the financial system more highly structured and more carefully guarded than in the United States.

Commercial paper—little but a change in a term and almost insignificant as an innovation—has broken through all safeguards of law, regulation, and custom and has subverted the American financial system.

We still rank the country’s banks.

But although New York’s Citibank is surely the country’s largest bank—and altogether the country’s largest “financial institution”—the “number-two bank” is probably not a bank at all, but General Electric Credit Corporation.

And Walter Wriston, the long-time chairman of Citibank, points out that Citibank’s biggest competitor in banking and finance is not a financial institution at all but Sears, Roebuck, the country’s largest department store chain, which now gives more consumer credit than any credit institution.

Mass and Mass Movement

A third social innovation of this century are mass and mass movements.

The mass is a collective.

It has a behavior of its own and an identity of its own.

It is not irrational; on the contrary, it is highly predictable.

But its dynamics are what in an individual we would call “the subconscious.”




The essence of the mass movement is concentration.

Its individual “molecules,” the individuals who comprise it, are what a chemist calls highly organized and highly charged.

They all point in the same direction, all carry the same charge.

In the nuclear physicist’s terms, the mass is a critical mass, that is, the smallest fraction big enough to alter the nature and behavior of the whole.




The mass was first invented—for it was an invention and not just a discovery—in the closing years of the nineteenth century when, exploiting the then brand-new literacy, two Americans, Joseph Pulitzer and William Randolph Hearst, created the first mass medium, the cheap, mass-circulation newspaper.

Until then a newspaper was meant to be “written by gentlemen for gentlemen,” as the masthead of one of the London newspapers proudly proclaimed for many years.

Pulitzer’s and Hearst’s “yellow press” by contrast was sneered at as “being written by pimps for guttersnipes.”

But it created a mass readership and a mass following.




These two men and their newspapers then created and led the first modern political mass movement, the campaign to force the United States into the Spanish-American War of 1898.

The tactics that these two men developed have since become standard for all mass movements.

They did not even try to win over the majority as earlier American movements—the abolitionists or the Free Soilers, for instance—had done.

On the contrary, they tried to organize a minority of true believers: they probably never attracted more than io percent of the electorate.

But they organized this following as a disciplined “shock troop” around the single cause of fighting Spain.

They urged their readers in every issue to clip out and mail a postcard to their congressman demanding that America declare war on Spain.

And they made a candidate’s willingness to commit himself on the war issue the sole criterion for endorsing or opposing him regardless of his position on any other issue.

Thus, their small minority had the “swing vote” and could claim control of the political future of the candidates.

In the end it imposed its will on the great majority, even though almost every opinion maker in the country was opposed.

A mass movement is powerful precisely because the majority has a diversity of interests all over the lot and is thus lukewarm in regard to all of them and zealous in respect to none.

The single cause gives the mass movement its discipline and its willingness to follow a leader.

It thus makes it stand out and appear much bigger than it really is.

It enables a single cause to dominate the news and, indeed, largely to determine what is news.

And because it makes its support of parties and candidates totally dependent on their willingness or unwillingness to commit themselves to the single cause, it may cast the deciding vote.




The first to apply what Pulitzer and Hearst had invented to a permanent “crusade” was the temperance movement.

For almost a century such temperance groups as the Anti-Saloon League and the Women’s Christian Temperance Union had struggled and campaigned without much success.

Around 1900 their support was probably at its lowest level since the Civil War.

And then they adopted the new tactics of the mass movement.

The Women’s Christian Temperance Union even hired several of Pulitzer’s and Hearst’s editors.

The “true believers” in Prohibition never numbered more than 5 or io percent of the electorate.

But in less than twenty years they had Prohibition written into the Constitution.




Since then, single causes—the environment, automobile safety, nuclear disarmament, gay rights, the Moral Majorityhave become commonplace.

But we are only now beginning to realize how profoundly the single-cause mass movement has changed the politics of all democratic countries.




And outside of the United States the social innovation of the mass has had even greater impacts.

The great tyrannies of this century—Lenin’s and Stalin’s Bolsheviks, Mussolini’s Fascism, Hitler’s Nazism, even Maoism—are all applications of the mass movement, the highly disciplined single-cause minority of true believers, to the ultimate political goal of gaining and holding power.




Surely no discovery or invention of this century has had greater impact than the social innovation of mass and mass movement.

Yet none is less understood.




Indeed, in respect to the mass we are today pretty much where we were in respect to the psychodynamics of the individual a hundred years ago.

Of course we knew of the “passions.”

But they were something one could only explain away as part of “animal nature.”

They lay outside the rational, that is, outside prediction, analysis, and understanding.

All one could do was to suppress them.

And then, beginning a hundred years ago, Freud showed that the passions have their reasons, that indeed, in Pascal’s famous phrase, “the heart has its reasons of which Reason knows nothing.”

Freud showed that the subconscious is as strictly rational as the conscious, that it has its own logic and its own mechanisms.

And although not all psychologists today—indeed, not even the majority—accept the specific causative factors of Freudian psychoanalysis, all accept Freud’s psychodynamics of the individual.




But so far we still lack a Sigmund Freud of the mass.

The Farm Agent

The single, most important economic event of this century is surely the almost exponential rise in farm production and farm productivity worldwide (except, of course, in the Soviet Union).

It was brought about primarily through a social innovation of the early years of this century: the farm agent.




Karl Marx, a hundred years ago, had good reason for his dismissal of the “peasant” as hopelessly ignorant and hopelessly unproductive.

Indeed, practically every nineteenth-century observer shared Marx’s contempt.

By 1880, serious, systematic scientific work on agricultural methods and agricultural technology had been going on for two hundred years.

Even the systematic training of farmers and agronomists in an “agricultural university” had been started one hundred years earlier.

Yet only a very small number of large landowners were paying any attention.

The vast majority of farmers—practically all American farmers, for instance—did not, in i88o, farm any differently, farm any better, or produce any more than their ancestors had done for centuries.

And twenty years later, around 1900, things still had not changed.




And then, suddenly, around the time of World War I—maybe a little later—things changed drastically.

The change began in the United States.

By now it has spread everywhere; indeed, the surge in farm production and farm productivity has become most pronounced in Third World countries such as India.




What happened was not that farmers suddenly changed their spots.

What happened was a social innovation that put the new agricultural knowledge within farmers’ reach.

Julius Rosenwald, the chief executive of a mail-order company, Sears, Roebuck, himself a Chicago clothing merchant and the purest of “city slickers,” invented the farm agent (and for ten years paid farm agents out of his own pocket until the U.S. government took over the Farm Extension Service).

He did not do this out of philanthropy alone, but primarily to create purchasing power among his company’s customers, the American farmers.

The farm agent provided what had hitherto been lacking: a conduit from the steadily increasing pool of agricultural knowledge and information to the practitioners on the farm.

And within a few short years the “ignorant, reactionary, tradition-steeped peasant” of Marx’s time had become the “farm technologist” of the “scientific revolution on the farm.”

Management

My last example of a social innovation is management.

“Managers,” of course, have been around a long time.

The term itself is, however, of twentieth-century coinage.

And it is only in this century, and largely within the last fifty years, that management has emerged as a generic function of society, as a distinct kind of work, and as a discipline.

A century ago most major tasks, including the economic task we call business, were done mainly in and by the family or by family-run enterprises such as the artisan’s small workshop.

By now all of them have become organized in institutions: government agency and university, hospital, business enterprise, Red Cross, labor union, and so on.

And all of them have to be managed.

Management is thus the specific function of today’s “society of organizations.”

It is the specific practice that converts a mob into an effective, purposeful, and productive group.




Management and organization have become global rather than Western or capitalist.

The Japanese introduced management as an organized discipline in the early 195os, and it became the foundation for their spectacular economic and social successes.

Management is a very hot topic in the Soviet Union.

And one of the first moves of the Chinese after the retreat from Maoism was to set up an Enterprise Management Agency in the prime minister’s office and import an American-style management school.




The essence of modern organization is to make individual strengths and knowledge productive and to make individual weaknesses irrelevant.

In traditional organizations—the ones that built the pyramids and the Gothic cathedrals, or in the armies of the eighteenth and nineteenth centuries—everybody did exactly the same unskilled jobs in which brute strength was the main contribution.

Such knowledge as existed was concentrated at the top and in very few heads.




In modern organizations everybody has specialized and fairly advanced knowledge and skill.

In the modern organization there are the metallurgist and the Red Cross disaster specialist, the trainer and the tool designer, the fund raiser and the physical therapist, the budget analyst and the computer programmer, all doing their work, all contributing their knowledge, but all working toward a joint end.

The little each knows matters; the infinity each doesn’t know, does not.




The two cultures today may not be those of the humanist and the scientist as C. P. Snow, the English physicist turned novelist, proclaimed thirty years ago.

They may be the cultures of what might be called the literati and the managers: The one sees reality as ideas and symbols; the other sees reality as performance and people.




Management and organization are still quite primitive.

As is common in a rapidly evolving discipline—as was true, for instance, in medicine until fairly recently—the gap between the leading practitioners and the great majority is enormously wide and is closing but slowly.

Far too few, even of the most accomplished of today’s managers in all organizations, realize that management is defined by responsibility and not by power.

Far too few fight the debilitating disease of bureaucracy: the belief that big budgets and a huge staff are accomplishments rather than incompetence.




Still, the impact has been enormous.

Management and its emergence have, for instance, rendered equally obsolete both social theories that dominated the nineteenth century and its political rhetoric: 

  • the Jeffersonian creed that sees society moving toward a crystalline structure of independent small owners—the yeoman on his forty acres, the artisan in his workshop, the shopkeeper who owns his own store, the independent professional—and 
  • the Marxist theorem of a society inexorably turning into an amorphous gas of equally impoverished, equally disenfranchised proletarians.

Instead, organization has created an employee society.

In the employee society, blue-collar workers are a steadily shrinking minority.

Knowledge workers are the new and growing majority—both the main cost and the main resource of all developed societies.

And although knowledge workers are employees, they are not proletarians but, through their pension funds, the only capitalists and, collectively, the owners of the means of production.




It is management which in large measure accounts for this century’s most extraordinary social phenomenon: the educational explosion.

The more highly schooled people are, the more dependent they then become on organizations.

Practically all people with schooling beyond high school, in all developed countries—in the United States the figure is 90 percent plus—will spend all their working lives as employees of managed organizations and could not make their living without them.

Neither could their teachers.

Conclusion

If this were a history of social innovation in the twentieth century, I would have to cite and to discuss scores of additional examples.

But this is not the purpose of this essay.

The purpose is not even to show the importance of social innovation.

It is, above all, to show that social innovation in the twentieth century has largely become the task of the manager.




This was not always the case; on the contrary, it is quite new.




The act that, so to speak, ushered in the nineteenth century was an innovation: the American Constitution.

Constitutions were, of course, nothing new; they go back to ancient Greece.

But the American Constitution was different: It first provided expressly a process for its own change.

Every earlier constitution had been presumed unchangeable and an “eternal verity.”

And then the Americans created in the Supreme Court a mechanism to adapt the Constitution to new conditions and demands.

These two innovations explain why the American Constitution has survived where all earlier ones perished after a short life of total frustration.




A hundred years later, Prince Bismarck in Germany created, without any precedent, the social innovations we now call Social Security—health insurance, old-age pensions, and workmen’s compensation insurance against industrial accidents, which were followed a little later by unemployment compensation.

Bismarck knew exactly what he was trying to do: defuse a “class war” that threatened to tear asunder the very fabric of society.

And he succeeded.

Within fifteen years, socialism in Western and Northern Europe had ceased to be “revolutionary” and had become “revisionist.”




Outside of the West, the nineteenth century produced the tremendous social innovations of the Meiji Restoration in Japan, which enabled the least Western and most isolated of all contemporary countries both to become a thoroughly “modern” state and nation and to maintain its social and cultural identity.




The nineteenth century was thus a period of very great social innovation.

But, with only a few exceptions, social innovations in the nineteenth century were made by governments.

Invention, that is, technical discovery, the nineteenth century left to the private sector.

Social innovation was governmental and a political act.




Somehow, in this century, government seems to have lost its ability to do effective social innovation.

It could still do it in America’s New Deal in the 193os, although the only New Deal innovations that worked were things that had been designed and thoroughly tested earlier, often before World War I, in the large-scale “pilot experiments” conducted by individual states such as Wisconsin, New York, and California.

Since then, however, we have had very few governmental innovations in any developed country that have produced the results for which they were designed—very few indeed that have produced any results except massive costs.




Instead, social innovation has been taken over by the private, nongovernmental sector.

From being a political act it has become a “managerial task.”

We still have little methodology for it, though we now do possess a “discipline of innovation.”

Few social innovators in management yet know what they intend to accomplish the way the American Founding Fathers, Bismarck, and the statesmen of Meiji did—though the examples of Rosenwald’s farm agent would indicate that it can be done.

Still, social innovation has clearly become management’s new dimension.

[1986]

 

line

 

The following ↓ is a condensed strategic brainscape that can be explored and modified to fit a user’s needs

 

The concepts and links below ↓ are …

major foundations ↓ for future directed decisionS

aimed at navigating

a world constantly moving toward unimagined futureS

history-of-the-world-in-two-hours-03-pict-600

YouTube: The History of the World in Two Hours
— beginning with the industrial revolution ↑ ↓

Management and the World’s Work

↑ In less than 150 years, management ↑ has transformed
the social and economic fabric of the world’s developed countries …

 

“Your thinking, choices, decisions are determined by
what you have seen edb

radar_limited-pict-no-reflect-400

Take responsibility for yourself and
don’t depend on any one organization ↑ ↓ (bread-crumb trailS below)

We can only work on the thingS on our mental radar at a point in time

About time The future that has already happened

radar-differences-pict-400

The economic and social health of our world
depends on
our capacity to navigate unimagined futureS
(and not be prisoners of the past)

 

The assumption that tomorrow is going to be
an extrapolation of yesterday sabotages the future — an
organization’s, a community’s and a nation’s future.

The assumption ↑ sabotages future generations — your children’s,
your grandchildren’s and your great grandchildren’s — in
spite of what the politicians say …

The vast majority of organization and political power structures
are engaged in this ↑ futile mind-set
while rationalizing the evidence

 

The future is unpredictable and that means
it ain’t going to be like today
(which was designed & produced yesterday)

 

The capacity to navigate is governed by what’s between our ears ↑ ↓

 

line

 

When we are involved in doing something ↑

it is extremely difficult to navigate

and very easy to become a prisoner of the past.

 

We need to maintain a pre-thought ↓

systematic approach to work and work approach

Click on either side of the image below to see a larger view

Harvest to action

Harvesting and implementing Work

based on reality

the non-linearity of time and events

and the unpredictability of the future

with its unimagined natureS. ↓ ↑

 

(It’s just a matter of time before we can’t get to the future
from where we are presently
)

Foundations and opportunities ::: larger view

foundations-and-opportunities-2016-pict-400

Intelligence and behavior ↑ ↓ ← Niccolò Machiavelli ↑ ↓

Political ecologists believe that the traditional disciplines define fairly narrow and limited tools rather than meaningful and self-contained areas of knowledge, action, and eventscontinue

❡ ❡ ❡

Foundational ↑ Books → The Lessons of History — unfolding realities (The New Pluralism → in Landmarks of Tomorrow ::: in Frontiers of Management ::: How Can Government Function? ::: the need for a political and social theory ::: toward a theory of organizations then un-centralizing plus victims of success) ::: The Essential Drucker — your horizons? ::: Textbook of Wisdom — conceptual vision and imagination tools ::: The Daily Drucker — conceptual breadth ::: Management Cases (Revised Edition) see chapter titles for examples of “named” situations …

foundational-books-cropped-pict-600

What do these ideas, concepts, horizons mean for me? continue

 

picture-technology-pict-no-reflect-400

Society of Organizations

“Corporations once built to last like pyramids
are now more like tents.

Tomorrow they’re gone or in turmoil.”

sound-players-pict-600

“The failure to understand the nature, function, and
purpose of business enterprise” Chapter 9, Management Revised Edition

“The customer never buys ↑ what you think you sell.
And you don’t know it.

That’s why it’s so difficult to differentiate yourself.” Druckerism

 

“People in any organization are always attached to the obsolete
the things that should have worked but did not,
the things that once were productive and no longer are.” Druckerism

 

What Everybody Knows Is Frequently Wrong ::: If You Keep Doing What Worked in the Past You’re Going to Fail ::: Approach Problems with Your Ignorance — Not Your Experience ::: Develop Expertise Outside Your Field to Be an Effective Manager ::: Outstanding Performance Is Inconsistent with Fear of Failure ::: You Must Know Your People to Lead Them ::: People Have No Limits, Even After Failure ::: Base Your Strategy on the #Situation, Not on a Formula — A Class With Drucker: The Lost Lessons of the World's Greatest Management Teacher

 

Why Peter Drucker Distrusted Facts (HBR blog) and here

 

Best people working on the wrong things continue

 

Conditions for survival

 

Going outside

 

Making the future — a chance for survival

 

“For what should America’s new owners, the pension funds,
hold corporate management accountable?” and
“Rather, they maximize the wealth-producing capacity of the enterprise”
Search for the quotes above here

 

Successful careerS are not planned ↑ here and

 

What do these issues, these challenges mean for me & … — an alternative

 

Exploration paths → The memo they don’t want you to see ::: Peter Drucker — top of the food chain ::: Work life foundations (links to Managing Oneself) ::: A century of social transformation ::: Post-capitalist executive interview ::: Allocating your life ::: What executives should remember ::: What makes an effective executive? ::: Innovation ::: Patriotism is not enough → citizenship is needed ::: Drucker’s “Time” and “Toward tomorrowS” books ::: Concepts (a WIP) ::: Site map a.k.a. brainscape, thoughtscape, timescape

 

Just reading ↑ is not enough, harvesting and action thinking are neededcontinue

Information ↑ is not enough, thinking ↓ is neededfirst then next + critical thinking

thinking-principles-taskcard-400

Larger view of thinking principles ↑ Text version ↑ :::
Always be constructiveWhat additional thinking is needed?

 

Initially and absolutely needed: the willingness and capacity to
regularly look outside of current mental involvements continue

bread-crumb trail end

 

line

 

Peter Drucker: Conceptual Resources

The Über Mentor

A political / social ecologist
a different way of seeing and thinking about
the big picture
— lead to his top-of-the-food-chain reputation

drucker business week

about Management (a shock to the system)

 

“I am not a ‘theoretician’; through my consulting practice I am in daily touch with the concrete opportunities and problems of a fairly large number of institutions, foremost among them businesses but also hospitals, government agencies and public-service institutions such as museums and universities.

And I am working with such institutions on several continents: North America, including Canada and Mexico; Latin America; Europe; Japan and South East Asia.” — PFD

 

line

 

List of his books

 

Large combined outline of Drucker’s books — useful for topic searching.

 

line

 

High tech is living in the nineteenth century,
the pre-management world.
They believe that people pay for technology.
They have a romance with technology.
But people don't pay for technology:
they pay for what they get out of technology.” —
The Frontiers of Management

TLN Keywords: tlnkwdruckerbook

 

“The greatest danger in times of turbulence is not turbulence;

it is to act with yesterday’s logic”. — Peter Drucker

 

 

The shift from manual workers
who do as they are being told
either by the task or by the boss —

TO knowledge workers
who have to manage themselves

profoundly challenges social structure

 

Managing Oneself (PDF) is a REVOLUTION in human affairs.” …

“It also requires an almost 180-degree change in the knowledge workers’ thoughts and actions from what most of us—even of the younger generation—still take for granted as the way to think and the way to act.” …

… “Managing Oneself is based on the very opposite realities:
Workers are likely to outlive organizations (and therefore, employers can’t be depended on for designing your life),

and the knowledge worker has mobility.” ← in a context

 

 

More than anything else,

the individual
has to take more responsibility
for himself or herself,
rather than depend on the company.”
continue

 

“Making a living is no longer enough
‘Work’ has to make a life .” continue

finding and selecting the pieces of the puzzle

 

The Second Curve

 

line

 

These pages are attention directing tools for navigating a world moving relentlessly toward unimagined futures.

 

evidence-wall-and-time-line-pict-600

What’s the next effective action on the road ahead

 

stages-simple-horizons-pict-t

 

It’s up to you to figure out what to harvest and calendarize
working something out in time (1915, 1940, 1970 … 2040 … the outer limit of your concern)nobody is going to do it for you.

It may be a step forward to actively reject something (rather than just passively ignoring) and then working out a plan for coping with what you’ve rejected.

Your future is between your ears and our future is between our collective ears — it can’t be otherwise.

A site exploration: The memo THEY don't want you to see

 

Google

To create a rlaexp.com site search, go to Google’s site ↓

Type the following in their search box ↓

your search text site:rlaexp.com

intelligence-instructions

 

What needs doing?

 

contact

 



Copyright 1985 through 2022 © All rights reserved | bobembry bobembryusa bobembry.usa | bob embry robert embry | “time life navigation” © #TimeLifeNavigation | “life TIME investment system” © #LifeTimeInvestmentSystem | “career evolution” © #CareerEvolution | “life design” © #LifeDesign | “organization evolution” © #OrganizationEvolution | “brainroads toward tomorrows” © #BrainroadsTowardTomorrows | “foundations for future directed decisions” © #FoundationsForFutureDirectedDecisions | #rlaexpdotcom © | rlaexpdotcom ©

#rlaexp.com = rla + exp = real life adventures + exploration or explored

exploration leads to explored

Examples ↑ can be found through web searches, Wikipedia,
Pinterest and the daily news

 

As an Amazon Associate I earn from qualifying purchases