When Did the Innovation Shortfall Start?

I’m responding to the posts by Arnold Kling and Bryan Kaplan critiquing  Tyler’s The Great Stagnation. Let me just throw out some thoughts, from the perspective of someone who thinks that The Great Stagnation is a terrific book.

1. I agree wholeheartedly with Tyler that the current crisis is a supply-side rather than a demand-side problem. That explains why the economy has responded relatively weakly to demand-side intervention.

2. From my perspective,  the innovation slowdown started in 1998 or 2000, rather than 1973–sorry, Tyler.  The slowdown was mainly concentrated in the biosciences, reflected in statistics like a slowdown in new drug approvals, slow or no gains in death rates for many age groups (see my post here),  and low or negative productivity in healthcare (see David Cutler on this and my post here).  This is a chart I ran in January 2010 (the 2007 death rate has been revised up a bit since then)–it shows a steady decline in the death rate for Americans aged 45-54 until the late 1990s.

The innovation slowdown was also reflected in the slow job growth in innovative industries, and the sharp decline in real wages for young college graduates (see my post here). (Young college grads, because they have no investment in legacy sectors, inevitably flock to the dynamic and innovative industries in the economy. If their real wages are falling, it’s because the innovative industries are few and far between).

3. The apparent productivity gains over the past ten years have been a statistical fluke caused in large part by the inability of our statistical system to cope with globalization, including: The lack of any direct price comparisons between imported and comparable domestic goods and services; systematic biases in the import price statistics (see Houseman et al  here, for example); and no tracking of knowledge capital flows. I’ve got several posts coming on this soon.

4. I agree with Tyler that regulation of innovation is a big problem.  That’s why I’ve suggested a new process, a Regulatory Improvement Commission,  for reforming selected regulations.

5. I’m of the view that we may be close to another wave of innovation, centered in the biosciences, that will drive growth and job creation over the medium run.  If we want growth and rising living standards, we need to avoid adding on well-meaning regulations that drive up the cost of innovation.

Comments

  1. The deaths trends should certainly be a lagging phenomenon. When “innovation slows”, or whatever other life expectancy enhancing mechanism gives out, people will not start dropping dead the next day. There should be at least a multi-year, if not multi-decade lag. So the causes are not to be sought in the late 90’s.

    Also from a demographic perspective, the peak of the boomers was around 1955, so the people born in the boomer peak were in the 45-55 age bracket between 2000 and 2010.

    • Mike Mandel says:

      Good point, but I’m looking at death rates and not life expectancy. The death rate should react much more quickly to the advent of a new treatment for cancer, say.

      • Well, but certainly most people’s death (unless deliberate or by accident) is a consequence of their life history, and there particularly health (and healthcare) history. There are probably not many diseases or health related conditions that will lead to rapid death and where a “miracle cure” will prolong life significantly. In many cases people get sick, are put on some treatment or maintenance regime, and die eventually, either from their illness or from “old age”. What kind of treatment people can get depends on circumstances, not unsubstantially their financial wherewithal and to what kind of healthcare they have access. With managed care and other healthcare rationing, many people probably don’t get the best care possible. How much of a difference that makes in the end is anybody’s guess.

      • cm is right – health innovations aren’t necessarily end of life interventions.

        How about healthcare innovations that have a negligible effect on life expectancy? I’m thinking about an alternative to injecting insulin for diabetics which may improve quality of life, but not necessarily duration of life (especially not in the short term) or a new method of hernia repair.

      • Mike Mandel says:

        I’m going to draw an analogy to crime statistics. Burglaries, assaults, and rapes are serious crimes, but they are subject to reporting biases for various reasons (i.e. people living more dangerous neighborhoods may be less likely to report crimes because of a distrust of police. The police themselves may have incentives to miseport, etc.). The murder rate ,though, is pretty reliable because you have a dead body.

        Same thing for death rates versus other measures of healthcare innovation. The death rate is countable, while other measures of quality of life are tough to measure and suffer from reporting biases.

      • I think we are getting closer to something, though probably not the subject of the post. Our technocratic society (and frame of mind) tends to overattribute significance to “measurable” things to the point that only things that can be “objectively” or at least “definitely” measured are considered significant. And death is certainly something very definite and binary.

        On a related line of thought, do you have the trend for infant mortality and does it show a similar pattern? Many advances in medicine, as in all other areas of life, have been made by observing key statistics and tracking progress by said statistics. As the declared goal was to improve the statistic, it is not surprising that the respective statistic would be on a downtrend as long as its improvement is a primary policy goal.

        To put on my cynical hat, maybe the state of public health is considered good enough, like the state of road repair, and our management doesn’t invest in continued improvement anymore as there are other priorities. This would also explain the “innovation shortfall”. Perhaps recent history (2+ decades) can provide some clues.

  2. If you really want to say it didn’t occur earlier, you need earlier data. The rate may have worsened then before becoming even worse in the last decade.

  3. I suspect a biotech boom will forever be just over the horizon, but then that’s what people probably think of people like me who’ve been saying for the last 8 years that a real internet boom is on the horizon. ;) The difference is that internet tech is a much larger and growing market today than biotech, so there’s certainly more evidence of its value. What do knowledge capital flows have to do with productivity measurement? Your argument is presumably that we will face stiffer competition down the road because of that knowledge capital export, but that has little to do with current productivity measurement. Sounds like you just lumped one of your hobby horses in. ;)

  4. Does this show up in cause of death? Is this medical innovation or more modern cars, safer workplaces, better food safety, better imaging diagnostics? Is medical practice even important?

  5. I try to keep up with biotech. The current predictions sound eerily similar to the predictions made when I was studying biotech in the 1980s.

    I think you mistake more research for better research. Not much progress has been made on understanding/solving the truly hard problems in biotech in the last 30 years. A lot of the “major advances” are trivial intellectually.

    There seem to be few incentives in research to take on truly hard problems because the risk of failure is so high. Instead scientists study safe problems where results are pretty much guaranteed. They then hype the results.

    Its sad.

  6. To NormD: I don’t think the problem is that people aren’t trying to solve the big problems, it’s that we have only figured out how to solve them in mice, dogs, etc. or in cell cultures. New cancer cures based on mice models appear frequently and are never heard from again because humans are more complicated than mice. In vivo is always harder than in vitro.

  7. The dot con era unleashed unprecedented mal-investment in start ups having no plans for profits. The dot cons were given market valuations based on how much more money they could raise in the gold rush mentality.

    A side effect is that tons of money was poured in seed that went bad. A decade later, the crop looks pretty cruddy.

    It will take time for the newer seeds to produce new crops in the future. Until then, there seems to be a famine of useful R&D output.

  8. Incidentally, that slowdown in life expectancy rise is an American thing. In Australia the graph has been a steady three months a year since 1900.

    • Mike Mandel says:

      Actually, my read of this chart suggests that Australia shows a similar pattern to the U.S. –relatively little improvement in death rates for middle-aged men and women (I use death rates rather than life expectancy because the death rate is something happening right now, while life expectancy is a projection into the future.

  9. mike shupp says:

    1. Beginning about 1970, Richard Nixon steadily cut back on federal funding for physicial science research and beefed up spending on medical (“the War on Cancer”) and biological research. A LOT of biology that most scientists had expected to emerge as a field for study during the 21st century actually got worked on during the 20th century. (Around 1980, of course, AIDS became a problem — and nobody since has seen reason to criticize Nixon’s judgement call.)

    2. In 1971, Congress passed something called The Mansfield Amendment, which prohibited DOD funding of science and technology without immediate and clear military significance.

    3. Up to this point about 3/4 of R&D spending in the USA was paid for (and directed) by the US government, with industry and private concerns making up the rest. After the early 1970’s, federal spending dropped to about 1/3 of all R&D, with industry taking up the slack. Overall, R&D spending didn’t drop signfiicantly, but there’s general agreement that industry spending has been focused on near-term development rather than long-term research; it’s been good for profits but blue-sky projects have languished.

    4. Several consequential actions upset the commercial R&D world in the middle 1970’s. One was IBM’s development of the System 360 computer architecture, which pretty well obsoleted computer mainframe manufacturing at Burroughs, Univac, NCR, Control Data, Honeywell and no doubt others — helped along by the rise of Data General, Digital Equipment Corporation, and other “mini-computer” firms. (RCA dropped 650 million in one year on it’s computer operatiions about this time — setting a record for dropping income.) Another was the consent decree that severed UNIX from AT&T and eventually led to the demise of AT&T’s Bell Labs.)

    5. There was a major contraction in aerospace R&D spending about this time. Cutbacks at NASA were pretty severe, but there was also much less funding for general and military aerospace as the Viet Nam War wound down. There were several Nixon Administration efforts aimed at persuading aviation firms to spend more of their own funds on independent R&D — my recollection is that the results were dismal. Fewer planes for operations, fewer prototypes… there was a lot of consolidation in the airplane building racket during the 1970’s. Failures in the commercial aircraft business sunk Douglas Aircraft. for example, and almost took down Lockheed and McDonnell.

    6. Need I mention that after Three Mile Island, the commercial nuclear business in the USA basically went down the tubes?

    ——-

    I’ll skip elaboration, and not try to bring things up to the present day. The point is, the American research and development establishement was altered considerably during the early to mid 1970’s. On the plus side, biomedical research received a hearty push from the federal govenment. On the negative side, the chemical and physical sciences and engineering that had driven commercial product development for most of the 20th century tapered off or was drastically curtailed — presumably with some impact. The “innovation shortfall” began in the 1970’s.

    • What you describe is a shortfall if not in funding, then certainly focus outside of what can be charitably described as “applied research”.

      Aside from that, your presentation stops before the computer revolution even took off, and before large advances in semiconductor/software happened that enabled exponentially progressing advances in manufacturing technology and logistics management. (And I’m not using “exponentially” for hyperbole here.)

      I would rather argue there was no innovation shortfall at all. What people with a financial/business focus probably mean is that the promised high growth rates of financial returns fell short.

      At any point in time there have been more promising, or at least plausible, innovations than could or would be commercially realized. There is credible evidence of innovations, or their commercial success, being nipped in the bud because they threatened established businesses, or established lines of business within the same company. In the latter category there were cases of R&D results being kept in the proverbial drawers internally because the current state of the art technology wasn’t played out yet, and was destined to be milked for a bit more instead of being “needlessly” obsoleted by the new designs. Probably one or the other innovator had their career potential stunted by it.

      • CM —

        The original question here was whether some sort of “innovation shortfall” began in the 1970’s as Tyler Cowens would have it, or in the 1990’s as Mr. Mandel views things. I don’t have the time, the patience, or the access to sources to bat out a full history of late 20th century technological development in America; I do think I can point to a seachange in the early 1970’s which supports Cowens’ opinion. Progress in computer hardware and software, the growth of USENET and the World Wide Web, cultural innovations such as open-source software development and social networking, etc. etc. are indeed interesting topics, but they aren’t germane to the initial issue.

        As for your later point about innovation slowdowns due to “milking” R&D, my gut feeling is that this isn’t so — it’s dangerous for a firm to rely on continued progress in obsolescent technology when rivals may be leaping forward with newer appproaches. A specific example: the Zilog Z80 CPU, a mainstay of CP/M microprocessors in the early-mid 1980’s, had the abiliity to address multiple 64-kilobyte memory spaces — a signficant improvement over the capability of Intel’s 8080, but generally slighted because manufacturers wanted to maintain 8080 compatability. It’s quite conceivable that in some alternative world the good folks at Digital Research would have developed a CP/M version 3.0 which took advantage of this sort of bank switching and led us on into a wonderous world of ever more miraculous 8-bit computing. Alas, alack, IBM was seduced by 16-bit technology and Intel’s bastard 8088 chip (an 8-bit processor masquerading as a 16-bit cpu) and to make things worse ran into a slick salesman named Gates , and the computing world has been in steady decline ever since. I could probably do a song and dance routine about Microsoft’s abandonment of OS/2 and IBM’s perseverence. and compose lyric operas about the maturation of time-sharing operating systems in the 1960s and the emergence of Linux from UNIX in the past 20 years. One could probably write lengthy textbooks about hardware and software improvements embedded in modern hard drives. One could discuss at great length the provisions for PCI graphics cards, AGP graphics, and integrated graphics capabilities in contemporary computer motherboards. And so on and so on. (What I’m saying is that I don’t see real prolonged stagnation in computer hardware, so much as an attempt to maintain commonality for a wide range of users.)

        The flip side of this, from my perspective, is that the market has brought forth extraordinary developments in computer hardware and software in recent years, driving a great quantity of often-ignored social change. Courting couples now “meet” upon the internet, the federal government listens in to our long distance calls with supercomputers, mathematicians finally prove some century-old theorems, real live boys and girls will disport themselves sexually for our viewing pleasure on a 24/7 basis at the drop of a credit card …. Oh how wonderful! It’s titallating, and it’s profitable? Who could ask for anything more?

        But neither government nor industry is pushing chemistry along at the rate it took in the late 19th century. People have been talking about the promise of nanotechnology for almost forty years now — and we’re still stuck at discussing promises. Space programs, manned or unmanned, move at glacial rates. Ditto for particle physics. Superconductors haven’t advanced much in thirty years. For all our modern mathematical sophistication it’s difficult to pull signficance out of masses of climate data — and totally impossible for those who believe they see signifiance to convince millions of skeptics. And so on.

        What’s going wrong? Cowens thinks we’ve done the easy part of developing science and the stuff I’ve just pointed to happens to be “hard.” He’s wrong I think– it’s not that this kind of technology is “hard”, it’s just that we lost interest in it. It’s easier to make money by devising clever games in banking, and we’ve set up our society to facilitate banking games. High tech ain’t us anymore, anymore than Zoroasterism or Mithraism. We’re edging out of the science-engineering mindset. Maybe the coming clever people in China and India will do that kind of thing at the end of the century, maybe clever people in Brazil and Indonesia will want to play with technology in the 22nd century.

        Oh well. Thanks for your interst.

      • Thanks for responding. But I think your conclusion is not so much different from what I pointed out – a shifting of focus from researching/developing new paradigms to financial engineering.

Trackbacks

  1. […] When Did the Innovation Shortfall Start? (innovationandgrowth.wordpress.com) […]

  2. […] Cross-posted at Mandel on Innovation and Growth […]

  3. […] is scientific discovery and technological innovation.   A recent post from his blog, Innovationandgrowth, shows how the decline of US innovation has real consequences–the chart above shows how […]

  4. […] suffered for more mysterious reasons (Michael Mandel has detailed the drought in biotech innovation here). Finally, the end of the stimulus and the cut-happy Congress are contributing to employment […]

  5. […] growth is scientific discovery and technological innovation.   A recent post from his blog, Innovationandgrowth, shows how the decline of US innovation has real consequences–the chart above shows how the […]

  6. […] income is stagnant since the 1980s. Tyler’s friend, and Atlantic contributor, Michael Mandel traces the origins of our innovation shortfall to the late 1990s. There’s a disagreement about the […]

  7. […] income is low given a 1980s. Tyler’s friend, and Atlantic contributor, Michael Mandel traces a origins of a creation shortfall to a late 1990s. There’s a feud about a date of origin, […]

  8. […] at Mandel on Innovation and Growth Tagged Economy, Innovation, Regulatory Reform, […]

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Archives

Follow

Get every new post delivered to your Inbox.

Join 65 other followers

%d bloggers like this: