Global interest rates and soft power

Since the GFC (Global Financial Crisis), there have been serious changes in how the American economy functions. Estimates of two key macroeconomic variables – the neutral interest rate and the natural unemployment rate – have been consistently lower since 2008. Recently Richard Clarida reported that the decline in the neutral real interest rate is a global phenomenon. The fall in the neutral real rate means that the corresponding FFR is lower than anticipated after the GFC: it is 2.5% nominally, consistent with the 2% inflation target. That puts a cap on Fed rates even if the Fed goes to the end and does not break its tightening cycle after 2-3 hikes. Also, the natural unemployment rate, consistent with the 2% inflation target, is lower: more jobs, lower interest rates, same inflation. Anyway, that was the situation before the pandemic. It entails a long-term vision. Real interest rates have been falling for centuries as a trend.

Global real interest rate, GDP-weighted, trend declines 1317-2018.


Given the severity of post-pandemic scars and all the structural change and supply-chain restructuring they entail – not to mention the Russian invasion of Ukraine that will have serious long-term economic consequences – is there a possibility that global rates will remain subdued for yet another five to ten years after 2023? Will they return to their new, lower neutral rates after the Fed hiking cycle ends? This is what has happened for seven years after Lehman. Another argument purports to explain the persistence of low interest rates in the aftermath of a deep financial crisis that dragged down the real economy. It is an old idea but it has been revived by many economists in the context of climate change after the Stern Report of 2007. What is the correct discount rate in evaluating a long-term project? A paper by Martin Weitzman elevated this issue to a more general context and related it to asset prices, stock-bond disconnect, and so on. It reads roughly as follows: in the long run, risk perception – captured by the discount rate – is somewhere between the risk-free rate and the overall market rate. If the market rate is represented by average real equity returns over a sufficiently long period, and if the short-run bond rate is a sufficient statistic for the risk-free rate, the U.S. data would yield 7% for the general rate and just 1% for the risk-free rate. The difference is huge: it would yield a present discounted value of 8000 times higher should 1% be used as opposed to 7% per annum over 150 years. This is an elaboration of the mathematical-cum-philosophical argument put forward by the genius-like Cambridge mathematician and logician Frank Ramsey in 1928. Governments shied away from the Stern Report’s consequences for 14 years because it suggested a large sum, 3% of global GDP, should be spent on keeping climate change in check. They opted for relatively modest spending tactics.


That kind of logic is best applied to monumental events such as climate change or long-horizon large public projects. However, this deceptively simple idea can also deliver an insight: after tail-risk events such as Lehman and COVID -19, if the market does not believe in the recurrence of that event, risk perception might decline so steeply that the risk-free rate could all of a sudden seem normal for the whole market risk. The lower bound might apply as a panacea for all risk and stock prices might climb along with other asset prices while bond rates remain very low for an extended period. This is not the same as the secular stagnation hypothesis, which related such persistence to a chronic deficiency of aggregate demand and supply-side factors such as demographics, or the rising inequality in the wealth/income distribution but it connects the persistence of low rates directly to the perception of risk. Whether the weak but always latent presence of yet another tail risk was masked in the last decade by successive rounds of quantitative easing is another question. Yet this argument would suggest a path whereby Fed rates rise in the immediate aftermath of the pandemic, say in 2022, but they would return to lower levels after the “scars” disappear and stay low during so many years of adaptation to the brave new world. It is true, after all, that low rates provide a seedbed for innovations, and this is what happened before the first Industrial Revolution in the middle of the 17th century. I tend to think a new industrial revolution is in the making with the current levels of education, job market changes, work ethic, working modes, and sectoral reallocation of investments. Maybe lower neutral real interest rates and lower neutral unemployment rates aren’t that surprising. The world isn’t where it was in 2008.


In the mid-1990s, there were two established opinions on the nature of industrial revolutions. Early conjectures claimed that industrial revolutions, especially the Industrial Revolution in England, had accomplished deep and all-encompassing structural changes. This strand of thought was advocated by such great names as Marx, Landes, Ashton, and Toynbee. However, there was a second opinion, supported by a newer generation of economic historians such as Crafts and Harley, that transformed the classical approach and radically altered the concept of industrial revolution. Peter Temin tested both approaches. Ashton describes an age of innovation sweeping across agriculture, coal, iron ore, textiles, and finance: “A wave of gadgets swept over England.” He thinks that innovation was not confined to iron ore and textiles but was widespread. Yet even Ashton admits that the revolution, though wider-based than the usual account, did not spread across all sectors. In his classic 1969 book, David Landes elaborates the view that mechanized manufacturing was of the essence. While other sectors showcased rather slow technological change, tool-making based on manufacturing-driven innovation ensured all other sectors benefited from advancements. At the turn of the century, Joel Mokyr reclaimed this all-encompassing, multi-focused technological innovation thesis.

Consider some firm-based growth examples. Growth here is innovation-led and microeconomic in nature. There is the innovation-based growth model exemplified by Paul Romer, for instance, which asserts that total productivity is a function of the spectrum of products. In this version, a firm with monopolistic tendencies – a rather large one – innovates as it expands the variety of its products without necessarily enhancing their quality. This depends on the Dixit-Stiglitz-Ethier monopolistic competition model. Here innovation or industrial revolution is built on the industrial organization of the market from which firm behavior derives. There is also Schumpeterian growth. The spectrum of products can be normalized (to one, for instance) but each intermediate input admits a productivity parameter A(i). There is no proliferation of products, but each sector monopolizes and produces at constant marginal cost. Each monopolistic sector faces a demand for the marginal product (i) it generates. Here the quality of each product is enhanced, but not the number or variety of products. This is quite the opposite of the Romer model, where all firms act as if they were R&D firms operating in the M segment of SMEs, even if they have monopoly power emanating from instant technological innovation, patents, and more. There is a built-in tendency to monopolize. If a firm can’t increase the quality of its marginal product, it exits the market. This is called creative destruction by Schumpeter.


There is also an update to Schumpeter advocated by Jones in 2014. According to Jones’ theory, if the logarithm of firm revenues grows exponentially, the level of revenue follows a Pareto distribution. This simply means that if the innovator/entrepreneur/pioneer firm invests and sufficient time passes until a competitor appears, the pioneer firm will reap all the rent accrued from innovation. Its revenues will skyrocket. Yet Schumpeterian creative destruction stops this process or shortens the time elapsed after the first entry so that pioneer firms cannot accumulate that much revenue. Did neoliberalism create that kind of effect? Did the standard deviation of innovation increase after 1980? Did the ratio of creative destruction also increase? In all these models the emphasis is on innovation, whether confined to a single product or expanded to the whole spectrum of products. Creating EBITDA or whatever is not the main focus therein. The focus is to drastically change the productive capacity of society.

Perhaps the global economy (U.S., Europe and China, and maybe India) will deliver what science promises, leading to an unprecedented technological revolution. Or, perhaps, such a view is an illusion. Two corollaries ensue from that crossroads. First, if technological innovation is to be sustainable, global neutral real (equilibrium) interest rates should stay low for at least another decade. Second, if such a thing is to happen it will require soft power, not war. Maybe the ‘real’ 21st century is beginning now. In which direction will we go: wars, climbing military expenditures, and hard power; or peace, technological innovation, and soft power?

Leave a Reply

Your email address will not be published.