Systems Theory of Macroeconomics
Working Paper
by
November 26 2001, First Draft
Abstract
In spite of elaborate descriptive and correlational studies of it, the most ubiquitous phenomenon in economics, namely inflation, has remained unexplained in terms of its mathematical origins. Keynes had attempted to relate inflation to a mechanism of "sticky wages and prices". Hitherto, such theories of inflation have remained disputable and unproven . Recently, during the so-called "New Economy" era, characterized by a spread of electronic transactions and Internet commerce, inflation eluded many central banks in a different way, this time by displaying a paradoxically slow rate, 2.5 – 3,5 % since 1994, despite rapid expansion of the U.S. economy. The Systems Theory of Macroeconomics (STM) points to an erroneous construction in Say's Identity and Walras Law as well as the derived concepts associated with equilibrium theories which have been fundamental to all schools of thought in economics. Subsequently, STM relates inflation to a state function, namely entropy, and derives it as an irreversible process. The part of STM that equates inflation with cost pressure stemming from entropy constitutes the Special Theory. General Theory attempts to quantify and unify the effects of fiscal, monetary, and social policies through corrections made to the Quantity Theory, thus modeling the short and long term effects of these policies in tandem or combined, in effect serving to more accurately predict the transmissional dynamics of monetary policy.
Keywords
Systems theory, systems approach, policy, complex system, Information Theory, graph, topology, mathematical logic, predicate calculus, semantics, entropy, inflation, input-output, Say's Equality, Walrasian equilibrium
Part One
-Basis-
Milestones : Barter, bullion and coins, paper money, e-transactions
Earliest coins dating the end of the barter era and the beginning of money economies were introduced by Lydians. Following Lydians, Phoenicians spread the use of money as they sailed through the Mediterranean ports. While the Roman Empire made extensive usage of copper coins, it was only after William the Conqueror and establishment of European monarchs that a full transition to money was accomplished, replacing corn and wheat used in transactions as medium of exchange. The next advancement was Gutenberg’s moving type press that facilitated the wider use of bills of exchange and eventually the paper money. For a long period of time silver and gold remained as the reference basis for all transactions. After the Great Depression United States moved away from the gold standard but it was not until 1973 that the gold standard was totally abondened and Federal Reserve Board’s policy actions constituted the value of money. Departure from gold-backed money also marks the beginning of computational era in economics. To certain extent, the birth of information economies was facilitated by reference-free monetary systems that mainly depended on the quantity of money, as well as the commencing computer and communications technologies.
The ubiquitous, stubborn and elusive nature of inflation throughout ages
The price revolution that reigned between 1500 and 1650 in England was followed by a period of stable prices which coincided with a rapid spread of bill of exchanges and bankers notes. These bills and notes as prototypes of paper money were instrumental in increasing quantity of money in the form of M1 as is referred to in current terms. Around 1670s, it would still take three months for a bill that was issued in Scotland to clear in London. (By comparison, at present a credit or debit card transaction for an item bought on the Internet anywhere around the globe clears in no more that 24 hours. The dynamics of the increased speed of transaction by all measures is both an asset and a challenge for policy administrators and central banks). As the prototypes of paper money followed Gutenberg’s printing revolution, the first breakthrough in economic theory, that of John Locke’s quantity theory, was also born in England. Although paradoxically, quantity was increasing when inflation had bottomed after 1649.
Earlier, confronted by budgetary shortfalls Henry VIII secretly reduced the silver content of the coins. Automatic reflex of the economic system to that was a general rise of prices. While the ethical and egotistical shortfalls of Henry VIII was no matter of debate, in terms of causality we do not have indisputable evidence to point out that inflation during that era was really the result of monarchs manipulation of the silver content during the coinage. There was unequivocally an association between high inflation rate and reduced silver in coins: They happened at or around the same time. Monarchs and governments had frequently ran out of cash and whenever given a chance, they have preferred debasement over debt financing. Nevertheless, in spite of excellent records, the cause of the great inflation in England between 1515 and 1650 remain mysterious and debatable as evidently cost pressure was already present at about the same time Henry VIII began debasing coins. Among the possible factors explaining the price revolution the following are frequently listed (3):
Yet, there has been no conclusive evidence pointing to a causal relationship between the rising price of commodities to any of these factors listed. A causal relationship requires at least a temporal order, meaning cause precedes effect, and we fail to find such an order. For this reason, a relationship between rising prices of commodities and the above factors cannot be established beyond associations and correlations.(7)
Thus, England went through a constant state of economic unrest between 1515 and 1649 partially because the rate of inflation was not homogeneous among classes and sectors. While all of the people were afflicted, they were not equally effected. It was a bit ironic that John Locke would formulate and publish his quantity theory not during the inflationary era but four decades later, when stable prices were finally observed. As bills of exchange and bankers notes were becoming widely used, the volume of money kept increasing. Yet paradoxically, during this era prices were not increasing noticeably.
The confounding cycle of relationship between output level, cost pressure and demand for money remains a fertile field of statistical analyses.(9) Nevertheless, such statistics of associative nature have revealed conflicting data between various studies as well as poor if any indication of causality. Whether the chicken or the egg comes first is still a target of intensive statistics and the three cardinal indicators of economic activity output, costs and money demand yield no evidence as for who comes first. Reinforced models involving "rigidity", "elasticity" and "stickiness" has not solved the riddle either. The old joke that "economists are scholars who spend their time investigating whether an idea that works in practice also works in theory" (2) simply reflect the elusive nature of this ubiquitous and stubborn phenomenon we have come to accept as rising prices at some level as the unchanging law of the land.
Another era of "great inflation" in history was during the fall of the Roman Empire. While that era also coincided with debasement of Roman coins, evidence is lacking to point out whether debasement was in reaction to inflation or the result of it. Some of the factors listed by historians to account for the inflationary era during the fall of Roman Empire are as follows:
While it has been extensively documented by historians that some or all of the above did happen, nevertheless a causal relationship from the perspective of a clear sequential line up of events does not exist. We do not know for sure if debasement happened as a result of a creeping inflation or debasement was the underlying cause of inflation. (Not surprisingly two of the greatest physicists of all times were employed to assess and monitor the value of money during the most famous of inflationary eras. Archimedes was hired to check the purity of bullion and Isaac Newton was assigned as a warden of the mint after the stormy era following the price revolution, the run of the Tower and policies of Charles I and II.)
Silver-based coins in England were eventually replaced by gold-backed coins.(3) Later in U.S. gold standard was also abolished; inflation nevertheless, remained as the biggest concern of all central banks. Eventually the term ‘debasement’ has been replaced by the softer term ‘devaluation’. As for the rate of inflation, the best score in recent history is 2-4 % annually in the United States as well as in Euro zone countries. Yet even at such low rates, prices double every 25-35 years and there are no guarantees that the rate of inflation will be confined as successfully in future.
During the last two decades, a culture of fiscal discipline has gradually emerged both in the United States and in the European Union. Budget deficits are a more serious concern to modern macroeconomists, if the resulting debt is not financed. Modern practice in the case of a budgetary shortfall is that Federal Reserve issues notes and treasuries to finance deficit spending. Such practice is not believed to cause direct inflation at least in the short term. It is when the government issues currency instead of financing the debt, that is a concern as a cause of inflation. In other words issuing currency, the way many developing countries have secretly attempted, has become the modern mode of currency debasement. Hyperinflation in Germany after the First World War, and throughout developing countries during the seventies and eighties were predominantly the result of this sort of debasement, essentially the "photocopying" of worthless paper money in government mints by unaccountable administrations. Such practice is greatly discouraged by I.M.F. wherever it has power.
Accentuated inflationary eras have been prevalent across the globe and cultures since the advent of money. Each case study would be very provocative and the least to say about social implications and political ramifications of inflation is that inflation has influenced and shaped governments along with political systems and people, dramatically and quite often insidiously.
Policy-making and Decision Criteria of Central Banks
The strictly defined aim of this paper is to suggest a thesis outlining a clear but hitherto undeclared mechanism of inflation and in particular a ‘systems theory’ unifying and quantifying various concepts in macroeconomics, at instances by borrowed concepts from physics and mathematical logic through abstractions in the style of systems science. As such, a critique of Federal Reserve’s past policies is not part of that aim. Any reference to recent events in the bank’s policy is to point to the elusiveness of concepts about inflation. Such references only aim to indicate the lack of a theoretical framework that would have made it easier to contemplate the events preceding the collapse of the great expansion experienced between 1994 and 2001. Central Banks work according to known and widely accepted principles. It was not realistic to expect Federal Reserve Board, or any other central bank for that matter, to subscribe to novel concepts before they are institutionally verified and uniformly accepted at least as a school of economic thought. Hopes of less interventionist discretional policy by the bank did not prove viable even in the light of a new era of unprecedented technological advancements that was observed during the last decade. Likewise, Federal Open Market Committee has worked according to settled principles and decided that the over-speeding U.S. economy had to be slowed down. If all had gone well the slowing would have been regarded as a soft landing but the landing was not soft. To sum up, the Federal Reserve Bank ostensibly decided that the economy was over-speeding. Speculative reasons as for why an issue of over-speeding became the agenda then, may be summarized as follows:
We can also point out some additional likely concerns: Mainly, implicit challenges were developing against the authority of the Federal Reserve Bank as the sole authority to issue currency and administer monetary policy. These challenges were stemming from the booming stock market. Reportedly, Federal Reserve Board did not target the stock market but chairman had expressed complaints about "excessive" gains in the stock market in his famous 1996 speech regarding "irrational exuberance". The board’s concern as it appeared was more directed at the wealth effect due to stock market as a source for a potential demand shock. It must be stressed a notion of a challenge to the authority for money issuance is purely speculative though one that is worthwhile considering.
One such challenge to the authority of Federal Reserve came from companies such as Microsoft that had begun paying their employees’ salaries with their stocks. It is a known practice to offer stock options to employees but what was new was replacing the national currency with company stocks for monthly pays and salaries. When widespread, such practice only amounts to increasing liquidity without the approval of a central bank and therefore may constitute an inadvertent challenge to the primary authority of money issuance. If seen that way, no central bank would tolerate it.
Yet, there were other sources of pressure, one from overseas companies that were not too happy about cash influx into U.S. stock markets and away from their local businesses and factories. Neither the overseas institutions’ complaints regarding cash inflow into U.S. out of their countries nor the government securities traders can be said to have influenced the Federal Reserve to aggressively burst the "bubble". However, they may have contributed to the public opinion that the "bubble" had to be burst. Resultantly, more and more editorials were issued asking, not whether but when the "bubble" would be burst, sort of reminder of the Bubble of the South Sea company in 1721.(3)
John Locke, the founding father of English Constitution, as well as that of macroeconomics formulated an epoch-making theory of prices in 1691. Oddly both during the forty years preceding his work and also later on during the eighteenth century, money supply, both in terms of M1 and M2, increased in great extents paradoxically without significant inflation. On the other hand between 1515 and 1650, accompanying a relative shortage of liquidity there was catastrophic levels of price increases. STM holds that debasement of currency by the monarchs was apparently not the cause but one of the results of inflation and as well as of unchecked powers. Like everyone else, monarchs also could not cope with cost pressures and rising prices so they secretly debased and virtually stole from the precious metal constituents of coins, in Britain and elsewhere.
In modern times, a gap between aggregate demand and supply is considered a major boost for price increases in the form of inflation. Between 1998 and 2001, a highly interventionist Federal Reserve Board policy was in effect. During the Asian crisis FRB lowered the interest rates and kept them low until measures of debt and commercial paper seemed to accelerate rapidly. At that moment, with intentions to slow down the economy or perhaps in an attempt to burst the "bubble", the board raised the interest rates in an almost unforeseen sequence of hikes. Possibly realizing the landing was not as soft as would have been desired, Federal Reserve Board lowered the interest rates in another unforeseen sequence of continued rate cuts. Finally following the attack on the Twin Towers the board further reduced the federal funds rate and the prime rate to a level not seen in forty years.
Although, the media and the public would like to personify the Federal Reserve policies in the interesting and revered persona of the chairman Alan Greenspan, in reality the decisions of the central bank are a committee action. During the last three episodes of rate changes since 1998, votes cast at the FOMC have been near unanimous and outside the board has accompanied with the concurrence of many, even if not all, academics and financial institutes. In all three cases; during the Asian crisis, during burst of the "bubble" and currently during attempts to avert a recession, the board and a majority of institutions viewed the FRB actions as unavoidable and necessary. Although congress had questioned the board actions increasingly more aggressively at the biannual meetings with the chairman, nevertheless a large portion of the community of economists have regarded the actions during the last three years as a "no brainier". Thus, dissent seemed to be confined to a small minority opinion that was not clamorous. On the other hand, political resentment from White House towards board actions have always been around since the inception of FRB as an independent policy-maker. Most economists do not regard it as worthy of mention because it is believed that administrations would always vote for a rate cut if they could vote in the board.
Common points in these events separated by epochs can be recapitulated as follows:
Many economists may prefer deflation to inflation any time. Deflation may be temporary while inflation is considered to be a chronic and stubborn ailment. Such paradoxes, deflation arising when inflation is expected and vice versa, can be explained in modern macroeconomics as discrepancies of short and long term effects of monetary policies. Since the fifties, numerous research articles have been devoted to fine tracing and analysis of short and long term effects of monetary policy. Although inconclusive, we can cite such works as the prototype attempts of a "systems approach" in macroeconomics. Systems behavior, displaying capacitance, inductance, alternating current, hysteresis and oscillatory changes have been occasionally modeled and applied to forecasting macroeconomic behavior in order to explain complex nature of aggregate parameters presiding over economics. Nevertheless, the complexity and transdisciplinary nature of macroeconomic realities have rendered even such attempts as less than successful in dealing with the complexities involved and thus have failed to accurately predict the behavior of economic systems.
In the broadest of terms central banks and policy-makers have one single objective : To provide stability in the marketplace by achieving equilibrium. Classical, neoclassical, monetarist, Keynesian, New Keynesian that means one thing : To control and eliminate inflation.
"We needn’t wait patiently for a utopian future to unfold at its own leisurely pace, because there are things we can do immediately that will speed things up enormously; or things that we could do, if only we could effectively control inflation independently of other economic goals." (4)
This can represent the Keynesian attitude but, other schools are no less vigilant or proactive in recognizing inflation as the main instigator of economic chaos. The other common denominator i.e., unemployment rate, immediately follows inflation as a concern to all schools of thought. Unemployment rate whether as a Keynesian factor or Philips curve concern immediately ranks after inflation as a policy priority.
Further elaboration of policy-making and evaluation brings into attention short-term, medium range, long-term effects and how the course of equilibrium is effected upon a time domain. Unemployment arising as immediate concern to economists is not due to a mysterious cause such as has surrounded the concept of inflation but is due to the fact that unemployment levels are closely associated with indicators of social peace such as crime rate, strikes and lockouts. Consequently, the use of monetary and fiscal policies to combat inflation and therefore unemployment has been the battle ground of the major schools of thought. As for the monetarist school, the fact that it stands out as the most reluctant to intervene in policy has not been relieved by a paradox that can explain why the monetarist viewpoint is revered but never followed: One cannot agree more that the least of intervention directed at a very complex phenomenon made of business cycles, rising prices, effects of new technology and changing demographics is the best policy. The monetarist proposition is to forecast what the economy would grow in the long term and increase liquidity at a fixed pace based on that long term growth prospect. If the forecasts about growth prospects fail to match reality so long for that fixed pace. Logically, monetarists would always end up resorting to discretional "corrections" of that "fixed pace". It should be noted that the monetarist viewpoint emerged in late fifties concurrently as systems approach was developing. There are hints that monetarists have somewhat vaguely been influenced by the culture of systems concepts in dealing with fluctuations and decision analysis in a way more akin to the lines of systems thinking than the other schools of economic thought.
During the last decades Federal Reserve Board based its highest priority on inflation confinement and price stability. It may be claimed that the board overdid that. Nevertheless the chairman had admitted at a congressional testimony that they would rather err on the safe side. This summarizes the overwhelming importance modern central banks attribute to confining inflation over fighting recession. There are no valid mechanisms offered by major economic schools yet agreement is universal: Inflation is the number one concern of a policy-maker regardless of at what rate it may be encountered.
STM considers deflation as one of the paradoxical reactions to rising cost pressure during an early phase of accumulated effects of hidden inflation. According to STM, barring significant structural changes and changes in trade patterns, the actual and pure inflation rate is fairly fixed and resilient to the effects of major factors listed by current economic schools. Implications and ramifications of this viewpoint are overwhelming from historical development perspectives of macroeconomics and therefore central bank policies. STM warns that monetarist as well as fiscal and social policies will not reduce or confine inflation but rather will postpone or displace it with rebound effects recurring stronger than initially encountered. STM offers several openings for inflation control, isobaric expansion reminder of thermodynamics concept of isothermal expansion being one of them and input-output restructuring and process synchronization and coupling as another amongst many, all of which depend on the utilization of information technology methods one way or another. As the trends are indisputably towards proliferation of information economies, even if STM were to offer no extra remedies for preventing economic instability, its role in avoiding wrong policies stemming from erroneous concepts about price rises and business cycles is one way to look at the doctrinal aspects of STM.
From Gutenberg to Internet, the New Economy, emerging complexities and how to keep it simple
The ability to hammer coins had marked the onset of money economies. The subsequent advancements in trade have also changed the course of cultural history. The age of paper money did not take off immediately as Gutenberg’s movable printing type. Paper currency was not secured by law until 1704.(3) The age of bullion and coins had gone unchallenged for three thousand years until 1704. The age of paper currency, which is about three hundred years old, now faces powerful challenge from electronic currency in the form of electronic and Internet transactions. The age of paper money that spanned three centuries can be characterized by Locke’s influential quantity theory. As each age is associated with a central theory of macroeconomics and understanding of trade and money, the new age of Internet seems to have its own ground breaking rules and economic theory. Characteristics of this era are global range, transactions speed, volatility and associated complexity of trade and industrial relationships. Systems approach that began in the forties and was marked by Bertalanffy’s influential General Systems Theory is a transdisciplinary doctrine that can be used to probe the future of economics as a powerful modeling and abstraction tool in the realms of complex systems.
Throughout this working paper, the term ‘complex’ may indicate some level of combinatorial complexity that poses as a computational burden due to an abundance of conditional probabilities in defining a particular state of a system under observation. Nevertheless, ultimately the term complex will be used to denote more than a combinatorial or stochastic level of computational burden. Systems theory of macroeconomics emphasizes semantics over a stochastic complexity in determining the state of a system. A complex system in this sense is one that has become context sensitive. The burden of computation is not in fact one of combinatorial nature but of contextual understanding. To highlight the difference between the two different contexts of complex, we can take a computer program that plays chess by making active evaluations of all possible movements and a human player that only evaluates limited moves. The human player, somewhat intuitively, through strategic assessment as opposed to combinatorial evaluation, nevertheless arrives at successful states without computing all possible moves.
As introduced in this working paper, systems theory of macroeconomics deals with the issues of complexity management and broad policy formation in the general theory section. In the special theory section STM concentrates on a price theory and inflation model upon a statistical treatment of Boltzmannian microstates of the marketplace. While this is where macroeconomics meets the newly emerging science of complexity, the burn test of any macroeconomic theory is in the domains of forecasting. When modern economics meets these two emerging areas, one of complexity and the other being forecasting, we may anticipate an overhaul of most of the fundamental principles governing modern macroeconomics. Upon that end, we may anticipate an era of "instrumental macroeconomics" offering support to decision makers much like the support weather forecasters receive from satellite and computer models. Systems theory in this regard is likely to offer the theoretical foundations of a new, reliable and accurate forecasting methodology.
Any issue of forecasting has always drawn many philosophical questions along with. Whether the future is knowable has preoccupied most of twentieth century science. At the atomic scale, both statistical and quantum mechanics had encountered complexities in the microcosmos from the point of view of probabilistic uncertainty. Another form of probabilistic complexity was in the domains of physical chemistry while dealing with gas models and thermodynamics. Thermodynamics had later also formed a basis for Shannon’s information theory. From Clausius to Maxwell and Boltzmann, then to Shannon, concepts in thermodynamics that link energy to probability and then probability to information constituted abstract achievements with robust results in applied science with implications in forecasting.
The same question of whether future is knowable or in particular whether the validity of theorems are knowable resulted in the greatest achievements of mathematical logic. The branch of mathematical logic established by Kurt Godel in fact was studying complexity from a semantics approach as he was formalizing axiomatic systems in the form of strings that he had formed through ‘Godel numbering’. While the complexities that Godel formalized using his numbering system did involve combinatorial complexities, the crown achievement of Godel’s proof systems lies in the recursive embedding of strings that was used to formalize semantics. This form of recursiveness involved parallel efforts of Alan Turing’s as he would later form the foundations of computer sciences.
Godel’s work consisted of developing a formal language for embedding mathematics and statements about mathematics, that is a two-layered coding of a within-system and an about-system language in one formal expression. In the whole domain of science and philosophy, Godel’s mathematical logic was indisputably the most abstract system ever developed. An attempt to bring godelization into macroeconomics may be perceived as an unusual application. Luckily in macroeconomics, there is a gateway where the "ultimate" in abstraction and the "ultimate" in concrete thinking meets in an easy to understand manner. This crossroads is namely Say’s Equality and Walrasian equilibrium. Systems theory of macroeconomics is based on a synthesis of godelization that can allow the dissection of complex multi-layer systems and Boltzmann’s statistical calculation of entropy.
Establishing an "instrumental macroeconomics" as a strategic alternative to dealing with combinatorial complexities is one avenue where macroeconomics meets Kurt Godel, although not the primary one. What Godel’s work tells us is that in any formal system; a theorem, a theory, a doctrine, a computer program, there will be propositions whose validity cannot be decided. In a system of axioms, and a set of rules for construction, there will be statements; commands –and actions- that will not be determined and will not be predicted by that system. This is related to the idea of recursiveness that had preoccupied Cantor, Russell and many others, studying antinomies and paradoxes since ancient Greeks. According to systems theory of macroeconomics, it is such inherent recursion, within systems of production that causes the inflationary phenomena. Digressing from matters of forecasting into matters of current prices, this is where systems theory meets inflation theories. Just as a knowledge of atmospheric pressure is crucial for weather forecasting, a sound understanding of inflationary pressure is the most fundamental instrument of economic forecasting. As inflationary phenomenon may express itself in varied forms such as cost pressure or even recession, debasement, a paradoxical deflation and a host of oscillatory cycles, such variety of its expression accounts for its elusiveness. Yet according to STM inflation and all of these confounding after effects stem from a basic systems parameter known as ‘entropy’. Instead of chasing inflation and its many faces, STM breaks the cycle by first studying entropy in the realms of economics.
Inflation and Cyclicality of Output
The prophetic coining of the term ‘entropy’ by Clausius which means "I turn into myself" in Greek and Godel’s formalization of recursion implies synonymous terms in very distant fields of study. Entropy, as Clausius coined, in one sense means recursion and according to STM the two of them in action causes inflationary pressures. One of the after effects of such price movements is cycles of economic activity, separate from business cycles. While business cycles are more like engine cycles, the other type of longer range cycles is seemingly not predictable and arise as recessionary periods known as boom and bust cycles. It is important to separate business cycles from boom-and bust cycles. Business cycles are due to timely delivery of produce to the marketplace. There is a time for crops, a time for back to school shopping and new car sales and tax payments and returns and such. Even though the interactions of hundreds of such cycles as well as sophisticated inventory movements may form a complex pattern, nevertheless these are anticipated and to a large extent predictable "refill" patterns.
Likewise in nature, cyclic events support life forms and allow extraction of energy from the potential gradients that develop. For example the lunar cycles, tides, seasons and diurnal variations have been significant helpers for the evolution of life on earth. The common denominator of all the "helpful" cycles is predictability that comes from well-defined periodicity. The mild nature of such periodicity has allowed adaptive mechanisms to evolve and retain their adaptive usefulness. Even "prime events" that have disrupted such periodicity occasionally have come with enough lapse of time as not to have a total destructive effect on life.
On the other hand, the types of boom and bust cycles we encounter in economics are not due to cyclic events but are due to "fuse off" states. In ancient times, some ominous periodicity of agricultural produce would result in periodic famines and illnesses. Farmers had learned to plant crops interchangeably only during the twelfth century. Until then, the land would be depleted of minerals and organic resources. Once the rock bottom was hit crops would no longer grow. After sufficient time had passed to replenish the nutrients, land would begin yielding crops again. In the mean time, famine would cause vitamin and other deficiencies, diseases and epidemics due to malnutrition. The resulting infections would emerge with "mysterious" cyclic patterns. Every seven years or so, "a mystical fuse" would go off, the agricultural output would plunge and until the next boom, famine and poverty would have prevailed.
Mechanisms that result in cyclic variations of demand and supply are numerous.(12) In spite of a modern understanding for production and distribution cycles, and additionally a well-developed postmodern industry, the disruptive forces of boom and bust cycles are by no means lessened if not increased. The increased conductivity of events is due to the lifting of barriers that had previously confined capital and trade movements to local and regional countries. In a global economy, during the current transitional period, the risks for accentuated boom and bust events are not lessened but increased. Additionally, cyclic variations inherent in each economy can accumulate in the same manner as in constructive and destructive interference patterns observed in wave phenomena. Thus it can create stronger and bigger harmonics of variation that can augment the magnitude of boom and bust cycles.
Recapitulation of the above statements on macroeconomic boom-bust cycles is as follows:
The systems views is that production cycles are as vital as engine cycles. On the other hand boom and bust cycles are nothing but exigent failure states. When business cycles are confused with oscillatory cycles known as boom and bust periods, given the behavioral and psychological contributors to any investment rationale, anticipatory bust periods may indeed develop; as such false expectations of a receding activity would irrationally remove any investments from the market place.
Although seasonal and other cyclic variations of production offer precise periodicity, the complex nature of inventories and supply-demand variations may reduce such deterministic occurrence of useful cycles. Superimposition of harmonics and cross-interactions of various overlapping cycles may disrupt and change such deterministic observance.
Various modeling methods; some purely mathematical, some derived from electronics circuit theory and solid state elements have been used. For example, the modeling of inventories with capacitors, flow with resistors and analyzing complex cycles and alternating demand-supply waves with analogous electrical currents. At this point, a crucial difference in approach between such models and systems theory emerges. Differentiating between cyclic and recursive events is key to understanding the new approach of systems theory.
Cyclic events are essentially iterations. Events modeled by traditional mathematics and electrical circuit theory are iterative in nature. Events modeled by systems theory of macroeconomics are inspired and derived from mathematical logic. Iterative events may be very complex as the elements contributing may be made arbitrarily large. No matter how complex, a deterministic behavior will develop from an iterative system or likewise from an economic model inspired by circuits theory and solid state components.
Computer programmers are familiar with the crucial difference between recursive and iterative processes. The following example will serve to clarify a fundamental aspect of systems theory of macroeconomics:
The universal "for" statement of many computer languages constitutes one way of iterations. Such "loops" are overt that looking at a source code one can immediately recognize their action. In mathematics the sigma sign (S) denotes an iterative addition. The basic models attempting to model macroeconomic realities are based on iterative models and therefore are vulnerable to shortfalls inherent in iterative modeling of a process that should be modeled recursively instead.
In recursive cycles, as in a recursive algorithm, the cycle is nested within itself. The better example would be various sorting algorithms that computer students are taught during their freshman years: A sort algorithm can be written using a "for" loop, which is the iterative way of repeating the operations. Alternatively the program can be written by using recursive algorithms where a function would call itself until all the swapping and sorting is over. Typically a recursive sort algorithm is more elegant and is shorter in the lines of code. The iterative algorithm takes many more lines and the function that is called is called from within the loop as opposed to being called from within the function itself. A recursive sort algorithm, no matter how elegant, is nevertheless unsafe. This is due to having a hidden cost incurred upon the stack space allocated for the code and there would be no clear indicator of when the computer would run out of that stack space.
A review of current price theories in comparison with the systems approach
While economies do
consist of various iterative processes which can be modeled by conventional
mathematical and electrical circuit models, it also has plenty of recursive
processes that cannot be modeled by iterative or conventional cyclic models.
Systems theory of macroeconomics asserts that the presence of underlying
recursive topologies in the real world accounts for the failure of conventional
forecasting models that are based on iterative models. While the market place
has both iterative and recursive elements, ultimately the flow of goods and
services and exchange of money is a recursive process. This recurring nature of
economic activities, as opposed to cyclic nature accounts for inflationary
pressures and in many cases of deteriorating business activity following a
period of growth.
Current economic theories explain inflation with one or more of the following:
In effect, these mechanisms as well as the observed reality that given sufficient time, deflation or not, bankruptcies or not, layoffs or not; the general level of prices always rise. This takes us towards one observation: In the long run demand and supply equations do not yield symmetric solutions. The observed asymmetry results in increased prices. That would require aggregate demand in the long run to be greater than aggregate supply, which would be a matter of dispute, even if it were to be decidable. The question of whether a state of equilibrium exists or not or whether it is more of a state of a constant disequilibrium has also been offered by many economists as an observational basis for a possible mechanism. Since Leon Walras had addressed the equilibrium state this issue has remained central to a discussion of how prices change in the long run.
A tendency towards monopolization has been suggested as a potential mechanism, due to price fixing activities. Nevertheless, as a cause for a general rise in prices, it has never remained in effect long enough and not been as universal and obstinate as inflation itself.
The political pressure for full employment is a concept closely related to the Phillips curve that links employment to ever higher wages. This idea claims that if governments try to achieve full employment through fiscal policy or other means, in order to provide social peace, any scarcity of workforce causes wages to increase. Immigration as well as corporate bankruptcies, merger related layoffs and other large-scale layoffs have frequently happened, in effect increasing the supply of labor but nevertheless without at all changing the basic reality that prices and wages have kept their upward course, at some rate large or small, regardless of occasional peaks at unemployment levels.
The sticky wages and prices theory was forwarded by Maynard Keynes. Racheting action is a slight variation of the same notion. Both claim that prices are easier to move up than to move down. Price rise anticipation causes franchises to rise their prices in order to be able to replenish their inventories if new supplies of the same type are expected to cost more. Wealth effect is a temporary rise in prices due to increased demand facilitated by wealth especially from stocks and dividends. In unison with ratcheting action wealth effect is believed to cause prices to remain high even after demand ceases to sustain its level once the extra income is depleted.
The validation and quantification of Keynesian explanation has been subject to intensive research with conflicting conclusions not infrequently. Factors effecting "stickiness" such as:
An interesting conclusion of one such study (5) is : "Demand increases appear less likely to prompt price changes than demand decreases; but cost increases are more likely to prompt price changes than costs decreases."
This is in line with STM idea that entropy causes increased cost pressure along with an unavailability of money and subsequently price increases begin and translate into increases in the general level of prices known as inflation. Unavailability of money causes reduced demand which may result in deflation, if inventories and cash balances allow, due to increased demand for money and competition for money. Once inventories are depleted, price cuts can no longer be sustained and cost pressure will take over and reflect as rises in prices. At this stage, bankruptcies, layoffs and buyouts are typical of the landscape. It must be stressed that this sequence is under neoclassical, New Keynesian and monetarist conditions. In other words, systems approach (STM) may provide a fix and break the vicious cycle summarized here.
PART TWO
-Fundamental Derivations-
Entropy and price "stickiness" ; Systems approach based on Clausius’ macrostate treatment
Rudolph Clausius had introduced entropy as a concept in thermodynamics in 1854. Of all the uses of entropy in thermodynamics, particularly in physical chemistry, prediction whether a reaction will take place under certain conditions of heat and temperature, may be the most crucial and fruitful one. While Clausius defined entropy from a macroscopic point of view, Boltzmann gave a microscopic description of entropy in 1877. The two apparently different formulations of entropy have confused many scientists since then.
DS = DQ/T
where, S is entropy, Q is heat and T is temperature. Notice the Clausius equality is only concerned with changes in entropy.
S = k log A
k is Boltzmann’s constant and A is the number of energy states in the system. Notice entropy is expressed in absolute terms.
If economic systems were as simple, we could use a corresponding formula that in economics entropy would be represented as:
DS = DM/P ; M is money stock and P is price level.
STM emphasizes cost pressure due to increases in unavailable energy in the system over other asymmetries and over skews of supply and demand. STM rejects that asymmetry (in the sense as for why prices move easier in one direction than the other) has much to do with factors such as the burden of changing price labels or trade unions or long term contracts as causing the asymmetry because layoffs and other mechanisms nevertheless take effect sooner or later. STM rejects all forms of demand-supply rigidity, as well as demand-supply shocks as a considerable cause of the asymmetries. However, STM indicates that those asymmetries (price stickiness, differing response to demand/supply shocks and others) are there and have developed because the ground is not level and is tilted towards price increases due to cost pressures instigated by entropic processes. As a paraphrase, it must be added here that due to an elementary arithmetic effect, changes in available energy due to changes in entropy with addition of some amount of liquidity, (C+x)/M, into the system will not be the same as withdrawal that is, (C-x)/M, of the same amount of liquidity. This is due to the first and second laws of thermodynamics.
The second law expressed for an irreversible process is:
DSsystem + DSsurrounding >0 ;
Translated into monetary language, it corresponds to saying that money added into one system from another increases the total entropy of the system (unavailable money). The fact that heat flows from objects of higher temperature to objects of lower temperature is the key issue why entropy will increase.
The corresponding concept in economics, for the direction of heat flow, was phrased by Adam Smith as well as by David Hume.(18) In two adjacent markets, say in France and Britain, if the price of wheat is cheaper in France than in Britain, the commodity will flow towards Britain and money will move towards France. As a result, price of wheat in England will decrease while the price of wheat in France will increase. If we apply the above notation, the change in entropy for the French wheat market is:
DS = DM/P ; M is amount of money received corresponding to heat transfer and P is the price level corresponding to temperature in Clausius equations.
Ph = price of wheat, 100 tokens in Britain (h for ‘hot" or high)
Pc = price of wheat, 60 tokens in France (c for ‘cold" or low)
DMh = -240 tokens ; money Britain remits to France , negative indicates direction is out of the system
DMc = 240 tokens ; money France receives from Britain , positive indicates direction is into the system
DSh = DMh / Ph ; Change in entropy in Britain after purchasing wheat from France
= -240 / 100
= -24
DSh = DMc / Pc ; Change in entropy in France after selling wheat from Britain
= 240 / 60
= 40
Notice the asymmetry -a fall of 24 versus a rise of 40- and the total entropy change for the two markets = -24+40 = 16, and this is the amount of entropy which can be used to calculate the amount of money made unavailable to the system in the final price level setting which will in most cases be the average of the two markets, crudely or 80 in this example. (DS) x (Pfinal) ; 80 times 16 , 1280 tokens, and that yields the amount of tokens that has become unavailable to the economy as a whole as a result of this trade.
If the total money stock is 10,000 tokens, following wheat trade in real terms the money stock will become 10,000-1280 which equals 9,720 tokens. In other words, as a result of expansion of the markets, what used to be nominally 10,000 dollars now drops to 9720 in real terms. The mint will start hammering coins and increase liquidity by 1280 dollars at a would-be inflation rate of 12.8 percent.
Entropy and price stickiness, Boltzmannian approach
As opposed to top-to-bottom macrostate deal of Clausius, Ludwig Boltzmann derived the statistical form of the entropy equation as an effort to relate the concept with the atomic models, from ground up. There is good news for those economists, who might have no interest whatsoever in quantum statistical atomic models and such. Boltzmann’s statistical distribution of energy levels maps one to one and squarely with income or capital distribution in a marketplace. However, this has only intermediary importance to the special systems theory of prices (SSTM) . SSTM will not use capital distribution at all to explain inflationary mechanism and the formation of prices. Instead a modified Boltzmannian treatment will later focus upon input-output staging. At the moment let us see the direct Boltzmannian method calculated by permutations of the capital distribution.
Conditions of the statistical mechanics can be summarized as follows:
Suppose there are 7 traders in the whole marketplace and the total money issued and available to the traders is 7 tokens. The tokens can be distributed among traders in any discrete amount because we will not allow clipping. Trader no.1 may have all 7 tokens, leaving the rest zero. Each trader may have one token or first trader 2 tokens, second 4, the third 1 and the rest zero tokens. This is how Boltzmann derived the distribution numbers. For 7 traders and a total of 7 indivisible tokens it appears that the possible number of such distributions is 15.
Distribution numbers
One trader can have all the money, leaving the remaining six traders zero tokens. (From all we know about man’s nature and history lessons, that is a very unstable possibility.) Likewise each trader may have one token as in a perfectly uniform distribution. It appears from Boltzmann’s statistics this is the second most unstable possibility. Alternatively, one trader may have 4 tokens, a second 2 and a third 1 token. The number of such possible distributions gives us the distribution numbers.
Microstates
Each distribution number out of 15, for 7 traders for a total of 7 tokens free in the marketplace has a number of different possibilities named as microstates. That is, for the 4-2-1-0-0-0-0 type of distributing the money among traders, it appears that we can distribute the tokens in many different ways. Trader (or company) no.1 can get the 4 tokens or alternatively the four tokens can go to no.5 for that matter :
As each distribution number may have a different number of microstates and thus we may have different probabilities of each state and a differing level of stability for each state. The microstate with the highest probability represents the equilibrium or maximum entropy state. The system will tend to find it. Associated probability for each microstate is given by:
P=N! / (n0! n1! n2!... nk!) (15)
When the number of atoms or traders as well as level of discrete amounts of capital is very large the above formula can be approximated by the Stirling approximation : (15)
ln ni! = ni ln ni -ni
ln P = ln N! - ln ni! - ln n2! - ln n3!.... ln nk!
At this section we will not go any further as this is the essence of Boltzmann’s statistical derivation of entropy for a very large number of atoms (in our case traders) . Since the number is large and money stock can be large therefore variables can be taken as continuous instead of discrete.
Input-Output Analyses: From Quesnay’s "Tableau" to Systems Approach in Macroeconomics
Systems theory of macroeconomics suggests that inflationary mechanisms are recursive as much as they are the consequences of entropy. In other words, a bridge from Boltzmann’s statistical approach to Godel’s formalizing by string-numbers needs to be constructed. According to Boltzmann’s version of entropy, the distribution i.e., where and how the components of a system are placed results in a certain amount of entropy and the equivalent amount of unavailable energy. Likewise, in a macroeconomic system how components are put together means essentially, who buys from whom in the broadest sense. Determining who buys from whom is also known as input-output analysis in economics. However, constructing a table of who buys from whom is not as straightforward and easy as it may be considered. Like all elaborate systems, economics systems cannot be reduced to a table of few columns. In fact, economic systems cannot be reduced to a tabular relationship of any number of columns. Inherently the nature of transactions is not linear and presents topologies that are essentially made of graph structures. In this regard STM's input output analysis methods are different than Quesnay's historical approach, as well as linear programming techniques developed earlier in twentieth century.
Ripple effects and nesting of economic activity complicates policy-making: STM focuses on the principle factor
If we were to formalize economic activity in an effort to apply input-output analysis to retrieve microstates of the distribution of constituents, then we would uncover the recursive topology. Having said that we can take a closer look at how the formation of prices and the uniform pattern of rising prices occur. In a sense entropy translates into a quantity of unavailable energy. As a reaction, the system may rise prices in the form of inflation. However, during this process if demand for money has become significantly higher than demand for goods and if inventories are large enough then paradoxically deflation as opposed to inflation may be encountered. Once inventories are depleted, inflation would take over the temporary deflation, provided entropy has remained unchanged during this period.
Inflation rate is released monthly at an annual core rate which excludes volatile prices. The crude rate includes food and energy prices as well. But actually both are sampled averages of a basket of goods. STM’s estimates of the entropic portion of core rate may be a fraction or a multiple of the core rate. In most instances, under stable trade patterns entropic portion is smaller than the core rate. However, changes in entropic portion precedes changes in core rate. Core rate is formed as a result of the transmission of the entropic fraction and becomes magnified as it is conducted during economic activity.
In this respect the dynamics of inflation as described by systems approach resembles to that of a small current applied to the base of a transistor where a very small amount controls a much bigger quantity passing between the emitter and the collector. Such is the manner which even small amounts of inflation results in drastic effects; boom and bust cycles and varied phenomena such as sequences of recession, deflation, stagflation and devaluation. This variety of expression is frequently further compounded by policy-action of central banks. The many different ways inflation and its complications are expressed have been documented extensively, creeping inflation, runaway inflation, stagflation are some of those. Of these, the 1947 inflation is regarded as the bouncing back of prices after being forced down during the Second World War. The OPEC shocks of the seventies had caused immediate rise in prices because of heightened cost of industrial input. The following recessions brought the price of crude back and eventually in 1996 crude had hit a bottom of 9 dollars a barrel. In such oscillatory behavior, ripple effects and "wave mechanics" further adds to the effects.(11)
As had happened with rises in price of crude oil, when a new product arrives rapidly into the market place with a large capacitance to channel in money and investments, ripple effects develop and widespread price movements are observed. The cellular telephone companies and Internet service providers constitute examples of new products and business activities. Such resultant effect is that the pie of profits will be divided into more pieces than had been before unless central banks increase the quantity of money to make room for new producers such as these. However, typically a central bank will hesitate, perhaps justifiably, due to uncertainties in determining whether the change is for good and is real, then typically will lag in action. The lag may cause some degree of slow down or even recession. When the central bank eventually decides to ease rates after determining the increase in GDP is real, it will possibly overshoot. Because by then, natural adjustment mechanisms would have already been activated outside central bank activities, and superimposing of monetary and market forces will multiply the policy’s effect. On the other hand, if the lagged increase in quantity of money coincides with an already developed recession then the addition of money will inadvertently cause demand for money to decrease beyond targets. Eventually these price swings will settle even if central banks do not adjust the volume of money in accordance with the new realities. Such has been the case. Each time after OPEC initiated a shock by cutting production, after hitting highs later crude oil prices have sled back from their peaks to new nadirs. Depending on whether the change is gradual or abrupt and whether psychological factors, pessimism or optimism, accompanies the change, differing effects on the course of inflation is encountered. When the ripple effects settle, the remaining net rise in the general price level is the amount of induced inflation by the change thus introduced. While ripple effects develop as confounding factors, systems approach at this stage will only concentrate on the original cause of inflation. Naturally, whether a particular effect is causal or subsequent ripple effect is of great interest to systems approach. Special theory of macroeconomics only deals with the entropic portion of inflation. General theory of macroeconomics on the other hand, deals with the transmission mechanisms of the entropic portion into core rate along with transmission of policy into price levels and altogether how boom-bust cycles and various ripple effect are observed, using elaborate computer models based on graph topologies and the new theory.
Prudently, some economists have suggested that central banks slowly increase the quantity of money at a steady rate and not pursue interventionist policies. This issue was extensively covered by Milton Friedman. However, one can always claim to be a monetarist in the Friedmanian sense and still maintain a highly interventionist policy in the name of discovering what that steady rate should be. We just do not know what that steady rate is and how steady that rate should be is the whole issue here.
Lately, the European Central Bank attempted to pursue a constant rate policy to achieve a steady rate of increase of money, as Milton Friedman had suggested many decades ago. Although modern central banks are independent of governmental pressure, they are not immune from public and political pressure. Another difficulty comes from the fact that central bank policies are "communicable". As investments and capital flow have attained a level of freedom of movement unprecedented in history, interest rates in a country have begun to effect the flow of investments globally in no time at all. Therefore, the reluctance of ECB to diverge from that policy did not last long under institutional and political pressures following the rate cuts of Federal Reserve Board.
Central bank policies effecting the quantity of money have been identified under three categories; discretional, automatic and rule-based. In the United States, policy is discretional and the FOMC determines the quantity through a servo-control approach monitoring market data. In Japan, largely, policy is automatic as the country rests on a huge current accounts surplus and sets the value of the yen in ways and manners favoring the continuation of the trade surplus. To this day, there exists no rule-based policy in effect. Systems theory of macroeconomics should not be misconstrued to offer such a system of presumably computer-based decision making but, that in its own right is the opposite of what the theory is about. However, through a systems theory, many unsolved issues of how prices and quantity of money effect each other under complex circumstances could be brought under light. More importantly, errors inherent in current models could be corrected offering significantly more accurate decision support tools, thus somewhat resembling a rule-based approach.
Broadly what we know from thermodynamics suggests that an expansion can be conceived as either isobaric or isometric. In the former, temperature and pressure is held constant and in the latter volume is kept constant. In economics, central banks decide whether volume should be kept constant or increased. Although this is co-dependent on producers achieving economies of scale. Under most constant volume conditions when new technology and addition of new goods and services arrive in the marketplace, isometric conditions will cause prices to increase due to increased utilisation of mainly cash transactions, demand for money will grow. Thus it may appear that a crucial factor in deciding the rate at which the quantity of money should be increased is obtaining more accurate and timely estimates of the GDP or output. In theory, the money stock must increase by an amount paralleling the increase in GDP. The problem is that GDP and the general level of prices and hence monetary policy do not have a clearly separable causal relationship. They recursively effect each other and which comes first is anyone’s guess much like chicken and the egg.
A Preliminary comparison of Quantity Theory and the Systems Approach
A simplified working example:
Island models in the style of Daniel Defoe have often provided shelter for those who seek refuge from overly abstract notations. During this introductory chapter we can resort the same and suppose that there are three islanders, Bill selling bread, Charles cheese and Arthur is the accountant. Arthur issues 30 tokens. Each day everyone needs a loaf of bread, some cheese and every transaction needs to be recorded. Each islanders buys one unit of the others product. Bill spends two tokens for purchases from Charles and Arthur, Charles spends two for purchases from Arthur and Bill, Arthur spends two tokens on Bill and Charles. Each day, six tokens are exchanged. At the end of each day, everyone still has 10 tokens. Virtually there are no differences between this island economy and barter economies despite assignment of tokens as medium of exchange.
In the mean time, Irving Fisher’s version of John Locke’s quantity theory states MV=PT, where, M is money stock, V is velocity of turnover, P; price level, T is the number of times trade or transactions happen. This is one of the most fundamental equations of economic theory. We can calculate basic parameters as follows:
M; money stock is 30 tokens.
Gross Domestic Product is; 6 tokens a day x 360 days = 2100 tokens.
Income per capita is 4 tokens a day x 360 days = 1440 tokens
Average price of goods: one token
Average price of services: one token
T; number of transactions per year = 360 x 6 = 2100
Velocity = GDP / Money stock ; (this is actually Cambridge equation version of Fisher’s formula)
= 2100/30 = 700 ( velocity is interpreted as each token is used 700 hundred times a year)
MV=PT
30 x 700 = 1 x 2100;
2100=2100
The notion that Charles buys his own cheese and Bill his own bread and Arthur charges his own for his own records sound overly meticulous. Yet, it happens in the real world all the time, concealed in the form of recursive transactions. We have not yet introduced demand-supply curves and the concept of profits and losses. Actually, Arthur’s bookkeeping so far displays no differences between barter and money economies. Somewhat resembling predicate calculus notation and a mix of Alan Turings computer programs, this is what Arthur’s log looks like:
01 buy(Charles, bread, 1) 02 buy(Charles, cheese, 1) 03 buy(Bill,cheese,1) 04 buy(Bill,bread,1) 05 wrt(Charles,bread,1) 06 wrt(Bill,cheese,1) 07 buy(Arthur,bread,1) 08 buy(Arthur,cheese,1) 09 wrt(Arthur, bread,1) 0a wrt(Arthur, cheese,1) 0b wrt(Charles,cheese,1) 0c wrt(Bill,bread,1)
The log book of this island economy displays nothing more than a formalized barter economy. In other words Arthur developed a way of expressing economic activity in a formal language. Notice, even though tokens are in use, there are no waits between purchases and no amount other than one item per transaction is observed. There is no storing of commodity or value either. In effect in spite of the usage of tokens this is still a barter economy.
Then, Walter arrives at the Defoe island. Walter begins selling water. Subscribed to Adam Smith’s division of labor concepts, Arthur, Bill and Charles think such is a good idea since if they can save from the time they spend fetching water they can produce more food. Later Walter goes into soda and beverages business. Although there are only four islanders, everything gets more complex. Walter was the first to begin calculating his costs and sales price at a profit. As Walter’s cash balance increases, others decide they too will sell their goods at a profit. Arthur invents half tokens and quarters and thus, formalized barter economy is no more.
At this point peculiar things started happening: After a number of transactions, the extra tokens each islander possessed that Arthur had issued provisionally, began disappearing. Then prices of unit items started increasing unless they were sold at a loss. Later they felt urged to cut on their profits because the buyers seemed to have run out of enough money to purchase the goods. After all, money was a medium of exchange and as the number of transactions increased, the medium of exchange also had gotten to be used more often. Eventually, a shortage of the tokens arouse with the accompanying increase in demand for money. Recall at this point, for simplicity, money as a store value did not exist.
Arthur suspends all transactions and starts debugging the economy that was headed for a crash. He finds out that this is how Bill calculates his sales price: As a valid rationale, in order to be able to produce bread, Bill needs food and water which he buys from others and pays Arthur for his bookkeeping services. Each person had followed course and calculated his sales price as follows:
PriceOf(bread)=AmountOf(cheese)+AmountOf(water)+AmountOf(bookkeeping)+ MarginOf(profit)
PriceOf(cheese=AmountOf(bread)+AmountOf(water)+AmountOf(bookkeeping) + MarginOf(profit)
PriceOf(water)=AmountOf(cheese)+AmountOf(bread)+AmountOf(bookkeeping)+ MarginOf(profit)
PriceOf(bookkeeping)=AmountOf(cheese)+AmountOf(water)+AmountOf(bread)+ MarginOf(profit)
Algebraically expressing, ‘a’ denotes amount and ‘P’ price, where ‘M’ is profit:
P1 = (a11. P2) + (a12. P3) + (a13. P4) + M1
P2 = (a21. P3) + (a22. P4) + (a23. P1) + M2
P3 = (a31. P4) + (a32. P1) + (a33. P2) + M3
P4 = (a41. P1) + (a42. P2) + (a43. P3) + M4
The format of the four equations resembles a frequently encountered general form of four independent equations from which we can calculate the unknown and arrive at fixed price levels for goods produced. A more careful look will reveal that each price is recursively dependent on one another. For example while P1 in calculated in terms of P2 and other prices, notice that P2 recursively includes P1. As this explains why the prices kept increasing in Defoe island, and ultimately caused the monetary system to crash, the extent of dependence is not the same in real world. Nevertheless in the end, however small it may be, dependence between input prices and final goods produced will not cease to exist. The more open the economy is, the less dependence the system will have between its input-output stages. The opening of the economy by complying with free market conditions and exercising free trade practices will reduce the dependence circuitry, until some point in time. Eventually a limit will be reached beyond which no matter how distant input and output processes are separated the degree of freedom from recursive cycles will not be increased. Consequently, the general level of prices will keep increasing in the form of inflation.
In reality, such a total dependence of every price on one another is not probable and is virtually impossible. In the Defoe island example, all four prices depend on the remaining three prices. That is, the degree of freedom was zero. The general distribution model of recursiveness which is related to the degree of freedom or conversely, dependence of prices, consists of a total number of commodities and a distribution of the number of constituents for each commodity. In other words in a market where we have 1000 commodities n=1000, average number of ingredients, ‘i’ being 30, the "gross" degree of freedom may be calculated as n-i = 970.
In classical and neoclassical equations, degree of freedom is implicitly presumed at a maximum of n-1 level. As unlikely as the Defoe island’s total dependence and maximum recursion situation is, Say’s Equality and the Walras Law and resultant equilibrium theories set the stage for one of the least likely situations where dependence on other products is none. The degree of freedom and how it is distributed among sectors of the economy and further down amongst processes within a sector essentially determines ‘state functions’ that we will define and measure as entropy. How the dynamics varies across topologies and time is a fairly complex process and that is where formalization methods derived from mathematical logic will assist. The patterns of dependence is an intelligent process in that intelligent agents effect and determine that process, resulting in a grammar of action (rules of trade) and semantics of results (objectives of trade).
The traversing of transactions across the whole topology can be modeled by the travel path of money across traders. Given sufficient time each node (vortex) symbolizing a trader will likely be visited at least ones, approximating the topology to a Hamiltonian graph and the resulting market will have arrived at the limit for the degree of freedom and every price will reflect the change in entropy as a change in price levels at the very next cycle of traversing. (According to Godelian principles sufficient time may not be finite, which implies that the graph cannot be Hamiltonian. In the final analysis it can be shown majority of produce will display dependence of some degree, a smaller amount of nodes or commodities will be independent and a minority of cases will not be decidable.)
The systems theory accepts the validity of Fisher’s quantity equation, as well as John Locke’s emphasis of the effects of quantity of money on the level of prices. However, systems theory of macroeconomics dismisses ‘quantity’ as a principal factor or a causal mechanism in explaining inflation. Inappropriate issuance of money as well as extensive periods of low interest rates below annual rate of inflation does prop up (absolute) prices. In such process, absolute prices increase and to a lesser extent relative prices will become effected proportional to the increase in money relative to current level of prices (DS = DM/P), according to the proposed systems approach. Nevertheless, the dominant and primary effect will be on absolute prices not on relative prices. Such has been the essence of many examples of hyperinflation in the Southern Hemisphere or many developing countries, where current account deficits have induced increased emissions, devaluation, and double or triple digit price increases. (Further filtering of absolute prices from pure effects of relative price changes in this regard has been brought into attention by Patinkin. Patinkin’s remarks about the dichotomization of the pricing process where he had criticized classical as well as neoclassical economists are helpful in this regard though Patinkin had criticized the use of relative prices for commodities and absolute price for money markets in Say’s Equality. )
Systems approach is further critical of the process in that the "Say’s Equality" does not algebraically exist and the process is continuously recursive. Thus, resulting instances of any equation, as formulated by classical or neoclassical economists are indeterminate. Apparently baffled but astutely observant some economists have called the process a constant state of disequilibrium.
Isolated cases of inflation without an accompanying devaluation or debasement are more often the case with U.S. and E.U. economies as evidenced by low and well-sustained rates of what can be called "purified inflation". In such form, price level increases are solely due to structural and topologic factors stemming from instances of input-output coupling. Being a state function of the economy and as such, "pure inflation" is measurable as entropy in the system. This type of actual inflation determines the amount of money made unavailable to the system and lost in the form of entropy to all participants of the marketplace; employers and employees, buyers and sellers. Evidently, factors effecting such primary inflation are in essence topological and not dependent on monetary policy.
Say’s Identity and Walras Law
Leaving the Defoe island to see what happens in actual economies, we arrive at our first stop which is Say’s Identity also known as the Walras Law. The following form of Say’s Identity treats money as a store of value as well as it is an exchange of medium:
n-1
å pi.Di = å pi.Si
i=1
n
å pi.Di = å pi.Si
i=1
If we compare Say’s identity, either form of it, but for simplicity the barter economy type, with the Defoe island price equations we notice two differences:
On the Defoe island, we had no demand or supply quantification. As each of the four islanders produced fixed quantities of goods and services synchronously, i.e., Bill cooked four breads a day and neither money nor goods were stored. For simplicity, demand and supply as well as money were all used only for exchange purposes and not for storing.
Say’s Equality makes space for store value of both money and commodities. Prices of goods are recorded post facto as associates of demand on the left and supply on the right. No effort to calculate prices is exerted or to derive the demand-supply curve. We can recapitulate Say’s Identity and the simplified Defoe islanders’ equation:
On Defoe island, price of each item is separately calculated in terms of its "ingredients", amount, and price:
P1 = (a11. P2) + (a12. P3) + (a13. P4) + M1
Say’s Identity formulates aggregate demand or supply:
Total Demand = p1.D1 + p2.D2 + p3.D3 +.....pn-1.Dn-1
On the island, before the fourth person arrived, total demand and supply were equal, much as in Say’s Identity. The bookkeeper was keeping track of it in the form of GDP, although this cannot be an accurate analogy because GDP represents final products in real world. After the arrival of the fourth person we introduced the concept of profit, named it ‘M’ and assigned ‘M’ the task of embodying and encapsulating the effects of demand and supply in a black box named margin of profit.
Say’s Identity ignores that prices are interdependent. It does not attempt to solve the equation for price levels at this moment, leaving that work for Leon Walras. In other words, Say’s Identity is not about individual prices but, is about the totality of them in the form of aggregate demand and supply.
According to Say’s Identity, aggregate demand always equals aggregate supply. On the other hand, Say’s Equality has room for a gap between aggregate demand and aggregate supply. Thus, Say’s Equality is the respected form of expression that present-day central bank policy-makers follow, especially if Patinkin’s dichotomization charge is addressed as well.
In none of the historical treatments leading into today's decision making criteria, the interdependence of prices on one another, as recursive input-output stages has been taken into consideration as a factor in the formation of prices. Walras Law assumes that given 'n' goods there are 'n-1' independent prices and 'n-2' independent relative prices. Classical and neoclassical economists as well as monetarists have assumed 'n-1' independent variables and concentrated on issues stemming from dichotomization, gaps between aggregate demand and supply, Cantillion effect, transactions motive, competition between stock market, money markets and government bond prices. The fundamental fact that Walrasian equation cannot be arranged and solved as a set of n equations with n-1 independent variables, because of the recursive and dependent nature of those n goods has been ignored. While this may be due to the fact that the increased range and speed of trade were never issues in the past and so it may be said interdependence of variables were ignored. One may also suspect Say and Walras simply attempted to simplify the model by deliberately ignoring the dependence of the variables. However, Russell and Hilbert had parallel efforts in another branch in mathematics similar to those of Walras, pointing more towards an erroneous assumption rather than an effort to simplify the model.
Perhaps the most stunning effect of not recognizing that the basic Walras Law is built on an assumption that does not conform to the real world is its misleading role on central bank policies. While Say’s Identity and Walrasian equation have served the historical development of fundamental concepts, they are increasingly misleading in an age of global electronic transactions and unforeseen increases in velocity figures for the famous Locke-Fisher Quantity Equation. The increased paced of transactions and velocity V in the Fisher equation, in effect causes the interdependence to become more and more non-negligible. Even under slow velocity slow transaction conditions of the ancient world trade, if input-output coupling was extensively encountered, prices would shot up rapidly. This may very well have been the case during the Roman Empire as well as during the English price revolution. Historically, traversing of a well established few trade paths may have been the significant factor for such a coupling of input-output stages in the sense of directly feeding output into input or perhaps with a few intermediary products interspersed in between.
A further ramification of the erroneous construction of Walras Law (Say’s Equality) is that excessive importance may be attributed to gaps of aggregate demand and supply. Most instances of fluctuations in aggregate gap are equilibrium seeking on their own, without the need for any policy action. This includes fluctuations in demand for money, demand for workers (Phillips Curve) as well as for goods and services. If every such gap is evaluated as a threat of inflation, complications of policy-action can be more harmful than any fluctuation itself since additional ripple factors will stem from the policy-action itself. While Say’s Identity is extremely helpful in developing a concept of aggregate statistics, systems approach can show that the two sides of the equation will never catch up with each other accept when economic activity ceases to exist.
Entropy and price stickiness, STM approach
In a Boltzmannian model, exchange of money and therefore trade is rendered by a simple transfer of momentum in the form of atomic and molecular collisions. As atoms collide with each other, energy is transferred in the form of momentum change. Restrictions are many, including that there should be no molecular reactions and collisions must be perfectly elastic. A fundamental restriction is that total energy available to players (atoms/traders) be kept constant, in compliance with the first law of thermodynamics. Corresponding models in the domain of economics would be much more elaborate and for advanced systems such as biological or industrial, the advanced transformational dynamics of energy that apply would be energetics rather than thermodynamics.
Advanced systems are endowed with entropy reversal mechanisms. Therefore, in this regard neither biological nor economic systems are irrevesible with regards to the changes in entropy and thus, the 2nd Law is twisted and turned over many times. The distribution model, summarized above, has significant biases which would not translate probability into stability levels in either biological or economic applications. A highly probable state as calculated may have low stability and low probability state may have a relatively high stability in advanced systems. Low stability means costly to render and high stability means affordable in terms of energy consumption.
The distribution statistics of systems theory will depend upon clustering of input products into intermediary and final products. The higher order of question at STM is not how capital is distributed but how products are formed. In other words if we have 7 producers, the number of different ways each producer can compile his product from the products of the other producers is the issue of importance for STM. This not a total rejection of the effects of capital distribution on entropy. Rather it is to indicate such distribution has a lower order effect on total entropy and the principal factor for STM is an analysis of input clusters as follows. We must mention that any produce falls into three categories. Notice the conventional classification of produce into raw, intermediary and final product categories is almost eliminated here:
In reality "true raw" products do not existent. From the standpoint of STM this is very important. Petroleum and wheat may be considered raw produce. But oil depends on steel, construction, electronics and transportation industries as well as labor force before it can ever be brought to the marketplace and they collectively form some portion of the costs and therefore some fraction of the price of crude.
Value added products are the most common type while compiled products are also quite prevalent. For example Dell computers are "compiled" equipments in the sense not one of the components are produced by the personal computer manufacturer but they are all bought from electronics factories and software suppliers. However in the days IBM used to market OS/2 based PCs the operating system was fully owned by IBM as a value added to the hardware; some of which came from IBM while other components were bought from external companies.
If we go back to our example of seven producers and assume that producer no.1 compiles his product from six others and buys equal amounts of input materials from each of the remaining six. We would express the anagram as 0-1-1-1-1-1-1 . A new anagram 1-0-0-3-2-2-0 means he puts one unit of his own produce and mixes it with 3 units from no.4 , 2 units from no.5, 2 units from no.6 and none from producer no.7 .
At first it may seem like Boltzmann was trying to distribute a fixed amount of energy into seven different placeholders, and we are trying to find clusters of input products through permutations of similar types known as anagrams. This is not exactly the case in that the process in Boltzmann’s distribution of seven energy levels to seven atoms is perfectly cyclic and is finite unless the cycle is ran without a time limit. On the other hand the process in STM distribution is recursive and yields no end in finite attempts, although the total entropy still converges as it would in the Boltzmannian computation of entropy.
In the Boltzmannian sample, at some point in time, we would find out about the total list of possible anagrams and from then on the list would simply repeat deterministically and the system would be in equilibrium. In the anagrams produced by STM the full list of possible anagrams would never be completed just as in number theory a full list of prime numbers would never be obtained because soon there would come a new prime number. How does this happen? This happens in exactly the same manner as Godel had foreseen and organized his formalization techniques. Before looking into how this may arise let us take a break here to ask a meta-analytical question :
Why are we doing this? The complexity of industrial input-output analysis is anyone’s guess. While such information is, in theory, retractable and traceable, is that the aim? Are we into creating a Doomsday Book of who buys from whom?
The aim is not to monitor economic activity with a modern day version of a Doomsday book. We are not in search of the absolute values of either entropy or the inflation or any parameter in economics. Rather, we want to know how these parameters influence each other and what factors are involved. An electronic doomsday book would probably not be feasible, neither politically nor affordably. It is not as crucial to know the exact figures for leading indicators as it is to know which factors influence them and by how much.
Going back to the seven producer model or perhaps to visualize somewhat more easily let us reduce the number of producers from seven to four. We can have an anagram like the following:
1-3-2-0 ;meaning producer no1. adds 1 unit of his material, 3 units from no.2 , 2 units from no2., none from no.4
2-1-0-3--;means in return producer no.2 adds 2 units from no.1, uses 1 units of his own, none from no.3, and 3 units from no.4
Let 1-3-2-0 and 1-1-3-2 be the other two producers ingredient formulas. Let us see what happens after the first business cycle:
Producer 1, whose ingredients formula is 1-3-2-0, now will buy according to this formula from the other producers. The ingredients formula orders him to put one unit of his produce which is 1[1320]. (We will represent final products with numbers and formulas with dash sequences.) then by three units from second producers 3[2103], then 2 units from 3rd and none from 4th. At the end of second business cycle we have :
1[1320] 3[2103] 2[1320] 0[1132]
The blue anagrams are ingredient formulas for each producer. Yet the red prefixes, when put together, point to ingredient formula of producer one. In explicit form formula one is a ‘formula of formulas’. If we were to go ahead and write down the second producers ingredients formula then, the explicit form becomes :
Since ingredients formula for no.2 is 2-1-0-3
2[1[1320] 3[2103] 2[1320] 0[1132]] 1[2103] 0[1320] 3[1132]
Ingredients formula for no.3 is 1-3-2-0
1[1[1320] 3[2103] 2[1320] 0[1132]]3[2[1[1320] 3[2103] 2[1320] 0[1132]] 1[2103] 0[1320] 3[1132]]2[1320]0[1132]
Let us do the last one in a more explicit way. We first fetch the master formula for no.4 which is 1-1-3-2. Then put it in an "product assembly line" as follows.
1[...]1[...]3[...]2[...]
Then fill in the bracket place each of the extended product formulas one by one by a cut and paste operation.
1[1[1320] 3[2103] 2[1320] 0[1132]]1[2[1[1320] 3[2103] 2[1320] 0[1132]] 1[2103] 0[1320] 3[1132]]3[1[1[1320] 3[2103] 2[1320] 0[1132]]3[2[1[1320] 3[2103] 2[1320] 0[1132]] 1[2103] 0[1320] 3[1132]]2[1320]0[1132]]2[1[1[1320] 3[2103] 2[1320] 0[1132]]3[2[1[1320] 3[2103] 2[1320] 0[1132]] 1[2103] 0[1320] 3[1132]]2[1320]0[1132]]
In summary:
The above points constitute the concerns that need be addressed in addition to Boltzmann’s derivations. In the Boltzmannian model we do not end up in recursive conditions. Over the course of time everything is cyclic in Boltzmannian model. If we model the real marketplace industrial sectors and labor in a large scale model for example with 10,000 components as opposed to four, we would see the following topological distribution of activities. Given finite time:
Given sufficient time or infinite business cycles, majority of nodes (traders/producers) will be connected. In such a case the graph topology will roughly resemble a Hamiltonian graph, in the sense given enough time a token (money) will visit each node at least once, but nevertheless there will still be nodes beyond reach.
Three additional corrections conclude the basic treatment:
Step 1:
1[1320P] 3[2103P] 2[1320P] 0[1132P]
Step 2:
2[1[1320P] 3[2103P] 2[1320P] 0[1132P]] 1[2103P] 0[1320P] 3[1132P]
Step 3:
1[1[1320P]3[2103P]2[1320P]0[1132P]]3[2[1[1320P]3[2103P]2[1320P]0[1132P]]1[2103P]0[1320P]3[1132P]]
2[1320P]0[1132P]
Step 4:
1[1[1320P]3[2103P]2[1320P]0[1132P]]1[2[1[1320P]3[2103P]2[1320P]0[1132P]]1[2103P]0[1320P]3[1132P]]
3[1[1[1320P]3[2103]P]2[1320P]0[1132P]]3[2[1[1320P]3[2103P]2[1320P]0[1132P]]1[2103P]0[1320P]3[1132P]]
2[1320]0[1132P]]2[1[1[1320P] 3[2103 P]2[1320]
P]0[1132P]]3[2[1[1320P]3[2103P]2[1320P]0[1132P]]1[2103P]0[1320P]3[1132P]]2[1320P]0[1132P]]
Notice that at the end of step 4 while we would expect four producers to have four units of profit, surprisingly 36 units of profit are observed. This false profit boom occured due to the recursive cycle. Naturally had we looked into all possible distribution numbers and associated microstates we would have seen not all trades end in such short-circuit situations. Nevertheless the godelization process will yield that a considerable number of such recursive boom cycles will occur and in effect force profits to be adjusted or else 1st Law of thermodynamics will have to be violated. Since in real life formulas and profits are not inscribed in color and made clear, traders cannot know the source or the pseudo-boom and cannot easily distinguish recurring profits from legitimate ones. As a result they will have to cut profits broadly resulting in a profits recession process as we are all too familiar with.
In the four-producer example, proliferation happened so fast, that we could refer to it as a chain reaction. The velocity in Fisher formula, partially determines the speed of such proliferation. Policy-makers thus intuitively attempt to slow down the economy at such times, although that amounts to destroying the problem not solving it. More essentially, continuous changes in trade patterns, as well as the very large number of traders and producers limit the extent to which such short-circuit conditions are encountered. From within the system recursion is unavoidable and constitutes an inherent systems attribute. To a meta-system observer however its effects can be predicted to great extents and monitored, barring effects due to prime nodes and unpredictable "anagrams" such as technological advances.
It is quite tempting to turn towards e-commerce as a meta-system tool in dealing with entropy and its destabilizing effects on monetary policy. At the introductory stage this paper will not go to that extent.
The Special and General STM : Methods and Different Domains
Special theory (SSTM) derives inflation rate from entropy computed in a graph topology model of a marketplace where commodities and money travel in different directions but are identically subject to demand and supply laws. Marketplace is subdivided into sectors (subsystems) with boundaries and related kinetics and dynamics of movement between them. Each node or vortex is connected to others where transactions i.e., transfer of money occurs to and from. The pathways that money, in all its forms M1/M2/M3, traverse are active graph links. Temporal and topological patterns constitute verbs and nouns of the graph while goal-seeking changes in overall topology constitute semantics of the graph system. Associated with each node is a host of pointers such as its descriptor (person/entity/bank/employer etc.), source/sink capacitance, reactance, elasticity functions identified by and derived from General Systems Theory of Bertalanffy and others. Nevertheless special theory is all about the rate of inflation and methods of computing entropy as a state function of the topology that describes the totality of the market.
General theory (GSTM) has broader objectives:
While this concludes the introductory section the following articles will be based on subsections of this working paper. In that order, a collateral work is in progress in the form of a software methodology and associated program which is essentially a library of functions consisting of an Application Programming Interface (API) that handles issues of complexity management, graph topology routines and godelization process prior to a forecasting model base. As earlier nicknamed CSM by the author, Complex Systems Management toolkit will be described in detail in the upcoming sections.
_________________________________________________________________________________
References:
(1) Proposed and developed by the biologist Ludwig von Bertalanffy in 1968, General Systems Theory has its origins in the 1940s. Bertalanffy and Ashby developed a unifying concept of systems derived from abstractions of various systems encountered in different domains such as physics, biology and electronics. As a transdisciplinary study, systems theory has been greatly influenced by cybernetics. Systems theory also was a reaction against reductionism and adopted the holist viewpoint that whole is more than a simplistic sum of its parts. The essential method of systems theory is abstractions of subsystems and their relationships with each other and with respect to the whole. This approach is unifying in its nature and the process of abstractions serve to uncover attributes of one type of a system from the known features of different disciplines and branches of science. Please visit http://pespmc1.vub.ac.be/systheor.html for a guiding article . (back to keywords) (back to text )
(2) Small Ian, and Yates Tony What makes prices sticky? Some survey evidence for the United Kingdom, Bank of England Quarterly Bulletin, August 1999 page 268. (back)
(3) Mayhew, N.J. Sterling, the history of a currency / Nicholas Mayhew. John Wiley & Sons, Inc. 1999 Page 87 (back)
(4) McLoed, A.N. A critique of neoclassicism and New Keynesianism, Eastern Economic Journal, Vol. 23, No. 1, Winter 1997, pp. 101-112 (back)
(5) Small Ian, Yates Tony What makes prices sticky? Some survey evidence for the United Kingdom, Bank of England Quarterly Bulletin:August 1999 page 262-268. (back)
(6) Kiley, Michael T. Staggered Price Setting and Real Rigidities , Division of Research and Statistics, Federal Reserve Board, August 1997
(7) Kiley, Michael T. The Lead of Output Over Inflation in Sticky Price Models , Division of Research and Statistics, Federal Reserve Board, August 15 1996
(8) Engel, Charles, Rogers John H. Violating the Law of One Price: Should we make a federal case out of it? Division of Research and Statistics, Federal Reserve Board
(9) Kiley, Michael T. Endogenous price stickiness and business cycle persistence , Division of Research and Statistics, Federal Reserve Board, July 18 1996
(10) Erceg, Christopher J. Nominal wage rigidities and the propagation of monetary distrubances, Board of Governors of the Federal Reserve System, International Finance Discussion Papers, Number 590, September 1997
(11) Hooker, Mark A., Exploring the Robustness of the oil price-macroeconomy relationship, Division of Research and Statistics, Federal Reserve Board, Revised October 1997
(12) Fleischman, Charles A. The causes of business cycles and the cyclicality of real wages, Division of Research and Statistics, Federal Reserve Board, October 1999
(13) Bordo Michael D., Erceg Christopher J. , Evans Charles L. Money, sticky wages and the Great Depression, Board of Governors of the Federal Reserve System, International Finance Discussion Papers, Number 591, September 1997
(14) Blaug Mark, Economic theory in retrospect, Cambridge University Press, Fifth edition
(15) Dugdale, J.S. Entropy and its physical meaning, Taylor and Francis Inc, 1996
(16) Nagel, Ernest, Newman James R. Godel’s Proof, New York University Press, 1986
(17) Godel, Kurt Meltzer B. (translater) On Formally Undecidable Propositions of Principia Mathematica and Related Systems, Dover Publications Inc. 1992
(18) Smith, Adam, An Inquiry into the Nature and Causes of The Wealth of Nations, The Modern Library, New York 1994
Adrian Gunnar Perison, a.k.a. Guner Gulyasar, consultant-programmer, is currently sabbatical and was previously with IBM T.J. Watson Research Center. He has a research background in biomedical and cognitive sciences, as well as distribution statistics and digital signal processing, can be contacted at agguly55@hotmail.com click here to go back