A Systems Theoretical Approach

Uncertainty, Indeterminacy and Shannon's Derivation of Entropy: Implications for Policy Administration

 

 

Working Paper

by 

Adrian Gunnar Perison M.D.

December 5, 2001

 

Abstract

 

Most prices and interest rates display fluctuating levels that embody extractable energy and equivalent amounts of money. Such fluctuations are also associated with varying degrees of uncertainty. Shannon's derivations of spectral entropy and information content offer computational techniques for unraveling the portion of useful economies that is resident in such cycles that nevertheless goes unaccounted for as a significant source of missing data in the composition of the overall economy. Shannon's concept of spectral entropy can also be exploited to quantify the amount of uncertainty that would preside over expectations, either adaptive or rational. Apart from Clausius-type and Boltzmannian derivations, Shannon's derivation of entropy which follows Boltzmann's lead, can be applied to price and rate fluctuations, thus unraveling a higher order generator mechanism for cost pressure and inflation. In this working paper the basis and fundamentals of Shannon-Weaver formulas are discussed, in their raw forms, prior to application samples that will follow in a separate article.

 

Keywords

Adaptive expectations, rational expectations, inflation targeting, decision analytical process, intelligent agents, seigniorage, active policy, inactive policy, exogenous uncertainty, nominal interest rate pegging, spectral analysis, relative power, spectral band, within-band entropy, between-band entropy, information percentage.

See also: Systems Theory of Macroeconomics, Introduction to

 

Part One

-Basis-

In looking at the above chart one can notice the apparent features of cyclicality as well as the specific trend which is ostensibly downwards. The absence of a scale precludes one from attaching much context. Thus the time domain could be hours, weeks or decades and dependently for each case an observer would draw different conclusions. If the scale were hours, we could conclude a daily activity at some stocks or commodities market ending the day in a downward trend. If the scale were decades, the chart could as well relate to a central bank's adjustment of interest rates.

In reality the trend is not at all downwards. It is simply a phase locked mixture of 4 Hz, 3Hz and 6 Hz sinusoid signals. Having said that, had the mixture been "enriched" with an additional introduction of phase differences, random noise and a nonstationary source essentially shifting the frequency, the original three-factor event would almost reach a forbidding level of complexity in spite of having no more than 5-8 factors.

Trend analysis of recorded data, whether as part of an adaptive or a rational expectations process, is subject to similar but much more profound and subtle misrepresentations of the underlying factors associated with limitations of extrapolation techniques. Exogenous as well as endogenous uncertainty often times results in indeterminacy of some traced attribute such as prices or interest rates. Frequently in forecasting levels of hierarchic decision trees such as acyclic graphs are involved. Such complex decision circuitry is often also composed of such trends. This approach contrasts with simple tautological models constructed during the sixties. Software techniques enabling advanced graph structures and neural computing alternatives offer support that was not quite developed during when most of the postmodern decision analytical advances in macroeconomics were made.

Tautological models have revealed invaluable insight to policy making but, they do represent a dilemma: Price levels will be indeterminate if adaptive expectations are assumed. On the other hand, if rational expectations form the basis of policy and investment strategies, uncertainty due to a multitude of endogenous and exogenous factors will emerge.

The fundamental approach of economists during the postmodern era has been to apply decision analytical techniques on a case by case basis to presumed models and monetary consequences, eventually building a tautological table of each behavior and consequences on price levels and budgetary deficits. In other words the macroeconomics of the postmodern era was characterized by a synthesis of behavioral sciences, games theory and decision analytical approaches. Nevertheless precious conclusions have been drawn as a result of this approach, slowly leading into a formation of rules regarding various strategies; the effects of interest rate pegging, policy activism, inactive policy, fixed interest rate policies, tradeoffs between output and price stability and such.

The monetarist approach is to accept uncertainty as the main factor and to refrain from activism accordingly, especially as effects of monetary policy require time and thus activism can destabilize the economy as an additional factor out of sync with the targeted factors. Secondly, the monetarist approach suggests that the policy maker refrains from feeding adaptive expectations, as such can lead to a loss of control over nominal prices. Having said that, a third way has also emerged as monetarists and Keynesians have found some common grounds in somewhat interchangeable periods of administering fiscal deficits and periods of interest rate based adjustments. Whichever choice prevails, a steady increase of money volume has been the broadest choice of Friedman's school, whether accomplished through IS or LM approaches, as long as extensive fluctuations in the volumetric growth is not permitted.

Systems theoretical approach to macroeconomics treats uncertainty and indeterminacy from a non-geometrical approach. Conventional measures of how data is dispersed as used by correlational and regressional tests are not immediate methods of preference for STM. Such methods which are based on measuring separation of data points from a trend or likewise within the trend, such as autoregression/correlation, do not reflect well upon cyclical variations. Cyclicality of data being one reason and a more directed effort towards quantification of randomness and information content being the other, STM utilizes methods that are more apt in dealing with alternating/cyclical trends. Having said that, most time-series data encountered in macroeconomics during a uniform period of time is not lengthy enough for full utilization of digital signal processing methods either. From this perspective, accuracy of statistical measurements is not as crucial as the power of probing into underlying factorial representation of macroeconomical systems. Thus spectral content takes over measures of dispersion in its powers to examine factorial as well as cyclical relations. Shannon's Information Theory with its quantification of entropy and information embodied in a signal can be useful as a factorial analysis tool as well as a measure of uncertainty associated with trends and data sets.

 

Spectral content as a mechanism of money generation

 

Fluctuations of price level is an important determinant for the store value of money and conversely of goods. By analogy price level corresponds to the level of water in a container with certain bottom area which corresponds to the capacitance of the container. By multiplying water level with the area one can estimate the amount of water held in the container. Likewise a transaction has two arguments; price and amount. One can introduce time domain as the third parameter and derive capacitance formulas modeling the broader RCL circuitry elements from alternating current electrical systems.

From a semantics approach any signal can be said to have three components: A carrier, an encoder and the message and which also represents the fundamental statement of Information Theory. Likewise if we plot price level over time, such a trend can also have three components. In other words the price level can become the carrier and fluctuations can embody further value in the form of a fluctuating price level. Current accounting practices whether as regards to factors effecting money supply or measurements of output such as GDP focuses only on the (carrier) price level and ignores the higher order of money generation. Typical example of this sort of money generation is of great interest to day traders. Likewise second order money generation can be an important instrument of seigniorage as fluctuating interest rates can also generate a second order mechanism of money emission or conversely money removal.

From this perspective a flat line of absolutely stable commodity price is of little interest to a trader for transaction purposes though it may offer invaluable store value. For transactions purposes, as worthless as an overly stable price trend is a totally random change in price levels, which defies all predictions and thus cannot be harnessed for extraction of any quantity of money. Within the gentle swing of a price level upon time domain thus information as well as energy is embodied.

The elementary graph on the left can help visualize this. At the onset of the trace, a trader may somehow have obtained the future course of prices which for simplicity mimics a sinusoidal wave. Thus, armed with information against uncertainty he would sell at peaks and buy at troughs and accumulate gains proportional to frequency of the waveform and the displacement amplitude. Notice this does not come without a price, because will have to carry with him an amount of money that is equal to the average price of the commodity. In observance with Clausius-type entropy, DS equals DQ/T, or in economics as adapted in the previous article, the corresponding DS = DM/P requires that the higher the offset of the carrier the lesser efficiency of the trade would turn out. Such efficiency also applies to stock prices levels as day traders know all too well.

According to Shannon's version of entropy, the sinusoidal fluctuations in this example present minimum entropy and yield maximum information. Claude Shannon had coined the term 'bit' (binary digit) and published "Information Theory" with Weaver in 1948. In Shannon's derivation, entropy is measured in bits.

Contrary to the above example with unifactorial price swings we can introduce a second yet unknown factor causing the trader's expectations to fail on occasions, as below:

Notice the first and fourth peaks are somewhat accentuated and the second and third peaks have been flattened out after the superimposition of a 3 Hz signal on top of the original 4Hz sinusoid. The resultant trend will reduce the trader's gains at least by half. In STM's version of Shannon's entropy we will follow similar course with the previous article and import some concepts from Godelization and we will emphasize spectral components as factors in our version of factorial analysis. (Fundamental frequencies and not their harmonics count as prime factors, in a way similar to Godel's use of prime numbers.)

At this instance entropy has become at least twice what it is was previously and profits of the trader have slumped by at least fifty percent.

 

In this example a third factor (sinusoid waveform) is introduced. Thus informatical value and predictability has fallen by a third and entropy has risen to at least three times that of the original 4 Hz signal. Further superimposition of random noise increases entropy and uncertainty and with lessened information content the trader's profits slump even more.

While the trader seems to be resorting to adaptive expectations that is besides the point. Independent of whether the trader is adaptively, or rationally forming expectations, or if he is an intelligent agent employing some algorithm for investment strategy, or a stochastics practitioner, so long as entropy increases, informatical advantages of whatever tools were at his disposal will fail to meet the expectations, as forecasts will miss actual reality.

At this point as far as systems theory is concerned, the method of forecasting is irrelevant. STM searches for a quantification of information hence uncertainty, regardless of the method of forecasting that is employed. The studies of economists through sixties and seventies have yielded lots of fruits in highlighting the paths likely to be traversed when each method of expectation forms the basis of fiscal or monetary policy. By the same token today no person is naive enough to form adaptive expectations as an investment basis however it can also be said that no one is smart enough to conduct impeccable rational expectations either. Shannon's entropy, as we shall see helps economists to first and foremost find out how much juice is available for successful forecasting, even before deciding upon the forecasting method of choice.

The alternating nature of prices in the meantime forms the basis and an essential resource extraction mechanism of economic activity. Such price fluctuations resemble AC circuits except economic systems are fundamentally open systems where as electrical circuitry by definition is a closed system. Similarly price fluctuations in a market yield extra money into circulation not possible with flat prices. Alternatively such mechanism can cause money removal from the system also.

Suppose the trend on the left is a graph of central bank interest rates over six years (or 12 six-month epochs). For this example suppose that the overall trend is indeed downwards. During this period, as interest rates drop, debtors pay less for the discount then was originally borrowed. The difference between expected and observed rates on borrowings from the central bank will equal the amount of money created. However bear in mind that if the trend is reversed and interest rates rise in a long and mild train of raises then the resultant trend will cause removal of money from the system. Either during money creation or conversely during money removal some quantity will be lost to Clausius-type entropy as explained in the previous article and the net loss will incur as cost pressure which will pose as an inflationist addendum. Creation of money with swings in interest rates in this manner is presumably very tricky. While it happens as a side effect of policy administering, in the coming articles we shall see how it can also be successfully exploited.

 

 

PART TWO

-Fundamental Derivations-

 

Claude Shannon's derivation of entropy for a power spectrum follows course set by Boltzmann. While the original calculations were made for atoms with discrete energy levels, in Shannon's version, the distribution numbers and microstates are identified similarly for discrete spectral lines each with any energy level where the total level is normalized to yield a sum of 1. In other words we replace Boltzmann's concept of "n atoms" and "m associated discrete energy levels" with a similar "n spectral lines" and the corresponding "m discrete energy levels".See also: STM Shannon had introduced two additional concepts:

Before calculations are made the trend data is applied Fourier Transform, yielding the power spectrum. For best results the data must be sampled at least twice the rate of the highest frequency of interest, Nyquist frequency. Sequences of the calculations are as follows (6):

1) Power spectrum is normalized. For each frequency, the power level obtained from Fourier Transform is summed, å p(frq), thus the total power is calculated. Later each frequency's power level is divided by the total power, yielding in the end the total; å p(frq) = 1. This way different data sets with differing amplitudes can be standardized. Normalized power is also called relative power.

2) Total entopy of the whole frequency range (upto half sampling frequency) is computed by multiplying the power in each freqency by the logarithm of the same power : å p(frq) x log p(frq) . The base of the logarithm is 2. The result is multiplied by -1 .

3) Within-band entropy (WBE) is calculated. (Here Shannon had also introduced a very useful approach for general systems theory apart from his intended work. In Boltzmann's calculations n atoms with k energy levels occupy the same chamber. For multicompartmental systems Shannon's approach through WBE represents a transition from the simplistic single chamber approach. Although energy transfers and interactions between frequency bands in Shannon's derivations is not allowed it is nevertheless is a step closer to multi-chamber environments.) To calculate the within-band entropy first relative band power (RBP) is obtained by summing each of the relative power within every frequency in that band. In other words RBP is proportion of power a band contributes to the total normalized power of the whole frequency range upto Nyquist number. Later, entropy for that band is computed by multiplying each frequency divided by RBP within the band by the logarithm of the same amount : p(frq)/RBP x log p(frq)/RBP

4) Between-band entropy (BBE) is calculated. Relative Band Powers are normalized. Each of the RBPs are divided by the sum of all RBPs in a way to yield the sum, å RBP = 1. Once each band pwer is normalized BBE is computed:

BBE = (RBP)x( log RBP ) ; summation is performed for each band

5) Shannon and Weaver Information decomposition is obtained:

Total Entropy = BBE + å (RBP)x(WBE) ; (for each band)

6) Information contribution of each band is:

I = - (RBP)x( log RBP ) + (RBP)x(WBE) where the percentage contribution is:

P= 100 x ( I ) / (Total Entropy)

 

Conclusion and Applications for Macroconomic Analyses

 

While the theoretical fundamentals of Shannon-Weaver's entropy is very strong, applying this concept without refinements poses difficulties. These stem from error and tolerance limits arising from aliasing, sampling rate, bit-resolution and other standardization matters including restrictions set upon signal that need to be addressed before Shannon's formulas can be applied reliably to economic data in a standard way allowing for meaningful comparison. However the strength of the notion that a second order economic activity is present within fluctuations and its associated entropy as a cause of inflationary pressure is not effected in principle from problems of isomorphism between signal and spectrum, which are inherent in Fourier Transform and is handed over to Shannon's expressions of entropy.

As part of an effort to standardize entropy measurements from Shannon-Weaver formulas, computer models of market activity and a study of resultant entropy generation can be helpful. In the next article through actual software models we will discuss and study the factors effecting the accurate quantification of entropy and, how the changes in entropy and endogenous uncertainty might result in unavailable money within the economy thus, adding to cost pressures.

In other words while the conceptual backgrounds of Shannon's entropy is very strong, in applications we are likely to have signal standardization and accuracy problems. Previous experience with entropy in other areas of signal processing such as biomedical applications and electroencephalograhy research reveals entropy and information percentage is already more refined than plain spectral analysis, though issues of repeatability remain. By the same token, spectral analysis itself is far more reliable than autoregression and related tests as well as tests of data dispersion when the underlying events are cyclical and take place over a time domain. The preliminary impression is that Shannon-Weaver derivations can also be advanced into Factorial Analysis techniques, in place of current FA techniques when the underlying data sets are time-series records displaying cyclicality of several superimposing processes each with a different period.

 

_________________________________________________________________________________

  

References:

(1)Shannon, C. and Weaver, W. (1948). A Mathematical theory of communication. Univ. of Illinois Press

(2) Shannon, C. and Weaver, W. (1949). Information Theory. Univ. of Illinois Press

(3) Fischer Stanley, Blanchard Olivier, Lecture on Macroeconomics, The MIT Press Cambridge, Massachusetts, London, Englan, 11th printing 1998, p: 567-617

(4) Blaug Mark, Economic theory in retrospect, Cambridge University Press, Fifth edition

(5) Dugdale, J.S. Entropy and its physical meaning, Taylor and Francis Inc, 1996

(6) Inouye T., Shinosaki K. et al., Quantification of EEG irregularity by use of entropy of the power spectrum, Electroencephalography and clinical Neurophysiology, 79(1991) 204-210, Elsevier Scientific Publishers Ireland Ltd.

 

 

Adrian Gunnar Perison, a.k.a. Guner Gulyasar, consultant-programmer, is currently sabbatical and was previously with IBM T.J. Watson Research Center. He has a research background in biomedical and cognitive sciences, as well as distribution statistics and digital signal processing, can be contacted at agguly55@hotmail.com click here to go back