Saturday, December 1, 2012

The Rule of 72 is used to estimate the time for an investment to double,  You take 72 and divide it by the interest rate as an integer to get an estimate for the doubling.  For example, if you have a growth rate of 3%, 72/3 = 24, so with a growth of 3%, your investment will double in approximately 24 years.

But the rule also works growth rates other than investments.  If the population is growing by 3% per year, the population will double in 24 years.  Another example is medical spending in the U.S., which is growing at approximately 9%, meaning medical spending doubles every 8 years. The rule is just a rule, but what isn't happening is a rational discussion about the natural consequences of the growth rate.  If 9% per year translates into doubling every 8 years, the discussion should be on the core issue, the growth rate, and not on the side issues.

Friday, November 23, 2012

Gross Domestic Product (GDP) is one of the most commonly known and accepted measures of an economy's health.  GDP is defined as:

GDP = consumption + investment + government + exports minus imports

or

GDP = C + I + G + (X - M)

There are many arguments against using GDP as a valid measure and they can be fodder for future posts, but right now I want to focus on the government component (G).

For the past four years, the government has been running around $3.8 Trillion per year in spending, which directly adds to GDP.  At the same time, approximately $1.1 Trillion per year has been in added debt.  The problems to this are fairly simple.  All expenditures by government, no matter how trivial or wasteful, directly add to the G component and therefore GDP.  So if government pays people to move dirt from one place to another, this will count as an addition to GDP.  When, not if, the government reduces its deficit from $1.1 Trillion to something less, this amount will directly reduce GDP and in fact will cause even further reduction in knock-on effects, such as base closures that translated into 2-1 job losses for each job loss.

When people are crowing about the growth in GDP of 2% to 3%, it is never mentioned that 1/15th of GDP is simply government debt, which completely dwarfs that supposed growth.

Sunday, November 18, 2012

I have been following Nassim Taleb, the author of The Black Swan: Second Edition: The Impact of the Highly Improbable.  I enjoy reading his books and now his essays.  He recent wrote on an essay in the Wall Street Journal Learning to Love Volatility that presents five rules.  I am wondering about his third rule, however, 'Small is beautiful, but it is also efficient."

The example used is that of the elephant and the mouse.  The elephant would break its leg on the slightest fall, while the mouse can fall several feet uninjured.  He also explains that "we need to distribute decisions and projects across as many units as possible, which reinforces the system by spreading errors across a wider range of sources."

My wonder comes about when thinking about the efficiency of a small, decentralized system versus a large centralized system.  I believe that the bottom-up mechanisms that he mentions, such as the cantons in Switzerland are definitely more stable and can involve more feedback into decisions than would authoritarian regimes.  I think though in political systems, though, efficiency is actually not desired.  When a political system is highly efficient, it seems that bad decisions can be quickly made and implemented.  Looking at the Constitution of the United States, it seems clear that many of the subsystems were designed to be small, but also inefficient.  The inefficiency was actually built-in to slow down any bad decisions.

Saturday, November 17, 2012

Signal to Noise Ratio

The Signal to Noise Ratio (SNR) is used to define the level of signal compared to the level of noise Signal to noise ratio.  The ratio takes the power of the signal divided by the power of the noise, with a ratio of greater than 1:1 indicating more signal than noise.  While the SNR was originally used in the Shannon-Hartley theorem to calculate the maximum transmission of data in the presence of noise, it can also be a very useful construct trying to understand current events.

For example, every day there is a tremendous amount of "signal" being generated or promulgated through the media.  But the question is always, is this really a signal or noise.  One particularly prominent "signal" is the monthly employment report from the Bureau of Labor Statistics.  Each month a report is released with a number of jobs being created and a corresponding unemployment rate.  These reports are nearly always treated as if they are clean signals with high quality information.  Other "signals" could be recent events from overseas, scandals in the White House, and on and on.

One approach to each signal is use the criteria from my previous post and measure each signal against the three heuristics; common sense, simplicity, and who benefits?  With these in mind, most so-called "signals" can immediately be seen as noise and as such reduce the amount of real signals in the environment.  The trick then becomes filtering out all of the noise masquerading as signal and concentrating on finding and evaluating the signals.

Friday, November 16, 2012

Yesterday's post was perhaps more cathartic than informational, but it should set the stage for more thought and discussion.  Today, let's put some more pieces in place.

Models are used everywhere.  Indeed, since the human brain consists of a very large but still finite number of neurons, it can be said that the human brain of necessity will construct models of reality from all the nearly infinite sources of input.  My concern with models isn't that they are bad, but rather, most people blindly accept them without questioning the validity of the model itself, the completeness and validity of the input, and the accuracy of the output.

I want to cover three heuristics (guiding rule or principle) that seem useful is constructing and analyzing models.  First, plain old common sense.  As said before, the human brain has constructed many models since birth and most likely, those models will prove useful as the person strives to continue existing.  Common sense is a vastly underused tool that can be used to understanding new models and problems.

Second, simplicity.  Occam's razor is a valuable principle that states that from competing models, the model with the fewest assumptions should be selected.  Simplicity allows the user to more accurately define the cost function in the model.  The more complex the model, by definition, the most time and energy that will be needed to work out all of the decision paths.

Finally, cui bono ('To who benefit?').  In nearly every decision, there are a multiple of actors with their vested interests and biases.  When analyzing a situation, an understanding of who benefits from the decision will quickly help understand the underlying model and bring clarity to the decision.

Thursday, November 15, 2012

Now it's time for the first definition.  (Warning, probably boring stuff here).  Here, I am going to define a model to represent thought.  From the outset, I need to be clear what a model is and what it is intended to do.  A model is a mathematical construct of a complex set that will be used to verify current observations and predict future observations.  Models are used all the time to make sense of a sometimes infinitely complex environment.  My point in a model is to point out that it is by definition a simplifying set of assumptions and expressions that will lose completeness but should be able to validate observations to a negligible degree of error.  The important characteristics are the workings of the model, the validity and completeness of the input and the accuracy of the output.

Well if that makes sense, let's talk about the model.  For thought, let's use a graph.  A graph consists of a vertices (points) and edges connecting the points.  For here, let's call the graph G with vertices V and edges E.  We will label one point v0 and call it the root vertice, the other vertices we will give integer based name,s such as v1, v2, and so forth.  The root vertice needs to have at least two edges going out of it.  For each edge, we will need to have a function that will generate a non-negative number in the range from 0 to 1.  This number will be the cost of the edge.  The purpose of the model is provide a minimization function for each vertice.

Wow, is this dry.  Anyways, let's try a specific example here.  Suppose, you want to buy a car.  At this point, you could say that I have two possible decisions or edges in the terminology above.  The buy decision will snake out from v0 to v1 and the other edges in the decision tree.  The do not buy will go out from v0 to v2 and the edges in its tree.  Each edge should then have some cost function that we will try to minimize to provide to best possible decision based on our model.  The buy edge will have a cost of the car, for certain, but other inputs should also be entered into this edge's function, such as the maintenance costs, opportunity costs, taxes, fees, etc.

The point I want to make with models is that the model measures only what it measures and is only as complete and valid as are its inputs.  Many times the model itself may be good, but the inputs are not complete.  A good example would be the work of Euclid's Elements, Newton's Laws of Mechanics, and Einstein's Laws of Relativity.  Euclid defined geometry for over two thousand years with just points, lines, curves and simple constructions.  This model proved to be remarkably successful to represent the world to the people.  When it came time to model planetary motion, however, it failed miserably, along with the model s of the earth being at the center of the universe.  When Newton formulated his laws, they were able to predict the motion of the planets and moon with remarkable accuracy and still do so to this day.  It was only near the end of the 1800s that people could see that model was unable to predict certain behaviors, such as the orbit of Mercury correctly enough.  When Einstein formulated his laws, they were able to predict all the behaviors that Newton's laws had and then predict new behaviors such as the position of stars when the Sun moved in front of them.  So for now, let's try looking at thought as a minimization problem of a decision tree with emphasis on identifying all of the necessary inputs and cost functions so provide accurate predictions.

Wednesday, November 14, 2012

This is my (perhaps somewhat inappropriately mis-named) blog.  This is written with the intent of trying to write a few coherent sentences about the complete lack of adult thought that I see throughout the world.  My audience will primarily be me, but you are all invited along for the ride.  I will first start by defining definitions in the first few entries and will then move in a pseudo-random walk from there.  My aspirations are that through writing I will somehow be able to better understand what is going on in the world and perhaps even that others will join in.  Welcome aboard.