8/29/2020 ☼ posts

A very loose summary for myself. See Wikipedia for a more technical answer.

A farmer expects to grow 500 apples, and sell them for $5 each—but when the harvest comes, there are fewer apples, 400. (Supply is low.) However, demand is still high, so they get to sell 400 apples at, say, $6.

The apple harvest was low, but this allowed them to charge more.

Next year, the apple farmer remembers the high price for last year’s harvest and plants more apples, say, 600. However, now that there are lots of apples, there isn’t enough demand. He must sell the additional 100 apples for a lower price.

Next year, apple farmer remembers the low price, plants fewer apples, gets to sell them for more.

Next year, apple farmer remembers high price, plants more apples, has to sell them for less.

So on and so forth.

This is the Cobweb Model. If you looked at the last few paragraphs on a graph, you’d see a cobweb-esque set of lines.

In that example, the farmer hopefully gets closer and closer to the true demand of apples by remembering not only last year’s harvest, but all of the harvests before then.

This is called convergence.

There’s another possible result of the cobweb theorem which is divergent, and causes more fluctuations—but both are driven by the expectations of the previous years.

Takeaways from this idea?

Not sure.

Don’t be shortsighted?