14 Comments

Now if evolution wipes you out for being all-in on a bet (i.e. growth-fueled capitalism), good luck to the next species I guess? Hope you probability match a bit better.

Expand full comment

I really like your framing here!

At the same time, it reminds me of an essay I haven't been able to get out of my head:

https://www.palladiummag.com/2023/11/03/make-yourself-human-again/

It makes the case that capitalism maps directly to evolution itself:

> Instead, capitalism is an open-ended spiraling evolution of evolution itself, an irreversible advent, not a neat closed-circle rise and fall of a particular extended tribe of monkeys.

Basically proposing that capitalism is so successful because it's so close to a natural truth about the world. Kind of like how Kris was hesitant to frame nature as an "agent" - nature is nature, and we are just discovering its inherent behaviors.

Expand full comment

IIRC the key point of Taleb's Skin in the game was less about alignment (for example, fund managers having their net worth in the fund) but really more about how a system cannot adapt (another word for learn) if behavior and outcome is divorced. it was a much more evolutionary argument than the more boring alignment one

Expand full comment

> "This whole culture already has baked in the manifest desire to replace humans and human agency with cheaper, more powerful, more obedient, and more legible machine capital. Accelerationism understands itself as the nascent self-awareness of this process."

Great quote from the essay you linked.

Nature only cares that something increases entropy/dissipates free energy/grows by consuming energy. It selects those which consume energy, survive and multiply - the faster the better.

Humans are good at it (they internally reduce entropy and externally increase it at a faster rate); systems better (capitalism); machines, as they can work/consume all day, are probably best (think of all the data center heat from crypto mining!).

Working in large organizations, I have seen how the most inefficient parts of the system are almost always the people. Capitalism - resource-efficient as it is - seeks to root out all inefficiency. When all the people go, what are we left with?

Some related essays:

- Thermodynamic evolution: https://knowm.org/thermodynamic-computing/

- Fourth law of thermodynamics: http://theoildrum.com/node/7147#:~:text=This%20%E2%80%9CMaximum%20Power%20Principle%E2%80%9D%20which%20was%20referred%20to%20as%20the%20Fourth%20Law%20of%20Thermodynamics%20by%20H.T.%20Odum%20states

Expand full comment

great read, probability matching seems rational in non-ergodic systems

Expand full comment

I think in general there's an "irrational", a "rational" and a "metarational" case to be made for many things.

The irrational case is, relative to a rational actor, you are deviating from the optimal decision. Here, the irrational argument is that you're trying to chase winners ("see head 75% of the time, try to catch it, see tails the remaining 25%, try to catch it", when this has rationally has worse odds than simply guessing the majority class each time). I can certainly see one potential explanation as we are intuitively bad at probability, but we are short-term winner chasers, so we'll follow that strategy even if it's suboptimal.

The rational case needs no explanation: we calculate what nets the highest expected value and pursue that strategy. This is systematic, predictable - and most germane to our discussion - antifragile and exploitable.

The metarational case is that in a competitive game with other rational actors, you almost _always_ need to drip a bit of randomization into your decision strategy. In addition, a rational strategy is necessarily calculated from all information known at the time. However, we can be incorrect about our information, or we can enter a "regime shift" where all the probability distributions flip around, and then our once rational strategy instead looks like a gratuitously overlevered bet. Best to hedge, not go all in 100% on the "obvious winning strategy" because, the fact is, things change.

Adaptation seems particular good at the last case (perhaps by necessity, otherwise you wouldn't live for the rest of us to see it!).

Expand full comment

One explanation is relative status.

Let utility = your-percentile-of-relative-status.

Suppose you know 70% of people will bet red and 30% will bet green.

Your options are

* Bet on red and have a 70% chance of your status rising from 50th percentile to 65th percentile (average of the top 70 percentiles)

* Bet on green and have a 30% chance of your status rising from 50th percentile to 85th percentile (average of the top 30 percentiles)

Note: the EV is the same iff the population is using probability matching. In other words, probability matching is simply the Nash equilibrium.

Expand full comment

This is really fascinating

Expand full comment

Always learn something. Population/collective cognition is wild.

Expand full comment

How do you think it would play out when there is skin in the game? 70/30 odds but bet 100 dollars of your own money. Win you get 100, lose you pay 100 of your own money. No monopoly money allowed.

My guess is some would still go with the low odds bet but more would shift to the higher chance of winning. Thought experiments and play money are just a start on how we behave.

Expand full comment

this is a good question but the replication studies were done with SITG plus the animals absolutely had SITG (food, rewards)

But yea, you could see how people might not trust that the odds were not what I say (in the experiments, the odds aren't necessarily told and of course the animals can't be told. the random process just unfolds and you notice after awhile that the coin is effectively biased but the population doesn't just bet on the biased side as they probably should once you gain enough confidence -- if a coin is 75% it won't take too many flips to be 95% sure that it is which goes back to the math in this https://moontower.substack.com/p/mr-my-way)

And even if someone isn't working the math out with just a few more flips i think it would be intuitively clear that the coin is biased.

still, i can see your next point that would be someone thinking "hey, as soon as we all figure out the pattern, the proctor is going to switch out the coins"

Expand full comment

Context matters. You've done this before where you give us a real simple question and I overthink it. If these problems were on "Magic School Bus", I would get an A+. Instead, you ask me an easy question and I think, a really clever guy is asking me a trivially simple question, so I spend 30 seconds thinking what is the trick. I still give the obvious answer but I do so with low confidence, since I'm still expecting a trick. Who knows maybe you'll tell us that your observation of 7 red and 3 green balls only held for one quantum of time and now I am in a alternate mirror universe? I don't have this problem with Ms. Frizzle, but you get me every time.

Expand full comment

this question has been replicated a bunch on twitter and Lo's team has studied it in a proper study design so i doubt the effect of "my audience" is messing with it too much. if it were i'd expect the contrarian choice would have been more heavily mirrored but in fact it was undermirrored if anything

Expand full comment