Thursday, January 30, 2025

Shannon entropy

Stephen Baxter once applied Shannon entropy analysis, in his story Turing's Apples, to language. He figured human language at 8-9th order in the Markov chain (I still haven't studied that); dolphins at 3-4. So long and thanks for all the fish, if I may cross streams. Today we learn that Shannon has application in quantum physics also.

The key here is that Shannon is dealing in information. Von Neumann once proved that the closed system of quantum is nonentropic. The Austrians point out that Von Neumann assumed we have all the information about a quantum box. Yeah, no. Uncertainty-principle, bitchez!

The entropy of information, and therefore that box with the kittycat in it, devolves to whatever theories we have on that entropy, which is Shannon.

No comments:

Post a Comment