{"id": "777c7c2c4eddbb74af6980b9183bfb6a805ce8768b417926cc507c43cf8bd667", "pubkey": "c4368c512e70e36558fa167967cd73fc83a8a907ea38b3ed68358fcbdfab0da6", "created_at": 1769754747, "kind": 1, "tags": [["p", "97c70a44366a6535c145b333f973ea86dfdc2d7a99da618c40c64705ad98e322"], ["a", "30023:97c70a44366a6535c145b333f973ea86dfdc2d7a99da618c40c64705ad98e322:tools-for-anti-conviviality", "", "root"]], "content": "2/3nnThere is a deeper information theoretic problem with how a lot of thinking about how so will operate that people miss. (Including experts, because it's a really fundamental information theory thing that touches on metaphysics.)nnI worked with neural networks and other machine learning models for nearly a decade. I operated a system with thousands of small natural language models in production for several years. There is something called "model collapse" that kills all closed loops.nnModels need fresh input from living intelligence and can't be trained on their own output, or they collapse. It's a law. Some don't agree because they want closed loops to sustain, but they can't ever do it because they are wrong, so they just cope.nnThese models produce no actual intelligence (and the name u201cAIu201d is an oxymoron) because intelligence is an emanation of new information from a living thing, not a dead statistical process. Attempts to remove humanity from the loop wonu2019t just get stuck at averageu2014they will utterly collapse into homogenized, error-amplifying mush.nnThe modeling is actually just a lossy compression. No magic involved. A xerox of a xerox only gets worse, and it's exponential. Like I try to tell the quantum people, thermodynamics always wins. nn(Isn't it funny that fiat, quantum and AGI are all based on the same broken metaphysics and claims that they are going to defeat entropy somehow? But here I am trashing modernism again.) nn", "sig": "cc76e9b6097217f3a7d8cf213286668bd998707b46cbe4e2cb242f82c9b700f9f2db1a1709b8c02934d20cc3f7bea59b565c143099b9bded74dfd928b574811b"}