It’s a failure of our education systems that people don’t know what a computer is, something they interact with every day.
While the Sapir-Whorf hypothesis might be bunk, I’m convinced that if you go up one level in language structure there is a version of it that is true. That is treating words as if they don’t need a consistent definition melts your brain. For the same reason that explaining a problem to someone else helps you solve it, doing the opposite and untethering your thoughts from self-consistant explanations stops you from explaining them even to yourself, and therefore harms your ability to think.
I wonder if this plays some part in how ChatGPT use apparently makes people dumber, that it could be not only because they become accustomed to not having to think, but because they become conditioned to accept text that is essentially void of consistent meaning.
It’s a failure of our education systems that people don’t know what a computer is, something they interact with every day.
While the Sapir-Whorf hypothesis might be bunk, I’m convinced that if you go up one level in language structure there is a version of it that is true. That is treating words as if they don’t need a consistent definition melts your brain. For the same reason that explaining a problem to someone else helps you solve it, doing the opposite and untethering your thoughts from self-consistant explanations stops you from explaining them even to yourself, and therefore harms your ability to think.
I wonder if this plays some part in how ChatGPT use apparently makes people dumber, that it could be not only because they become accustomed to not having to think, but because they become conditioned to accept text that is essentially void of consistent meaning.