GPT-4 doesn't understand rhyming but can solve rhyme-based puzzles
twofergoofer.comGPT-4 doesn't actually comprehend anything. It simply reproduces what it was trained on by a statistical model. It's a sophisticated Markov Chain.
Well, in some ways the human brain is a sophisticated system with some input translated into state such as weights and synapses. How could mere matter "comprehend" anything? It just sends some chemicals and electrical signals here and there in glorified pipes.
Perhaps comprehension is not something special, but an illusion that emerges after some level of mass storing information, adjusting weights, and running the "circuit" in a feedback loop. While GPT-4 is much simpler (and not just in scale but also in mechanism, being a "mere" statistical model) than a human brain, it can still exhibit an emergent property analogous to comprehension.
At some level even a switch might be said to have a unit of comprehension, encoded in the open or closed state.