In reply to
Robotistry
@Robotistry@fediscience.org
Moved from mstdn.ca and sciencemastodon.com. Disabled by # LongCovid and semi-retired, currently active in # Robotics via Robotistry Consulting & Research Services and in long Covid via the Patient-Led Research Collaborative. Chair, IEEE standards development working group P2817. Located in the Canadian Maritimes.
fediscience.org
Robotistry
@Robotistry@fediscience.org
Moved from mstdn.ca and sciencemastodon.com. Disabled by # LongCovid and semi-retired, currently active in # Robotics via Robotistry Consulting & Research Services and in long Covid via the Patient-Led Research Collaborative. Chair, IEEE standards development working group P2817. Located in the Canadian Maritimes.
fediscience.org
@Robotistry@fediscience.org
·
Apr 11, 2026
@audioflyer79 @davidaugust @alisynthesis This is actually very important.
LLMs do not "forget" the way humans do. They don't have memory lapses.
Humans have memory lapses and difficulty recalling facts they know.
But the nature of computers is to remember perfectly. LLMs exist because they remember perfectly and look for patterns in that memory.
If it is to be useful at interpreting human commands and understanding human expectations (see "now anyone can code" PR), it needs to be encoding concepts, not characters.
People are making big claims that these machines are somehow conscious or intelligent and are able to understand abstract concepts.
If an inference machine with perfect memory states unequivocally at one point that unripe bananas are yellow, and then later states unequivocally that unripe bananas are green, then it is not storing and retrieving conceptual information.
Any claim of generalization cannot involve this kind of concept.
In the battle between Cog and Cyc, LLMs are Cyc writ large.
Cog: https://en.wikipedia.org/wiki/Cog_(project)
Cyc: https://en.wikipedia.org/wiki/Cyc
View full thread on fediscience.org
1
1
1
Conversation (1)
Showing 0 of 1 cached locally.
Syncing comments from the remote thread. 1 more reply is still loading.
Loading comments...