look, the reason LLMs can't answer questions like "how many Rs are in strawberry" and similar baby shit is because this kind of question-answer pair ISNT IN THE TRAINING DATA AND THESE MACHINES DO NOT "THINK" OR "ANALYSE" OR "REASON THROUGH" QUESTIONS.