Compelling thoughts and ideas about what makes interacting with AI so hard. Essentially, it's that it ignores the basics of what language represents: a massive set of broadly shared and easily recallable concepts and agreements about the world around us.
Furthermore, as you can see in the examples, even what we think of as simple statements have layers and layers of meanings based on context and experience. It's not enough to know that people get sick, for example.
You have to know and understand 1) what "people" and "get" and "sick" represents concretely and abstractly; 2) what "sick" means contextually ("bodily ill" versus "culturally cool"); and 3) that "get" represents a always possible condition, not a present reality.
We learn so much as little humans, and then assume it's all present from the birth. We have to teach!