Yes. Unlike intelligent people, and like some unintelligent people, LLMs seem incapable of saying "I don't know" or even "I'm not sure." Instead, in the absence of knowledge independently verified from multiple reliable sources, they seem to be designed to just make stuff up randomly.