This tracks with what I know about chatgpt. It will often correctly pull information and give it to you but other times it will outright fabricate things in attempt to answer your question. There was a story last year about a lawyer who didn’t understand that chatgpt will do this false information thing and they used it to try and find relevant legal rulings, and chatgpt made up a legal ruling that the lawyer actually used in a court. The judge did not take kindly to that.
This tracks with what I know about chatgpt. It will often correctly pull information and give it to you but other times it will outright fabricate things in attempt to answer your question. There was a story last year about a lawyer who didn’t understand that chatgpt will do this false information thing and they used it to try and find relevant legal rulings, and chatgpt made up a legal ruling that the lawyer actually used in a court. The judge did not take kindly to that.