Amanda Cooper lost the election. It was a very close contest, only 31 votes once preferences were distributed. But Bart Mellish did win. Google knows this, all the top results show Bart Mellish’s official pages, social media, and news/encyclopaedia articles about him.
I can only assume its AI crawler saw the first preference votes and a reasoning model decided that must mean Amanda Cooper won. Because no actual written source could have reported that.
I tried the search again a few minutes later and got the correct result.


AI doesn’t understand anything, it’s just producing a linguistically coherent answer that may or may not be right. Stop looking to LLMs for answers if you care about whether those answers are correct or not
You don’t look for it. Google shoves it right there where it’s unavoidably the first thing you see.
I know, you should stop using Google
Agree; at least by default.
I use duckduckgo as a default search engine, and then google when I want a second second opinion (admittedly usually high-quality). That’s one strategy, I guess there could be others.
https://noai.duckduckgo.com/
Google probably will do better if you use their paid models.