Bing Chat began to hallucinate. It assured us that we could safely eat ground glass, and that four US Presidents had been women. It would invent citations. It would then deny an answer it had just given you was true. All this was performed with the smoothly reassuring bedside manner of an experienced NHS consultant.
Things got even worse as Bing Chat began to throw tantrums, and even threaten its users.
A whole series of stories had AIs – rather more advanced – going mad mere tens of minutes after being activated.Such powerful intelligences working at such speed just chewed through all of the interesting bits of possible thought so quickly they ended up composing symphonies in C through the colour blue sort of madness. A fugue state where they considered the mysteries of the cosmos sorta stuff.
Still used them, of course, but they did only last as useful entities for those tens of minutes.