amogus is cringe

does anybody here have a strong opinion on anything

comments (single view)

What's sus der backwards

In artificial intelligence (AI), a hallucination or artificial hallucination (also occasionally called delusion) is a confident response by an AI that does not seem to be justified by its training data. For example, a hallucinating chatbot with no knowledge of Tesla's revenue might internally pick a random number (such as "$13.6 billion") that the chatbot deems plausible, and then go on to falsely and repeatedly insist that Tesla's revenue is $13.6 billion, with no sign of internal awareness that the figure was a product of its own imagination.

View all comments