Gen AI hallucination = Human Intelligence?
Gen AI hallucination is in a way represents human intelligence. Why hallucination happens?
Insufficient or Biased Data training
Overfitting – data variance deprivation
Complex Input Patterns – In simple words, a complex, unpredictable input/question to the model. It’s like how some of us would have attended the school/college exams without enough preparation.
Model limitation and Random errors are additional reasons why the GenAI model hallucinate.
In a way as a human, we are all having challenges with all of these. So aren’t we hallucinating somewhere or other? Lets forgive AI 😉
#GenAI #AI-Hallucination #LLMs