Been wondering why AI models have been hallucinating a lot more. Though, the results show. That late 2024. It was around a 16% or so chance to hallucinate.
In 2025 with the updates, it went to a 33% to 48% chance to hallucinate.
Results can be even worse if the model does not know how to respond at all, then it takes a guess instead of trying to learn.
In 2025 with the updates, it went to a 33% to 48% chance to hallucinate.
Results can be even worse if the model does not know how to respond at all, then it takes a guess instead of trying to learn.
Fluffy~.


Reactions: JayMark, Aaqil, Prince_Azmiran_Myrian and 1 other person