Comments by "John G Williams" (@johnwilliams8818) on "Even Guest Thinks Stephen Colbert Is Nuts After He Makes This Claim" video.
-
Gemini itself will tell you that "it" an LLM is only as good as the data given to it. That if the input and teaching models given are biased and imbalanced, any conclusions that it makes will be biased and imbalanced. It has no way of determining any different. Gemini didn't malfunction, it was simply given biased and incorrect data and a skewed learning model.
Have you ever had a conversation with a child where they ask a question, you give them a generic answer, and they reply with "But Why?", and you say, "Because", and they reply with a "But Why Because", and you say "Because I said so." That is exactly how an LLM will interpret things when it uses biased incomplete data or learning models. It simply does/can not know any better.
Why? because AIs are not given access to real time up to the minute information. they are not allowed to communicate outside their little bubble of datasets. If you ask an AI a question will only have access to any data on that subject that has already been given to it. It cannot search the internet for current updated data.
Gemini has also (after what I call Ouroboros conversations) been quite clear on the fact that AIs can indeed be very dangerous if the data and learning models they are given are not equal, unbiased, and that are as full and complete as possible, that is is a BAD idea to create bias driven LLMs.
1