When you purchase through links on our site, we may earn an affiliate commission.Heres how it works.

When an AI doesn’t know history, you’re free to’t blame the AI.

It’s all that and our perceptions of the AI’s “intentions” on the other side.

AI messup

It wasn’t too hard to figure out what happened here.

Doing too much right

Which brings us to Google.

It likely programmed Gemini to be racially sensitive but did so in a way that over-compensated.

In fact, we knowsome of those men were enslavers.

I’m not sure how Gemini could’ve accurately depicted these white men while adding that footnote.

I know, it’s ridiculous.

The models are incredibly powerful and, in some ways, are outstripping our ability to understand them.

This is, though, how AI and the developers behind it learn.

We have to make these mistakes.

AI has to create a hand with eight fingers before it can learn that we only have five.

AI will sometimeshallucinate,get the facts wrong, and even offend.

If and when it does though, that’s not cause to pull the plug.

The AI has no emotion, intention, opinions, political stands, or axes to grind.

It’s trained to give you the best possible result.

You might also like