“A kid born today will never be smarter than AI, ever.” Nope. Not true.

A kid born today will never be smarter than AI, ever” says Sam Altman, CEO of OpenAI.  He is wrong, but let’s come back to that after we first lean into a quote and question from Jensen Huang, CEO of Nvidia.

In the interview where Altman makes the statement about kids and AI (Cleo Abram: Sam Altman Shows Me GPT 5... And What's Next) Jensen Huang, poses the following statement and question to Sam,

“Fact is what is. Truth is what it means, so facts are objective.  Truths are personal.  They depend on perspective, culture, values, beliefs and context.  One AI can learn the facts, but how does one AI know the truth for everyone in every country and every background.”

I believe Jensen is a bit off in his statement about truth and offer the following as a more concrete definition: truth is knowledge of things as they are, and as they were, and as they are to come.  This is particularly relevant to AI, discovering what is real or fake and why the truth even matters.

The idea that truth is personal makes it all too convenient to ignore the laws of every society.  It makes it possible to feel good about taking what is not yours.  It encourages the notion that anything goes and it doesn’t matter what I say, or who I say it to.  It means I can call anything and everything truth as long as it fits my personal agenda.  This doesn’t really work very well over the long haul. 

In our society today, it’s critical to know and trust if you are chatting with AI or a live person.  It’s necessary to know whether the phone call you are receiving from mom or dad, your husband, wife or child is real or not.  It’s critical that a State Governor in the United States knows if the phone call from Marco Rubio is real or fake.  And it’s absolutely necessary to know if you are flirting with a man or a woman when engaged in online dating.  The truth is not simply a matter of personal opinion, or what Jensen calls “values, beliefs and context”.

Years ago, I was given the following nugget and I had it written on a large white board in my office for a very long time: Cash is a fact.  Profit is an opinion.  Is that true?  Is it okay for Jensen or any CEO of a publicly traded company to arrange the finances of their company according to their own values, beliefs and context, and portray their financial reporting as truthful – no matter what the numbers say?  In other words, if “truths are personal”, the truth about their operations and how they track the numbers can have nothing to do with financial accounting standards or any standards or laws.  It only applies if it fits their own personal truth and values.  I think not.

Truth must be far more reliable, steady and dependable if it is to have any meaning over time and generations.

One of the interesting things Sam Altman mentioned in his interview with Cleo Abram is that he will on occasion use the public GPT model instead of his own GPT.  This is noteworthy as it reinforces the idea of subjectivity and contradicts what I’m saying about truth.  His GPT companion, the one that learns from him and understands him best; even his GPT companion that becomes integrated into his life acts differently than the public GPT model.  Again, this is significant.  As AI evolves, based on the personal nature of how it learns and the data it is exposed to, the answers will gradually change and become meaningfully different, depending on who is asking the question and how much the AI knows about that person.  And we want to call this truth?  No, these are opinions and they are unanchored and change with time.

Unanchored, ever changing opinions may be fine in the world of media, news and scientific discovery, but it is not fine when talking about things that are true. Here are two examples from science to drive home the point:

·        The earth is flat.  This was believed by the Mesopotamians, ancient Egyptians, Homer, Hesiod and Thales of Miletus.  Magellan sailed around the earth and satellites in the sky today, prove that the earth is round, kind of like an orb.

·        The earth is at the center of the universe.  Claudius Ptolemy’s model (2nd century CE) claimed the earth was the stationary center of the universe, with planets and stars revolving in complex orbits.  We know today that the earth orbits the sun.  This was first observed by Galileo and later by Friedrich Bessel.  Today we use modern tools in aircraft navigation that are dependent upon the earth orbiting the sun.

 So, let’s return to the headline statement:  “A kid born today will never be smarter than AI, ever”.  Sounds impressive right?  No, it is false and here is why:

Dogmatic statements that predict the future have a long history of falling apart.  They collapse under the weight of innovation, creativity and progress.  I’m betting this one will too.  In two hundred years we will be dead and gone and future generations will have to answer the question about AI and kids.  In the mean time, I can simply say that Sam is wrong.

Next
Next

We need AI to Detect AI