While AI’s inner workings remain mysterious even to researchers, companies amplify that mystique to market each new model as a breakthrough.
By CODY DELISTRATY
There’s a word that Sam Altman likes to use when talking about artificial intelligence: magic. Last year, he called a version of ChatGPT “magic intelligence in the sky.” In February, he referred to “magic unified intelligence.” He later posted that a recent update has “a magic to it i haven’t felt before.”
At times, A.I. can indeed feel magical. But treating it as anything other than a mere machine can have serious consequences. How many pose their deepest questions to chatbots, as if to an omniscient oracle? They ask Claude or ChatGPT: What should I do about this relationship? This job? This problem? Technology’s supposed promise of salvation — whether it’s Mars colonization, eternal life or achieving the A.I. “singularity” — has become a kind of secular religion, a mix of utopian beliefs that borders on the mystical.
Part of A.I.’s mystique comes from the fact that its inner workings aren’t entirely understood, even by its creators. Researchers are “very frequently surprised by how models behave once you build them,” Sam Bowman, a researcher at Anthropic and a professor at New York University’s Center for Data Science, told me. Researchers can look at the inputs and outputs of A.I., but the actual process going on inside is not yet clearly known.
That mystery — and the vast potential it portends — allows companies to build hype. Before OpenAI released GPT-2 in 2019, it cautioned that it might be too dangerous to use. With GPT-4 in 2023, Mr. Altman said he was “a little bit scared” of its power. This year, with GPT-5, Mr. Altman went on Theo Von’s podcast to compare the product to the Manhattan Project. “What have we done?” he asked.
Much of this is savvy marketing. This kind of constant upping of the stakes has itself become something of a magic trick. We can’t yet see the promised breakthrough, but we are told repeatedly that it’s just around the corner.