Most people who work in the field of “Artificial Intelligence” rarely use the term. Referring to something as “AI” is a flag you probably don’t know much about the industry.
People in the field instead talk about specific technologies and techniques: machine learning, deep learning, neural networks, natural language processing (NLP), transformers, embeddings, reinforcement learning, large language models (LLM), robotics, latent space diffusion, and generative adversarial networks (GAN).
AI becomes a catchall term for this wide array of disciplines that are only united by a hand-wavey philosophy about getting machines to act smart. Prominent researchers Kate Crawford and Meredith Whittaker poetically call it a “constellation of technologies.” The AI Now Report , Crawford and Whittaker (2016)9ya A bit like how “cancer” is the catch-all phrase for a group of diseases – each with its own set of characteristics, prognoses, and treatment options – but we hear it as a blanket term: the emperor of all maladies
Artificial intelligence traditionally means any computational system that mimics human-like intelligence. At the moment we take that to mean understanding language, recognizing patterns, perceiving the world, reasoning, and decision-making. But we end up persistently moving the goalposts on “intelligence.”
All computing is to some degree artificially intelligent. At one point a TI-83 calculator was artificial intelligence. When Google search first came out it was artificial intelligence. Frankly, a modern refrigerator counts as artificial intelligence: it’s programmed to achieve an objective by taking action and uses environmental feedback to adjust its behaviour. These things are now so banal and pervasive we are unimpressed by their “intelligence.”
AI becomes a stand-in term for whatever technologies and techniques are new, shiny, and just beyond the grasp of our understanding. We use it to gesture at a future we cannot fully comprehend or currently realise. As soon as we do, it will no longer be “AI.”
Many people have pointed to this lack of specific meaning behind the moniker. Here’s an excerpt from a (paraphrased) conversation between Stewart Russell and Sean Carroll on AI as a shifting continuum: Russell wrote one of the original textbooks on AI and has too many accolades in the field to list off. Also wrote Human Compatible – an accessible non-fiction book on AI alignment
carroll
Is there some way of definitively saying when a certain computer program is AI versus just a regular computer program? I mean in some sense isn't a pocket calculator optimized to add numbers together correctly?
russell
Yes, there really is a continuum between something as simple as a thermostat that switches the heat on when it gets cold and switches it off when it gets warm. All the way up to humans and even beyond. The continuum is mainly in the the nature of the task environment. How complicated is the world that the the entity has to deal with?
carroll
I like the idea there's a continuum here. It's not like there is some sharp phase transition between dumb computer programs and artificially intelligent ones.
russell
That's right. In fact there's a Nostrum that's been put about a long time that as soon as it works it stops being artificial intelligence which is a little bit unfair because then, of course, that means that AI is a field of continual failure. But I think that's in some ways accurate.
For example, every time you drive your car and you get directions there's an AI algorithm running in your cellphone that is computing the shortest paths and incorporating expected delays and so on. This is a very classical AI algorithm that was developed in the late '60s and early '70s and no one thinks of that as AI anymore.
We’re in a moment where people are getting hype about very specific tools like large language models and neural networks and GANs but have trouble distinguishing between them. To them, it’s all “AI”. Conversations often devolve into confusion or inane generalisations about “robots taking our jobs” simply because we can’t clarify which part(s) of the constellation people are pointing at.
It’s become an empty pointer. We attempt to point at possible futures and find ourselves all looking in opposite directions.