There's long been a divide between what people call hard vs soft AI, or strong vs weak AI, or narrow vs general. The definitions are a bit fuzzy, but generally a hard AI or strong AI would be able to think for itself, develop strategies and skills, maybe have a sense of self. Soft AI in contrast is a mere tool where you put something in and get something out.
Now some people don't like using the term AI for soft/weak/narrow AI, because it's a fleeting definition, mostly applied to things that are novel and that we didn't think computers were able to do. Playing chess used to be considered AI, but a short time after AI beat the human chess world master it was no longer considered AI. If you buy a chess computer capable of beating Magnus Carlsen today that's considered a clever algorithm, no longer AI. You see the same thing playing out in real time right now with LLMs, where they go from AI to "just algorithms" in record time.
Now some people don't like using the term AI for soft/weak/narrow AI, because it's a fleeting definition, mostly applied to things that are novel and that we didn't think computers were able to do. Playing chess used to be considered AI, but a short time after AI beat the human chess world master it was no longer considered AI. If you buy a chess computer capable of beating Magnus Carlsen today that's considered a clever algorithm, no longer AI. You see the same thing playing out in real time right now with LLMs, where they go from AI to "just algorithms" in record time.