> The AI effect occurs when onlookers discount the behavior of an artificial intelligence program as not "real" intelligence.[1]

> Author Pamela McCorduck writes: "It's part of the history of the field of artificial intelligence that every time somebody figured out how to make a computer do something—play good checkers, solve simple but relatively informal problems—there was a chorus of critics to say, 'that's not thinking'."[2] Researcher Rodney Brooks complains: "Every time we figure out a piece of it, it stops being magical; we say, 'Oh, that's just a computation.'"[3]

https://en.wikipedia.org/wiki/AI_effect

I don't know how many times I've posted this now, and how many times I'll have to continue posting it in the future, because it's a very real psychological phenomenon that I can observe in real time among people, such as the author of this article.

It would have been better if the term used was Prediction Machine, rather than Artificial Intelligence.

One may argue, what's in the name? AI is a catchier term and covers the umbrella under which terms like - statistics & predictions, ML, NLP AGI, etc can be grouped.

* As an engineer it makes sense since the ultimate aim should be to reach AGI.

* As a salesperson it makes selling the technology easier.

* As for the rest of the world, the term has a varying meaning, one that is limited by only the individual's imagination.

  - My wife, who has a PhD in Biotechnology believes AI to be more like a sentient being.

  - My sales director feels AI should be able to act like a real salesperson, taking vague instructions, researching and churning presentations to hit the customer's heart. 
The author is right here - the term is misused and misunderstood by most folks, and we should expose the system for what it actually is rather than what it can be.