It seems to me that people think AI is somehow magic. Recently I led a product demo. The conversation went something like this:

End users (at my company) - Can your AI system look at numbers and find differences and generate a text description?

Pre-sales - (trying to clarify) For our systems to generate text it will be better if you give it some live examples so that it understands what text to generate.

End users - But there is supporting data (metadata) around the numbers. Can't your AI system just generate text?

Pre-Sales - It can but you need to provide context and examples. Otherwise it is going to generic text like "there is x difference".

End user - You mean I need to write comments manually first? That is too much work.

Now these users have a call with another product - MS Copilot.

Well, you hear a lot about how AI will "empower" employees and generate new "insights" based off of data for analysts and execs. In reality, most executives aren't really interested in that. They'd like it for sure, but really what they want is automation. They want "efficiencies"; they want cost cutting.

Anyone that's been involved in data science roles in corporate environments knows that "the data" is usually forced into an execs pre-existing understanding of a phenomenon. With AI, execs are really excited at "cutting out the middlemen" when the middlemen in the equation are very often their own paid employees. That's all fine and dandy in an abstract economic view, but it's sure something they won't say publicly (at least most won't).

In terms of potential cost cutting, it probably is the most recent "new magic". You used to have to pay a consultant, now you can "ask AI".

This is a very common sentiment I see everywhere and it really highlights how uneducated most people are about technology in general. Most folks seem to expect things to work magically and perform physics breaking feats and it honestly baffles me. I would expect this attitude from maybe the younger generations who grew up only being users of technology like tablets and smartphones, but I honestly never expected millennials to be in the same camp, but nope they are just as ignorant. And I am thinking to myself, did I grow up different? Were my friends also not using the same Nintendo cartridges, and VCR's and camcorders and all the other tech that you had no choice but to learn at least basic fundamentals to use? Apparently most people never delved deeper then surface level on how to use these things and everything else went right over their head...

Vonnegut in On Writing Science Fiction reflected on Player Piano being labeled sci-fi since it involved machines, "The feeling persists that no one can simultaneously be a respectable writer and understand how a refrigerator works, just as no gentleman wears a brown suit in the city"

> Apparently most people never delved deeper then surface level on how to use these things and everything else went right over their head...

This is really the truth of all things in life.

Plenty of people have a story of managers asking them to do impossible or nonsensical things. It should be unsurprising people will do the same with a machine.

> Most folks seem to expect things to work magically and perform physics breaking feats and it honestly baffles me

This is how it is being marketed and I guess people are silly enough to believe marketing so it's not too surprising

The MS Copilot pre-sales person responded "oh, there is metadata? then yes, it will discover that and generate a text description, no problem"

> It seems to me that people think AI is somehow magic.

That's because it is marketed as magic. It's marketed as magic so people will adopt the thing before knowing its shortcomings.

https://pbfcomics.com/comics/the-masculator/

TBF synthetic data generation exists for this reason. I do understand why a lot of companies go with the "safe" choice (copilot) even though it's crap.

Pray, Mr Babbage, etc