If I want to know about the law, I'll ask a lawyer (ok, not any lawyer, but it's a useful first pass filter). If I want to know about plumbing I'll ask a plumber. If I want to ask questions or learn about writing I will ask one or more writers. And so on. Experts in the field are way better at their field than 95% of the population, which you can ask but probably shouldn't.
There are many 100's of professions, and most of them take a significant fraction of a lifetime to master, and even then there usually is a daily stream of new insights. You can't just toss all of that information into a bucket and expect that to outperform the < 1% of the people that have studied the subject extensively.
When Idiocracy came out I thought it was a hilarious movie. I'm no longer laughing, we're really putting the idiots in charge now and somehow we think that quantity of output trumps quality of output. I wonder how many scientific papers published this year will contain AI generated slop complete with mistakes. I'll bet that number is >> 0.
Surely you don't always call up and pay for a lawyer any time you have an interest or question about law, you google it? In what world do you have the time, money and interest to ask people about every single thing you want some more information about.
I've done small plumbing jobs after asking AI if it was safe, I've written legal formalia nonsense that the government wanted with the help of AI. It was faster, cheaper and I didn't bother anyone with the most basic of questions.
In some evaluations, it is already outperforming doctors on text medical questions and lawyers on legal questions. I'd rather trust ChatGPT than a doctor who is barely listening, and the data seems to back this up.
The problem is that you don't know on what evaluations and you are not qualified yourself. By the time you are that qualified you no longer need AI.
Try asking ChatGPT or whatever is your favorite AI supplier about a subject that you are an expert about something that is difficult, on par with the kind of evaluations you'd expect a qualified doctor or legal professional to do. And then check the answer given, then extrapolate to fields that you are clueless about.
That's the funny thing to me about these criticisms. Obviously it is an important caveat that many clueless people need to be made aware of, but still funny.
AI will just make stuff up instead of saying it doesn't know, huh? Have you talked to real people recently? They do the same thing.
Yeah... you're supposed to ask the 5%.
If you have a habit of asking random lay persons for technical advice, I can see why an idiot chatbot would seem like an upgrade.
Surely if you have access to a technical expert with the time to answer your question, you aren't asking an AI instead.
Books exist
chatGPT exists
(I'm not saying not to read books, but seriously: there are shortcuts)
...and is unreliable, hence the origin of this thread.
If I want to know about the law, I'll ask a lawyer (ok, not any lawyer, but it's a useful first pass filter). If I want to know about plumbing I'll ask a plumber. If I want to ask questions or learn about writing I will ask one or more writers. And so on. Experts in the field are way better at their field than 95% of the population, which you can ask but probably shouldn't.
There are many 100's of professions, and most of them take a significant fraction of a lifetime to master, and even then there usually is a daily stream of new insights. You can't just toss all of that information into a bucket and expect that to outperform the < 1% of the people that have studied the subject extensively.
When Idiocracy came out I thought it was a hilarious movie. I'm no longer laughing, we're really putting the idiots in charge now and somehow we think that quantity of output trumps quality of output. I wonder how many scientific papers published this year will contain AI generated slop complete with mistakes. I'll bet that number is >> 0.
Surely you don't always call up and pay for a lawyer any time you have an interest or question about law, you google it? In what world do you have the time, money and interest to ask people about every single thing you want some more information about.
I've done small plumbing jobs after asking AI if it was safe, I've written legal formalia nonsense that the government wanted with the help of AI. It was faster, cheaper and I didn't bother anyone with the most basic of questions.
Indeed. The level of intellectual dishonesty on this page is staggering.
In some evaluations, it is already outperforming doctors on text medical questions and lawyers on legal questions. I'd rather trust ChatGPT than a doctor who is barely listening, and the data seems to back this up.
The problem is that you don't know on what evaluations and you are not qualified yourself. By the time you are that qualified you no longer need AI.
Try asking ChatGPT or whatever is your favorite AI supplier about a subject that you are an expert about something that is difficult, on par with the kind of evaluations you'd expect a qualified doctor or legal professional to do. And then check the answer given, then extrapolate to fields that you are clueless about.
Sure, so long as the question is rather shallow. But how is this any better than search?
That's the funny thing to me about these criticisms. Obviously it is an important caveat that many clueless people need to be made aware of, but still funny.
AI will just make stuff up instead of saying it doesn't know, huh? Have you talked to real people recently? They do the same thing.