You realize that when typing into a calculator, you probably hit a wrong key more than 1% of the time? Which is why you always type important calculations twice?
I've been stunned by how many smart people talk so casually about how because LLMs aren't perfect, they therefore have no value. Do they just forget that nothing in the world is perfect, and the values of things are measured in degrees?
There’s a big difference between mistyping 1% of the time yourself (human error) and a calculator failing 1% of the time (machine error) and I am willing to bet there isn’t a company out there (maybe a handful of less scrupulous ones) that has knowingly shipped a calculator that got it wrong 1% of the time. Especially in previous decades when countless people were using a dedicated calculator dozens of times a day. Hard to imagine a 1% margin of error was acceptable.
Not to mention now you have the compounded problem of your mistakes plus the calculator’s mistakes.
The computer on your desk has a number of errors just holding values in memory.
Yes, it's not 1%, but the argument is about them being imperfect devices. It's not a horrible thing to start with the presumption that calculators are not perfect.
Yes but I don’t depend on the output of my comp’s memory in such explicit terms and it doesn’t have lasting consequences. If my calculator literally gives me the wrong answer 1% of the time that’s a bigger problem.
There isn't a difference in the big picture. Error is error. Even when we have incredibly reliable things, there's error when they interface with humans. Humans have error interfacing with each other.
But you seem to have missed the main point I was making. See? Another error. They're everwhere! ;)
> But you seem to have missed the main point I was making. See? Another error. They're everwhere! ;)
Ah, but whose error? ;)
> But you seem to have missed the main point I was making. See? Another error. They're everwhere! ;)
You really could’ve done without this bit.