Just define "general" as "as general as allowed by math, physics, and practical limitations." Or use a conventional reading of AGI as a human-level intelligence (which we, naturally, have a working example of).
Just define "general" as "as general as allowed by math, physics, and practical limitations." Or use a conventional reading of AGI as a human-level intelligence (which we, naturally, have a working example of).
Yeah but if you do that, you have to then turn around and look at how all the goalposts keep moving around. That is what I was (originally) trying to get at, and why I phrased it like I did. If we truly had actual (artificial) general intelligence (or were close to it) we would already have a solid definition/benchmark (and it... Probably wouldn't be what you said, but something a lot more detailed/thorough). Right now both AGI and ASI is just... Whatever. "It earns a hundred billion dollars in revenue," "It can do anything a general human can do" (ignoring the shear amount of ambiguity alone in that), "It can do most tasks a human can do" (again, ambiguous: which human, which tasks, on and on and on).