I've found it's less about specificity and more about removing the # of critical assumptions it needs to make. Being too specific can be a hindrance in it's own regard.
And that's also a decent barometer for what it's good at. The more amount of critical assumptions AI needs to make, the less likely it is to make good ones.
For instance, when building a heat map, I don't have to get specific at all because the amount of consequential assumptions it needs to make is slim. I don't care or can change the colors, or the label placement.
This decade’s version of “works on my box”
I've found it's less about specificity and more about removing the # of critical assumptions it needs to make. Being too specific can be a hindrance in it's own regard.
And that's also a decent barometer for what it's good at. The more amount of critical assumptions AI needs to make, the less likely it is to make good ones.
For instance, when building a heat map, I don't have to get specific at all because the amount of consequential assumptions it needs to make is slim. I don't care or can change the colors, or the label placement.