The article says,

> Under the hood, Uncertain<T> models GPS uncertainty using a Rayleigh distribution.

And the Rayleigh distribution is clearly not just an interval with a uniformly random distribution in between. Normal interval arithmetic isn't useful because that uniform random distribution isn't at all a good model for the real world.

Take for example that Boost library you linked. Ask it to compute (-2,2)*(-2,2). It will give (-4,4). A more sensible result might be something like (-2.35, 2.35). The -4 lower bound is only attainable when you have -2 and 2 as the multiplicands which are at the extremes of the interval; probabilistically if we assume these are independent random variables then two of them achieving this extreme value simultaneously should have an even lower probability.

While it does sound like GP missed a distinction, I don't see how (-2.35, 2.35) would be sensible. The extremes can happen (or else they wouldn't be part of the input intervals) and the code has to sensibly deal with that event in order to be correct.

The reason is that the uniform distribution is very rare. Nearly no real world scenario were something is equally likely to be the values 2, 0 and -2, and where it's literally impossible to be -2.01. It exists but it's not the normal case.

In noisy sensors case there's some arbitrary low probability of them being actually super wrong, if you go by true 10^-10 outlier bounds they will be useless for any practical use, while the 99% confidence range is a relatively small rent.

More often you want some other distribution and say (-2, 2) and those are the 90th percentile interval not the absolute bounds, 0 is more likely than -2 and -3 is possible but rare. It's not bounds, you can ask you model for your 99th or 99.9th percentile value or whatever tolerance you want and get something outside of (-2,2).

Interval arithmetic isn't useful because it only tells you the extreme values, but not how likely these values are. So you have to interpret them as uniform random. Operations like multiplications change the shape of these distributions, so then uniform random isn't applicable any more. Therefore interval arithmetic basically has an undefined underlying distribution that can change easily without being tracked.

> Operations like multiplications change the shape of these distributions, so then uniform random isn't applicable any more.

Doesn't addition as well? Like if you roll d6+d6, the output range is 2-12, but it's not nearly the same as if you rolled d11+1.

Yes that's true! I used multiplication because that was my original example.

Okay, thanks :-). I was just trying to make sure I was understanding what I was reading.

-2 and 2 were not the extremes to begin with.