Linear time in the length of the sequence, yes. But is the sequence length linear in dimension size, or number of moves given? Thats what is interesting.
Linear time in the length of the sequence, yes. But is the sequence length linear in dimension size, or number of moves given? Thats what is interesting.
Linear in the size of the witness, however many bits it takes to express it.
This applies to any computable problem though, no? At minimum the verifier has to read the witness. If we ignore PCPs and such. The point here is that the witness grows very fast in terms of vector dimensionality and/or move set size.
It doesn’t need to be linear though. Polynomial is enough.
But I guess it can be proven that the shortest possible sequence grows faster than polynomial.
Proving it's nonpolynomial is pretty easy: just make the goal vector `(10, 10, 10)` and the displacement vectors `{(1, 0, 0), (-10, 1, 0), (-10, -10, 1)}`. It takes ~1000 steps to reach the goal, and if we add one more dimension we need 10x more steps.
So it grows, at least, exponentially. Back in 1970ish someone found a problem that's at least double-exponential, proving it was 2-EXPTIME-hard. It was conjectured to be 2-EXPTIME-complete for the longest time, but turned out to be significantly harder.
I think they proved it grows with Ackermann function.
Sounds implausible… Number of computational steps to find the shortest sequence maybe grows with Ackermann function. But length of the shortest sequence?
I think you are right. I just read this article linked in the OP: https://www.quantamagazine.org/an-easy-sounding-problem-yiel...