As an example, parts of the C++ standard library (none of the core language I believe though) are covered by complexity requirements but implementations can still vary widely, e.g. std::sort needs to be linearithmic but someone could still implement a very slow version without it being UB (even if it was quadratic or something it still wouldn't be UB but wouldn't be standards conforming).
UB is really about the observable behavior of the abstract machine which is limited to the reads/writes to volatile data and I/O library calls [1]
[1] http://open-std.org/jtc1/sc22/open/n2356/intro.html
Edit: to clarify the example
I understand why Alexander Stepanov thought the complexity requirements were a good idea, but I am not convinced that in practice this delivers value. Worse, I don't see much sign C++ programmers care.
You mentioned particularly the C++ unstable sort std::sort. Famously although C++ 11 finally guarantees O(n log n) worst case complexity the libc++ stdlib didn't conform. They'd shipped worst case O(n squared) instead.
The bug report saying essentially "Hey, your sort is defective", was opened in 2014. By Orson Peters. It took until 2021 to fix it.