Pascal strings have the issue that you need to agree on an int size to cross an ABI boundary, unless you want to limit all strings to 255 characters and what the prefix means is ambiguous if you have variable length characters (e.g. Unicode). These were severe enough that Pascal derivatives all added null terminated strings.

Took a bit for languages to develop the distinction between string length in characters and bytes that allows us to make it work today. In that time C derivatives took over the world.

If we're specifying the size of a buffer we obviously work in bytes as opposed to some arbitrary larger unit.

Agreed that passing between otherwise incompatible ABIs is likely what drove the adoption of null termination. The only other option that comes to mind is a bigint implementation, but that would be at odds with the rest of the language in most cases.

It wasn't obvious to everyone at the time that string size in bytes and characters were often different. It was very common to find code that would treat the byte size as the character count for things like indexing and vice versa.