If we're specifying the size of a buffer we obviously work in bytes as opposed to some arbitrary larger unit.
Agreed that passing between otherwise incompatible ABIs is likely what drove the adoption of null termination. The only other option that comes to mind is a bigint implementation, but that would be at odds with the rest of the language in most cases.
It wasn't obvious to everyone at the time that string size in bytes and characters were often different. It was very common to find code that would treat the byte size as the character count for things like indexing and vice versa.