I've gotten a lot of gains in this area in the past by just - not memcpy'ing. A good percentage of the time, somebody assumes that they need to copy something somewhere when in fact, the original never gets referenced. I can often get away with reading a buffer off the wire, inserting null terminators to turn bits of the buffer into proper C-style strings and just using them in-place.
That is a really good advice, copying data everywhere makes only sense if the data will be mutated. I only wonder why, why C-style strings were invented with 0 termination instead of varint prefix, this would have saved so much copying and so many bugs knowing the string length upfront.
That reminds me of one of my favorite vulnerabilities. A security researcher named Moxie Marlinspike managed to register an SSL cert for .com by submitting a certificate request for the domain .com\0mygooddomain.com. The CA looked at the (length prefixed) ASN.1 subject name and saw that it had a legitimate domain, they accepted it, but most implementations treated the subject name as a C-delimited string and stopped parsing at the null terminator.
Pascal strings have the issue that you need to agree on an int size to cross an ABI boundary, unless you want to limit all strings to 255 characters and what the prefix means is ambiguous if you have variable length characters (e.g. Unicode). These were severe enough that Pascal derivatives all added null terminated strings.
Took a bit for languages to develop the distinction between string length in characters and bytes that allows us to make it work today. In that time C derivatives took over the world.
If we're specifying the size of a buffer we obviously work in bytes as opposed to some arbitrary larger unit.
Agreed that passing between otherwise incompatible ABIs is likely what drove the adoption of null termination. The only other option that comes to mind is a bigint implementation, but that would be at odds with the rest of the language in most cases.
It wasn't obvious to everyone at the time that string size in bytes and characters were often different. It was very common to find code that would treat the byte size as the character count for things like indexing and vice versa.
I'm not here to defend zero- terminated strings, but I register that prefixed strings would be equally bad for the goal of OP, or even worse since you would need to inject int prefixes instead of zero bytes.