MPEG-2 TS is a container. H.264 is a coding specification. They are totally different things.
One can find MPEG-2 TS in surprising places (see: DOCSIS encapsulating Ethernet frames into TS packets).
If I had to guess why MPEG-2 TS, it'd probably be the for the fact it's a well-supported streaming format in both hardware and software. If you tried using QuickTime or MPEG-4 containers, you'd have to rely on hacks like ensuring the moov atom preceeds mdat.
Matroska may be worth considering (especially the subset used by WebM to make it stream-friendly and quicker to seek), but no idea how widespread hardware support is for (de)muxing that.
Before ProRes, we captured HD content at 100Mbps MPEG2 video with PCM audio wrapped in a 302m stream that were muxed together as an MP2TS wrapper. The 302m made it even more difficult as not all MP2TS tools could do it correctly, and some would not allow for custom Program stream IDs and needed to be remuxed by other tools allow for custom PIDs as a post process.
But seeing how many uses people came up with for using MP2TS just shows it's flexibility and resilience.
I don't think MKV is better suited for low-latency streaming production (as opposed to consumption) than MP4. That's really more of the domain of MPEG-TS, RTP etc., and presumably this is a reliable transport alternative to that.
HLS, MPEG-DASH etc. do successfully work around much of that, but they're really mostly that – workarounds to present stream-like semantics over an HTTP + CDN based delivery mechanism.
There are significant gaps on the production/distribution side of things, i.e., everything that happens before the CDN (and for very low latency even beyond), and I suppose this is an attempt at filling those.
Sure, but there are a lot of media applicances that don't really have upgradeable software. I suppose it can be pretty useful to just (un)wrap MoQ and feed the result over a local interface into something that just expects M2TS and vice versa.
MPEG-TS is used to contain h264 chunks for HLS. MPEG-DASH and the new CMAF standard use fMP4 containers instead. My personal take is that Media over QUIC (MoQ) should support both.
The nice thing about fMP4 is that it's supported by both HLS and MPEG-DASH, so you can save yourself the effort of duplicating all data at the CDN level (or using a CDN that can just-in-time assemble fMP4 and M2TS files from the same underlying source).
MPEG-2 TS is a container. H.264 is a coding specification. They are totally different things.
One can find MPEG-2 TS in surprising places (see: DOCSIS encapsulating Ethernet frames into TS packets).
If I had to guess why MPEG-2 TS, it'd probably be the for the fact it's a well-supported streaming format in both hardware and software. If you tried using QuickTime or MPEG-4 containers, you'd have to rely on hacks like ensuring the moov atom preceeds mdat.
Matroska may be worth considering (especially the subset used by WebM to make it stream-friendly and quicker to seek), but no idea how widespread hardware support is for (de)muxing that.
Before ProRes, we captured HD content at 100Mbps MPEG2 video with PCM audio wrapped in a 302m stream that were muxed together as an MP2TS wrapper. The 302m made it even more difficult as not all MP2TS tools could do it correctly, and some would not allow for custom Program stream IDs and needed to be remuxed by other tools allow for custom PIDs as a post process.
But seeing how many uses people came up with for using MP2TS just shows it's flexibility and resilience.
I don't think MKV is better suited for low-latency streaming production (as opposed to consumption) than MP4. That's really more of the domain of MPEG-TS, RTP etc., and presumably this is a reliable transport alternative to that.
HLS, MPEG-DASH etc. do successfully work around much of that, but they're really mostly that – workarounds to present stream-like semantics over an HTTP + CDN based delivery mechanism.
There are significant gaps on the production/distribution side of things, i.e., everything that happens before the CDN (and for very low latency even beyond), and I suppose this is an attempt at filling those.
Hardware support does not matter for TS, as containers are handled in software.
Fast start is irrelevant because MoQ normally uses fragmented MP4, not progressive.
Sure, but there are a lot of media applicances that don't really have upgradeable software. I suppose it can be pretty useful to just (un)wrap MoQ and feed the result over a local interface into something that just expects M2TS and vice versa.
MPEG-TS is used to contain h264 chunks for HLS. MPEG-DASH and the new CMAF standard use fMP4 containers instead. My personal take is that Media over QUIC (MoQ) should support both.
HLS also supports fMP4 now and no one is making new services that use TS (there are some old ones still around with too much friction to switch)
The nice thing about fMP4 is that it's supported by both HLS and MPEG-DASH, so you can save yourself the effort of duplicating all data at the CDN level (or using a CDN that can just-in-time assemble fMP4 and M2TS files from the same underlying source).
No, but under it ;)