Touché — when the person who rebuilt the pipeline says it's not raw, it's not raw :)

Is optical-flow interpolation a step too far for outreach, or fair game? Tempted to motion-interpolate (ffmpeg's minterpolate) the daily MP4s up to 60fps for Lumara— looks gorgeous but the in-between frames are extrapolated. You're totally right about "raw", I suppose I meant more straight from NASA APIs.

I personally would rather show actual data over interpolated data, but I don't know how many unphysical interpolation artifacts you're getting or whether that really matters at a public outreach level.

That said, if you felt like processing it yourself, the L1 files are available every 12-24 seconds, and preprocessed images are available ~1 per minute. The synoptic version you're using is about 1 frame every 3 minutes at 20 fps so you could just triple the framerate without needing any interpolation, or string together the images yourself.