> In part, because Vulkan is a graphics API, not a GPGPU framework like CUDA. They're entirely different beasts.

Tbf, the distinction between rendering and compute has been disappearing for quite a while now, apart from texture sampling there isn't much reason to have hardware that's dedicated for rendering tasks on GPUs, and when there's hardly any dedicated rendering hardware on GPUs, why still have dedicated rendering APIs?

And, mesh shading in particular is basically "what if we just deleted all that vertex specification crap and made you write a compute shader"

Note that it's not always better. The task shaders are quite hardware specific and it makes sense to ship defaults inside the driver.

Yes, I predict eventually we will be back at software rendering, with the difference that now it will be hardware accelerated due to running on compute hardware.

This is not a statement on the hardware, it's a statement on what the APIs are trying to achieve. In this regard, they are remarkably different.

The point is that a (self-declared) low-level API like Vulkan should just be a thin interface to GPU hardware features. For instance the entire machinery to define a vertex layout in the PSO is pretty much obsolete today, vertex pulling is much more flexible and requires less API surface, and this is just one example of the "disappearing 3D API".

More traditional rendering APIs can then be build on top of such a "compute-first-API", but that shouldn't be the job Khronos.