> As for MIDI controllers, the host is supposed to convert them to parameter changes

And nearly everyone except Steinberg considers this to be a mistake. MIDI messages (CCs, pitch bend, and so on) are _not_ parameters.

As I understand it is better because it gives more freedom in routing parameter values. The parameter values might come not only from a MIDI controller knob directly, but from an automation curve, or from other plugin output, or from an automation track output. For example, you might combine 2 automation curves through a multiplication plugin and route the output to a plugin's parameter input.

Or you could have an automation curve that produces a sine wave, and have a MIDI knob to modulate its amplitude, and send the result to an LPF "cutoff frequency" input, so that you can control amount of modulation with a knob.

So VST3 (and CLAP) treat each parameter as an additional plugin input, which can be routed arbitrarily.

If a plugin would instead have a single MIDI input and extract controller changes from a MIDI stream, then the scenario above would be more difficult to implement (you would need to convert an output of multiplication plugin into a sequence of MIDI controller changes and combine it with other MIDI events).

Am I missing something here? Never developed a single plugin.

So there's a couple of problems here.

The first is that VST3 does not exist in a vacuum. Plugin developers need to ship AU (and previously VST2 concurrently with VST3) plugins. MIDI still is the basic abstraction used by higher level wrappers for sending events to plugins, so in practice, all VST3 events get converted to MIDI anyway (for the majority of plugins).

The second thing is that parameter values are special. You do not actually want the modulation features you are talking about to touch the parameter value, you want them to be used as modulation sources on top of the parameter value. Most synths for example treat MIDI CC and pitch bend as modulation sources to the voice pitch or other parameters (programmable by the user), and not direct controls of parameters. Keep in mind that parameters are typically serialized and saved for preset values.

The third thing is that parameters not in practice allowed to be dynamic. So if you want to use a MIDI controller as a modulation source, you need a dedicated parameter for it. You cannot dynamically add and remove it based on user input.

As an example:

> Or you could have an automation curve that produces a sine wave, and have a MIDI knob to modulate its amplitude, and send the result to an LPF "cutoff frequency" input,

This is not possible in VST3, in any host I'm aware of. You cannot output parameters from one plugin to route to another until 3.8, and I highly doubt anyone will support this.

VST3 is less flexible and more surprising in how it handles midi which is why historically, VST3 plugins have very poor MIDI support.

I think you can find something similar in a modular synth: you can have an envelope generator that generates control voltage, and connect it to a filter cutoff frequency input.

I don't care much how it is made in other DAWs, I just tried to design a routing system for a DAW on a paper. I needed a concept that would be simple yet powerful. So I thought that we can define 3 types of data - audio data, MIDI data and automation (parameter) data. And every plugin may have inputs/outputs of these types. So if a plugin has 20 parameters, we can count them as 20 inputs. And we can connect inputs and outputs however we want (as long as the types match). Of course, we also have 3 types of clips - audio, MIDI and automation curve clips. And obviously we can process these data with plugins - so we can take a MIDI clip, connect it to a plugin that generates envelopes for incoming MIDI notes, and connect its output to a cutoff frequency of a filter. Why not?

Technically it is possible to process parameter data, we just have to deal with converting data between different formats - some plugins might have "control voltage" inputs/outputs, other allow changing parameters sample- or block-precise points. And here VST3, which has a defined model for parameter interpolation, is easier to deal with than plugin formats that do not define exact interpolation formula.

By the way now I noted a serious weakness in my model - it doesn't support per-note parameters/controllers - all parameters are global in my concept. Guess I need to think more.

Your point about modulating parameters is valid, however I am not sure if it is better to implement modulation in a host and have full control over it (what do we do if a user moves a knob when the automation is being played) or have every plugin developer to implement it themselves (like in CLAP which supports parameter modulation).

> This is not possible in VST3,

I think it is possible - the plugin gets a sequence of parameter changes from the host and doesn't care where they come from. As I remember, plugins may also have output parameters, so it is possible to process parameter data using plugins.

So your paper design is most similar in spirit to CLAP. I would say that the actual audio/event processing bits are the "easy" part of the API.

> And here VST3, which has a defined model for parameter interpolation, is easier to deal with than plugin formats that do not define exact interpolation formula.

So I'll just reiterate that this is not true for either plugin or host developers and that's not a minority opinion. The parameter value queue abstraction is harder to implement on both sides of the API, has worse performance, and doesn't provide much in benefit over sending a sparse list of time-stamped events and delegating smoothing to the plugin.

> As I remember, plugins may also have output parameters, so it is possible to process parameter data using plugins.

The host forwards those output parameters back to that plugin's editor, not to other plugins. You use this as a hack to support metering, although in practice since this is a VST3 quirk, few people do it. Until 3.8.0 which added the IMidiLearn2 interface there was no way to annotate MIDI mappings for output parameters, which caused hosts to swallow MIDI messages even if they should be forwarded to all plugins. I doubt that the new interface will be implemented consistently by hosts, and now there's a problem where old plugins may do the wrong thing in new versions of hosts that expect plugins to be updated (this is catastrophic behavior for audio plugins - you never want a version update to change how they behave, because it breaks old sessions). There's also no good way to consistently send what are effectively automation clips out of plugins, since the plugin does not have a view into the sequencer.

And most importantly - plugins aren't aware of other plugins. If one plugin outputs an parameter change it is meaningless to another plugin. Maybe if both plugins implement IMidiMapping2 the host can translate the output parameter change into a MIDI event and then into another parameter change. Sounds a lot stupider than just sending discrete MIDI events.

Essentially, the design of parameters in VST3 is fragile and bad.

You are right though that many DAWs do not allow to route and process parameter data (control voltages) - I notice that some plugins implement this internally, especially synths, so you can draw curves and process them inside a plugin - but I think it would be better if you didn't need every plugin developer to add a curve editor and modulators.