Really appreciate you sharing your workflow honestly.
I have a few follow up questions. When you're using SonoBus, are your musicians running local synths/instruments, or are you working with audio streams? And are you on macOS? Asking because you can actually use Contrapunk with SonoBus today then route guitar through Contrapunk, send the harmony MIDI to a synth via IAC buses, and pipe the audio into SonoBus through BlackHole. But there's clearly an extra step you probably shouldn't have to do. Your session hears the harmonized output and you can mute/unmute the Contrapunk voices like any other musician. Longer term we can look at native SonoBus integration via the AOO protocol so Contrapunk would show up as a peer in your session directly. What do you think about it ? Would love your input on what that should look liked.