Yes, you absolutely should have the right to install (or uninstall) whatever software you want on any of those, assuming it contains writable program memory. The alternative is a nightmarish dystopian future where your washing machine company is selling its estimate of your political inclinations, sexual activities, and risk aversion to your car insurance company, your ex-husband, your trade union representative, and your homeowners' association.
I thought I had this line, but I imagined if my credit card had writable program memory, I'd be fine with a third party preventing me from using it for its intended purpose if it wasn't trusted there. There must be some purpose for my own good for preventing me from writing to my own program memory, and I should be able to void this purpose if I deem it worth it.
Likewise, I'd be fine with banking apps on phones requiring some level of trust, but it shouldn't affect how the rest of my phone works so drastically.
Why would your credit card need to act against your interests? The only thing it should be doing is signing transactions to signal that you approve. The credit card company has their own computers that can be consulted to ask them if they approve a transaction. They don't need one in your pocket. They can rent a rack in a data center. It's not that expensive.
Similarly, the banking app on your phone should be representing your interests, 100%. It may need to keep secrets, such as a private transaction signing key, from your bank or from your boyfriend, but not from you. And it definitely should not be collecting information on your phone against your will or without your knowledge. But that is currently common practice.
Why?
My washing machine could be programmed to do all of those things you're worried about without any writeable memory. Why does the parts the manufacturer puts into it turn it from an appliance that washes my clothes to a computer that I have a right to install custom code on?
The principle is that the owner should have full control of their own device, because that's what defines private property. In particular, everything that the maker can make the device do must be something that the owner can make the device do. If the device is simply incapable of doing a certain thing, that might be bad for the owner, but it's not an abrogation of their right to their own property, and it doesn't create an ongoing opportunity for exploitation by the maker.
Maybe in theory your washing machine could be programmed to do those things without writable program memory. Like, if you fabricated custom large ROM chips with the malicious code? And custom Harvard-architecture microcontrollers with separate off-chip program and data buses? But then the functionality would be in theory detectable at purchase time (unlike, for example, Samsung's new advertising functionality: https://news.ycombinator.com/item?id=45737338) and you could avoid it by buying an older model that didn't have the malicious code. This would greatly reduce the maker's incentives to incorporate such features, even if it were possible. In practice, I don't think you could implement those features at all without writable program memory, even with the custom silicon designs I've posited here.
If you insist that manufacturers must not prevent owners from changing the code on their devices, you're insisting that they must not use any ROM, for any purpose, including things like the PLA that the 6502 used to decode instructions. It's far more viable, and probably sufficient, to insist that owners must be able to change any code on their devices that manufacturers could change.