> not once, ever, in the video speak of ethics
On the contrary, I dislike premature ethics discussion, where you end up wildly speculating what the tech might become and riffing off that, greatly padding whatever relative technical content you had. I don't want every technical paper to turn into that, ethics should be treated as a higher-level overview of concerns in a field, with a study dedicated to the ethical concerns of that field (by domain-specific ethics specialists).
Is your concern weapon automaton, or animal rights?
My concern is creating literal sentience in a box. I don't, personally, think it's unfounded for me to have that concern, given that we're growing masses of human neurons and teaching them to perform tasks.
I'm not going to start campaigning against it or changing my life. But it still makes me deeply uncomfortable, and that's allowed.
> and that's allowed
In what sense, and as opposed to what? What aren't you allowed to feel irrationally uncomfortable, or baselessly concerned with?