My concern is creating literal sentience in a box. I don't, personally, think it's unfounded for me to have that concern, given that we're growing masses of human neurons and teaching them to perform tasks.

I'm not going to start campaigning against it or changing my life. But it still makes me deeply uncomfortable, and that's allowed.

> and that's allowed

In what sense, and as opposed to what? What aren't you allowed to feel irrationally uncomfortable, or baselessly concerned with?