All opinions are my own:

The whole point of the CNNs is to act like a auto encoder for input and an auto decoder for output. The only reason why this is done in the first place is because the number of electrodes in the dish is pitiful and has no chance of describing something as complex as Doom. They are there to create a latent space that can be fed through 60 odd electrodes and decode the neuron latent space into pressing buttons.

The pong version of the game was the proof of concept that neurons can learn without a latent space intermediate in either direction. Both the world state and neuronal control were raw signals: https://pubmed.ncbi.nlm.nih.gov/36228614/

What I wanted to do after dish brain pong, but never had the budget for, was using live animals as the computational substrate. Use the visual cortex of one as the input, send the neural spikes to a second animals frontal lobe for computation and finally send those signals to a third animals motor cortex to physically press buttons. It's a shame we never raised enough because it wouldn't have cost more than $15m to build the hardware and do the biological proof of concept.

> using live animals as the computational substrate. Use the visual cortex of one as the input, send the neural spikes to a second animals frontal lobe for computation and finally send those signals to a third animals motor cortex to physically press buttons.

That sounds terrifying.

It does but most of what we do to animals is terrifying. I could see why getting funding for this idea might not have been that easy though "I want to mind control three animals to play Doom" is certainly a pitch

That is the fallacy of relative privation. The fact that most of what we do to animals is terrifying should be the motivating factor to NOT do more of it, such as the atrocity described above.

> The only reason why this is done in the first place is because the number of electrodes in the dish is pitiful and has no chance of describing something as complex as Doom.

This sounds a bit suspicious though. If we're confident that the neurons aren't complex enough to understand Doom, how can they be said to be complex enough to play it? Playing a game is a loose term but it seems difficult to say that it is playing something that it can't comprehend or interact with. By analogy, if there was a CNN between me and a game of Doom people would say "roenxi is cheating with an AI aim-bot", not 'roenxi is playing Doom".

The whole thing is still pretty cool though. Hopefully the neurons are having fun, I'm sure we all wish them what happiness they can muster.

There isn't enough input electrodes to encode a doom frame into the multi electrode array without compression.

That's all the artificial neural networks are doing.

If we could have gotten an MEA with 320x200 electrodes we wouldn't have used any encoding and just let the neurons figure it out. Instead it is an 8x8 grid.

We've got LLMs that seem to be smarter than anyone I'm talking to day-to-day and one useful model of them is just "compression". Compression is turning out to be a pretty key operation in intelligence and understanding (in fact, it seems to be intelligence and understanding in key ways). If we compress Doom into "shoot" and "the press buttons in the most favourable way to the player" then good compression could let a fair coin play Doom well if someone flips it fast enough.

I mean maybe ANN just means sampling the screen in which case I'm not sure why we're talking about it as a "network". But the type of compression seems critical.

Have I watched any of the videos or read the code? No I have not.

> What I wanted to do after dish brain pong, but never had the budget for, was using live animals as the computational substrate.

This is horrific. What’s your end goal? Prisoners as data centers?

I hope you rethink this.

> What I wanted to do after dish brain pong, but never had the budget for, was using live animals

Torture, so casually mentioned. For what, I wonder.

When we can turn off distress and pain in farm animals we would have done more to improve well-being in the world than anyone alive today. Factory farms stop being an efficient evil and become the only moral way to produce meat.

And as a side effect we also get super intelligence on a substrate that is 10 orders of magnitude more energy efficient than silicon.

Do you mean we'll input their brains with an alternate reality or just removing their pain and distress signals?

Regardless of the answer. Lab meat still seems more ethical, it was never a sentient being in the first place.

[dead]

This sounds nightmarish. Maybe we build a human centipede if we can get the VC funding next?

Or a Torment Nexus!!

Great idea, intense pain does provide a stronger response in the neuronal substrate. The prisoners, or uhh, “research subjects”, won’t mind. It’s for science. /s

I would have been quite happy to use my own brain as the computational substrate and I had more than a few other people keen to be the input and output parts of the system.

It's rather unfortunate that in the West it is impossible to get elective brain surgery. The countries that will do it have at best a spotty record. I talked to someone who had it done in Brazil and their electrodes became dislodged after a few months.

There is nothing new or horrifying about self experimentation. Newton for one did it in conditions that were far more dangerous: https://psmag.com/social-justice/newtons-needle-scientific-s...

I'm totally fine with consensual human experimentation that somehow threads the needle around exploitation of the poor - just not sure how we do the latter part short of requiring experimentees to pass a minimum net-worth threshold?

I think the closest would be: if anyone involved ever complains to authorities at all, everyone involved gets in trouble. If no one ever complains, no trouble. Everyone involved is forever at the mercy of everyone else involved.

I see perverse incentives to ablate complaint origination to expression pathways, or complaint system dependencies.

Or not so perverse, as this makes running these ventures much safer. Safety first!

Complaining to the authorities needs to come with a cost, otherwise people who don’t believe in the research or are looking for a payday will join just to complain

Competitor will pay a former customer to complain.

Or...at the mercy of a scary man with a big wrench. Every single post you've put in this thread is a volatile mix of idealistic, naive, and sociopathic. So, obviously you'll be a tech CEO in 10 years.

??? This is the only post I made in this thread?

Reminds me of the head transplant experiments. The stuff of nightmares but also fascinating.

Gosh it's been years, but I think they did the dual animal experiment with rats about a decade ago. I'm likely misremembering but they tickled a rat in Japan and fed the impulses into the internet and had another rat in maybe Brazil move it's tail in response. From what I recall it did potentiate over time, implying learning at the more reflex level. Sorry I can't find the link though!

Hahaha I love how you made something that wouldn’t be harmful sound like a nightmare horror show.

Edit sweet Jesus never mind I missread it.

Yes...quite a shame that we never made a amalgamation cyborg horror out of parts and pieces of several different animals. That's definitely not the plot of every sci-fi horror movie.

>What I wanted to do after dish brain pong, but never had the budget for, was using live animals as the computational substrate.

What does the ethical due diligence process look like, for something like this?

Haha, you made me laugh quite a bit, like ethical due diligence was even a bleep in the mental model of someone who talks like that about sentiment life forms.

I figure I'd get some mice fed to pythons.

You know, the ones swallowed head first and alive and then drown in stomach acid, the ones you can buy in pet any store?

Yeah, "nature is brutal, therefore what gives if i raise the bar in the suffering we bring to this world", great logic right there mate, specially when we all know such experiments are without the slightest shred of doubt aimed to end up using humans neurons, because those are the most powerful.

[deleted]