To ruin it for everyone: They're also patented :) https://patents.google.com/patent/WO2023143707A1/en?inventor...

What's the innovation here?

Using logic operators? Picking something from a range of options with SoftMax? Having a distribution to pick from?

I remember reading about adaptive boolean logic networks in the 90's. I remember a paper about them using the phrase "Just say no to backpropagation". It probably goes back considerably earlier.

Fuzzy logic was all the rage in the 90's too. Almost at the level of marketers sticking the label on everything the way AI is done today. Most of that was just 'may contain traces of stochasticity' but the academic field used actual defined logical operators for interpolated values from zero to one.

A quick look on picking from a selection found https://psycnet.apa.org/record/1960-03588-000 but these days softmax is just about ubiquitous.

> What's the innovation here? > Having a distribution to pick from?

As I understand it, it's exactly this. Specifically, representing neurons in a neural network via a probability distribution of logic gates and then collapsing the distribution into the optimal logic gate for a given neuron via hyper-parameter tuning in the form of gradient descent. The author has a few more details in their thesis:

https://arxiv.org/abs/2209.00616

Specifically it's the training approach that's patented. I'm glad to see that people are trying to improve on his method, so the patent will likely become irrelevant in the future as better methods emerge.

The author also published an approach on applying their idea onto convolutional kernels in CNN's:

https://arxiv.org/abs/2411.04732

In the paper they promise to update their difflogic library with the resulting code, but apparently they seem to have conveniently forgotten to do this.

I also think their patent is too broad, but I guess it speaks for the entire ML community that we haven't seen more patents in this area. I could also imagine that, given that the approach promises some very impressive performance improvements, they're somewhat afraid that this will be used for embedded military applications.

Liquid NN are also able to generate decision trees.

My Zojirushi rice cooker says fuzzy logic on it, it's 15 years old, so that phrase was still marketed 15 years after "inception".