This paper is about the limits in current systems.

Ai currently has issues with seeing what's missing. Seeing the negative space.

When dealing with complex codebases you are newly exposed to you tackle an issue from multiple angles. You look at things from data structures, code execution paths, basically humans clearly have some pressure to go, fuck, I think I lost the plot, and then approach it from another paradigm or try to narrow scope, or based on the increased information the ability to isolate the core place edits need to be made to achieve something.

Basically the ability to say, "this has stopped making sense" and stop or change approach.

Also, we clearly do path exploration and semantic compression in our sleep.

We also have the ability to transliterate data between semantic to visual structures, time series, light algorithms (but not exponential algorithms, we have a known blindspot there).

Humans are better at seeing what's missing, better at not closuring, better at reducing scope using many different approaches and because we operate in linear time and there are a lot of very different agents we collectively nibble away at complex problems over time.

I mean on a 1:1 teleomere basis, due to structural differences people can be as low as 93% similar genetically.

We also have different brain structures, I assume they don't all function on a single algorithmic substrate, visual reasoning about words, semantic reasoning about colors, synesthesia, the weird handoff between hemispheres, parts of our brain that handle logic better, parts of our brain that handle illogic better. We can introspect on our own semantic saturation, we can introspect that we've lost the plot. We get weird feelings when something seems missing logically, we can dive on that part and then zoom back out.

There's a whole bunch of shit the brain does because it has a plurality of structures to handle different types of data processing and even then the message type used seems flexible enough that you can shove word data into a visual processor part and see what falls out, and this happens without us thinking about it explicitly.

Yep definitely agree with this.