The claim that everything is there does not imply infinite, or even large capacity.
Consider an exponentially weighted moving average - you can just keep putting more data in forever and the memory requirement is constant.
The brain stores information as a weighted graph which basically acts as lossy compression. When you gain more information, graph weights are updated, essentially compressing what was already in there further. Eventually you get to a point where what you can recall is useless, which is what we would consider forgotten, and eventually the contribution of a single datapoint becomes insignificant, but it never reaches zero.
> The claim that everything is there does not imply infinite, or even large capacity.
It implies enough capacity to store everything. But what you describe is not storing everything.
> lossy compression
Which means you're not storing all the information. You're not storing everything.
> When you gain more information, graph weights are updated, essentially compressing what was already in there further.
In other words, each time you store a new memory, you throw some old information away.
Which the person I was responding to said does not happen.
And this description is based on what?