This article is like a quick street rap. Lots of rhythm, not much thesis. Big on tone, light on analysis...Or no actual thesis other than a feelgood factor. I want these 5 min back.
This article is like a quick street rap. Lots of rhythm, not much thesis. Big on tone, light on analysis...Or no actual thesis other than a feelgood factor. I want these 5 min back.
> I want these 5 min back.
Tell me, what is it you plan to do
with your five wild and precious minutes?
On the other hand, as somebody not well-read in AI I found it to be a rather intuitive explanation for why pruning helps avoid the overfitting scenario I learned when I first touched neural networks in the ‘10s.
Sure, this could’ve been a paragraph, but it wasn’t. I don’t think it’s particularly offensive for that.
Do you think a GPT that already trained on something "feels" the same way when reading it a second time?