Something is lost as well if you do 'research' by just asking an LLM. On the path to finding your answer in the encyclopedia or academic papers, etc. you discover so many things you weren't specifically looking for. Even if you don't fully absorb everything there's a good chance the memory will be triggered later when needed: "Didn't I read about this somewhere?".
Yep, this is why I just don’t enjoy or get much value from exploring new topics with LLMs. Living in the Reddit factoid/listicle/TikTok explainer internet age my goal for years (going back well before ChatGPT hit the scene) has been to seek out high quality literature or academic papers for the subjects I’m interested in.
I find it so much more intellectually stimulating then most of what I find online. Reading e.g. a 600 page book about some specific historical event gives me so much more perspective and exposure to different aspects I never would have thought to ask about on my own, or would have been elided when clipped into a few sentence summary.
I have gotten some value out of asking for book recommendations from LLMs, mostly as a starting point I can use to prune a list of 10 books down into a 2 or 3 after doing some of my research on each suggestion. But talking to a chatbot to learn about a subject just doesn’t do anything for me for anything deeper than basic Q&A where I simply need a (hopefully) correct answer and nothing more.
LLMs hallucinate too much and too frequently for me to put any trust in their (in)ability to help with research.