What use cases are people using local LLMs for? Have you created any practical tools that actually increase your efficiency? I've been experimenting a bit but find it hard to get inspiration for useful applications
What use cases are people using local LLMs for? Have you created any practical tools that actually increase your efficiency? I've been experimenting a bit but find it hard to get inspiration for useful applications
I have a signal tracer that evaluates unusual trading volumes. Given those signals, my local agent receives news items through API to make an assessment what happens. This helps me tremendously. If I would do this through a remote app, I'd have to spend a several dollars per day. So I have this on existing hardware.
Thank you, this is a great example!
Do you want to share it?
Anyone who does not want to leak their data? I am actually surprised that people are ok with trusting their secrets to a random foreign company.
But what do you do with these secrets? Like tagging emails, summarizing documents?
a document management system is an easy example. Let’s say medical, legal, and tax documents.
Thank you, but what do you use the llm for? Writing new documents based on previous ones? Tagging/categorization/summarization/lookup? RAG? Extracting structured data from them?
Me personally, i’m using paperless-ngx to manage documents.
i use ollama to generate a document title, with 8 words or less. I then go through and make any manual edits at my leisure. Saves me time which i appreciate!
Paperless-ngx already does a pretty good job auto-tagging, i think it uses some built in classifiers? not 100% sure.
A random foreign company is far better than a big 5 eyes country, which syphon everything to the NSA, and use it against you.
Whilst the Chinese intelligence agency will have not much power over you.
No one cares about your 'secrets' as much as you think. They're only potentially valuable if you're doing unpatented research or they can tie them back to you as an individual. The rest is paranoia.
Having said that, I'm paranoid too. But if I wasn't they'd have got me by now.
step back for a bit. some people actually work with sensitive documents as part of their JOB. Like accountants, lawyers, people in medical industry, etc.
Sending a document with a social security number to OpenAI is just a dumb idea. As an example.
I do a lot of data cleaning as part of my job, and I've found that small models could be very useful for that, particularly in the face of somewhat messy data.
You can for instance use them to extract some information such as postal codes from strings, or to translate and standardize country names written in various languages (e.g. Spanish, Italian and French to English), etc.
I'm sure people will have more advanced use cases, but I've found them useful for that.
Also worth it for the speed of AI autocomplete in coding tools, the round trip to my graphics card is much faster than going out over the network.
Anyone actually doing this? DeepSeek-R1 32b ollama can't run on an RTX 4090 and the 17b is nowhere near as good at coding as OpenAI or Claude models.
I specified autocomplete, I'm not running a whole model asking it to build something and await an output.
DeepSeek-coder-v2 is fine for this, I occasionally use a smaller Qwen3 (I forget exactly which at the moment... Set and forget) for some larger queries about code, given my fairly light used cases and pretty small contexts it works well enough for me
Any companies with any type of sensitive data will love to have anything to do with LLM done locally.
A recent example: a law firm hired this person [0] to build a private AI system for document summarization and Q&A.
[0] https://xcancel.com/glitchphoton/status/1927682018772672950
I use the local LLM-based autocomplete built into PyCharm and I'm pretty happy with it