On a related note, has anyone found a good local LLM option for working with Excel files?

Here's my use case: I have a set of responses from a survey and want to perform sentiment analysis on them, classify them, etc. Ideally, I'd like to feed them one at a time to a local LLM with a prompt like: "Classify this survey response as positive, negative, or off-topic...etc".

If I dump the whole spreadsheet into ChatGPT, I found that because of the context window, it can get "lazy"; while with a local LLM, I could just literally prompt it one row at a time to accomplish my goal, even if it takes a little longer in terms of GPU and wall-clock time.

However, I can't find anything that works off the shelf like this. It seems like a prime use case for local models.

Cellm + Ollamma?

https://docs.getcellm.com/models/local-models

That looks like a great fit! Not sure how I missed it, but I appreciate the link.

Don't know about excel, but for Google Sheets. You can ask chatgpt to write you a appsscript custom function e.g CALL_OPENAI. Then you can pass in variables into. =CALL_OPEN("Classify this survey response as positive, negative, or off-topic: "&A1)

Sheets also has an `AI` formula now that you can use to invoke Gemini models directly.

When I tried the Gemini/AI formula it didn’t work very well, gpt-5 mini or nano are cheap and generally do what you want if you are asking something straightforward about a piece of content you give them. You can also give a json schema to make the results more deterministic.