I got tired of the overhead required to run even a simple data analysis - cloud setup, ETL pipelines, orchestration, cost monitoring - so I built a fully local data-stack/IDE where I can write SQL/Py, run it, see results, and iterate quickly and interactively.

You get data lake like catalog, zero-ETL, lineage, versioning, and analytics running entirely on your machine. You can import from a database, webpage, CSV, etc. and query in natural language or do your own work in SQL/Pyspark. Connect to local models like Gemma or cloud LLMs like Claude for querying and analysis. You don’t have to setup local LLMs, it comes built in.

This is completely free. No cloud account required.

Downloading the software - https://getnile.ai/downloads

Watch a demo - https://www.youtube.com/watch?v=C6qSFLylryk

Check the code repo - https://github.com/NileData/local

This is still early and I'd genuinely love your feedback on what's broken, what's missing, and if you find this useful for your data and analytics work.

What's the difference between this and asking claude to do data analysis?

Two things:

1. You may not want to expose bits and pieces of your data and metadata to an LLM, you dont want your data to be used for training. If you are using LLM running on your machine, as in this case, you are covered there.

2. Claude can do a lot of stuff, but doing multi step analysis consistently and reliably is not guaranteed due to the non-deterministic nature of LLMs. Every time it may take a different route. Nile local offers a bunch of data primitives like query, build-pipe, discover, etc. that reduces the non-determinism and bring reliability and transparency (how the answer was derrived) to the data analysis.

Can I run it on my MacBook.. do I need to setup LLM myself?

Yes. I would recommend a model with 16gb ram at least but I was able to run it on a MacBook air 8gb but it lagged for LLM assist.

You don't need to setup LLM locally, the tool does that. You can choose which model to go with. It has Gemma and Qwen supported now.

[dead]

[dead]