"Usually"? I'm not saying there are literally no computers in existence that might have this much space on a single filesystem, but...has there ever been a known case of someone hitting this limit with a single SQLite file?
"Usually"? I'm not saying there are literally no computers in existence that might have this much space on a single filesystem, but...has there ever been a known case of someone hitting this limit with a single SQLite file?
That's just 10 30TB HDDs. Throw in two more for redundancy and mount them in a single zfs raidz2 (a fancy RAID6). At about $600 per drive that's just $7200. Half that if you go with 28TB refurbished drives (throw in another drive to make up for lost capacity). That is in the realm of lots of people's hobby projects (mostly people who end up on /r/datahoarder). If you aren't into home-built NAS hardware you can even do this with stock Synology or QNAP devices
The limit is more about how much data you want to keep in sqlite before switching to a "proper" DBMS.
Also the limit above is for someone with the foresight that their database will be huge. In practice most sqlite files use the default page size of 4096, or 1024 if you created the file before the 2016 version. That limits your file to 17.6TB or 4.4TB respectively.
Last week I threw together a 840TB system to do a data migration. $1500 used 36-bay 4U, 36 refurbished Exos X28 drives, 3x12 RAIDz2. $15000 all in.
Where did you source the drives?
Never underestimate the ability of an organization to throw money at hardware and use things _far_ past their engineered scale as long as the performance is still good enough to not make critical infrastructure changes that, while necessary, might take real engineering.
Though to be fair to those organizations. It's amazing the performance someone can get out of a quarter million dollars of off the shelf server gear. Just imagine how much RAM and enterprise grade flash that can get someone off of AMD or Intel's highest bin CPU even at that budget!
Poking around for only a minute, the largest SQLite file I could find is 600GB https://www.reddit.com/r/learnpython/comments/1j8wt4l/workin...
The largest filesystems I could find are ~1EB and 700PB at Oak Ridge.
FWIW, I took the ‘usually’ to mean usually the theoretical file size limit on a machine is smaller than theoretical SQLite limit. It doesn’t necessarily imply that anyone’s hit the limit.
Wondered the same thing. That's a lot of data for just one file!
Did a full-day deep dive into SQLite a while back; funny how one tiny database runs the whole world—phones, AI, your fridge, your face... and like, five people keep it alive.
Blows my mind.
> I'm not saying there are literally no computers in existence that might have this much space on a single filesystem
I don't use it for sqlite, but having multi-petabyte filesystems, in 2025, is not rare.
With block level compression you might manage it. But you'd have to be trying for it specifically.