It might not be common in typical software shops. I work in manufacturing and our database has multiple tables with hundreds of columns.
It might not be common in typical software shops. I work in manufacturing and our database has multiple tables with hundreds of columns.
I’m working on migrating an IBM Maximo database from the late 90s to a SQL Server deployment on my current project. Also charged with updating the schema to a more maintainable and extensible design. Manufacturing and refurbishing domain - 200+ column tables is the norm. Very demoralizing.
What's in them?
Data from measurement tools. Everything about the tool configuration, time of measurement, operator ID, usually a bunch of electrical data (we make laser diodes) like current, potential, power, and a bunch of emission related data.
I think I'd rather work with very wide tables than the entity-attribute-value approach - which seems like a good idea but rapidly becomes a mess...
Property1 to 20 or more is an example. There are better ways to do it but I have seen columns for storing ‘anything’
Sounds like a generic form of single table inheritance. I don't honestly see any other way to do it (punting to a JSON field is effectively the same thing) when you have potentially thousands of parts all with their own super specific relevant attributes.
I've worked on multiple products that have had a concept of "custom fields" who did it this way too.