A few years ago I've stumbled upon a database design approach, but I can't find any information on this for the love of my life.
Its main feature was normalizing tables in a way where every "property" of entities was stored in their own table. E.g. a user entity, normally stored in a single table with an id, email and other fields, was split into tables for each field. So users' email address would have been stored in a table with a field to contain a user's current email address and a datetime (when this email address was set or last updated) with a user id or similar. The model would track every change to a user's email address by creating new records in this table with updated timestamps. (Not sure if this made sense.)
This approach was documented online and it had an online tool to visualize such a model (as a graph) and it had a way to export the visually created model as an SQL schema (with full support for MSSQL).
Anyone have an idea about the name of this design approach?
Edit.: I remember that the author of this approach (or the visual modeller that is/was available online) mentioned that the tool was able to produce database schema (or "optimized JOINs") for MSSQL since it has (had) the ability to efficiently to "prune tables".
Edit.: By "prune tables" I meant "eliminate tables".
Best Answer
Your description could be of SQL Columnstore indexes