Sqlite database table with 1000’s of columns

database-designindexperformancesqlite

I am using sqlite3 in an application to conduct an alalysis of distances between sets of points. Distances between these points have been precomputed using dijkstra's algorithim and left me with a table like:

    point | p0    p1    p2  ...  p1500
    ------|--------------------------
    p0    | 0     100   15  ...  X
    p1    | 100   0     10  ...  Y
    p2    | 15    10    0   ...  Z
    p150  | X     Y     Z   ...  0

I will be doing lots of I/O on the table. For 10s of thousands of records I need to look up a set of points. Combine with other factors and determine which point is best for that record. Then repeat.

My question is… in terms of performance for a database (and sqlite in particular) would it better to leave the table as is or transform the table into a long format?

    aPoint bPoint | dist
    --------------|-----
    p0     p0     | 0
    p0     p1     | 100
    p0     p2     | 15
    p0     p1500  | X
    p1     p0     | 100
    p1     p1     | 0
    p1     p2     | 10
    p1     p1500  | Y
    p2     p0     | 15
    p2     p1     | 10
    p2     p2     | 0
    p2     p1500  | Z
    p1500  p0     | X
    p1500  p1     | Y
    p1500  p2     | Z
    p1500  p1500  | 0

Best Answer

Despite of the database you are using, normally you don't add thousands of fields to a table. Your second model is more 'normalized' ( and it's easy to index ), and should work better ( I'm not an expert, anyway ). My opinion is based on normal limits of databases, that are not meant to work that way, so you're going to hit some inner limit of the engine, or misusing the engine as it was designed for.

You can explore functions and datatypes of more modern versions of mysql, MariaDB and others, that have defined types for points and similar.