Today, someone was helping me to make a function in PostgreSQL. I was wanting the function to return in arrays of 100+ geometries per two columns instead of delimited text to make it easier to understand what is being returned. His response was that it was going to use more resources if it was array instead of text and that I should focus on my programming instead of wasting resources.
I don't know enough about PostgreSQL to know if the use of arrays in this case would be concern.
My questions are: do PostgreSQL dynamic arrays use up more memory than text? And if so, is it a significant concern to drop the use of arrays on large queries?
Best Answer
Evidence?
If he's arguing performance against what'd otherwise be a clarity and correctness improvement, the onus is on him to provide evidence.
Delimited text is usually the wrong way to do it - see e.g. this related question.
It depends on how you're constructing the arrays. Try to create them in a single run using
array_agg
or anARRAY[]
constructor or similar. Iteratively building them with concatenation in a loop tends to be very slow because at the SQL level arrays are immutable, so they have to be copied to append to them.Of course, the same is true of
text
at the SQL level, and the same principles apply.It depends on the underlying type. In general, if t1 uses more memory than t2, then an array of t1 uses more memory than an array of t2.
The internal storage of arrays depends a little on how they're created, but in general they add little or no memory overhead over the base type.
Yes, but no more of a concern than using delimited text.
Try to return result sets, not huge single results.