In developing software for large data sets (billions of records, terabytes in size) the way you store your data in memory is critical – and you want your data in memory if you want to be able to analyse it quickly (e.g. minutes not days).
Any data structure that relies on pointers for each data element quickly becomes unworkable due to the overhead of pointers. On a 64 bit system, with one pointer for each data element across a billion records you have just blown near 8GB of memory just in pointers.
Thus there is a need for compact data structures that still have fast access characteristics.
via siganakis.com