ricardogg's blog

By ricardogg, history, 3 months ago, In English,

Hello everyone,

Recently I was trying to implement a working persistent segment tree, and I managed to solve couple problems with it.

However for one problem I needed to implement persistent segment tree with $$$100000$$$ values in the leafs, and two long long values in each node while the memory limit is only $$$64$$$ megabytes. Firstly I created version that works with pointers, however later I replaced the pointers with integers representing each node, but neither of those version is not working in the memory limit.

What is the best way to implement such tree, are there any tricks to reduce the memory usage?

  • Vote: I like it
  • +24
  • Vote: I do not like it

3 months ago, # |
  Vote: I like it 0 Vote: I do not like it

you are creating the segment tree nodes as a "when required" basis right? dont create unnecessary nodes. Also, maybe try "compressing" straight chains in ur segment tree. Its an ugly implementation though. (just so you know: 4nlogn = 64mb precisely when n = 1e5).

3 months ago, # |
  Vote: I like it +6 Vote: I do not like it

How many updates are there?

If there are also $$$10^5$$$ queries, then I don't see how this breaks the memory limit. Each node contains 4 + 4 + 8 + 8 bytes (two ints and two long longs), and each query creates $$$\log n = 17$$$ new nodes. This is $$$10^5 \cdot 17 \cdot 24 / 2^{20} \approx 39$$$ megabytes. Are you sure your problem isn't elsewhere? Do you have other large arrays?

Anyway, you can try to pack the data closer together. The node indices are almost small enough to fit into a short, maybe you can use useless bits from your long long to encode information about the node indices? Also, there's something called __attribute__(packed) that can be useful in such hacking.

  • »
    3 months ago, # ^ |
      Vote: I like it +6 Vote: I do not like it

    There aren't updates except the ones I use for building the persistent segment trees, so there are $$$100000$$$ of them, there are more queries, but they don't create new nodes.