it turned out that the current implementation of the --bounds option causes problems when you compile a
tile that covers a huge area but contains rather few data for that area. In this case the area covers West Europe:
(37.880859375,-20.0830078125) to (60.556640625,2.98828125)
The problem is that mkgmap tries to load the data from the precompiled bounds for the full area to fill the address information.
This takes very long and consumes large amounts of memory, so I am surprised that it worked with only 6800M heap.
A first improvement is to calculate the "real" bounds of the input data, that is the intersection of the area
given by the (first) bounds statement in the input file and the bounding box of all nodes. (r3861).
I've used Bernhards data for an area.list and called splitter with the --split-file option and an input file
that contains a merge of Belgium + Netherlands.
The result is a 43100001.o5m file with ~17MB and the nodes in it cover the area
(49.49697017669678,-5.859103202819824) to (58.18458080291748,7.282497882843018)
The nodes in the west are from ferry lines and sea cables.
Now, the intersection of these two bounds is
49.49697017669678,-5.859103202819824) to (58.18458080291748,2.98828125)
This is still large but much smaller than the first one, and it loads within a reasonable time
so that the LocationHook finishes within 20 secs.
Of course this work-around doesn't help when the input file really contains (a few) nodes which have such a huge
bounding box, so I am working on a better algo now.
Von: mkgmap-dev <[hidden email]> im Auftrag von svn commit <[hidden email]>
Gesendet: Mittwoch, 22. März 2017 07:30:10
An: [hidden email]; [hidden email] Betreff: [mkgmap-dev] Commit r3861: improve performance of LocationHook when input file bounds data is huge.
Version mkgmap-r3861 was committed by gerd on Wed, 22 Mar 2017
improve performance of LocationHook when input file bounds data is huge.
Calculate the intersection of the bounds and the bounding box of all nodes
instead of using the bounds directly.