overall (in map.rb): 22.598376
overall (measured at client including transfer): 23.004521
So this means, that almost the whole time (78%) is spent in converting the
data to xml output. The time to transfer the data is practically
irrelevant (2%). Note, that I even did not use the gzipped feature.
WHAT A SURPRISE! (Really, I didn't forecast this.. :-| )
So its time to either use another XML library, use another platform (no
ruby :-( ) or reactivate our discussion about CSV.
For the CSV thing: We already have an object modell that supports a rather
linear expression of the data, so defining an transport mechanism should
not be that hard. And btw: Ruby support CSV in one of its standard
Any volunteer for a transport format specification? (I'll do it, if you
don't want ;)
Any other comments?
> while I was bored waiting for the big party to start, I did some
> performance measurements of the code we currently use to retrieve map
> For this, I wrote a small ruby script to generate a dataset for the test
> You will find it in sql/make-random-data.rb
> Using this data, I test the speed of the various steps within map.rb. This
> turned out (the numbers are in seconds):
> dao.getnodes: 0.202771
> dao.getlines: 1.514142
> ox.to_s_pretty: 17.765789
> gc: 0.242098
This isn't averaged, is it? Is this for 10 calls?
ox.to_s_pretty only gets called on non-gzip enabled browsers. I don't
expect it is, but is to_s any faster?
> So its time to either use another XML library, use another platform (no
> ruby :-( ) or reactivate our discussion about CSV.
Yes, this seems a good argument of either moving to csv or generating
xml with print statements...