I think a very good compromise between speed and size still is bzip2, so i will stick to bzip2. Seems to be in the middle between compression time, decompression time, memory and size.
Note that there is a huge difference between compression levels in xz (LZMA), while there is almost no difference between bzip2 levels. This benchmark uses the slowest 9th level. If you try lower levels, e.g. xz -3, it may give better compression ratio and better compression speed than bzip2, and, being an LZ77 algorithm, much better decompression speed. Here's an example with different levels: http://pokecraft.first-world.info/wiki/Quick_Benchmark:_Gzip...
Also, I recommend using parallel versions (pbzip2 or pxz) if you have more than one CPU core: compression is highly parallelizable, so if you have 2 cores, they can half the compression time.
I don't know whether it makes a big difference, but you also should (and, frankly, I should. I tend to use bzip2 or 7z, without much testing) try and get data for your representative load. How often do you compress 100 megabyte or even one gigabyte XML files?
Also, if you are compressing lots of small files, and won't need to extract them individually often, you could use what rar and 7z call solid archives: all files concatenated, and then compressed (equivalent to .tar.gz or .tar.bzip2)
1) What' is so bad about log-linear graphic? My rule of thumb is trying to use an scale were the graph looks like a line. Line are easy to understand, and to apply the inverse transformations to get the true form of the curve and an approximate formula. In a graph it's difficult to distinguish the positive part of 1/x, 1/x^2, ..., e^{-x}, ...
I don't see the problem. It's obvious what the chart conveys if you stop to think for a moment (at least speaking for myself, I'm used to looking at graphs with a logarithmic time scale, and didn't think about it). In fact, it's reasonable that a linear improvement in compressed size should require an exponential increase in time and memory.
http://mattmahoney.net/dc/rationale.html
Also, his book "Data Compression Explained" was a great help and eye opener for me:
http://mattmahoney.net/dc/dce.html