Hi, Mathieu
Thank you for the info.
It's mostly just OSX's malloc that is obscenely expensive beyond a certain size. But that threshold is more like 16k or so. On Linux, it's 128k instead, but if both thresholds were the same, you'd see that Linux takes this change well, whereas OSX does not.
Is this something you would learn only from studying the Linux source, or is it a fact discussed fairly often?
I would appreciate it if you can forward me to any online resources where this is mentioned (Regarding Linux... OSX is out of my scope for the moment)
-- David Shimamoto