Thanks for your question. We are aware of the related blog post and disagree with the conclusions the author reaches. Specifically, the analogy with the elephant, while quite visually provacative, is inaccurate. The implication is that browser users want to drive their browsers on a racetrack all day (e.g., the benchmarks). Browser users want high performance, but not for small benchmarks that don't correspond to their daily experience. A better analogy would be to consider driving your browser down a city street in traffic. In that case, which we argue is the common case, the fact that your browser performs like a Porsche on a racetrack isn't as meaningful. What the user cares about is how it handles in traffic. We are presenting this research at the WebApps 2010 conference in Boston, on June 23, 2010 (http://www.usenix.org/events/webapps10/).
Thanks for the question. We did not measure the number of simultaneous XmlHttpRequests in our current study, although that's an interesting question. There are a number of additional measures like that including the ways the typical WebApps interact with the DOM that we are also interesting in knowing. While we do not have concrete plans right now for collecting such data, if we do get it, we'll make it available from our project website.
Computer hardware often allows a small number of memory watchpoints, that are typically used for debugging. There have been proposals for hardware that allows many small independent memory protection regions (for example, Mondriaan Memory Protection by Emmett Witchel). He's proposed using such a mechanism to detect illegal stores, however no current commercial hardware supports his mechanism. There are also a number of papers describing either hardware or software mechanisms for detecting out-of-bounds reads and writes, and existing tools, such as BoundsChecker, can also be used.