Where i work at, we often have to work with data that has many rows. We, as standard users, have no access to the underlying queries and as such, must often run a standard report that returns 100's of thousands of rows. So the 64k limit is really bad. I know this has to do with memory management, but by reselecting how rows and columns are split, i could use the same amount of memory with a limit that makes more sense to me.
I mean, how many times did you have to work with more than 32 columns ? Unless its a full application with protected cells etc. in which case you will probably not need 64K rows. On the other hand, the inverse could be true. One could need more Columns and less rows.
Would be interesting to have feedback on this.