Its not as unrealistic as you think. Databases for example do pretty much exactly this. If you don't need the database to survive a reboot (which is often the case with search indexers for example) then this would work better with the database on a ram-disk to avoid the random-access writes to disk after every database access.
If you don't like that example, take the example of storing Temporary Internet Files or the system TEMP directory on a ramdisk.
*snip similar examples*
If an application tells Windows that a file is temporary, Windows actually goes out of it's way to avoid ever writing the file to disk unless it really, really has to. And something like a search indexer that doesn't want to preserve it's index after a reboot (though I'm not sure why) would make far more sense to indicate that it's file is temporary. Forcing commits to disk is also something that seems unlikely if the overall consistency of the file on the filesystem is unimportant also seems rather more like finding an example that works than a real-world scenario.
Or for disk-less computers where all data is stored on network drives
Which is the one and only scenario that the Windows RAM disk was ever actually created for. And the only one which makes sense.
It's not possible to upgrade from any 32-bit version of Windows to a 64-bit one without a complete reinstall. However it's probably a much better long term solution than trying to use some third party RAM disk to improve performance, even if it only uses normally unaccessible memory.
While you're at it, upgrading beyond XP makes sense too, not only because XP-x64 is something of a compatibility nightmare and frankly something of an abortion, but because XP's memory management algorithms do a poor job with more than about 1GB of RAM, often zeroing pages in an overly aggressive manner since it was designed around differing expectations of memory availability.