Tech Off Thread

43 posts

Microsoft's XP RAM Disk Driver

Back to Forum: Tech Off
  • User profile image
    AndyC

    , evildictaitor wrote

    *snip*

    Its not as unrealistic as you think. Databases for example do pretty much exactly this. If you don't need the database to survive a reboot (which is often the case with search indexers for example) then this would work better with the database on a ram-disk to avoid the random-access writes to disk after every database access.

    If you don't like that example, take the example of storing Temporary Internet Files or the system TEMP directory on a ramdisk.

    *snip similar examples*

    If an application tells Windows that a file is temporary, Windows actually goes out of it's way to avoid ever writing the file to disk unless it really, really has to. And something like a search indexer that doesn't want to preserve it's index after a reboot (though I'm not sure why) would make far more sense to indicate that it's file is temporary. Forcing commits to disk is also something that seems unlikely if the overall consistency of the file on the filesystem is unimportant also seems rather more like finding an example that works than a real-world scenario.

    Or for disk-less computers where all data is stored on network drives

    Which is the one and only scenario that the Windows RAM disk was ever actually created for. And the only one which makes sense.

    , evildictaitor wrote

    *snip*

    Not really. You should be able to upgrade to Windows-XP 64 without reinstalling many of your programs, so long as they don't do too many registry shenanigans.

    It's not possible to upgrade from any 32-bit version of Windows to a 64-bit one without a complete reinstall. However it's probably a much better long term solution than trying to use some third party RAM disk to improve performance, even if it only uses normally unaccessible memory.

    While you're at it, upgrading beyond XP makes sense too, not only because XP-x64 is something of a compatibility nightmare and frankly something of an abortion, but because XP's memory management algorithms do a poor job with more than about 1GB of RAM, often zeroing pages in an overly aggressive manner since it was designed around differing expectations of memory availability.

  • User profile image
    evildictait​or

    , AndyC wrote

    If an application tells Windows that a file is temporary, Windows actually goes out of it's way to avoid ever writing the file to disk unless it really, really has to.

    That's a pretty big if you have there. Files created with either the FILE_DELETE_ON_CLOSE or the FILE_ATTRIBUTE_TEMPORARY (on NTFS) are stored in paged-pool rather than on disk (although they appear to be on disk because of NTFS lies). This is effectively NTFS implementing a ramdisk for you in this case where previously a ramdisk would have meant a speedup. Putting files marked temporary on a ramdisk when NTFS just adds inefficiency, but putting files which aren't marked temporary on a ramdisk when you want them to behave as if they are temporary does give you a speedup.

    And something like a search indexer that doesn't want to preserve it's index after a reboot (though I'm not sure why) would make far more sense to indicate that it's file is temporary. Forcing commits to disk is also something that seems unlikely if the overall consistency of the file on the filesystem is unimportant also seems rather more like finding an example that works than a real-world scenario.

    If this is a general purpose database program (that you don't have source for), it will assume that on-disk consistency is important. If you happen to be using that general purpose database program to do very large numbers of operations that don't need on-disk consistency, you could either

    a) Build (or shim) your own database program that is identical, but uses temporary files instead of the normal database files

    b) Use the general purpose database, but put its database on a ramdisk.

    c) Suffer the large number of disk faults caused by each and every commit (which is what most people do).

  • User profile image
    golnazal

    open

     

  • User profile image
    Prophet​Zarquon

    Never advise anyone to switch to XP 64bit.  Just avoid that. XP 32bit, Server, Win7, Linux Mint LXDE, but not XP64. That OS is like an Abandonware poster-child.

    Also...  Has it occured to you that millions of people are still using 32bit hardware?  Look at the date on this message, scary isn't it?  But it's true.

    Back to topic:  I found (& Necrobumped) this because I am interested in establishing a RAM-drive for security and also because the Win8 USB system degrades heavily when running from non-USB3.0 devices.  I may try a Bart PE, Win PE, or Tiny XP distro booted from USB to RAMdrive next.

    It's a recovery thing.  Linux is great, but sometimes XP would be better and current USB distros I've found rely completely on USB flash.

  • User profile image
    nehresmann

    AndyC, there are indeed cases where you will get performance improvements by using a RAM disk over relying on OS caching mechanisms.

    One such case, compiling software. If the resulting files of compiling are stored on a RAM disk and the final exe copied out when the compile is done, the entire compilation and linking process will go much faster.

  • User profile image
    AndyC

    @nehresmann: Only if your OS is awful at caching or your compiler is doing a very poor job of indicating files that shouldn't be cached. All "having a RAM disk" does is add an unnecessary copy step between bits of memory and reduces the amount available for caching in the first place.

  • User profile image
    evildictait​or

    , nehresmann wrote

    One such case, compiling software. If the resulting files of compiling are stored on a RAM disk and the final exe copied out when the compile is done, the entire compilation and linking process will go much faster.

    Actually, compiling code is a great counter-example. Compiling requires lots of burst RAM when building trees for each individual file, and almost all languages are heavilly memory-limited rather than disk-access-time limited, meaning that each compilation unit puts a fair whack of pressure on the pagefile if you don't have enough RAM available for it.

    By installing a RAM driver, you artificially reduce the amount of RAM that's available to your machine (since lots of RAM is currently parked holding temporary files that you're not looking at), causing your programs dealing with the trees that you are looking at to violently thrash the page-file to get memory available for the compilation process.

    Even if that wasn't the case - the filesystem driver keeps recently written files in memory anyway, so your stated performance benefit because "those obj files are in memory" you get without the RAM disk anyway.

    As a general rule, RAM disks don't help you more than they hurt you. There are cases where they work - but they're few, far between and generally pretty contrived.

Comments closed

Comments have been closed since this content was published more than 30 days ago, but if you'd like to continue the conversation, please create a new thread in our Forums, or Contact Us and let us know.