Coffeehouse Thread

28 posts

Anyone using SSD for development work only?

Back to Forum: Coffeehouse
  • Dr Herbie

    Trying to think of ways to speed up the development process at work: we have some moderately large projects that take a few minutes to compile which can slow down development work when making some changes and recompiling.

    9ers that use SSDs for development work:  

    How are you set up (are you using the SSD as the boot drive, or just to store the code and bin directories)?

    What size SSD do you use?

    How much faster is compiling to the SSD that to the HDD?

    Any downsides (apart from the cost)?

     

    Thanks

    Herbie

  • PeterF

    From my side I don't regret having SSD on my XPS17, then again I don't have comparison material to quantify the speed increase since my timings on the old hardware were simply not that representative (too many significant hardware differences).

    Well, supposing (actually more agreeing) that compiling is a lot of IO where memory and that disk are bottlenecks, increasing disk IO will certainly give you a potential speed advantage. How much speed advantage will of course vary with each project type. If you are in the luxory position to go out and buy an SSD disk you can clone your existing disk to the SSD and do side-by-side comparisons with similar setups and calculate cost versus time to convince management to pony up some dough Wink

    Cheers,
    Peter

  • Proton2

    I haven't used an SSD yet, but I have set up RAID to speed things up.

  • myefforts2

    I have an SSD and larger spinning rust disk in my dev machine. SSD is the boot volume and source code repositories live here too.

  • Dr Herbie

    @myefforts2: Can you tell how much faster is is to compile the same projects using SSD over HDD?

    Herbie

  • Dr Herbie

    Ok, with more Googling I found several people saying it was great to have an SSD for running windows and VS, but it didn't make much of a difference for compiling (SSDs aren't as good for writing as they are for reading, therefore they're more effective for loading data that doesn't change much, like executable files).

    Seems CPU speed and amount of RAM are still the main bottlenecks for compiling.

    Herbie

  • Proton2

    Do you know what RAID is Dr Herbie ?

  • PerfectPhase

    I have a 300+ project solution that we benchmarked on SSD and spinning rust.  SSD saved us about 10 seconds of a 200 second compile. The bottle neck in our solution was the Silverlight projects, nothing we could do could make them compile faster!  Even tried RAM disks after that and the compile time didn't drop noticeably.  This was on a top of the range i7 with buckets of RAM and SSD for OS. 

    The biggest win for us though was in opening solution and source control operations, made a real difference in usability there.

  • fabian

    I do. I have a azure solution. On the old drive it took around 1 minute from i pressed F6 until the solution was deployed, the browser had started and the website was displayed. After moving to SSD the same operations takes less than 5 seconds.

    I have replaced all disk with SSD.

  • Dr Herbie

    @Proton2: Yes, I do know about the striped RAID option : my last dev PC used striped raid between two disks but we didn't find it noticeably faster, and the disks ended up getting disk errors a two or three times over the three years I used them, so we dropped that from the spec of my most recent dev machine.

    Herbie

  • blowdart

    For me the SSDs are boot drives, and VS drives, not source control. If you were worried about disk errors with RAID then SSD isn't for you (unless you have a concrete backup solution). I can't honestly say I notice much difference between compiling Katana on SSDs and HDs. The little improvement I see I put down to my home machine having 32Gb and my work machine only 24 and some VMs running in Hyper-V. What is noticeable is boot speed and the loading of VS. 3 seconds at home on SSD, 10-15 seconds at work.

  • MasterPi

    My SSD (240GB) is boot/apps. I almost wish my secondary/file storage were an SSD as well as the latency is now noticeable in comparison.

    How much faster? Can't say...I had it before I started working on a new project so I really don't have anything for reference. It doesn't seem like anything changed, probably because my source code is on the non-SSD drive.

  • Dr Herbie

    @blowdart: We use TFS on a separate server, so disk failures on a secondary drive aren't really much of a problem (as long as we don't have to install our tools we can happily swap out the secondary drive without any great loss).

    The upshot seems to be that there is no quick PC update to speed compilation, but a major update to CPU and memory will make the most difference.

    I guess the next approach will be to split our projects up more (they generally tend to be Big Balls of Mud anyway) so that VS can optimise the projects that need to be built rather than having to compile the whole ball of mud.

    Herbie

  • androidi

    Somewhere during the 2010/2012 VS releases, the overheads in compiling have grown significantly. I still have 2008 on a HDD Windows installation, while 2012 is on my SSD installation. Yes. The 2008 is several times faster when compiling small changes to small console projects. Order of magniture faster if it was on SSD.

    If I make a trivial change to 1 line of code, one would think the background compiler would compile that in the background and then just insert that compiled code in place when F6 is pressed for debug builds, resulting in practically no compile time at all for trivial changes. Why should compiling 1 small change take same time as large changes?

    Now if this issue only affected new features and syntactic sugar, that'd be ok but it affects things like modifying a text in a comment and nothing else, or fixing a typo in a string etc. Those things should not be 5-10x slower in 2012 vs 2008, they should be faster now that we have more background compiling "ahead of the F6-time".

     

  • bondsbw

    @androidi: How many projects are in your solution?  I don't know if this has anything to do with the difference between versions of Visual Studio, but it has been documented that build times are noticeably slower when breaking your solution up into multiple assemblies.  It would be interesting if Microsoft has updated the C# compiler to optimize for one case to the detriment of the other.

    If I make a trivial change to 1 line of code, one would think the background compiler would compile that in the background and then just insert that compiled code in place when F6 is pressed for debug builds, resulting in practically no compile time at all for trivial changes. Why should compiling 1 small change take same time as large changes? 

    NCrunch actually does background compiling, except it is built only for running unit tests and displaying their results immediately inline.  It isn't built for having your assembly ready for debugging.  But in theory, it can be made to happen.  There probably could even be optimizations made to update individual classes, methods, properties, fields, etc. and just the signature dependencies of those items, instead of the entire project.

  • Dirtbagg

    I have an old Dell Latitude D630 which I replaced the hard drive with a SSD 120 GB. it works well, I save bigger files a larger drive

  • Bass

    I compile to a dynamically generated ramdisk that only exists when I compile something. Actually, not for performance reasons, but to reduce wear on the SSD.

  • cheong

    @Dr Herbie:I always use RAMDisk for building large projects. Fortunately our biggest project is around 94X MB only (OBJ files etc. included), and the performance is satisfactory.

    The only drawback is that you have to remember check-in / shelve your changes to source control at the end of everyday or you'll risk data loss.

    Recent Achievement unlocked: Code Avenger Tier 4/6: You see dead program. A lot!
    Last modified

Comments closed

Comments have been closed since this content was published more than 30 days ago, but if you'd like to continue the conversation, please create a new thread in our Forums, or Contact Us and let us know.