DaveNoderer DaveNoderer DaveNoderer

Niner since 2004

CEO / President and founder of Computer Ways, Inc., a Microsoft Partner. Mr. Noderer holds a BS in Electrical Engineering from Rochester Institute of Technology with graduate and continuing education courses from University of Rochester and Northeastern University. He spent over 20 years designing computers, writing microcode and managing projects before starting Computer Ways, Inc. in 1994.

Dave is a Microsoft MVP and is very active in .net User Group communities. He spent 3 years as an offi...


  • Dashboards Made Easy With Reporting Services

    Nice demo!!


    Adding some value labels to the pie charts would be a good excersice for the viewer.


  • Joe Duffy, Huseyin Yildiz, Daan Leijen, Stephen Toub - Parallel Extensions: Inside the Task Parallel

    Starting to fool around with tpl and did a small program to investigate IO. Somewhat real in that I'm working on a purge of old files (.eml's) for a customer.

    I'm copying ~ 5000 files to a directory then deleting them. Not very interesting but close to what the customer needs done.

    I found that with the sequential loop (there are other ways in system.io to do this besides a loop!) it took ~ 11 seconds and the paralelll loop  took 6 seconds on my dual core laptop.

    I don't have a 4 core machine, I suspect that adding a third processor would not help much. I'm assumng that the first thread blocks for I/O then the second thread can queue up another I/O and back and forth but having 3 or 4 processors would not necessarily do any better.

    The copy takes most of the time.

    For Each FI In DirSrc.GetFiles
      FI.CopyTo(Path.Combine(Me.FilePath, FI.Name))


    Parallel.ForEach(DirSrc.GetFiles, Function(Fix) Fix.CopyTo(Path.Combine(Me.FilePath, Fix.Name)))

    Which brings up the subject of good tools to see what is really happening...

  • Gordon Bell and Jim Gemmell - A look into Microsoft's Bay Area Research Center, Part I

    Good to see that Gordon Bell is still active and thriving!

    Besides the terrabytes of data... there is the problem of keeping that ever growing set of information moving into future tools and technologies while keeping the formats and links in place at an affordable price for the general public over hundreds of years.