eagle wrote:Was that green apple some sort of subliminal advertising?
What about the Heineken? MS must be more of a cool place to work than I thought originally.
Looks great... a few questions...
1. It looks like the build agent is actually a dummy/slave machine and it is actually the TFS that monitors checkins and when a build is needed it tells a build agent to run that build. Is this the case? If so, can you have a pool of build agents rather than specifiying a specific agent for a specific build?
2. The build status UI looked like it was passive... in other words you actually had to decide to look at the build status screen to see if one failed or whatever. Is there any type of notifications that just pop-up if a build fails... sort of like cctray? CCtray makes things so easy... you can basically ignore it but a quick glance without opening any UI dialog and you can tell the status via the Green-Yellow-Red indications.
3. Is there any labeling in source control that corresponds to a build? This could perhaps be used because a build is compiled as Debug for testing but then when we decide to release the build we want to actually rebuild from the same source to create a release/install package. Or, is this just part of the build project file itself? (Sorry we have worked with TFS build at all yet.)
4. How easy is it to add additional build steps like perhaps running FxCop or Simian or Fitnesse tests? Can the reports from those third party tools be integrated into the data warehouse?
Feb 26, 2007 at 8:15PMOk, I admit I am a little confused. Does this take snapshots are predefined intervals or does it version a file everytime it is saved? So, if I save my document at 10:00 AM... then make a change and save it at 10:10 AM do I get both versions? Or, is that considered one change because it was saved between snapshot intervals?
Also, any chance NTFS will support the recycle bin? It seems at this time (XP) it is a shell feature cause I only get files in the recycle bin if I delete using explorer? But, if I use command line or some other app those files aren't put in the recycle. I would love to see the recycle bin be a file system feature.
What are you guys thinking? A rich native app that runs on both Windows and MAC (what no Linux version?)
Don't you read Gartner and listen to Ray Ozzie? The windows platform is DEAD! Web 2.0 is taking over the world. No one is buying traditional PC app's anymore.
Get with the program!
I don't understand why this would/should be limited to Vista... This seems like the caching should be controlled in the drive firmware rather than the OS.
I would expect that you will soon see the cache utilization built right into the drive firmware so you get the same benefits of a flash cached hard drive.
This is not much different from SQL servers cache. The main difference is that you are using non-volitile ram rather than dram... so on power off the cache remains and can be used for booting as well. But, Personally, I leave my PC on all the time, so boot up time isn't a big deal.
Something that wasn't addressed that I kept expecting the intervier to ask is how the Visual Studio version relates to the NetFx version?
Currently, correct me if I am wrong, but if you want to target .NetFx1.0 app you MUST use VS.Net. If you want to target .NetFx 1.1 you MUST use VS.Net 2003.
Now that VS 2005 is out, it seems to allow you to target .NetFx 2.0 and 3.0. But, you can't target .NetFx 1.1 with VS 2005. (Yes, I know you can use BEE to compile from VS 2005 to 1.1, but there is nothing in the IDE to ensure that you only use/reference DLL's and features of the 1.1 framework.)
So, now you say NetFx 3.5 (for example) is the next "version" of the framework. Will I be able to target that with VS 2005? Or, will I be required to upgrade to VS.Next?
Also, will VS.Next be able to target .NetFx 3.0 and 2.0 as well as 3.5 while still giving me the new IDE features of VS.Next?
In other words, are there plans to decouple the VS (IDE) version from the framework version and allow tools in the IDE to target a specific .NetFx version?
I know sometimes the new IDE requires the new Fx version to run. That was the excuse as to why you can't use VS 2005 to target .NetFx 1.1 apps. But, it didn't make sense. I have no problem installing .NetFx 2.0 in order to get the VS 2005 features while still wanting to target 1.1 NetFx.
Can you tell I hate having to drop back to VS.Net 2003 to work on my .NetFx 1.1 apps. ???!
schrepfler wrote:Looks very promising, what I object is that it's still data driven. You generate a new set of classes that abstract a database (which might be usefull), but why generate new classes if you already have your own set of the domain in question? I believe a coherent Model Driven solution should also exist and I guess with entities the true ORM nature should allow us to use our domain objects directly. At this point we can see a convergence between the approaches of both Java and .Net on persistence and object-relational mapping except that .Net integrates the queries directly in the language.
Sounds like you need to read the SQL Server manual. Since 7.0 there is no advantage to running a query that is in a stored proc over one that is sent via the connection.
As a matter of fact there can be a disatvantage in that the first time you run the stored proc, the execution plan is optimized for the specific parameters that you pass in. Your next call to that proc may have a faster execution plan but the existing one will be used since it is from a already run proc.
Also, dynamic query's execution plans are cached, so if the same query comes in again the previous execution plan is reused.
You may want to look at some of Kim Tripp's web casts on SQL performance optimization where she actually demonstrates that even using dynamic SQL in an SP can dramatically increase performance.
Well, it seems object spaces isn't dead afterall.
Anyway... do you see this framework simply as a DAL tier feature? Are the entities that are generated basically a container for metadata? Or, can the entities be used as business objects? In other words, can the entities contain business rules or are they more like data transfer objects (DTO's) that have no encapsulated code.