There are three straightforward ways to write 32-bit managed code in 64-bit Windows:
1) Use VS2003. VS2003 installs on 64-bit Windows and will generate code that always runs on the 32-bit CLR.
2) When using VS2005 you can select that you want the assembly you're generating to be 32-bit only (the default is "ANY CPU", which will float to the 64-bit CLR if one is available).
3) Use the 32-bit C++ compilers (C++ has different compilers for targeting each platform). The 32-bit compiler runs on 64-bit Windows and will generate code that runs as 32-bit unless the user specifies /clr:safe.
Check out, I believe, section 4.6 of the C++ Standard on Integral Promotion.
I have the K&R C book in front of me so it's easier to quote from the reference section of the text... "If an int can represent all the values of the original type, then the value is converted to an int; otherwise the value is converted to an unsigned int.
This process is called integral promotion".
No, we actually didn't do this to get the extra bit. It's because C/C++ standard says that sign + usign is unsign, and thus when we add an unsigned value to a pointer we don't sign-extend (as the value is not negative), but we would sign extend if the type
Now maybe what you mean is that we should just always treat 64bit integral values as signed. The problem with that is that there are 64-bit types besides pointers (such as __int64) and you want to make sure that properies that people may expect with unsigned
values are preserved.
Such as, for all bit representation x, (if x != 0) => x > 0.
The crazy thing is that these incremental changes have these huge ramifications. For example there was a time that people were talking about the impact of CD-ROM, and how you could send CD-ROMs to people and they could do virtual walkthroughs of a house
or see a city map.
A couple of years later the web had caught on, and shipping a CD-ROM in the mail for this type of digital content almost seems foolish to have considered.
When I first saw the web in ~1993 the feedback from a lot of people was, why would I use this when I have FTP, archie, gopher, etc...? This web thing doesn't seem to add much value, plus there's WAY more content over FTP. How many people even know what gopher,
archie, veronica are today (much less even used it)?
I personally think GPS (or location tracking in general) is going change the world in a drastic way in the next decade. As well as 64-bit
Beer, this actually isn't a function of the chip architecture, but rather of the high-level language. In a different language the result of (sign + usign) may have a different type, in which case the result would be different, even on the same machine.
Sven some of your points are valid, but let me have a chance to justify some of my statement
1) You say that CLR code will run fine is no surprise, and you justify it by saying that .NET is the platform and not x64/Itanium. All correct, but nevertheless not necessarily obvious. This requires some understanding of what is in metadata, and what is
2) I do hope people watching Channel9 know better, but you'd be surprised. In the same way that there are a lot of developers that don't understand that 64-bit code could potentially make your app slower.
3) I'd have to look back at what I meant by that comment, but I imagine it was meant to refer to assembly code, where reading x64 assembly is like reading x86 assembly. If you're looking at C, then you're looking at C. Of course if you're reading Itanium
code, it's a whole different ballgame -- or do you think most x86 wizards could parse Itanium code with the same efficiency.
Why am I so hyped about 64-bit? Well from the developer side of the world it's absolutely going to be critical. LTCG is the way of the future,and the main issue with LTCG is memory pressure. I'd say in less than a few years having 8GB of RAM for devs will
be a must have (unless you don't plan on ever doing LTCG builds).
And with respect to development the only way you free memory is because you're going to run out of it. Lets take memory to the extreme and say you have 2^64 bits of memory. Based on my current usage (lets say I consume 32TB/day, which is a LOT), I could turn
on my computer today and never have to free memory over the course of my lifetime. Forget about garbage collection -- the garbage man disappears altogether.
On the desktop, the fact that I have to ever shutdown an application is annoying, and is largely a memory issue. Another example is something as simple as gaming. Right now physics are totally faked, but if you want to do real finite-element analysis over
complex meshes, well you'll likely suck up as much memory as there is in the system (can you accurately simulate physics any faster than the real world -- do how many bits of data do you need to accurately do it -- every bit in the universe?). Another example
of media is what is now passive media, such as TV. We currently watch streaming video, but it's much cooler to send models (as in games), do the rendering in real-time and create a world where you control the camera (and where in a show like "24" you can
watch whatever part of the world you want).
I'm certainly not a visionary, so I don't claim these to necessarily be great ideas, but things that I would be excited about.
Anyways, I'd love to chat more, and attempt to be less vapid
Hi Derek. The good news is that in VS2005 (including the recent CTPs and the soon to be released Beta2!) we have 64-bit devtools as part of it. Just change your platform configuration and now you're targeting Itanium or x64.
The bad news is that I don't know of any emulator to run the code in. You need a 64bit machine to run 64-bit code.
The good news though is that we're just about to ship 64-bit Windows for x64 (like in days), and x64 machines cost pretty the same as 32-bit machines (and you can dual boot 32-bit and 64-bit OS, plus the 64-bit OS will run virtually all of your existing 32-bit
I'm neither Zoe nor Gretchen, but I'm a PM who does do interviews at Microsoft and I will say that we do ask coding questions. I asked one this morning in fact. It probably depends on the team. I'm a PM on the C++ team so our product has a lot to do
If you have an interview, good luck!
But doesn't that mean that any proprietary algorithms are pretty exposed in the MSIL?
Yes and no. You can always obfuscate the generated MSIL. Also, remember that obfuscation of only sort only keeps honest people honest. The people who want to rip you off (and if their good engineers) will be able to do so. There's also an interesting paper
on code obfuscation, which proves that it's impossible to do a really good job on it (from a theoretical perspective).
The original C++ code is "unsafe" and is not under the control of the GC?
Correct. This does not make unsafe code become safe. Nor does it make the code GCed.
But, it does allow you to start directly using managed DirectX in your current C++ code base.
So it is not worth mentioning. It is still C++. As a matter of fact every C++ compiler should accept C code with minor modifications. The whole message is "The .net C++ compiler behaves in some way like a C++ compiler."
Hoz, I think you misunderstand the point. Moving legacy code to .NET or the JVM is one of the big pain points for anyone with a big app. In .NET with C or C++ code you add the /clr switch and now your app is a .NET application.
I think you're under the incorrect belief that this compile of Quake resulted in just normal native code. No the result of the recompile with the /clr option was MSIL (similar in some ways to Java bytecode). You can run ILDASM on the .exe to look at the MSIL
Most importantly though is that now you can seamlessly add managed code to the Quake2 application. Recompiling for the sake of recompiling means nothing. The reason you go to .NET or JVM is because now you can leverage the benefits of the platform. In the
Quake2 demo they added a Winform radar directly into the application, just as you would have done in a C# or Java application -- but this was the same code you started with. Not a complete rewrite of the project.
What if I could take Doom3... the original code, throw the /jvm switch and now it was running on the JVM at 90% of the native code speed. And then I added some Java control to the game -- all easily. Now wouldn't you agree that this would be big news? Well
that's what you just saw with .NET.