Loading user information from Channel 9

Something went wrong getting user information from Channel 9

Latest Achievement:

Loading user information from MSDN

Something went wrong getting user information from MSDN

Visual Studio Achievements

Latest Achievement:

Loading Visual Studio Achievements

Something went wrong getting the Visual Studio Achievements

Route 64 - Kang Su Gatlin talks about 64-bit


Right click “Save as…”

Over the next few days we'll bring you video from the Route64 Training Tour when it visited Redmond.

In this first part, you meet Kang Su Gatlin (again, since he's been on Channel 9 before). Kang Su is program manager on the Visual C++ compiler team.

He knows 64-bit better than most other people because his team is building the compilers to take advantage of the new 64-bit processors that are now hitting the market.

Our tech guy, Charles Torre, gets into the 64-bit world with Kang Su.


Follow the discussion

  • Oops, something didn't work.

    Getting subscription
    Subscribe to this conversation
  • Why didnt he let go the Xbox controller during the interview ? Hauhauhauhau. Caught the guy in his break time.
    Actually I was kickin' major butt in Project Gotham II Smiley

  • That is what I call a major Skill ! Kicking butt while giving an interview about 64-bit !
    I canĀ“t even talk about 64-bit Big Smile

    Congrats for the great interview !
  • Need better questions.  Is there some way to suggest questions for upcoming interviews?
  • CharlesCharles Welcome Change
    Ask away, Sparky. Kang's around.

    We don't have the bandwidth to ask for user questions before every interview.

  • derekvsderekvs I'd rather be flying
    For people like myself, who are really interested in getting a jump on this stuff before it's mainstream, what developer tools are available to begin exploring?  Is a 64-bit emulator available?  I remember back on Windows 3.1 we could install Win32 to run 32-bit applications.  It would be nice to use my new P4, at least for a few months, before throwing it away!   Smiley
  • Buddhike.de.SilvaBuddhike.de.​Silva Buddhike de Silva
    Hey!!!! KSG, You Rock dude! Big Smile
    Hi Derek.  The good news is that in VS2005 (including the recent CTPs and the soon to be released Beta2!) we have 64-bit devtools as part of it.  Just change your platform configuration and now you're targeting Itanium or x64.

    The bad news is that I don't know of any emulator to run the code in.  You need a 64bit machine to run 64-bit code. 

    The good news though is that we're just about to ship 64-bit Windows for x64 (like in days), and x64 machines cost pretty the same as 32-bit machines (and you can dual boot 32-bit and 64-bit OS, plus the 64-bit OS will run virtually all of your existing 32-bit apps).

    Hope that helps.

    Kang Su Gatlin
    Visual C++ Program Manager
    Thanks Buddhike, you rock!  Cool glasses... their like mine -- when I'm not wearing my contacts.  Smiley

  • Bah, that was so uninformative I stopped halfway through part 2. And I was looking forward to this, him being a VC++ PM and all.

    The questions were bad. And some of the comments ... vapid.

    1. He mentions that CLR code will run fine, and this surprises who? It's the whole point of the VM, as the x64/Itanium isn't the platform, but .NET is. Big whoop.

    2. He mentiones that some developers think (I want to see a show of hands here) that going 64 will give twice the speed.

    3. And this comment "If you can read 32 bit code, you can read 64 bit code", WOW! Considering that even C abstracts the bitsize of the architecture away, this will surprise who?

    And then he seems WAAY overenthusiastic about going 64 bit. How on earth does 32 -> 64 compare to the invention of the web? Few apps need an address space larger then 4 gig. Sure, 16 bit was not alot, but really, what has changed things is not the larger address space, but the architecture of chips, and operating systems. That you got more memory is really not where it's at.

    I'm a little surprised. The interview (or first half at least) gave no real reasons why 64 bit code will rock my world, so it all feels a bit hyped, which is a shame. What desktop machine does data mining (which cannot be accomplished in four gigs?), and how does media processing/files benefit from 64 bit?

    Let me ask people in another way:

    What would be seriously hampered, by the desktop market staying on 32 bit for another ... five years (let's ignore installed base, etc). If Intel and AMD kept making 32 bit chips, and nothing else for five more years, what would be the consequenses?

    Edit: Well, not to sound too sour Wink
    What I want to know, is WHY does he sound so hyped, about 64 bit? Data mining is nice, but really, who does that on a desktop (in a scale which cannot be accomplished today)? And media files? How do they benefit? Tell us what's exciting about it! We know that it wont affect actual developing, the architecture, languages and tools were made that way, to make the impact as small as possible.

  • CharlesCharles Welcome Change
    SvendTofte wrote:

    The questions were bad.

    Sorry 'bout that.


    Sven some of your points are valid, but let me have a chance to justify some of my statement  Smiley

    1) You say that CLR code will run fine is no surprise, and you justify it by saying that .NET is the platform and not x64/Itanium.  All correct, but nevertheless not necessarily obvious.  This requires some understanding of what is in metadata, and what is not. 

    2) I do hope people watching Channel9 know better, but you'd be surprised.  In the same way that there are a lot of developers that don't understand that 64-bit code could potentially make your app slower.

    3) I'd have to look back at what I meant by that comment, but I imagine it was meant to refer to assembly code, where reading x64 assembly is like reading x86 assembly.  If you're looking at C, then you're looking at C.  Of course if you're reading Itanium code, it's a whole different ballgame -- or do you think most x86 wizards could parse Itanium code with the same efficiency.

    Why am I so hyped about 64-bit?  Well from the developer side of the world it's absolutely going to be critical.  LTCG is the way of the future,and the main issue with LTCG is memory pressure.  I'd say in less than a few years having 8GB of RAM for devs will be a must have (unless you don't plan on ever doing LTCG builds). 

    And with respect to development the only way you free memory is because you're going to run out of it.  Lets take memory to the extreme and say you have 2^64 bits of memory.  Based on my current usage (lets say I consume 32TB/day, which is a LOT), I could turn on my computer today and never have to free memory over the course of my lifetime.  Forget about garbage collection -- the garbage man disappears altogether.

    On the desktop, the fact that I have to ever shutdown an application is annoying, and is largely a memory issue.  Another example is something as simple as gaming.  Right now physics are totally faked, but if you want to do real finite-element analysis over complex meshes, well you'll likely suck up as much memory as there is in the system (can you accurately simulate physics any faster than the real world -- do how many bits of data do you need to accurately do it -- every bit in the universe?).  Another example of media is what is now passive media, such as TV.  We currently watch streaming video, but it's much cooler to send models (as in games), do the rendering in real-time and create a world where you control the camera (and where in a show like "24" you can watch whatever part of the world you want).

    I'm certainly not a visionary, so I don't claim these to necessarily be great ideas, but things that I would be excited about.

    Anyways, I'd love to chat more, and attempt to be less vapid  Smiley

  • CharlesCharles Welcome Change
    I have a hard time asking questions and running the camera at the same time. (Sort of like reading and walking, maybe). When I have a camera person I'm able to think better.

    I do think we should open things up a bit and let you know when we are going to interview somebody interesting and give you an opportunity to put forth some questions that we should ask. In the end, Scoble and I just don't know everything about our technologies. We're learning too, in realtime, just like many of you. That said, you'll find I do much better when the topic is closer to my level of knowledge and or personal interest (read CLR, Windows, Security, competitive platforms, bourbon, wine, cigars, oh...)


  • rhmrhm
    Some people have been crying out for 64bit for years - mostly database people as Kang points out in the vid although there are CFD and FEA apps as well - anything that can use a lot of memory ought to be able to now that memory is so cheap. Thing is, most apps don't use a lot of memory. My PC has 1Gb of RAM and I can count on the fingers of one hand the number of times Windows has actually *had* to swap as opposed to just dumping inactive pages so it can pointlessly increase the size of the disk cache (I've moaned about this extensively elsewhere Smiley )

    Of course the people that really needed 64bit have been able to get it for years in the form of Alpha and RS6000/PowerPC workstations. The fact that most new PCs will be 64bit though has more to do with AMD's battle to gain the technological high ground in its war with Intel than it does with market demand for 64bit. The AthlonXP is now discontinued and so all new AMD machines are 64bit and Intel have responded by making newer P4 and Xeon chips compatible (how that must have pained them to be copying AMD features for a change!).

    It will be nice when 64bit Windows XP arrives - for users of Athlon64 machines anyway - as they will be able to make use of mass amounts of ram (yay, 5gig disk caches!). As for applications though I think a few vendors who's apps need large amounts of RAM will ship 64bit versions, but most won't. Not because making a 64bit port is difficult, but because of the management headache of distributing two versions of the same app. This won't matter to the end-user of course as win64 will run 32bit apps just as well. In fact they'll be even better as they'll have access to 4gigs of RAM as pointed out in the vid.

    So, 64bit is IMO mostly hype to get people to upgrade. Most people don't need it, but the ones who do will sure be glad they can now get it at commodity x86 prices.
  • CharlesCharles Welcome Change
    Good points, rhm.

    I have to disagree about the hype, though. Sure, statements like "the 64 bit revolution" just wreak of marketing hyperbole, but I'd argue that the species of new applications that 64 bit computing will give rise to is really mostly unknown at this point.

    That's pretty exciting isn't it?

  • rhmrhm
    Making predictions about the end of progress always seems natural but is nearly always wrong in retrospect. However I do see problems at least in the short to medium term. One major problem is that disks aren't really getting any faster. Apps that need lots of scratch ram for calculations will use all that memory that 64bit allows access to, but what about apps that just use it for storage. It takes forever to save and restore even hundreds of megs of ram, never mind gigs. Ask anyone who's suspended a virtual machine under VMWare or made the mistake of leaving large crash dumps turned on on a machine with 1Gb or more of ram.

    In the long term we hope to get something to replace the impressive-that-it-woks-at-all technology known as hard disks, and hope that it is also faster as well as having increased capacity limits. In the meantime I won't be angling for a machine with a TB of ram unless it has a few hunderd disks to feed it in parallel - oh and enough bus bandwidth to shift all that data about.
  • derekvsderekvs I'd rather be flying
    Great, that'll be enough to get started experimenting.  Once I'm ramped up enough, it'll be time to upgrade to an x64 PC anyway!

    As for the comment I read about not needing more than 4GB of RAM, I've supported many servers running applications that could benefit, especially SQL Server.  I think that MSMQ could really gain a nice boost since it seems to suck the brains out of our current servers.

    At the PC level, I constantly have to wait for my desktop to come back from swapping with 1 GB RAM after a few hours of HL2:Counter-Strike.  Video game performance has always been the driving force behind every PC purchase or upgrade that I've ever considered.  I'm sure that I'm not alone in this, so it's my bet that the majority of first owners of x64 PCs will be game junkies like me.

  • derekvs wrote:
    so it's my bet that the majority of first owners of x64 PCs will be game junkies like me

    You're a gamer? I'm surprised then that you haven't got a x64 PC already!  Almost all the gamers that I know, bought their Athlon64 machine last year already!

    Personally I'm NOT a gamer, but I still have had an Athlon64 Socket754 in 2004, and since the start of this year I work on an Athlon64 S939 (dual channel DDR) machine!
  • Didn't Bill Gates himself once say no one will ever need more than 640KB of RAM?  That's not a typo, that is KB as in Kilo, not Mega.  I'm sure many people here are too young to remember the days of DOS, and trying to make 640KB work for you.  People are spoiled, 2GB will vanish in short time.  I doubt 4GB will make it past a decade. 

    By 2020, I can see 16GB (or more) of RAM as a minimum standard on every PC/Media Center sold.  (In 2020 we won't have PCs anymore, it will be something much more involved and require far more resources).

    10 Years ago I bought 8 MB of ram for nearly 200$, doubling my PC's capacity, and I though I was riding on top of the world.

    7 years ago 16MB of RAM was the mimimum, today anything less than 512MB is a joke.

    As resources grow, developers will create applications to use the resources.  It's a cycle, that has no end in sight.
  • rhmrhm
    PuckPuck wrote:
    Didn't Bill Gates himself once say no one will ever need more than 640KB of RAM?

    Apparently not.
    The crazy thing is that these incremental changes have these huge ramifications.  For example there was a time that people were talking about the impact of CD-ROM, and how you could send CD-ROMs to people and they could do virtual walkthroughs of a house or see a city map. 

    A couple of years later the web had caught on, and shipping a CD-ROM in the mail for this type of digital content almost seems foolish to have considered.

    When I first saw the web in ~1993 the feedback from a lot of people was, why would I use this when I have FTP, archie, gopher, etc...?  This web thing doesn't seem to add much value, plus there's WAY more content over FTP.  How many people even know what gopher, archie, veronica are today (much less even used it)?

    I personally think GPS (or location tracking in general) is going change the world in a drastic way in the next decade.  As well as 64-bit  Smiley

    Kang Su Gatlin
    Visual C++ Program Manager
  • nice vid... btw, is MS doing any research on quantum computing?
  • WilWil Wil
    It doesn't seem to me that either Part 1 or Part 2 of this video made a very compelling case for why Mom and Dad should want a 64-bit computer.  That's of course because in fact they don't - E-mail, Word, Excel, Intuit, Photoshop, etc., all work fine now in 32 bits.  I think the speakers got closer to the truth in urging developers to climb the mountain of 64 bits simply because, in effect, "it's there".  Intel and AMD will sell mainly 64-bit CPUs starting within the next year or so, so it becomes a moot point whether they're needed.  It won't really change the world - after all, lots of us have been using 64-bit SPARCstations etc. as out main work platform for years and years.  What **may** have a big impact, though, is 64-bit multi-core, multi-processor, distributed, and eventually grid computing.  I'm not saying Mom and Dad will have a Beowulf cluster in the basement to run Intuit and Photoshop faster, but the "utility computing", "Web tone", etc., model of computers everywhere (and seen nowhere) could come to pass, and developers (who are already thinking in terms of 'Can I make this run on a cell phone?' when they plan an app) need to be thinking of 'What if this app had access to arbitrarily many CPUs?' when they do their design.  How about some Channel 9 videos, and a Route64 road show, on 64-bit C++ using OpenMP?
  • leighswordleighsword LeighSword
    to application programmers, they don't care how bit the CPUs is, only the complier(system progammers) will consider it.
  • Will I eventually be able to write managed 32 bit code while booted into 64bit mode in Windows? I dont seem to recall losing the ability to write 16 bit applications when I switched from Windows 3.1 to Windows 95, but when I install the 64 bit version of Windows, I can no longer create 32 bit managed code. That in itself meant the machine got flattened. I know I could dual boot, but that isnt exactly an efficient option.

    There are three straightforward ways to write 32-bit managed code in 64-bit Windows:

    1) Use VS2003.  VS2003 installs on 64-bit Windows and will generate code that always runs on the 32-bit CLR.

    2) When using VS2005 you can select that you want the assembly you're generating to be 32-bit only (the default is "ANY CPU", which will float to the 64-bit CLR if one is available).

    3) Use the 32-bit C++ compilers (C++ has different compilers for targeting each platform).  The 32-bit compiler runs on 64-bit Windows and will generate code that runs as 32-bit unless the user specifies /clr:safe.

    Hope that helps,

    Kang Su Gatlin
    Visual C++ Program Manager
  • I should clarify somewhat. The apps I have run into issues with were ASP.NET, IIS on Windows 2003 Server x64 would not run them for me. I posted in the MSDN managed groups and was told that was expected behavior. I obviously need to research further. I will see if I can find my post in the MSDN managed newsgroups, I had a very specific error and was told there would be no 32 bit managed development in the 64 bit OS, and nothing before the 2.0 framework.
  • Here is my original post

    Is ASP.NET 1.1 available on the 64 bit extended version of Windows 2003 
    Server? When I install VS.NET 2003 I then get Service Unavailable from IIS 
    when navigating to the main under construction site (http://localhost) and I 
    get error 'HTTP/1.1 503 Service Unavailable' when trying to start a new 
    ASP.NET project. I uninstalled IIS and reinstalled IIS, and the under 
    construction default page comes up normally, I ran aspnet_regiis -i and then 
    get the same service unavailable error. I get the following errors in the 
    event log.
    A process serving application pool 'DefaultAppPool' reported a failure. The 
    process id was '1876'.  The data field contains the error number.
    A process serving application pool 'DefaultAppPool' reported a failure. The 
    process id was '2904'.  The data field contains the error number.
    A process serving application pool 'DefaultAppPool' reported a failure. The 
    process id was '820'.  The data field contains the error number.
    A process serving application pool 'DefaultAppPool' reported a failure. The 
    process id was '204'.  The data field contains the error number.
    A process serving application pool 'DefaultAppPool' reported a failure. The 
    process id was '2884'.  The data field contains the error number.
    Application pool 'DefaultAppPool' is being automatically disabled due to a 
    series of failures in the process(es) serving that application pool.

    and the reply was that I need VS 2005 and the 64 bit sdk, and that I could only write 64 bit managed code, granted the reply was from an MVP and not a MS person, though no MS person answered my question.

    Thanks for the followup, I am going to wander around and see if maybe there is something I have missed.

  • Not sure this is exactly what you need -- it mentions Windows x64, not Windows Server 2003 x64, but it seems to address the issue:

  • Very nice find, I'll have to throw in a spare harddrive, install the 64 bit version and give it a whirl...
  • After reading this thread, one trend I can see where 64bit is necessary for client side workstations in the next few years is voice enabled applications. Sure, right now they are available - but I can see another "revolution" with these applications and having their capabilities work as well as the Microsoft handwriting recognition right out of the box.

    I would love to be able to start up my computer - or leave it on at all times and just walk into my office and say to my machine, "Computer, start word" and begin to efficiently dictate and when I want to add media to my documents, I can describe what to do - for example:

    "computer add picture from folder My Documents Meeting from last tuesday"

    Or "Computer, please add a picture showing a red BMW and fit into frame"

    These kind of productivity enhancements require immense processing and memory - 64bit is going to help quite a bit indeed.


  • 64-bit processors have absolutely nothing to do with expanded memory. It's true that many 64-bit processors support more memory...but this is not a function of their being "64-bit". 64-bit processors have 64-bit wide general purpose registers and thus these processors can operate on 64-bit chunks of data. 36-bit memory addressing which has allowed in excess of 4GB of memory has been around since the Pentium Pro. In fact supporting in excess of 4GB of memory is not even new to operating systems...linux has supported this for quite some time and the Windows 2000 Advanced Server allows it and that was released long before any x64 processor. The 64-bit craze is highly overrated...especially in terms of performance. In reality it is not likely you will recieve a performance increase (just look at the recent benchmarks of the 64-bit version of Half-Life 2 which underperforms its 32-bit counterpart) which is why Microsoft is not hyping 64-bit performance. Also, because pointers are bigger in 64-bit, programs recompiled for 64-bit are going to be larger, eating up more hard drive and memory space.
  • The forgotten problems of 64-bit programs development. Big Smile
  • Why 64bit ?  more registers.

    I usually see a 5-15% speedup going 64bit with time sensitive code.
    (even with data set only a few meg in size)

    Quick note: for people that write high efficiency/performance code, its not a good idea to ignore the underliying architecture.
    The number of bits per register to the instruction set extensions...  it all matter at the programmer level, not just the compiler.

Remove this comment

Remove this thread


Comments closed

Comments have been closed since this content was published more than 30 days ago, but if you'd like to continue the conversation, please create a new thread in our Forums, or Contact Us and let us know.