The Design-by-Contract (CodeContract) is of course very Eiffelike, even the old-value semantics is there. The syntax is quite bloated in comparison [as leriksen71 says], but having the semantics at all is a first step. Also, tail recursion optimization is a welcome improvement.
Flash can playback dynamically generated audio. Can I do that with Silverlight 2.0?
I've always wanted to do an IL-to-IL optimizer. While the jit does a good job it doesn't have time for in-depth analysis, and there are a number of things you can do upstream to boost performance.
And that's what all optimization freaks are craving for. And to get conceptual simplicity whilst not sacrificing performance too much [declarative programmer, imperative compiler].
I haven't studied IL and IR, but would the IR actually be a better IL? It sounds like it could be.
If IL is at a lower level than IR, and if an IL-IL optimizer has to maybe abstract up to IR, then wouldn't it be better to just stay with IR - depending on the effort required to go IR->IL.
I wonder if the TCPA could be used to secure highly optimized snapshots of compiled code [in encrypted files] so the JITr could effectively be relieved of a lot of up-front work. Of course there's NGen which might do some optimizations up-front.AndyA wrote:
As far as what all those cores will be doing -- I expect we will find good ways to employ them to directly address user problems. Phoenix itself can profitably use 6-8 cores, and with a bit more work we should be able to scale even higher.
Sounds great.AndyA wrote:
It may be that the world of code is more dynamic in the future, but I thought that 10 years ago when I worked on a big static compiler and things haven't really changed that much.
I didn't mean in the sense of dynamic languages (necessarily), more in the sense of JIT'd bytecode.
Having processor makers produce plugins for Phoenix sounds quite compelling, for Phoenix itself, Microsoft, the processor makers and the users.
And it would be great with more static IL optimizations. On the other hand, in the parallel world of the future, there should be enough cores to continuously GC, profile and analyze code, so one wonders how much Phoenix can adapt to dynamic compilation and analysis demands.
Anyway, cool stuff.
Brian Beckman: Project Quark - A New Beginning for Quantum Computing Rises from the Ashes of TheoretApr 01, 2008 at 3:47PM
I find your lack of faith disturbing...
Expert to Expert: Erik Meijer and Bertrand Meyer - Objects, Contracts, Concurrency, Sleeping BarbersMar 27, 2008 at 11:48AMHOLY CRAP! I have to watch this!
Feb 13, 2008 at 11:44AMI think most of us managed to get the meaning. Anyhow - excellent video!