I am sorry, I don't like it, this Silverlight based implementation is awfully slow and uses just all my computer resources. I don't see the point to have this sort of implementation when in the same time the user is facing slow performance and in general
a bad experience. Sure it looks pretty, but this is slow, slow and slow.
Ok that's a beta, but even if they manage to make it perform faster, this Silverlight will continue to suck my computer resources like hell as it does for anything else. Silverlight is like flash, a web technology will should not exist because it is just
a pain to use. Instead of pushing this proprietary technology, Microsoft should better join Apple, Google, Mozilla and anyone else besides Adobe to develop and promote open standard web technologies.
"with things like silverlight you could develop an app that runs on all these platforms, but there isnt a good distrubution channel, atleast not across all the platforms.."
HTML5, the new HTML streaming technology if accepted by the IETF are what they want.
"things liek the office addin gallery, expression and vs addins could also be interated in such a place. it would eclipse the app store in like a week"
This is blah, blah. I mean look at it, Apple yesterday announced that they sold more than 1.5 billions apps in one year (yes 1.5 billions!!!), they got more than 65000 apps in one year, they got more than 100,000 developers developing apps for the iPhone
platform in one year.
And what you say? You are talking about office addin, expression or vs addin, those are toys, how long have windows mobile been in the market? Years, and Microsoft could not figure out how to sell apps to windows mobile users and in first place they did
not figure out how to make an attractive platform for users and developers. So now, once someone has done it and proved that people are willing to buy and use apps on their phone, Microsoft again late in the game can only come in without more than ripping
off the Apple's App Store. I mean look at the Market Place app, where are the new ideas, the innovation, even the interface is similar to what Apple came up with their app, so what?
Call me Apple fanboy or Microsoft hater, if you want, the matter of the fact is that some Microsoft blind fanboys (who rarely look outside their closed Microsoft world) should see the truth, the reality: Microsoft can't come up with original ideas that they
can sell to people, an idea that can be successful, that people love and love the products that comes from the innovation. And i have to say that i am sad about that, a multi-billion company that continues to sell a crappy system for phones is just disappointing.
You like it or not, Apple got 100,000 developers on his platform because it is innovative, modern, fun, easy to use. What is windows mobile today? Just crap, it is just crap!! Be honest, everyone having a minimum of honesty comes to the same observation. Windows
mobile is old, boring, lacks modern technologies, and simply is behind the competition. Even Palm with far less resources than Microsoft managed to build an attractive platform with their webOS. It just blows away windows mobile. Sure it is very inspired from
the iPhone, but they did something unique for the specific market that they want to go for in the mobile space.
People outside and inside Microsoft have to accept it, they got a crappy os for phones, as long as they pretend that everything is fine, we will never see anything good coming from Microsoft in the mobile space. And oh yes, there is Windows mobile 6.5, and
what? It is awful, this is just a quick hack to keep Microsoft's head outside the water in the mobile space. And this Market Place, this is nothing, it just running behind the competition, just to say that Microsoft also does it.
"There is this notion of server GC and workstation GC, ...... this distinction came back from the good old days when only servers were multi core, no one had a multi core box as a client"
This is plain wrong, really wrong. Workstations which of course are not servers have had multi processors (which is basically multi cores but on different processors) for years, back the the nineties. You could get boxes (they were terribly expensive) from
Digital, Sun, HP, SGI, IBM with Alpha, Sparc, PA-RISC, MIPS and Power processors respectively, configured with 2-ways or 4 ways processing. Basically when those processors became multi-processing capable, they got their way to servers and high end workstations
at around the time. And by the way, all those processors were 64 bits.
You could even get cheap dual G4 processors machines from Apple back to 2001, when no one else was speaking about multi-processing in the personal computer world (the workstation market was considered as high end market distinct from the personal computer
market). This was the time when Apple was trying to convince the pc industry (and Intel) that running after the gigahertz was pointless and will have to come to an end. It eventually happened....
Taking advantage of multi processing is not a new problem, it is a more important problem now because the number of customers and developers affected by it is larger as the whole industry has embraced multi processing as the main path for higher performance.
So we will see more and more cores that we need to use.
Before the problem was limited to people doing rocket science who needed to reduce the computing time of large data sets, and to develop complicated visualization apps, or anything involving high performance. They had (still have) to write concurrent code
for 4, 16 cores. From now, we will have to write concurrent code for 64 cores and higher not too far in the future, making the problem even bigger.
But again saying that multi-processing was limited to servers is totally wrong, it had appeared on the client many years ago, again during the time when multiprocessing was not a big player in the windows world.
Some Microsoft people should really go out sometimes, really...
Webslices a new feature? You are kidding, right? I mean it would be nice that the folks of channel9 just go out a little to see what is going on. Webslices is nothing more than a ripp off of Webclip, a feature introduced in Safari for Mac OS 10.5. At least
IE folks could be honest and clearly admit that they took the idea from Safari and they ripped it off. That won't kill them....
I do respect the work that Rashid did when he was at the CMU, he is a big expert in OS development, but pretending that he is the main inventor of Mach is really not honest and that magically Apple has used his "work" to build Mac OS X is quite ridiculous.
Mac OS X' s father and architect, Avie Tevanian is also the main architect of MACH and is actually known to be the principal coder of it. Just look at the MACH source code used in Mac OS X (Mac OS X kernel is opens source), they all refer to Avie Tevanian
as the programmer. It is not also surprising that Avie decided to use Mach as kernel foundation for Mac OS X, he knew where he was going and what he was using. In other words, Rashid deserves little credit to the development of Mac OS X in contradiction to
what he is pretending, even if he was involved in Mach, sorry, i think truth has to be said. Yes Rashid is an important person in the development of Mach, but he failed to say that he is far to be the only one, he should have mentioned at least Avie, i mean
history has be said properly or not at all........
The mac a closed platform? Well OS X supports all OPEN and STANDART video and audio codecs wich windows does not, Safari has a large support of web standarts which IE does not (remember SVG for example!!! or maybe the acid test!!!), the kernel of OS X
is OPEN SOURCE AND FREE (i don't need to mention windows here isn't it?), any kind of Unix/Linux applications can run natively on OS X which of course windows can not do, and any kind of Windows applications can run on mac (think about virtualization solutions),
.......and this guy is calling the mac a closed platform, ....... Hummm i smell BS....
I would like to correct something. This is absolutely wrong to say that processes can just quit or disappear in the mac witout the user knowing what is happening. This is plain wrong and the way you say it in the video sounds just like a big lie. I don't
about Linux, but i do know that OS X detects and notifies to the user any process which has crashed or did not terminate properly. No only it does notify to the user that a given application has terminated in an abnormal way but it does also allow the user
to send to Apple a complete bug report for any application. So no, please don't assert things that you don't know, not only this is not correct, but don't try to let people believe that Vista is the first OS to do that, i mean the mac start to do such things
since Panther, way back to 2003!!! Give credit to it.......be fair.
I don't really think that those two guys really understand how user accounts work in the mac. Pretending that the mac has a sinple log in/log out that puts you in an administrator account with full privileges is just a big lie to the face of the camera.
Those guys do not seem to know that OS X is a Unix like system, and for this reason it uses exactely the same model for users accounts. It works as follows: OS X as Unix or Linux uses threee different levels of permissions: - The super user or root. If a user
log in as a root, he has full provileges, full power to modify anithing in the OS. He can modify OS vital files or directories without any prompt. Well full power!!! The root is BY DEFAULT disactivated in OS X or Linux, or any other Unix. Th user need to activate
the root account manually by providing the admin password. Most of the users on mac don't even know that such account exist, only Unix users know how to activate it. - The administrator account; This is the owner account. When people install a new version
of OS X or Linux or buy a new mac, the is the default account which is created by the system. Why? Those are multiusers OS, so it needs to create at least one administrator account in order that the owner can manage the system. Of course the owner can disactivate
the admin privileges if he/she wishes. However the admin account works quite differently than windows admin accounts. On Unix a admin user can yes manage the system, set the preferences, etc, but it does not have full freedom to modify the system. If an admin
user tries to modify any OS vital files or directories, he will be prompted before. The idea is that you get the power to change things as your are the owner of the password to administrate the machine but system does not give full freedom to do anything you
want before being sure that it is reeally what you want to do. If you try to install a application that put files in protected directories, an admin will also be prompted before to do so. The difference with windows is that the admin account on Unix does not
open all doors as it does in windows, any vital change can not be done without entering a password even if your log in as admin. The admin account in windows is more similar to the super user on Unix. That means that a worm or virus will not be able to modify
any protected files or directories without the approval of the user even in an admin account. If it tries to do so the system will ask the user to prompt. In windows, under admin acccount it just go through without the user noticing that something is changing
the system. It also does mean that the admin account under OS X (UNIX) is more secure than the one in windows, because again, yes you are logged in as an admin but the system will still ask you to to enter a password if you try to make something dangerous
to the system. - The non-admin account: This the default account that is created outise of the initial adnin account. Any account which is created on OS X is by default a non-admin account, ie., with the smallest privileges. So i don't get why one of those
guys says that user accounts on mac are by default admin account, no they are not, only the orginal account created after installing the OS or starting up for the first time the mac is admin for the reason that i explained. Other created accounts are by default
non-admin with smallest privileges. That means that a user in such an account can not set the system or change any shared directories betweenn users like the Application directory. He can only change what is inside his home directory. This is quite a quick
explanation of how it works but man!! this is basic Unix. I can not just understand why those two guys seem to know very few on how accounts work on Unix and particularly on mac. Again OS X use the Unix model that i exposed. Trying to make people believe that
logging in as an admin in mac is the same as windows is just showing that he really doesn know what he is talking about. Not surprising that UAC is quite badly implemented.
If i look to this video i can not prevent myself to think that there is nothing new here and i am quite surprised to see that Microsoft is so late in this. Yes Apple (now i am sure that many windows fanboy will treat me of mac troll, but anyway!!!!) have
been doing many work on data parallelism for many years now. I mean Apple has been working on APIs for SIMD programming for many years that provide data parallelism for image processing, scientific application, signal processing, math computing, etc..... This
API is called Accelerate framework and it just do all the job for the developper. No need to worry which architecture your programm will run (Powerpc or Intel), the APIs does the optimisation for you, the vectorizing for you, and the architecture dependent
optmization for you. No need to worry about data alignment, or vector inctrcustion, etc... It just provide the all abstraction, and this certainly why SIMD computing has been far more spread on mac compared to windows. On pc you could use Intel vectorizing
tools, but that's expensive and still the level of abstraction is not quite high or as high as a developper would like to be. Now talking about GPU processing, i can not see anything impressive in this video. Apple (yes again Apple, sorry!!) is already proposing
TODAY (not a research project) an object oriented API for high-end application and data-parallelism computing. CoreImage and Corevideo does just that. They provide an abstraction model that provides to the developers a model for GPU programming, CoreImage
uses OpenGl, OpenGl Shading language and works on programmable GPU. Developpers do no need to know how GPU works or how OpenGl works, CoreInage and CoreVideo provide all the abstraction with an object oriented programming model built with Cocoa. You don't
need to know about graphical programming and computer graphics mathematics either, CoreImage/Video abstract all of these. Moreover CoreImage/Video does the optimization on the fly for a given application, depending on the architecture on which the program
runs. It does optimize and scale performances depending on the ressource you have. In another words, it optimizes for the GPU if the hardware allows it, otherwise it will optimize for Altivec (SIMD computing) on G4/G5 or for SSE on Intel. It will also optimize
for multi-processors machines or multicores machines if it needs/can do so. CoreImage/Video also provide a set of built in Image Units that perform general graphical effect, blur effect, distorsion, morphology, you name it, all running on GPU. CoreImage/Video
use a non-destructive mechanism and 32-bit floating point numbers. The architecture is completely modular, any developper can buld it own image unit. Anyone call download a test application named "FunHouse" in the Apple development tools that performs REAL
TIME image processing using the GPU. Much more impressive compared to their demo i would say. And more important high end applications like Motion and FinalCut Pro 5 Dynamic RT technology leverage CoreImage and Core Video, you get real time graphics and video
processing!!! So i don't really think that what is shown in this video is new or a breakthrough (sorry!!!!), particularly when it is still a research project when CoreImage and CoreVideo already does even more and have been available for more than 1 year now.
I would really advice people interested in Accelarator to have a look to CoreImage, CoreVideo too, they will find a state of the art GPU based data precessing and data-paralellism technology. Its not the future, its now.... Last point, in the video there is
something that i don't agree. One of the guy said that scientific computing could be done on GPUs. I don't really think so, at least depending on you needs. I am geophysicist, specialist in fluid modelling and continuum mechanics. In most (if not all) scientific
modelling work, double precision math is required to achieve acceptable precision for the results. The problem is that CPUs do not provide double precision floating point numbers support in their execution unit. They do provide only (so far!!) simple precision
math as it is enough for 3D modelling and games. What i mean is that the vector units in the GPU (yes GPUs use a SIMD model for their execution unit, that's why they can achieve a high order of parallelism in data processing data) only support single precision
floating point numbers. This is not enough for most of the scientific applications today. Now there are many research out there on how to use GPUs for non-graphical calculation involving large sets of data, but so far nothing really usuable forr scientifc
computing. Apple had similar problem with Altivec becasue it does not support double precision floating point vectors, which prevent the G4 to provide vector computing for double precision floating point numbers. Some of the Accelerate APIs can do some double
precision operations on Altivec but it was limited to some specific operations like double precision Fourrier transform. So the GPUs have therefore a similar problem, they do not scale well for double precision floating point computing which limits their use
in scientific computing. On the other hand, this does not mean that some interesting work can not be done with the GPUs outside of the graphics world. There are some proposals on taking advantage of the GPUs power to encode or decode MP3 files, MPEG4 files,
etc... Some ATI card do some H264 decoding in hardware but we could imagine to use the GPU to also encode H264. Another application is of course animation. Animation does require a lot of data paralelism computing, and GPUs can help a lot in that. Leopard
Core Animation is a good application of what can be done.