Entries:
Comments:
Posts:

Loading User Information from Channel 9

Something went wrong getting user information from Channel 9

Latest Achievement:

Loading User Information from MSDN

Something went wrong getting user information from MSDN

Visual Studio Achievements

Latest Achievement:

Loading Visual Studio Achievements

Something went wrong getting the Visual Studio Achievements

Alan Kay

Alan Kay alan.nemo

Niner since 2009

  • Jason Olson: Composing Programming Languages, F# and OO

    Good question. I'm not a big fan of any of them, including the ones I've designed.

    I think we (the programming language field) has yet to get to a sweet spot that matches up with the scales, aspirations, and life cycles we want to deal with.

    That sweet spot yet to come will certainly have a very strong and dynamic notion of modules, and one part of the notion will almost certainly involve both protection and intermodule communication. But I don't think we need to be religious about the past (even the parts of the past that were done better than today).

    My main complaint about most language use today (including the languages you mentioned) is that Angel the very weak "data structure and procedure" style is retained (via getters and setters, etc.), and (b) the CPU is used to set the notion of "time passing" so that bad synchronization schemes have to be employed. This leads to fragile, very large, inscrutable, and intractable systems.

    It was known in the 70s how to eliminate both these problems. (But not all the problems involved with scaling and very high level expression.)

    So, to me, the issue is much more "how should we design and program systems that must live in an ever growing "ecology" of ongoing systems?"

    Besides using the good 70s solutions to the two problems alluded to above, we could also think about the idea of protected modules which don't send explict messages but use some form of efficient and general publish and subscribe.

    Imagine that each module in "a space" exhibits two kinds of "bulletin boards": Angel one that tells the space what the module needs as resources to do its job, and (b) one that exhibits what the module has produced. A "super-Linda" kind of matcher can be used to automatically do the brokering. This is more late bound and has more possibilities for graceful scaling (because objects are not dealing with targets but with needs).

    Similar mechanisms can be used to completely separate "meanings" from "optimizations" so that semantic debugging, testing and validation can be done independently of attempts to make things go faster.

    One of the mottos from Xerox PARC was "Math Wins!", and I think this is still the case. By "math" I mean mathematical thinking and making up new mathematics when it will help problem solving. It is possible to make meta systems in which the new maths can be turned into programming systems and run. This is somewhat equivalent to "escaping from bricks to inventing arches" when things start to get too complex. This is one of the most powerful properties of computers, but it is rarely used to the extent needed and possible in most of today's systems.

    And so on. It's not that this stuff is easy or solved, it's that too many people in the field are simply trying to *cope*, where what we need are lots of people trying to make *real progress*!

    Cheers,

    Alan

  • Jason Olson: Composing Programming Languages, F# and OO

    I couldn't agree more that credit should be given where it is due. Therefore, it should be incumbent on those who believe this to actually do the work to track down just where the credit should be given. I did this when I was asked to write "The Early History of Smalltalk" for the ACM History of Programming Languages" (HOPL II, which is available online and as a book chapter).

    As I mentioned in that history, parts of the object idea had been around for quite a while (and the class and instance idea could be traced back to Plato's "Ideals"). As far as I could tell, this idea was used in a recognizable but somewhat weak way on the US Air Force Burroughs 220 file system for Air Training Command as early as 1960 or 1961, and much more powerfully in the Burroughs 5000 of Bob Barton, and first described publicly at the WJCC in 1961 (paper is online), and in Ivan Sutherland's Sketchpad system done in 1962 (which like the B5000 also introduced several more deeply seminal ideas).

    These ideas predated the first Simula ca 1965, and the first paper in English (1966) which did not reference any of the above, and also predated the next Simula (67), which had a stronger notion of "class".

    Also, it is important to realize that the two Simula guys were completely different personalities with completely different attitudes about "objects". Dahl was very conservative and didn't like the idea of objects (he prefered data structures and his use of Simula classes was primarily to make what are now called Abstract Data Types). Nygaard on the other hand was a wonderful infinite horizon creative thinker and not only loved the idea of objects, but was the most enthusiastic person we ever demoed Smalltalk to later in the 70s.

    As I mentioned in the history, my epiphany came in Nov 1966 after encountering Sketchpad and Simula in my first week of grad school. This combusted with my background in mathematics and biology, and with ARPA's plans to make a distributed ARPAnet, to provide a glimmer of a universal structuring idea that was simply to recapitulate and recur on the idea of "computer" itself.

    Our contributions to this already years old research field were to generalize and scale and liberate: everything could and should be an object (so classes should be objects, elementary entities should be objects, objects should be more biological, the Internet that was being worked on should be completely made from encapsulated entities, the underlying object system should be made from objects, all should be late-bound like the Internet was going to be, and so forth).

    And this allowed bad old ideas like operating systems, applications, and even programming languages to be dispensed with. This worked very powerfully and compactly at Xerox PARC, but is not a style that has been adopted in the commercialization of the Xerox PARC ideas since 1980.

    Best wishes,

    Alan Kay