We can actually do this pretty easy with OO today. You can derive all objects from a class that has an in and out queue and a worker thread. Then send messages to it and it processes all messages in local scope if it wants. However,
not sure this is better then some library (i.e. ccr, pfx) that takes lambdas and does the work inside a closure. With the ability to have nobs on the library for thread min/max, wait times, throttles, perf mon, etc. The former could be a good design for a
pipeline server (i.e. sql), but I think the later is better for general work and things like joins, choice, futures, etc. Room for both I think.
I would qualify this:
A "process" in something like Erlang is more lightweight and transparent than a Thread in a traditional OO language/library such as C# -- you can create millions, and distribute them across networks easily. Libraries can do similar things but the distribution
might be difficult without integrated support as in Erlang.
There is nothing in C# or the CLR to prevent modifying encapsulated state in an object, in the sense that every value returned from a function, and every property getter, needs to return something with value semantics, not reference semantics.
In Erlang at least, the flow of control is generally turned inside out with the "objects" waiting for a message to arrive, rather than sending a message and waiting for an object to return. The .net framework libraries are generally not designed around
This was a great video, thanks!!! While people are going to have to start giving up the shared state model, it does not mean OO will go away completely -- "objects" will be more like independent, encapsulated processes than one giant intertwined memory
structure in a single process. This is a different way of programming but closer to the original idea of object oriented programming.
Thanks, Patrick, Charles, for the great discussion. If Patrick is going to answer any questions in the future, I would be interested in how the CLR and GC will change to work in a world of hundreds of CPU cores, which is not so far away.
The original idea of object oriented programming, as expressed by Kay, was that each object was a computer in miniature. I am interested in how .net will evolve to reduce visible shared state between objects so that concurrent programming can really take off.
To the interviewer, regarding rapid trend changes: it is important to distinguish between "hype" and real trends. Hype would be the marketing buzzwords, or trendy blog topics. A true trend is a steady, long term change in the industry. The hype comes and
goes, and doesn't necessarily address any real need. The trends do address real needs. So true trends rarely change quickly, as I am sure Tony is aware of. Blogs are not trends at all.
Whoo-eee! This thread gave me a good laugh several times. It is quite funny to hear the young whippersnappers give it to the old cooters -- and the cooters snap right back.
It is not about age or conservativism. It is about listening to the customer, instead of the geeky developer playing video games. The customer often does not give a hoot about the fancy features that developer has planned -- and the developer does not feel
like working on that code his successor left behind. So there is a fundamental conflict. At MS, I would argue for development tools, the developer, not the customer, often wins these days.
The customer is more important, always. Avalon has a horrible API, eats far more memory than it should, and is essentially useless on older machines or small devices. Every time someone points out a problem there it is explained away as a "feature" or a "revolution".
This is a general trend I think at Microsoft. Sorry to bash but... there are dull, boring tasks that need to be done (for the customer) and someone has to do them.
It is too bad over-engineered API's like Avalon are fully baked before hitting the usability lab for developers. At that point it is too late to fix the usability problems; they are embedded in the architecture.
Give all the developers machines with very limited memory and CPU power, and suddenly all the new libraries and development tools would be scalable and easy to learn. Visual Studio would run speedy; Avalon would actually be usable with 100K elements. That is
the best usability tool for getting rid of gold-plating and inefficient code.
What a nice guy! I hope the Visual Studio team sees this, and starts thinking harder about accessibility. Blind programmers can be as productive as sighted ones, since most programming and debugging is text based. Only the tools are lacking.
I would like to hear Christopher expand on this transactional programming idea for application programming (not stored procedures) for simplifying hard exception handling, such as OOM. What sort of concepts would the developer be dealing with? How would
the application be structured for "rip-down" componetization?
How would the conflict between declarative and imperative developement styles be resolved.
This sounds great, but what exactly is this non-traditional execution Christopher is speaking of? AOP or a derivitive?
I see there is a great deal of power at that level, but I am not sure what might be done there.
One thing I would like to see is method tracing. Selectively turn on a trace-log using regular expressions matching member names, and type/namespace filtering, with indenting according to call depth, and possibly logging argument values as well. I am used to
using this type of tracing in older object oriented environments.