Well. Don't make the mistake that this interview starts out a little dry, or abstracted to far from software engineering as there are some amazing Gems in there for anyone who has every had to be involved with managing or scheduling a team of developers to deliver a task - or has worked in such an environment. I'd say that's about 100% of us!
Very interesting. I like the strong type inference particulary. Still feels like scripting but with the advantages of static typing for tooling (if not performance). Nice lightweight type syntax too, unlike Closure. Be interesting to see how it measures up against Dart, but it's certainly more ambitious than CoffeScript.
Messy, all over the place, poorly structured. Fabulous energy and great enthusiasm. Inspirational really.
Just sad it took me so long to get around to watching this. It's just so fantastic to see these guys sparking off each other - two towers of Channel9. Titans! Far better than what we normally have these days.
Great interview. It's been too long since Anders has been on C9.
Someone recently asked me to describe Scala... I said it was like a c# version of Java
It was somewhat tongue-in-cheek, but the thing is that for the most part c# code looks very clean, as does Scala code, and the addition of Lambda's and closures to the language has made a huge difference to the way people write production code. Look at most of the modern libraries being written, like TPL etc and see the high usage of Lambdas. All those design patterns with single method interfaces that took loads of boiler-plate code (so they were rarely used) now boil down to simple functional patterns. In some ways some of the Async examples that get shown look old fashioned compared with that in that it is often shown in combination with things like for loops... which really look like a cumbersome way to iterate these days: hardly the poster child for imperative programming, non-local returns etc not withstanding.
another good article in the series. I need to go back and look more closely at the fixed point operator, as I kind of got a bit out of my depth when we got onto implementing recursion into the interpreter... but that's the whole point I guess: if I didn't
get out of my depth there would be nothing to learn!
Talking of the Fixed point operator/combinator it reminded me of Bart De Smet's post on implementing the trampoline for safe recursion in c#:
where he uses a fixed point combinator. I'm not sure I'd want to implement that in production code as none of my collogues would
understand how it worked! I'd have to bow to simplicity and introduce local mutation and use an explicit stack rather than the function stack. It's a very interesting article though.
I used to find maths interesting. At some level I liked the mechanical process of putting together simple building blocks to get more interesting results. However at some point I lost this interest and started to find maths pretty boring, as there was a
big gap between abstract maths and interesting effects..... and therefore feedback! The result of this was that I focused more on application of maths (ee-eng) and then I moved into computing.
In computing I completely regained my joy of building bigger systems out of smaller building blocks, and solving things from first principles. However despite the deeply rooted history of computing coming from maths.... it's almost like the maths is invisible
at times. The processes are very similar, but I sometimes wonder how all the maths in computing has been hidden so successfully by the programming languages we use. In many cases almost by design!(like Cobol or Basic)
Its a shame really, that making programming accessible, seemed incompatible with moving programming closer to maths. I wonder does that tell us more about programming or about the state of maths education ... or the fact the ultimately so many people find
maths boring even though they are good at it.
Of course building big systems these days is more like a managing a big building construction site, hence the term architect, so even at that level there's a conceptual gap between programming and systems.
Thanks Brian for another very interesting, and entertaining interview.
Very interesting. Good to see more of Erik behind the scenes. I must say he's an inspirational character.
My career has gone in the opposite direction... historically I was only interested in theory with applications, and theory that was therefore just good enough for the problem space, i.e. a typical engineer. These days I'm finding I missed out on so much
theory that way and went too quick to the solution. I've always drummed into developers to step back from the problem, and take time to think: take things up a level or two before coming back down. Ask how I would
like to solve this problem, before deciding how to solve the problem. All this time I was not applying that myself to the higher level concepts in programming. Still I'd never like to be purely academic: I like the feedback of seeing things
actually doing something.
With all the new posts going up on channel 9 these days I find many of them bland.... I confess to filtering by Charles' posts and then looking at those for ones worth watching!
On the delegate D D( D d);
There was one of the Functional Programming episodes where Erik mentioned that all programs (should that be expressions?) could be compiled into the SKI combinators. However he then said that really you need only one, but frustratingly never elaborated.
Is this function of the untyped Lambda calculus the ONE?
This is wonderful stuff. I agree with the other commenter who said that they no longer watch TV since discovering this type of material on channel 9; me too!
This really shows the similarity between applied mathematics and programming. What I hadn't realised until relatively recently (since I started looking at functional programming) was that with FP there is a strong relationship between programming and Pure
maths. Reasoning about programs in algebraic form (Equational Reasoning) is something I find amazing. I also find it amazing that I didn't realise you could do this until so recently .... when doing Electrical Engineering before moving into IT (going back
10 years now), I always wrote programs imperatively, so when writing simulations there was a great impedance mismatch between what I was trying to model and the code I had to write to do it; much more than you would expect for scenarios where I was just trying
to move from the continuous mathematical model to a discrete model.
As an aside Brian, are you familiar with the idea's in Wolfram's A New Kind of Science. Where he talks about the fundamental building blocks of the universe being better modelled with executing computations (programs) rather than with high level mathematics.
And that the high level mathematics of "physical laws" was basically a short cut to describe behaviour that emerged from the complex interactions of the simple computations as they executed over time. So something like the speed of light is just a emergent property
of very simple computations (in the mathematical sense) interacting wildly over time. I'm probably butchering his hypothesis here, but hopefully he isn't reading this
Anyway that would suggest that executing computations is more fundamental than the current laws of physics. Do you have any views on that? To me this sounds beautiful, and beguilingly simple - though some of the computations when they execute are irriducibile,
so it only gets us so far with trying to infer things from them, you end up having to run the simulation to see what happens... which is what the universe does, but it means we can't predict the outcome directly from knowing the simple laws.
Something about that theory smells right to me, but I'm not a physicist.
a truly amazing series, and hopefully more to come in this format from Channel 9. I can't say as I've been educated as much and enjoyed it at the same time in many, many years. I've already watched most of these lectures a few times, and each time I learn
something new. I can't recommend this series enough, both as an introduction to functional programming and haskell, and as a solid grounding in many general programming concepts. Fantastic.
Interesting point, I'll have to think about that quite a bit.
I'm not sure I fully get Monads yet.
How does this relate to the concept of an object type? A package of state and behaviour ... is a Monad just an object that obeys a certain behaviour pattern? I've seen them described as amplified Types, and clearly if you have your type obey the behaviour
pattern (the monad laws) you get compositional types. I don't quite see yet how this differs from an Type that implements an interface with Bind and Return/Unit methods and thus are supported by certain syntax. Suddenly everything starts to look like a Monad
of some sort if it encapsulates data and computations.
I think I'm missing some key insight that differentiates a Monad from an object type: they look very similar to me, both of them represent state and behaviour that can be applied to that state - is it just that a Monad has something more specific to say
about the behaviours that are applied? I'm guessing this is what gives them the mathematical properties that allows you to reason about them in a certain way...