@Simon: that will work great if you have explicit data that you can represent in-line in the code (and it so happens I do in my Wikipedia example). But I don't see how to take advantage of these collection initializers when reading data from an external source. For an application like that, I can still Aggregate over a composable dictionary, as in lines 168 through 174 in my unit test DictionaryExtensionsTests.cs... like this

const int range = 1280;
var kvps = Enumerable
.Range(0, range)
.Select(i => new KeyValuePair<int, string>(i, Convert.ToChar(i).ToString()))
;
dict = kvps.Aggregate(dict, (d, kvp) => d.AddUnconditionally(kvp));

@exoteric:Even without NuGet, it still pays off bigtime. I have some LinqPad scripts where i have integrated OpenCv (via PInvoke), QuickGraph (see codeplex), the SQL server Geomety and Geography data types, LINQ / Objects, LINQ / XML, LINQ / Rx, Task lib, all in a single script (not just to use them for sophomoric show-off fun, but because they were the most economical way to get the job done!). I took the effort to pull this all into Visual Studio (not really hard, mostly just rationalizing the namespaces), and by gum it all just works. <3 + + + +

@bryanedds : I've had a couple of good experiences with F# (simulations of card and dice games, just for fun) but I need some more "seat time" with it before weighing in. I'm particularly interested in particle filters (/slaps forehead), as well as Kalman filters (Unscented, Extended, and others) in reactive form. I can hammer them out with confidence in C# or take the deep dive and try them in F#. We'll see

My first job out of grad school involved "heavy industrial" Kalman in Fortran II at Jet Propulsion Lab, where we used them to track everything we could track in the solar system. Daily runs had around 1,000 parameters and 25 million observations, but that was decades ago, and that kind of thing can be easily done on a windows phone now (required liquid nitrogren back then Imagine the applications!

One of my colleagues said "With ordinary tools, you code and then test; with LinqPad, you test and then code." Use it in concert with the Visual-Studio Unit Test Framework (VSUTF) and you will be writing bulletproof code with unbelievable speed.

Technique: get your code working in LinqPad where the code-test-look cycle is fast and frictionless (LinqPad's Dump() + the charting tools in System.Windows.Forms.DataVisualization + plus the Sho-viz libraries are worth their weight in diamonds for code-test-look speed!). Copy all your test stuff into VSUTF and the target code into your VS projects and BAKE 'EM. Really great!

You've got some interesting thoughts, there. You're talking about complex-valued functions on the complex plane. The domain of such functions is an unbroken plane, a simple manifold, but the co-domains form very interesting shapes: manifolds in their own
right, and composing such functions brings up all kinds of interesting phenomena. Your color-mapping idea is great. The link below has some variations on this idea that you might find useful:

N2Cheval -- yes, you're right: spotting the "signature" of contravariance is totally sensitive to the directions of the arrows. We will have to do better to nail that down. Too squishy so far But free jazz is like that sometimes.

Kyrae -- this is a good one and fits my hunch that monads and coordinate systems are similar, that is, the coordinate transforms from {X} to {Y} and back are similar to the monad transformations from IEnumerable to IObservable and back. I'll think about
this some more.

I think there is a bi-directional correspondence between computation and energy transfers. If you think of computation as manipulations of symbols in the lambda calculus or in the pi calculus, then that involves clearing and storing "memory cells," usually
represented as states of switches in a network. Ed Fredkin showed that you can't change states of memories without energy transfers (and the entropy growth that goes along with them, by the second law of thermodynamics!), so it's not possible to do computations
without spending energy and growing the heat in the Universe!

Hi Akopacsi -- The rough idea on time is this: Consider a path -- a 1-dimensional curve -- passing through points in space-time. Every point along that curve has a particular set of 4 coordinates: 3 space coordinates and 1 time coordinate, for any reasonable
choice of coordinate systems. Now, parameterize that curve by the incremental distance along the curve: as you move from one point to another, you go a certain "distance" in 4-space, a distance measured by the "metric tensor," which is a generalization of
the Pythagorean or Euclidean distance. Locally, that incremental distance is sqrt(dx^2 + dy^2 + dz^2 - dt^2) (notice the minus sign!). This distance measure is unique for a choice of metric tensor and is called the "proper time." It's a kind of cosmological
average of proper times over the Hubble motion of galaxies along their curves that measures the age of the Universe backwards 13 or 16 billion years. Very rough idea, but hope that adds some clarity.

I'll take another look at "Rigs of Rods," one of my all-time favorite pieces of software!

## Comments

## Brian Beckman: Hidden Markov Models, Viterbi Algorithm, LINQ, Rx and Higgs Boson

@Simon: that will work great if you have explicit data that you can represent in-line in the code (and it so happens I do in my Wikipedia example). But I don't see how to take advantage of these collection initializers when reading data from an external source. For an application like that, I can still Aggregate over a composable dictionary, as in lines 168 through 174 in my unit test DictionaryExtensionsTests.cs... like this

## Brian Beckman: Hidden Markov Models, Viterbi Algorithm, LINQ, Rx and Higgs Boson

@exoteric:the blackboard style is a cheat try this (on Win7)

Windows key & "+" -- turn on built-in magnifier

Windows key & "-" -- shrink back down

Control-Alt "i" -- inverse video your whole world

## YOW! 2011: Joe Albahari - LINQ, LINQPad, and .NET Async (and a little Rx, too)

@exoteric:Even without NuGet, it still pays off bigtime. I have some LinqPad scripts where i have integrated OpenCv (via PInvoke), QuickGraph (see codeplex), the SQL server Geomety and Geography data types, LINQ / Objects, LINQ / XML, LINQ / Rx, Task lib, all in a single script (not just to use them for sophomoric show-off fun, but because they were the most economical way to get the job done!). I took the effort to pull this all into Visual Studio (not really hard, mostly just rationalizing the namespaces), and by gum it all just works. <3 + + + +

## Brian Beckman: Hidden Markov Models, Viterbi Algorithm, LINQ, Rx and Higgs Boson

@bryanedds : I've had a couple of good experiences with F# (simulations of card and dice games, just for fun) but I need some more "seat time" with it before weighing in. I'm particularly interested in particle filters (/slaps forehead), as well as Kalman filters (Unscented, Extended, and others) in reactive form. I can hammer them out with confidence in C# or take the deep dive and try them in F#. We'll see

My first job out of grad school involved "heavy industrial" Kalman in Fortran II at Jet Propulsion Lab, where we used them to track everything we could track in the solar system. Daily runs had around 1,000 parameters and 25 million observations, but that was decades ago, and that kind of thing can be easily done on a windows phone now (required liquid nitrogren back then Imagine the applications!

## YOW! 2011: Joe Albahari - LINQ, LINQPad, and .NET Async (and a little Rx, too)

One of my colleagues said "With ordinary tools, you code and then test; with LinqPad, you test and then code." Use it in concert with the Visual-Studio Unit Test Framework (VSUTF) and you will be writing bulletproof code with unbelievable speed.

Technique: get your code working in LinqPad where the code-test-look cycle is fast and frictionless (LinqPad's Dump() + the charting tools in System.Windows.Forms.DataVisualization + plus the Sho-viz libraries are worth their weight in diamonds for code-test-look speed!). Copy all your test stuff into VSUTF and the target code into your VS projects and BAKE 'EM. Really great!

## E2E: Whiteboard Jam Session with Brian Beckman and Greg Meredith - Monads and Coordinate Systems

You've got some interesting thoughts, there. You're talking about complex-valued functions on the complex plane. The domain of such functions is an unbroken plane, a simple manifold, but the co-domains form very interesting shapes: manifolds in their own right, and composing such functions brings up all kinds of interesting phenomena. Your color-mapping idea is great. The link below has some variations on this idea that you might find useful:

http://www.kfunigraz.ac.at/imawww/vqm/pages/complex/index.html

and the following is a good book on the analysis of complex functions:

http://usf.usfca.edu/vca//

## E2E: Brian Beckman and Erik Meijer - Co/Contravariance in Physics and Programming, 3 of n

N2Cheval -- yes, you're right: spotting the "signature" of contravariance is totally sensitive to the directions of the arrows. We will have to do better to nail that down. Too squishy so far But free jazz is like that sometimes.

## E2E: Brian Beckman and Erik Meijer - Co/Contravariance in Physics and Programming, 3 of n

Kyrae -- this is a good one and fits my hunch that monads and coordinate systems are similar, that is, the coordinate transforms from {X} to {Y} and back are similar to the monad transformations from IEnumerable to IObservable and back. I'll think about this some more.

## Brian Beckman: On Analog Computing, Beckman History and Life in the Universe Redux

Yes, I'm still here, HeavensRevenge. BTW, loved the youtube on COCONUT -- very, very cool!

## Brian Beckman: On Analog Computing, Beckman History and Life in the Universe Redux

F# would probably be the best way to go, now.

## Brian Beckman: On Analog Computing, Beckman History and Life in the Universe Redux

I think there is a bi-directional correspondence between computation and energy transfers. If you think of computation as manipulations of symbols in the lambda calculus or in the pi calculus, then that involves clearing and storing "memory cells," usually represented as states of switches in a network. Ed Fredkin showed that you can't change states of memories without energy transfers (and the entropy growth that goes along with them, by the second law of thermodynamics!), so it's not possible to do computations without spending energy and growing the heat in the Universe!

## Brian Beckman: On Analog Computing, Beckman History and Life in the Universe Redux

Hi Akopacsi -- The rough idea on time is this: Consider a path -- a 1-dimensional curve -- passing through points in space-time. Every point along that curve has a particular set of 4 coordinates: 3 space coordinates and 1 time coordinate, for any reasonable choice of coordinate systems. Now, parameterize that curve by the incremental distance along the curve: as you move from one point to another, you go a certain "distance" in 4-space, a distance measured by the "metric tensor," which is a generalization of the Pythagorean or Euclidean distance. Locally, that incremental distance is sqrt(dx^2 + dy^2 + dz^2 - dt^2) (notice the minus sign!). This distance measure is unique for a choice of metric tensor and is called the "proper time." It's a kind of cosmological average of proper times over the Hubble motion of galaxies along their curves that measures the age of the Universe backwards 13 or 16 billion years. Very rough idea, but hope that adds some clarity.

I'll take another look at "Rigs of Rods," one of my all-time favorite pieces of software!