# Brian Beckman: On Analog Computing, Beckman History and Life in the Universe Redux

- Posted: Mar 11, 2010 at 1:04 PM
- 34,775 Views
- 22 Comments

Loading User Information from Channel 9

Something went wrong getting user information from Channel 9

Loading User Information from MSDN

Something went wrong getting user information from MSDN

Loading Visual Studio Achievements

Something went wrong getting the Visual Studio Achievements

- Posted: Mar 11, 2010 at 1:04 PM
- 34,775 Views
- 22 Comments

- To download, right click the file type you would like and pick “Save target as…” or “Save link as…”

- It's an easy way to save the videos you like locally.
- You can save the videos in order to watch them offline.
- If all you want is to hear the audio, you can download the MP3!

- If you want to view the video on your PC, Xbox or Media Center, download the High Quality MP4 file (this is the highest quality version we have available).
- If you'd like a lower bitrate version, to reduce the download time or cost, then choose the Medium Quality MP4 file.
- If you have a Windows Phone, iPhone, iPad, or Android device, choose the low or medium MP4 file.
- If you just want to hear the audio of the video, choose the MP3 file.

Right click “Save as…”

It's been *far* too long since we've chatted with the great Brian Beckman, an astrophysicist, software architect, and
Channel 9 icon. Some of you may know him as the wizard who appears out of
thin air whenever the word Monad is said three times in succession. :->

A few weeks ago, Erik Meijer sent an email to Brian with a
link to some videos about the use of analog computers in the US Navy in the 1950s. This got Brian thinking and reflecting about his past. Turns out Brian's father was
a famous Hollywood actor who also produced training movies for the US Navy. Well, I was added on to the email thread and we taped the conversation in this video a few days later.

It's always a pleasure to embark on an unscripted chat with Dr. Beckman. There are always great nuggets of wisdom and insight around every corner. Here, you'll learn about some of Brian's personal history, some insights on analog computing, and even some discussion
on the Drake equation, * N = N* fp ne fl fi fc fL,* which attempts to formalize the probability of intelligent life in the universe.

Sit back, relax, and enjoy.

Comments have been closed since this content was published more than 30 days ago, but if you'd like to continue the conversation,
please create a new thread in our Forums,

or
Contact Us and let us know.

## Follow the Discussion

Oops, something didn't work.

## What does this mean?

Following an item on Channel 9 allows you to watch for new content and comments that you are interested in. You need to be signed in to Channel 9 to use this feature.## What does this mean?

Following an item on Channel 9 allows you to watch for new content and comments that you are interested in and view them all on your notifications page.sign up for email notifications?

Awesome video

Maybe it has been far too long since Brian was on channel 9... but he sure makes it worth waiting for

Agreed. Brian is a hero to many of us. Thanks, Brian!

C

The analog computer videos are amazing. It was so interesting to see and hear them describe finding the right shape of the surface, to solve a particular problem.

Someone told me that QM and GR were incompatible mathematically, but I didn't understand how that could be (after all, isn't all mathematics, at the root, based on the same underlying axioms and theorems). He thought I wouldn't understand if he tried to explain; but I think I get the idea. Thanks Brian.

Brain is definately a programming hero of mine. The dude is a modern day Feynman, and has been responsible for much pleasant head scratching followed by much reading and some little learning on my part.

Every time I hear Brian, he makes me feel I am just a 'carbon based life form'...... Excellent video guys and thanks Charles for these ETI videos

Ah, great video. A cosmology related question popped up in mind while I was watching this... Maybe Brian or somebody else here can answer it. If Time is not a fixed variable, but it can be distorted, how can we know if the Universe is 13.7 billion years old... Or even, does it make any sense to talk about its age? If Time is "expanding" or is being distorted, maybe the Universe was born just before the moment we call Now... (Sorry, if it's a stupid question.)

Something else. Brian, you talked about a truck simulator software in another video some years ago and you said something like that maybe you will never see its source code, but the basic idea behind the simulation of metal pieces was quite obvious... now it's an open source software. So you can check it out, if you want.

Brian and Erik really have an interesting past. Thanks for sharing.

Hi Akopacsi -- The rough idea on time is this: Consider a path -- a 1-dimensional curve -- passing through points in space-time. Every point along that curve has a particular set of 4 coordinates: 3 space coordinates and 1 time coordinate, for any reasonable choice of coordinate systems. Now, parameterize that curve by the incremental distance along the curve: as you move from one point to another, you go a certain "distance" in 4-space, a distance measured by the "metric tensor," which is a generalization of the Pythagorean or Euclidean distance. Locally, that incremental distance is sqrt(dx^2 + dy^2 + dz^2 - dt^2) (notice the minus sign!). This distance measure is unique for a choice of metric tensor and is called the "proper time." It's a kind of cosmological average of proper times over the Hubble motion of galaxies along their curves that measures the age of the Universe backwards 13 or 16 billion years. Very rough idea, but hope that adds some clarity.

I'll take another look at "Rigs of Rods," one of my all-time favorite pieces of software!

I was thinking about the analog fire control computer today, and protein folding and neurotransmitters, and how measurement of all electrical activity in the brain as a function of time would map back to the dynamic surface of the molecular interaction. I haven't ever heard the idea that the brain could be an analog computer in that sense, but after seeing that fire control video it really is making me wonder. Of course, the chemical interaction is just one part of the entire process, but often I have assumed that the "intelligence" lies in the electrical state of the system, but that might just be an artifact ... energy transfer and not computation at all. Of course I have no idea if any of this is close.

I think there is a bi-directional correspondence between computation and energy transfers. If you think of computation as manipulations of symbols in the lambda calculus or in the pi calculus, then that involves clearing and storing "memory cells," usually represented as states of switches in a network. Ed Fredkin showed that you can't change states of memories without energy transfers (and the entropy growth that goes along with them, by the second law of thermodynamics!), so it's not possible to do computations without spending energy and growing the heat in the Universe!

Interesting. That reminds me of photosynthesis as quantum computation. (http://www.scientificamerican.com/article.cfm?id=when-it-comes-to-photosynthesis-plants-perform-quantum-computation, and for non-quantum computation, there is another paper here http://www.minds.may.ie/~xebedee/papers/MWN2005m.pdf).

What's your favourite ".Net language" for expressing mathematics? I remmeber an old video where you showed a pretty generic mathematics library written in VB.NET. I find writing truly generic abstract libraries for .Net, at least in C# and I would guess VB.NET as well, not very.... easy.

F# would probably be the best way to go, now.

I have been wondering if this equation I created a while back is true, I heard of that renormalization technique which I didn't know of formally, but I used it many months ago in a thinking spree I had to try and show how a black hole both creates and destroys itself at the same time using principles such as time stops at infinite gravity, and at infinite speed there is infinite gravity.

The black arrow shows our current "moment" where there systems are in reasonable balance, also when speed (where I suppose speed could be considered and imply movement in space) and time are intersected for equalization.

So this picture might look odd but I used it as a way to create the little equation.... newtons equation shows when the green text as a black holes effect is arranged to be negated/renormalized.

So Brian may I get your professional opinion on if anything jumps out at you other than the manually drawn curves, and the lack of Béziers in OneNote?

I used to find maths interesting. At some level I liked the mechanical process of putting together simple building blocks to get more interesting results. However at some point I lost this interest and started to find maths pretty boring, as there was a big gap between abstract maths and interesting effects..... and therefore feedback! The result of this was that I focused more on application of maths (ee-eng) and then I moved into computing.

In computing I completely regained my joy of building bigger systems out of smaller building blocks, and solving things from first principles. However despite the deeply rooted history of computing coming from maths.... it's almost like the maths is invisible at times. The processes are very similar, but I sometimes wonder how all the maths in computing has been hidden so successfully by the programming languages we use. In many cases almost by design!(like Cobol or Basic)

Its a shame really, that making programming accessible, seemed incompatible with moving programming closer to maths. I wonder does that tell us more about programming or about the state of maths education ... or the fact the ultimately so many people find maths boring even though they are good at it.

Of course building big systems these days is more like a managing a big building construction site, hence the term architect, so even at that level there's a conceptual gap between programming and systems.

Thanks Brian for another very interesting, and entertaining interview.

I think it's the problems that we were given to solve, using the maths we were taught, which were boring. I used to work with my friends to solve our own problems that we'd make up, like how high exactly did Michael Jordan have to jump to dunk a basketball from the foul line. What is the function which describes the relationship between velocity, angle of take off, height etc..., and plot that curve. That sort of thing was fun. Repetitive problems that go over the same concept 100 times was not fun. You don't understand via repetition, you just memorize, so that's all most people can do, and then they forget everything very soon afterwards. So I think it's because the teachers are poor at teaching mathematics, in general. Some are good of course, but most not, and the really good mathematicians are certainly not the ones teaching in elementary and high school. Not that a good mathematician makes a good teacher, but there's too few people really good at both.

I wonder if i need to trigger a reply by replying, I can't really ask you directly on that graph/equation I spuriously thought up.

But I would agree as well that using F# is the best way to go for almost anything,unless all you are doing is bit-fiddeling like crazy.

On a side-note I have recently acquired a PlayStation 3 NOW ALLOWING ME to use the "MASS" Physics engine in the PS3 SDK which is Haskell based and actually faster than the C based physics engine for allowing inter-core communication thus preventing socket bandwidth saturation and FINALLY I get to program a Cell Broadband engine in Haskell no less!!! Google Tech talk Here For Reference : http://www.youtube.com/watch?v=yHd0u6zuWdw

I get excited just thinking about 8 cores of power like this!!! Especially creating my own execution VM that changes a loop invariant of the loop making the loop "mapped" +8 per pipeline and "reduced" by the load computation controller/tracker. Something I have planned to not be limited to the x++ increment by one problem that tends to always limit true work balancing that is making the many-core problem so unmanageable in line-by-line assembly instruction binaries, and not just course grained groups of lambda's in functional languages.

Yet another random question... (The question of Charles about the z axis of his garden made me less shy about my questions... So here is my question... How do pocket calculators calculate the sqrt? I understand that it may be easy to implement Newton's method on a circuit, but it isn't an effective way to find the sqrt of perfect squares. It gets closer and closer, but it will produce a result like x.99999...

So is Brian still watching this post??

Yes, I'm still here, HeavensRevenge. BTW, loved the youtube on COCONUT -- very, very cool!

Have fun, here is The Flake Equation:

http://xkcd.com/718/

I would like to contact Brian. I knew his father Henry Beckman very well but lost contact when I went to live in Ireland.

Cheers,

Jessica, www.jessicaetaylor.org

## Remove this comment

## Remove this thread

close