Entries:
Comments:
Discussions:

Loading user information from Channel 9

Something went wrong getting user information from Channel 9

Latest Achievement:

Loading user information from MSDN

Something went wrong getting user information from MSDN

Visual Studio Achievements

Latest Achievement:

Loading Visual Studio Achievements

Something went wrong getting the Visual Studio Achievements

Discussions

Auxon Richard.Hein Read it: ​http://bitc​oin.​org/bitcoin.​pdf
  • Subliminal Advertising for Channel 9

    Nice Ian. Smiley  I don't have a Windows Phone yet. 

  • Scientists prove Global Warming caused by the sun, not humans.

    @cbae:  I don't know. Smiley  I'm really busy at work, and I don't really have time to try to clarify/expand/correct myself right now.  Plus, I should just stick to what I know ... "stay on target".

  • Scientists prove Global Warming caused by the sun, not humans.

    ,cbae wrote

    *snip*

    Yes, you did. You attacked my doctors/leeches example by stating it would be revisionist to say that "use of leeches by 19th century doctors wasn't as prevalent as we think it was today".

    *snip*

    No I didn't.  You're reading everything through your own filter and revising my comment.  I said, "Yes, but writing today that use of leeches by 19th century doctors wasn't as prevalent as we think it was today, isquite possibly revisionist - although perhaps most doctors were against leeches being used in medicine, I have no idea."  There's a big difference. I certainly never attacked; I'm just saying I don't believe everything I read, and that's the summation of it all.  The rest is just making conversation.

  • Scientists prove Global Warming caused by the sun, not humans.

    @cbae:  I didn't miss your point.  I just don't take someone's word for it, especially since it goes against my memory.  I would have to look at all the research history to know, and I don't plan on doing that today.  I don't use past mistakes as proof that current research is wrong, but I do consider past mistakes a warning that we have to be more vigilant.

  • Scientists prove Global Warming caused by the sun, not humans.

    @cbae: Yes, but writing today that use of leeches by 19th century doctors wasn't as prevalent as we think it was today, is quite possibly revisionist - although perhaps most doctors were against leeches being used in medicine, I have no idea.  (However, leeches is a bad example because they have some valuable use, even today, to clean wounds and encourage blood clotting.  We just found other ways of doing the same kind of thing.  Blood letting would be a better example; although there is possible reasons why it would help some cases, in almost all cases it's a terrible idea.)

    All I know for sure without literally going through papers from the 70s, is that "another ice age is coming" was a common belief and debated as widely as global warming in the public sphere.

  • Scientists prove Global Warming caused by the sun, not humans.

    @cbae:  That's not a myth.  When I was a kid, it was all about the "next ice age is coming" and "acid rain will kill all life".  Perhaps it was a myth in terms of the overall, real, scientific consensus of the time, but it was certainly played up in the media and believed by most everyone.

    After reading the article I can't help but think this is rewriting history.  In 20 years they will be saying scientific consensus did not support global warming; it supported "climate change".

  • Paradigm shift? The quCPU is here.

    ,evildictait​or wrote

    People don't change how they program to accomodate new computers, they change how the new computer behaves to make it behave like an old computer, or use new programming languages so they can interact with it in the same way as before (people can write Visual Basic in any language, and if we get a quantum computer, they'll write Visual Basic on that too).

    Look at the past ten years of computing development - your data no longer goes along bus lines in the north-bridge, it goes along the QPR bridge. You no longer have RAM, you have flash-RAM. You no longer have a single platter disk, you have a multiplatter magnetic RAID disk that uses sector streaking to improve performance and redundancy, and you don't have one CPU you have four, all connected along a hybrid bus along the outside of the CPU - and has any of this changed how we program? Not one jot.

    If and when quantum computers come along, they'll be abused for specialist applications like encryption, sure, but if they ever reach the desktop (and that's a long way off) then Microsoft and Linux will write an OS that sits on top that just takes the "quantum" as a speed improvement over other computers and throw the rest of the magic physics away, because programmers don't care about physics, they care about getting their product to the maximum number of users in the minimum amount of effort.

    Although in some respects, abstractions will be made to hide certain kinds of complexity, in the big scheme of things, I don't agree, because of a few reasons.  First of all, weare changing the way we program for multiple CPUs and distributed systems.  I am certainly doing so for the Azure project I am on.  It uses the same languages, but the design is very different.  Parallel programming and asychronous programming idioms like C# Async, TPL and Rx provide an abstraction to make it simpler, but it's still quite different, and results in a different design.  Rx code looks very little like "ordinary" C#. 

    Also, the fundamental algorithms for solving problems on a quantum computer are drastically different than classical algorithms.  They are so different, that simply understanding them requires at least a working knowledge of quantum mechanics.  There may be a group of programmers that don't get into that depth - and they may be the majority - but there will still be a lot who must, just as today there are still a lot of programmers that need to program to the machine (C++ is still very much alive).  So, as part of a decent computer science education, students will at least have to learn the basics, just as I had to learn all about memory management, even though I rarely work on code that deals with its own memory management any more.

    But, who knows.  We'll see.  It's just the beginning. Big Smile

     

     

  • Erik Meijer: The World According to LINQ

    @head.in.the.box:  Great! Big Smile  Thanks!

  • Erik Meijer: The World According to LINQ

    Thanks for the link.  The constructor for class Pie is called Chart, but it should be Pie.  Just proving I read it. Wink  Also, in the conclusion, it says, "We have also shown how to implement custom LINQ providers that can run in memory and over SQL and CoSQL databases, and we have presented LINQ-friendly APIs over Web services.", but there is no example for CoSQL in the paper.  I eagerly await  Wink part 2, which I hope covers this oversight, and also about Rx in the same context.

    On the topic itself:  This is important and recommended reading for all programmers. The paper shows how LINQ can be used to create internal DSLs for any data source, or any API, because we can treat code as data through expression trees.  Creating a query provider - or comprehension provider to stress the more generalized capabilities of LINQ - allows us to think of any input, asynchronous or otherwise, as a data source.  We get strong typing and intellisense for free, while maintaining a coherent, generic abstraction over a fluent interface to the DSL we create by implementing LINQ operators.  Erik Meijer shows which operators you have to implement, and that by implementing cross apply, one can implement all the other required operators.  This surfaces the meaning of a program (or computations), by providing explicit details of the types being created and a DSL for what is being done to them, composed together into a consistent abstraction. 

     

     

  • Paradigm shift? The quCPU is here.

    @ScanIAm:  Yes.