14 posts

## Energy Usage for Math = Dark Energy?

Back to Forum: Coffeehouse
• OK, so I was thinking last night about how much warmer my room is than the rest of the house.  I've got a few computers in there, and in the winter time, that's awesome.  So it got me thinking: Why does it take energy to do computations in a computer?  And, how much of that energy is due to the inherent cost of moving electrons around, and how much (if any) of the cost is due to the math that is being done?

In other words, does 1 + 1 have a cost beyond that of the system in which the equation is evaluated?

And, here's the funny part:  What if it does?  Even if that cost is inperceptably small, the math involved in managing every particle in the universe would be tremendous, so perhaps that cost is met by what astrophysicists call 'dark energy'?

Feel free to poke holes (and fun).

• Sort of like a Heisenberg's uncertainty principle for all of mathematics?

So when you run write down a mathematical equation to calculate the mass of the universe, the mass of the universe changes as a result of you writing the equation down?  Have you been watching too much Dr Who?

Herbie

• I can hear W3bbo pushing pencil to paper, already on page 145 of the master calculation. Page 123 had some string theory stuff, but he scratched that out...

• Probably

That and I listened to a poor explanation of Hawking radiation yesterday so the whole cosmology thing was stuck in my head.

• @ScanIAm: Math stores energy as mass. This can be readily proven by observing that an old well-worn laptop weights a lot more than a brand new one.

• , ScanIAm wrote

In other words, does 1 + 1 have a cost beyond that of the system in which the equation is evaluated?

Feel free to poke holes (and fun).

This is all covered by the First and Second Laws of thermodynamics.  Cost is a human term, that simply does not exist except for the person charging for electricity, and the one paying for it.,

• , vesuvius wrote

*snip*

This is all covered by the First and Second Laws of thermodynamics.  Cost is a human term, that simply does not exist except for the person charging for electricity, and the one paying for it.,

I understand.  Cost might have been the wrong word to use, but it gets the point across.

I guess it would depend on whether there is anything gained by computing 1 + 1.  If there is, then using thermodynamics, what provided the building blocks for the finished product.  If you computer 1 + 1 to be 2, has 1 + 1 been consumed?

• Are you talking about the transisors in the processor that have to switch on and off to do your calculation, and the heat generated by them to do so? And then what happens to the heat thereafter?

• isn't heat directly related to how many cycles an operation takes to complete? Isn't heat the results of electrons moving thru a medium?

let's say an ADD takes 1 cycle, and a DRAW_A_VIDEO_FRAME takes 1000 cycles.

so, the latter op generates 1,000 times the heat.

But I don't question it too much. Instead, I open up the case, remove the fan cuz it's cold up here in the winter time.

•

Information and energy have an equivalence relation, just as mass and energy do, and this has been theorized a long time ago (1929).  In fact, recent experiments have indeed converted information into energy.

One bit of information equals kTln2 Joules/bit, where k is Bolzmann's constant and T is the temperature.  At room temperature, that's 2.8×10⁻²¹ J/bit, which is incredibly tiny relative to the heat generated.  So, in a way, it takes 2.8×10⁻²¹ J to perform a computation involving one bit, to overcome entropy, and/or one bit of information can be converted into 2.8×10⁻²¹ Joules.  At least that's the way I understand it.

So, it's any information (although, yes, that includes math).  To create one bit of information takes  2.8×10⁻²¹ Joules, at room temperature.

• Oh, and to answer the question in the title:  No.  Definitely not.  We know where the energy comes from and goes during computation in classical computers.  No need to invoke dark energy.

You may also find this interesting: Information Processing and Thermodynamic Entropy

• This reminds me of Programming with Bits and Atoms (wait a while for the interesting part).

Speaking of information - does science actually know the approximate storage capacity of the brain? Is there even a possible answer to this question?

• , exoteric wrote

This reminds me of Programming with Bits and Atoms (wait a while for the interesting part).

Speaking of information - does science actually know the approximate storage capacity of the brain? Is there even a possible answer to this question?

raw storage does not consider data compression. But, each person has its own unique data compression algorithm. So, knowing the raw storage is not useful when you don't know the approximate efficiency of the compression algorithm per person.

Leaving WM on 5/2018 if no apps, no dedicated billboards where I drive, no Store name.