3 hours ago, Sven Groot wrote
If an object receives more energy than it radiates, it heats up, yes?
You put a kettle of water on the stove, it heats up and starts to boil because it's receiving more energy from the stove than it is capable of radiating away in the same amount of time.
The earth's atmosphere reduces the amount of energy the earth radiates into space.
Wrong. The earths atmosphere distributes the energy around the globe via convection. It's thus cooling by transporting the energy around the planet and it's always radiating at max to space. The earth is in thermal balance with the sun, if it's output increases, so does our thermal radiating. If that wasn't the case, we couldn't live here.
If that weren't the case, the Earth would cool off very rapidly at night, similar to the moon (where it's 100C in daytime and -150C at night).
Agreed, the atmosphere creates a lag for warming and cooling. It's a mass, and mass is slow. Otherwise the extremes would be higher and lower. But the surface temperature is determined by pressure, distance and output of the sun not by the greenhouse effect. As demonstrated with Mars, Venus, etc.
How much the atmosphere reduces that radiation depends on its properties, one of which is composition. It has been shown as early as 150 years ago that carbon dioxide will trap and re-emit radiation in all directions. Therefore, increasing the amount of carbon dioxide in the atmosphere reduces the amount of energy the earth can radiate into space in a given time, while it still receives the same amount of energy from the sun. Therefore, it heats up.
Argument from history? The properties determined CO2 demonstrated in a lab 150 years ago, is no justification of it's behavior in the atmosphere. And I'm not in disagreement with it's properties. Sure it can block certain IR frequencies. But that doesn't mean it radiates this energy back to the planet. That's still in violation of the second law. There has to be additional work done in the atmosphere to force the radiation back to it's origin. The models predicted a 'hotspot' in the troposphere, it isn't there.
So, the models are wrong. They can't predict anything and violate basic laws. These should be put on the same footing as other crackpot ideas.
Consider this: if your mirror or ice cube analogy were correct, then it would be completely impossible for the atmosphere to heat the planet at all.
The whole point of the atmosphere is to put a lag on the temperature changes and distribute the warming around the planet.
Why then isn't the earth a barren wasteland like the moon? Those analogies are entirely irrelevant because they utterly misrepresent the actual process at work.
The deeper you go, the higher the pressure, the higher the temperatures.
And why does my experiment fail to represent the process? The greenhouse effect is;
- Light emitted by the sun, get's absorbed by the surface. (This is the desk lamp)
- The surface does work (heats up) and a lower frequency of light (IR) is emitted from the surface. (this is the area on the desk where the light shines)
- This longwave radiation is then reflected by CO2 and other greenhouse gasses (the mirror)
- This radiation get's absorbed by the surface and additional warming is generated (the brighter spot on the desk)
In order for this effect to occur, you would have to violate the second law of thermodynamics. A source cannot heat up (light up) more by it's own emitted photons. Because there is no difference in energy between the two. Current (electrons or photons) wont flow between two points with the same potential. How does the photon know this? It doesn't, it pushes against a stream with equal force, they cancel each other out, net result is zero. This is basic physics,.