Talk:Temperature/Archive 3
This is an archive of past discussions about Temperature. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current talk page. |
Archive 1 | Archive 2 | Archive 3 | Archive 4 | Archive 5 | Archive 6 |
Mean temperature
Some climate denialists are oposed to the very notion of mean temperature.
The idea was advanced by Christopher Essex, Ross McKitrick, and Bjarne Andresen in February 2007 issue of the Journal of Non-Equilibrium Thermodynamics: http://www.uoguelph.ca/%7Ermckitri/research/globaltemp/GlobTemp.JNET.pdf
It appears the paper ( and the idea) don't have any merit:
http://www.realclimate.org/index.php/archives/2007/03/does-a-global-temperature-exist/
http://rabett.blogspot.com/2007/03/once-more-dear-prof.html
http://scienceblogs.com/deltoid/2004/05/mckitrick3.php —Preceding unsigned comment added by Mihaiam (talk • contribs) 18:04, 10 May 2010 (UTC)
- "Some climate denialists are oposed to the very notion of mean temperature." Mihaiam, you are in the wrong place. Read what it says at the beginning of the article -->
- This article is about the thermodynamic property. For other uses, see Temperature (disambiguation). Temperature is an intensive thermodynamic property, meaning it is localised i.e cannot be averaged, (see Intensive and extensive properties). Please refrain from using terms like denialist which may be considered abusive. --Damorbel (talk) 18:40, 10 May 2010 (UTC)
I wasn't trying to be abusive, just framing the context generating the humorous idea that a mean temperature cannot be defined or even useful. On the contrary, intensive thermodynamic properties are often averaged, at least for practical purposes, the above paper notwithstanding. —Preceding unsigned comment added by Mihaiam (talk • contribs) 20:10, 10 May 2010 (UTC)
- "intensive thermodynamic properties are often averaged" Before making remarks like that you should read the link I gave you. You clearly do not understand what is involved. You can see examples of intensive properties in the article here [1] If you try to make an 'average' temperature you have to put it in another, perhaps climate related, place.
- In thermodynamics e.g. this article, if there is a need for something like 'an average temperature' it means that there is disequilibrium of some sort i.e. the temperature is not uniform, the entropy is not at a maximum and there will be energy flow of some sort. As I have pointed out before, the 'Greenhouse Effect' requires heat transfer from a cold troposphere to an Earth's surface that is already many degrees warmer. This is a massive breach of the 2nd law of thermodynamics which says that heat always flows from a hot place (the Eath's surface) to a cold place (the troposphere). Claims for heat to go against a thermal gradent are the 'business' of perpetual motion inventors.
- What do you mean by 'practical purposes?', This article should be about the well established science of thermodynamics. I have the strong impression you are not familiar with this branch of science, a rather difficult but extremely practical one.
- Referring to other contributors as denialists means quite plainly you define their contribution as worthless, as you do here, this is clearly abuse. Please avoid these personal attacks. --Damorbel (talk) 21:09, 10 May 2010 (UTC)
- Oh good grief, enough with the long words, try reading the article: Since thermodynamics deals entirely with macroscopic measurements, the thermodynamic definition of temperature, first stated by Lord Kelvin, is stated entirely in empirical, measurable variables - any *measurement* of temperature is inevitably an average, even if over a very short time and small region William M. Connolley (talk) 21:44, 10 May 2010 (UTC)
- William, you are clearly right out of your depth. To grasp these matter properly you need to get a grip of kinetic theory and (preferably) statistical mechanics. Kinetic theory is the fundamental mechanism of heat in atoms and molecules, from there you will get a basic understanding of heat transport at atomic level. Without this you are totally lost when dealing with matters seemingly so simple as temperature. Sorry you find the words a problem, the concepts are far from obvious but quite essential for a competent analysis. You take a very strong line on climate matters, how this is possible when you appear to have only a passing aquaintance with thermal physics I do simply do not know.--Damorbel (talk) 11:40, 11 May 2010 (UTC)
- I'll ignore the falacious appeals to expertise William M. Connolley (talk) 11:49, 11 May 2010 (UTC)
- Then I'll explain. A mean temperature only exists in an ensemble of molecules when the average momentum of the molecules is the same, there is a characteristic Maxwell–Boltzmann distribution of velocities (or momenta for inhomogeneous ensembles). If an ensemble has a non-uniform average i.e. there is a (macroscopic) difference in averages between the various parts (aka a temperature difference) then the concept of an average temperature is inapplicable. For a start the condition is unstable because heat will transfer from the warmer part to the cooler. The effect on the temperatures of this transfer cannot be predicted without detail knowledge of the thermal properties of the various parts so, in this case, any figure given as 'an average temperature' is meaningless. The misunderstanding that commonly arises comes with thermometry, for practical reasons molecular temperatures are not measurable so thermometers measure a 'bulk' temperature which averages and cannot readily detect small temperature differences; that does not mean that the concept of temperature does not extend to the microscopic level. It is similar with infrared thermometers, a mottled infrared image will always create some kind of 'temperature' reading that may be satisfactory for some purposes but scientifically it is probably worthless. --Damorbel (talk) 13:12, 11 May 2010 (UTC)
Damorbel is right in the sense that a system that is not in thermal equilibrium but which consists of parts that are (to a good approximation) in thermal equilibrium at different temperatures, is not equivalent to a system in thermal equilibrium at some average temperature. But this (well known) issue has been abused by some people to argue against Global Warming. If we want to mention something about this, I propose we stick to the relevant physics. E.g. one can mention that temperature gradients give rise to transport phenomena (heat conduction etc.). You can then delve deeper and explain that this means that a gas that is in dynamical equilibrium where you maintain a temperature gradient cannot be described by a velocity distribution function obtained by inserting the local position dependent "temperature" in the Maxwell distribution. It is in fact the deviation from this that gives rise to the heat transport. So, in a sense this can be taken to mean that even the concept of a local thermal equilibrium breaks down.
Now, we actually all use this fact when solving heat conduction problems. We don't have to bother about the fact that the local speed distribution is not precisely Maxwellian; you only need to consider that if you also want to compute the heat conduction coefficient from first principles. But you can just as well take the heat conduction coefficient as a given and then the local state at some point is specified by a local temperature, albeit that the distribution is not Maxwellian at the local temperature. But the point is that whatever it is, is now implicitely fixed.
This fact makes it possible to do thermodymnamics in practice where things are not in thermal equilibrium. You can make models of the Earth's atmosphere and describe the time evolution using differential equations. Those equations contain transport coefficients like the viscosity, heat conduction coeffienct etc. etc. Such a description arises precisely from a perturbation away from exact global thermal equilibrium and is valid if you are close to global thermal equilibrium. But here "close" means that the deviations from the Maxwell distribution is not large. This will only break down in extremely violent processes, certainly not when you model the Earth's atmoposhere. Count Iblis (talk) 14:56, 11 May 2010 (UTC)
- A recent edit by User:89.43.152.15 restored the global temperature map with the remark that the consensus in this thread is "clear enough". I see no statement here that says "okay, let's put in the temperature map". I thank Count Iblis for his/her insightful remarks, but I don't see how they address the issue of whether to have the map.
- My opinion is this: Temperature is a very fundamental property. To explain it, we should stick to simple, tangible examples. The planet Earth is arguably as far from this ideal as we could get. It would be like discussing childhood obesity in the mass article; it's an important topic, but it doesn't belong. The map does not add to the article. The text makes no reference to it. It is clutter that can be removed with no negative impact. Spiel496 (talk) 18:58, 12 May 2010 (UTC)
- I disagree with your argument, but at least it is plausible. The original argument (this article can't talk about average temepratures) clearly wasn't plausible William M. Connolley (talk) 19:09, 12 May 2010 (UTC)
- I disagree too. If we want to stick to a narrow and simple treatment we should remove the vacuum and negative temperature paragraphs, too, as they are neither fundamental or tangible. The map is illustrative about temperature use in other sciences, the scope of the respective paragraph.
- The very reason given for the initial removal of the map (the alleged inappropriate use of the term in most natural sciences) is an example of its importance in this article, as it is universally used nevertheless (with credible scientific motivation IMHO).
- I don't see a vote here for the removing of the map, either, user:79.113.7.217 removed it without even initiating a discussion, with a contentious reason. —Preceding unsigned comment added by Mihaiam (talk • contribs) 05:55, 14 May 2010 (UTC)
- So, to paraphrase, I said "the Earth is too complex a system" and you said "then so is the vacuum". Earth---Vacuum; Everything---Nothing. They're practically opposites. I want to hear from a different editor on this matter, because I don't follow Mihaiam's logic. Spiel496 (talk) 19:27, 15 May 2010 (UTC)
- There is no logic from mihaim. He has something with 'denialists', and his weapons are a web page and a couple of blogs against a published paper, ad hominem attacks (denialists, contentious reason = he's a telepath?) and apparently he didn't understand several clear replies abut averaging the temperature, which were given to him. As for me, I decided that I have no business with this page anymore. If the editors want to mislead people with that map, keep it here. You should also add maps on averaging intensive quantities on all pages describing them, to keep things consistent. —Preceding unsigned comment added by 79.113.2.40 (talk) 12:08, 20 May 2010 (UTC)
- Mean temperature is a useful enough concept to appear in many published papers to this very day, as a quick search may reveal:
- There is no logic from mihaim. He has something with 'denialists', and his weapons are a web page and a couple of blogs against a published paper, ad hominem attacks (denialists, contentious reason = he's a telepath?) and apparently he didn't understand several clear replies abut averaging the temperature, which were given to him. As for me, I decided that I have no business with this page anymore. If the editors want to mislead people with that map, keep it here. You should also add maps on averaging intensive quantities on all pages describing them, to keep things consistent. —Preceding unsigned comment added by 79.113.2.40 (talk) 12:08, 20 May 2010 (UTC)
- So, to paraphrase, I said "the Earth is too complex a system" and you said "then so is the vacuum". Earth---Vacuum; Everything---Nothing. They're practically opposites. I want to hear from a different editor on this matter, because I don't follow Mihaiam's logic. Spiel496 (talk) 19:27, 15 May 2010 (UTC)
- I disagree with your argument, but at least it is plausible. The original argument (this article can't talk about average temepratures) clearly wasn't plausible William M. Connolley (talk) 19:09, 12 May 2010 (UTC)
http://arxiv.org/find/all/1/all:+EXACT+mean_temperature/0/1/0/all/0/1 —Preceding unsigned comment added by Mihaiam (talk • contribs) 06:18, 21 May 2010 (UTC)
- Thank you for proving once again my point. For your information, 'useful' in not a valid logical argument. Just because some statistical figure which isn't a thermodynamical value is used in a lot of papers, doesn't mean that it should be pushed into thermodynamics.
- The presence of a map showing locations with different temperatures is only justified if it illustrates the confusion that can arise in a faulty thermodynamic analysis. The property "temperature" is quite independent of size. By assigning a temperature to a particular (macroscopic and larger) object, by definition all the parts of it must have the same temperature. In the case of a gas Maxwell and Boltzmann recognised that the heat transmission process described by kinetic theory meant that the particles comprising a mass of gas (i.e. at the macrosopic level) meant that the various particles of gas (microscopic level) had different (translational) kinetic energies, even at the "degree of freedom" level. What they did was to show that, when there was thermal equilibrium (no heat transport taking place at the macroscopic level) the distribution of the microscopic energies i.e. the energies of the individual particles, followed a statistical distribution now called the Maxwell-Boltzmann distribution; it is these statistics that connect the definition of temperature at the atomic level with the bulk temperature measured by a themometer.
- But this is as far as you can go when assigning a temperature to a multiplicity of locations. When assigning an average temperature to a large object (it doesn't have to be as big as a planet!), not in thermal equilibrium i.e. has parts definitely at different temperatures with heat transport taking place between them according to the 2nd Law, any figure produced will be meaningless. What actually happens when defining an "average temperature" this way is that the temperatures of the different parts are added together in an undefined way, perhaps to give a publishable figure. It is fairly easy to understand why this is unreliable; should the measure be made using infrared radiation then two locations with the same temperature but differing emissivities will appear to have quite different temperatures. A good example is data for the Sun, its temperature is often given to four figures by measuring its radiance. But as seen from the Earth its surface is far from uniformly bright (because of limb darkening) and its spectrum is only vaguely black body, all of which makes nonsense of four figure accuracy for the actual temperature. --Damorbel (talk) 09:47, 21 May 2010 (UTC)
- Somewhat imprecise don't mean useless. In many cases it's useful to define a baseline (however imprecise) in order to study the evolution of a system. —Preceding unsigned comment added by Mihaiam (talk • contribs) 11:38, 21 May 2010 (UTC)
Overview section overhaul
I've looked at this page for the first time today and the Overview section is in need of a major overhaul. The first sentence wasn't even complete. When I looked through the history to see what was deleted I noticed that much of the information in there is incoherent and/or incorrect. I will change a few things today to fix the incomplete sentences and I will hopefully revisit this page to give it some help. What do other people think? What should be in the Overview section anyway? The stuff that is currently in there isn't really an overview of anything? Should it just be deleted? Sirsparksalot (talk) 00:39, 13 May 2010 (UTC)
Contradiction
The following is a quotation from Temperature of vacuum "If a thermometer orbiting the Earth is exposed to sunlight, then it equilibrates at the temperature at which power received by the thermometer from the Sun is exactly equal to the power radiated away by thermal radiation of the thermometer. For a black body this equilibrium temperature is about 281 K (+8 °C)." This is correct but the "black body" reservation is misleading because the equilibrium temperature is in fact independent of the optical properties of the body.
The contribution goes on to say "Since Earth has an albedo of 30%, average temperature as seen from space is lower than for a black body, 254 K, while the surface temperature is considerably higher due to the greenhouse effect." Introducing temperature as a function of the reflected radiation (the albedo) is a contradiction since the first quoted statement makes no reference to the reflectivity of the thermometer. The first quotation remains correct because, in these conditions (no other sources or sinks of thermal energy) the absorption and emission of radiation equilibrate at the same temperature for all materials, independent of their reflectivity and transparency.--Damorbel (talk) 08:20, 21 May 2010 (UTC)
- I'm afraid the equilibrium temperature actually depends on the optical properties of the body, being lower for bodies with high reflectivity (or transparency for that matter). —Preceding unsigned comment added by Mihaiam (talk • contribs) 09:50, 21 May 2010 (UTC)
- "I'm afraid the equilibrium temperature actually depends on the optical properties" Care to justify this? It's a popular fallacy stated many many times but that doesn't make it true. I take it you mean the temperature in a thermal radiation field.
- Tell me what would be the temperature of a 1m spinning ball plated so that it reflected 99% of the incident radiation, same orbit as Earth but far away?--Damorbel (talk) 10:49, 21 May 2010 (UTC)
- Giving some simplifying assumptions, I'll say about 89K —Preceding unsigned comment added by Mihaiam (talk • contribs) 11:19, 21 May 2010 (UTC)
- 89K? Ok. Now if you changed one half to carbon black (emissivity 0.99) the black side would, I imagine, be about 190K warmer, giving you a nice thermal gradient which could be used for generating electricity, maybe 500W/m2? Which, when scaled up a bit and sent to Earth, could save the planet from AGW. Cheers! (You can patent it too!)
- PS Nearly as good as cold fusion! --Damorbel (talk) 12:06, 21 May 2010 (UTC)
- And your point is .... ?--Mihaiam (talk) 18:41, 21 May 2010 (UTC)
- Haven't you noticed my point? If your idea that the equilibrium temperature is dependent on reflectivity etc. was true you could be getting energy from nothing, you would have a perpetual motion scheme (or scam). The sad truth is, equilibrium temperature of any object in a uniform radiation field is independent of its reflectivity, its colour or its transparency. History is full of such perpetual motion schemes to "get energy from nothing". Pons and Fleischman's cold fusion scam is ever so slightly different in that they just had no way to get the energy output to exceed the input. --Damorbel (talk) 19:33, 21 May 2010 (UTC)
- The reflectivity simply don't play a role in all this. It just reduce the amount of incident radiation to be absorbed by the body. You do have a cold sink as the body will radiate according to Stefan–Boltzmann law. —Preceding unsigned comment added by Mihaiam (talk • contribs) 19:56, 21 May 2010 (UTC)
- Haven't you noticed my point? If your idea that the equilibrium temperature is dependent on reflectivity etc. was true you could be getting energy from nothing, you would have a perpetual motion scheme (or scam). The sad truth is, equilibrium temperature of any object in a uniform radiation field is independent of its reflectivity, its colour or its transparency. History is full of such perpetual motion schemes to "get energy from nothing". Pons and Fleischman's cold fusion scam is ever so slightly different in that they just had no way to get the energy output to exceed the input. --Damorbel (talk) 19:33, 21 May 2010 (UTC)
- And your point is .... ?--Mihaiam (talk) 18:41, 21 May 2010 (UTC)
- Giving some simplifying assumptions, I'll say about 89K —Preceding unsigned comment added by Mihaiam (talk • contribs) 11:19, 21 May 2010 (UTC)
You have to make your mind up. A while ago you wrote about body with an albedo (reflectivity) of 99% having an equilibrium temperature of 89K, that is the role I am talking about. You refer to the Stefan-Boltzmann law, fair enough, but a body that reflects 99% of the incident radiation has an emissivity of 1%, what emissivity did you use in your calculation? --Damorbel (talk) 20:09, 21 May 2010 (UTC)
- You are right if the emissivity of the object is the same at all wavelengths (gray body). One of the assumption I've made is that the object have a reflectivity of 99% for the solar spectrum (UV, visible and near infrared), but 0% in far infrared. Although not really warranted from your premises, this assumption does not contradict them since you only specified the reflection for incident solar radiation (albedo). As it happens, most materials exhibit low infrared reflectivity:
http://minerals.gps.caltech.edu/FILES/Infrared_Reflectance/index.htm FWIW, albedo is a more specific form of reflectivity, over the incident light wavelengths —Preceding unsigned comment added by Mihaiam (talk • contribs) 23:26, 21 May 2010 (UTC)
- Please stop the discussion and think. IF you have a thermodynamical heat bath, that is radiation gas - and IF you talk about thermodynamical EQUILIBRIUM, then you MUST have the object in equilibrium with the radiation. That means its temperature is independent of the optical properties. If it wouldn't be such, you could easily make a perpetuum mobile of the 2nd kind. Now in earth case, it's not the case. The radiation coming from the Sun is not at equilibrium, it would have to be at 2.725K to be there. Or it would have to come from all around as the one from the Sun, that is to be isotropic and homogeneous, which would mean Sun's "temperature" at Earth level. In the case of NON equilibrium, in a simple system, which does not rotate, you'll have to take into account the followings: - the albedo of the exposed part, for both incoming and outgoing; - the albedo for the shadow part, for outgoing; - the thermal properties of the body, that is thermal conductivity. You would have in the body a thermal gradient, which means a thermodynamical non equilibrium situation. For reasons exposed already, it's meaningless in physical sense to talk about 'the average temperature' of the body. —Preceding unsigned comment added by 79.113.15.75 (talk) 06:41, 22 May 2010 (UTC)
Equilibrium temperature in a way does have to do with reflectivity and in a much more meaningful way does not. As described I gather that while Equilibrium temperature can appear to be AFFECTED by reflectivity of various energies, it is however not a FUNCTION of reflectivity. In improper testing of material which does not take into account reflectivity properties of the material used in the experiment the equilibrium temperature of the material would appear higher than in actuality.
IE, Equilibrium temperature is the POINT at which the energy absorbed by the material is exactly equal to the energy released by he material. However a material may reflect a certain percentage of energy, never absorbing it in the first place, thereby artificially inflating the equilibrium temperature in improper observations.
However the equilibrium temperature is actually the point at which the energy ABSORBED it the same as the energy RELEASED by the substance, so you can subtract the amount of radiation REFLECTED as irrelevant to the equation.
That is NOT to say that a material which is reflective will not have to have more energy expended to reach the equilibrium temperature, or to say that an object cannot be higher or lower than it's equilibrium temperature. —Preceding unsigned comment added by QSquared (talk • contribs) 05:57, 22 May 2010 (UTC)
Metrology
What are we to make of the last four words of this: 'Fahrenheit's scale is still in use in the USA, with the Celsius scale in use in the rest of the world and the Kelvin scale.'? 213.122.28.238 (talk) 16:10, 24 June 2010 (UTC)
- Yeah, that phrase was pretty garbled. I shortened the sentence to eliminate it. At that place in the article, there's no need to go into depth about where each unit is used. Spiel496 (talk) 21:24, 24 June 2010 (UTC)
Introduction
The opening statement "In physics, temperature is the average energy in each degree of freedom of the particles in a thermodynamic system" is complete nonsense, if it were true temperature would be measured in Joules, not K.--Damorbel (talk) 15:25, 7 January 2010 (UTC)
- Absolutely right. I have changed the introduction to not only reflect this, but also to state that the scientific definition of temperature is in the realm of thermodynamics, not statistical physics. It was defined by Lord Kelvin, before the advent of statistical physics. Statistical physics provides an explanation and a deeper understanding of temperature rather than a definition. I'm just worried that the introduction now sounds too technical in the beginning. PAR (talk) 16:45, 7 January 2010 (UTC)
- Thanks for your rapid response. I now have second thoughts about the nonsense bit and I am looking at your revision with a great deal of interest. --Damorbel (talk) 20:17, 7 January 2010 (UTC)
The opening statement "In physics, temperature is the average energy in each degree of freedom of the particles in a thermodynamic system" is complete nonsense, if it were true temperature would be measured in Joules, not K.
- Your statement is simply false as K is a unit of measure denoting average kinetic energy. Joules measure energy, K measures average energy. Therefore, the statement, "Temperature is the measure of the average kinetic energy of the particles in a substance," is absolutely correct. I have changed the first sentence to reflect this. Happy editing, hajatvrc with WikiLove @ 16:44, 3 July 2010 (UTC)
- I'm going to step in and agree with Damorbel and PAR. First, K is not a unit of denoting average kinetic energy as K is not a unit of energy. Second, consider that two completely different systems with same temperature can have different average kinetic energies. One simple example of this would be two separate containers, one filled with iodine gas and one filled with an equal amount of hydrogen gas. Both molecules are homo-nuclear diatomics and can be treated similarly. They should each have the same average kinetic energy of translation and rotation. However, due to the different vibrational frequencies of hydrogen and iodine molecules, they will have drastically different amounts of vibrational energy. This means that the two containers, despite being at the same temperature, have very different average energies. This average energies of each system will scale with the temperature, but that does not mean that temperature is a direct measure of the average energy. To measure the average energy of the system, the heat capacity of the system (which relates the average energy to the temperature) must be known. While it may seem like semantics, the opening statement should be treated rather delicately. Personally, my vote would be that it should stress that temperature is related to the average kinetic energy and is not a direct measure of it. --Sirsparksalot (talk) 17:30, 27 July 2010 (UTC)
- No, the mean kinetic energy is only a function of temperature not the type of gas. Kbrose (talk) 19:05, 12 September 2010 (UTC)
- I'm going to step in and agree with Damorbel and PAR. First, K is not a unit of denoting average kinetic energy as K is not a unit of energy. Second, consider that two completely different systems with same temperature can have different average kinetic energies. One simple example of this would be two separate containers, one filled with iodine gas and one filled with an equal amount of hydrogen gas. Both molecules are homo-nuclear diatomics and can be treated similarly. They should each have the same average kinetic energy of translation and rotation. However, due to the different vibrational frequencies of hydrogen and iodine molecules, they will have drastically different amounts of vibrational energy. This means that the two containers, despite being at the same temperature, have very different average energies. This average energies of each system will scale with the temperature, but that does not mean that temperature is a direct measure of the average energy. To measure the average energy of the system, the heat capacity of the system (which relates the average energy to the temperature) must be known. While it may seem like semantics, the opening statement should be treated rather delicately. Personally, my vote would be that it should stress that temperature is related to the average kinetic energy and is not a direct measure of it. --Sirsparksalot (talk) 17:30, 27 July 2010 (UTC)
Temperature is the frequency of the average molecular vibration. If temperature were the measure of energy why then does a pound of steam at 212 degrees F. and atmospheric pressure have much more energy than the a pound of water at 212 degrees F. and atmospheric pressure? The molecules in the steam have a much higher velocity than the molecules in the water, however the average frequency of vibration of the molecules in the steam and water are the same. The temperature of colors are also frequency. Geweber (talk) 16:58, 12 September 2010 (UTC)
- Amusing! The choice of units for temperature and kinetic energy is merely a historical artifact. The mean kinetic energy is only a function of temperature not the type of gas. This is why it can be used as a definition of temperature. In fact, when using natural units, the two are only different by a factor of 2: E=T/2. History could have equally decided to call the kinetic energy itself as being the temperature, but instead we have the correlation through the Boltzman constant. Kbrose (talk) 19:05, 12 September 2010 (UTC)
- The mean kinetic energy is most definitely dependent upon the identity of the substance. If it was only dependent upon the temperature, then all substances would have the same heat capacity! Also, what about phase changes? Consider a solid substance with a temperature that is just an infinitesimal amount below the melting point and compare that to that same substance as a solid with a temperature an infinitesimal amount above the melting point. For all intensive purposes, the temperature of both systems is the same (only infinitesimally different) yet the internal energy of the substances is COMPLETELY different. This is because immense amounts of energy had to be put into the substance so that it could undergo a phase transition from liquid to solid. Here we have two systems with the same composition and mass with practically identical temperatures that have drastically different energies. The mean kinetic energy is not a function of only the temperature. Sirsparksalot (talk) 22:38, 16 September 2010 (UTC)
- You're confusing different forms of energy. Kinetic energy: E = 3/2 kT, there is no dependency on anything but T. The articles states it too, as does any thermal physics text book. You couldn't construct a useful thermometer if this weren't so. It's the zeroth law of thermodynamics. Your model of heat capacity has similar flaws, and Internal energy is not just kinetic energy. Kbrose (talk) 23:30, 16 September 2010 (UTC)
- I don't even understand what you are saying. If you are going to disagree with something, please give reasonable explanations. How does the model of heat capacity have similar flaws? Also, don't refer to the fact that the "article states it" because the fact that the article states it is the precise reason this debate is taking place. Finally, please provide a citation for one of these books. I would be more than happy to look it up to try to better understand the argument you are making. Sirsparksalot (talk) 01:58, 17 September 2010 (UTC)
- You're confusing different forms of energy. Kinetic energy: E = 3/2 kT, there is no dependency on anything but T. The articles states it too, as does any thermal physics text book. You couldn't construct a useful thermometer if this weren't so. It's the zeroth law of thermodynamics. Your model of heat capacity has similar flaws, and Internal energy is not just kinetic energy. Kbrose (talk) 23:30, 16 September 2010 (UTC)
- The mean kinetic energy is most definitely dependent upon the identity of the substance. If it was only dependent upon the temperature, then all substances would have the same heat capacity! Also, what about phase changes? Consider a solid substance with a temperature that is just an infinitesimal amount below the melting point and compare that to that same substance as a solid with a temperature an infinitesimal amount above the melting point. For all intensive purposes, the temperature of both systems is the same (only infinitesimally different) yet the internal energy of the substances is COMPLETELY different. This is because immense amounts of energy had to be put into the substance so that it could undergo a phase transition from liquid to solid. Here we have two systems with the same composition and mass with practically identical temperatures that have drastically different energies. The mean kinetic energy is not a function of only the temperature. Sirsparksalot (talk) 22:38, 16 September 2010 (UTC)
I'm going to apologize for beating a dead horse here, but I am once again going to state that the kinetic energy is dependent upon the identity of the substance. Maybe my argument above was a bit unclear because I used the internal energy, but the ideas are still in order. The problem that I have with saying that the average kinetic energy is E = 3/2 kT, where 1/2 kT comes from each degree of freedom, is that it ignores quantum effects. While I do not disagree with this formulation, it has marked limitations that must be considered. This result is an application of the equipartition theorem and is shown quite clearly when considering kinetic molecular theory. The result of E = 3/2 kT only takes into account the classical motion of particles and will only work for quantum systems when kT is much larger than the energy spacing that arises from quantization. Examples of this include translational and rotational degrees of freedom, which is why E = 3/2 kT works for monatomic ideal gasses and a few extra 1/2 kT can be included to account for rotational energy, the number of 1/2 kT that are included depends on the number of non-equivalent axes of rotation. When vibrations are considered, however. This approximation breaks down because the spacing between adjacent vibrational energy levels can typically be much higher than kT. What this means is that the population distribution of molecules in each of the energy levels is weighted toward states of lower energies according to the relevant Boltzmann factors. Taking these Boltzmann factors into account when calculating the average kinetic energy gives a value that can be much different than the classical result of E = 3/2 kT.
Using the example I used above that compared the kinetic energy of hydrogen and iodine gasses, this effect is extremely apparent. Rotation and translation of the two species can be treated classically, offering 1/2 kT to the average kinetic energy for each degree of freedom. This means that the rotational and translational energies of each gas should be the same. Vibrations, on the other hand, must be treated quantum mechanically. Hydrogen gas has a vibrational frequency of approximately 4000 cm-1. This value is far greater than kT at room temperature which is approximately 200 cm-1. The fractional population of molecules in the first excited vibrational level will be negligible when compared to the population of molecules in the ground vibrational state (very close to unity!) and will thus contribute very little to the average kinetic energy. This case is much different from iodine which has a vibrational frequency of approximately 200 cm-1, which is on the order of the 200 cm-1 of kT available at room temperature. This means that the relative populations of the ground and first excited vibrational energy levels will approximately 3:1, which are comparable to one another (but still different enough that E cannot be approximated classically) and will contribute much more to the average kinetic energy than did the first excited state of hydrogen. This example illustrates that the two different gasses, hydrogen and iodine, have drastically different energies due to the quantum mechanical treatment of vibrations.
Another consideration that should be made is the effect of making the assumption that E = 3/2 kT. This is one of the underlying assumptions made in the derivation of the heat capacities of solids (Dulong-Petit). While this law did hold at higher temperatures, it utterly failed at low temperatures. This is because it did not take into account energy quantization. By considering these effects, the later models of Einstein and Debye did a much better job at explaining the temperature dependence of the heat capacity. A similar case would be how the classical description of blackbody emission spectra could not be obtained. Quantization had to be taken into account.
Hopefully this does a good job at showing how temperature is not simply equal to the average kinetic energy of the system in question. Including quantization is a necessary step and it cannot be overlooked. While I am not opposed to putting E = 3/2 kT into the article, I feel that it needs to have a bit of a clause attached to it. It needs to be made clear that this result is the consequence of treating the system classically and will only hold at high temperatures where kT is much larger than the energy spacing that arises from including quantization. At low temperatures, or in systems where quantum effects are strong, this result breaks down and other methods are required to describe the average kinetic energy of a material. Sirsparksalot (talk) 16:36, 21 September 2010 (UTC)
- I agree with Sirsparksalot. Temperature is not kinetic energy. Kinetic energy is e = 1/2 m v^2. Where is the "T"? A heat source has a temperature level to it but it is not energy. In some systems the velocity varies as the temperature even then velocity by itself is not energy. There has to be some mass. Since absolute zero temperature is defined as no molecular motion then you could say temperature is related to the motion of molecules. It is not the speed of the molecules since the speed of the molecules of water at saturation is different between the liquid and the gas yet the temperature is the same. The remaining characteristic of molecular motion is the frequency of vibration. So it seems temperature is the frequency of the molecular vibration.Geweber (talk) 02:05, 22 September 2010 (UTC)
- I doubt Sirsparksalot agrees with you, however. A monotonic gas has temperature, yet it does not vibrate. A collection of identical harmonic oscillators all vibrate at a fixed frequency, yet that system can have an arbitrary temperature. Spiel496 (talk) 05:55, 22 September 2010 (UTC)
- I don't have a good answer for you, Sirsparksalot, but the kinetic energy statement confuses me as well. I'm not saying it's incorrect, but I would like to read a clarification that goes beyond "it's in a physics text book". Spiel496 (talk) 21:16, 22 September 2010 (UTC)
Definition of temperature in Lead
The definition of temperature in the lead section gets reworded from time to time, but it is generally a variation of "the measure of the average kinetic energy of the particles". I'm uncomfortable with that wording for two reasons. First, that it doesn't address the random nature of thermal energy. Second, that it ignores the potential energy in solids.
The randomness of the motion seems important if we consider, for example, a liquid-helium-cooled bullet propelled at 100 m/s. Although the individual lead (Pb) atoms have an average kinetic energy that is consistent with 300 Kelvin, one would say the temperature is really more like 4 Kelvin.
Second, the potential energy is important in solids. I remember reading that half the thermal energy of a solid is in the form of potential energy. It makes sense, considering that the energy of a harmonic oscillator cycles between all kinetic and all potential. Now, we wouldn't want to count the static energy of chemical bonds as thermal energy, so I would make the distinction that it's the component of potential energy that is rapidly being transformed from and into kinetic energy.
Taking these two points together, I would propose, "Temperature is a measure of the energy associated with the random motion of the individual particles in matter." I say "associated with" to attempt to include the transient potential energy, but not the static energy of the chemical bonds. And I specify "random" to rule out the component due to center-of-mass motion, or some weird oscillation where all the atoms move together. Does this make sense? Spiel496 (talk) 00:53, 14 September 2010 (UTC)
- No, sorry. An oscillator constantly exchanges potential energy into kinetic energy and back at its characteristic frequency. So, potential energy is a driving force, but it is only the kinetic energy that is temperature, by definition. While thermal energy is 'generated' from stored potential energy of every oscillator contributing, it is the kinetic energy (=thermal energy) that is transferred and stored as another form of potential energy in a neighboring system. The randomness is expressed in the fact that it is the mean kinetic energy of an ensemble. That term implies a statistical sampling. Your example is no different than an atom flying through a gas, internally it has oscillators and perhaps a characteristic temperature. Just like the temperature of the bullet is one thing, it is an ensemble of oscillators that have a mean kinetic energy and the bullet therefore has a temperature internally, but the bullet's speed is another story. A single flying bullet doesn't have a 'temperature', even if you can calculate one with a formula, unless it collides with something. If you want to consider a gas of such bullets, then your container, the system, has a temperature as a whole, and the situation is identical to a container of gas. The definition of temperature is based on kinetic energy, not internal energy, which is the sum you're talking about. This is described in every elementary physics book. Today's definition of temperature is merely a consequence of scientific history. It could have happened that they decided to state: mean kinetic energy = temperature and the world of physics would likely still be whole. The use of natural units comes close to this ideal, E=T/2. Kbrose (talk) 02:12, 14 September 2010 (UTC)
- I appreciate you taking the time to reply to my post. But rather than explain why my wording is wrong, you have simply repeated the statements that I called into question. For example "it is only the kinetic energy that is temperature". Well, maybe not. I'm hoping there's more to go on than a definition. And throughout your paragraph, you make some statements that are either completely false or, at best, ambiguous:
- "potential energy is a driving force" -- No. Energy is not force.
- "The term [mean] implies a statistical sampling" -- No. It is a sum of terms, divided by the number of terms.
- "A flying bullet doesn't have a temperature" -- ??. You contradict this yourself in the preceding sentence.
- "internal energy ... is the sum you're talking about" -- No, internal energy includes the energy of chemical bonds, which I specifically omitted.
- "This is described in every elementary physics book." -- There are plenty of physics books which make no mention of thermodynamics.
- Pardon me for being so nit-picky, but even if my proposed definition of temperature lacking, it is not as wrong as some of the statements you've used to dispute it. I think I get your point about the bullets, though; their internal temperatures could differ from the temperature inferred by the CM motion of the larger bodies. It's too artificial an example.
- I appreciate you taking the time to reply to my post. But rather than explain why my wording is wrong, you have simply repeated the statements that I called into question. For example "it is only the kinetic energy that is temperature". Well, maybe not. I'm hoping there's more to go on than a definition. And throughout your paragraph, you make some statements that are either completely false or, at best, ambiguous:
- Maybe this would be a quicker way to get to the truth: Can someone come up with a thought experiment where I get the wrong answer if I define temperature to include the potential energy. For example, what if we had two systems exchanging thermal energy: One is a monotonic gas, in which potential energy plays no role; the other is an ensemble of harmonic oscillators (masses on springs), in which potential energy plays an important role. Transfer some thermal energy Q from the monotonic gas to the harmonic oscillators. The total kinetic energy of the gas has decreased by Q. Is it true that the kinetic energy of the harmonic oscillators has increased by Q? Spiel496 (talk) 06:25, 14 September 2010 (UTC)
- This is not a forum to philosophize about the merits of established physics. If you want to do that you can do that elsewhere. If you don't accept the definition of temperature and want to propose a new one, Wikipedia is not the place for original research, we report established knowledge. Take a physics book that has a thermodynamics chapter and find out the definition. It is already stated in this article, albeit not in great presentation. If you did notice that the definition here always involves kinetic energy, then perhaps you should take that as a signal. If you like a different but equivalent definition from state function approach, then the article also provides the differential of internal energy wrt entropy. Kbrose (talk) 14:34, 14 September 2010 (UTC)
- Please understand, I'm not trying to rewrite physics; I wanted merely to direct the conversation to a more concrete example. If it was too rambling or philosophical, then just ignore it.
- Here's my point more concisely: When a solid is heated, half of the thermal energy ends up as potential energy. Yet temperature is said to be a measure of just the kinetic energy. That sounds enough like a contradiction that I suspected that "kinetic" was being used loosely. Apparently it is literal, which surprises me.
- Regarding the issue of the randomness (of the motion associated with temperature) it was a mistake to bring it up in the same thread. If no one has anything useful to say about it, I'll bring it up in a separate section. Spiel496 (talk) 23:50, 14 September 2010 (UTC)
- It is quite mistaken to consider temperature as a measure of energy. I pointed this out here http://en.wikipedia.org/wiki/Talk:Temperature#Introduction but it seems that active editors want it this way even though few report 'temperature' as 273J, practitioners of thermodynamics much prefer273K; 0oC is another popular way of saying the same thing. But this is Wikipedia and, in practice anybody can write what they like about physics. 'Temperature' is inextricably bound up with the Boltzmann constant which isn't even mentioned in the article. I would suggest that those editing or accepting the definitions in the current version of this entry should reconcile their ideas with the Boltzmann constant otherwise they will end by making a mess like the present one.--Damorbel (talk) 07:30, 16 September 2010 (UTC)
- The state of the article leaves a lot to be desired for, indeed, but this is partially the result of following advice like yours. The practice of expressing temperature in units of energy is common-place in physics, see for example the wide-spread use of the unit electron volt in plasma physics. In the history of thermodynamics, temperature was originally treated synonymous to heat (caloric). When using natural units, k=1, temperature has the unit of energy and this expresses the natural physics of temperature without human construct and many physicists don't accept the notion that it should be a separate physical base property in system of units. The involvement of the Boltzmann constant is only an artifact of choice of system of units. Kbrose (talk) 19:16, 16 September 2010 (UTC)
- It is quite mistaken to consider temperature as a measure of energy. I pointed this out here http://en.wikipedia.org/wiki/Talk:Temperature#Introduction but it seems that active editors want it this way even though few report 'temperature' as 273J, practitioners of thermodynamics much prefer273K; 0oC is another popular way of saying the same thing. But this is Wikipedia and, in practice anybody can write what they like about physics. 'Temperature' is inextricably bound up with the Boltzmann constant which isn't even mentioned in the article. I would suggest that those editing or accepting the definitions in the current version of this entry should reconcile their ideas with the Boltzmann constant otherwise they will end by making a mess like the present one.--Damorbel (talk) 07:30, 16 September 2010 (UTC)
- "the result of following advice like yours". Hmmm! You say "The practice of expressing temperature in units of energy is common-place". You can read what it says about using the electronvolt as a measure of temperature here: [[2]] where it says "In certain fields, such as plasma physics, it is convenient to use the electronvolt as a unit of temperature. The conversion to kelvins (symbol: uppercase K) is defined by using kB, the Boltzmann constant". Please note multiplication by the Boltzmann constant which puts the temperature proportional to the energy per degree of freedom.
- Temperature is an intensive property, that is why energy flows from regions where the intensity is high to where it is low and vice-versa.
- An energy measure (joules, ergs) is not intensive, it is just a general statement about the energy in a system, it says nothing about the size of the system in contrast to "energy per degree of freedom" which severely limits size of the 'system'. Similarly a measurement of energy says nothing at all about the tendency of the energy to flow in one direction or another--Damorbel (talk) 15:28, 26 September 2010 (UTC)
- I understand the point on natural units but I think the purpose of natural units is being confused. Natural units are meant to normalize physical variables to one another. This is where the k=1 comes in. It simply normalizes the magnitude of k to 1. This does not mean that the units of one quantity are the same as the units of another. It just means that the magnitude of the units are the same. Think of them as being unit vectors. Just because unit vectors all have the same magnitude, doesn't mean they all point in the same direction (which they don't - they're orthogonal). Please see the comments I made above that address how temperature is not a direct measure of the average energy of a system. (http://en.wikipedia.org/wiki/Talk:Temperature#Introduction) Sirsparksalot (talk) 00:11, 17 September 2010 (UTC)
- Well, apparently you don't understand the natural units. An equation in physics is always an equation with units and all. Natural units bear out the natural physics of things and simplify dimensional analysis. Kbrose (talk) 22:55, 24 September 2010 (UTC)
- It is still a matter of convention to assign incompatible dimensions to different physical quantities. Of course, temperature is not the same as energy, but then potential energy is not the same as kinetic energy either. Yet, we don't measure potential energy in different units as we measure kinetic energy in. Nothing would stop one from doing that, though (you would then get a dimensionful conversion factor in the energy conservation law). Count Iblis (talk) 00:23, 17 September 2010 (UTC)
- Damorbel, can you clarify a bit? It's not clear to me what wording you would propose. Would the phrase, "temperature is proportional to the average kinetic energy" be better? (Just consider the Kelvin scale for the moment.) Spiel496 (talk) 00:38, 17 September 2010 (UTC)
- I understand the point on natural units but I think the purpose of natural units is being confused. Natural units are meant to normalize physical variables to one another. This is where the k=1 comes in. It simply normalizes the magnitude of k to 1. This does not mean that the units of one quantity are the same as the units of another. It just means that the magnitude of the units are the same. Think of them as being unit vectors. Just because unit vectors all have the same magnitude, doesn't mean they all point in the same direction (which they don't - they're orthogonal). Please see the comments I made above that address how temperature is not a direct measure of the average energy of a system. (http://en.wikipedia.org/wiki/Talk:Temperature#Introduction) Sirsparksalot (talk) 00:11, 17 September 2010 (UTC)
- "clarify" - "Temperature at the microscopic level is the measure of the energy per degree of freedom (the energy in a degree of freedom)." In a macroscopic system "temperature is the mean energy per degree of freedom". This is where the Maxwell-Boltzmann distribution comes is, it recognises that in a kinetic system of freely interacting particles, possible with different numbers of degrees of freedom, they all will have different energies at any given instant but the average over time will represent the temperature just as if all degrees of freedom had the same energy. --Damorbel (talk) 15:28, 26 September 2010 (UTC)
- Maybe point wasn't as clear as I had hoped. I was essentially trying to suggest that using dimensionless units isn't a good way of equating energy and temperature. Count Iblis makes an excellent point. Just because two quantities have the same units, doesn't mean they are the same thing. Sirsparksalot (talk) 01:25, 17 September 2010 (UTC)
The concept of equilibrium and the idea that a large number of particles is involved are both vital to the definition of temperature. Without equilibrium, temperature is undefined. The fewer the number of particles, the less useful the idea of temperature becomes.
Neglecting quantum effects, at equilibrium, for a sufficient number of particles (N), whose center of gravity is at rest, each degree of freedom will have an average energy of NkT/2. For example, if you have a monatomic gas (three degrees of freedom) and a bunch of linear harmonic oscillators (one degree of freedom) that have equilibrated, then they will all be at the same temperature and each degree of freedom will have, on average, an energy of NkT/2 where N is the number of particles, or oscillators, as the case may be. The energy of a linear oscillator is the sum of its potential and kinetic energy. Statistically, half of that total energy will be kinetic, half potential.
In the case of a solid, if the molecules are essentially point particles vibrating about a center of attraction, they will have three degrees of freedom, each degree of freedom having total energy NkT/2. At equilibrium, statistically, half of that energy will be potential, half kinetic. Again, NkT/2 will be the total energy per degree of freedom, NkT/4 will be the kinetic energy per degree of freedom.
Just as in the case of Entropy, there is a thermodynamic definition of temperature, and then there is the statistical mechanical explanation of temperature. The thermodynamic definition involves the experimental fact that all dilute gases behave as ideal gases, etc. Statistical mechanics provides an explanation of temperature by showing that, at equilibrium, the average energy (kinetic plus potential) per particle of a body with a large number of particles is equal to fkT/2 per particle where f is the number of degrees of freedom availiable to that particle, k is Boltzmann's constant, and T is temperature. PAR (talk) 17:14, 25 September 2010 (UTC)
- PAR, perhaps I misunderstand you when you write "kinetic plus potential" (energy) in relation to the temperature of a particle. I do not think potential energy directly affects the temperature of particles. For example gas moving "upwards" in a gravitational field will cool because part of its thermal (kinetic) energy is converted to potential energy, it cools because there is less energy in the 'kinetic' energy component. --Damorbel (talk) 15:28, 26 September 2010 (UTC)
- Hmm - I wish I could give a good answer for that off the top of my head. But it shows why equilibrium is important. For a gas of point particles in a gravitational field, at equilibrium, I believe the temperature will be the same at any height, while the density (and therefore pressure) drops off as you go higher. The internal energy density will be higher, closer to the "ground" because of the potential, but the temperature will be the same, which means the average kinetic energy per molecule will be the same at any height. But if the molecules are point particles, they still only have three degrees of freedom. This is not the same case where, for example, the molecules are like two point atoms connected by a spring. Lets say, for simplicity, all the springs are oriented in the same direction. Then each molecule has four degrees of freedom, three for the center of mass, and one to describe their separation. Again, the kinetic energy of the molecular centers of mass will reflect the temperature, each of the three degrees of freedom of the center of mass will have energy kT/2 per molecule. But the molecules will be also vibrating about their center of mass, and this degree of freedom will also have energy kT/2, part kinetic, part potential. In other words, the kinetic energy of the center of mass will not be equal to the sum of the kinetic energies of each atom in the molecule. The extra kinetic energy is part of the fourth degree of freedom energy, which is the sum of the kinetic energies relative to the center of mass plus the potential energy due to the stretching or compression of the spring. I know, this is a description, rather than an explanation, but I guess I will have to think about it some more in order to give a good explanation. PAR (talk) 17:22, 28 September 2010 (UTC)
- PAR you wrote "I believe the temperature will be the same at any height" Really? I thought it was common experience that the temperature in the troposphere falls with height at a 'Lapse Rate'. Most passenger flights experience outside temperatures below -50C. --Damorbel (talk) 20:31, 2 October 2010 (UTC)
- In an idealized case where there is simply a gas at equilibrium in a gravitational field, I think the temperature will be the same. Once you shine the sun on it, everything changes. The ground absorbs the sun's light and heats up, heating the gas at ground level. The gas absorbs heat from the sun, there is convection, with hot gases rising, and cooling as their density decreases, etc. etc. For the first (ideal) case, I haven't really seen the analysis, I'm only saying the temperature is the same from the idea that there can never be temperature gradients in an isolated system in equilibrium. The first case is isolated, the second case, with the sun shining on it, is not. I have a suspicion this might not be so simple for low density gases, where the molecules travel so far between collisions that their change in potential energy between collisions is of the same order as their kinetic energy. But this is straying from the subject - I still believe that the temperature of a gas is proportional to the kinetic energy of the center of mass of the molecule, but not to the kinetic energies of the individual atoms in the molecule, and this distinction should be made. PAR (talk) 23:09, 2 October 2010 (UTC)
- Count Iblis, above you write:"It is still a matter of convention to assign incompatible dimensions to different physical quantities." Which leaves me completely baffled. It is highly likely that I do not grasp your point, can you help me? --Damorbel (talk) 20:31, 2 October 2010 (UTC)
- In an idealized case where there is simply a gas at equilibrium in a gravitational field, I think the temperature will be the same. Once you shine the sun on it, everything changes. The ground absorbs the sun's light and heats up, heating the gas at ground level. The gas absorbs heat from the sun, there is convection, with hot gases rising, and cooling as their density decreases, etc. etc. For the first (ideal) case, I haven't really seen the analysis, I'm only saying the temperature is the same from the idea that there can never be temperature gradients in an isolated system in equilibrium. The first case is isolated, the second case, with the sun shining on it, is not. I have a suspicion this might not be so simple for low density gases, where the molecules travel so far between collisions that their change in potential energy between collisions is of the same order as their kinetic energy. But this is straying from the subject - I still believe that the temperature of a gas is proportional to the kinetic energy of the center of mass of the molecule, but not to the kinetic energies of the individual atoms in the molecule, and this distinction should be made. PAR (talk) 23:09, 2 October 2010 (UTC)
The point I'm making is that one can do without units in physics, provided one has access to enough knowledge about (and access to) fundamental relations between quantities. E.g. a classical physicist will lack knowledge about the fundamental equations that relate energy to mass, distances to time intervals, mass to length etc. Even if one has knowledge of these relations, to do accurate measurements, one may need to resort to physical representations of units that cannot be related to each other. In that case, what happens is that some relations that express indentities come with conversion factors that will have experimental errors. You can define a set of units where you set all these conversion factor equal to 1, but that may not yield the most accurate measurement system. Count Iblis (talk) 23:01, 2 October 2010 (UTC)
Definition with Second Law of Thermodynamics
I think we should remove that section. That definition is of the thermodynamical scale rather than the temperature itself. For example, the old Celsius scale, defined by setting water's freezing point at 0 and boiling point at 100 with assumption that mercury expands linearly with temperature, is indeed a scale of temperature, but apparently in disagreement with thermodynamical scale.--Netheril96 (talk) 08:50, 5 October 2010 (UTC)
Definition of temperature in Lead (2)
I have opened this section because there is at least one real problem with the whole article and to maintain any sort of intellectual respectability it should be resolved. Temperature is not a measure of energy nor is it a general statement about energy. Energy is measured in Joules, ergs, electron volts, Btu, calories, kWh whereas temperature is measured in K, oC etc., etc.
Ultimately the article should be corrected but I have no intention of starting an edit war with so many editors convinced that temperture is a measure of energy. Lack of agreement and consequent edit warring is one of the depressing features of Wikipedia and I would like to find some way to avoid it. So if you are convinced that temperature is a measure of energy please state your case here and at the same time explain why K, oC etc., etc. can be replaced by J, cal. erg etc. without introducing error.--Damorbel (talk) 06:33, 7 October 2010 (UTC)
- It has already been explained to you and others (above) that this is the language of physics and that even a unit of energy is an acceptable unit for measurement of temperature and the use of today's temperatures scales is merely a historical artifact introducing a proportionality constant. You have been shown the equivalence of energy and temperature in natural units, the fundamental expression of the underlying physics of this aspect. Even the equation E=1/2 kT make this clear, as there is no other physical concept entering but the equivalence of kinetic energy and temperature with a constant conversion factor that does nothing but reformulate the system of units. If you still don't believe this, then why don't you go and do you own private research, which you should have done when this was first explained to you, before expecting other editors to you give you a personal physics lesson here. There are myriads of books that explain it in simple language and you will find the terminology used here in every physics book. This is not a discussion forum to find an audience for your views or to get a personal lesson. Kbrose (talk) 18:39, 7 October 2010 (UTC)
- Well, I hate edit wars too. A "measure" of a physical quantity is something which has a one-to-one relationship with that quantity (A bijection) for the given situation. The two quantities need not have the same physical dimensions. In the case of the Kelvin scale, at equilibrium, kT/2 is equal to the energy per degree of freedom (for any degree of freedom that is not "frozen out" by quantum considerations). The temperature T is therefore certainly a "measure" of this energy. For situations not even close to equilibrium, temperature is not defined. For a non-absolute scale like Celsius, temperature is proportional to the energy per degree of freedom plus some constant, which is also a "measure" of energy. PAR (talk) 17:50, 7 October 2010 (UTC)
- I would like to better understand your concern, Damorbel. I have a similar statement I would like your comment on: "The height of a column of mercury is a measure of the atmospheric pressure." I consider that statement to be analogous to the one "temperature is a measure of the energy per degree of freedom". Temperature, like millimeters-of-mercury is easily measured; microscopic energy, like pressure, is a well-defined physical concept that is not a directly observable. Do you consider the analogy valid? If you can explain why not, then I think I'll be closer to understanding your objection. Spiel496 (talk) 17:26, 7 October 2010 (UTC)
Kbrose, I think it would be useful if the definition of the Boltzmann constant and temperature were made consistent. The Boltzmann constant is not a simple equation ratio like that relating two temperature scales such as oC and oF (forgive me if I have not understood what you mean here). The Boltzmann constant gives a number denoting how much energy is contained in the Degrees of freedom (physics and chemistry). For example an atom of a monatomic gas (e.g He) at 1K having 3 degrees of freedom possess about 3 x 1.38 = 4.14x10-23J whereas a diatomic gas molecule having 5 degrees of freedom (e.g. O2) would possess 67% more energy at 1K. Thus two gas molecules at the same temperature contain different amounts of energy, molecules of Helium contain 60% of the energy of Oxygen at the same temperature. This simple example illustrates why temperature cannot be defined as energy.--Damorbel (talk) 20:49, 7 October 2010 (UTC)
- No, the Boltzmann constant has nothing much to do directly with degrees of freedom, and the concept of degrees of freedom doesn't make any statements about energy. The article Boltzmann constant gets along just fine defining itself without referencing degrees of freedom. 'Degrees of freedom' is a concept of multiplicity of oscillators, independent directions of movement etc, just like the term 'number of atoms' reflects the multiplicity of atomic entities. Naturally, if counting energy, one sums over all available particles and over all available degrees of freedom accessible. The Boltzmann constant is the conversion factor to relate microscopic properties to macroscopic properties. By virtue of this, it is plausible that it introduces a change in units. To understand the pure physics of it all, set k = 1. Kbrose (talk) 21:59, 7 October 2010 (UTC)
- Kbrose,you write "the Boltzmann constant has nothing much to do directly with degrees of freedom" but the article on the Boltzmann constant metions 'degrees of freedom' at least six times in different contexts e.g. here [3] where it says "Monatomic ideal gases possess three degrees of freedom per atom, corresponding to the three spatial directions, which means a thermal energy of 1.5kT per atom". This simple statement makes it clear that the temperature of an atom is a function of the energy of that atom or more accurately the energy in the number of available degrees of freedom available in that atom. Diatomic molecules such as H2 and O2 with the same amount of energy as monatomic molecules with 3 degrees of freedom have a lower temperature because the energy is spread over more (5) degrees of freedom. If He and O2 with equal energy per molecule are mixed the molecules of each type will exchange energy by colliding according to kinetic theory until, on average they all have the same temperature; the O2 molecules will then have, on average, about 67% more energy than the the He because they have 5 degrees of freedom but they will, on average, have the same temperature. --Damorbel (talk) 15:17, 8 October 2010 (UTC)
- This is truly maddening and a waste of time. Perhaps you can stay on focus only once and not constantly slide into your favorite energy calculations. The article defines the Boltzmann constant without reference to degrees of freedom. Is that so hard to discover? Of course, in any energy calculation that relates to microscopic properties the constant will be involved, since our system of units demands it, that's the whole purpose of the constant. Kbrose (talk) 16:05, 8 October 2010 (UTC)
- WP:Waste of Time :) . Count Iblis (talk) 16:39, 8 October 2010 (UTC)
- Kbrose you will find written in the article here [4] "temperature is a measure of the thermal energy held by a body of matter." (first sentence). You should be aware that this contradicts the introduction to the article where it says "the thermodynamic definition of temperature can be interpreted as a direct measure of the average energy in each degree of freedom of the particles in a thermodynamic system" (last paragraph), after this somebody has added a note 'dubious, discuss' which is what I am doing. As part of the invited discussion I suggest you say if you also consider these statements to be contradictory and why. --Damorbel (talk) 08:53, 9 October 2010 (UTC)
- WP:Waste of Time :) . Count Iblis (talk) 16:39, 8 October 2010 (UTC)
- This is truly maddening and a waste of time. Perhaps you can stay on focus only once and not constantly slide into your favorite energy calculations. The article defines the Boltzmann constant without reference to degrees of freedom. Is that so hard to discover? Of course, in any energy calculation that relates to microscopic properties the constant will be involved, since our system of units demands it, that's the whole purpose of the constant. Kbrose (talk) 16:05, 8 October 2010 (UTC)
- Kbrose,you write "the Boltzmann constant has nothing much to do directly with degrees of freedom" but the article on the Boltzmann constant metions 'degrees of freedom' at least six times in different contexts e.g. here [3] where it says "Monatomic ideal gases possess three degrees of freedom per atom, corresponding to the three spatial directions, which means a thermal energy of 1.5kT per atom". This simple statement makes it clear that the temperature of an atom is a function of the energy of that atom or more accurately the energy in the number of available degrees of freedom available in that atom. Diatomic molecules such as H2 and O2 with the same amount of energy as monatomic molecules with 3 degrees of freedom have a lower temperature because the energy is spread over more (5) degrees of freedom. If He and O2 with equal energy per molecule are mixed the molecules of each type will exchange energy by colliding according to kinetic theory until, on average they all have the same temperature; the O2 molecules will then have, on average, about 67% more energy than the the He because they have 5 degrees of freedom but they will, on average, have the same temperature. --Damorbel (talk) 15:17, 8 October 2010 (UTC)
- No, the Boltzmann constant has nothing much to do directly with degrees of freedom, and the concept of degrees of freedom doesn't make any statements about energy. The article Boltzmann constant gets along just fine defining itself without referencing degrees of freedom. 'Degrees of freedom' is a concept of multiplicity of oscillators, independent directions of movement etc, just like the term 'number of atoms' reflects the multiplicity of atomic entities. Naturally, if counting energy, one sums over all available particles and over all available degrees of freedom accessible. The Boltzmann constant is the conversion factor to relate microscopic properties to macroscopic properties. By virtue of this, it is plausible that it introduces a change in units. To understand the pure physics of it all, set k = 1. Kbrose (talk) 21:59, 7 October 2010 (UTC)
- Damorbel, you're saying the same thing as the article: Temperature is a measure of the kinetic energy per degree of freedom. The O2 molecule and He atom both have 1.38x10-23J per degree of freedom. Does the article contradict this somewhere? If so, we should fix it. Spiel496 (talk) 21:37, 7 October 2010 (UTC)
- Spiel496, the article contradicts itself, see the opening statement and here [[5]] --Damorbel (talk) 15:17, 8 October 2010 (UTC)
- Yeah, Damorbel you have a good point about Temperature#Overview. It leads with temperature is a measure of the thermal energy held by a body. That's wrong, because it doesn't say "per degree of freedom". For example, because of the two rotational modes, oxygen will have more thermal energy than argon at the same temperature. In fact, the statement doesn't even say "per mole", so if taken literally, it implies temperature is proportional to quantity! There is at least one other place where the prose doesn't specifically normalize by the degrees of freedom. Damorbel (and others), would you consider the article more accurate if it consistently said "temperature is a measure of the average kinetic energy per degree of freedom per particle"? (That's horrible prose, but humor me.) Or is the problem deeper than that? Spiel496 (talk) 21:48, 8 October 2010 (UTC)
- Spiel496 - I think its better to say energy per translational degree of freedom (i.e. kinetic energy of the whole molecule as a result of its motion). The other degrees of freedom will have kT/2 only if they are not in the quantum regime, where their energy is contained in a few energy levels, or maybe the zero level if they are frozen out. That practically never happens for the translational degrees of freedom. PAR (talk) 22:37, 8 October 2010 (UTC)
- I agree that limiting the statement to translational KE avoids quantum effects, and indeed it avoids mention of the other degrees of freedom. But then can the statement be made to apply to solids and liquids? (Assume high enough temperatures, for the moment.) Spiel496 (talk) 00:10, 9 October 2010 (UTC)
- Spiel496, PAR the crucial factor is the 'per degree of freedom', the translational degree of freedom is the most important not just because 'monatomics' don't have any other (!) but also because 'particles' of gas can only exchange energy by collision. Liqids and solids exchange energy along the bonds that keep them together (as liquids, crystals, amorphous solids etc.). These bonds also contain energy which is seen as latent heat, heat of fusion etc. The Wiki article on heat capacity seems a good ref. to me it, also discusses quantum effects extensively but I haven't checked it through. You should also be aware that, by explaining apparent anomalies in various specific heats, quantum theory achieved a major triumph. --Damorbel (talk) 08:53, 9 October 2010 (UTC)
Spiel496, the relationship between atmospheric pressure and the height of a column of mercury is a simple ratio of the effect of gravity on a column of mercury and a column of gas. The matter becomes far more complicated when considering actual height of the two columns because mercury has a density almost independent of pressure so the force exerted by a column of mercury is very closely related to its height, whereas the height of a column of gas (should you be able to decide what its height is in the first place) depends strongly on temperature, not so with mercury. --Damorbel (talk) 20:49, 7 October 2010 (UTC)
- Whoa! Stop after your first sentence. I'm talking about a simple mercury barometer -- one column of mercury, no column of gas, no temperature stuff. The point I was going to make is that your objection about units sounds equivalent to someone saying "the height of a column of mercury cannot be a measure of pressure, because the height is in meters, whereas the pressure is in Pascals". I expect your objection goes deeper than just the units of Kelvin vs Joules. Does it? Spiel496 (talk) 21:20, 7 October 2010 (UTC)
You can also take this perspective. Thermodynamical quantitities like temperature stay relevant in the macroscopic realm. If you take the universe and scale it down and down until the microworld vanishes from view you can still readily measure temperature, heat flow etc. etc. The directly observable effects of this being due to statistical effects at the molecular level, (e.g. Brownian motion) can be made arbitrarily small while scaling things down.
Observers living close to the scaling limit, will likely discover thermodynamics without the proper theoretical foundations. Temperature then cannot be related in a universal way directly to the known mechanical quantities, so it will be assigned a new unit that cannot be expressed in terms of the known units. When later people do discover the fundamental relations, people will tend to stick to the old units, which means that the constant relating temperature to energy will have to cancel out the previously assigned unit for temperature.
This is why kb = 1.38...*10^(-23) Joule/Kelvin
The Kelvin cancels out the unit we've decided to give temperature. And the 1.38...*10^(-23) is small because we are pretty far removed from the microworld. We have chosen a unit for energy such that typical daily life phenomena will have energies within a few order of magnitudes of 1. In the exact scaling limit, one unit for energy would be sent to infinity and the magnitude of Boltzmann constant would thus become zero. So exactly at the scaling limit, you lose the relation between temperature and (renormalized) energy. At that scaling limit, you really have to define a unit for temperature. The reason why we have an independent unit for temperatre is because, being so close to the scaling limit, our ancestors could not readily observe the effects that demonstrate that we are not precisely at the scaling limit.
See here for a similar argument about the speed of light. Count Iblis (talk) 23:30, 7 October 2010 (UTC)
- Count Iblis, if you mean that temperature is an intensive property i.e. it is defined locally (it can change markedly over small distances) and discretely (per degree of freedom), I do not see how it can 'lose the relation between temperature and (renormalized) energy' --Damorbel (talk) 15:17, 8 October 2010 (UTC)
- If e is some energy difference in some mcroscopic process (e.g. neded to excite a vibratinal mode), then e/(your units of energy) ---> 0, as you let "your units for energy" -----> infinity. The larger you are, the larger the unit for mass you would use. Giants of size x will use a giant version of the kilogram of mass proportional to x^3 and the energy needed to heat water one of their kilogram from freezing point to boiling point scales as x^3. But that neergy will be renormalized to a new unit. So, as you increase x, that amount of energy will stay constant, so
e/(your units of energy) will scale as x^(-3). Count Iblis (talk) 16:47, 8 October 2010 (UTC)
Definition of temperature in Lead (3)
I quickly scanned your discussion, and I think no one realized that the energy, in general, has no fixed relationship with temperature. Most of your formulas are only applicable to ideal gas, or even only monatomic ideal gas, but temperature is a universal property that any macroscopic body has. The only way to define it is through the zeroth law.--Netheril96 (talk) 01:09, 8 October 2010 (UTC)
- Temperature is directly proportional to the energy per (availiable) degree of freedom at equilibrium. That sounds very fixed to me. Just because the units of temperature are different from the units of energy does not mean that temperature is not a measure of energy. Because of the fact that temperature is proportional (not equal, proportional via the constant k/2 !) to the energy per degree of freedom, it is therefore a "measure" of that energy. This follows from the definition of "measure". Why is this so hard to understand? Why do the people who argue against temperature as a measure of energy refuse to confront and deal with these simple facts? PAR (talk) 01:28, 8 October 2010 (UTC)
- Because your "facts" are oversimplified. Again, that relationship only applies to monatomic idea gas. Even triatomic idea gas like thin water vapor deviates from the theory, not to mention real gas (liquid, solid) whose molecules interact with each other. Or maybe you can tell me how to measure the energy of water on a particular degree of freedom. I only can measure U,H,G,F,S which are functions of both T and V and sometimes other variables (for real substance).
- Even in natural units when k=1 and temperature has the same dimension as energy, that does not mean they are one thing.--Netheril96 (talk) 03:40, 8 October 2010 (UTC)
- I answered the non-monatomic gas question in the discussion that you "quickly scanned". If you disagree with something I posted, quote me and point out the error. The kT/2 formula applies to everything, unless someone brings up a counterexample or, heaven forbid, a reliable reference. Spiel496 (talk) 06:31, 8 October 2010 (UTC)
- I searched all your words in this page and didn't think that you had answered my question. Let's set it aside for now and (probably) come back later. Now I give you the mass, amount of substance, U,H,F,G of some water vapor, and you may take it as Van der waals gas for simplicity, please tell me T. Or you can directly measure the average kinetic energy or the energy distributed on a particle degree of freedom?--Netheril96 (talk) 09:27, 8 October 2010 (UTC)
- I assume you include in your data other thermodynamic data like the specific heats of the substance. In that case, the internal energy is equal to pV/2 times the effective number of degrees of freedom availiable per molecule, and the effective number of degrees of freedom is equal to the molar specific heat at constant volume divided by the gas constant. (Definitely take a look at specific heat, particularly the specific heat#Energy storage mode "freeze-out" temperatures section.) I'm sure you will note that the "effective degrees of freedom" is not a fixed number except for certain limiting cases. For a thin monatomic gas at high enough temperature, it is essentially 3. For a diatomic gas at high enough temperature, it approaches 5. The reason for this fuzziness is that the energy is quantized in every degree of freedom, and when the quantum levels become too coarse in a particular degree of freedom, it will not hold an energy of kT/2 exactly. At cold enough temperatures, it will hold practically none and that degree of freedom gets frozen out. Even for a monatomic ideal gas of point particles, if you contain it in a box with one dimension much shorter than the others, at low enough temperature, that short dimension (short thermal wavelength, high thermal frequency) degree of freedom will freeze out, and the kinetic energy of the particles will not be 3/2 kT per particle, but rather 2/2 kT per particle. In short, I agree, if you take quantum considerations into account, temperature is, strictly speaking, not a good measure of energy per degree of freedom under all conditions. But I think the point that people are making is that the kinetic energy of translational motion is almost always negligibly affected by quantum considerations. The case of a small box with one side smaller than the other at extremely cold temperatures is hardly ever part of the problem. Neglecting this bizarre case, the temperature is an excellent measure of this translational kinetic energy per degree of freedom. PAR (talk) 13:51, 8 October 2010 (UTC)
- No, I didn't intend to tell you the specific heat. Without defining temperature itself before, how could get specific heat as heat absorbed per temperature raise? You said a lot, and the only thing I learned from it is that translational kinetic energy per degree of freedom is only a rough estimate for temperature, not definition. Besides, still you don't answer my question: how to measure the average translational energy when there are definitely interference of potential energy.
- And I think we may have got lost about what we want. I don't object to taking temperature as a rough measure of translational energy, --Netheril96 (talk) 01:17, 9 October 2010 (UTC)
- Oh thats right, using specific heat makes it a circular argument. I will think more about that. But why do you say rough? If you say that its a measure, then its a very precise measure, the constant of proportionality between translational energy and temperature is 3k/T to within parts in billions or more for most cases. Thats not rough. PAR (talk) 01:47, 9 October 2010 (UTC)
- this is more rigorous, but I haven't finshed writing this up (as it stands the text so far is a bit misleading, I have to invoke ergodicity and other subtle points). Then, from the working in the canonical ensemble, you can derive the equipartition theorem in the classical limit. Count Iblis (talk) 01:30, 9 October 2010 (UTC)
- What you give me IS a rigorous definition. But that is totally different from what we are talking about. And before how many times I ask will any of you answer me how to measure the translational energy.--Netheril96 (talk) 04:52, 9 October 2010 (UTC)
- this is more rigorous, but I haven't finshed writing this up (as it stands the text so far is a bit misleading, I have to invoke ergodicity and other subtle points). Then, from the working in the canonical ensemble, you can derive the equipartition theorem in the classical limit. Count Iblis (talk) 01:30, 9 October 2010 (UTC)
- Oh thats right, using specific heat makes it a circular argument. I will think more about that. But why do you say rough? If you say that its a measure, then its a very precise measure, the constant of proportionality between translational energy and temperature is 3k/T to within parts in billions or more for most cases. Thats not rough. PAR (talk) 01:47, 9 October 2010 (UTC)
- I assume you include in your data other thermodynamic data like the specific heats of the substance. In that case, the internal energy is equal to pV/2 times the effective number of degrees of freedom availiable per molecule, and the effective number of degrees of freedom is equal to the molar specific heat at constant volume divided by the gas constant. (Definitely take a look at specific heat, particularly the specific heat#Energy storage mode "freeze-out" temperatures section.) I'm sure you will note that the "effective degrees of freedom" is not a fixed number except for certain limiting cases. For a thin monatomic gas at high enough temperature, it is essentially 3. For a diatomic gas at high enough temperature, it approaches 5. The reason for this fuzziness is that the energy is quantized in every degree of freedom, and when the quantum levels become too coarse in a particular degree of freedom, it will not hold an energy of kT/2 exactly. At cold enough temperatures, it will hold practically none and that degree of freedom gets frozen out. Even for a monatomic ideal gas of point particles, if you contain it in a box with one dimension much shorter than the others, at low enough temperature, that short dimension (short thermal wavelength, high thermal frequency) degree of freedom will freeze out, and the kinetic energy of the particles will not be 3/2 kT per particle, but rather 2/2 kT per particle. In short, I agree, if you take quantum considerations into account, temperature is, strictly speaking, not a good measure of energy per degree of freedom under all conditions. But I think the point that people are making is that the kinetic energy of translational motion is almost always negligibly affected by quantum considerations. The case of a small box with one side smaller than the other at extremely cold temperatures is hardly ever part of the problem. Neglecting this bizarre case, the temperature is an excellent measure of this translational kinetic energy per degree of freedom. PAR (talk) 13:51, 8 October 2010 (UTC)
PAR, you contribution, the second paragraph in this section. Temperature is a measure of specific energy, not energy in general, that is what needs to be made clear. Temperature is somewhat comparable with specific gravity (SG) which is Kg/m3; you would (I hope!) never quote or use SG as kilograms (unit of mass). Temperature has a big brother called Entropy which is a measure of the distribution of energy and is by contrast an extensive property which is defined in the theory Statistical mechanics. Temperature does not need statistical mechanics to define it, it is one of the assumptions on which statistical mechanics is based. --Damorbel (talk) 09:15, 9 October 2010 (UTC)
- Damorbel - I WOULD use the weight (in kilograms) of a 1-inch cube of material as a "measure" of its specific gravity. If the weight is w kilograms, then the specific gravity is Kw where K=6.102 x . The weight of the 1-inch cube is not EQUAL to its specific gravity, but it is a MEASURE of its specific gravity. If something is a "measure" of something else, the two things do not need to have the same dimensions. If I know the weight of the one inch cube, then I can calculate the specific gravity. That makes the weight of the one inch cube a "measure" of its specific gravity. I would NEVER quote the specific gravity as the weight of the cube. I WOULD say that the weight of the cube in kilograms is a "measure" of the specific gravity. By the way, entropy is not defined in statistical mechanics. It too has a thermodynamic definition which does not depend on molecules and microstates, etc. Entropy was defined by Clausius, long before Boltzmann explained it using statistical mechanics. Statistical mechanics gives an explanation of entropy, not a definition. I agree that temperature too, is not defined by statistical mechanics, but is rather explained by statistical mechanics. This means that the idea that temperature is a measure of the energy per translational degree of freedom is not true by definition, it is true as a result of the assumptions of statistical mechanics. PAR (talk) 13:18, 9 October 2010 (UTC)
- Netheril96 - regarding the measurement of energy per degree of freedom. I think that it cannot be measured thermodynamically, using the parameters you have given. The idea that temperature is proportional (via Boltzmann's constant) to energy per degree of freedom is a statistical mechanical idea, not a thermodynamic idea, and so cannot be measured thermodynamically. But does this mean it is not true? The situation is the same for entropy. In statistical mechanics, entropy is proportional (via Boltzmann's constant) to the log of the number of microstates that could possibly yield the given macrostate, or thermodynamic state. Given the same thermodynamic parameters you have provided, there is no way that they can be used to calculate the number of microstates availiable. To be consistent, you must also say that entropy is not a measure of the number of microstates. Are you also saying this? PAR (talk) 13:38, 9 October 2010 (UTC)
- (I was editing at the same time as PAR and duplicated some of his points.) Netheril96, I think the answer is that one cannot directly measure the kinetic energy per degree of freedom or infer it from U,H,F and G. With a monotonic ideal gas, everything is proportional to temperature and it's easy. But with a non-ideal gas or a liquid, where potential energies between particles are important, there isn't any kind of simple relationship between internal energy and temperature. Is this the point you're making? I'll assume "yes" and forge ahead: Some of the energy is in chemical bonds, some of it is in the potential energy associated with Van der Waals and hydrogen-bonding forces, and some of it is in the various K.E. components. I think what Kbrose and others are saying is that, while all of that makes it very difficult to calculate the heat capacity, one relationship remains very simple: The amount of the energy in each K.E. component is proportional to that thing that becomes equal when two systems are in thermal contact -- the temperature. Spiel496 (talk) 13:40, 9 October 2010 (UTC)
- The Boltzmann relationship is different from equipartition theorem in that the predictions of former have been tested precisely while the latter can only give approximate results (I mean results that can be experimentally tested). For example, it predicts all diatomic gas has molar specific heat capacity of 2.5R at room temperature, but in fact, the molar specific capacity of H2 ,NO, Cl2 are 2.53R, 2.57R, 3.02R respectively. And before we continue, I think we are consenting to specify in the article that your definition would only be valid when quantum effects are negligible.--Netheril96 (talk) 02:11, 10 October 2010 (UTC)
- Yes, it predicts 2.5 if quantum effects are neglected, and the variations from 2.5 are due to quantum effects, none of which affect the translational degrees of freedom. Also, yes, the question remains, how do you measure the energy per translational degree of freedom? You say that the Boltzmann relationship relating entropy to the log of the number of microstates has been verified experimentally. Please explain how that has been done. Temperature and entropy are conjugate variables - their product is a measureable energy. You can say its TdS or you can say where is the translational energy per molecule per degree of freedom and you can say where f is the number of microstates availiable, and then k cancels and you get . If you are saying that cannot be measured, then I would say neither can f. Because the energy T dS CAN be measured. Or, to put it another way, the energy per degree of freedom is , which is measureable, if, as you say, "The Boltzmann relationship has been tested precisely", which means is measureable. PAR (talk) 18:05, 10 October 2010 (UTC)
- OK, I admit I was wrong. But for your information, the deviation is not due to quantum effects; it is because of potential energy.--Netheril96 (talk) 05:38, 11 October 2010 (UTC)
- There's no deviation due to potential energy, and you both have the heat capacity incorrect. The prediction is 3 d.o.f. for the translational KE, 2 for the rotational modes, 1 for the vibrational KE and 1 for the vibrational potential energy. That's 7 total, so CV is 7/2*R=3.5R. That is the prediction for a diatomic gas, but quantization of the vibrational mode reduces it, especially for hydrogen. And so what? The energy in a non-frozen d.o.f. is still kT/2. Spiel496 (talk) 13:45, 11 October 2010 (UTC)
- The total number of degrees of freedom is 6 not 7. That the vibration degree of freedom counts double in the energy (because of the potential energy term) doesn't mean that it should count twice as a degree of freedom. The whole point of d.o.f. is to describe in how many independent ways the molecule can move. In case of a two atomic molecule in which the two atoms can move independently, that is clearly 6. Count Iblis (talk) 14:45, 11 October 2010 (UTC)
- That cannot be right. If its one degree of freedom, then it gets energy kT/2, end of story. Off the top of my head, I believe the potential energy and kinetic energy form the total energy kT/2 for the one vibrational degree of freedom, but this is at odds with Specific heat#Diatomic gas. PAR (talk) 14:53, 11 October 2010 (UTC)
- Any quadratic term in the Hamiltonian gets a k T/2 contribution to the internal energy in the classical limit by the equipartition theorem. But d.o.f. is a kinematic concept that you want to use as a book keeping device. So, you want to say that a two atom system has 6 d.o.f. because the two atoms can move independently and that you can also decompose the motion in terms of center of mass movement, rotation and vibration, and then you get 6 = 3 + 2 + 1. Then the fact that the 1 vibrational d.o.f. counts double when the two atoms are bound, is due to the potential energy term, but a potential energy term is always present, even if the two atoms are not bound.
- That cannot be right. If its one degree of freedom, then it gets energy kT/2, end of story. Off the top of my head, I believe the potential energy and kinetic energy form the total energy kT/2 for the one vibrational degree of freedom, but this is at odds with Specific heat#Diatomic gas. PAR (talk) 14:53, 11 October 2010 (UTC)
- The total number of degrees of freedom is 6 not 7. That the vibration degree of freedom counts double in the energy (because of the potential energy term) doesn't mean that it should count twice as a degree of freedom. The whole point of d.o.f. is to describe in how many independent ways the molecule can move. In case of a two atomic molecule in which the two atoms can move independently, that is clearly 6. Count Iblis (talk) 14:45, 11 October 2010 (UTC)
- So, to award it an extra degree of freedom, cannot be done merely because there is potential energy term, you would have to consider the precise form of the effective potential energy function. If this were to be an x^n potential, the contribution to the internal energy would be k T/n, so in the end it wouldn't even satisfy the desire to make the internal energy in the classical limit equal to d.o.f./2 k T in all cases. But to even consider the form of the effective potential energy function would be strange, because that already would in princple be based on a thermodynamic calculation (the potential is not precisely a harmonic potential, you can approximate it like that but with a cut-off, the molecule will dissociate at high temperatures). Count Iblis (talk) 15:25, 11 October 2010 (UTC)
- Count Iblis, you're right, of course, even though the PE contributes to Cv, it is not a d.o.f. Spiel496 (talk) 16:59, 11 October 2010 (UTC)
- Well, clearly I should stop talking off the top of my head and revisit this subject. But if what you are saying is true, then the statement that each molecular degree of freedom contains an energy of kT/2 per molecule is false. That's what you are saying, right? PAR (talk) 20:58, 11 October 2010 (UTC)
- Hopefully my wording won't shoot me in the foot but I believe that this would mean that each molecular degree of freedom "contains" a 'kinetic' energy of kT/2. Consideration of potential energy as well adds an additional amount of kT/2 for each degree of freedom. As Count Iblis mentioned, the amount should depend on the nature of the potential. In the case of a diatomic ideal gas, the only dof that is associated with a potential is the vibration. Since rotations and translations (for ideal gasses at least) don't have an appreciable potential, only their kinetic energy contributes to the heat capacity and thus to the total energy. Another example would be to look at a solid where the classical heat capacity (before Einstein and Debye - or at high temperatures) should be equal to 3Nk, N being the number of particles in the solid. This result appears to have a heat capacity of k for each degree of freedom and a total energy of 3NkT, which can, in principle, be interpreted as kT/2 worth of kinetic energy and kT/2 worth of potential energy for each dof. Sirsparksalot (talk) 21:59, 11 October 2010 (UTC)
To Netheril96 - ok, but wrong about what? I think you are wrong that the number of microstates can be directly measured, but that still leaves your original question: How do you measure translational energy per degree of freedom, and if you can't, what does the concept really mean? For this article, I would be in favor of a statement something like "If quantum effects are negligible, temperature is a measure of the energy per molecule per degree of freedom of a substance. In any macroscopic situation, the translational degrees of freedom are practically unaffected by quantum considerations, and therefore the temperature is a very accurate measure of the translational kinetic energy per molecule of a substance." But I still wonder what that means if you cannot measure it. PAR (talk) 14:48, 11 October 2010 (UTC)
- In a word, I am confused. I had thought the equipartition theorem only hold precisely for gas whose molecules don't interact before I read some books on statistical mechanics and found that it holds under any circumstances in classical statistical mechanics. The microstates in fact cannot be measured, but the corollaries derived from work perfectly, which is why I said Boltzmann relationship had been experimentally verified. But since equipartition theorem is derived from the statistical mechanics postulates I think maybe it no longer needs experimental test. So far this is what I have in mind on this topic. When I actually finish my course of statistical mechanics (maybe next year) I will have more definite opinion.--Netheril96 (talk) 15:52, 12 October 2010 (UTC)
---
Wow! A lot has been said in the last week since I've been able to check in on the discussion. But I am glad that this debate has finally opened up as it is LONG overdue. I agree that the article has to be heavily edited, particularly with regard to the statement relating temperature and energy. My main issue is that so many people are making so many over-generalized statements. While it is true that one can determine the mean translational kinetic energy from the temperature (and vice versa) it is only one specific case that has a simple (i.e. direct) proportionality between temperature and energy. Yes, quantum effects are negligible for translational degrees of freedom, but unless we are talking about a monatomic ideal gas (which we almost never do when considering the number of thermodynamic systems of interest!) then quantum effects are VERY important and absolutely cannot be overlooked. It is perfectly OK to say that there is a relationship between energy and temperature, but there are far too many details to ever be able to put into a wiki article such as this. As nice as it would be to be comprehensive, I think it is more poignant to get the main points across. Rather than bicker about the details about how they are related, why can't we just say that they are related and move on? We could put a few examples of these relationships for a few specific cases (e.g. the average translational temperature of a gas is Ekin=3/2kT or, since ideal monatomic gasses only have translational degrees of freedom E=3/2kT can be used for the average total energy of an ideal gas per d.o.f.). What do people think about this approach. Sirsparksalot (talk) 19:41, 11 October 2010 (UTC)
- I agree with everything you say, except I hope we can craft a statement for the Lead section that would encompass a broader class of systems than monatomic gases. Saying the temperature is a measure of the translational kinetic energy of the molecules is accurate enough for molecular gases. This statement remains true even when there are significant inter-molecular forces. For solids, yes, one must bring up a bunch of caveats for quantum effects plus the fact that the motion is solely vibrational. I say leave that level of detail out of the Lead.Spiel496 (talk) 00:51, 12 October 2010 (UTC)
- Sorry for my miss-communication, the intention of putting the monatomic gas stuff in there was to be an example of what examples we could list. I didn't intend it to be exclusive. What about putting a new section in the article? We could clean up a lot of the stuff that is spread all over and lump it into one section like "relationship between energy and temperature" and put a few examples for different systems. Sirsparksalot (talk) 16:19, 12 October 2010 (UTC)
Reorganize definitions under the zeroth law
The zeroth law is the principal and defining property of temperature, which should be addressed in the article. The various kinds of definition are in fact just different realizations. Another problem is that the definitions are only different approaches to define thermodynamic scale; the old Celsius scale, which was based on mercury's property instead of the universal properties of matter as employed in defining thermodynamic temperature, should have its place.--Netheril96 (talk) 16:01, 12 October 2010 (UTC)
- Temperature can be defined through both the 0th and the 2nd law. It was rigorously defined before the 0th law existed. However, the correctness and reason of measuring temperature with a thermometer was not rigorously established until the zeroth law was formulated, therefore its strange numbering. The article mentions both already. Which one emphasizes first is often a matter of taste and teaching preference, but in general the 0th law provides a more accessible basis. I don't know what your criticism aims at, as the article already discusses both, in apparently the sequence you also prefer. Kbrose (talk) 19:44, 12 October 2010 (UTC)
By the way, a lot of people still think Celsius is defined as 0° at freezing point and 100° at boiling point of water. Maybe we can say something in this article.--Netheril96 (talk) 16:07, 12 October 2010 (UTC)
- Netheril96, would you care to show how the 0th Law of thermodynamics enables you to calculate temperature? Kbrose, you might do the same for the 2nd Law of thermodynamics.--Damorbel (talk) 07:53, 13 October 2010 (UTC)
- Zeroth law don't numerically define temperature. That is called a scale of temperature. So basically I mean to introduce the concept, or well defining of temperature in a section about zeroth law and put all scales of temperature like Celsius scale, thermodynamic scale under that as subsections.--Netheril96 (talk) 10:48, 13 October 2010 (UTC)
- Scale? Scale is unimportant, you can choose any scale you like for temperature, the scale does not change the physics, the scale is no more than a name. What is important in temperature is what it measures. Why does temperature difference drive the transfer of energy from a warm place to a cold one? What I am trying to draw to your attention is that temperature is not a measure of energy but a measure of the concentration of energy. In mechanics this can be a function of the velocity of a particle e.g. an atom of gas, or the resonant energy of an atom (mass) bound by a chemical bond (spring) to a molecule. --Damorbel (talk) 11:07, 13 October 2010 (UTC)
- Clearly you don't understand what the scale of temperature is. What you say is strictly thermodynamic scale; the old Celsius scale, still a valid scale of temperature, is not proportional to energy as thermodynamic temperature.--Netheril96 (talk) 14:03, 13 October 2010 (UTC)
- I am referring to scales such as Kelvin, Celsius, Fahrenheit etc. Have I misunderstood the meaning of 'scale'? --Damorbel (talk) 16:09, 13 October 2010 (UTC)
- Damorbel - you have not. All are linearly related to translational energy "concentration", i.e. translational energy per molecule and are thus linear "scales". Absolute scales like Kelvin are not just linear, but proportional. The 0th and 2nd law help in the definition of temperature, but do not, as such, define it. It may be defined as follows: If you have an ideal gas (the limiting case of a gas whose density approaches zero), then for a linear temperature scale, temperature may be defined as linear in PV for a fixed quantity of gas. P is measureable, V is measureable, and so a temperature T=k1*PV+k2 of the gas is measureable to within a pair of constants, k1 and k2. If k2=0, its an absolute scale. This is one way of experimentally defining a linear thermodynamic temperature scale with two undetermined constants. Two defining measurements are needed to set those constants. The second law says that when equlibrium is achieved (no change after a long time), the temperature of two thermally connected bodies (e.g. the thermometer and the system measured) will be the same. This defines the temperature of the measured body. Roughly speaking, thermally connect your ideal gas thermometer with water at the triple point, wait for equilibrium, call that zero. Put it at the boiling point of water at atmospheric, call that 100, and you have the Celsius scale. Call those temperatures 32 and 212, you have Fahrenheit. Put it at the triple point, call that 273.15, and set k2=0 and you have the Kelvin scale. The zeroth law says that if the temperature of two bodies so measured are equal, and they are then put in thermal contact, their temperatures will not change. This tells you that a second ideal gas thermometer can be constructed and when calibrated in the same way as the first, will always agree with the first. The PV definition, along with the 0th and 2nd law nail down the idea of temperature. This is a rough picture that does not include all the caveats about how the systems must be thermally isolated from the universe, etc. Statistical mechanics then tells you that the translational energy per molecule is proportional to an absolute temperature. PAR (talk) 00:21, 14 October 2010 (UTC)
- You both misunderstood the meaning of scale. It is now just a name because they are all redefined in terms of kelvin. The old definition of Celsius, taking the relationship of volume and temperature of mercury as linear, is not precisely compatible with present definition, i.e., T+273.15. In fact, melting point of ice now is not precisely 0°C[1]. Whatever, I am gonna write a separate article dedicated to the concept of scale.--Netheril96 (talk) 01:03, 14 October 2010 (UTC)
References
- ^ The ice point of purified water has been measured to be 0.000 089(10) degrees Celsius - see Magnum, B.W. (1995). "Reproducibility of the Temperature of the Ice Point in Routine Measurements" (PDF). Nist Technical Note. 1411. Archived from the original (PDF) on Mar 07,2007. Retrieved 11 February 2007.
{{cite journal}}
: Check date values in:|archivedate=
(help); Unknown parameter|month=
ignored (help)
- The above was intended to give the idea of how temperature is defined, not paying intense attention to the numbers. To be precise, if you use the ideal gas definition of temperature, the Kelvin scale is set to be proportional to PV and to have a value of 273.16 at the triple point of water (triple point, not ice point). The Celsius scale is set to have a value of 0.01 at the triple point of water. (used to be 0.0 at the ice point, then they realized the triple point is more stable, and they thought the difference between the two was 0.01 C, so they made it .01 at the triple point. Now the ice point has changed, but the scale is nevertheless as stable as the triple point, so it ain't broke, so don't fix it.) If Fahrenheit is now based on Kelvin, then its a different scale than the one defined as 32 at the ice point, 212 at the boiling point. You might want to reference Temperature Conversion#Comparison of temperature scales when writing this article. Note that Kelvin, Celsius, Fahrenheit, etc. are all referred to as temperature scales. Any one-to-one relationship (bijection) between temperature and PV could be used as a temperature scale. We could have a temperature scale in which T=12.3*(PV)^13.7 + 33.2 for 1 mole of gas. The ideal gas law would read PV=n((T-33.2)/12.3)^(1/13.7) and the energy per translational degree of freedom per molecule would be (1/2)((T-33.2)/12.3)^(1/13.7)/A where A is Avogadro's number. Any absolute scale which is just PV=QT for one mole where Q is some constant is an absolute scale and it is the simplest choice. PV=Q(T+T0) is the next simplest, a linear scale. Celsius and Fahrenheit are linear scales. PAR (talk) 04:16, 14 October 2010 (UTC)
I finalized my article on scale of temperature. You may read it to get a hint on how scale of temperature differs from just a name.--Netheril96 (talk) 04:32, 19 October 2010 (UTC)
Empirical temperature
I suggest that the edit concept of 'empirical temperature' [6] should have been discussed before posting, its meaning is far from clear, a google search for 'empirical temperature' yields results connected with chemistry. --Damorbel (talk) 13:26, 13 October 2010 (UTC)
PAR's edit
PAR, in your edit [7] you wrote "This means that bringing a thermometer into thermodynamic equilibrium with another system, and allowing equilibrium to occur," . I suggest that "bringing a thermometer into thermodynamic equilibrium with another system" is enough and there is no need to continue with "allowing equilibrium to occur" since you have just said that with slightly different words.
- Right - bad wording on my part, I will fix it.
Further, you also write "The second law of thermodynamics then defines absolute temperature through the use of the concept of reversible heat engines operating between systems at different temperatures, but only to within a calibration constant. This allows the actual construction of a thermometer".
Since in simple terms the 2nd Law defines the direction of heat transfer, even in relation to thermometers and temperature measurement, it only defines whether the temperature of thermometer will rise or fall when thermally connected to the place of measurement. I do not see how the 2nd Law defines absolute zero since it only relates to the direction of energy transfer and not the cessation of energy transfer at absolute zero. --Damorbel (talk) 07:20, 19 October 2010 (UTC)
- Check out Thermodynamic temperature#Definition of thermodynamic temperature. This is the bottom line on the definition of temperature. I don't know why there are two separate articles, what other kind of temperature is there?
- There is a really good book by Enrico Fermi entitled "Thermodynamics", and its one of the best books on the subject. The guy knows his stuff, and he doesn't waste time. I measured it, the book is a total of 3/8 of an inch thick. The description in the above article is essentially the same as in this book, but with less detail. Anyway, to answer your question, the second law describes, or at least puts boundaries on, the behavior of reversible heat engines. In his book, Fermi uses these reversible engines along with the first and second law as really versatile conceptual tools to prove all sorts of things, without ever resorting to statistical mechanics to make an argument. In other words, pure thermodynamics. The above definition of temperature is just one of those results. The simplest reversible engine is a cylinder with a piston containing an ideal gas. The work it does is PdV, a fully measureable quantity. Fermi shows that, as a result of the definition of temperature, you can construct a thermometer out of the above engine, and define temperature as PV times some constant (that holds only for that thermometer). When PV goes to zero, you have absolute zero. PAR (talk) 12:02, 19 October 2010 (UTC)
- PS - you don't need it to be an ideal gas to build a thermometer by this method - but then you have to measure the work done during the cycles of the reversible engine in order to calculate temperature. An ideal gas is convenient because you can set the temperature to some constant times PV and get the same answer. PAR (talk) 20:02, 19 October 2010 (UTC)
- While I do like what the edit is saying, is it really a clear enough "definition" of temperature to warrant putting in the introduction? Could this be something that could go into the Temperature measurement section? Sirsparksalot (talk) 01:05, 20 October 2010 (UTC)
- Well, its not intended to be a clear definition, just tell the reader where to look for a detailed definition of temperature. Im not sure where it should go, but I think it should be made clear that the bottom line definition of temperature is a thermodynamic definition, after which comes the statistical explanation of temperature. PAR (talk) 01:41, 20 October 2010 (UTC)
- While I do like what the edit is saying, is it really a clear enough "definition" of temperature to warrant putting in the introduction? Could this be something that could go into the Temperature measurement section? Sirsparksalot (talk) 01:05, 20 October 2010 (UTC)
Opening Statement 3rd Paragraph
This paragraph is vague to the point of inaccuracy. The 3rd sentence of this para. "Microscopically, temperature is the measure of the average energy of motion, called kinetic energy, of particles of matter." is incorrect. First, it doesn't distinguish between the different number of degrees of freedom a particle can have. Second, it should be clear that it is kinetic energy of microscopic motions i.e. not macroscopic translational motion. Third, in rather unclear English, it says that "...temperature... it is also referred to as thermal energy"; I have yet to see a coherent argument for this. Fifth, it says "the transfer of thermal energy is commonly referred to as heat"; it may be common to say this but it is confusing and inaccurate. Since temperature is the driving force behind the second law of thermodynamics but temperature is not a measure of energy but of specific energy, thus it is heat that is measured by temperature, not energy i.e. a system in equilibrium (at maximum entropy or uniform temperature) is not in a comdition of homogenous energy distribution e.g. in a system comprising monatomic and diatomic gases (mixed or not) in thermal equilibrium, in general the diatomic gases will have more energy than the monatomic gases; that is, the diatomics have a higher heat capacity than the monatomics. --Damorbel (talk) 20:30, 11 October 2010 (UTC)
- While I am a bit confused as to all of the points you are making, I do agree that the paragraph you're referring to is a mess. I think a lot of the stuff in it is addressed further below in the article and, seeing as though this paragraph is in the introduction, it should only discuss topics in a general sense and avoid as many specifics as possible. Could we keep the first two sentences and get rid of the rest of the paragraph? It might even be appropriate to then merge it with the following paragraph, although I'm not entirely sure that it would be. I'd also like to see the 5th paragraph of the introduction go. As with the 3rd paragraph, it contains too many details that are discussed further below and I wouldn't consider what it says of utmost importance for an introduction. Sirsparksalot (talk) 21:34, 11 October 2010 (UTC)
- Damorbel, regarding the phrase "it is also referred to as thermal energy", the "it" refers not to temperature, but to kinetic energy. Spiel496 (talk) 22:54, 11 October 2010 (UTC)
I think the key to this is the definition of heat, heat is the intensity of energy. Looking at it this way the 2nd law becomes a doddle since it is not difficult to understand that the energy associated with heat will tend to spread from zones where the energy intensity is high to where it is lower - by diffusion, radiation, convection etc. This doesn't depart from the definition of thermal energy as that due to molecular motions. It also makes it easier to comb the article, sorting the good from the bad.
The Wiki article on heat is in a very poor state because of the confusion between thermal energy and heat --Damorbel (talk) 07:48, 12 October 2010 (UTC)
- This 'discussion' is ridiculous. Please read some physics books before commenting here. 'Energy intensity'? Where are you getting this stuff? There is not one physics book that uses such a term. Intensity is usually a flux density, i.e. a quantity per time and per area. By today's definition in physics, heat is thermal 'energy in motion', with a simple dimension of energy, but sometimes it is used for the process of movement, rather than the 'moving energy' itself. Heat is what flows due to a difference in temperature. And this is why in modern physics matter never 'contains' heat. Heat is only exchanged between closed or open systems in thermal contact. In engineering the term heat has a little broader scope often, engineers don't restrict themselves to the same narrow definition and often include e.m. radiation as heat, but from a physics view point that is 'work' done on matter, not heat. From a physics point of view, convection is also not heat, but the flow of fluids, i.e. mass exchange between regions of different temperatures created from pressure differences. The field of heat transfer engineering generally uses a very loose definition of what they call heat. When reading a lot of books about heat one finds that the term has a very broad scope of usage, but physics uses a very narrow one, which is counter-intuitive to what the average person associates with heat. What lay persons use as the notion of heat is actually better described in physics as entropy. Kbrose (talk) 12:30, 12 October 2010 (UTC)
- I would tend to agree with a lot of what Kbrose just stated. The word "heat" should be thought a verb, although most people unfortunately tend to use it as a noun. The one part I would disagree with is the final sentence as I feel that lay persons typically confuse the notion of heat with thermal energy, not entropy. That being said, I think this discussion is starting to get off of the original topic. What do people think about the suggestions that I made in this thread earlier? Could we get rid of a lot of this stuff from the introduction? Most of it is addressed further below in the article and I think keeping things simple in the introduction would help out the article a ton. Sirsparksalot (talk) 15:43, 12 October 2010 (UTC)
- Kbrose, I am not at all sure what your meaning by 'energy in motion' is. If energy is trasmitted from one place to another surely the measure is Power Joules/sec or Watts. You write 'Heat is what flows due to a difference in temperature'. How can a modern definition of heat that refers to 'heat flowing'? Fluids flow. The 'fluid' concept of heat was caloric, recognition that energy is conserved (1st law of thermodynamics) ended the acceptance of fluid (material) heat. --Damorbel (talk) 19:03, 12 October 2010 (UTC)
- Sirsparksalot, I certainly agree to a clean up but not at the expense of heat at a measure of energy. I have yet to find a logical explanation that temperature is anything more (or less) than the measure of heat, to put it any other way would make nonsense of the 2nd law of thermodynamics. --Damorbel (talk) 19:03, 12 October 2010 (UTC)
- I suppose my issue with the section in question is that it goes into far too much detail (masses, velocites, etc.) for what it is trying to accomplish. But, that being said, I do think that discussing heat and energy this early has a few major disadvantages. As stated by Netheril96 below, the only principle necessary to define the temperature is the zeroth law of thermodynamics. All of the other "definitions" are not actually true definitions, just ways for us to get a conceptual understanding of what it represents. They tell us how temperature relates to other thermodynamic quantities, but neither do they define the temperature or does the temperature define those properties. While heat and temperature are intricately related to one another, neither one can define the other. Personally, I think that this is one of the most common misconceptions of thermodynamics (it certainly was mine when I first took a course in thermo and it took me quite some time and a large number of discussions with experts to fix!). Thermodynamics does not strive to define or derive any one thermodynamic variable. It merely derives relationships among them.
- Taking all this into consideration, removal of the parts of the introduction that I suggested would leave an outline as follows... 1) It would give a very basic definition of temperature that is hopefully accessible to the lay person (i.e. speaks in terms of "hotness" and "coldness" that most people have experience with. 2) It mentions that temperature is a quantitative measure of these experiences 3) It addresses how people typically measure temperature 4) It mentions the zeroth law which, in principle, is the closest thing we have to a true "definition" of temperature 5) It mentions the historical developments that have been used to conceptualize what the temperature is in terms of other thermodynamic variables. Here is the place where we can mention that there exists a relationship between temperature and energy, temperature and heat, but I don't think that we should do anything to quantify what that exact relationship is. This is because the quantitative relationship between them is highly dependent upon the system in question. Rather than address this here, I would like to see a short section somewhere else in the article that lists a few of these relationships. Sirsparksalot (talk) 19:59, 12 October 2010 (UTC)
- I stand partially corrected. Kbrose added some discussion below that leads me to change my mind slightly. However, I would still stick with my current suggestions for cleaning up the introduction. Sirsparksalot (talk) 20:04, 12 October 2010 (UTC)
- Does anyone have objections to Sirsparksalot's proposed outline? It's fairly similar to the current article, though (if I'm reading it correctly) more concise. Some of my own thoughts: I would like the Lead to (continue to) relate temperature to the kinetic energy contained in the random motions of the particles -- and I think saying "random" or "disordered" explicitly is important. We should remove some of the detail about kinetic energy. As it reads now, it implies that materials consisting of more massive particles tend to be warmer -- I know, that's not what it says, but it might come off that way to the kind of person that would look up "temperature" in Wikipedia. Spiel496 (talk) 21:33, 15 October 2010 (UTC)
Since relative temperature defines the direction of energy flow, the basic definition of temperature must include the Boltzmann constant. --Damorbel (talk) 20:36, 12 October 2010 (UTC)
- The Boltzmann constant is necessary to relate the microscopic density of states to the classical thermodynamic entropy. It is essentially a way to connect the microscopic and macroscopic worlds. While I respectfully disagree that it is necessary to the basic definition of temperature, I do agree that it should be included, albeit briefly, in the introduction. Maybe directly after the sentence on the statistical picture it could mention something about the necessity of the Boltzmann constant for connecting the two pictures. Sirsparksalot (talk) 21:13, 12 October 2010 (UTC)
Sirsparksalot, Would you care to explain what part of the Boltzmann constant is "a way to connect the microscopic and macroscopic worlds"? It is just a number giving the ratio between the energy in one degree of freedom and the temperature. The number is different for some of the the various temperature scales that have different step sizes; in natural units the Boltzmann constant can be 1. It is interesting to note in the natural units article, it explains "Some physicists do not recognize temperature as a fundamental physical quantity, since it simply expresses the energy per degree of freedom of a particle, which can be expressed in terms of energy (or mass, length, and time)"; this is consistent with kinetic theory in which particles exchange momentum/energy by elastic collision. Despite the assertion that "Some physicists... etc." the fact is that it is temperature i.e. energy per degree of freedom that drives the 2nd law of thermodynamics. --Damorbel (talk) 07:42, 13 October 2010 (UTC)
- Sirsparksalot has it right. There are TWO distinct disciplines, TWO. Thermodynamics and Statistical Mechanics. Thermodynamics is a macroscopic theory, in which temperature, pressure, entropy, etc. are experimentally DEFINED, without assuming that there are molecules, atoms, or anything like that, these are all microscopic (statistical mechanical) concepts. The basic definitions of thermodynamic quantities like temperature and entropy have nothing to do with atoms, degrees of freedom, number of microstates, etc. Thermodynamics as a discipline was largely in place before Boltzmann came along. Arguments were still going on as to whether matter was a continuum or composed of particles, but there was no pondering about how to measure temperature, entropy, etc. Boltzmann is the heavy lifter, if not the founder of statistical mechanics. Statistical mechanics EXPLAINS thermodynamics in terms of molecules, degrees of freedom, etc. It does not DEFINE thermodynamic quantities - they have already been defined in thermodynamics. The fundamental link between thermodynamics and statistical mechanics is the Boltzmann constant. Temperature (a thermo concept) is proportional to the translational energy per degree of freedom (a stat mech concept). The constant of proportionality is the Boltzmann constant. Entropy (a thermo concept) is proportional to the log of the number of microstates availiable to a system (a stat mech concept). The constant of proportionality is again the Boltzmann constant. The term TdS in the second law is an energy, expressed in thermodynamic variables. T=2f/k where f is the energy per degree of freedom. S = k log(W) where W is the number of microstates availiable. In stat mech terms, TdS=2f d(log(W)). Expressed thermodynamically, there is no Boltzmann constant, no need for one. Expressed in stat mech terms, there is no Boltzmann constant, no need for one. Again, the Boltzmann constant is the link between thermodynamics and statistical mechanics, the macro and the micro. PAR (talk) 22:42, 15 October 2010 (UTC)
- P.S. - I didn't mean to slight Avogadro's contribution k=R/A where R is the (thermodynamic) gas constant, and A is Avogadro's number, which relates the thermodynamic (i.e. macroscopic) concept of mole to the microscopic concept of the number of particles. PAR (talk) 00:23, 16 October 2010 (UTC)
- Sorry for not getting back sooner but I think that PAR explained essentially everything that I would have said anyway. The only place where I would emphasize caution is the statement that relates the temperature to the energy per degree of freedom. While I acknowledge a relationship between the two ideas, I have major reservations with stating such a definite relationship. As many people have mentioned on this page, the exact functional form of this relationship is highly dependent upon the system in question. Other than that, I think PAR summed things up rather nicely. But, while the Boltzmann constant is not necessary to define temperature, I do think that it should be briefly included in the introduction. Sirsparksalot (talk) 00:41, 20 October 2010 (UTC)
- Yes, I should have specified translational energy per translational degree of freedom. PAR (talk) 01:44, 20 October 2010 (UTC)
- Due to equipartition, the accessible degrees of freedom of a molecule, vibrational as well as translational, the contain the same energy. This is the reason why molecules with different configurations (number of degrees of freedom) e.g. He, N2 and CO2 contain different amounts of energy when at the same temperature i.e. He, N2 and CO2 have different Cv specific heats. From this one comes to the conclusion that the parameter called temperature is not a measure of the energy in a molecule but the measure of energy in a degree of freedom. --Damorbel (talk) 06:26, 20 October 2010 (UTC)
- Yes, I should have specified translational energy per translational degree of freedom. PAR (talk) 01:44, 20 October 2010 (UTC)
- Therefore, what? Spiel496 (talk) 13:22, 20 October 2010 (UTC)
- Therefore temperature is not a measure of energy. --Damorbel (talk) 13:35, 20 October 2010 (UTC)
- For crying out loud, Damorbel, you yourself use the phrase in "the measure of energy in a degree of freedom". Please make your comments address the wording of the article. Spiel496 (talk) 17:49, 20 October 2010 (UTC)
- Spiel496, the crucial difference is in the expression "a [per] degree of freedom". Just saying something is a "measure of energy" places no limit on where this energy is to be found, thus it is not a measure of the concentration of energy which is what temperature is all about. I have tried to show that a molecule is an not a standard measure of heat capacity either because not all molecules have the same number of degrees of freedom. Using the energy in a single degree of freedom relates this energy to temperature via the Boltzmann constant, the equipartition of energy and the Maxwell-Boltzmann distribution take care of the temperature with large ensembles of particles. --Damorbel (talk) 21:57, 20 October 2010 (UTC)
- Again, therefore, what? It's not enough to make true statements. This forum is to discuss improvements to the article. Spiel496 (talk) 04:09, 21 October 2010 (UTC)
- Spiel496, in the overview [[8]] the article states "Macroscopically, temperature is a measure of the thermal energy held by a body of matter"; this is incorrect, thermal energy like all other forms of energy is measured in Joules but temperature is measured in 'degrees' of one sort or another (see absolute temperature scales). This is a defect in the article, correction of which would be an improvement, or don't you agree?
- Spiel496, the crucial difference is in the expression "a [per] degree of freedom". Just saying something is a "measure of energy" places no limit on where this energy is to be found, thus it is not a measure of the concentration of energy which is what temperature is all about. I have tried to show that a molecule is an not a standard measure of heat capacity either because not all molecules have the same number of degrees of freedom. Using the energy in a single degree of freedom relates this energy to temperature via the Boltzmann constant, the equipartition of energy and the Maxwell-Boltzmann distribution take care of the temperature with large ensembles of particles. --Damorbel (talk) 21:57, 20 October 2010 (UTC)
- For crying out loud, Damorbel, you yourself use the phrase in "the measure of energy in a degree of freedom". Please make your comments address the wording of the article. Spiel496 (talk) 17:49, 20 October 2010 (UTC)
- Therefore temperature is not a measure of energy. --Damorbel (talk) 13:35, 20 October 2010 (UTC)
- Therefore, what? Spiel496 (talk) 13:22, 20 October 2010 (UTC)
- Perhaps the requirement that temperature is "energy per degree of freedom" is the problem. But this definition corresponds with others definitions such as speed in miles per hour (mph) or meters per second (m/s). Neither mph nor m/s measure distance, even though 'miles' and 'meters' appear in the definition, so when temperature is defined as 'Joules per degree of freedom' it doesn't mean that temperature is a measure of energy but a measure of energy density. --Damorbel (talk) 08:44, 21 October 2010 (UTC)
- Thats right. Temperature is not a measure of total energy or energy per molecule, but it is a measure of energy per degree of freedom. As long as those degrees of freedom are not quantized very much. If a degree of freedom has only a few levels to hold that energy, it wont hold an energy of kT/2 and temperature won't be a good measure of that energy. But the translational degrees of freedom are practically never quantized enough for this to matter. Therefore temperature is an excellent measure of the translational energy per degree of freedom. PAR (talk) 17:54, 20 October 2010 (UTC)
- Again I'd like to point out the quantization issue. PAR mentioned it in the last post. If quantization is considered, then each degree of freedom will not contain a total energy of kT/2. I recognize that translational degrees of freedom have energy spacings small enough to treat classically and that the statement is absolutely correct for systems that have only translational degrees of freedom. The problem is that the only system for which this is the case is a monatomic ideal gas. This system can only be encountered 6 times in nature, once for each of the rare gasses. This is far too small a number of cases to use the kT/2 argument as an umbrella definition that applies to every system in nature. If it is not true for every system, it is not true enough to be used in a generalized manner. Statements of this type can only be made in conjunction with statements of the approximations that are being made. Can we please move away from using this in the article??? It is a condition that is only true under very specific circumstances and needs a lot of detail to fully explain. Putting in all the details or making an over-generalized statement will just confuse people who read the article or, perhaps even worse, give them the wrong idea. Sirsparksalot (talk) 16:51, 21 October 2010 (UTC)
- Just to clarify - the statement that the temperature is a measure of tranlational energy per (translational) degree of freedom is true for all gases, not just monatomic ideal gases. PAR (talk) 17:18, 21 October 2010 (UTC)
- Sorry, I must have misread that part. But what about other systems that don't have translational degrees of freedom (ignoring COM of course) such as solids or liquids? The relation won't even be applicable. I just think that it is too specific of a statement to rely so heavily upon, as it is in the article. It certainly has its pedagogic benefits of drawing a relationship between energy and temperature, but using it without all the necessary details would just sacrifice accuracy. Sirsparksalot (talk) 20:03, 21 October 2010 (UTC)
- In solids the gaseous 'elastic collision' aspect of energy transfer is replaced by intermolecular forces e.g. the kind of force that holds a crystal together, this force together with the mass of the atoms forms a resonant system that stores energy. It is the same energy storage method that gives diatomic gases a higher specific heat than monatomics. Forces between molecules of a liquid also store energy, water molecules have very high forces acting between them because the very great difference between the size of two H atoms and one O atom; this great force gives water a very high latent heat of evaporation since the water can only become a gas when the H2O molecules have too much energy to coalesce. --Damorbel (talk) 21:02, 21 October 2010 (UTC)
- That is precisely my point and why I have such reservations about using the expression Etr=kT/2 per degree of freedom. This equation is only useful for certain situations and I feel that it is currently being overused and relied upon too heavily in the article. Sirsparksalot (talk) 21:18, 21 October 2010 (UTC)
- I think you need three things for kT/2 to represent energy per degree of freedom.
- You need the molecules to be able to exchange energy randomly. This can happen by collisions, or intermolecular forces.
- You need to wait a long enough time for equilibrium to occur - the distribution of energies settles down and becomes unchanging.
- You need the energy levels of a particular degree of freedom to be practically continuous over the energies of interest. In other words, negligible quantization effects.
- If you have these three conditions, the energies will be distributed according to a Maxwell distribution, they will have some temperature T and the energy for each such degree of freedom will be kT/2. The translational motion of the molecules in a gas, liquid, and, I think, most solids fulfill all of these requirements, and so each molecule's translational degrees of freedom will hold an energy of kT/2. In other words, kT/2 is good for the translational degrees of freedom for gases, liquids and solids. PAR (talk) 22:44, 21 October 2010 (UTC)
- But in gasses, there are 3N translational degrees of freedom, where N is the number of molecules. In solids and liquids, there are only 3 translational degrees of freedom. These would correspond to the x,y and z translational motions of the center of mass for the entire solid, i.e. the center of mass of a rock or a glass full of water. The other degrees of freedom are taken up by vibrational (and in the case of liquids librational as well) motions which cannot be simply attributed to kT/2 per dof. If we were to stick with the expression Etr=kT/2, this would mean that a single rock with a temperature of 300 K would have to have a well defined average velocity. But the velocity often has very little to do with the temperature of the rock. Sirsparksalot (talk) 23:56, 21 October 2010 (UTC)
- I think you need three things for kT/2 to represent energy per degree of freedom.
- That is precisely my point and why I have such reservations about using the expression Etr=kT/2 per degree of freedom. This equation is only useful for certain situations and I feel that it is currently being overused and relied upon too heavily in the article. Sirsparksalot (talk) 21:18, 21 October 2010 (UTC)
- In solids the gaseous 'elastic collision' aspect of energy transfer is replaced by intermolecular forces e.g. the kind of force that holds a crystal together, this force together with the mass of the atoms forms a resonant system that stores energy. It is the same energy storage method that gives diatomic gases a higher specific heat than monatomics. Forces between molecules of a liquid also store energy, water molecules have very high forces acting between them because the very great difference between the size of two H atoms and one O atom; this great force gives water a very high latent heat of evaporation since the water can only become a gas when the H2O molecules have too much energy to coalesce. --Damorbel (talk) 21:02, 21 October 2010 (UTC)
- Sorry, I must have misread that part. But what about other systems that don't have translational degrees of freedom (ignoring COM of course) such as solids or liquids? The relation won't even be applicable. I just think that it is too specific of a statement to rely so heavily upon, as it is in the article. It certainly has its pedagogic benefits of drawing a relationship between energy and temperature, but using it without all the necessary details would just sacrifice accuracy. Sirsparksalot (talk) 20:03, 21 October 2010 (UTC)
Ah ha! I see what you are saying. In other words you do not disagree with the conditions I wrote above, but with the characterization of the motion of the molecules of a solid as being translational in nature. This is a semantic argument now, which is easier. I agree with you that solid motions are better described as vibrational. I still don't believe it for a liquid tho. Bottom line is, I agree, for solids at least, that "translational" is a questionable terminology. PAR (talk) 03:20, 22 October 2010 (UTC)
- Exactly. I don't disagree with your conditions at all. I guess it comes down to an issue on semantics. Unfortunately, the devil is in the details and you can't really go about this stuff without dealing with the semantics at some point. As for the liquids, they are an entirely different beast all together. Dealing with gasses and solids are comparatively easy since separating degrees of freedom is (more or less) trivial. As you mentioned, gasses can be broken up into translation, rotation and vibration and solids can be reasonably well approximated by just vibration. Liquids are horrible. While, in principle, molecules in a liquid can translate, rotate and vibrate, they are all extremely hindered by the presence of other surrounding solvent molecules. For example, people often characterize the frustrated rotations as "librations." While they can be experimentally measured, they are hard to characterize due to the large number of variables and the extremely complicated potential energy surfaces present. Sirsparksalot (talk) 17:29, 22 October 2010 (UTC)
- I suggest "translational" applies only to gases since it originates in the Kinetic theory (of gases). The general consideration is that molecules are so far separated from each other that momentum exchange is the only exchange that takes place in a collision. At first sight the explanation given here [[9]] is good. In the real world where some molecules such have very high polar moments we see that deviations from the PV=RT approximation become very large and liquifaction sets in as the temperature drops (and sometimes as the pressure rises). The exact PV characteristics of all gases are extremely informative about intermolecular forces, the simplest example being the separation of liquifaction and solidification temperatures, large for diatomics like hydrogen and small for helium. --Damorbel (talk) 08:39, 22 October 2010 (UTC)
- I agree. When many textbooks mention the Etr=kT/2 relation, it is almost always stated in terms of an ideal gas. They never extend it to any other type of system. Come to think of it, I don't know of any book that states it without being specifically in the context of an ideal gas. If the gas isn't ideal, the potential energy due to intermolecular forces starts to come in to play and energy transfer, as you put it Damorbel, is not solely due to momentum transfer. The Etr=kT/2 expression starts to become less important. Sirsparksalot (talk) 17:29, 22 October 2010 (UTC)
I think one has to choose between a fundametal definition of temperature provided by Statistical mechanics and the phenomenological one provided by thermodynamics. If we choose the latter, as consensus here seems to indicate, then you also have to forget about relating the temperature of an object to the microscopic description of the object. What you can do is appeal to the notion of thermodynamic equilibrium. An object at temperature T will be in thermal equilibrium with an ideal gas at the same temperature when brought in thermal contact with it. This means that you only have to define temperature for an ideal gas. Count Iblis (talk) 17:44, 22 October 2010 (UTC)
- The fundamental definition of temperature is the thermodynamic definition. If you confine yourself to thermodynamics only, then yes, you are not concerned with the microscopic description of the object. Statistical mechanics provides a detailed explanation of thermodynamic temperature in terms of the microscopic description of the object, but it does not provide an alternative definition of temperature. Stat mech proves that the thermodynamic temperature of a gas is equal to 2/3 the translational energy per molecule divided by the Boltzmann constant, for example. This article addresses the subject of temperature, and that means thermodynamic temperature. There is no reason it cannot then go on to explain the insights into thermodynamic temperature that statistical mechanics provides. PAR (talk) 19:33, 22 October 2010 (UTC)
- I completely agree. I think that the Intro has slowly evolved over the last weeks and is in a slightly better state than it was. I definitely think that it should start with the thermodynamic formulation of temperature since that is what came first. It can then describe how statistical mechanics provides an interpretation based on microscopic phenomena. One problem that I think comes up based on PAR's last post (not due to PAR) is the topic of the "thermodynamic temperature" which, for some reason has its own article. I don't understand why there should be two different articles. Maybe someone can explain this to me. Sirsparksalot (talk) 22:28, 22 October 2010 (UTC)
A measure of the average energy
It is not clear to me if the last paragraph of the lead ("The statistical approach ...") can be understood in the sense that the temperature is proportional to the mean energy of an "energetically available degree of freedom", thus implying that the specific heat (at constant volume) is constant. Perhaps one could explain that there are degrees of freedom which are, say, "gradually available" for quantum reasons, which is the case of vibrations in solids, explicitly recalled in the paragraph. Or it should be specified that the statement is valid at temperatures much higher than the quantum energy spacing of the various degree of freedom (or much smaller, in which case that mode is not available), and virtually always for translational motions (which have very small energy spacings in sizable volumes). --GianniG46 (talk) 23:29, 22 October 2010 (UTC)
- Why not edit it the way you think it should be and then go from there? PAR (talk) 03:37, 23 October 2010 (UTC)
Thank you for your very kind reply, but I did not dare to put the pen on the article, after the long discussion I see above, nor I do now. Nevertheless, I think one cannot simply say that T is a "measure of the average energy of motion (kinetic energy) in each energetically available degree of freedom" without also saying that there are some specifications to consider, as already remarked by Sirsparksalot in the first discussion ("Introduction") above. T is monotonically increasing with energy, and for many gases is simply proportional to it, but the latter statement is macroscopically false for most solids even at ordinary temperatures. Now I can go from here. Best regards. --GianniG46 (talk) 07:56, 23 October 2010 (UTC)
- Well then you can see from the above discussion that I need to learn more about the specific heat of solids, but some of the stuff in this article was just wrong. Please watch this article and give an opinion, with explanation, if you see something you think is wrong. PAR (talk) 19:50, 23 October 2010 (UTC)
The matter in itself would be simple: the relation which is always valid between energy and T is the factor exp(-E/kT) in the populations of each degree of freedom. The average energy is proportional to the integral of this factor over all energies. If there is a continuum of possible energies this integral is proportional to T, so average energy is proportional to T.
In the quantum case, instead, we have discrete levels, and the integral becomes a sum over discrete energies, which is not at all proportional to T unless the spacing between levels is much smaller than kT. And, at low T, when kT in substantially lower than the energy of the first level above the ground state, that degree of freedom cannot be populated and becomes thermally inactive, and its specific heat tends to zero. Which is the case of solids, according to the Einstein-Debye theory of specific heat. This is true in general for condensed matter and also for polyatomic gases, and makes the specific heat to go to zero at absolute zero.
This is simple to say here, less simple to be used in the last paragraph of the lead, to specify in which sense T is connected to energy. I have to find a way to say this in few words. --GianniG46 (talk) 22:39, 24 October 2010 (UTC)
Definition of temperature in lead (4)
The description of temperature as being derived from the zeroth law is described with references, so I cannot in good conscience remove or dispute these statements until I have read the quoted references, which I will do. But rather than quote competing references, can we just stop and THINK about the situation. Take the set of simple thermodynamic systems - the zeroth law says that you can collect all those that are in thermal equilibrium with each other and form a new set - an equivalence class - an element of the set consists of all those systems which are in thermal equilibrium with each other, different elements are not. And thats all. There is no ordering of this set. I can assign a number which corresponds to the thermodynamic temperature to each element of the set. Fine, that works - equal values of thermodynamic temperature of a system will imply their thermodynamic equilibrium, and unequal values imply lack thereof. I can also assign a number as follows: Take the decimal expression for the thermodynamic temperature, and reverse every group of three digits, beginning at the decimal point. For example, 273.150 becomes 372.051. The zeroth law will have no problem with this. The numbering is unique - equal values of this new numbering of the equivalence classes will imply thermodynamic equilibrium, and unequal values imply lack thereof. But the crucial concept of "hotter" and "colder" is gone. There are an infinite number of ways to assign "numbers" to these equivalence classes while obeying the zeroth law, some of which are even more bizarre. The zeroth law alone cannot be used to justify and design what is commonly known as a thermometer, or an empirical temperature scale. If you put forth some object and propose that it be used as a thermometer, you must show that its mechanical parameters (e.g. pressure and volume) bear a unique relationship to the thermodynamic state of the thermometer and that it can be used to specify a unique real number to each of the equivalence classes. The minute you introduce the idea of "mechanical parameters" and their relationship to the thermodynamic state, you must invoke the first and second laws. This is the only way to provide the needed order relationship that is vital to the definition of temperature. To quote Boyling, J.B. (1972). "An Axiomatic Approach to Classical Thermodynamics". PRoc. R. Soc. Lond. A (329): 35–70.
The zeroth law as it stands is not strong enough to guarantee the existence of an empirical temperature scale, i.e. of a (1-1) correspondence between isothermals and real numbers. It has to be supplemented by auxiliary assumptions involving a special kind of simple system called a "thermometer".
Of course I don't agree with this fully. The "reverse every group of three digits" recipe provides a 1-1 correspondence between isotherms and real numbers, but does not qualify as a temperature scale. But the larger point is valid. The zeroth law provides no order relationship, no sense of "hotter" and "colder" which is vital to the definition of temperature. The concept of a thermometer is outside the realm of the zeroth law. The second law can be used to provide this ordering relationship when it states that if heat (a measureable quantity by the first law) flows from one system to another, the first is at a higher temperature than the other. This has nothing to do with thermodynamic temperature, it holds for any temperature scale, absolute or empirical. You can then go even further and use the second law to define thermodynamic temperature (to within a scaling factor) by saying that the ratio of thermodynamic temperatures is equal to the ratio of heats exchanged during the power cycles of a reversible heat engine. PAR (talk) 14:36, 2 November 2010 (UTC)
- You are describing your point of view, but it is incorrect and this is not a forum to teach thermodynamics. The 0th law does that just fine. The principles of equilibrium and the mathematical framework of this, that the law is based on, where known explicitly even before its concise statement. Caratheodory laid it out early on and describes the mathematical conditions and frame work for the 0th law. You don't need the second law to scale and measure temperature successfully in the laboratory, as history is proof. Your quotation of Boyling above is taken out of context and cannot be used for your general argument. Boyling doesn't invalidate the concept, he was referring to a specific problem in the choice of a simple system used as a thermometer which others have disputed or corrected, neither of which can be treated in an encyclopedia. Boylings paper overall is pretty much just a reformulation or rediscovery of Caratheodory. A good text book that also relies on this approach is that by Buchdahl. The 0th law formalism relies on a mathematical model for the representative space of thermodynamic systems in equilibrium. The temperature is one of the space coordinates as is the thermometric parameter that is used to measure it, i.e. the pressure or the volume of a gas thermometer. One last comment, the 0th law statement only exists because people recognized (late) that it was necessary for making a temperature and thermometer definition at all, it was defined as a law for the sole purpose of defining empirical temperature scales and the construction of a thermometer. If this wasn't possible, as is your opinion, it would have never been defined as a law. We don't need it for anything else, it did not create new fundamental relationships, the formalism of thermodynamic equilibrium were already known with the same detail. It was only overlooked that they formed a logistical requirement for temperature measurement. Kbrose (talk) 15:51, 2 November 2010 (UTC)
- Ok, bottom line - give a quick outline of how to build a thermometer without recourse to the first or second law. This is an honest request, made out of curiosity.
- I give a reasoned argument, you declare it wrong. Proof - the authority of Caratheodory and "history". I quote the Boyling reference, you declare it wrong. Proof - the authority of unnamed "others". Nowhere do you actually confront the reasoning given. Please explain the sense in which Boyling's statement is not to be taken at face value and please explain where the above reasoning goes astray. Again, an honest request.
- You say "The 0th law formalism relies on a mathematical model for the representative space of thermodynamic systems in equilibrium. The temperature is one of the space coordinates as is the thermometric parameter that is used to measure it, i.e. the pressure or the volume of a gas thermometer." - The representative space does not exist until it is defined by the first and second laws. Circular reasoning.
- Buchdahl? Is this the same Buchdahl who wrote "On the redundancy of the zeroth law of thermodynamics " H A Buchdahl 1986 J. Phys. A: Math. Gen. 19 L561 in which he states:
Expositions of classical thermodynamics frequently include the so-called zeroth law amongst its 'fundamental principles'. It is shown here that, given only the first law and the second law (the latter in a formulation manifestly free of any explicit or implicit reference to temperature), the transitivity of the relation 'is in diathermic equilibrium with' can be deduced. The zeroth law which is an assertion of just this transistivity is therefore redundant. The existence of the absolute temperature function of course emerges directly, i.e. without appeal to a prior empirical temperature.
- Even I am not prepared to go that far. PAR (talk) 17:50, 2 November 2010 (UTC)
- Putting the zeroth law debate aside for the moment, I just want to point out that the Lead is getting really long and really technical. Please step back and consider the audience. The Lead is for the curious reader asking "what is temperature really?". This reader knows about molecules, but not entropy. The answer should something tangible, like "the amount of random motion of the particles", maybe introducing kinetic energy. We don't want the curious reader to encounter paragraph 2, which says "Science is scary and inaccessible. Go away." The zeroth law, entropy, dE/dS... save it for the body of the article. Just because something is true, doesn't mean it belongs in the Lead section. I propose a shorter Lead that mirrors the structure of the article:
- Temperature quantifies the sensations of cold and hot (current 1st paragraph is good)
- Historically there have been various definitions, but roughly speaking it corresponds to the mean kinetic energy of the particles; don't go overboard on degrees of freedom discussion; maybe just disclaim that liquids and solids are more complicated
- Temperature scales; introduce absolute zero;
- Importance in science -- temperature-dependence of various phenomena
- Maybe another topic I missed?
- Does this seem reasonable? Spiel496 (talk) 20:56, 2 November 2010 (UTC)
- Putting the zeroth law debate aside for the moment, I just want to point out that the Lead is getting really long and really technical. Please step back and consider the audience. The Lead is for the curious reader asking "what is temperature really?". This reader knows about molecules, but not entropy. The answer should something tangible, like "the amount of random motion of the particles", maybe introducing kinetic energy. We don't want the curious reader to encounter paragraph 2, which says "Science is scary and inaccessible. Go away." The zeroth law, entropy, dE/dS... save it for the body of the article. Just because something is true, doesn't mean it belongs in the Lead section. I propose a shorter Lead that mirrors the structure of the article:
- Please use the word random as in
- the mean random kinetic energy of the particles
- If a rocket is going to the moon, its translational kinetic energy has nothing to do with its temperature. However, its internal random kinetic energy does. Q Science (talk) 05:27, 3 November 2010 (UTC)
- Please use the word random as in
- "but roughly speaking it corresponds to the mean kinetic energy of the particles". The main reason for not putting it this way is because it is incorrect, doing so is just misleading. The worst aspect of the article is that it is contradictory, small wonder it is 'scary', how are people at any level of expertise supposed to profit from such stuff? --Damorbel (talk) 09:21, 3 November 2010 (UTC)
- Damorbel, it is inconsiderate to throw out a "nope, that's wrong" statement without elaborating. This sounds like the same issue you brought up before. After about 10,000 words about units and Boltzmann's constant that led everyone in different directions, it turned out your only gripe was the lack of the phrase "per degree of freedom". Don't waste our time. Spiel496 (talk) 16:24, 3 November 2010 (UTC)
- "but roughly speaking it corresponds to the mean kinetic energy of the particles". The main reason for not putting it this way is because it is incorrect, doing so is just misleading. The worst aspect of the article is that it is contradictory, small wonder it is 'scary', how are people at any level of expertise supposed to profit from such stuff? --Damorbel (talk) 09:21, 3 November 2010 (UTC)
- I agree it is necessary to add a word such as "random". Perhaps it would be more correct to say something like "kinetic energy with respect to the center of gravity frame", but this would be uselessly complicated, for the lead. After all, in gases the movements can be effectively defined as random, and I agree with Spiel496 to speak "historically", making reference mainly to the properties of gases. I think a phrase like "Temperature is a measure of the kinetic energies of the microscopic, random movements of molecules of matter. The higher the temperature, the higher the kinetic energy. In simple cases (perfect gases) temperature is simply proportional to the average kinetic energy of molecules" would be adequate. Please note the plural "energies", so there is no statement about mean energy --GianniG46 (talk) 11:16, 3 November 2010 (UTC)
- I second that: "random" needs to be in there. Spiel496 (talk) 16:25, 3 November 2010 (UTC)