Usually we tend to think about temperature as being related to the motion of atoms. At lower temperatures, atomic motions slow down. Absolute zero, defined as zero Kelvin or −273.15 degrees Celsius, then is the point where all atomic motion stops. But what comes beyond that, does something like a negative absolute temperature exist? Indeed, as Ulrich Schneider and colleagues from Munich have now demonstrated impressively in this week’s issue of Science, it does. [...]
Archive | Thermodynamics RSS feed for this archive
December 27, 2010
The past year has been a great year for science with major advances in several areas. Too many exciting results to mention here. Instead, to reflect about the past year I have chosen a representative paper for each month of the year that I hope can serve as an example of the great science going on in a number of research fields. Of course, this is a highly subjective and personal collection, and indeed there might be others worth mentioning. But the aim was also to provide a balanced overview of the year that covers a variety of topics.
Of course, if you have an exciting paper to add, please feel free to use the comments section below to let us know!
Anyway, enough said, here are some of my highlights from the past year:
JANUARY – iron-based superconductors
Since they were discovered in 2008, iron-based superconductors, the pnictides, have been one of the hottest topics in condensed matter physics. Part of their appeal stems from the fact that they are based on iron, which is a magnetic element. Normally, magnets and superconductivity exclude each other.
The iron-based compounds have a similar crystal structure as the so-called cuprates, which are the materials with the highest superconducting temperatures known. The mechanism for these high-temperature superconductors is unknown, and studying the iron-based superconductors may also be relevant to the understanding of the cuprates.
This paper published in Science shows for the first time that the electrons in the iron-based superconductors show a periodic arrangement that is different to the periodicity of the atoms in the crystal. Similar observations have been made in the cuprates, and their understanding is considered important to the mechanism of high-temperature superconductivity.
Chuang, T., Allan, M., Lee, J., Xie, Y., Ni, N., Bud’ko, S., Boebinger, G., Canfield, P., & Davis, J. (2010). Nematic Electronic Structure in the “Parent” State of the Iron-Based Superconductor Ca(Fe1-xCox)2As2 Science, 327 (5962), 181-184 DOI: 10.1126/science.1181083
November 14, 2010
Your desk at work, is it chaotic as mine, or clean and ordered? If the latter, I salute you, because it takes work to keep a desk tidy. Otherwise, chaos will soon reign. And while I admit that I should keep my desk cleaner (and no, I won’t share photos here), I have an excellent excuse: it is a fundamental law of nature that disorder and chaos are always increasing.
A measure of this disorder is a quantity called entropy. To clean your desk it takes work, which creates heat, which is energy that is wasted on the environment. So even though the entropy of the desk is reduced, overall it increases. And this is the second law of thermodynamics: on average, the entropy in the universe is always increasing, whatever the process.
Well, asked James Clerk Maxwell back in 1867, what if you have a box filled with gas of a certain temperature. The box is separated into two compartments by a wall that has a small door. The door is controlled by a small ‘demon‘ that lets fast-moving gas molecules go into the right half, and leaves slow ones in the left. The left box would cool down, and the right one heats up. Overall, the box is more ordered than before. If the demon itself doesn’t use up any energy (which can be done), entropy would decrease, right? But according to the second law of thermodynamics energy is needed to create order, and the demon wouldn’t use any. Is this then a violation of the second law?
Well, actually not. The reason is that there is energy in information. To store a bit of information a system like a computer memory needs to be put into a defined state, either a ’1′ or a ’0′. This reduces entropy. And the same is true for the two compartments in a box. We use the information whether a molecule is fast or slow to separate them. The energy stored in that information is used to reduce the entropy of the system. So we are fine, the second law won’t be violated.
The energy that is contained in the information is tiny. For a single bit the lower limit is on the order of 10-21 joules. In comparison, one calorie, the old energy unit often used for food, corresponds to 4.184 joules.