Thermodynamics of learning

By Marcus Wilson 27/11/2013

Last week I attended a conference on Emergent Learning and Threshold Concepts, here at the University of Waikato. It was a very interesting couple of days. As far as academic conferences go, it was unusual in that it was really cross-disciplinary. We had engineers mixing with physiotherapists, and management consultants with dancers. It certainly was interesting to hear about how other disciplines approach educating their students. A challenge faced by everyone presenting, me included, was to make the presentations accessible to someone with no expertise in the area whatsoever. It was a job that was surprisingly well done.

I'm not going to mention here what I talked about (you can find it on the ELTC website if you are that interested). Rather, I'll talk about what my colleague Jonathan Scott presented. He's been looking at Threshold Concepts and Learning for a while now and had some observations to make which he cased in terms of thermodynamics. Jonathan had to keep it pretty maths-easy for those in the audience that weren't mathematically inclined (probably most of them) and I think he did a good job. Here's a potted summary of things.

When we learn something 'thresholdy', things get more ordered in our brain. Pieces of information fit together better. We can see how concepts work, rather than just being pieces of knowledge. Things come into order. In thermodynamics, order is associated with a quantity called entropy. Specifically, something well ordered has low entropy; something with little order has high entropy. Ice has less entropy than water (since its molecules have an ordered structure), but water has less entropy than steam (since even in water there is some degree of ordering among the molecules).  We give entropy the symbol 'S'.  (Actually, I've never stopped to think why it's 'S' for entropy  – Does anyone know?)

Another key quantity in thermodynamics is heat. Heat is a form of energy. Practically, however, it's not always the best quantity to work with. That's because if we do experiments at constant pressure, which is what the laboratory usually has, gases and liquids expand when they heat up. That means a more useful quantity to work with is enthalpy. It's like heat energy, but it takes into account the fact that things can expand and contract, so the amount of stuff in say a 1 litre volume changes. When ice melts into water, for example, there is a change in enthalpy of the system. We need to put energy into the ice to melt it, which means that the enthalpy of the water is higher than that of the ice. We often give enthalpy the symbol 'H'.

We can combine the effects of a change in enthalpy and a change in entropy in something called the Gibbs' Free Energy.  We give it the symbol 'G'.  Specifically, it's the enthalpy minus the produce of temperature (T) and entropy – in maths terms G = H – TS.  Now, here's the neat bit. To make a system change its state (e.g. ice into water) the change in Gibbs' free energy needs to be negative. For ice turning to water, we note that the change in entropy is positive (more disorder). The change in enthalpy is also positive. To get the change in G to be negative, we need the temperature T to be large enough. At atmospheric pressure, if T > 0 degrees C, it will happen. If not, it won't.

What has that got to do with learning. Well, here's Jonathan's analogy. To learn a threshold concept, we need to have a move to more order. But a large, negative change in entropy means -TS is strongly positive and so if this is to happen we need to make the change in H (energy) strongly negative. In other words we need to 'take the heat out' of the system. If the system is 'the student', then this equates to getting the student to do lots of work. (Remember the first law of thermodynamics: Heat and work are equivalent). If a system does lots of work (on something else), it loses heat. A good example is gas from a pressurized bottle doing work as it moves to atmospheric pressure and expands  – the nozzle of the bottle will get cold. The bigger the ordering that is required in one's thoughts, the bigger the amount of work that the student needs to do.  The process is assisted by a lowering of the temperature – a 'cool' environment (as opposed to a hot one with too much going on)  helps the student learn.

Perhaps all this is taking a physics analogy a bit too far. If we think of the message as being "to get thoughts to order together is actually quite difficult" then it's got merit – that is really what the Threshold Concept environment is about.

Finally, it's been noted that Threshold Concepts, are indeed, a threshold concept. Therefore if you struggle to see what I'm commenting on, you need to do some more work 😉

0 Responses to “Thermodynamics of learning”

• ross says:

Mr C was German. ‘e was on holiday in San Tropez. ‘e had a French secretary at the time and ‘e was dictating ‘is paper to ‘er…. She (it has to be a she) thought that every time ‘e said “Entropy” she thought he was saying “Saint Tropez”. You can imagine Mr C’s reaction. After all, his wife was not in San Tropez and the after dinner activities had to continue so he did not admonish her whimsical mistake. But. In ‘onour of the mistake, ‘e took the S from the “Saint” (and after all, she was one) and used it in his treatise for Entropy.

Such are the thrills of illicit love.

Now funnily enough another heat sensitive symbol emerged from the Laws of Thermodynamics. A women was involved. This time a chook.

One of the interesting facts about chooks is that they don’t pee.

“Why?” is the wonderful question you are asking.

God was discovering all about the second law and while he contemplated, the chooks were peeing all around his throne and upsetting his concentration. At some point he finally got really really peed off and pronounced that all time that: “No ‘en thal pee”!!

Later, someone was reading a lost book of the bible and stumbled across the exclamation. Being a man of science and discovering the futility of perpetual motion (he had correctly deduced Gods will), he corrected the misspelling of “Hen”.

The hen had cosiness and warmth. Ahah!!!!

“Haitch it shall be, I’ll shall call this warmth Henthalpy!!”

And decades later another Frenchman (or woman) translated it again from that filthy German and the Haitch got lost……

• Haha good one Ross

My question since we are talking about entropy starts with a story (it could be true).
There once was a man who came to prominence in WW2 because he was quite an intelligent man he was coerced to work for the Nazis. He had studied nature by looking at how trout behaved in a stream. He had figured out that water behaves a certain way when it flows, he studied its vortices.
So the legend goes he invented a machine that worked on the spiralling motions of water as it fell. He noticed that it behaved in a manner that was opposite to entropy, he had discovered an implosive force, that created levity.
This mans name was Viktor Schauberger and his machine of legend is called the repulsine. Now it is rumoured the Nazis took this machine and made strange circular discs that could carry some people off into the sky from this machine (the Hanebu, Vril).
Proof none but it sounds better than any fiction i have heard, anyway my question is…
Since every reaction has an equal an opposite reaction what is the negative of entropy called and is it written into the textbooks (not Vicktor Schauberger but -ve entropy) ? Or is entropy like magic that doesn’t apply to this rule ?

• Bruce Hamilton says:

” We give entropy the symbol ‘S’. (Actually, I’ve never stopped to think why it’s ‘S’ for entropy – Does anyone know?)”

Nobody knows, but… my vague recollection is that the symbol “S” for entropy was chosen by Rudolf Clausius to honour Sadi Carnot. Clausius developed the Second Law of Thermodynamics based on the earlier heat engine work of Sadi Carnot ( that produced the Carnot Cycle for heat engines ).

A quick Google finds the current Wikipedia article on History of Entropy which also gives the following. No fowls were harmed….

” In 1865, Clausius gave irreversible heat loss, or what he had previously been calling “equivalence-value”, a name:[7][8]

“ I propose to name the quantity S the entropy of the system, after the Greek word [τροπη trope], the transformation. I have deliberately chosen the word entropy to be as similar as possible to the word energy: the two quantities to be named by these words are so closely related in physical significance that a certain similarity in their names appears to be appropriate. ”

Although Clausius did not specify why he chose the symbol “S” to represent entropy, it is arguable that Clausius chose “S” in honor of Sadi Carnot, to whose 1824 article Clausius devoted over 15 years of work and research.

On the first page of his original 1850 article “On the Motive Power of Heat, and on the Laws which can be Deduced from it for the Theory of Heat”, Clausius calls Carnot the most important of the researchers in the theory of heat.[9] ”

Additional useless information, Sadi Carnot died of cholera when only 36. As was typical for victims of contagious diseases, all of his personal effects ( including scientific papers ) were burned or buried with him. Much of his work was lost, and it was decades later before others ( like Rudolf Diesel ) appreciated his insights into thermal engine design.

• Marcus Wilson says:

‘Q’ for heat, ‘U’ for internal energy, ‘H’ for enthalpy, ‘S’ for entropy. Once you’ve grasped this the rest of thermodynamics is a breeze.

• Marcus Wilson says:

All this makes me think that teaching students a little more science history (and philosophy) would be useful