Ambiguity in the definition of entropyHow can temperature be defined as it is if entropy isn't a function of energy?Statistical interpretation of EntropyEntropy as an arrow of timeWhat precisely does the 2nd law of thermo state, considering that entropy depends on how we define macrostate?Axioms behind entropy!The statistical interpretation of EntropyWhat is the cause for the inclusion of 'thermal equilibrium' in the statement of Ergodic hypothesis?Do the results of statistical mechanics depend upon the choice of macrostates?Entropy definition, additivity, laws in different ensemblesDefinition of entropy and other StatMech variablesWhat is the definition of entropy in microcanonical ensemble?
How does one intimidate enemies without having the capacity for violence?
What would happen to a modern skyscraper if it rains micro blackholes?
Can you really stack all of this on an Opportunity Attack?
Accidentally leaked the solution to an assignment, what to do now? (I'm the prof)
Why is 150k or 200k jobs considered good when there's 300k+ births a month?
If human space travel is limited by the G force vulnerability, is there a way to counter G forces?
dbcc cleantable batch size explanation
What does the "remote control" for a QF-4 look like?
Are the number of citations and number of published articles the most important criteria for a tenure promotion?
Why doesn't Newton's third law mean a person bounces back to where they started when they hit the ground?
Malformed Address '10.10.21.08/24', must be X.X.X.X/NN or
How to determine what difficulty is right for the game?
How old can references or sources in a thesis be?
Client team has low performances and low technical skills: we always fix their work and now they stop collaborate with us. How to solve?
Are astronomers waiting to see something in an image from a gravitational lens that they've already seen in an adjacent image?
Could an aircraft fly or hover using only jets of compressed air?
Can a monk's single staff be considered dual wielded, as per the Dual Wielder feat?
Why can't I see bouncing of switch on oscilloscope screen?
Why do I get two different answers for this counting problem?
Today is the Center
LWC SFDX source push error TypeError: LWC1009: decl.moveTo is not a function
Alternative to sending password over mail?
DC-DC converter from low voltage at high current, to high voltage at low current
What does "Puller Prush Person" mean?
Ambiguity in the definition of entropy
How can temperature be defined as it is if entropy isn't a function of energy?Statistical interpretation of EntropyEntropy as an arrow of timeWhat precisely does the 2nd law of thermo state, considering that entropy depends on how we define macrostate?Axioms behind entropy!The statistical interpretation of EntropyWhat is the cause for the inclusion of 'thermal equilibrium' in the statement of Ergodic hypothesis?Do the results of statistical mechanics depend upon the choice of macrostates?Entropy definition, additivity, laws in different ensemblesDefinition of entropy and other StatMech variablesWhat is the definition of entropy in microcanonical ensemble?
$begingroup$
The entropy $S$ of a system is defined as $$S = kln Omega.$$ What precisely is $Omega$? It refers to "the number of microstates" of the system, but is this the number of all accessible microstates or just the number of microstates corresponding to the systems current macrostate? Or is it something else that eludes me?
statistical-mechanics entropy definition
$endgroup$
add a comment |
$begingroup$
The entropy $S$ of a system is defined as $$S = kln Omega.$$ What precisely is $Omega$? It refers to "the number of microstates" of the system, but is this the number of all accessible microstates or just the number of microstates corresponding to the systems current macrostate? Or is it something else that eludes me?
statistical-mechanics entropy definition
$endgroup$
$begingroup$
Uh, ambiguity is the definition of entropy.
$endgroup$
– Hot Licks
yesterday
add a comment |
$begingroup$
The entropy $S$ of a system is defined as $$S = kln Omega.$$ What precisely is $Omega$? It refers to "the number of microstates" of the system, but is this the number of all accessible microstates or just the number of microstates corresponding to the systems current macrostate? Or is it something else that eludes me?
statistical-mechanics entropy definition
$endgroup$
The entropy $S$ of a system is defined as $$S = kln Omega.$$ What precisely is $Omega$? It refers to "the number of microstates" of the system, but is this the number of all accessible microstates or just the number of microstates corresponding to the systems current macrostate? Or is it something else that eludes me?
statistical-mechanics entropy definition
statistical-mechanics entropy definition
edited 2 days ago
Qmechanic♦
107k121991238
107k121991238
asked Apr 3 at 0:46
PiKindOfGuyPiKindOfGuy
714624
714624
$begingroup$
Uh, ambiguity is the definition of entropy.
$endgroup$
– Hot Licks
yesterday
add a comment |
$begingroup$
Uh, ambiguity is the definition of entropy.
$endgroup$
– Hot Licks
yesterday
$begingroup$
Uh, ambiguity is the definition of entropy.
$endgroup$
– Hot Licks
yesterday
$begingroup$
Uh, ambiguity is the definition of entropy.
$endgroup$
– Hot Licks
yesterday
add a comment |
3 Answers
3
active
oldest
votes
$begingroup$
Entropy is a property of a macrostate, not a system. So $Omega$ is the number of microstates that correspond to the macrostate in question.
Putting aside quantization, it might appear that there are an infinite number of microstates, and thus the entropy is infinite, but for any level of resolution, the number is finite. And changing the level of resolution simply multiplies the number of microstates by a constant amount. Since it is almost always the change in entropy, not the absolute entropy, that is considered, and we're taking the log of $Omega$, it actually doesn't matter if the definition of S is ambiguous up to a constant multiplicative factor, as that will cancel out when we take dS. So with a little hand waving (aka "normalization"), we can ignore the apparent infinity of entropy.
$endgroup$
$begingroup$
+1; You may add for completeness that $Omega$ is actually the volume occupied by all the possible microstates corresponding to the given macrostate, in phase space. I Always found this expression clearer, since both positions and momentum are continuous variables, the number of microstates available would be infinite.
$endgroup$
– Run like hell
2 days ago
1
$begingroup$
@Runlikehell I was trying to get at the infinity problem with my last paragraph, but I think I'll make that a bit more clear.
$endgroup$
– Acccumulation
2 days ago
$begingroup$
+1. "Entropy is a property of a macrostate, not a system." is completely clear, but I have never heard that before. It clarified my thinking a lot.
$endgroup$
– M. Winter
yesterday
$begingroup$
@M. Winter, in physics, the macrostate " in question " is the state in which system exists or more accurately spend most of the time, so we (or the system) must choose the macrostate with the highest stat. weight, that is, highest $ Omega$. Analogy: the extremum of the function (entropy) is a property of the function (system) and not a particular function value of particular argument value
$endgroup$
– Aleksey Druggist
yesterday
add a comment |
$begingroup$
Entropy logarithmically measure of the number of microscopic states corresponding to some specific macroscopically-observable state, not the system as a whole. Put another way: systems that have not yet found their equilibrium state, when left alone, increase their entropy. This would not be possible if the system had the same entropy for all macrostates.
Indeed, the driving principle of entropy in modern stat-mech says that we have some uncertainty about the underlying microscopic state of the system and that from a certain perspective (basically, the one where every macroscopic quantity we can determine is conserved) we can treat nature as simply choosing a microstate uniformly at random. (We have to tread carefully about what exactly uniformly means here but an “obvious” choice seems to replicate certain nice features, like that metals will have specific heats that look like $3R$ where $R$ is the gas constant—a result that I want to say is due to Einstein but I am not 100% sure.)
As a result of this principle of nature picking microstates at random, our equilibrium state is the macrostate which contains the most microstates, and our regression to equilibrium is a process of macrostates getting larger and larger.
$endgroup$
add a comment |
$begingroup$
Entropy is a matter of perspective.
You pick a way to describe a system at large scales. This effectively subdivides the system into macrostates, or "macroscopic states".
Each of these macroscopic states corresponds to a number of "microstates"; different configurations of the system that are clumped together in one macrostate.
If, for each macrostate, you take the log of the number of microstates in it, the principle of Entropy is that whatever macrostate it is in, it will move towards macrostates with a higher value almost certainly.
Now you can move to a lower Entropy value only by increasing the Entropy of another system. This basically consists of merging the two systems into one and applying the first rule.
The number of microstates multiply when they are combined; if we have two systems A and B, and they have macrostates A_0 and B_0 with 7 and 10 microstates apiece, the system A+B with macrostate A_0+B_0 has 70 microstates (7*10).
Taking the log of the number of microstates simply allows us to use addition instead of multiplication; the entropy of $log(7)$ and $log(10)$ add to $log(7)+log(10)$ = $log(7*10)$.
Any function that has the property that $f(a*b)=f(a)+f(b)$ will do just as well, which is why we don't care what the base of our logarithm is.
The fun part is that this applies regardless of how you clump the microstates into macrostates so long as you do the clumping before the experiment. So we go and pick sensible macrostates that correspond to things we care about, and the result holds. Crazy choice of macrostates don't actually help us; the vast majority of the possible configuration space of any system is completely useless chaos, only a ridiculously small fraction of the system configuration space is going to be "useful", and no matter how we label it that space is going to have very few microstates in it.
$endgroup$
add a comment |
Your Answer
StackExchange.ifUsing("editor", function ()
return StackExchange.using("mathjaxEditing", function ()
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
);
);
, "mathjax-editing");
StackExchange.ready(function()
var channelOptions =
tags: "".split(" "),
id: "151"
;
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function()
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled)
StackExchange.using("snippets", function()
createEditor();
);
else
createEditor();
);
function createEditor()
StackExchange.prepareEditor(
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader:
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
,
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
);
);
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fphysics.stackexchange.com%2fquestions%2f470202%2fambiguity-in-the-definition-of-entropy%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
3 Answers
3
active
oldest
votes
3 Answers
3
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
Entropy is a property of a macrostate, not a system. So $Omega$ is the number of microstates that correspond to the macrostate in question.
Putting aside quantization, it might appear that there are an infinite number of microstates, and thus the entropy is infinite, but for any level of resolution, the number is finite. And changing the level of resolution simply multiplies the number of microstates by a constant amount. Since it is almost always the change in entropy, not the absolute entropy, that is considered, and we're taking the log of $Omega$, it actually doesn't matter if the definition of S is ambiguous up to a constant multiplicative factor, as that will cancel out when we take dS. So with a little hand waving (aka "normalization"), we can ignore the apparent infinity of entropy.
$endgroup$
$begingroup$
+1; You may add for completeness that $Omega$ is actually the volume occupied by all the possible microstates corresponding to the given macrostate, in phase space. I Always found this expression clearer, since both positions and momentum are continuous variables, the number of microstates available would be infinite.
$endgroup$
– Run like hell
2 days ago
1
$begingroup$
@Runlikehell I was trying to get at the infinity problem with my last paragraph, but I think I'll make that a bit more clear.
$endgroup$
– Acccumulation
2 days ago
$begingroup$
+1. "Entropy is a property of a macrostate, not a system." is completely clear, but I have never heard that before. It clarified my thinking a lot.
$endgroup$
– M. Winter
yesterday
$begingroup$
@M. Winter, in physics, the macrostate " in question " is the state in which system exists or more accurately spend most of the time, so we (or the system) must choose the macrostate with the highest stat. weight, that is, highest $ Omega$. Analogy: the extremum of the function (entropy) is a property of the function (system) and not a particular function value of particular argument value
$endgroup$
– Aleksey Druggist
yesterday
add a comment |
$begingroup$
Entropy is a property of a macrostate, not a system. So $Omega$ is the number of microstates that correspond to the macrostate in question.
Putting aside quantization, it might appear that there are an infinite number of microstates, and thus the entropy is infinite, but for any level of resolution, the number is finite. And changing the level of resolution simply multiplies the number of microstates by a constant amount. Since it is almost always the change in entropy, not the absolute entropy, that is considered, and we're taking the log of $Omega$, it actually doesn't matter if the definition of S is ambiguous up to a constant multiplicative factor, as that will cancel out when we take dS. So with a little hand waving (aka "normalization"), we can ignore the apparent infinity of entropy.
$endgroup$
$begingroup$
+1; You may add for completeness that $Omega$ is actually the volume occupied by all the possible microstates corresponding to the given macrostate, in phase space. I Always found this expression clearer, since both positions and momentum are continuous variables, the number of microstates available would be infinite.
$endgroup$
– Run like hell
2 days ago
1
$begingroup$
@Runlikehell I was trying to get at the infinity problem with my last paragraph, but I think I'll make that a bit more clear.
$endgroup$
– Acccumulation
2 days ago
$begingroup$
+1. "Entropy is a property of a macrostate, not a system." is completely clear, but I have never heard that before. It clarified my thinking a lot.
$endgroup$
– M. Winter
yesterday
$begingroup$
@M. Winter, in physics, the macrostate " in question " is the state in which system exists or more accurately spend most of the time, so we (or the system) must choose the macrostate with the highest stat. weight, that is, highest $ Omega$. Analogy: the extremum of the function (entropy) is a property of the function (system) and not a particular function value of particular argument value
$endgroup$
– Aleksey Druggist
yesterday
add a comment |
$begingroup$
Entropy is a property of a macrostate, not a system. So $Omega$ is the number of microstates that correspond to the macrostate in question.
Putting aside quantization, it might appear that there are an infinite number of microstates, and thus the entropy is infinite, but for any level of resolution, the number is finite. And changing the level of resolution simply multiplies the number of microstates by a constant amount. Since it is almost always the change in entropy, not the absolute entropy, that is considered, and we're taking the log of $Omega$, it actually doesn't matter if the definition of S is ambiguous up to a constant multiplicative factor, as that will cancel out when we take dS. So with a little hand waving (aka "normalization"), we can ignore the apparent infinity of entropy.
$endgroup$
Entropy is a property of a macrostate, not a system. So $Omega$ is the number of microstates that correspond to the macrostate in question.
Putting aside quantization, it might appear that there are an infinite number of microstates, and thus the entropy is infinite, but for any level of resolution, the number is finite. And changing the level of resolution simply multiplies the number of microstates by a constant amount. Since it is almost always the change in entropy, not the absolute entropy, that is considered, and we're taking the log of $Omega$, it actually doesn't matter if the definition of S is ambiguous up to a constant multiplicative factor, as that will cancel out when we take dS. So with a little hand waving (aka "normalization"), we can ignore the apparent infinity of entropy.
edited 2 days ago
answered 2 days ago
AcccumulationAcccumulation
3,014514
3,014514
$begingroup$
+1; You may add for completeness that $Omega$ is actually the volume occupied by all the possible microstates corresponding to the given macrostate, in phase space. I Always found this expression clearer, since both positions and momentum are continuous variables, the number of microstates available would be infinite.
$endgroup$
– Run like hell
2 days ago
1
$begingroup$
@Runlikehell I was trying to get at the infinity problem with my last paragraph, but I think I'll make that a bit more clear.
$endgroup$
– Acccumulation
2 days ago
$begingroup$
+1. "Entropy is a property of a macrostate, not a system." is completely clear, but I have never heard that before. It clarified my thinking a lot.
$endgroup$
– M. Winter
yesterday
$begingroup$
@M. Winter, in physics, the macrostate " in question " is the state in which system exists or more accurately spend most of the time, so we (or the system) must choose the macrostate with the highest stat. weight, that is, highest $ Omega$. Analogy: the extremum of the function (entropy) is a property of the function (system) and not a particular function value of particular argument value
$endgroup$
– Aleksey Druggist
yesterday
add a comment |
$begingroup$
+1; You may add for completeness that $Omega$ is actually the volume occupied by all the possible microstates corresponding to the given macrostate, in phase space. I Always found this expression clearer, since both positions and momentum are continuous variables, the number of microstates available would be infinite.
$endgroup$
– Run like hell
2 days ago
1
$begingroup$
@Runlikehell I was trying to get at the infinity problem with my last paragraph, but I think I'll make that a bit more clear.
$endgroup$
– Acccumulation
2 days ago
$begingroup$
+1. "Entropy is a property of a macrostate, not a system." is completely clear, but I have never heard that before. It clarified my thinking a lot.
$endgroup$
– M. Winter
yesterday
$begingroup$
@M. Winter, in physics, the macrostate " in question " is the state in which system exists or more accurately spend most of the time, so we (or the system) must choose the macrostate with the highest stat. weight, that is, highest $ Omega$. Analogy: the extremum of the function (entropy) is a property of the function (system) and not a particular function value of particular argument value
$endgroup$
– Aleksey Druggist
yesterday
$begingroup$
+1; You may add for completeness that $Omega$ is actually the volume occupied by all the possible microstates corresponding to the given macrostate, in phase space. I Always found this expression clearer, since both positions and momentum are continuous variables, the number of microstates available would be infinite.
$endgroup$
– Run like hell
2 days ago
$begingroup$
+1; You may add for completeness that $Omega$ is actually the volume occupied by all the possible microstates corresponding to the given macrostate, in phase space. I Always found this expression clearer, since both positions and momentum are continuous variables, the number of microstates available would be infinite.
$endgroup$
– Run like hell
2 days ago
1
1
$begingroup$
@Runlikehell I was trying to get at the infinity problem with my last paragraph, but I think I'll make that a bit more clear.
$endgroup$
– Acccumulation
2 days ago
$begingroup$
@Runlikehell I was trying to get at the infinity problem with my last paragraph, but I think I'll make that a bit more clear.
$endgroup$
– Acccumulation
2 days ago
$begingroup$
+1. "Entropy is a property of a macrostate, not a system." is completely clear, but I have never heard that before. It clarified my thinking a lot.
$endgroup$
– M. Winter
yesterday
$begingroup$
+1. "Entropy is a property of a macrostate, not a system." is completely clear, but I have never heard that before. It clarified my thinking a lot.
$endgroup$
– M. Winter
yesterday
$begingroup$
@M. Winter, in physics, the macrostate " in question " is the state in which system exists or more accurately spend most of the time, so we (or the system) must choose the macrostate with the highest stat. weight, that is, highest $ Omega$. Analogy: the extremum of the function (entropy) is a property of the function (system) and not a particular function value of particular argument value
$endgroup$
– Aleksey Druggist
yesterday
$begingroup$
@M. Winter, in physics, the macrostate " in question " is the state in which system exists or more accurately spend most of the time, so we (or the system) must choose the macrostate with the highest stat. weight, that is, highest $ Omega$. Analogy: the extremum of the function (entropy) is a property of the function (system) and not a particular function value of particular argument value
$endgroup$
– Aleksey Druggist
yesterday
add a comment |
$begingroup$
Entropy logarithmically measure of the number of microscopic states corresponding to some specific macroscopically-observable state, not the system as a whole. Put another way: systems that have not yet found their equilibrium state, when left alone, increase their entropy. This would not be possible if the system had the same entropy for all macrostates.
Indeed, the driving principle of entropy in modern stat-mech says that we have some uncertainty about the underlying microscopic state of the system and that from a certain perspective (basically, the one where every macroscopic quantity we can determine is conserved) we can treat nature as simply choosing a microstate uniformly at random. (We have to tread carefully about what exactly uniformly means here but an “obvious” choice seems to replicate certain nice features, like that metals will have specific heats that look like $3R$ where $R$ is the gas constant—a result that I want to say is due to Einstein but I am not 100% sure.)
As a result of this principle of nature picking microstates at random, our equilibrium state is the macrostate which contains the most microstates, and our regression to equilibrium is a process of macrostates getting larger and larger.
$endgroup$
add a comment |
$begingroup$
Entropy logarithmically measure of the number of microscopic states corresponding to some specific macroscopically-observable state, not the system as a whole. Put another way: systems that have not yet found their equilibrium state, when left alone, increase their entropy. This would not be possible if the system had the same entropy for all macrostates.
Indeed, the driving principle of entropy in modern stat-mech says that we have some uncertainty about the underlying microscopic state of the system and that from a certain perspective (basically, the one where every macroscopic quantity we can determine is conserved) we can treat nature as simply choosing a microstate uniformly at random. (We have to tread carefully about what exactly uniformly means here but an “obvious” choice seems to replicate certain nice features, like that metals will have specific heats that look like $3R$ where $R$ is the gas constant—a result that I want to say is due to Einstein but I am not 100% sure.)
As a result of this principle of nature picking microstates at random, our equilibrium state is the macrostate which contains the most microstates, and our regression to equilibrium is a process of macrostates getting larger and larger.
$endgroup$
add a comment |
$begingroup$
Entropy logarithmically measure of the number of microscopic states corresponding to some specific macroscopically-observable state, not the system as a whole. Put another way: systems that have not yet found their equilibrium state, when left alone, increase their entropy. This would not be possible if the system had the same entropy for all macrostates.
Indeed, the driving principle of entropy in modern stat-mech says that we have some uncertainty about the underlying microscopic state of the system and that from a certain perspective (basically, the one where every macroscopic quantity we can determine is conserved) we can treat nature as simply choosing a microstate uniformly at random. (We have to tread carefully about what exactly uniformly means here but an “obvious” choice seems to replicate certain nice features, like that metals will have specific heats that look like $3R$ where $R$ is the gas constant—a result that I want to say is due to Einstein but I am not 100% sure.)
As a result of this principle of nature picking microstates at random, our equilibrium state is the macrostate which contains the most microstates, and our regression to equilibrium is a process of macrostates getting larger and larger.
$endgroup$
Entropy logarithmically measure of the number of microscopic states corresponding to some specific macroscopically-observable state, not the system as a whole. Put another way: systems that have not yet found their equilibrium state, when left alone, increase their entropy. This would not be possible if the system had the same entropy for all macrostates.
Indeed, the driving principle of entropy in modern stat-mech says that we have some uncertainty about the underlying microscopic state of the system and that from a certain perspective (basically, the one where every macroscopic quantity we can determine is conserved) we can treat nature as simply choosing a microstate uniformly at random. (We have to tread carefully about what exactly uniformly means here but an “obvious” choice seems to replicate certain nice features, like that metals will have specific heats that look like $3R$ where $R$ is the gas constant—a result that I want to say is due to Einstein but I am not 100% sure.)
As a result of this principle of nature picking microstates at random, our equilibrium state is the macrostate which contains the most microstates, and our regression to equilibrium is a process of macrostates getting larger and larger.
edited 2 days ago
answered 2 days ago
CR DrostCR Drost
22.7k11962
22.7k11962
add a comment |
add a comment |
$begingroup$
Entropy is a matter of perspective.
You pick a way to describe a system at large scales. This effectively subdivides the system into macrostates, or "macroscopic states".
Each of these macroscopic states corresponds to a number of "microstates"; different configurations of the system that are clumped together in one macrostate.
If, for each macrostate, you take the log of the number of microstates in it, the principle of Entropy is that whatever macrostate it is in, it will move towards macrostates with a higher value almost certainly.
Now you can move to a lower Entropy value only by increasing the Entropy of another system. This basically consists of merging the two systems into one and applying the first rule.
The number of microstates multiply when they are combined; if we have two systems A and B, and they have macrostates A_0 and B_0 with 7 and 10 microstates apiece, the system A+B with macrostate A_0+B_0 has 70 microstates (7*10).
Taking the log of the number of microstates simply allows us to use addition instead of multiplication; the entropy of $log(7)$ and $log(10)$ add to $log(7)+log(10)$ = $log(7*10)$.
Any function that has the property that $f(a*b)=f(a)+f(b)$ will do just as well, which is why we don't care what the base of our logarithm is.
The fun part is that this applies regardless of how you clump the microstates into macrostates so long as you do the clumping before the experiment. So we go and pick sensible macrostates that correspond to things we care about, and the result holds. Crazy choice of macrostates don't actually help us; the vast majority of the possible configuration space of any system is completely useless chaos, only a ridiculously small fraction of the system configuration space is going to be "useful", and no matter how we label it that space is going to have very few microstates in it.
$endgroup$
add a comment |
$begingroup$
Entropy is a matter of perspective.
You pick a way to describe a system at large scales. This effectively subdivides the system into macrostates, or "macroscopic states".
Each of these macroscopic states corresponds to a number of "microstates"; different configurations of the system that are clumped together in one macrostate.
If, for each macrostate, you take the log of the number of microstates in it, the principle of Entropy is that whatever macrostate it is in, it will move towards macrostates with a higher value almost certainly.
Now you can move to a lower Entropy value only by increasing the Entropy of another system. This basically consists of merging the two systems into one and applying the first rule.
The number of microstates multiply when they are combined; if we have two systems A and B, and they have macrostates A_0 and B_0 with 7 and 10 microstates apiece, the system A+B with macrostate A_0+B_0 has 70 microstates (7*10).
Taking the log of the number of microstates simply allows us to use addition instead of multiplication; the entropy of $log(7)$ and $log(10)$ add to $log(7)+log(10)$ = $log(7*10)$.
Any function that has the property that $f(a*b)=f(a)+f(b)$ will do just as well, which is why we don't care what the base of our logarithm is.
The fun part is that this applies regardless of how you clump the microstates into macrostates so long as you do the clumping before the experiment. So we go and pick sensible macrostates that correspond to things we care about, and the result holds. Crazy choice of macrostates don't actually help us; the vast majority of the possible configuration space of any system is completely useless chaos, only a ridiculously small fraction of the system configuration space is going to be "useful", and no matter how we label it that space is going to have very few microstates in it.
$endgroup$
add a comment |
$begingroup$
Entropy is a matter of perspective.
You pick a way to describe a system at large scales. This effectively subdivides the system into macrostates, or "macroscopic states".
Each of these macroscopic states corresponds to a number of "microstates"; different configurations of the system that are clumped together in one macrostate.
If, for each macrostate, you take the log of the number of microstates in it, the principle of Entropy is that whatever macrostate it is in, it will move towards macrostates with a higher value almost certainly.
Now you can move to a lower Entropy value only by increasing the Entropy of another system. This basically consists of merging the two systems into one and applying the first rule.
The number of microstates multiply when they are combined; if we have two systems A and B, and they have macrostates A_0 and B_0 with 7 and 10 microstates apiece, the system A+B with macrostate A_0+B_0 has 70 microstates (7*10).
Taking the log of the number of microstates simply allows us to use addition instead of multiplication; the entropy of $log(7)$ and $log(10)$ add to $log(7)+log(10)$ = $log(7*10)$.
Any function that has the property that $f(a*b)=f(a)+f(b)$ will do just as well, which is why we don't care what the base of our logarithm is.
The fun part is that this applies regardless of how you clump the microstates into macrostates so long as you do the clumping before the experiment. So we go and pick sensible macrostates that correspond to things we care about, and the result holds. Crazy choice of macrostates don't actually help us; the vast majority of the possible configuration space of any system is completely useless chaos, only a ridiculously small fraction of the system configuration space is going to be "useful", and no matter how we label it that space is going to have very few microstates in it.
$endgroup$
Entropy is a matter of perspective.
You pick a way to describe a system at large scales. This effectively subdivides the system into macrostates, or "macroscopic states".
Each of these macroscopic states corresponds to a number of "microstates"; different configurations of the system that are clumped together in one macrostate.
If, for each macrostate, you take the log of the number of microstates in it, the principle of Entropy is that whatever macrostate it is in, it will move towards macrostates with a higher value almost certainly.
Now you can move to a lower Entropy value only by increasing the Entropy of another system. This basically consists of merging the two systems into one and applying the first rule.
The number of microstates multiply when they are combined; if we have two systems A and B, and they have macrostates A_0 and B_0 with 7 and 10 microstates apiece, the system A+B with macrostate A_0+B_0 has 70 microstates (7*10).
Taking the log of the number of microstates simply allows us to use addition instead of multiplication; the entropy of $log(7)$ and $log(10)$ add to $log(7)+log(10)$ = $log(7*10)$.
Any function that has the property that $f(a*b)=f(a)+f(b)$ will do just as well, which is why we don't care what the base of our logarithm is.
The fun part is that this applies regardless of how you clump the microstates into macrostates so long as you do the clumping before the experiment. So we go and pick sensible macrostates that correspond to things we care about, and the result holds. Crazy choice of macrostates don't actually help us; the vast majority of the possible configuration space of any system is completely useless chaos, only a ridiculously small fraction of the system configuration space is going to be "useful", and no matter how we label it that space is going to have very few microstates in it.
answered 2 days ago
YakkYakk
3,0511714
3,0511714
add a comment |
add a comment |
Thanks for contributing an answer to Physics Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function ()
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fphysics.stackexchange.com%2fquestions%2f470202%2fambiguity-in-the-definition-of-entropy%23new-answer', 'question_page');
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function ()
StackExchange.helpers.onClickDraftSave('#login-link');
);
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
Uh, ambiguity is the definition of entropy.
$endgroup$
– Hot Licks
yesterday