Entropy and its Differential Treatment in the System vs Surroundings

The name of the pictureThe name of the pictureThe name of the pictureClash Royale CLAN TAG#URR8PPP












5












$begingroup$


I have had this question for years, and despite posting in chemistry stack exchange about it, have never received a response which closes the issue for me.



My confusion arises from the differential treatment of entropy when focusing on systems and surroundings. It has always been my understanding that, because entropy is a state function, we are able to compute its maximum value for any process by summing over a reversible path. In a reversible process, where the system and surrounding are in equilibrium at each stage, this results in equal and opposite values for $Delta S_syst$ and $Delta S_surr$, and hence we get the limiting value of Clausius' inequality: $Delta S_total = 0$.



My confusion is that, during an irreversible process, we use the already known fact that entropy is a state function only on the system. This leads to differing values of $Delta S_syst$ and $Delta S_surr$ such that all non-reversible processes follow the non-limiting case of Clausius: $Delta S_total > 0$.



My question is: Because entropy is a state function, we should be able to calculate $Delta S_surr$ along a reversible path in all of these non-limiting cases, such that $Delta S_total$ is always zero. Why is this not done?



To help understand my confusion, I have received these responses before, and am not satisfied with my understanding of them at present. If they are in fact the answer, I would appreciate clarification on my confusion:



a.) We treat the surroundings (the rest of universe) as an infinite heat and work sink. This appears to me to be useful in practice, but in theory it cannot be true by the first law no? If, for instance, more heat flows into/out of the surroundings, we are removing/adding that finite energy (which can neither be created nor destroyed) from/to the rest of the universe. I intuitively understand the argument that this energy is "a drop in the ocean that is the universe." But that to me is not satisfactory in an axiomatic or rigorous sense. If this is the answer, please explain.



b.) I have also been told that we cannot compute the reversible path of the surroundings because we lack information on its final variables, in so much as we would need to know the parameters V, T, P of the end state as well as the initial state in order to compute the path, and so we must fall back on computing the irreversible path. This makes sense practically, but again, just because we lack the state variables, does not mean they don't exist. In principle, each of these irreversible processes has a final state, and so a reversible path exists, along which the entropy calculated would always be at a maximum, and then would we not have $Delta S_total = 0$ again?



Six years into chemistry, and I still struggle with this although I see many people on these sites argue as though it is intuitive and obvious. I have seen the derivation of Clausius' inequality, I have been taught about partition functions. The use of entropy in practical problems does not bother me, it is the underlying theory that seems right outside my grasp. I'd appreciate any and all help.



(PS I did not include an example problem because the last time I did so with this question it hindered more than it helped. Earnest people responding seemed to get lost in my understanding/interpretation of the problem in question instead of the underlying issue).










share|cite|improve this question









$endgroup$











  • $begingroup$
    What do you mean by "its [entropy's] maximum value for a reversible process"? I don't see where the maximum comes in.
    $endgroup$
    – user1476176
    Feb 1 at 1:03















5












$begingroup$


I have had this question for years, and despite posting in chemistry stack exchange about it, have never received a response which closes the issue for me.



My confusion arises from the differential treatment of entropy when focusing on systems and surroundings. It has always been my understanding that, because entropy is a state function, we are able to compute its maximum value for any process by summing over a reversible path. In a reversible process, where the system and surrounding are in equilibrium at each stage, this results in equal and opposite values for $Delta S_syst$ and $Delta S_surr$, and hence we get the limiting value of Clausius' inequality: $Delta S_total = 0$.



My confusion is that, during an irreversible process, we use the already known fact that entropy is a state function only on the system. This leads to differing values of $Delta S_syst$ and $Delta S_surr$ such that all non-reversible processes follow the non-limiting case of Clausius: $Delta S_total > 0$.



My question is: Because entropy is a state function, we should be able to calculate $Delta S_surr$ along a reversible path in all of these non-limiting cases, such that $Delta S_total$ is always zero. Why is this not done?



To help understand my confusion, I have received these responses before, and am not satisfied with my understanding of them at present. If they are in fact the answer, I would appreciate clarification on my confusion:



a.) We treat the surroundings (the rest of universe) as an infinite heat and work sink. This appears to me to be useful in practice, but in theory it cannot be true by the first law no? If, for instance, more heat flows into/out of the surroundings, we are removing/adding that finite energy (which can neither be created nor destroyed) from/to the rest of the universe. I intuitively understand the argument that this energy is "a drop in the ocean that is the universe." But that to me is not satisfactory in an axiomatic or rigorous sense. If this is the answer, please explain.



b.) I have also been told that we cannot compute the reversible path of the surroundings because we lack information on its final variables, in so much as we would need to know the parameters V, T, P of the end state as well as the initial state in order to compute the path, and so we must fall back on computing the irreversible path. This makes sense practically, but again, just because we lack the state variables, does not mean they don't exist. In principle, each of these irreversible processes has a final state, and so a reversible path exists, along which the entropy calculated would always be at a maximum, and then would we not have $Delta S_total = 0$ again?



Six years into chemistry, and I still struggle with this although I see many people on these sites argue as though it is intuitive and obvious. I have seen the derivation of Clausius' inequality, I have been taught about partition functions. The use of entropy in practical problems does not bother me, it is the underlying theory that seems right outside my grasp. I'd appreciate any and all help.



(PS I did not include an example problem because the last time I did so with this question it hindered more than it helped. Earnest people responding seemed to get lost in my understanding/interpretation of the problem in question instead of the underlying issue).










share|cite|improve this question









$endgroup$











  • $begingroup$
    What do you mean by "its [entropy's] maximum value for a reversible process"? I don't see where the maximum comes in.
    $endgroup$
    – user1476176
    Feb 1 at 1:03













5












5








5


1



$begingroup$


I have had this question for years, and despite posting in chemistry stack exchange about it, have never received a response which closes the issue for me.



My confusion arises from the differential treatment of entropy when focusing on systems and surroundings. It has always been my understanding that, because entropy is a state function, we are able to compute its maximum value for any process by summing over a reversible path. In a reversible process, where the system and surrounding are in equilibrium at each stage, this results in equal and opposite values for $Delta S_syst$ and $Delta S_surr$, and hence we get the limiting value of Clausius' inequality: $Delta S_total = 0$.



My confusion is that, during an irreversible process, we use the already known fact that entropy is a state function only on the system. This leads to differing values of $Delta S_syst$ and $Delta S_surr$ such that all non-reversible processes follow the non-limiting case of Clausius: $Delta S_total > 0$.



My question is: Because entropy is a state function, we should be able to calculate $Delta S_surr$ along a reversible path in all of these non-limiting cases, such that $Delta S_total$ is always zero. Why is this not done?



To help understand my confusion, I have received these responses before, and am not satisfied with my understanding of them at present. If they are in fact the answer, I would appreciate clarification on my confusion:



a.) We treat the surroundings (the rest of universe) as an infinite heat and work sink. This appears to me to be useful in practice, but in theory it cannot be true by the first law no? If, for instance, more heat flows into/out of the surroundings, we are removing/adding that finite energy (which can neither be created nor destroyed) from/to the rest of the universe. I intuitively understand the argument that this energy is "a drop in the ocean that is the universe." But that to me is not satisfactory in an axiomatic or rigorous sense. If this is the answer, please explain.



b.) I have also been told that we cannot compute the reversible path of the surroundings because we lack information on its final variables, in so much as we would need to know the parameters V, T, P of the end state as well as the initial state in order to compute the path, and so we must fall back on computing the irreversible path. This makes sense practically, but again, just because we lack the state variables, does not mean they don't exist. In principle, each of these irreversible processes has a final state, and so a reversible path exists, along which the entropy calculated would always be at a maximum, and then would we not have $Delta S_total = 0$ again?



Six years into chemistry, and I still struggle with this although I see many people on these sites argue as though it is intuitive and obvious. I have seen the derivation of Clausius' inequality, I have been taught about partition functions. The use of entropy in practical problems does not bother me, it is the underlying theory that seems right outside my grasp. I'd appreciate any and all help.



(PS I did not include an example problem because the last time I did so with this question it hindered more than it helped. Earnest people responding seemed to get lost in my understanding/interpretation of the problem in question instead of the underlying issue).










share|cite|improve this question









$endgroup$




I have had this question for years, and despite posting in chemistry stack exchange about it, have never received a response which closes the issue for me.



My confusion arises from the differential treatment of entropy when focusing on systems and surroundings. It has always been my understanding that, because entropy is a state function, we are able to compute its maximum value for any process by summing over a reversible path. In a reversible process, where the system and surrounding are in equilibrium at each stage, this results in equal and opposite values for $Delta S_syst$ and $Delta S_surr$, and hence we get the limiting value of Clausius' inequality: $Delta S_total = 0$.



My confusion is that, during an irreversible process, we use the already known fact that entropy is a state function only on the system. This leads to differing values of $Delta S_syst$ and $Delta S_surr$ such that all non-reversible processes follow the non-limiting case of Clausius: $Delta S_total > 0$.



My question is: Because entropy is a state function, we should be able to calculate $Delta S_surr$ along a reversible path in all of these non-limiting cases, such that $Delta S_total$ is always zero. Why is this not done?



To help understand my confusion, I have received these responses before, and am not satisfied with my understanding of them at present. If they are in fact the answer, I would appreciate clarification on my confusion:



a.) We treat the surroundings (the rest of universe) as an infinite heat and work sink. This appears to me to be useful in practice, but in theory it cannot be true by the first law no? If, for instance, more heat flows into/out of the surroundings, we are removing/adding that finite energy (which can neither be created nor destroyed) from/to the rest of the universe. I intuitively understand the argument that this energy is "a drop in the ocean that is the universe." But that to me is not satisfactory in an axiomatic or rigorous sense. If this is the answer, please explain.



b.) I have also been told that we cannot compute the reversible path of the surroundings because we lack information on its final variables, in so much as we would need to know the parameters V, T, P of the end state as well as the initial state in order to compute the path, and so we must fall back on computing the irreversible path. This makes sense practically, but again, just because we lack the state variables, does not mean they don't exist. In principle, each of these irreversible processes has a final state, and so a reversible path exists, along which the entropy calculated would always be at a maximum, and then would we not have $Delta S_total = 0$ again?



Six years into chemistry, and I still struggle with this although I see many people on these sites argue as though it is intuitive and obvious. I have seen the derivation of Clausius' inequality, I have been taught about partition functions. The use of entropy in practical problems does not bother me, it is the underlying theory that seems right outside my grasp. I'd appreciate any and all help.



(PS I did not include an example problem because the last time I did so with this question it hindered more than it helped. Earnest people responding seemed to get lost in my understanding/interpretation of the problem in question instead of the underlying issue).







thermodynamics entropy






share|cite|improve this question













share|cite|improve this question











share|cite|improve this question




share|cite|improve this question










asked Jan 31 at 10:39









YajibromineYajibromine

546




546











  • $begingroup$
    What do you mean by "its [entropy's] maximum value for a reversible process"? I don't see where the maximum comes in.
    $endgroup$
    – user1476176
    Feb 1 at 1:03
















  • $begingroup$
    What do you mean by "its [entropy's] maximum value for a reversible process"? I don't see where the maximum comes in.
    $endgroup$
    – user1476176
    Feb 1 at 1:03















$begingroup$
What do you mean by "its [entropy's] maximum value for a reversible process"? I don't see where the maximum comes in.
$endgroup$
– user1476176
Feb 1 at 1:03




$begingroup$
What do you mean by "its [entropy's] maximum value for a reversible process"? I don't see where the maximum comes in.
$endgroup$
– user1476176
Feb 1 at 1:03










4 Answers
4






active

oldest

votes


















1












$begingroup$


Because entropy is a state function, we should be able to calculate ΔSsurr along a reversible path in all of these non-limiting cases, such that ΔStotal is always zero. Why is this not done?




In special cases the environment can be treated as just another thermodynamic system, having possible equilibrium states, with temperature and entropy. Then what you're suggesting would be possible. For example, if environment of a gas in a cylinder is another gas in separate cylinder, then both can be assigned entropy in the way you suggest.



However, in general (and usually) the environment is more complicated, usually it is not in equilibrium at all. Consider atmosphere as environment of the gas in a cylinder. The atmosphere has locally single temperature so we can treat it as reservoir with single temperature for the purpose of analysing the main system (gas in a cylinder), but this does not actually mean that the atmosphere is in thermodynamic equilibrium or has entropy.



Entropy is a function of thermodynamic equilibrium state, or (generalized entropy) integral of local entropy density where locally the medium is in thermodynamic equilibrium.



In general, physical system together with its surroundings is not in either of the above two states and one cannot assign entropy to the combined system at all. Then, provided temperature $T_r$ still makes sense and has single value at the boundary of the main studied system, the (actual, original) Clausius inequality says



$$
oint fracdQT_r leq 0
$$

where the integral is over any closed path in state space of the main system, but this implies nothing about entropy of the environment, or total entropy, because such entropy is not generally even defined.






share|cite|improve this answer









$endgroup$












  • $begingroup$
    Your answer, I think, most directly touches on my confusion. I will need to do some more digging into the topics you touch on, such as the non equalibrium of surroundings and related lack of an entropy definition for this. Is it then correct to extrapolate from the Clausius inequality the law that the entropy of the universe must increase? Or is this erroneous (as we are limited by the surroundings which we can measure)?
    $endgroup$
    – Yajibromine
    Feb 1 at 16:56











  • $begingroup$
    The extrapolation from the Clausius inequality to the statement "universe has entropy and it cannot decrease" was made already by Clausius in his witty statement of the 2nd law, but I think it was unfortunate that other people then took it as a word of god. We do not have a way to experiment on the environment or the universe as a whole, such as make heat exchanges the way we do with a closed system in laboratory. So I believe the extrapolation is unwarranted.
    $endgroup$
    – Ján Lalinský
    Feb 1 at 21:29



















4












$begingroup$

My question is: Because entropy is a state function, we should be able to calculate $Delta S_surr$ along a reversible path in all of these non-limiting cases, such that $Delta S_surr$ is always zero. Why is this not done?



I believe the short answer would be, we don’t care.



The choice of the “system” in thermodynamics is completely arbitrary. We define it based on what it is we wish to study, and then focus on that. One person’s system can be another’s surroundings and vice versa. So there is no inherent reason for not calculating the entropy change of the surroundings.



I like to use the following as a definition of thermodynamics (key terms bold faced):



The study of energy transfer, between a system and its surroundings, in relation to the properties of substances.



A key part of this definition is “the properties of substances”, because whatever it is that we wish to study, we need information on its properties. So if we want to change our focus from the currently defined system to its current surroundings, we then need to redefine the surroundings as the system and study the effects of energy transfers to and from it in relation to the properties of its substances. Without knowledge of the properties of the surroundings, we can only come to some general conclusions about the effects of energy transfers to and from it.



For example with, regard to energy, we know that if a system does work on, or transfers heat to, the surroundings, there is an increase in the internal energy of the surroundings equal to the decrease in internal energy of the system. In other words, the first law applies equally to both the system and surroundings. However, without information on the properties of the surroundings we cannot determine the effect of the transfers on, say, the temperature of the surroundings.



With regard to entropy, we know that if heat $Q$ is transferred isothermally from the surroundings to the system, there will be a decrease in entropy of the surroundings of



$$Delta S_surr=-fracQT_surr$$



And an increase in entropy of the system of



$$Delta S_sys=+fracQT_sys$$



For a total entropy change of



$$Delta S_Tot= +fracQT_sys-fracQT_surr$$



And therefore for all $T_surr>T_sys$, $Delta S_Tot>0$. $Delta S_Tot=0$ only for the reversible path where $T_surrto T_sys$. In this example we needed properties of the surroundings, namely its temperature and that it can be considered a thermal reservoir.



Hope this helps.






share|cite|improve this answer









$endgroup$












  • $begingroup$
    > "we know that if heat Q is transferred isothermally" in general, it is sufficient that temperature of the system $T_sys~$ is defined at all times, then the formula above applies in differential sense.
    $endgroup$
    – Ján Lalinský
    Jan 31 at 21:22



















3












$begingroup$


Because entropy is a state function, we should be able to calculate $ΔS_surr$ along a reversible path in all of these non-limiting cases, such that $ΔS_total$ is always zero.




Actually, whenever you're calculating the entropy change in a process you are using a reversible path, since $S$ is only defined for quasi-static reversible processes. So in an irreversible process you will calculate the entropy changes of the system and its surroundings assuming each one went in a separate quasi-static reversible process connecting the initial and final states. And if you add the entropy changes for each, it'll not equal zero.



Let's look at an example: suppose body $(1)$ is our system with temperature $T^(1)$ and body $(2)$ is the 'surroundings' of the first with temperature $T^(2)$ such that $T^(1) < T^(2)$. Now we take the two bodies into contact with each other, allowing them to only exchange heat. We know from observation that heat will be transferred from body $(2)$ to body $(1)$ and not the other way arround, so this process is irreversible.



How can we calculate the entropy change of the two bodies? We assume that body $(1)$ went a quasi-static reversible process from it's temperature $T^(1)$ to the equilibrium temperature $T_texteq$ and calculate the entropy along that path. We do the same to body $(2)$: take a reversible process connecting $T^(2)$ and $T_texteq$ and calculate the entropy. The change in entropy of the composite system $(1) + (2)$ in an infinitesimal step will be
$$
dS = dS^(1) + dS^(2) = fracdelta Q^(1)T^(1) + fracdelta Q^(2)T^(2)
$$

but since energy is conserved, $delta Q^(2) + delta Q^(1) = 0$, so
$$
dS = delta Q^(1) left (frac1T^(1) - frac1T^(2) right ) > 0
$$

for the whole process, since $T^(2)$ will always be bigger than $T^(1)$ until $T_texteq$ is reached and $delta Q^(1)$ is positive. So integrating yields $Delta S > 0$ for the whole process.






share|cite|improve this answer











$endgroup$








  • 2




    $begingroup$
    "assuming each one went in a separate quasi-static path". Separate is, I think, the key word which the OP was missing here. The fact that a given reversible path will bring the system to the correct final state does not imply that the same path will also bring the surondings to the correct final state.
    $endgroup$
    – By Symmetry
    Jan 31 at 14:07










  • $begingroup$
    It is a common error to think that only reversible or quasi-static processes can be used to calculate entropy change. Bridgman and Eckart nearly eighty years showed that irreversible entropy generation can be usefully calculated in completely irreversible processes without the need to postulate some equilibirium endpoints and a reversible path connecting them.
    $endgroup$
    – hyportnex
    Feb 2 at 13:46


















0












$begingroup$

The issue is the entropy change of the surroundings. In thermodynamics, we typically treat the surroundings as an "ideal constant-temperature reservoir." Such a reservoir has the capacity to receive or deliver heat to a system without the reservoir temperature changing significantly (from the value $T_surr$). An example of such a reservoir would be an ice bath at $T_surr=0 C$. So the enthalpy change of the reservoir is (perhaps) the result of a phase change at constant temperature: $Delta H_surr=Q_surr$.



In addition to this, we assume that all the entropy generation in the irreversible process takes place within the system (and none takes place in the ideal surroundings). As such, there are never any temperature gradients in the ideal surroundings reservoir, and it always presents the same temperature $T_surr$ to the system at its interface with the system. Furthermore, since there are no temperature gradients within the reservoir, the amount of entropy generation in the reservoir during the irreversible process is zero. The only entropy change that takes place for the reservoir is due to the transfer of entropy through its interface with the system: $$Delta S_surr=fracQ_surrT_surrtag1$$This equation applies irrespective of whether the process path is reversible or irreversible.



The other gap in your understanding is in assuming that the reversible path used to calculate the change in entropy for the irreversible process has be the same reversible path for both the system and the surroundings. Since the irreversible process results in an increase in entropy, there is no reversible path for both the system and surroundings (in combination) that will yield the same final state for both, since, as you indicated, all combined reversible paths will yield zero change in entropy. To correctly get the change in entropy for the system and surroundings, we must completely separate them from one another and devise a separate reversible path for each of them which takes each from its initial to its final state that it experienced in the irreversible process. In the case of the surroundings, to end up in the same final state as in the irreversible process, it must receive the same amount of heat in its reversible path as it did in the irreversible path: $Q_surr, rev=-Q_sys, irrev$So the entropy change of the surroundings is: $$Delta S_surr=-fracQ_sys, irrevT_surr$$So the change in entropy for both the system and surroundings in the irreversible process is then:
$$Delta S_irrev=intfracdQ_rev, sysT_rev, sys-fracQ_sys, irrevT_surr$$
For more on separating system from surroundings (and even separating different parts of the same system) in devising reversible paths, see the following reference: https://www.physicsforums.com/insights/grandpa-chets-entropy-recipe/






share|cite|improve this answer











$endgroup$












    Your Answer





    StackExchange.ifUsing("editor", function ()
    return StackExchange.using("mathjaxEditing", function ()
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    );
    );
    , "mathjax-editing");

    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "151"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: false,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: null,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    noCode: true, onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );













    draft saved

    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fphysics.stackexchange.com%2fquestions%2f458003%2fentropy-and-its-differential-treatment-in-the-system-vs-surroundings%23new-answer', 'question_page');

    );

    Post as a guest















    Required, but never shown

























    4 Answers
    4






    active

    oldest

    votes








    4 Answers
    4






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    1












    $begingroup$


    Because entropy is a state function, we should be able to calculate ΔSsurr along a reversible path in all of these non-limiting cases, such that ΔStotal is always zero. Why is this not done?




    In special cases the environment can be treated as just another thermodynamic system, having possible equilibrium states, with temperature and entropy. Then what you're suggesting would be possible. For example, if environment of a gas in a cylinder is another gas in separate cylinder, then both can be assigned entropy in the way you suggest.



    However, in general (and usually) the environment is more complicated, usually it is not in equilibrium at all. Consider atmosphere as environment of the gas in a cylinder. The atmosphere has locally single temperature so we can treat it as reservoir with single temperature for the purpose of analysing the main system (gas in a cylinder), but this does not actually mean that the atmosphere is in thermodynamic equilibrium or has entropy.



    Entropy is a function of thermodynamic equilibrium state, or (generalized entropy) integral of local entropy density where locally the medium is in thermodynamic equilibrium.



    In general, physical system together with its surroundings is not in either of the above two states and one cannot assign entropy to the combined system at all. Then, provided temperature $T_r$ still makes sense and has single value at the boundary of the main studied system, the (actual, original) Clausius inequality says



    $$
    oint fracdQT_r leq 0
    $$

    where the integral is over any closed path in state space of the main system, but this implies nothing about entropy of the environment, or total entropy, because such entropy is not generally even defined.






    share|cite|improve this answer









    $endgroup$












    • $begingroup$
      Your answer, I think, most directly touches on my confusion. I will need to do some more digging into the topics you touch on, such as the non equalibrium of surroundings and related lack of an entropy definition for this. Is it then correct to extrapolate from the Clausius inequality the law that the entropy of the universe must increase? Or is this erroneous (as we are limited by the surroundings which we can measure)?
      $endgroup$
      – Yajibromine
      Feb 1 at 16:56











    • $begingroup$
      The extrapolation from the Clausius inequality to the statement "universe has entropy and it cannot decrease" was made already by Clausius in his witty statement of the 2nd law, but I think it was unfortunate that other people then took it as a word of god. We do not have a way to experiment on the environment or the universe as a whole, such as make heat exchanges the way we do with a closed system in laboratory. So I believe the extrapolation is unwarranted.
      $endgroup$
      – Ján Lalinský
      Feb 1 at 21:29
















    1












    $begingroup$


    Because entropy is a state function, we should be able to calculate ΔSsurr along a reversible path in all of these non-limiting cases, such that ΔStotal is always zero. Why is this not done?




    In special cases the environment can be treated as just another thermodynamic system, having possible equilibrium states, with temperature and entropy. Then what you're suggesting would be possible. For example, if environment of a gas in a cylinder is another gas in separate cylinder, then both can be assigned entropy in the way you suggest.



    However, in general (and usually) the environment is more complicated, usually it is not in equilibrium at all. Consider atmosphere as environment of the gas in a cylinder. The atmosphere has locally single temperature so we can treat it as reservoir with single temperature for the purpose of analysing the main system (gas in a cylinder), but this does not actually mean that the atmosphere is in thermodynamic equilibrium or has entropy.



    Entropy is a function of thermodynamic equilibrium state, or (generalized entropy) integral of local entropy density where locally the medium is in thermodynamic equilibrium.



    In general, physical system together with its surroundings is not in either of the above two states and one cannot assign entropy to the combined system at all. Then, provided temperature $T_r$ still makes sense and has single value at the boundary of the main studied system, the (actual, original) Clausius inequality says



    $$
    oint fracdQT_r leq 0
    $$

    where the integral is over any closed path in state space of the main system, but this implies nothing about entropy of the environment, or total entropy, because such entropy is not generally even defined.






    share|cite|improve this answer









    $endgroup$












    • $begingroup$
      Your answer, I think, most directly touches on my confusion. I will need to do some more digging into the topics you touch on, such as the non equalibrium of surroundings and related lack of an entropy definition for this. Is it then correct to extrapolate from the Clausius inequality the law that the entropy of the universe must increase? Or is this erroneous (as we are limited by the surroundings which we can measure)?
      $endgroup$
      – Yajibromine
      Feb 1 at 16:56











    • $begingroup$
      The extrapolation from the Clausius inequality to the statement "universe has entropy and it cannot decrease" was made already by Clausius in his witty statement of the 2nd law, but I think it was unfortunate that other people then took it as a word of god. We do not have a way to experiment on the environment or the universe as a whole, such as make heat exchanges the way we do with a closed system in laboratory. So I believe the extrapolation is unwarranted.
      $endgroup$
      – Ján Lalinský
      Feb 1 at 21:29














    1












    1








    1





    $begingroup$


    Because entropy is a state function, we should be able to calculate ΔSsurr along a reversible path in all of these non-limiting cases, such that ΔStotal is always zero. Why is this not done?




    In special cases the environment can be treated as just another thermodynamic system, having possible equilibrium states, with temperature and entropy. Then what you're suggesting would be possible. For example, if environment of a gas in a cylinder is another gas in separate cylinder, then both can be assigned entropy in the way you suggest.



    However, in general (and usually) the environment is more complicated, usually it is not in equilibrium at all. Consider atmosphere as environment of the gas in a cylinder. The atmosphere has locally single temperature so we can treat it as reservoir with single temperature for the purpose of analysing the main system (gas in a cylinder), but this does not actually mean that the atmosphere is in thermodynamic equilibrium or has entropy.



    Entropy is a function of thermodynamic equilibrium state, or (generalized entropy) integral of local entropy density where locally the medium is in thermodynamic equilibrium.



    In general, physical system together with its surroundings is not in either of the above two states and one cannot assign entropy to the combined system at all. Then, provided temperature $T_r$ still makes sense and has single value at the boundary of the main studied system, the (actual, original) Clausius inequality says



    $$
    oint fracdQT_r leq 0
    $$

    where the integral is over any closed path in state space of the main system, but this implies nothing about entropy of the environment, or total entropy, because such entropy is not generally even defined.






    share|cite|improve this answer









    $endgroup$




    Because entropy is a state function, we should be able to calculate ΔSsurr along a reversible path in all of these non-limiting cases, such that ΔStotal is always zero. Why is this not done?




    In special cases the environment can be treated as just another thermodynamic system, having possible equilibrium states, with temperature and entropy. Then what you're suggesting would be possible. For example, if environment of a gas in a cylinder is another gas in separate cylinder, then both can be assigned entropy in the way you suggest.



    However, in general (and usually) the environment is more complicated, usually it is not in equilibrium at all. Consider atmosphere as environment of the gas in a cylinder. The atmosphere has locally single temperature so we can treat it as reservoir with single temperature for the purpose of analysing the main system (gas in a cylinder), but this does not actually mean that the atmosphere is in thermodynamic equilibrium or has entropy.



    Entropy is a function of thermodynamic equilibrium state, or (generalized entropy) integral of local entropy density where locally the medium is in thermodynamic equilibrium.



    In general, physical system together with its surroundings is not in either of the above two states and one cannot assign entropy to the combined system at all. Then, provided temperature $T_r$ still makes sense and has single value at the boundary of the main studied system, the (actual, original) Clausius inequality says



    $$
    oint fracdQT_r leq 0
    $$

    where the integral is over any closed path in state space of the main system, but this implies nothing about entropy of the environment, or total entropy, because such entropy is not generally even defined.







    share|cite|improve this answer












    share|cite|improve this answer



    share|cite|improve this answer










    answered Jan 31 at 21:36









    Ján LalinskýJán Lalinský

    15.2k1335




    15.2k1335











    • $begingroup$
      Your answer, I think, most directly touches on my confusion. I will need to do some more digging into the topics you touch on, such as the non equalibrium of surroundings and related lack of an entropy definition for this. Is it then correct to extrapolate from the Clausius inequality the law that the entropy of the universe must increase? Or is this erroneous (as we are limited by the surroundings which we can measure)?
      $endgroup$
      – Yajibromine
      Feb 1 at 16:56











    • $begingroup$
      The extrapolation from the Clausius inequality to the statement "universe has entropy and it cannot decrease" was made already by Clausius in his witty statement of the 2nd law, but I think it was unfortunate that other people then took it as a word of god. We do not have a way to experiment on the environment or the universe as a whole, such as make heat exchanges the way we do with a closed system in laboratory. So I believe the extrapolation is unwarranted.
      $endgroup$
      – Ján Lalinský
      Feb 1 at 21:29

















    • $begingroup$
      Your answer, I think, most directly touches on my confusion. I will need to do some more digging into the topics you touch on, such as the non equalibrium of surroundings and related lack of an entropy definition for this. Is it then correct to extrapolate from the Clausius inequality the law that the entropy of the universe must increase? Or is this erroneous (as we are limited by the surroundings which we can measure)?
      $endgroup$
      – Yajibromine
      Feb 1 at 16:56











    • $begingroup$
      The extrapolation from the Clausius inequality to the statement "universe has entropy and it cannot decrease" was made already by Clausius in his witty statement of the 2nd law, but I think it was unfortunate that other people then took it as a word of god. We do not have a way to experiment on the environment or the universe as a whole, such as make heat exchanges the way we do with a closed system in laboratory. So I believe the extrapolation is unwarranted.
      $endgroup$
      – Ján Lalinský
      Feb 1 at 21:29
















    $begingroup$
    Your answer, I think, most directly touches on my confusion. I will need to do some more digging into the topics you touch on, such as the non equalibrium of surroundings and related lack of an entropy definition for this. Is it then correct to extrapolate from the Clausius inequality the law that the entropy of the universe must increase? Or is this erroneous (as we are limited by the surroundings which we can measure)?
    $endgroup$
    – Yajibromine
    Feb 1 at 16:56





    $begingroup$
    Your answer, I think, most directly touches on my confusion. I will need to do some more digging into the topics you touch on, such as the non equalibrium of surroundings and related lack of an entropy definition for this. Is it then correct to extrapolate from the Clausius inequality the law that the entropy of the universe must increase? Or is this erroneous (as we are limited by the surroundings which we can measure)?
    $endgroup$
    – Yajibromine
    Feb 1 at 16:56













    $begingroup$
    The extrapolation from the Clausius inequality to the statement "universe has entropy and it cannot decrease" was made already by Clausius in his witty statement of the 2nd law, but I think it was unfortunate that other people then took it as a word of god. We do not have a way to experiment on the environment or the universe as a whole, such as make heat exchanges the way we do with a closed system in laboratory. So I believe the extrapolation is unwarranted.
    $endgroup$
    – Ján Lalinský
    Feb 1 at 21:29





    $begingroup$
    The extrapolation from the Clausius inequality to the statement "universe has entropy and it cannot decrease" was made already by Clausius in his witty statement of the 2nd law, but I think it was unfortunate that other people then took it as a word of god. We do not have a way to experiment on the environment or the universe as a whole, such as make heat exchanges the way we do with a closed system in laboratory. So I believe the extrapolation is unwarranted.
    $endgroup$
    – Ján Lalinský
    Feb 1 at 21:29












    4












    $begingroup$

    My question is: Because entropy is a state function, we should be able to calculate $Delta S_surr$ along a reversible path in all of these non-limiting cases, such that $Delta S_surr$ is always zero. Why is this not done?



    I believe the short answer would be, we don’t care.



    The choice of the “system” in thermodynamics is completely arbitrary. We define it based on what it is we wish to study, and then focus on that. One person’s system can be another’s surroundings and vice versa. So there is no inherent reason for not calculating the entropy change of the surroundings.



    I like to use the following as a definition of thermodynamics (key terms bold faced):



    The study of energy transfer, between a system and its surroundings, in relation to the properties of substances.



    A key part of this definition is “the properties of substances”, because whatever it is that we wish to study, we need information on its properties. So if we want to change our focus from the currently defined system to its current surroundings, we then need to redefine the surroundings as the system and study the effects of energy transfers to and from it in relation to the properties of its substances. Without knowledge of the properties of the surroundings, we can only come to some general conclusions about the effects of energy transfers to and from it.



    For example with, regard to energy, we know that if a system does work on, or transfers heat to, the surroundings, there is an increase in the internal energy of the surroundings equal to the decrease in internal energy of the system. In other words, the first law applies equally to both the system and surroundings. However, without information on the properties of the surroundings we cannot determine the effect of the transfers on, say, the temperature of the surroundings.



    With regard to entropy, we know that if heat $Q$ is transferred isothermally from the surroundings to the system, there will be a decrease in entropy of the surroundings of



    $$Delta S_surr=-fracQT_surr$$



    And an increase in entropy of the system of



    $$Delta S_sys=+fracQT_sys$$



    For a total entropy change of



    $$Delta S_Tot= +fracQT_sys-fracQT_surr$$



    And therefore for all $T_surr>T_sys$, $Delta S_Tot>0$. $Delta S_Tot=0$ only for the reversible path where $T_surrto T_sys$. In this example we needed properties of the surroundings, namely its temperature and that it can be considered a thermal reservoir.



    Hope this helps.






    share|cite|improve this answer









    $endgroup$












    • $begingroup$
      > "we know that if heat Q is transferred isothermally" in general, it is sufficient that temperature of the system $T_sys~$ is defined at all times, then the formula above applies in differential sense.
      $endgroup$
      – Ján Lalinský
      Jan 31 at 21:22
















    4












    $begingroup$

    My question is: Because entropy is a state function, we should be able to calculate $Delta S_surr$ along a reversible path in all of these non-limiting cases, such that $Delta S_surr$ is always zero. Why is this not done?



    I believe the short answer would be, we don’t care.



    The choice of the “system” in thermodynamics is completely arbitrary. We define it based on what it is we wish to study, and then focus on that. One person’s system can be another’s surroundings and vice versa. So there is no inherent reason for not calculating the entropy change of the surroundings.



    I like to use the following as a definition of thermodynamics (key terms bold faced):



    The study of energy transfer, between a system and its surroundings, in relation to the properties of substances.



    A key part of this definition is “the properties of substances”, because whatever it is that we wish to study, we need information on its properties. So if we want to change our focus from the currently defined system to its current surroundings, we then need to redefine the surroundings as the system and study the effects of energy transfers to and from it in relation to the properties of its substances. Without knowledge of the properties of the surroundings, we can only come to some general conclusions about the effects of energy transfers to and from it.



    For example with, regard to energy, we know that if a system does work on, or transfers heat to, the surroundings, there is an increase in the internal energy of the surroundings equal to the decrease in internal energy of the system. In other words, the first law applies equally to both the system and surroundings. However, without information on the properties of the surroundings we cannot determine the effect of the transfers on, say, the temperature of the surroundings.



    With regard to entropy, we know that if heat $Q$ is transferred isothermally from the surroundings to the system, there will be a decrease in entropy of the surroundings of



    $$Delta S_surr=-fracQT_surr$$



    And an increase in entropy of the system of



    $$Delta S_sys=+fracQT_sys$$



    For a total entropy change of



    $$Delta S_Tot= +fracQT_sys-fracQT_surr$$



    And therefore for all $T_surr>T_sys$, $Delta S_Tot>0$. $Delta S_Tot=0$ only for the reversible path where $T_surrto T_sys$. In this example we needed properties of the surroundings, namely its temperature and that it can be considered a thermal reservoir.



    Hope this helps.






    share|cite|improve this answer









    $endgroup$












    • $begingroup$
      > "we know that if heat Q is transferred isothermally" in general, it is sufficient that temperature of the system $T_sys~$ is defined at all times, then the formula above applies in differential sense.
      $endgroup$
      – Ján Lalinský
      Jan 31 at 21:22














    4












    4








    4





    $begingroup$

    My question is: Because entropy is a state function, we should be able to calculate $Delta S_surr$ along a reversible path in all of these non-limiting cases, such that $Delta S_surr$ is always zero. Why is this not done?



    I believe the short answer would be, we don’t care.



    The choice of the “system” in thermodynamics is completely arbitrary. We define it based on what it is we wish to study, and then focus on that. One person’s system can be another’s surroundings and vice versa. So there is no inherent reason for not calculating the entropy change of the surroundings.



    I like to use the following as a definition of thermodynamics (key terms bold faced):



    The study of energy transfer, between a system and its surroundings, in relation to the properties of substances.



    A key part of this definition is “the properties of substances”, because whatever it is that we wish to study, we need information on its properties. So if we want to change our focus from the currently defined system to its current surroundings, we then need to redefine the surroundings as the system and study the effects of energy transfers to and from it in relation to the properties of its substances. Without knowledge of the properties of the surroundings, we can only come to some general conclusions about the effects of energy transfers to and from it.



    For example with, regard to energy, we know that if a system does work on, or transfers heat to, the surroundings, there is an increase in the internal energy of the surroundings equal to the decrease in internal energy of the system. In other words, the first law applies equally to both the system and surroundings. However, without information on the properties of the surroundings we cannot determine the effect of the transfers on, say, the temperature of the surroundings.



    With regard to entropy, we know that if heat $Q$ is transferred isothermally from the surroundings to the system, there will be a decrease in entropy of the surroundings of



    $$Delta S_surr=-fracQT_surr$$



    And an increase in entropy of the system of



    $$Delta S_sys=+fracQT_sys$$



    For a total entropy change of



    $$Delta S_Tot= +fracQT_sys-fracQT_surr$$



    And therefore for all $T_surr>T_sys$, $Delta S_Tot>0$. $Delta S_Tot=0$ only for the reversible path where $T_surrto T_sys$. In this example we needed properties of the surroundings, namely its temperature and that it can be considered a thermal reservoir.



    Hope this helps.






    share|cite|improve this answer









    $endgroup$



    My question is: Because entropy is a state function, we should be able to calculate $Delta S_surr$ along a reversible path in all of these non-limiting cases, such that $Delta S_surr$ is always zero. Why is this not done?



    I believe the short answer would be, we don’t care.



    The choice of the “system” in thermodynamics is completely arbitrary. We define it based on what it is we wish to study, and then focus on that. One person’s system can be another’s surroundings and vice versa. So there is no inherent reason for not calculating the entropy change of the surroundings.



    I like to use the following as a definition of thermodynamics (key terms bold faced):



    The study of energy transfer, between a system and its surroundings, in relation to the properties of substances.



    A key part of this definition is “the properties of substances”, because whatever it is that we wish to study, we need information on its properties. So if we want to change our focus from the currently defined system to its current surroundings, we then need to redefine the surroundings as the system and study the effects of energy transfers to and from it in relation to the properties of its substances. Without knowledge of the properties of the surroundings, we can only come to some general conclusions about the effects of energy transfers to and from it.



    For example with, regard to energy, we know that if a system does work on, or transfers heat to, the surroundings, there is an increase in the internal energy of the surroundings equal to the decrease in internal energy of the system. In other words, the first law applies equally to both the system and surroundings. However, without information on the properties of the surroundings we cannot determine the effect of the transfers on, say, the temperature of the surroundings.



    With regard to entropy, we know that if heat $Q$ is transferred isothermally from the surroundings to the system, there will be a decrease in entropy of the surroundings of



    $$Delta S_surr=-fracQT_surr$$



    And an increase in entropy of the system of



    $$Delta S_sys=+fracQT_sys$$



    For a total entropy change of



    $$Delta S_Tot= +fracQT_sys-fracQT_surr$$



    And therefore for all $T_surr>T_sys$, $Delta S_Tot>0$. $Delta S_Tot=0$ only for the reversible path where $T_surrto T_sys$. In this example we needed properties of the surroundings, namely its temperature and that it can be considered a thermal reservoir.



    Hope this helps.







    share|cite|improve this answer












    share|cite|improve this answer



    share|cite|improve this answer










    answered Jan 31 at 13:03









    Bob DBob D

    3,0002214




    3,0002214











    • $begingroup$
      > "we know that if heat Q is transferred isothermally" in general, it is sufficient that temperature of the system $T_sys~$ is defined at all times, then the formula above applies in differential sense.
      $endgroup$
      – Ján Lalinský
      Jan 31 at 21:22

















    • $begingroup$
      > "we know that if heat Q is transferred isothermally" in general, it is sufficient that temperature of the system $T_sys~$ is defined at all times, then the formula above applies in differential sense.
      $endgroup$
      – Ján Lalinský
      Jan 31 at 21:22
















    $begingroup$
    > "we know that if heat Q is transferred isothermally" in general, it is sufficient that temperature of the system $T_sys~$ is defined at all times, then the formula above applies in differential sense.
    $endgroup$
    – Ján Lalinský
    Jan 31 at 21:22





    $begingroup$
    > "we know that if heat Q is transferred isothermally" in general, it is sufficient that temperature of the system $T_sys~$ is defined at all times, then the formula above applies in differential sense.
    $endgroup$
    – Ján Lalinský
    Jan 31 at 21:22












    3












    $begingroup$


    Because entropy is a state function, we should be able to calculate $ΔS_surr$ along a reversible path in all of these non-limiting cases, such that $ΔS_total$ is always zero.




    Actually, whenever you're calculating the entropy change in a process you are using a reversible path, since $S$ is only defined for quasi-static reversible processes. So in an irreversible process you will calculate the entropy changes of the system and its surroundings assuming each one went in a separate quasi-static reversible process connecting the initial and final states. And if you add the entropy changes for each, it'll not equal zero.



    Let's look at an example: suppose body $(1)$ is our system with temperature $T^(1)$ and body $(2)$ is the 'surroundings' of the first with temperature $T^(2)$ such that $T^(1) < T^(2)$. Now we take the two bodies into contact with each other, allowing them to only exchange heat. We know from observation that heat will be transferred from body $(2)$ to body $(1)$ and not the other way arround, so this process is irreversible.



    How can we calculate the entropy change of the two bodies? We assume that body $(1)$ went a quasi-static reversible process from it's temperature $T^(1)$ to the equilibrium temperature $T_texteq$ and calculate the entropy along that path. We do the same to body $(2)$: take a reversible process connecting $T^(2)$ and $T_texteq$ and calculate the entropy. The change in entropy of the composite system $(1) + (2)$ in an infinitesimal step will be
    $$
    dS = dS^(1) + dS^(2) = fracdelta Q^(1)T^(1) + fracdelta Q^(2)T^(2)
    $$

    but since energy is conserved, $delta Q^(2) + delta Q^(1) = 0$, so
    $$
    dS = delta Q^(1) left (frac1T^(1) - frac1T^(2) right ) > 0
    $$

    for the whole process, since $T^(2)$ will always be bigger than $T^(1)$ until $T_texteq$ is reached and $delta Q^(1)$ is positive. So integrating yields $Delta S > 0$ for the whole process.






    share|cite|improve this answer











    $endgroup$








    • 2




      $begingroup$
      "assuming each one went in a separate quasi-static path". Separate is, I think, the key word which the OP was missing here. The fact that a given reversible path will bring the system to the correct final state does not imply that the same path will also bring the surondings to the correct final state.
      $endgroup$
      – By Symmetry
      Jan 31 at 14:07










    • $begingroup$
      It is a common error to think that only reversible or quasi-static processes can be used to calculate entropy change. Bridgman and Eckart nearly eighty years showed that irreversible entropy generation can be usefully calculated in completely irreversible processes without the need to postulate some equilibirium endpoints and a reversible path connecting them.
      $endgroup$
      – hyportnex
      Feb 2 at 13:46















    3












    $begingroup$


    Because entropy is a state function, we should be able to calculate $ΔS_surr$ along a reversible path in all of these non-limiting cases, such that $ΔS_total$ is always zero.




    Actually, whenever you're calculating the entropy change in a process you are using a reversible path, since $S$ is only defined for quasi-static reversible processes. So in an irreversible process you will calculate the entropy changes of the system and its surroundings assuming each one went in a separate quasi-static reversible process connecting the initial and final states. And if you add the entropy changes for each, it'll not equal zero.



    Let's look at an example: suppose body $(1)$ is our system with temperature $T^(1)$ and body $(2)$ is the 'surroundings' of the first with temperature $T^(2)$ such that $T^(1) < T^(2)$. Now we take the two bodies into contact with each other, allowing them to only exchange heat. We know from observation that heat will be transferred from body $(2)$ to body $(1)$ and not the other way arround, so this process is irreversible.



    How can we calculate the entropy change of the two bodies? We assume that body $(1)$ went a quasi-static reversible process from it's temperature $T^(1)$ to the equilibrium temperature $T_texteq$ and calculate the entropy along that path. We do the same to body $(2)$: take a reversible process connecting $T^(2)$ and $T_texteq$ and calculate the entropy. The change in entropy of the composite system $(1) + (2)$ in an infinitesimal step will be
    $$
    dS = dS^(1) + dS^(2) = fracdelta Q^(1)T^(1) + fracdelta Q^(2)T^(2)
    $$

    but since energy is conserved, $delta Q^(2) + delta Q^(1) = 0$, so
    $$
    dS = delta Q^(1) left (frac1T^(1) - frac1T^(2) right ) > 0
    $$

    for the whole process, since $T^(2)$ will always be bigger than $T^(1)$ until $T_texteq$ is reached and $delta Q^(1)$ is positive. So integrating yields $Delta S > 0$ for the whole process.






    share|cite|improve this answer











    $endgroup$








    • 2




      $begingroup$
      "assuming each one went in a separate quasi-static path". Separate is, I think, the key word which the OP was missing here. The fact that a given reversible path will bring the system to the correct final state does not imply that the same path will also bring the surondings to the correct final state.
      $endgroup$
      – By Symmetry
      Jan 31 at 14:07










    • $begingroup$
      It is a common error to think that only reversible or quasi-static processes can be used to calculate entropy change. Bridgman and Eckart nearly eighty years showed that irreversible entropy generation can be usefully calculated in completely irreversible processes without the need to postulate some equilibirium endpoints and a reversible path connecting them.
      $endgroup$
      – hyportnex
      Feb 2 at 13:46













    3












    3








    3





    $begingroup$


    Because entropy is a state function, we should be able to calculate $ΔS_surr$ along a reversible path in all of these non-limiting cases, such that $ΔS_total$ is always zero.




    Actually, whenever you're calculating the entropy change in a process you are using a reversible path, since $S$ is only defined for quasi-static reversible processes. So in an irreversible process you will calculate the entropy changes of the system and its surroundings assuming each one went in a separate quasi-static reversible process connecting the initial and final states. And if you add the entropy changes for each, it'll not equal zero.



    Let's look at an example: suppose body $(1)$ is our system with temperature $T^(1)$ and body $(2)$ is the 'surroundings' of the first with temperature $T^(2)$ such that $T^(1) < T^(2)$. Now we take the two bodies into contact with each other, allowing them to only exchange heat. We know from observation that heat will be transferred from body $(2)$ to body $(1)$ and not the other way arround, so this process is irreversible.



    How can we calculate the entropy change of the two bodies? We assume that body $(1)$ went a quasi-static reversible process from it's temperature $T^(1)$ to the equilibrium temperature $T_texteq$ and calculate the entropy along that path. We do the same to body $(2)$: take a reversible process connecting $T^(2)$ and $T_texteq$ and calculate the entropy. The change in entropy of the composite system $(1) + (2)$ in an infinitesimal step will be
    $$
    dS = dS^(1) + dS^(2) = fracdelta Q^(1)T^(1) + fracdelta Q^(2)T^(2)
    $$

    but since energy is conserved, $delta Q^(2) + delta Q^(1) = 0$, so
    $$
    dS = delta Q^(1) left (frac1T^(1) - frac1T^(2) right ) > 0
    $$

    for the whole process, since $T^(2)$ will always be bigger than $T^(1)$ until $T_texteq$ is reached and $delta Q^(1)$ is positive. So integrating yields $Delta S > 0$ for the whole process.






    share|cite|improve this answer











    $endgroup$




    Because entropy is a state function, we should be able to calculate $ΔS_surr$ along a reversible path in all of these non-limiting cases, such that $ΔS_total$ is always zero.




    Actually, whenever you're calculating the entropy change in a process you are using a reversible path, since $S$ is only defined for quasi-static reversible processes. So in an irreversible process you will calculate the entropy changes of the system and its surroundings assuming each one went in a separate quasi-static reversible process connecting the initial and final states. And if you add the entropy changes for each, it'll not equal zero.



    Let's look at an example: suppose body $(1)$ is our system with temperature $T^(1)$ and body $(2)$ is the 'surroundings' of the first with temperature $T^(2)$ such that $T^(1) < T^(2)$. Now we take the two bodies into contact with each other, allowing them to only exchange heat. We know from observation that heat will be transferred from body $(2)$ to body $(1)$ and not the other way arround, so this process is irreversible.



    How can we calculate the entropy change of the two bodies? We assume that body $(1)$ went a quasi-static reversible process from it's temperature $T^(1)$ to the equilibrium temperature $T_texteq$ and calculate the entropy along that path. We do the same to body $(2)$: take a reversible process connecting $T^(2)$ and $T_texteq$ and calculate the entropy. The change in entropy of the composite system $(1) + (2)$ in an infinitesimal step will be
    $$
    dS = dS^(1) + dS^(2) = fracdelta Q^(1)T^(1) + fracdelta Q^(2)T^(2)
    $$

    but since energy is conserved, $delta Q^(2) + delta Q^(1) = 0$, so
    $$
    dS = delta Q^(1) left (frac1T^(1) - frac1T^(2) right ) > 0
    $$

    for the whole process, since $T^(2)$ will always be bigger than $T^(1)$ until $T_texteq$ is reached and $delta Q^(1)$ is positive. So integrating yields $Delta S > 0$ for the whole process.







    share|cite|improve this answer














    share|cite|improve this answer



    share|cite|improve this answer








    edited Jan 31 at 17:11

























    answered Jan 31 at 13:35









    ErickShockErickShock

    45819




    45819







    • 2




      $begingroup$
      "assuming each one went in a separate quasi-static path". Separate is, I think, the key word which the OP was missing here. The fact that a given reversible path will bring the system to the correct final state does not imply that the same path will also bring the surondings to the correct final state.
      $endgroup$
      – By Symmetry
      Jan 31 at 14:07










    • $begingroup$
      It is a common error to think that only reversible or quasi-static processes can be used to calculate entropy change. Bridgman and Eckart nearly eighty years showed that irreversible entropy generation can be usefully calculated in completely irreversible processes without the need to postulate some equilibirium endpoints and a reversible path connecting them.
      $endgroup$
      – hyportnex
      Feb 2 at 13:46












    • 2




      $begingroup$
      "assuming each one went in a separate quasi-static path". Separate is, I think, the key word which the OP was missing here. The fact that a given reversible path will bring the system to the correct final state does not imply that the same path will also bring the surondings to the correct final state.
      $endgroup$
      – By Symmetry
      Jan 31 at 14:07










    • $begingroup$
      It is a common error to think that only reversible or quasi-static processes can be used to calculate entropy change. Bridgman and Eckart nearly eighty years showed that irreversible entropy generation can be usefully calculated in completely irreversible processes without the need to postulate some equilibirium endpoints and a reversible path connecting them.
      $endgroup$
      – hyportnex
      Feb 2 at 13:46







    2




    2




    $begingroup$
    "assuming each one went in a separate quasi-static path". Separate is, I think, the key word which the OP was missing here. The fact that a given reversible path will bring the system to the correct final state does not imply that the same path will also bring the surondings to the correct final state.
    $endgroup$
    – By Symmetry
    Jan 31 at 14:07




    $begingroup$
    "assuming each one went in a separate quasi-static path". Separate is, I think, the key word which the OP was missing here. The fact that a given reversible path will bring the system to the correct final state does not imply that the same path will also bring the surondings to the correct final state.
    $endgroup$
    – By Symmetry
    Jan 31 at 14:07












    $begingroup$
    It is a common error to think that only reversible or quasi-static processes can be used to calculate entropy change. Bridgman and Eckart nearly eighty years showed that irreversible entropy generation can be usefully calculated in completely irreversible processes without the need to postulate some equilibirium endpoints and a reversible path connecting them.
    $endgroup$
    – hyportnex
    Feb 2 at 13:46




    $begingroup$
    It is a common error to think that only reversible or quasi-static processes can be used to calculate entropy change. Bridgman and Eckart nearly eighty years showed that irreversible entropy generation can be usefully calculated in completely irreversible processes without the need to postulate some equilibirium endpoints and a reversible path connecting them.
    $endgroup$
    – hyportnex
    Feb 2 at 13:46











    0












    $begingroup$

    The issue is the entropy change of the surroundings. In thermodynamics, we typically treat the surroundings as an "ideal constant-temperature reservoir." Such a reservoir has the capacity to receive or deliver heat to a system without the reservoir temperature changing significantly (from the value $T_surr$). An example of such a reservoir would be an ice bath at $T_surr=0 C$. So the enthalpy change of the reservoir is (perhaps) the result of a phase change at constant temperature: $Delta H_surr=Q_surr$.



    In addition to this, we assume that all the entropy generation in the irreversible process takes place within the system (and none takes place in the ideal surroundings). As such, there are never any temperature gradients in the ideal surroundings reservoir, and it always presents the same temperature $T_surr$ to the system at its interface with the system. Furthermore, since there are no temperature gradients within the reservoir, the amount of entropy generation in the reservoir during the irreversible process is zero. The only entropy change that takes place for the reservoir is due to the transfer of entropy through its interface with the system: $$Delta S_surr=fracQ_surrT_surrtag1$$This equation applies irrespective of whether the process path is reversible or irreversible.



    The other gap in your understanding is in assuming that the reversible path used to calculate the change in entropy for the irreversible process has be the same reversible path for both the system and the surroundings. Since the irreversible process results in an increase in entropy, there is no reversible path for both the system and surroundings (in combination) that will yield the same final state for both, since, as you indicated, all combined reversible paths will yield zero change in entropy. To correctly get the change in entropy for the system and surroundings, we must completely separate them from one another and devise a separate reversible path for each of them which takes each from its initial to its final state that it experienced in the irreversible process. In the case of the surroundings, to end up in the same final state as in the irreversible process, it must receive the same amount of heat in its reversible path as it did in the irreversible path: $Q_surr, rev=-Q_sys, irrev$So the entropy change of the surroundings is: $$Delta S_surr=-fracQ_sys, irrevT_surr$$So the change in entropy for both the system and surroundings in the irreversible process is then:
    $$Delta S_irrev=intfracdQ_rev, sysT_rev, sys-fracQ_sys, irrevT_surr$$
    For more on separating system from surroundings (and even separating different parts of the same system) in devising reversible paths, see the following reference: https://www.physicsforums.com/insights/grandpa-chets-entropy-recipe/






    share|cite|improve this answer











    $endgroup$

















      0












      $begingroup$

      The issue is the entropy change of the surroundings. In thermodynamics, we typically treat the surroundings as an "ideal constant-temperature reservoir." Such a reservoir has the capacity to receive or deliver heat to a system without the reservoir temperature changing significantly (from the value $T_surr$). An example of such a reservoir would be an ice bath at $T_surr=0 C$. So the enthalpy change of the reservoir is (perhaps) the result of a phase change at constant temperature: $Delta H_surr=Q_surr$.



      In addition to this, we assume that all the entropy generation in the irreversible process takes place within the system (and none takes place in the ideal surroundings). As such, there are never any temperature gradients in the ideal surroundings reservoir, and it always presents the same temperature $T_surr$ to the system at its interface with the system. Furthermore, since there are no temperature gradients within the reservoir, the amount of entropy generation in the reservoir during the irreversible process is zero. The only entropy change that takes place for the reservoir is due to the transfer of entropy through its interface with the system: $$Delta S_surr=fracQ_surrT_surrtag1$$This equation applies irrespective of whether the process path is reversible or irreversible.



      The other gap in your understanding is in assuming that the reversible path used to calculate the change in entropy for the irreversible process has be the same reversible path for both the system and the surroundings. Since the irreversible process results in an increase in entropy, there is no reversible path for both the system and surroundings (in combination) that will yield the same final state for both, since, as you indicated, all combined reversible paths will yield zero change in entropy. To correctly get the change in entropy for the system and surroundings, we must completely separate them from one another and devise a separate reversible path for each of them which takes each from its initial to its final state that it experienced in the irreversible process. In the case of the surroundings, to end up in the same final state as in the irreversible process, it must receive the same amount of heat in its reversible path as it did in the irreversible path: $Q_surr, rev=-Q_sys, irrev$So the entropy change of the surroundings is: $$Delta S_surr=-fracQ_sys, irrevT_surr$$So the change in entropy for both the system and surroundings in the irreversible process is then:
      $$Delta S_irrev=intfracdQ_rev, sysT_rev, sys-fracQ_sys, irrevT_surr$$
      For more on separating system from surroundings (and even separating different parts of the same system) in devising reversible paths, see the following reference: https://www.physicsforums.com/insights/grandpa-chets-entropy-recipe/






      share|cite|improve this answer











      $endgroup$















        0












        0








        0





        $begingroup$

        The issue is the entropy change of the surroundings. In thermodynamics, we typically treat the surroundings as an "ideal constant-temperature reservoir." Such a reservoir has the capacity to receive or deliver heat to a system without the reservoir temperature changing significantly (from the value $T_surr$). An example of such a reservoir would be an ice bath at $T_surr=0 C$. So the enthalpy change of the reservoir is (perhaps) the result of a phase change at constant temperature: $Delta H_surr=Q_surr$.



        In addition to this, we assume that all the entropy generation in the irreversible process takes place within the system (and none takes place in the ideal surroundings). As such, there are never any temperature gradients in the ideal surroundings reservoir, and it always presents the same temperature $T_surr$ to the system at its interface with the system. Furthermore, since there are no temperature gradients within the reservoir, the amount of entropy generation in the reservoir during the irreversible process is zero. The only entropy change that takes place for the reservoir is due to the transfer of entropy through its interface with the system: $$Delta S_surr=fracQ_surrT_surrtag1$$This equation applies irrespective of whether the process path is reversible or irreversible.



        The other gap in your understanding is in assuming that the reversible path used to calculate the change in entropy for the irreversible process has be the same reversible path for both the system and the surroundings. Since the irreversible process results in an increase in entropy, there is no reversible path for both the system and surroundings (in combination) that will yield the same final state for both, since, as you indicated, all combined reversible paths will yield zero change in entropy. To correctly get the change in entropy for the system and surroundings, we must completely separate them from one another and devise a separate reversible path for each of them which takes each from its initial to its final state that it experienced in the irreversible process. In the case of the surroundings, to end up in the same final state as in the irreversible process, it must receive the same amount of heat in its reversible path as it did in the irreversible path: $Q_surr, rev=-Q_sys, irrev$So the entropy change of the surroundings is: $$Delta S_surr=-fracQ_sys, irrevT_surr$$So the change in entropy for both the system and surroundings in the irreversible process is then:
        $$Delta S_irrev=intfracdQ_rev, sysT_rev, sys-fracQ_sys, irrevT_surr$$
        For more on separating system from surroundings (and even separating different parts of the same system) in devising reversible paths, see the following reference: https://www.physicsforums.com/insights/grandpa-chets-entropy-recipe/






        share|cite|improve this answer











        $endgroup$



        The issue is the entropy change of the surroundings. In thermodynamics, we typically treat the surroundings as an "ideal constant-temperature reservoir." Such a reservoir has the capacity to receive or deliver heat to a system without the reservoir temperature changing significantly (from the value $T_surr$). An example of such a reservoir would be an ice bath at $T_surr=0 C$. So the enthalpy change of the reservoir is (perhaps) the result of a phase change at constant temperature: $Delta H_surr=Q_surr$.



        In addition to this, we assume that all the entropy generation in the irreversible process takes place within the system (and none takes place in the ideal surroundings). As such, there are never any temperature gradients in the ideal surroundings reservoir, and it always presents the same temperature $T_surr$ to the system at its interface with the system. Furthermore, since there are no temperature gradients within the reservoir, the amount of entropy generation in the reservoir during the irreversible process is zero. The only entropy change that takes place for the reservoir is due to the transfer of entropy through its interface with the system: $$Delta S_surr=fracQ_surrT_surrtag1$$This equation applies irrespective of whether the process path is reversible or irreversible.



        The other gap in your understanding is in assuming that the reversible path used to calculate the change in entropy for the irreversible process has be the same reversible path for both the system and the surroundings. Since the irreversible process results in an increase in entropy, there is no reversible path for both the system and surroundings (in combination) that will yield the same final state for both, since, as you indicated, all combined reversible paths will yield zero change in entropy. To correctly get the change in entropy for the system and surroundings, we must completely separate them from one another and devise a separate reversible path for each of them which takes each from its initial to its final state that it experienced in the irreversible process. In the case of the surroundings, to end up in the same final state as in the irreversible process, it must receive the same amount of heat in its reversible path as it did in the irreversible path: $Q_surr, rev=-Q_sys, irrev$So the entropy change of the surroundings is: $$Delta S_surr=-fracQ_sys, irrevT_surr$$So the change in entropy for both the system and surroundings in the irreversible process is then:
        $$Delta S_irrev=intfracdQ_rev, sysT_rev, sys-fracQ_sys, irrevT_surr$$
        For more on separating system from surroundings (and even separating different parts of the same system) in devising reversible paths, see the following reference: https://www.physicsforums.com/insights/grandpa-chets-entropy-recipe/







        share|cite|improve this answer














        share|cite|improve this answer



        share|cite|improve this answer








        edited Jan 31 at 15:27

























        answered Jan 31 at 13:41









        Chester MillerChester Miller

        15k2823




        15k2823



























            draft saved

            draft discarded
















































            Thanks for contributing an answer to Physics Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid


            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.

            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fphysics.stackexchange.com%2fquestions%2f458003%2fentropy-and-its-differential-treatment-in-the-system-vs-surroundings%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown






            Popular posts from this blog

            How to check contact read email or not when send email to Individual?

            Bahrain

            Postfix configuration issue with fips on centos 7; mailgun relay