Courses
Courses for Kids
Free study material
Offline Centres
More
Store Icon
Store

Two blocks of the same metal having the same mass and at temperature ${T_1}$ and ${T_2}$, respectively, are brought in contact with each other and allowed to attain thermal equilibrium at constant pressure. The change in entropy, $\Delta S$ , for this process is:
A. $2{C_p}\ln \left( {\dfrac{{{T_1} + {T_2}}}{{4{T_1}{T_2}}}} \right)$
B. $2{C_p}\ln \left( {\dfrac{{{{({T_1} + {T_2})}^{\dfrac{1}{2}}}}}{{{T_1}{T_2}}}} \right)$
C. ${C_p}\ln \left( {\dfrac{{{{({T_1} + {T_2})}^2}}}{{4{T_1}{T_2}}}} \right)$
D. $2{C_p}\ln \left( {\dfrac{{{T_1} + {T_2}}}{{2{T_1}{T_2}}}} \right)$

seo-qna
SearchIcon
Answer
VerifiedVerified
457.5k+ views
Hint: Entropy is a function of the state of the system, so the change in entropy of a system is determined by its initial and final states. In the idealization that a process is reversible, the entropy does not change, while irreversible processes always increase the total entropy. Thus, entropy is the order of randomness of a system.

Complete step by step answer:
According to the question, the two blocks are placed in contact with each other. As there is a difference in the temperature of the two systems, there will be a transition of heat from the high temperature system to the low temperature system. Thus, the final temperature (${T_f}$ ) of the system will be:
${T_f} = \dfrac{{{T_1} + {T_2}}}{2}$
Now, as we know from the definition of the change in entropy,
$\Delta {S_{system}} = \int {\dfrac{{d{q_{rev}}}}{T}} = n{C_p}\int {\dfrac{{dT}}{T}} $
Where, $d{q_{rev}} = $ small change in the reversible heat
${C_p} = $ specific heat at constant pressure
$dT = $ small change in the temperature
Thus, the change in entropy of the system for the first block will be given as:
$\Delta {S_1} = n{C_p}\int\limits_{{T_1}}^{{T_f}} {\dfrac{{dT}}{T}} = n{C_p}\ln \dfrac{{{T_f}}}{{{T_1}}}$
Similarly, the change in entropy of the system for the second block will be given as:
$\Delta {S_2} = n{C_p}\int\limits_{{T_2}}^{{T_f}} {\dfrac{{dT}}{T}} = n{C_p}\ln \dfrac{{{T_f}}}{{{T_2}}}$
Now, the total change in entropy is equal to the sum of the change in entropy of the individual blocks, which is given by:
$\Delta {S_1} + \Delta {S_2} = n{C_p}\ln \dfrac{{T_f^2}}{{{T_1}{T_2}}}$
But, we know that, ${T_f} = \dfrac{{{T_1} + {T_2}}}{2}$
Thus, replacing the value of ${T_f}$ in the above equation, we have the final entropy of the combined system as:
$\Delta {S_1} + \Delta {S_2} = n{C_p}\ln \dfrac{{{{\left( {\dfrac{{\left( {{T_1} + {T_2}} \right)}}{2}} \right)}^2}}}{{{T_1}{T_2}}}$
Thus, the final entropy of the system is = $n{C_p}\ln \left( {\dfrac{{{{({T_1} + {T_2})}^2}}}{{4{T_1}{T_2}}}} \right)$
For n =1, the final entropy will be = ${C_p}\ln \left( {\dfrac{{{{({T_1} + {T_2})}^2}}}{{4{T_1}{T_2}}}} \right)$

So, the correct answer is Option C.

Note:
Because it is determined by the number of random microstates, entropy is related to the amount of additional information needed to specify the exact physical state of a system, given its macroscopic specification. For this reason, it is often said that entropy is an expression of the disorder, or randomness of a system.