![]() This kind of calculation quickly becomes very tedious, even for a very small system of only 10 particles, and completely impracticable for systems where the number of particles approaches the number of atoms/molecules in even a very small fraction of one mole. Sharing the energy equally between four particles increases the number of possible ways this energy can be distributed to 210. However, since the particles are indistinguishable, there would be only 45 ways in which this state can be achieved. ![]() If the energy is shared between two molecules, half of the energy could be given to any one of the 10 particles and the other half to any one of the remaining nine particles, giving 90 ways in which this state of the system could be achieved. If all the energy were concentrated in one particle, any one of the 10 particles could take it up and this state of the system is 10 times more probable than every particle having two units each. Although this is the average energy, this state of the system can only arise if every particle has two units of energy. The average energy is easily calculated to be two units per particle. Suppose there are 20 units of energy to be supplied to a system of just 10 identical particles. To obtain some idea of what entropy is, it is helpful to imagine what happens when a small quantity of energy is supplied to a very small system. For example, the standard entropy of graphite is 6 J K -1 mol -1, whereas that for water it is 70 J K -1 mol -1, and for nitrogen it is 192 J K -1 mol -1. Since different substances have different heat capacities and because some compounds will have melted or vaporised, by the time they have reached their standard states at 298 K, their standard entropies will be different. The rise in temperature caused by a given quantity of heat will be different for different substances and depends on the heat capacity of the substance. As energy, in any form is supplied to a system, its molecules begin to rotate, vibrate and translate, which is observable as a rise in temperature. In classical terms, systems at absolute zero have no energy and the atoms or molecules would be close packed together. The first law deals with the conservation of energy, the second law is concerned with the direction in which spontaneous reactions go, and the third law establishes that at absolute zero, all pure substances have the same entropy, which is taken, by convention, to be zero. Alongside this it is important to bear in mind the three laws of thermodynamics. We prefer to consider that the entropy of a system corresponds to the molecular distribution of its molecular energy among the available energy levels and that systems tends to adopt the broadest possible distribution. Secondly, the equation ( ii) defining entropy change does not recognise that the system has to be at equilibrium for it to be valid. First the units of entropy are Joules per Kelvin but the degree of disorder has no units. This more modern approach has two disadvantages. ![]() However, it is more common today to find entropy explained in terms of the degree of disorder in the system and to define the entropy change, Δ S, as: Where Q is the quantity of heat and T the thermodynamic temperature. Many earlier textbooks took the approach of defining a change in entropy, Δ S, via the equation: Generations of students struggled with Carnot's cycle and various types of expansion of ideal and real gases, and never really understood why they were doing so. The concept of entropy emerged from the mid-19th century discussion of the efficiency of heat engines. ![]() Entropy is a thermodynamic property, like temperature, pressure and volume but, unlike them, it can not easily be visualised. Thermodynamic properties depend on the current state of the system but not on its previous history and are either extensive - their values depend on the amount of substance comprising the system, eg volume - or intensive - their values are independent of the amount of substance making up the system, eg temperature and pressure. Thermodynamics deals with the relation between that small part of the universe in which we are interested - the system - and the rest of the universe - the surroundings. Entropy is dynamic - something which static scenes don't reflect ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |