Entropy, my nemesis, is actually sort of a confusing concept.  My favorite re-statement of the three (four) laws of thermodynamics is:

0. Everyone must play. (Systems that interact reach thermal equilibrium.)
1. You can’t win. (Energy cannot be created or destroyed.)
2. You can’t break even. (Work done increases heat and heat cannot be converted completely into work.)
3. You can’t leave the game. (Entropy goes to zero as you approach absolute zero, but you can’t reach absolute zero.)

Entropy is a measure of the number of ways a system can be arranged.  The higher it is, the more “disorder” a system has.  You can’t reduce this disorder, because forcing the system back into a smaller number of possible arrangements takes work, and work produces heat (which is itself increasing entropy).

Entropy tending to increase has two major physical consequences: it prevents perpetual motion (no work-creation process is totally reversible) and it indicates a direction for time (time is the direction in which entropy increases).

Alright, with those definitions out of the way, I had an intriguing thought: over time, relationships between people become more complicated.  Logic tells me that this is because of shared history – as the number of experiences you have with a person increases, your expectations and internal analysis of their behavior become more refined.  When I first meet a person, the number of things we have to talk about is small.  When I have known a person for years, the number is much larger.

Is the number of potential (plausible, likely?) interactions between people in some sense related to entropy of a nonliving system?  That is, are the number of “possible person-to-person interactions” the same as the number of “possible arrangements of a system”?  And if so, can we formulate Laws of Human Dynamics that are extensions of the Laws of Thermodynamics?

I expect that a more formal revisit of this topic (by which I mean, more logically structured rather than stream of consciousness) is necessary.