No one laughs at me more than me, partially because I’m the only one who thinks I’m funny. Plus, I consider myself little more than a well-trained talking monkey (ape, for you purists). I often have moments where I visualize chimpanzee-me running my errands or doing my chores, a mental image which makes it difficult to take anything I do seriously.
Chimpanzees (and bonobos, genetically as similar to humans as chimps) evolved from a common ancestor within social contexts similar to those experienced by humans. The prevalent Social Brain hypothesis proposes that the evolution of our “large” brains was driven largely by the complexities of functioning within these social environments.
A simple example of the types of complex social decisions humans deal with is the famous Prisoner’s Dilemma:
If you confess and your accomplice remains silent I will drop all charges against you and use your testimony to ensure that your accomplice does serious time. Likewise, if your accomplice confesses while you remain silent, they will go free while you do the time. If you both confess I get two convictions, but I’ll see to it that you both get early parole. If you both remain silent, I’ll have to settle for token sentences on firearms possession charges.
In the Prisoner’s Dilemma, the “safest” bet is to confess; remaining silent is more beneficial only if the other prisoner also remains silent. In this scenario, a cost/reward calculation mixes inextricably with social contexts (do you really trust the other prisoner?) to create a “no right answer” situation.
We face complex social calculations all the time, though we would seldom use the term “calculation” to reference our interactions. Naturally, creatures with larger brains (with specialized regions, such as the orbitofrontal cortex, for handling social calculations) would manage these interactions more effectively and win more of the the attention of socially-conscious mates.
The results of this evolution include common (often irrational) behaviors which promote cooperation and punish those who threaten social balance, such as Altruistic Punishment:
You give a group of people some money–$20, say–and a set of rules. Players can contribute any amount to a common pool with the promise of a modest return, or they can “free ride,” pocketing their initial stake plus a share in the group profits. If all of the players cooperate fully, everyone comes out ahead. But if one player acts selfishly, he’ll do even better. You don’t have to be a psychologist to guess how soon the whole system breaks down. Allow honest players to punish cheaters with a fine, however, and most will jump at the chance–even if doing so costs a significant portion of their profits.
In a modern context, I consider the “Little Brother” nature of the internet something of an amplifier for these built-in behaviors. I’m not convinced there’s anything new going on, but there is definitely more of it.