Back in college we had a long talk in English 101 about the symbolism associated with the natural world. We talked about how the sun is traditionally presented as male, and the earth as female; we talked about the seasons, and how autumn was the season of death and decay and endings, while spring was the season of life, birth, and rebirth...
And I... just... no. No. I realize this places me completely at odds with most of Western Culture™, but that is completely wrong. Autumn is cool and clean, the season when I feel awake and alive and focused. The relentless heat of summer is finally letting go, the nights are getting longer, and I have my energy back.
Spring is the season of death. Spring is the time when I cannot breathe, when I walk around feeling as if someone has driven a nail into my forehead, when - basically - the air itself is trying to kill me.
Western Civilization has it backwards.