This Is What Nature Does To Your Body
Scientists have found that just being around nature can benefit our health in major ways. What exactly does nature do to our bodies?
Psychologists have long known that nature can have a beneficial effect on the mind. Multiple studies have shown that regular exposure to green spaces – a long walk in the woods, say – can reduce stress, counter depression and anxiety, and improve mood and cognition.
But there's also a growing body of evidence that suggests natural surroundings can directly improve physical health in significant and oddly specific ways. Green spaces have been shown to improve patient recovery from major surgeries and illnesses. In some cases, exposure to nature seems to actively prevent coronary heart disease and stroke. The moral of the story? Go take a hike. Jules Suzdaltsev has the details in today's DNews report.
Huffington Post: The Unlikely Health Benefits Of Walking In The Woods
Europe PMC: A Review of the Health Benefits of Greenness