How Nature Benefits Human Health

suzuki_trees_650

Understanding that we’re part of nature and acting on that understanding makes us healthier and happier and encourages us to care for the natural systems around us. A growing body of science confirms this, including two recent studies that explore the ways nature benefits human health.

From our very good friends over at Eco Watch

Click here for link to the full article