Gardens not only make the world a more beautiful place, they’re good for you. In fact, plants make life healthier. Researchers have found that people who spend time outdoors basking in the beauty of plants — in gardens, parks and other green spaces — benefit from increased activity, leading to improved physical health, reduced stress and lower health care costs.
More