Nature and Health

The health benefits of nature are well documented in an increasing body of research suggesting that access to nature may be vital to our health and well-being. Integrative medicine is leading healthcare’s movement toward consideration of nature’s healing properties and the central role that all healthcare providers can play in facilitating nature-based interventions and community-based collaboration as a counterpoint to urban living.

Download the PDF for details.