from Harvard Health Blog3 years agoPlant-based diets have taken root in American culture in recent years, mostly thanks to the growing realization about the health benefits of this eating pattern. Read at Harvard Health Blog[add][|| ... ]