Search
Animals

Do Vitamins Really Improve Health?

Vitamin supplements are everywhere, touting that a normal food diet doesn't provide the entire nutrition we need. Supplementing with vitamins will make us healthier and will prolong our lives. But is this actually true? And could we be doing more harm than good by swallowing all these extracts each day?

Published On 11/17/2013
8:00 AM EST