The discovery of various vitamins – essential micronutrients that cause disease when deficient – was one of the great advances of modern scientific medicine. This knowledge also led to several highly successful public health campaigns, such as vitamin-D supplementation to prevent rickets.
Today vitamins have a deserved reputation for being an important part of overall health. However, their reputation has gone beyond the science and taken on almost mythical proportions. Perhaps it is due to aggressive marketing from the supplement industry, perhaps recent generations have grown up being told by their parents thousands of times how important it is to take their vitamins, or eat vitamin-rich food. Culture also plays a role – Popeye eating spinach to make himself super strong is an example this pervasive message.
Regardless of the cause, the general feeling is that vitamins are all good – they are not only important for health, they promote health. Many people take vitamin supplements on the idea that more is better, or for nutritional “insurance” to make sure they are getting enough of every vitamin.