Anyone who eats ice cream, eggs, mango or sweet potato is getting some of their necessary intake of Vitamin A. Even pumpkin pie contains Vitamin A. Because Vitamin A is found in many different foods, either naturally or supplemented, the consequences of not having enough are rarely a topic of discussion.
According to the World Health Organization, Vitamin A deficiency is one of the leading causes of blindness and increases the risk of death from common childhood infections, such as diarrheal disease and measles, in more than half of all countries. In pregnant women, the possibilities for night blindness and maternal mortality are also intensified due to Vitamin A deficiency.
One hundred years ago, Elmer Verner McCollum co-authored a revolutionary paper with Marguerite Davis on the discovery of what he called “Factor A,” now known as Vitamin A. Though Vitamin A was discovered in 1913, the concept of the vitamin had begun decades earlier.
According to an article written by Kenneth J. Carpenter of the Department of Nutritional Sciences at the University of California-Berkeley, Thomas Christie, a physician working in Sri Lanka in 1804, noted that beriberi, a disease that affects the nervous system, must be caused by something that was lacking from the diet. Because citrus fruits, which are used in treating scurvy, had no effect on beriberi, he supposed that there must be some other nutritional compound involved.
Other similar observations, in various parts of the world, were made throughout the 19th century, though not much came of any of them until the spark was ignited in Madison, Wis., at the beginning of the 20th century.
In 1907, McCollum became involved with the exploration of nutritional value of livestock feed when he was recruited by E. B. Hart, a recently appointed professor of agricultural chemistry at UW-Madison. Hart’s “single-grain experiment,” executed from 1907 to 1911, was designed to compare the health and performance of four groups of heifers—young cows before giving birth to their first calf. The groups were fed on corn, wheat, oat or a mixture of the three. The goal was to determine the best feed for livestock, as well as identifying the unknown substance that was key to long-term nutritional health.
Of the four groups, the corn-fed cows fared the best by far. The corn-fed cows had smooth coats, were full in the chest, and appeared healthy in comparison to the other groups, which had rough coats and appeared emaciated.
The most notable difference among the groups was in their reproductive performance. Calves born to corn-fed heifers were strong and all survived. The calves born to those in the other groups either died shortly after birth or were incredibly weak.
This study proved that the nutritional value of food could not be determined simply from measurements of digestible nutrients nor did the amount of protein cause a significant difference in performance.
At a minisymposium in 1996, UW-Madison professor of biochemistry and nutritional sciences Alfred E. Harper, mentioned Hart and his colleagues had stated at the beginning of the century, “We have no adequate explanation of our results.”
Although there was no explanation for the outcome at the time, it led McCollum to perform further experiments in order to isolate whatever it was that was lacking from most feed rations.
Along with Davis, McCollum went on to test an assortment of mixes on rats with varying percentages of casein, lard, lactose, minerals and starch.
What they found was rats fed with these mixes would grow well initially, but the growth would come to a halt. Neither olive oil nor cottonseed oil had the growth-boosting effect. Only if an egg or butterfat was added would the growth resume.
When butterfat was mixed with a base, a process called saponification, and the resulting mixture of both water-soluble and fat-soluble components was shaken with olive oil, the olive oil became successful in restarting growth. Because the previously inert olive oil was able to generate growth like the butterfat, Davis and McCollum concluded the active agent was fat soluble. This was soon confirmed by a group in Connecticut who reported the high effectiveness of cod liver oil.
From these results, Davis and McCollum identified “Factor A,” later known as Vitamin A. As Davis and McCollum demonstrated in rats, deficiency of “Factor A” resulted in severe ophthalmia, or inflammation of the eye, as well as night blindness and xerophthalmia, a condition in which the eye fails to produce tears.
The word vitamin was derived from the word “vitamines” which was coined by a Polish biochemist, Casimir Funk. The “e” was removed once it was realized not all the necessary nutritional factors could be amines.
The idea of a necessary nutritional substance began in the early 19th century. Then, at the beginning of the 20th century the notion grew into what became known as the “single grain experiment.” The results of which lead Davis and McCollum to testing and separating elements of butterfat and their effects on rats. This was done primarily in the interest of developing a well-balanced feed for livestock. The effects on nutritional health for both animals and humans turned out to be more significant than anyone at the time could have predicted.
An achievement such as this, especially given it is the 100th anniversary of the discovery, should be highlighted and made known to those who consider themselves proud to be a part of the legacy that is UW-Madison.
David Nelson, professor in the biochemistry department at UW-Madison, has made it very apparent how much goes into an endeavor such as the nutritional breakthrough of Vitamin A. Because of the hard work and dedication by McCollum and Davis, the road to identifying all known vitamins began.