21 Aug Iron Awareness Week
Bioavailability of dietary iron varies among different foods, but this difference has not always been accounted for in dietary studies. Based on our DELTA Model® 2.1, a nutrient gap arises for iron in 2040 if current food production is retained, motivating the importance of addressing iron shortfalls in our diet, especially in populations with higher requirements.
Dietary Iron and its Bioavailability
The bioavailability of iron refers to the proportion of iron within a food or supplement that can be absorbed by body cells. In a previous Thought for Food, we explained how different iron compounds, such as haem and non-haem iron, can have different bioavailabilities.
Different dietary patterns can have a significant impact on iron absorption and consequently reduce iron levels. For example, in a combined diet of meat and vegetables, the presence of meat/fish/poultry, which are good sources of highly bioavailable iron, counteracts poor absorption of non-haem iron from vegetables. Conversely, absence of highly bioavailable iron and presence of iron absorption inhibitors from restrictive plant-based diets leads to poorer iron intake to compensate for monthly iron losses. When iron deficiency is severe, it can manifest as iron deficiency anaemia (IDA), a condition most prevalent among women of reproductive age.
A comprehensive study suggests that iron deficiency is common among vegetarian females, based on levels of iron stores in the body, across both developed and developing countries, while results for vegans were more inconsistent. A common way of assessing iron status is by measuring the ferritin compounds present in blood serum (known as serum ferritin (SF)). Sometimes SF levels may be raised due to an infection or disease, providing a false indication of iron levels. Therefore, using more than one method to assess iron status is important.
To improve estimations of iron adequacy from different dietary patterns, it is important to consider diet-specific calculations. Since iron bioavailability is different in animal-sourced and plant-sourced foods, applying the same iron absorption measurements from omnivorous diets to vegan and vegetarian diets leads to an overestimation of iron bioavailability from plant-based diets. A recent study demonstrated the importance of considering absorption enhancers and inhibitors when calculating uptake of iron in meals. They showed that estimated iron absorption was lower when the effect of enhancers and inhibitors were applied to the calculations. This is particularly important as the levels of enhancers and inhibitors differ among omnivorous and plant-based diets.
To overcome the insufficient dietary iron supply, two strategies are predominantly being undertaken: fortification and supplementation. In the following paragraphs the benefits and limitations of each of these strategies will be discussed.
Iron Fortification Measures
Fortification is the practice of adding one or more nutrients to foodstuffs in order to correct or prevent nutrient deficiencies. Fortification programs have helped overcome many nationwide micronutrient deficiencies over the years. Iron fortification strategies have played an important role in addressing deficiencies and reducing IDA in many countries around the world. Iron enrichment of wheat flour is now mandatory in 81 countries around the globe (in 2020) including low- and middle-income countries.
One of the important factors that impacts the effectiveness of iron fortification is the type of iron compound used. There are several iron compounds that could be used for food fortification, among which only a few are considered suitable to be added to foodstuffs based on their bioavailability, cost and interaction with the fortification vehicle. In other words, the ideal iron compounds for fortification are those that cause no changes in taste, colour or texture to the original food item while presenting adequate bioavailability at minimum cost. Ferrous sulphate, ferrous fumarate and ferrous gluconate are among the few iron compounds that are commonly used for fortification.
The relative absorption of iron depends on the ability of the iron compound to dissolve in the stomach. Water-soluble iron compounds such as ferrous sulphate are generally more bioavailable than water-insoluble iron compounds. However, water-soluble iron leads to more adverse changes in the taste of the fortified food, potentially putting off consumers. Ferrous sulphate for example is known for causing colour changes and giving an undesirable metallic taste, especially in liquid foods. On the contrary, ferrous fumarate, a water-insoluble compound, has less of an impact on taste. Although poorly dissolvable in water, ferrous fumarate becomes highly dissolved in the stomach and hence provides an iron bioavailability as high as ferrous sulphate.
Fortification can also play a key role in facilitating the transition towards more environmentally friendly foods while maintaining nutrient adequacy. In a recent study published in Nutrients, various scenarios were simulated to investigate the potential of fortifying plant-based alternatives and commonly consumed foods with essential nutrients to facilitate a shift towards more plant-based diets. Their results showed that fortification of foods with critical micronutrients, iron being one of them, allows for smaller dietary shifts from current diets when optimised for greenhouse gas emissions and nutrient adequacy, hence a higher probability of the optimised diet being adopted by consumers. In another study recently published in the European Journal of Nutrition, the role of fortification with zinc and iron in developing dietary shift scenarios was investigated. The study demonstrated that when plant-based alternatives are fortified with these essential nutrients, they become viable selections in optimised diets, which would not have been the case without fortification. As a result, the modelled diets featured a greater reduction in red meat consumption with less significant deviation from the baseline diet.
Despite the immediate and long-term benefits associated with fortified foods, some consumers are still sceptical about including them in their diets. A consumer study commissioned by Food Standards Australia and New Zealand revealed that many consumers perceive fortified products as artificially processed and are reluctant to adopt them in their diets. Promoting awareness about nutrient deficiencies and how fortification may assist in ameliorating them is necessary to ensure the success of any fortification strategy.
Iron supplementation and the associated challenges
Oral iron supplementation in the form of iron tablets or liquid iron may be consumed to increase dietary intake of iron and treat IDA. Ferrous salts or bivalent ions (Fe2+) are the most common form of iron supplementation due to their high solubility and bioavailability, as compared to ferric iron (Fe3+). Examples include ferrous fumarate, ferrous sulphate, ferrous gluconate, and ferrous bisglycinate. Oral supplementation is however complicated by challenges associated with dosage and dosing frequency, as well as variations in the preparations of iron salts.
Iron levels in the body are regulated by hepcidin, a liver-derived hormone. Hepcidin concentrations increase when the body has enough iron. This prevents further iron absorption. When the body is iron deficient, hepcidin concentrations are reduced, and more iron is absorbed into the blood.
Hepcidin follows a circadian rhythm and concentrations generally increase over the day. This suggests that an iron dose provided in the morning may provide the best efficacy. Conclusions from trials conducted on women with iron deficiency found that twice-daily dosing should be avoided, because the persistent elevation of hepcidin for 24 hours after the first dose reduces further absorption of iron. This usually subsides after 48 hours, so doses ≥60mg should be consumed 48 hours apart to maximise fractional iron absorption. Such a dosing strategy may also reduce gut discomfort associated with iron supplements. This is crucial to improve use among individuals with IDA. However, such a dosing strategy may reduce the rate of total iron absorption in red blood cells and attain iron repletion, as dosage is halved per unit time. Iron fortification may be a more practical strategy to increase dietary iron intake and prevent iron deficiency among women, especially in lower-income countries where cost is a major challenge to iron supplementation.
Take home message
Iron deficiency is a common nutritional problem around the world, especially among women of reproductive age and pregnant women in the third trimester. Iron bioavailability from diets high in plant foods need to be examined to determine true iron adequacy. The value of fortified diets as future resolutions to narrow the nutritional gap between iron deficiency and repletion should also be examined.
This Thought for Food was written by Patricia Soh and Dr Mahya Tavan, with the support of the SNi team.
Photo by Bit245 from Canva Pro.