ORIGINAL ARTICLE
Jinran Yu1, Hongying Zheng2, Peipei Zhang1, Lixia Zhang1 and Yongye Sun1*
1Department of Nutrition and Food Hygiene, School of Public Health, Qingdao University, Qingdao, China; 2Clinical Laboratory, The Affiliated Hospital of Qingdao University, Qingdao, China
Background: Currently available evidence on the association between dietary iron intake and hyperuricemia is limited and inconsistent.
Objective: This study aimed to examine the relationships between animal-derived dietary iron (ADDI) intake, plant-derived dietary iron (PDDI) intake, and the ratio PDDI:ADDI and hyperuricemia risk among US adults.
Design: Data from the National Health and Nutrition Examination Survey (NHANES) 2009–2014 were used. Iron intake from diet was assessed through two 24-h dietary recalls. Logistic regression models and restricted cubic spline models were used to investigate the associations between dietary iron intake from different sources and hyperuricemia risk.
Results: A total of 12,869 participants aged ≥20 years were enrolled in the study. After adjustment for multiple confounders, relative to the lowest quartile, the odds ratios (ORs) with 95% confidence intervals (CIs) of hyperuricemia for the highest quartile of ADDI intake, PDDI intake, and the PDDI:ADDI intake ratio were 1.11 (0.90–1.38), 0.69 (0.55–0.87), and 0.85 (0.67–1.07), respectively. Dose–response analysis revealed that the risk of hyperuricemia was negatively associated with PDDI intake in a linear manner.
Conclusion: PDDI intake was inversely associated with hyperuricemia in US adults.
Keywords: animal-derived iron; plant-derived iron; dietary intake; hyperuricemia; NHANES
Citation: Food & Nutrition Research 2020, 64: 3641 - http://dx.doi.org/10.29219/fnr.v64.3641
Copyright: © 2020 Jinran Yu et al. This is an Open Access article distributed under the terms of the Creative Commons Attribution 4.0 International License, allowing third parties to copy and redistribute the material in any medium or format and to remix, transform, and build upon the material for any purpose, even commercially, provided the original work is properly cited and states its license.
Received: 16 July 2019; Revised: 17 February 2020; Accepted: 27 March 2020; Published: 11 May 2020
Competing interests and funding: The authors declare no potential conflicts of interest. This research was supported by the Natural Science Foundation of China (grant number 81703206).
*Yongye Sun, Department of Nutrition and Food Hygiene, School of Public Health, Qingdao University, No. 308 Ningxia Road, Laoshan District, Qingdao Shandong 266071, China. Email: yongye.sun@126.com
To access the supplementary material, please visit the article landing page
Hyperuricemia is a risk factor for gout, metabolic syndrome (1, 2), cardiovascular diseases (2, 3), acute kidney injury (4), and type 2 diabetes mellitus (5). The prevalence of hyperuricemia has an increasing trend (6, 7). The reported prevalence of hyperuricemia varies from 8.4 to 21.6% (6, 8–10). Some mechanisms of hyperuricemia were proposed, including uric acid overproduction in the liver and/or a decrease in uric acid excretion through kidney and gut (11). Some dietary contributors were linked to uric acid metabolism, for example, meat, seafood, beer, liquor, and sugar-sweetened foods were associated with high levels of serum uric acid (12–14), whereas several micronutrients, such as vitamin C, folate, magnesium, and calcium, have been reported to be beneficial for hyperuricemia (15–18).
As an essential micronutrient in humans, iron plays a pivotal role in oxygen transport and energy production. Both in vivo and in vitro studies have demonstrated that the activity of xanthine oxidase increased after iron exposure, and xanthine oxidase has the capability to produce uric acid (19–21). Studies have indicated a positive association between the levels of serum ferritin and uric acid concentrations (22–24). Furthermore, increased transferrin and hemoglobin levels were associated with an increased risk of hyperuricemia (24). Recently, several studies have explored the association of dietary iron intake with hyperuricemia. A Korean study demonstrated an inverse relationship between dietary iron intake and hyperuricemia among 9,010 participants who underwent health examination (25). In addition, two large-scale Caucasian cohort studies (conducted in Australia and Norway) have demonstrated that increased dietary iron intake was related to a decrease in the levels of serum uric acid in the Australian participants but not in the Norwegian participants, whose consumption of dietary iron was lower than that of the Australian participants (26). However, the relationship between dietary iron intake and hyperuricemia was not significant in either men or women in the Nutrition and Health Survey in Taiwan (NAHSIT) (16).
Clearly, the aforementioned studies have shown inconsistent results. Because of differences between the absorption and metabolism of dietary iron obtained from plant foods and animal foods (27), the two types of dietary iron may differ in their effects on uric acid metabolism. To date, no known studies have evaluated the associations between different sources of dietary iron intake and the risk of hyperuricemia. Therefore, we explored the associations between dietary iron intake, namely, animal-derived dietary iron (ADDI) intake, plant-derived dietary iron (PDDI) intake, and the PDDI:ADDI intake ratio and hyperuricemia among adults by using data from the National Health and Nutrition Examination Survey (NHANES).
The data were merged together from three 2-year cycles (2009–2010, 2011–2012, and 2013–2014) of the NHANES (https://www.cdc.gov/nchs/nhanes/); thus, 30,468 individuals were included in total. We excluded the data of the participants who were younger than 20 years (n = 12,921) and those who were pregnant (n = 149). Furthermore, the participants with unreliable or incomplete 24-h recall data (n = 1,629) or missing information of dietary iron intake or serum uric acid levels were excluded (n = 2,762). Moreover, the participants whose total daily energy intake was greater than mean + 3 standard deviations (SDs) (4,506 kcal) or less than mean – 3 SDs (0 kcal) were excluded (n = 138). Finally, 12,869 adults (6,158 men and 6,711 women) were included in our analysis (Fig. 1). The Research Ethics Review Board of National Center for Health Statistics granted the approval for NHANES and all participants provided informed consent.
Fig. 1. Flow chart showing the process for the selection of eligible participants.
The dietary intake of iron was assessed using two 24-h dietary recall interviews, which were conducted by trained dietitians. The first dietary recall interview was conducted in person in the mobile examination center, and the second interview was conducted via telephone 3–10 days later. Average dietary iron intakes from the two 24-h recalls were used. More details of the dietary recall interviews were published elsewhere (15). Dietary iron obtained from different sources was identified using food codes. The sources of ADDI (meat, poultry, and fish; eggs; and dairy products) and PDDI (cereals; beans; vegetables; and fruits) were identified. ADDI intake, PDDI intake, and the PDDI:ADDI intake ratio were considered as predominant exposures.
The serum uric acid concentration was measured using a Beckman Synchron LX20 and Beckman UniCel® DxC800 Synchron. The values were reported in mg/dL. Hyperuricemia was defined as serum uric acid levels >7.0 mg/dL in men and >6.0 mg/dL in women (28).
Factors that had been proved to be correlated with dietary iron intake and hyperuricemia were included in regression models to control potential confounding. These factors included age (20–49 years and ≥50 years), gender, race (Mexican American, other Hispanic, non-Hispanic White, non-Hispanic Black, and other race), body mass index (BMI), total cholesterol (TC), educational level (below high school, high school, and above high school), alcohol consumption (≥12 alcoholic drinks/year and <12 alcoholic drinks/year), smoking status (smoking ≥100 cigarettes and <100 cigarettes in their lifetime), physical activity, daily total energy intake, dietary vitamin C intake, dietary fiber intake, dietary magnesium intake, hypertension, and diabetes. The definitions of hypertension and diabetes were published elsewhere (26, 29, 30).
Student’s t test or Mann–Whitney U test was used to compare the mean levels of continuous variables with normal distributions or with non-normal distributions between individuals with and without hyperuricemia. Chi-square test was used to compare the distribution of categorical variables between groups. Pearson’s correlation analysis or Spearman’s correlation analysis between PDDI intake and some inflammatory biomarkers associated with hyperuricemia was performed.
The three dietary exposures (ADDI intake, PDDI intake, and the PDDI:ADDI intake ratio) were categorized according to quartiles (quartile 1: <25th percentile, quartile 2: ≥25th–50th percentile, quartile 3: ≥50th–75th percentile, and quartile 4: ≥75th percentile). Logistic regression models were used to examine the associations between the three dietary exposures and hyperuricemia risk separately, and quartile 1 was used as the reference category. Moreover, nutrient residual model was used to remove the variation caused by total energy intake before logistic regression analysis (31). Liner regression models were also used to examine the associations between the three dietary exposures and serum uric acid separately. In multivariate regression models, model 1 was adjusted for age and gender, and model 2 was further adjusted for race, BMI, TC, educational level, alcohol consumption, smoking status, physical activity, daily total energy intake, dietary vitamin C intake, dietary fiber intake, dietary magnesium intake, hypertension, and diabetes. Subsequently, stratified analyses by age and gender were conducted separately to determine the associations between the three dietary exposures and hyperuricemia. Odds ratios (ORs) with 95% confidence intervals (CIs) were calculated from logistic regression analyses. β-Coefficients with 95% CIs were calculated from liner regression analyses.
After 1% abnormal values before and after were rejected, dose–response relationships were evaluated by binary logistic regression models with the use of restricted cubic spline functions with three knots located at the 5th, 50th, and 95th percentiles of the exposure distribution in fully adjusted model 2. The P-value for nonlinearity was calculated by testing the null hypothesis that the coefficient of the second spline was equal to zero. All P-values were two-sided and P ≤ 0.05 was considered significant. All statistical analyses were performed using Stata version 15.0.
Baseline characteristics of the participants are summarized in Table 1. The overall prevalence of hyperuricemia was 19.35% (22.32% in men and 16.67% in women). Both men and women with hyperuricemia tended to be non-Hispanic Black, have higher levels of BMI and serum TC, and have hypertension. Vigorous recreational activity, daily total energy intake, dietary intakes of vitamin C, fiber and magnesium, PDDI intake, and the PDDI:ADDI intake ratio of participants with hyperuricemia were significantly lower than those without hyperuricemia.
The weighted ORs (95% CIs) of hyperuricemia according to quartiles of the three dietary exposures for all participants are shown in Table 2. In univariate logistic regression analyses, compared with the lowest quartile, the ORs (95% CIs) of hyperuricemia for the highest quartile indicated that PPDI intake (0.60 [0.50–0.72]) and the PDDI:ADDI intake ratio (0.60 [0.50–0.71]) were negatively related to hyperuricemia, whereas ADDI intake was positively associated with hyperuricemia (1.36 [1.16–1.60]). After adjustment for age and gender (model 1), the associations were similar to those observed in the unadjusted model. After further adjustment for race, BMI, TC, educational level, alcohol consumption, smoking status, physical activity, daily total energy intake, dietary vitamin C intake, dietary fiber intake, dietary magnesium intake, hypertension, and diabetes (model 2), PDDI intake (0.69 [0.55–0.87]) remained significantly negatively associated with hyperuricemia, whereas the associations between ADDI intake, the PDDI:ADDI intake ratio, and hyperuricemia were no longer statistically significant. After using nutrient residual model, the associations of ADDI intake and PDDI intake with hyperuricemia were substantially unchanged in model 2, whereas an inverse association was found between the PDDI:ADDI intake ratio (quartile 3 vs. quartile 1) and hyperuricemia (Supplementary Table S1). The result of Supplementary Table S2 suggested that there was no collinearity between PDDI intake and the PDDI:ADDI intake ratio. Then we run a model with both PDDI intake and the PDDI:ADDI intake ratio fitted at the same time (Supplementary Table S3), the results were similar to those of Table 2.
Crude | Model 1 | Model 2 | |
OR (95% CI) | OR (95% CI) | OR (95% CI) | |
Animal-derived iron (mg/day) | |||
<1.69 | 1.00 (Ref.) | 1.00 (Ref.) | 1.00 (Ref.) |
1.69 to <2.90 | 1.19 (0.99–1.43) | 1.15 (0.96–1.39) | 1.10 (0.88–1.36) |
2.90 to <4.57 | 1.23 (1.04–1.45)* | 1.17 (0.99–1.39) | 1.05 ( 0.85–1.30) |
≥4.57 | 1.36 (1.16–1.60)** | 1.29 (1.08–1.53)** | 1.11 (0.90–1.38) |
Plant-derived iron (mg/day) | |||
<6.66 | 1.00 (Ref.) | 1.00 (Ref.) | 1.00 (Ref.) |
6.66 to <9.77 | 0.79 (0.65–0.95)* | 0.75 (0.62–0.91)** | 0.83 (0.69–1.00) |
9.77 to <14.14 | 0.65 (0.56–0.77)** | 0.61 (0.52–0.71)** | 0.71 (0.57–0.87)** |
≥14.14 | 0.60 (0.50–0.72)** | 0.53 (0.44–0.63)** | 0.69 (0.55–0.87)** |
Plant-derived iron: animal-derived iron intake ratio | |||
<1.84 | 1.00 (Ref.) | 1.00 (Ref.) | 1.00 (Ref.) |
1.84 to <3.43 | 0.84 (0.72–0.97)* | 0.82 (0.71–0.96)* | 0.96 (0.81–1.13) |
3.43 to <6.66 | 0.73 (0.62–0.86)** | 0.71 (0.61–0.84)** | 0.86 (0.69–1.06) |
≥6.66 | 0.60 (0.50–0.71)** | 0.59 (0.49–0.71)** | 0.85 (0.67–1.07) |
Model 1 adjusted for age and gender. Model 2 adjusted for age, gender, race, BMI, TC, educational level, alcohol consumption, smoking status, physical activity, daily total energy intake, dietary vitamin C intake, dietary fiber intake, dietary magnesium intake, hypertension, and diabetes. The lowest quartile of animal-derived iron intake, plant-derived iron intake, and the plant-derived iron:animal-derived iron intake ratio separately was used as the reference group. Results are survey-weighted. *P < 0.05 **P < 0.01. |
The associations between the three dietary exposures and serum uric acid concentrations are shown in Supplementary Table S4. An additional 10 mg of PDDI intake was associated with a decrease in serum uric acid (-0.074 mg/dL; 95% CI: -0.118 to -0.029 mg/dL) in fully adjusted model. When the intakes of PDDI and ADDI were expressed together as a ratio, there was a 0.004 mg/dL (95% CI: -0.007 to -0.0005 mg/dL) decrease in serum uric acid per additional 10 of the PDDI:ADDI intake ratio in model 2.
The associations between each of the three dietary exposures and hyperuricemia in stratified analyses by gender are displayed in Table 3. The comparison between the highest quartile and the lowest quartile showed that the OR (95% CI) of hyperuricemia was 1.10 (0.84–1.45) for ADDI intake, 0.64 (0.48–0.85) for PDDI intake, and 0.82 (0.61–1.09) for the PDDI:ADDI intake ratio in the multivariate-adjusted model (model 2) in men. In women, in Model 2, the corresponding OR (95% CI) of hyperuricemia for quartile 3 (compared with quartile 1) of PDDI intake was 0.73 (0.56–0.95). However, no significant associations were found between ADDI intake, the PDDI:ADDI intake ratio, and hyperuricemia risk across quartiles 2–4 compared with quartile 1.
Crude | Model 1 | Model 2 | |
OR (95% CI) | OR (95% CI) | OR (95% CI) | |
Men | |||
Animal-derived iron (mg/day) | |||
<1.69 | 1.00 (Ref.) | 1.00 (Ref.) | 1.00 (Ref.) |
1.69 to <2.90 | 1.25 (0.95–1.66) | 1.26 (0.95–1.66) | 1.21 (0.86–1.70) |
2.90 to <4.57 | 1.24 (0.96–1.61) | 1.24 (0.96–1.61) | 1.18 (0.85–1.64) |
≥4.57 | 1.19 (0.96–1.47) | 1.18 (0.96–1.46) | 1.10 (0.84–1.45) |
Plant-derived iron (mg/day) | |||
<6.66 | 1.00 (Ref.) | 1.00 (Ref.) | 1.00 (Ref.) |
6.66 to <9.77 | 0.66 (0.50–0.86)** | 0.66 (0.50–0.86)** | 0.77 (0.60–0.99)* |
9.77 to <14.14 | 0.57 (0.45–0.72)** | 0.57 (0.45–0.72)** | 0.66 (0.50–0.87)** |
≥14.14 | 0.49 (0.40–0.61)** | 0.49 (0.40–0.61)** | 0.64 (0.48–0.85)** |
Plant-derived iron: animal-derived iron intake ratio | |||
<1.84 | 1.00 (Ref.) | 1.00 (Ref.) | 1.00 (Ref.) |
1.84 to <3.43 | 0.86 (0.70–1.05) | 0.86 (0.70–1.05) | 0.95 (0.78–1.16) |
3.43 to <6.66 | 0.79 (0.65–0.97)* | 0.79 (0.65–0.97)* | 0.93 (0.72–1.19) |
≥6.66 | 0.62 (0.51–0.76)** | 0.63 (0.51–0.76)** | 0.82 (0.61–1.09) |
Women | |||
Animal-derived iron (mg/day) | |||
<1.69 | 1.00 (Ref.) | 1.00 (Ref.) | 1.00 (Ref.) |
1.69 to <2.90 | 1.10 (0.89–1.37) | 1.11 (0.90–1.37) | 1.08 (0.87–1.35) |
2.90 to <4.57 | 1.07 (0.84–1.37) | 1.11 (0.89–1.40) | 1.00 (0.76–1.30) |
≥4.57 | 1.33 (0.98–1.80) | 1.61 (1.20–2.16)** | 1.36 (0.94–1.95) |
Plant-derived iron (mg/day) | |||
<6.66 | 1.00 (Ref.) | 1.00 (Ref.) | 1.00 (Ref.) |
6.66 to <9.77 | 0.86 (0.67–1.11) | 0.83 (0.64–1.07) | 0.87 (0.65–1.16) |
9.77 to <14.14 | 0.65 (0.53–0.79)** | 0.62 (0.51–0.75)** | 0.73 (0.56–0.95)* |
≥14.14 | 0.56 (0.43–0.74)** | 0.56 (0.43–0.73)** | 0.75 (0.55–1.01) |
Plant-derived iron:animal-derived iron intake ratio | |||
<1.84 | 1.00 (Ref.) | 1.00 (Ref.) | 1.00 (Ref.) |
1.84 to <3.43 | 0.85 (0.66–1.08) | 0.76 (0.59–0.98)* | 0.95 (0.70–1.28) |
3.43 to <6.66 | 0.71 (0.56–0.90)** | 0.61 (0.49–0.77)** | 0.77 (0.57–1.05) |
≥6.66 | 0.61 (0.46–0.82)** | 0.54 (0.41–0.72)** | 0.86 (0.61–1.21) |
Model 1 adjusted for age. Model 2 adjusted for age, race, BMI, TC, educational level, alcohol consumption, smoking status, physical activity, daily total energy intake, dietary vitamin C intake, dietary fiber intake, dietary magnesium intake, hypertension, and diabetes. The lowest quartile of animal-derived iron intake, plant-derived iron intake and the plant-derived iron: animal-derived iron intake ratio separately was used as the reference group. Results are survey-weighted. *P < 0.05 **P < 0.01. |
The associations between each of the three dietary exposures and hyperuricemia in stratified analyses by age are shown in Table 4. For participants aged 20–49 years, PDDI intake (0.57 [0.42–0.76]) and the PDDI:ADDI intake ratio (0.72 [0.52–0.98]) (highest vs. lowest quartiles) were inversely associated with hyperuricemia in model 2. Moreover, the corresponding OR (95% CI) of hyperuricemia was 1.08 (0.80–1.45) for ADDI intake. In the ≥50 year group, relative to quartile 1, the ORs (95% CIs) of hyperuricemia for quartile 4 of ADDI intake, PDDI intake, and the PDDI:ADDI intake ratio were 1.22 (0.93–1.59), 0.79 (0.60–1.05), and 0.95 (0.70–1.29).
Crude | Model 1 | Model 2 | |
OR (95% CI) | OR (95% CI) | OR (95% CI) | |
20–49 years | |||
Animal-derived iron (mg/day) | |||
<1.69 | 1.00 (Ref.) | 1.00 (Ref.) | 1.00 (Ref.) |
1.69 to <2.90 | 1.18 (0.86–1.62) | 1.12 (0.81–1.54) | 1.01 (0.69–1.48) |
2.90 to <4.57 | 1.59 (1.22–2.07)** | 1.30 (0.99–1.70) | 1.18 (0.85–1.63) |
≥4.57 | 1.70 (1.33–2.18)** | 1.22 (0.95–1.57) | 1.08 (0.80–1.45) |
Plant-derived iron (mg/day) | |||
<6.66 | 1.00 (Ref.) | 1.00 (Ref.) | 1.00 (Ref.) |
6.66 to <9.77 | 0.85 (0.68–1.07) | 0.77 (0.61–0.96)* | 0.82 (0.66–1.03) |
9.77 to <14.14 | 0.67 (0.57–0.80)** | 0.55 (0.46–0.65)** | 0.56 (0.45–0.70)** |
≥14.14 | 0.72 (0.57–0.92)** | 0.53 (0.42–0.66)** | 0.57 (0.42–0.76)** |
Plant-derived iron: animal-derived iron intake ratio | |||
<1.84 | 1.00 (Ref.) | 1.00 (Ref.) | 1.00 (Ref.) |
1.84 to <3.43 | 0.79 (0.63–0.99)* | 0.82 (0.65–1.04) | 0.92 (0.73–1.14) |
3.43 to <6.66 | 0.72 (0.56–0.91)** | 0.77 (0.61–0.97)* | 0.86 (0.65–1.13) |
≥6.66 | 0.51 (0.40–0.65)** | 0.57 (0.44–0.73)** | 0.72 (0.52–0.98)* |
≥50 years | |||
Animal-derived iron (mg/day) | |||
<1.69 | 1.00 (Ref.) | 1.00 (Ref.) | 1.00 (Ref.) |
1.69 to <2.90 | 1.17 (0.97–1.41) | 1.19 (0.98–1.44) | 1.23 (1.00–1.51) |
2.90 to <4.57 | 1.01 (0.83–1.22) | 1.05 (0.85–1.28) | 0.98 (0.77–1.24) |
≥4.57 | 1.21 (0.99–1.48) | 1.33 (1.06–1.66)* | 1.22 (0.93–1.59) |
Plant-derived iron (mg/day) | |||
<6.66 | 1.00 (Ref.) | 1.00 (Ref.) | 1.00 (Ref.) |
6.66 to <9.77 | 0.72 (0.57–0.90)** | 0.72 (0.57–0.91)** | 0.84 (0.65–1.08) |
9.77 to <14.14 | 0.64 (0.51–0.79)** | 0.64 (0.52–0.80)** | 0.83 (0.61–1.11) |
≥14.14 | 0.51 (0.41–0.63)** | 0.52 (0.42–0.65)** | 0.79 (0.60–1.05) |
Plant-derived iron: animal-derived iron intake ratio | |||
<1.84 | 1.00 (Ref.) | 1.00 (Ref.) | 1.00 (Ref.) |
1.84 to <3.43 | 0.85 (0.66–1.10) | 0.84 (0.65–1.07) | 1.01 (0.76–1.33) |
3.43 to <6.66 | 0.72 (0.57–0.89)** | 0.70 (0.56–0.87)** | 0.87 (0.66–1.13) |
≥6.66 | 0.64 (0.50–0.82)** | 0.62 (0.49–0.79)** | 0.95 (0.70–1.29) |
Model 1 adjusted for gender. Model 2 adjusted for gender, race, BMI, TC, educational level, alcohol consumption, smoking status, physical activity, daily total energy intake, dietary vitamin C intake, dietary fiber intake, dietary magnesium intake, hypertension, and diabetes. The lowest quartile of animal-derived iron intake, plant-derived iron intake and the plant-derived iron: animal-derived iron intake ratio separately was used as the reference group. Results are survey-weighted. *P < 0.05 **P < 0.01. |
The corresponding result of the dose–response relationship between PDDI intake and hyperuricemia is presented in Fig. 2. PDDI intake was negatively associated with hyperuricemia in a linear manner (Pfor nonlinearity = 0.136). When PDDI intake reached 3 mg/d (OR: 0.95; 95% CI: 0.92–0.99), it exhibited protective effects on hyperuricemia. After using nutrient residual model, the relationship was substantially unchanged (Supplementary Fig. S1). In addition, the result was similar to that of Fig. 2 when both PDDI intake and the PDDI:ADDI intake ratio fitted at the same model (Supplementary Fig. S2).
Fig. 2. Examination of the dose-response relationship between plant-derived iron intake and hyperuricemia by restricted cubic splines model. The lowest level of plant-derived iron intake (1.89 mg/day) was used as the reference group. The model adjusted for age, gender, race, BMI, TC, educational level, alcohol consumption, smoking status, physical activity, daily total energy intake, dietary vitamin C intake, dietary fiber intake, dietary magnesium intake, hypertension, and diabetes. The solid line and dashed line represent the estimated ORs and the corresponding 95% CIs, respectively. OR, odds ratio; CI, confidence interval.
C-reaction protein (CRP) (n = 4,690) was available in 2009–2010 cycle, and white blood cell (WBC) (n = 12,848) was available in 2009–2010, 2011–2012, and 2013–2014 cycles of the NHANES. The results of Supplementary Table S5 showed that CRP and WBC of participants with hyperuricemia were significantly higher than those without hyperuricemia. The correlations between PDDI intake and these two inflammatory biomarkers are displayed in Supplementary Table S6. The levels of CRP and WBC were negatively correlated with PDDI intake.
In the present study, we found that PDDI intake was inversely associated with hyperuricemia. When stratified by gender and age, the associations remained unchanged in men, women, and 20–49 year group. The inverse relationship between the PDDI:ADDI intake ratio and hyperuricemia was observed among participants aged 20–49 years. We also found a linear inverse relationship between PDDI intake and hyperuricemia.
Three studies have explored the association between total dietary iron intake and hyperuricemia with conflicting results. A study by Ryu et al. demonstrated that dietary iron intake was negatively associated with hyperuricemia (25). Another study involving two Caucasian populations showed that high consumption of dietary iron was associated with low serum uric acid in the Australian cohort but not in the Norwegian cohort (26). Moreover, the NAHSIT did not report a significant relationship between dietary iron intake and hyperuricemia (16). In this study, a total of 2,176 participants aged 4–96 years were included and the definition of hyperuricemia was serum urate levels >7.7 mg/dL in men and >6.6 mg/dL in women. To our knowledge, our study is the first to explore the associations between dietary iron intake from different sources and the risk of hyperuricemia using a nationally representative sample of US adults. The associations of ADDI intake and PDDI intake with hyperuricemia were different.
Currently, available studies on the associations between different sources of dietary iron intake and hyperuricemia are very limited. Our findings indicated that PDDI intake was inversely associated with hyperuricemia. The mechanisms underlying the association remain undetermined because of the cross-sectional design of our study, but may be partly related to inflammation. Our study showed that PDDI intake was inversely correlated with the levels of CRP and WBC, which are biomarkers of inflammation (32, 33). Moreover, many studies have reported that CRP and WBC were positively associated with hyperuricemia (32–34), we also found that CRP and WBC of participants with hyperuricemia were significantly higher than those without hyperuricemia. Therefore, the negative relationship between PDDI intake and hyperuricemia may be related to the reduction of inflammatory response by PDDI intake. In addition, in our study, we determined the consumption of PDDI by calculating iron mainly from beans, vegetables, and fruits, which are rich in fiber, vitamin C, and magnesium. Previous studies have shown that dietary vitamin C, magnesium, and fiber may protect against hyperuricemia (15, 18, 35). Moreover, alkaline diet (composed of vegetable- and fruit-rich foods) was reported to accelerate the excretion of uric acid by alkalizing urine (36). Based on published studies, the exact explanation for the inverse association between PDDI intake and hyperuricemia risk is not clear. Further studies are needed to clarify the mechanisms of the relationship between them.
Our study has multiple strengths. First, the associations between dietary iron intake from different sources (ADDI and PDDI) and hyperuricemia were analyzed for the first time. Second, we explored the dose–response relationship between PDDI intake and hyperuricemia. Third, the large population-based study increased the statistical power of our study and results.
Our study also has some limitations. First, the causality cannot be easily determined because of the cross-sectional design of the study. Hence, additional prospective longitudinal studies are needed to establish a causal relationship between dietary iron intake and hyperuricemia. Second, 24-h dietary recall interviews may result in recall bias and underestimate of approximately 10% in food intake (37). Moreover, we cannot exclude the possibility of residual confusion caused by other confounding factors.
PPDI intake was inversely associated with the risk of hyperuricemia in US adults.
The authors thank all participants at the National Center for Health Statistics of the Centers for Disease Control and Prevention who were responsible for the planning and administering of NHANES and making the data sets of NHANES available online. In addition, this article was edited by Wallace Academic Editing.
JY and YS conceived and designed the study. JY, HZ, PZ and LZ analyzed the data. JY wrote the first draft of the manuscript. YS reviewed the manuscript and had primary responsibility for the final content. All authors read and approved the final manuscript.