{"title":"海报摘要。","authors":"","doi":"10.1002/jpen.2735","DOIUrl":null,"url":null,"abstract":"<p><b>P1–P34 Parenteral Nutrition Therapy</b></p><p><b>P35–P52 Enteral Nutrition Therapy</b></p><p><b>P53–P83 Malnutrition and Nutrition Assessment</b></p><p><b>P84–P103 Critical Care and Critical Health Issues</b></p><p><b>P104–P131 GI, Obesity, Metabolic, and Other Nutrition Related Concepts</b></p><p><b>P132–P165 Pediatric, Neonatal, Pregnancy, and Lactation</b></p><p><b>Parenteral Nutrition Therapy</b></p><p>Sarah Williams, MD, CNSC<sup>1</sup>; Angela Zimmerman, RD, CNSC<sup>2</sup>; Denise Jezerski, RD, CNSC<sup>2</sup>; Ashley Bestgen, RD, CNSC<sup>2</sup></p><p><sup>1</sup>Cleveland Clinic Foundation, Parma, OH; <sup>2</sup>Cleveland Clinic Foundation, Cleveland, OH</p><p><b>Financial Support:</b> Morrison Healthcare.</p><p><b>Background:</b> Essential fatty acid deficiency (EFAD) is a rare disorder among the general population but can be a concern in patients reliant on home parenteral nutrition (HPN), particularly those who are not receiving intravenous lipid emulsions (ILE). In the US, the only ILE available until 2016 was soybean oil based (SO-ILE), which contains more than adequate amounts of essential fatty acids, including alpha-linolenic acid (ALA, an omega-3 fatty acid) and linoleic acid (LA, an omega-6 fatty acid). In 2016, a mixed ILE containing soybean oil, medium chain triglycerides, olive oil and fish oil, became available (SO, MCT, OO, FO-ILE). However, it contains a lower concentration of essential fatty acids compared to SO-ILE, raising theoretical concerns for development of EFAD if not administered in adequate amounts. Liver dysfunction is a common complication in HPN patients that can occur with soybean based ILE use due to their pro-inflammatory properties. Short-term studies and case reports in patients receiving SO, MCT, OO, FO-ILE have shown improvements in liver dysfunction for some patients. Our study evaluates the long-term impact of SO, MCT, OO, FO-ILE in our HPN patient population.</p><p><b>Methods:</b> This single-center, retrospective cohort study was conducted at the Cleveland Clinic Center for Human Nutrition using data from 2017 to 2020. It involved adult patients who received HPN with SO, MCT, OO, FO-ILE for a minimum of one year. The study assessed changes in essential fatty acid profiles, including triene-tetraene ratios (TTRs) and liver function tests (LFTs) over the year. Data was described as mean and standard deviation for normal distributed continuous variables, medians and interquartile range for non-normally distributed continuous variables and frequency for categorical variables. The Wilcoxon signed rank test was used to compare the baseline and follow-up TTR values (mixed time points). The Wilcoxon signed rank test with pairwise comparisons was used to compare the LFTs at different time points and to determine which time groups were different. P-values were adjusted using Bonferroni corrections. Ordinal logistic regression was used to assess the association between lipid dosing and follow-up TTR level. Analyses were performed using R software and a significance level of 0.05 was assumed for all tests.</p><p><b>Results:</b> Out of 110 patients screened, 26 met the inclusion criteria of having baseline and follow-up TTRs. None of the patients developed EFAD, and there was no significant difference in the distribution of TTR values between baseline and follow-up. Additionally, 5.5% of patients reported adverse GI symptoms while receiving SO, MCT, OO, FO-ILE. A separate subgroup of 14 patients who had abnormal LFTs, including bilirubin, alkaline phosphatase (AP), aspartate aminotransferase (AST) or alanine aminotransferase (ALT), were evaluated. There was a statistically significant improvement of AST and ALT and decreases in bilirubin and AP that were not statistically significant.</p><p><b>Conclusion:</b> We found that using SO, MCT, OO, FO-ILE as the primary lipid source did not result in EFAD in any of our subset of 26 patients, and TTRs remained statistically unchanged after introduction of SO, MCT, OO, FO-ILE. Additionally, there was a statistically significant decrease in AST and ALT following the start of SO, MCT, OO, FO-ILE. While liver dysfunction from PN is multifactorial, the use of fish oil based lipids has been shown to improve LFT results due to a reduction of phytosterol content as well as less pro-inflammatory omega-6 content when compared to SO-ILEs. A significant limitation was the difficulty in obtaining TTR measurements by home health nursing in the outpatient setting, which considerably reduced the number of patients who could be analyzed for EFAD.</p><p><b>Table 1.</b> Summary Descriptive Statistics of 26 Patients With Baseline and Follow Up TTR.</p><p></p><p><b>Table 2.</b> Change in LFTs From Baseline Levels Compared to 3 Months, 6 Months, 9 Months and 12 Months.</p><p></p><p>Wendy Raissle, RD, CNSC<sup>1</sup>; Hannah Welch, MS, RD<sup>2</sup>; Jan Nguyen, PharmD<sup>3</sup></p><p><sup>1</sup>Optum Infusion Pharmacy, Buckeye, AZ; <sup>2</sup>Optum Infusion Pharmacy, Phoenix, AZ; <sup>3</sup>Optum Infusion Pharmacy, Mesa, AZ</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Aluminum is a non-nutrient contaminant of parenteral nutrition (PN) solution. The additive effects of PN components can contribute to toxicity and cause central nervous system issues as well as contribute to metabolic bone disease as observed in adults with osteomalacia. When renal function and gastrointestinal mechanisms are impaired, aluminum can accumulate in the body. Aluminum toxicity can result in anemia, dementia, bone disease and encephalopathy. Symptoms of aluminum toxicity may include mental status change, bone pain, muscle weakness, nonhealing fractures and premature osteoporosis. In July 2004, the U.S. Food and Drug Administration (FDA) mandated labeling of aluminum content with a goal to limit exposure to less than 5mCg/kg/day. Adult and pediatric dialysis patients, as well as patients of all ages receiving PN support, have an increased risk of high aluminum exposure. Reducing PN additives high in aluminum is the most effective way to decrease aluminum exposure and risk of toxicity. This abstract presents a unique case where antiperspirant use contributed to an accumulation of aluminum in an adult PN patient.</p><p><b>Methods:</b> A patient on long-term PN (Table 1) often had results of low ionized calcium of < 3 mg/dL, leading to consideration of other contributing factors. In addition, patient was taking very high doses of vitamin D daily (by mouth) to stay in normal range (50,000IU orally 6 days/week). Risk factors for developing metabolic bone disease include mineral imbalances of calcium, magnesium, phosphorus, vitamin D, corticosteroid use, long-term PN use and aluminum toxicity (Table 2). A patient with known osteoporosis diagnosis had two stress fractures in left lower leg. Aluminum testing was completed in order to identify other factors that may be contributing to low ionized calcium values and osteoporosis. During patient discussion, the patient revealed they used an aluminum-containing antiperspirant one time daily. The range of aluminum content in antiperspirants is unknown, but studies show that minimal absorption may be possible, especially in populations with kidney insufficiency.</p><p><b>Results:</b> After an elevated aluminum value resulted on July 3, 2023 (Figure 1), patient changed products to a non-aluminum containing antiperspirant. Aluminum values were rechecked at 3 and 7 months. Results indicate that patient's antiperspirant choice may have been contributing to aluminum content through skin absorption. Antiperspirant choice may not lead to aluminum toxicity but can contribute to an increased total daily aluminum content.</p><p><b>Conclusion:</b> Preventing aluminum accumulation is vital for patients receiving long-term PN support due to heightened risk of aluminum toxicity. Other potential sources of contamination outside of PN include dialysis, processed food, aluminum foil, cosmetic products (antiperspirants, deodorant, toothpaste) medications (antacids), vaccinations, work environment with aluminum welding and certain processing industry plants. Aluminum content of medications and PN additives vary based on brands and amount. Clinicians should review all potential aluminum containing sources and assess ways to reduce aluminum exposure and prevent potential aluminum toxicity in long-term PN patients.</p><p><b>Table 1.</b> Patient Demographics.</p><p></p><p><b>Table 2.</b> Aluminum Content in PN Prescription.</p><p></p><p></p><p><b>Figure 1.</b> Aluminum Lab Value Result.</p><p>Haruka Takayama, RD, PhD<sup>1</sup>; Kazuhiko Fukatsu, MD, PhD<sup>2</sup>; MIdori Noguchi, BA<sup>3</sup>; Nana Matsumoto, RD, MS<sup>2</sup>; Tomonori Narita, MD<sup>4</sup>; Reo Inoue, MD, PhD<sup>3</sup>; Satoshi Murakoshi, MD, PhD<sup>5</sup></p><p><sup>1</sup>St. Luke's International Hospital, Chuo-ku, Tokyo; <sup>2</sup>The University of Tokyo, Bunkyo-ku, Tokyo; <sup>3</sup>The University of Tokyo Hospital, Bunkyo-ku, Tokyo; <sup>4</sup>The University of Tokyo, Chuo-City, Tokyo; <sup>5</sup>Kanagawa University of Human Services, Yokosuka-city, Kanagawa</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Our previous study has demonstrated beta-hydroxy-beta-methylbutyrate (HMB)-supplemented total parenteral nutrition (TPN) partially to restore gut-associated lymphoid tissue (GALT) atrophy observed in standard TPN-fed mice. Oral intake of HMB is now popular in body builders and athletes. Herein, we examined whether oral supplementation of HMB could increase GALT mass in mice which eat dietary chow ad libitum.</p><p><b>Methods:</b> Six-week-old male Institute of Cancer Research (ICR) mice were divided into the Control (n = 9), the H600 (n = 9) and the H2000 (n = 9) groups. All mice were allowed to take chow and water ad libitum for 7 days. The H600 or H2000 mice were given water containing Ca-HMB at 3 mg or 10 mg/mL water, while the Controls drank normal tap water. Because these mice drank approximately 6-7 mL water per day, the H600 and H2000 groups took 600 and 2000mg/kg Ca-HMB in one day, respectively. After 7 days manipulation, all mice were killed with cardiac puncture under general anesthesia, and the whole small intestine was harvested for GALT cell isolation. GALT cell numbers were evaluated in each tissue (Payer patches; PPs, Intraepithelial spaces; IE, and Lamina propria; LP). The nasal washings, bronchoalveolar lavage fluid (BALF), and intestinal washings were also collected for IgA level measurement by ELISA. The Kruskal-Wallis test was used for all parameter analyses, and the significance level was set at less than 5%.</p><p><b>Results:</b> There were no significant differences in the number of GALT cells in any sites among the 3 groups (Table 1). Likewise, mucosal IgA levels did not differ between any of the 2 groups (Table 2).</p><p><b>Conclusion:</b> Oral intake of HMB does not affect GALT cell number or mucosal IgA levels when the mice are given normal diet orally. It appears that beneficial effects of HMB on the GALT are expected only in parenterally fed mice. We should examine the influences of IV-HMB in orally fed model in the next study.</p><p><b>Table 1.</b> GALT Cell Number (x10<sup>7</sup>/body).</p><p></p><p><b>Table 2.</b> IgA Levels.</p><p></p><p>Median (interquartile range). Kruskal-Wallis test. n; Control=9, H600 = 8, H2000 = 9.</p><p>Nahoki Hayashi, MS<sup>1</sup>; Yoshikuni Kawaguchi, MD, PhD, MPH, MMA<sup>2</sup>; Kenta Murotani, PhD<sup>3</sup>; Satoru Kamoshita, BA<sup>1</sup></p><p><sup>1</sup>Medical Affairs Department, Research and Development Center, Chiyoda-ku, Tokyo; <sup>2</sup>Hepato-Biliary-Pancreatic Surgery Division, Bunkyo-ku, Tokyo; <sup>3</sup>School of Medical Technology, Kurume, Fukuoka</p><p><b>Financial Support:</b> Otsuka Pharmaceutical Factory, Inc.</p><p><b>Background:</b> The guideline of American Society for Parenteral and Enteral Nutrition recommends a target energy intake of 20 to 30 kcal/kg/day in patients undergoing surgery. Infectious complications reportedly decreased when the target energy and protein intake were achieved in the early period after gastrointestinal cancer surgery. However, no studies investigated the association of prescribed parenteral energy doses with clinical outcomes in patients who did not receive oral/tube feeding in the early period after gastrointestinal cancer surgery.</p><p><b>Methods:</b> Data of patients who underwent gastrointestinal cancer surgery during 2011–2022 and fasted for 7 days or longer after surgery were extracted from a nationwide medical claims database. The patients were divided into 3 groups based on the mean prescribed parenteral energy doses during 7 days after surgery as follows: the very-low group (<10 kcal/kg/day), the low group (10–20 kcal/kg/day), and the moderate group (≥20 kcal/kg/day). Multivariable logistic regression model analysis was performed using in-hospital mortality, postoperative complications, length of hospital stay, and total in-hospital medical cost as the objective variable and the 3 group and confounding factors as the explanatory variables.</p><p><b>Results:</b> Of the 18,294 study patients, the number of patients in the very low, low, and moderate groups was 6,727, 9,760, and 1,807, respectively. The median prescribed energy doses on the 7th day after surgery were 9.2 kcal/kg, 16 kcal/kg, and 27 kcal/kg in the very low, low, and moderate groups, respectively. The adjusted odds ratio (95% confidence interval) for in-hospital mortality with reference to the very low group was 1.060 (1.057–1.062) for the low group and 1.281 (1.275–1.287) for the moderate group. That of postoperative complications was 1.030 (0.940–1.128) and 0.982 (0.842–1.144) for the low and moderate groups, respectively. The partial regression coefficient (95% confidence interval) for length of hospital stay (day) with reference to the very low group was 2.0 (0.7–3.3) and 3.2 (1.0–5.5), and that of total in-hospital medical cost (US$) was 1,220 (705–1,735) and 2,000 (1,136–2,864), for the low and moderate groups, respectively.</p><p><b>Conclusion:</b> Contrary to the guideline recommendation, the prescribed energy doses of ≥ 10 kcal/kg/day was associated with the increase in in-hospital mortality, length of hospital stay, and total in-hospital medical cost. Our findings questioned the efficacy of guideline-recommended energy intake for patients during 7 days after gastrointestinal cancer surgery.</p><p>Jayme Scali, BS<sup>1</sup>; Gaby Luna, BS<sup>2</sup>; Kristi Griggs, MSN, FNP-C, CRNI<sup>3</sup>; Kristie Jesionek, MPS, RDN, LDN<sup>4</sup>; Christina Ritchey, MS, RD, LD, CNSC, FASPEN, FNHIA<sup>5</sup></p><p><sup>1</sup>Optum Infusion Pharmacy, Thornton, PA; <sup>2</sup>Optum Infusion Pharmacy, Milford, MA; <sup>3</sup>Optum Infusion Pharmacy, Murphy, NC; <sup>4</sup>Optum Infusion Pharmacy, Franklin, TN; <sup>5</sup>Optum Infusion Pharmacy, Bulverde, TX</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Home parenteral nutrition (HPN) is a life-sustaining nutrition support therapy that is administered through a central venous access device (CVAD). Caregivers of children on HPN are initially trained to provide CVAD care and therapy administration by their clinical team or home infusion nurse. Often there is a gap in training when the patient is ready to assume responsibility for CVAD care. Without proper training, patients are at significant risk of complications such as bloodstream infections and catheter occlusions. The purpose of this study was twofold: 1) to explore the caregiver's perspective about current and future CVAD training practices and 2) to evaluate the need for a proactive formalized CVAD training program when care is transitioned from caregiver to patient.</p><p><b>Methods:</b> An 8-question survey was created using an online software tool. The target audience included caregivers of children receiving HPN. A link to the survey was sent via email and posted on various social media platforms that support the HPN community. The survey was conducted from June 17 to July 18, 2024. Respondents who were not caregivers of a child receiving HPN via a CVAD were excluded.</p><p><b>Results:</b> The survey received 114 responses, but only 86 were included in the analysis based on exclusion criteria. The distribution of children with a CVAD receiving HPN was evenly weighted between 0 and 18 years of age. The majority of the time, initial training regarding HPN therapy and CVAD care was conducted by the HPN clinic/hospital resource/learning center or home infusion pharmacy (Table 1). Forty-eight percent of respondents indicated their HPN team never offers reeducation or shares best practices (Figure 1). Most respondents selected the best individual to train their child on CVAD care and safety is the caregiver (Figure 2). In addition, 60% of respondents selected yes, they would want their child to participate in CVAD training if offered (Figure 3).</p><p><b>Conclusion:</b> This survey confirms that most caregivers anticipate training their child to perform CVAD care when it is determined the child is ready for this responsibility. One challenge to this provision of training is that almost half of the respondents in this survey stated they never receive reeducation or best practice recommendations from their team. This finding demonstrates a need for a formalized training program to assist caregivers when transitioning CVAD care to the patient. Since most respondents reported relying on their intestinal rehab or GI/motility clinic for CVAD related concerns, these centers would be the best place to establish a transition training program. Limitations of the study are as follows: It was only distributed via select social platforms, and users outside of these platforms were not captured. Additional studies would be beneficial in helping to determine the best sequence and cadence for content training.</p><p><b>Table 1.</b> Central Venous Access Device (CVAD) Training and Support Practices.</p><p></p><p></p><p><b>Figure 1.</b> How Often Does Your HPN Team Offer Reeducation or Share Best Practices?</p><p></p><p><b>Figure 2.</b> Who is Best to Train Your Child on CVAD Care Management and Safety?</p><p></p><p><b>Figure 3.</b> If Formalized CVAD Training is Offered, Would You Want Your Child to Participate?</p><p>Laryssa Grguric, MS, RDN, LDN, CNSC<sup>1</sup>; Elena Stoyanova, MSN, RN<sup>2</sup>; Crystal Wilkinson, PharmD<sup>3</sup>; Emma Tillman, PharmD, PhD<sup>4</sup></p><p><sup>1</sup>Nutrishare, Tamarac, FL; <sup>2</sup>Nutrishare, Kansas City, MO; <sup>3</sup>Nutrishare, San Diego, CA; <sup>4</sup>Indiana University, Carmel, IN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Long-term parenteral nutrition (LTPN) within the home is a lifeline for many patients throughout the United States. Patients utilize central venous access devices (CVAD) to administer LTPN. Central line-associated bloodstream infection (CLABSI) is a serious risk associated with patients who require LTPN. Rates of CLABSI in LTPN populations range from 0.9-1.1 per 1000 catheter days. The aim of this study was to determine the incidence of CLABSI in a cohort of patients serviced by a national home infusion provider specializing in LTPN and identify variables associated with an increased incidence of CLABSI.</p><p><b>Methods:</b> A retrospective review of electronic medical records of LTPN patients with intestinal failure was queried from March 2023 to May 2024 for patient demographics, anthropometric data, nursing utilization, parenteral nutrition prescription including lipid type, length of therapy use, geographic distribution, prescriber specialty, history of CLABSI, blood culture results as available, and use of ethanol lock. Patient zip codes were used to determine rural health areas, as defined by the US Department of Health & Human Services. Patients were divided into two groups: 1) patients that had at least one CLABSI and 2) patients with no CLABSI during the study period. Demographic and clinical variables were compared between the two groups. Nominal data were analyzed by Fisher's exact test and continuous data were analyzed with student t-test for normal distributed data and Mann-Whitney U-test was used for non-normal distributed data.</p><p><b>Results:</b> We identified 198 persons that were maintained on LTPN during the study time. The overall CLABSI rate for this cohort during the study period was 0.49 per 1000 catheter days. Forty-four persons with LTPN had one or more CLABSI and 154 persons with LTPN did not have a CLABSI during the study period. Persons who experienced CLABSI weighed significantly more, had fewer days of infusing injectable lipid emulsions (ILE), and had a shorter catheter dwell duration compared to those that did not have a CLABSI (Table 1). There was no significant difference between the CLABSI and no CLABSI groups in the length of time on LTPN, location of consumer (rural versus non-rural), utilization of home health services, number of days parenteral nutrition (PN) was infused, or use of ethanol locks (Table 1).</p><p><b>Conclusion:</b> In this retrospective cohort study, we report a CLABSI rate of 0.49 per 1000 catheter days, which is lower than previously published CLABSI rates for similar patient populations. Patient weight, days of infusing ILE, and catheter dwell duration were significantly different between those that did and did not have a CLABSI in this study period. Yet, variables such as use of ethanol lock and proximity to care providers that had previously been reported to impact CLABSI were not significantly different in this cohort. An expanded study with more LTPN patients or a longer study duration may be necessary to confirm these results and their impact on CLABSI rates.</p><p><b>Table 1.</b> Long Term Parenteral Nutrition (LTPN) Characteristics.</p><p></p><p>Silvia Figueiroa, MS, RD, CNSC<sup>1</sup>; Stacie Townsend, MS, RD, CNSC<sup>2</sup></p><p><sup>1</sup>MedStar Washington Hospital Center, Bethesda, MD; <sup>2</sup>National Institutes of Health, Bethesda, MD</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> In hospitalized patients, lipid emulsions constitute an essential component of balanced parenteral nutrition (PN). Soy oil-based lipid injectable emulsions (SO-ILE) were traditionally administered as part of PN formulations primarily as a source of energy and for prevention of essential fatty acid deficiency. Contemporary practice has evolved to incorporate mixtures of different lipid emulsions, including a combination of soy, MCT, olive, and fish oils (SO, MCT, OO, FO-ILE). Evidence suggests that the use of SO, MCT, OO, FO-ILEs may alter essential fatty acid profiles, impacting hepatic metabolism and other processes associated with clinical benefits. The aim of this project was to compare essential fatty acid profile (EFAP), triglycerides (TGL), liver function tests (LFTs), and total bilirubin (TB) levels in adult patients receiving parenteral nutrition with SO-ILE or SO, MCT, OO, FO-ILE in a unique hospital entirely dedicated to conducting clinical research.</p><p><b>Methods:</b> This was a retrospective chart review from 1/1/2019 to 12/31/2023 of adult patients in our hospital who received PN with SO-ILE or SO, MCT, OO, FO-ILE and had EFAP assessed after 7 days of receiving ILE. Data included demographic, clinical, and nutritional parameters. Patients with no laboratory markers, those on propofol, and those who received both ILE products in the 7 days prior to collection of EFAP were excluded. Data was statistically analyzed using Fisher's tests and Mann-Whitney U tests as appropriate.</p><p><b>Results:</b> A total of 42 patient charts were included (14 SO-ILE; 28 SO, MCT, OO, FO-ILE). Group characteristics can be found in Table 1. Patients on SO-ILE received more ILE (0.84 vs 0.79 g/kg/day, p < 0.0001). TGL levels changed significantly after start of ILE (p < 0.0001). LFTs were found to be elevated in 57% of patients in the SO-ILE group and 60% in the SO, MCT, OO, FO-ILE group, while TB was increased in 21% and 40% of the patients respectively (Figure 1). Further analysis showed no significant differences in LFTs and TB between the two groups. Assessment of EFAP revealed a significant difference in the levels of DHA, docosenoic acid, and EPA, which were found to be higher in the group receiving SO, MCT, OO, FO-ILE. Conversely, significant differences were also observed in the levels of linoleic acid, homo-g-linolenic acid, and total omega 6 fatty acids, with those being higher in patients administered SO ILE (Figure 2). No differences were observed between groups regarding the presence of essential fatty acid deficiency, as indicated by the triene:tetraene ratio.</p><p><b>Conclusion:</b> In our sample analysis, LFTs and TB levels did not differ significantly between SO-ILE and SO, MCT, OO, FO-ILE groups. Increased levels of DHA, docosenoic acid, and EPA were found in the SO, MCT, OO, FO-ILE group, while linoleic acid, homo-g-linolenic acid, and total omega 6 fatty acids tended to be higher in the SO-ILE group. Although in our sample the SO, MCT, OO, FO-ILE group received a lower dosage of ILE/kg/day, there were no differences in the rate of essential fatty acid deficiency between groups.</p><p><b>Table 1.</b> General Characteristics (N = 42).</p><p></p><p></p><p><b>Figure 1.</b> Liver Function Tests (N = 39).</p><p></p><p><b>Figure 2.</b> Essential Fatty Acid Profile (N = 42).</p><p>Kassandra Samuel, MD, MA<sup>1</sup>; Jody (Lind) Payne, RD, CNSC<sup>2</sup>; Karey Schutte, RD<sup>3</sup>; Kristen Horner, RDN, CNSC<sup>3</sup>; Daniel Yeh, MD, MHPE, FACS, FCCM, FASPEN, CNSC<sup>3</sup></p><p><sup>1</sup>Denver Health, St. Joseph Hospital, Denver, CO; <sup>2</sup>Denver Health, Parker, CO; <sup>3</sup>Denver Health, Denver, CO</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Early nutritional support in hospitalized patients has long been established as a key intervention to improve overall patient outcomes. Patients admitted to the hospital often have barriers to receiving adequate nutrition via enteral route means and may be candidates for parenteral nutrition (PN). Central parenteral nutrition (CPN) requires central access, which has historically led to concerns for central line-associated bloodstream infection (CLABSI). Obtaining central access can be resource-intensive and may result in treatment delays while awaiting access. Conversely, peripheral parenteral nutrition (PPN) can be delivered without central access. In this quality improvement project, we sought to characterize our PPN utilization at a large urban tertiary hospital.</p><p><b>Methods:</b> We performed a retrospective review of adult inpatients receiving PN at our facility from 1/1/23–12/31/23. Patients were excluded from review if they had PN initiated prior to hospitalization. Demographic information, duration of treatment, timely administration status, and information regarding formula nutrition composition were collected.</p><p><b>Results:</b> A total of 128 inpatients received PN for a total of 1302 PN days. The mean age of these patients was 53.8 years old (SD: 17.9) and 65 (50%) were male. Twenty-six (20%) patients received only PPN for a median [IQR] length of 3 [2–4] days, and 61 (48%) patients received only CPN for a median length 6 [3–10] days. Thirty-nine (30%) patients were started on PPN with the median time to transition to CPN of 1 [1-3] day(s) and a median total duration of CPN being 8 [5-15.5] days. A small minority of patients received CPN and then transitioned to PPN (2%).</p><p><b>Conclusion:</b> At our institution, PPN is utilized in more than 50% of all inpatient PN, most commonly at PN initiation and then eventually transitioning to CPN for a relatively short duration of one to two weeks. Additional research is required to identify those patients who might avoid central access by increasing PPN volume and macronutrients to provide adequate nutrition therapy.</p><p>Nicole Halton, NP, CNSC<sup>1</sup>; Marion Winkler, PhD, RD, LDN, CNSC, FASPEN<sup>2</sup>; Elizabeth Colgan, MS, RD<sup>3</sup>; Benjamin Hall, MD<sup>4</sup></p><p><sup>1</sup>Brown Surgical Associates, Providence, RI; <sup>2</sup>Department of Surgery and Nutritional Support at Rhode Island Hospital, Providence, RI; <sup>3</sup>Rhode Island Hospital, Providence, RI; <sup>4</sup>Brown Surgical Associates, Brown University School of Medicine, Providence, RI</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Parenteral nutrition (PN) provides adequate nutrition and fluids to patients with impaired gastrointestinal function who cannot meet their nutritional needs orally or enterally. PN requires a venous access device that has associated risks including infection as well as metabolic abnormalities associated with the therapy. Monitoring of PN therapy in the hospital setting involves regular blood work, yet improperly collected samples can lead to abnormal laboratory results and unnecessary medical interventions.</p><p><b>Methods:</b> An IRB exempted quality improvement study was conducted at Rhode Island Hospital by the Surgical Nutrition Service which manages all adult PN. The purpose of the study was to quantify the occurrence of contaminated blood samples among PN patients between January 1, 2024, and August 31, 2024. Demographic data, venous access device, and PN-related diagnoses were collected. Quantification of contaminated blood specimens was determined per patient, per hospital unit, and adjusted for total PN days. Comparisons were made between serum glucose, potassium, and phosphorus levels from contaminated and redrawn blood samples. Descriptive data are reported.</p><p><b>Results:</b> 138 patients received PN for a total of 1840 days with a median length of PN therapy of 8 days (IQR 9, range 2-84). The most common vascular access device was dual lumen peripherally inserted central catheter. The majority (63%) of patients were referred by surgery teams and received care on surgical floors or critical care units. The most frequent PN related diagnoses were ileus, gastric or small bowel obstruction, and short bowel syndrome. There were 74 contaminated blood specimens among 42 (30%) patients receiving TPN for a rate of 4% per total patient days. Of 25 nursing units, 64% had at least one occurrence of contaminated blood specimens among TPN patients on that unit. Contaminated samples showed significantly different serum glucose, potassium, and phosphorus compared to redrawn samples (p < 0.001); glucose in contaminated vs redrawn samples was 922 ± 491 vs 129 ± 44 mg/dL; potassium 6.1 ± 1.6 vs 3.9 ± 0.5 mEq/L; phosphorus 4.9 ± 1.2 vs 3.3 ± 0.6 mg/dL. The average time delay between repeated blood samples was 3 hours.</p><p><b>Conclusion:</b> Contaminated blood samples can lead to delays in patient care, discomfort from multiple blood draws, unnecessary medical interventions (insulin; discontinuation of PN), delay in placement of timely PN orders, and increased infection risk. Nursing re-education on proper blood sampling techniques is critical for reducing contamination occurrences. All policies and procedures will be reviewed, and an educational program will be implemented. Following this, occurrences of blood contamination during PN will be reassessed.</p><p>Hassan Dashti, PhD, RD<sup>1</sup>; Priyasahi Saravana<sup>1</sup>; Meghan Lau<sup>1</sup></p><p><sup>1</sup>Massachusetts General Hospital, Boston, MA</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> ASN Nutrition 2024.</p><p><b>Publication:</b> Saravana P, Lau M, Dashti HS. Continuous glucose monitoring in adults with short bowel syndrome receiving overnight infusions of home parenteral nutrition. Eur J Clin Nutr. 2024 Nov 23. doi: 10.1038/s41430-024-01548-z. Online ahead of print. PMID: 39580544.</p><p><b>Financial Support:</b> ASPEN Rhoads Research Foundation.</p><p>Maria Romanova, MD<sup>1</sup>; Azadeh Lankarani-Fard, MD<sup>2</sup></p><p><sup>1</sup>VA Greater Los Angeles Healthcare System, Oak Park, CA; <sup>2</sup>GA Greater Los Angeles Healthcare System, Los Angeles, CA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition is a serious complication of the hospital stay. Parenteral nutrition (PN) in most sophisticated way of addressing it but requires on-going monitoring. In our medical center PN provision is guided by the interdisciplinary Nutrition Support Team (NST). In 2024 we began creation of a dashboard to monitor safety and utilization of PN at the Greater Los Angeles VA. Here we discuss the collaborative process of developing the dashboard and its first use.</p><p><b>Methods:</b> A dashboard was constructed using data from the VA electronic health record. The dashboard used Microsoft Power BI technology to customize data visualization. The NST group worked closely with the Data Analytics team at the facility to modify and validate the dashboard to accommodate the needs of the group. The dashboard was maintained behind a VA firewall and only accessible to members of the NST. The dashboard reviewed patient level data for whom a Nutrition Support consult was placed for the last 2 years. The variables included were the data of admission, date of consult request, the treating specialty at the time of request, demographics, admission diagnosis, discharge diagnosis, number of orders for PPN/TPN, number of blood sugars >200 mg/dL after admission, number of serum phosphorus values < 2.5 mg/dL, number of serum potassium values < 3.5 mmol/L, any discharge diagnosis of refeeding (ICD 10 E87.8), micronutrient levels during admission, and any discharge diagnosis of infection. The ICD10 codes used to capture infection were for: bacteremia (R78.81), sepsis (A41.*), or catheter associated line infection (ICD10 = T80.211*). The asterix (*) denotes any number in that ICD10 classification. The dashboard was updated once a week. The NST validated the information on the dashboard to ensure validity, and refine information as needed.</p><p><b>Results:</b> The initial data extraction noted duplicate consult request as patients changed treating specialties during the same admission and duplicate orders for PPN/TPN as the formulations were frequently modified before administration. The Data Analytics team worked to reduce these duplicates. The NST also collaborated with the Data Analytics team to modify their existing documentation to better capture the data needed going forward. Dashboard data was verified by direct chart review. Between April 2022- 2024 68 consults were placed from the acute care setting and 58 patients received PPN or TPN during this time period. Thirty-five patients experienced hyperglycemia. Two patients were deemed to have experience refeeding at the time of discharge. Fourteen episodes of infection were noted in those who received PPN/TPN but the etiology was unclear from the dashboard alone and required additional chart review.</p><p><b>Conclusion:</b> A dashboard can facilitate monitoring of Nutrition Support services in the hospital. Refinement of the dashboard requires collaboration between the clinical team and the data analytics team to ensure validity and workload capture.</p><p>Michael Fourkas, MS<sup>1</sup>; Julia Rasooly, MS<sup>1</sup>; Gregory Schears, MD<sup>2</sup></p><p><sup>1</sup>PuraCath Medical Inc., Newark, CA; <sup>2</sup>Department of Anesthesiology and Perioperative Medicine, Mayo Clinic, Rochester, MN</p><p><b>Financial Support:</b> Funding of the study has been provided by Puracath Medical.</p><p><b>Background:</b> Intravenous catheters can provide venous access for drug and nutrition delivery in patients for extended periods of time, but risk the occurrence of central line associated bloodstream infections (CLABSI) due to inadequate asepsis. Needleless connectors (NC), which provide access for injection of medications, are known to be one of the major sources of contamination. Studies demonstrate that current methods of disinfecting connectors such as a 15 second antiseptic wipe do not guarantee complete disinfection inside of connectors. With the rise of superbugs such as Candida auris, there is an urgent need for better aseptic technique compliance and non-antibiotic disinfection methods. Ultraviolet light-C (UV-C) is an established technology that is commonly used in hospital settings for disinfection of equipment and rooms. In this study, we investigate the efficacy of our novel UV-C light disinfection device on UV-C light-transmissive NCs inoculated with common CLABSI-associated organisms.</p><p><b>Methods:</b> Staphylococcus aureus (ATCC #6538), Candida albicans (ATCC #10231), Candida auris (CDC B11903), Escherichia coli (ATCC #8739), Pseudomonas aeruginosa (ATCC #9027), and Staphylococcus epidermidis (ATCC #12228) were used as test organisms for this study. A total of 29 NC samples were tested for each organism with 3 positive controls and 1 negative control. Each UV-C light-transmissive NC was inoculated with 10 µl of cultured inoculum (7.00-7.66 log) and were exposed to an average of 48 mW/cm2 of UV light for 1 second using our in-house UV light disinfection device FireflyTM. After UV disinfection, 10 mL of 0.9% saline solution was flushed through the NC and filtered through a 0.45 µm membrane. The membrane filter was plated onto an agar medium matched to the organism and was incubated overnight at 37°C for S. aureus, E. coli, S. epidermidis, and P. aeruginosa, and two days at room temperature for C. albicans and C. auris. Positive controls followed the same procedure without exposure to UV light and diluted by 100x before being spread onto agar plates in triplicates. The negative controls followed the same procedure without inoculation. After plates were incubated, the number of colonies on each plate were counted and recorded. Log reduction was calculated by determining the positive control log concentration over the sample concentration in cfu/mL. 1 cfu/10 mL was used to make calculations for total kills.</p><p><b>Results:</b> Using our UV light generating device, we were able to achieve greater than 4 log reduction average and complete kills for all test organisms. The log reduction for S. aureus, C. albicans, C. auris, E. coli, P. aeruginosa, and S. epidermidis were 5.29, 5.73, 5.05, 5.24, 5.10, and 5.19, respectively.</p><p><b>Conclusion:</b> We demonstrated greater than 4-log reduction in common CLABSI-associated organisms using our UV light disinfection device and UV-C transmissive NCs. By injecting inoculum directly inside the NC, we demonstrated that disinfection inside NCs can be achieved, which is not possible with conventional scrubbing methods. A one second NC disinfection time will allow less disruption in the workflow in hospitals, particularly in intensive care units where highly effective and efficient disinfection rates are essential for adoption of the technology.</p><p><b>Table 1.</b> Log Reduction of Tested Organisms After Exposure to 48 mW/cm2 UV-C for 1 Second.</p><p></p><p>Yaiseli Figueredo, PharmD<sup>1</sup></p><p><sup>1</sup>University of Miami Hospital, Miami, FL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Octreotide belongs to the somatostatin analog class. It is used off-label for malignant bowel obstructions (MBO). Somatostatin analogs (SSA) inhibit the release and action of multiple hormones, reducing gastric secretions, peristalsis, and splanchnic blood flow while enhancing water and electrolyte absorption. National Comprehensive Cancer Network (NCCN) guidelines recommend octreotide 100-300 mcg subcutaneous twice to three times a day or 10-40 mcg/hour continuous infusion for the management of malignant bowel obstructions, and if prognosis is greater than 8 weeks, consider long-acting release (LAR) or depot injection. Using octreotide as an additive to parenteral nutrition solutions has been a debatable topic due to concerns of formation of a glycosyl octreotide conjugate that may decrease the octreotide's efficacy. However, other compatibility studies have concluded little octreotide loss over 48 hours in TPN solutions at room temperature in ambient room light. At the University of Miami Hospital, it is practiced using octreotide as an additive to Total Parenteral Nutrition (TPN) solutions to reduce gastro-intestinal secretions in patients with malignant bowel obstructions. The starting dose is 300 mcg, and dose is increased on 300 mcg increments to a maximum dose of 900 mcg if output remains uncontrolled/elevated. The objective of this study is to evaluate the effectiveness and safety of octreotide as an additive to TPN in hospitalized patients with malignant bowel obstructions.</p><p><b>Methods:</b> A three-year retrospective chart review (June 2021-June 2024) was conducted to evaluate the effectiveness and safety of octreotide as an additive to TPN in hospitalized patients with MBO diagnosis at UMH. The following information was obtained from chart review: age, gender, oncologic diagnosis, TPN indication, TPN dependency, octreotide doses used, baseline and final gastrointestinal secretion output recorded, type of venting gastrostomy in place, length of hospital stay, and baseline and final hepatic function tests.</p><p><b>Results:</b> A total of 27 patients were identified to have malignant bowel obstruction requiring TPN which had octreotide additive. All patients were started on octreotide 300 mcg/day added into 2-in-1 TPN solution. The gastrointestinal secretion output was reduced on average by 65% among all patients with a final average daily amount of 540 mL recorded. The baseline average output recorded was 1,518 mL/day. The average length of treatment as an inpatient was 23 days, range 3-98 days. Liver function tests (LFTs) were assessed at baseline and last inpatient value available for the admission. Four out of the 27 patients (15%) reviewed were observed to have a significant rise in liver enzymes greater than three times the upper limit of normal.</p><p><b>Conclusion:</b> Octreotide represents a valuable addition to the limited pharmacological options for managing malignant bowel obstruction. Its ability to reduce gastrointestinal secretions by 65% on average as observed in this retrospective chart review can significantly alleviate symptoms and improve patient care. Using octreotide as an additive to TPN solutions for patients with malignant bowel obstructions who are TPN dependent reduces the number of infusions or subcutaneous injections patients receive per day. According to octreotide's package insert, the incidence of hepato-biliary complications is up to 63%. The finding that 15% of patients from this retrospective chart review had significant liver enzyme elevations remains an important monitoring parameter to evaluate.</p><p>Pavel Tesinsky, Assoc. Prof., MUDr.<sup>1</sup>; Jan Gojda, Prof., MUDr, PhD<sup>2</sup>; Petr Wohl, MUDr, PhD<sup>3</sup>; Katerina Koudelkova, MUDr<sup>4</sup></p><p><sup>1</sup>Department of Medicine, Prague, Hlavni mesto Praha; <sup>2</sup>Department of Medicine, University Hospital, 3rd Faculty of Medicine Charles University in Prague, Praha, Hlavni mesto Praha; <sup>3</sup>Institute for Clinical and Experimental Medicine, Prague, Hlavni mesto Praha; <sup>4</sup>Department of Medicine, University Hospital and 3rd Faculty of Medicine in Prague, Prague, Hlavni mesto Praha</p><p><b>Financial Support:</b> The Registry was supported by Takeda and Baxter scientific grants.</p><p><b>Background:</b> Trends in indications, syndromes, performance, weaning, and complications of patients on total HPN based on the updated 30 years analysis and stratification of patients on home parenteral nutrition (HPN) in Czech Republic.</p><p><b>Methods:</b> Records from the HPN National Registry were analysed for the time period 2007 – 2023, based on the data from the HPN centers. Catheter related sepsis (CRS), catheter occlusions, and thrombotic complications were analyzed for the time–to–event using the competing-risks regression (Fine and Gray) model. Other data is presented as median or mean with 95% CI (p < 0.05 as significant).</p><p><b>Results:</b> The incidence rate of HPN is 1.98 per 100.000 inhabitants (population 10.5 mil.). Lifetime dependency is expected in 20% patients, potential weaning in 40%, and 40% patients are palliative. Out of 1838 records representing almost 1.5 million catheter days, short bowel syndrome was present in 672 patients (36.6%), intestinal obstruction in 531 patients (28.9%), malabsorption in 274 patients (14.9%), and the rest of 361 patients (19.6%) was split among fistulas, dysphagia, or remained unspecified. The majority of SBS were type I (57.8 %) and II (20.8%). Mean length of residual intestine was 104.3 cm (35.9 - 173.4 cm) with longer remnants in type I SBS. Dominant indications for HPN were pseudoobstruction (35.8%), non-maignant surgical conditions (8.9%), Crohn disease (7.3%), and mesenteric occlusion (6.8%). Mobility for a substantial part of the day was reported from 77.8% HPN patients, economic activity and independence from 162 (24.8 %) out of 653 economically potent patients. A tunneled catheter was primarily used in 49.1%, PICC in 24.3%, and IV port in 19.8% patients. Commercially prepared bags were used in 69.7%, and pharmacy-prepared admixtures in 24.7% patients. A total of 66.9% patients were administered 1 bag per day/7 days a week. The sepsis ratio per 1000 catheter days decreased from 0.84 in 2013 to 0.15 in 2022. The catheter occlusions ratio decreased from 0.152 to 0.10 per 1000 catheter days, and thrombotic complications ratio from 0.05 to 0.04. Prevalence of metabolic bone disease is 15.6 %, and prevalence of PNALD is 22.3%. In the first 12 months, 28 % patients achieved intestinal autonomy increasing to 45 % after 5 years. Patient survival rate is 62% in the first year, 45% at 5 years, and 35% at the 10-years mark. Tedeglutide was indicated in 36 patients up to date with reduction of the daily HPN volume to 60.3% on average.</p><p><b>Conclusion:</b> Prevalence of HPN patients in the Czech Republic is increasing in the past ten years and it is corresponding to the incidence rate. Majority of patients are expected to terminate HPN within the first year. Risk of CRS decreased significantly in the past five years and remains low, while catheter occlusion and thrombotic complications have a stable trend. Tedeglutide significantly reduced the required IV volume.</p><p></p><p><b>Figure 1.</b> Per-Year Prevalence of HPN Patients and Average Number of Catheter Days Per Patient (2007 - 2022).</p><p></p><p><b>Figure 2.</b> Annual Incidence of HPN Patients (2007 - 2022).</p><p></p><p><b>Figure 3.</b> Catheter related bloodstream infections (events per 1,000 catheter-days).</p><p>Jill Murphree, MS, RD, CNSC, LDN<sup>1</sup>; Anne Ammons, RD, LDN, CNSC<sup>2</sup>; Vanessa Kumpf, PharmD, BCNSP, FASPEN<sup>2</sup>; Dawn Adams, MD, MS, CNSC<sup>2</sup></p><p><sup>1</sup>Vanderbilt University Medical Center, Nashville, TN; <sup>2</sup>Vanderbilt University Medical Center, Nashville, TN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Determining macronutrient goals in patients requiring home parenteral nutrition (HPN) can be difficult due to various factors. While indirect calorimetry is the gold standard for measuring energy expenditure, it is not readily available in the outpatient setting. Therefore, clinicians typically rely on less accurate weight-based equations for assessment of protein and energy requirements. Energy goals are also impacted by the targeted desire for weight loss, weight gain, or weight maintenance. Patients receiving HPN may consume some oral dietary intake and experience variable degrees of macronutrient absorption. These factors, as well as underlying clinical conditions, can significantly impact protein and energy requirements and may change over the course of HPN therapy. The purpose of this study was to evaluate the range of protein and energy doses prescribed in patients receiving HPN managed by an interdisciplinary intestinal failure clinic at a large academic medical center.</p><p><b>Methods:</b> Patient demographics including patient age, gender, and PN indication/diagnosis were retrospectively obtained for all patients discharged home with PN between May 2021 to May 2023 utilizing an HPN patient database. Additional information was extracted from the electronic medical record at the start of HPN, then at 2-week, 2 to 3 month, and 6-month intervals following discharge home that included height, actual weight, target weight, HPN energy dose, HPN protein dose, and whether the patient was eating. Data collection ended at completion of HPN therapy or up to 6 months of HPN. All data was entered and stored in an electronic database.</p><p><b>Results:</b> During the study period, 248 patients were started on HPN and 56 of these patients received HPN for at least 6 months. Patient demographics are included in Table 1. At the start of HPN, prescribed energy doses ranged from 344 to 2805 kcal/d (6 kcal/kg/d to 45 kcal/kg/d) and prescribed protein doses ranged from 35 to 190 g/d (0.6 g/kg/d to 2.1 g/kg/d). There continued to be a broad range of prescribed energy and protein doses at 2-week, 2 to 3 month, and 6-month intervals of HPN. Figures 1 and 2 provide the prescribed energy and protein doses for all patients and for those who are eating and not eating. For patients not eating, the prescribed range for energy was 970 to 2791 kcal/d (8 kcal/kg/d to 45 kcal/kg/d) and for protein was 40 to 190 g/d (0.6 g/kg/d to 2.0 g/kg/d) at the start of PN therapy. The difference between actual weight and target weight was assessed at each study interval. Over the study period, patients demonstrated a decrease in the difference between actual and target weight to suggest improvement in reaching target weight (Figure 3).</p><p><b>Conclusion:</b> The results of this study demonstrate a wide range of energy and protein doses prescribed in patients receiving HPN. This differs from use of PN in the inpatient setting, where weight-based macronutrient goals tend to be more defined. Macronutrient adjustments may be necessary in the long-term setting when patients are consuming oral intake, for achievement/maintenance of target weight, or for changes in underlying conditions. Patients receiving HPN require an individualized approach to care that can be provided by interdisciplinary nutrition support teams specializing in intestinal failure.</p><p><b>Table 1.</b> Patient Demographics Over 6-Month Study Period.</p><p></p><p></p><p><b>Figure 1.</b> Parenteral Nutrition (PN) Energy Range.</p><p></p><p><b>Figure 2.</b> Parenteral Nutrition (PN) Protein Range.</p><p></p><p><b>Figure 3.</b> Difference Between Actual Weight and Target Weight.</p><p>Jennifer Lachnicht, RD, CNSC<sup>1</sup>; Christine Miller, PharmD<sup>2</sup>; Jessica Younkman, RD CNSC<sup>2</sup></p><p><sup>1</sup>Soleo Home Infusion, Frisco, TX; <sup>2</sup>Soleo Health, Frisco, TX</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Since the 1990s, initiating parenteral nutrition (PN) at home has been performed, though some clinicians prefer hospital initiation due to risks like refeeding syndrome (RS). A key factor in successful home PN initiation is careful evaluation by an experienced nutrition support clinician, particularly assessing RS risk. In 2020, ASPEN published consensus recommendations for identifying patients at risk for RS and guidelines for initiating and advancing nutrition. Home PN initiation offers advantages, including avoiding hospitalization, reducing healthcare costs, minimizing hospital-acquired infections, and improving quality of life. Literature suggests a savings of $2,000 per day when PN is started at home. This study aims to determine the risk and incidence of RS in patients who began PN at home, based on the 2020 ASPEN Consensus Recommendations for RS.</p><p><b>Methods:</b> A national home infusion provider's nutrition support service reviewed medical records for 27 adult patients who initiated home PN between September 2022 and June 2024. Patients were evaluated for RS risk before PN initiation and the actual incidence of RS based on pre- and post-feeding phosphorus, potassium, and magnesium levels using ASPEN 2020 criteria. Initial lab work was obtained before and after PN initiation. Refeeding risk was categorized as mild, moderate, moderate to severe, or severe based on initial nutrition assessment, including BMI, weight loss history, recent caloric intake, pre-feeding lab abnormalities, fat/muscle wasting, and high-risk comorbidities. The percent change in phosphorus, potassium, and magnesium was evaluated and categorized as mild, moderate, or severe if levels decreased after therapy start. Initial PN prescriptions included multivitamins and supplemental thiamin per provider policy and consensus recommendations.</p><p><b>Results:</b> The average baseline BMI for the study population was 19.6 kg/m² (range 12.7-31.8, median 18.9). Weight loss was reported in 88.9% of patients, averaging 22%. Little to no oral intake at least 5-10 days before assessment was reported in 92.3% of patients. Initial lab work was obtained within 5 days of therapy start in 96.2% of cases, with 18.5% showing low prefeeding electrolytes. 100% had high-risk comorbidities. RS risk was categorized as mild (4%), moderate (48%), moderate to severe (11%), and severe (37%). Home PN was successfully initiated in 25 patients (93%). Two patients could not start PN at home: one due to persistently low pre-PN electrolyte levels despite IV repletion, and one due to not meeting home care admission criteria. Starting dextrose averaged 87.2 g/d (range: 50-120, median 100). Average total starting calories were 730 kcals/d, representing 12.5 kcals/kg (range: 9-20, median 12). Initial PN formula electrolyte content included potassium (average 55.4 mEq/d, range: 15-69, median 60), magnesium (average 11.6 mEq/d, range: 4-16, median 12), and phosphorus (average 15.6 mmol/d, range: 8-30, median 15). Labs were drawn on average 4.3 days after therapy start. Potassium, magnesium, and phosphorus levels were monitored for decreases ≥10% of baseline to detect RS. Decreases in magnesium and potassium were classified as mild (10-20%) and experienced by 4% of patients, respectively. Eight patients (32%) had a ≥ 10% decrease in phosphorus: 4 mild (10-20%), 2 moderate (20-30%), 2 severe (>30%).</p><p><b>Conclusion:</b> Home initiation of PN can be safely implemented with careful monitoring and evaluation of RS risk. This review showed a low incidence of RS based on ASPEN criteria, even in patients at moderate to severe risk prior to home PN initiation. Close monitoring of labs and patient status, along with adherence to initial prescription recommendations, resulted in successful PN initiation at home in 92.5% of patients.</p><p>Dana Finke, MS, RD, CNSC<sup>1</sup>; Christine Miller, PharmD<sup>1</sup>; Paige Paswaters, RD, CNSC<sup>1</sup>; Jessica Younkman, RD, CNSC<sup>1</sup></p><p><sup>1</sup>Soleo Health, Frisco, TX</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Iron deficiency anemia is common in home parenteral nutrition (HPN) patients (Hwa et al., 2016). Post iron infusion, hypophosphatemia due to fibroblast growth factor 23 is a known side effect of some IV iron formulations (Wolf et al., 2019). Ferric carboxymaltose can cause serum phosphorus levels to drop below 2 mg/dL in 27% of patients (Onken et al., 2014). Hypophosphatemia (< 2.7 mg/dL) can lead to neurological, neuromuscular, cardiopulmonary, and hematologic issues, and long-term effects like osteopenia and osteoporosis (Langley et al., 2017). This case series reviews the occurrence and clinical implications of transient serum phosphorus drops in patients receiving ferric carboxymaltose and HPN.</p><p><b>Methods:</b> A retrospective case series review was performed for three patients who were administered ferric carboxymaltose while on HPN therapy. Serum phosphorus levels were measured at baseline prior to initial iron dose, within 1 week post injection, and at subsequent time intervals up to 4 months post initial injection. Data was collected on the timing, magnitude, and duration of any phosphorus decreases, and any associated clinical symptoms or complications. Patient records were also reviewed to evaluate any correlations with HPN composition or phosphorus dosing.</p><p><b>Results:</b> Among the three patients reviewed, all exhibited short-term drops in serum phosphorus levels following ferric carboxymaltose injection (Table 1). All patients had baseline serum phosphorus levels within normal limits prior to the initial dose of ferric carboxymaltose. All cases involved multiple doses of ferric carboxymaltose, which contributed to the fluctuations in phosphorus levels. The average drop in serum phosphorus from baseline to the lowest point was 50.3%. The lowest recorded phosphorus level among these three patients was 1.4 mg/dL, and this was in a patient who received more than two doses of ferric carboxymaltose. In two cases, increases were made in HPN phosphorus in response to serum levels, and in one case no HPN changes were made. However, all serum phosphorus levels returned to normal despite varied interventions. Despite the low phosphorus levels, none of the patients reported significant symptoms of hypophosphatemia during the monitoring periods. Ferric carboxymaltose significantly impacts serum phosphorus in HPN patients, consistent with existing literature. The need for vigilant monitoring is highlighted, patients receiving HPN are closely monitored by a trained nutrition support team with frequent lab monitoring. Lab monitoring in patients receiving ferric carboxymaltose who are not on HPN may be less common. The lowest level recorded was 1.4 mg/dL, indicating potential severity. Despite significant drops, no clinical symptoms were observed, suggesting subclinical hypophosphatemia may be common. In two of the reviewed cases, hypophosphatemia was addressed by making incremental increases in the patient's HPN formulas. Note that there are limitations to phosphorus management in HPN due to compatibility and stability issues, and alternative means of supplementation may be necessary depending upon the patient's individual formula.</p><p><b>Conclusion:</b> Three HPN patients receiving ferric carboxymaltose experienced transient, generally mild reductions in serum phosphorus. Monitoring is crucial, but the results from this case series suggest that clinical complications are rare. Adjustments to HPN or additional supplementation may be needed based on individual patient needs, with some cases self-correcting over time.</p><p><b>Table 1.</b> Timeline of Iron Injections and the Resulting Serum Phosphorus Levels and HPN Formula Adjustments.</p><p></p><p>Danial Nadeem, MD<sup>1</sup>; Stephen Adams, MS, RPh, BCNSP<sup>2</sup>; Bryan Snook<sup>2</sup></p><p><sup>1</sup>Geisinger Wyoming Valley, Bloomsburg, PA; <sup>2</sup>Geisinger, Danville, PA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Ferric carboxymaltose (FC) is a widely used intravenous iron formulation, primarily employed in the treatment of iron deficiency. It offers significant benefits, particularly in cases where oral iron supplementation proves ineffective or is not well-tolerated. However, an important potential adverse effect associated with FC use is hypophosphatemia. This condition has been observed in multiple patients following their treatment with FC. The paper discusses the potential mechanisms leading to this adverse effect and its significant implications for patient care.</p><p><b>Methods:</b> A middle-aged female with history of malnutrition and iron deficiency receiving parenteral nutrition at home had received multiple doses Venofer in the past, with the last dose given in 2017 to which the patient developed an anaphylactic reaction. She was therefore switched to ferric carboxymaltose (FCM) therapy. However, upon receiving multiple doses of FCM in 2018, the patient developed significant hypophosphatemia. As hypophosphatemia was noted, adjustments were made to the patient's total parenteral nutrition (TPN) regimen to increase the total phosphorus content in an effort to treat the low phosphate levels. The patient also received continued doses of FCM in subsequent years, with persistent hypophosphatemia despite repletion.</p><p><b>Results:</b> Ferric carboxymaltose (FC) is a widely used intravenous iron therapy for the treatment of iron deficiency. It is particularly beneficial in cases where oral iron supplementation is ineffective or not tolerated. FC works by delivering iron directly to the macrophages in the reticuloendothelial system. The iron is then released slowly for use by the body, primarily for the production of hemoglobin. However, recent studies have highlighted the potential adverse effect of hypophosphatemia associated with the use of FC. Hypophosphatemia induced by FC is thought to be caused by an increase in the secretion of the hormone fibroblast growth factor 23 (FGF23). FGF23 is a hormone that regulates phosphate homeostasis. When FGF23 levels rise, the kidneys increase the excretion of phosphate, leading to lower levels of phosphate in the blood. There are many implications of hypophosphatemia in regards to patient care. Symptoms of hypophosphatemia can include muscle weakness, fatigue, bone pain, and confusion. In severe cases, persistent hypophosphatemia can lead to serious complications such as rhabdomyolysis, hemolysis, respiratory failure, and even death. Therefore, it is crucial for clinicians to be aware of the potential risk of hypophosphatemia when administering FC. Regular monitoring of serum phosphate levels is recommended in patients receiving repeated doses of FC. Further research is needed to fully understand the mechanisms underlying this adverse effect and to develop strategies for its prevention and management.</p><p><b>Conclusion:</b> In conclusion, while FC is an effective treatment for iron deficiency, it is important for clinicians to be aware of the potential risk of hypophosphatemia. Regular monitoring of serum phosphate levels is recommended in patients receiving repeated doses of FC. Further research is needed to fully understand the mechanisms underlying this adverse effect and to develop strategies for its prevention and management.</p><p><b>Table 1.</b> Phosphorous Levels and Iron Administration.</p><p></p><p>Table 1 shows the response to serum phosphorous levels in a patient given multiple doses of intravenous iron over time.</p><p>Ashley Voyles, RD, LD, CNSC<sup>1</sup>; Jill Palmer, RD, LD, CNSC<sup>1</sup>; Kristin Gillespie, MD, RD, LDN, CNSC<sup>1</sup>; Suzanne Mack, MS, MPH, RD, LDN, CNSC<sup>1</sup>; Tricia Laglenne, MS, RD, LDN, CNSC<sup>1</sup>; Jessica Monczka, RD, CNSC, FASPEN<sup>1</sup>; Susan Dietz, PharmD, BCSCP<sup>1</sup>; Kathy Martinez, RD, LD<sup>1</sup></p><p><sup>1</sup>Option Care Health, Bannockburn, IL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Home parenteral nutrition (HPN) can be successfully initiated in the home setting with careful evaluation and management by an experienced nutrition support team (NST).<sup>1,2</sup> Safe candidates for HPN initiation include medically stable patients with an appropriate indication, a safe environment, and the means for reliable follow-up. However, some patients are not appropriate to start directly with HPN due to logistical reasons or the risk of refeeding syndrome (RFS).<sup>2</sup> Consensus Recommendations for RFS provide guidance regarding recognizing and managing risk.<sup>3</sup> An experienced NST that provides individualized care can recommend intravenous (IV) hydration before initiating HPN to expedite the initiation of therapy and normalize blood chemistry to mitigate the risk of RFS. The purpose of this study is to evaluate the impact of IV hydration on adult patients managed by a home infusion NST who received IV hydration prior to initiating HPN. The proportion of patients who received IV hydration prior to HPN, the reason for initiating IV hydration, and the impact this intervention may have had on their care and outcomes will be described.</p><p><b>Methods:</b> This retrospective review includes 200 HPN patients 18 years of age and older initiated on HPN therapy with a national home infusion pharmacy over a 6-month period from January 1, 2024, to June 30, 2024. Data collection included baseline demographics, indication for HPN, risk of RFS, days receiving IV hydration prior to initiating HPN, the number of rehospitalizations within the first 2 weeks, whether they received IV hydration, and if so, the indication for IV hydration and the components of the orders. Data was collected via electronic medical records and deidentified into a standardized data collection form.</p><p><b>Results:</b> Of the 200 total patients, 19 (9.5%) received IV hydration prior to HPN. Of these 19 patients, 16 were female, and 3 were male (Table 1). The most common indications for HPN were bariatric surgery complication (5), intestinal failure (4), and oncology diagnosis (4) (Figure 1). Among these patients, 9 (47%) were at moderate RFS risk and 10 (53%) were at high RFS risk. The indications for IV hydration included 7 (37%) due to electrolyte abnormalities/RFS risk, 5 (26%) due to delay in central line placement, and 7 (37%) due to scheduling delays (Figure 2). IV hydration orders included electrolytes in 15 (79%) of the orders. All orders without electrolytes (4) had an indication related to logistical reasons (Figure 3). All 19 patients started HPN within 7 days of receiving IV hydration. Two were hospitalized within the first two weeks of therapy with admitting diagnoses unrelated to HPN.</p><p><b>Conclusion:</b> In this group of patients, HPN was successfully initiated in the home setting when managed by an experienced NST, preventing unnecessary hospitalizations. This study demonstrated that safe initiation of HPN may include IV hydration with or without electrolytes first, either to mitigate RFS or due to logistical reasons, when started on HPN within 7 days. The IV hydration orders were individualized to fit the needs of each patient. This data only reflects IV hydration dispensed through the home infusion pharmacy and does not capture IV hydration received at an outside clinic. This patient population also does not include those deemed medically unstable for HPN or those not conducive to starting in the home setting for other factors. Future research should account for these limitations and detail IV hydration components, dosing, and frequency of orders.</p><p><b>Table 1.</b> Demographics.</p><p></p><p></p><p><b>Figure 1.</b> HPN Indications of IV Hydration.</p><p></p><p><b>Figure 2.</b> Indication for IV Hydration and Refeeding Risk.</p><p></p><p><b>Figure 3.</b> Indications and Types of IV Hydration.</p><p>Emily Boland Kramer, MS, RD, LDN, CNSC<sup>1</sup>; Jessica Monczka, RD, CNSC, FASPEN<sup>1</sup>; Tricia Laglenne, MS, RD, LDN, CNSC<sup>1</sup>; Ashley Voyles, RD, LD, CNSC<sup>1</sup>; Susan Dietz, PharmD, BCSCP<sup>1</sup>; Kathy Martinez, RD, LD<sup>1</sup></p><p><sup>1</sup>Option Care Health, Bannockburn, IL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Parenteral nutrition (PN) is a critical therapy for patients who are unable to absorb nutrients adequately via the gastrointestinal tract.<sup>1</sup> PN is complex, with 10 or more individually dosed components in each order which inherently increases the risk for dosing errors. <sup>2</sup> This study seeks to analyze the PN orders at hospital discharge received by a home infusion provider and identify the incidence of the omission of the standard components, as determined by ASPEN Recommendations on Appropriate Parenteral Nutrition Dosing.<sup>3</sup> The primary objective of this study was to identify missing sodium, potassium, magnesium, calcium, phosphorus, and multivitamin components in hospital discharge orders. The secondary objective was to determine whether identified missing components were added back during the transition of care (TOC) process from hospital to home.</p><p><b>Methods:</b> This multi-center, retrospective chart review analyzed patients referred to a national home infusion provider over a 3-month period. Data was collected from the electronic medical record and internal surveillance software. The Registered Dietitian, Certified Nutrition Support Clinician (RD, CNSC) reviewed all PN hospital discharge orders for patients transitioning clinical management and PN therapy from hospital to home. Inclusion criteria were patients 18 years of age or older with PN providing the majority of their nutritional needs, as defined in Table 1, who were missing sodium, potassium, magnesium, calcium, phosphorus, or multivitamin from their PN order at hospital discharge. Exclusion criteria were patients less than 18 years of age, patients receiving supplemental PN not providing majority of nutritional needs, and patients with doses ordered for all electrolytes and multivitamin.</p><p><b>Results:</b> During the 3-month period (April 1, 2024 to June 30, 2024), 267 patients were identified who were greater than 18 years of age, receiving the majority of their nutritional needs via PN, and missing at least one PN component in their hospital discharge order. See Table 2 and Figure 1 for demographics. One hundred seventy-five (65.5%) patients were missing one component and 92 (34.5%) were missing multiple components from the hospital discharge order. One hundred seventy-five (65.5%) patients were missing calcium, 68 (25.5%) phosphorus, 38 (14%) multivitamin, 23 (8.6%) magnesium, 20 (7.5%) potassium, and 20 (7.5%) sodium. During the transition from hospital to home, after discussion with the provider, 94.9% of patients had calcium added back, 94.7% multivitamin, 91.3% magnesium, 90% potassium, 88.2% phosphorus, and 80% sodium.</p><p><b>Conclusion:</b> This study highlights the prevalence of missing components in PN hospital discharge orders, with calcium being the most frequently omitted at a rate of 65.5%. Given that many patients discharging home on PN will require long term therapy, adequate calcium supplementation is essential to prevent bone resorption and complications related to metabolic bone disease. In hospital discharge orders that were identified by the RD, CNSC as missing calcium, 94.9% of the time the provider agreed that it was clinically appropriate to add calcium to the PN order during the TOC process. This underlines the importance of nutrition support clinician review and communication during the transition from hospital to home. Future research should analyze the reasons why components are missing from PN orders and increase awareness of the need for a thorough clinical review of all patients going home on PN to ensure the adequacy of all components required for safe and optimized long term PN.</p><p><b>Table 1.</b> Inclusion and Exclusion Criteria.</p><p></p><p><b>Table 2.</b> Demographics.</p><p></p><p></p><p><b>Figure 1.</b> Primary PN Diagnosis.</p><p></p><p><b>Figure 2.</b> Components Missing from Order and Added Back During TOC Process.</p><p>Avi Toiv, MD<sup>1</sup>; Hope O'Brien, BS<sup>2</sup>; Arif Sarowar, MSc<sup>2</sup>; Thomas Pietrowsky, MS, RD<sup>1</sup>; Nemie Beltran, RN<sup>1</sup>; Yakir Muszkat, MD<sup>1</sup>; Syed-Mohammad Jafri, MD<sup>1</sup></p><p><sup>1</sup>Henry Ford Hospital, Detroit, MI; <sup>2</sup>Wayne State University School of Medicine, Detroit, MI</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Intestinal Failure Associated Liver Disease (IFALD) is a known complication in patients reliant on total parenteral nutrition (TPN), especially those awaiting intestinal transplantation. There is concern that IFALD may negatively impact post-transplant outcomes, including graft survival and overall patient mortality. This study aims to evaluate the impact of IFALD, as indicated by liver function test (LFT) abnormalities before intestinal or multivisceral transplant, on transplant outcomes.</p><p><b>Methods:</b> We conducted a retrospective chart review of all patients who underwent IT at an academic transplant center from 2010 to 2023. The primary outcome was patient survival and graft failure in transplant recipients.</p><p><b>Results:</b> Among 50 IT recipients, there were 30 IT recipients (60%) who required TPN before IT. The median age at transplant in was 50 years (range, 17-68). In both groups, the majority of transplants were exclusively IT, however they included MVT as well. 87% of patients on TPN developed elevated LFTs before transplant. 33% had persistently elevated LFTs 1 year after transplant. TPN-associated liver dysfunction in our cohort was associated with mixed liver injury patterns, with both hepatocellular injury (p < 0.001) and cholestatic injury (p < 0.001). No significant associations were found between TPN-related elevated LFTs and major transplant outcomes, including death (p = 0.856), graft failure (p = 0.144), or acute rejection (p = 0.306). Similarly, no significant difference was observed between elevated LFTs and death (p = 0.855), graft failure (p = 0.769), or acute rejection (p = 0.386) in patients who were not on TPN. TPN-associated liver dysfunction before transplant was associated with elevated LFTs at one year after transplant (p < 0.001) but lacked clinical relevance.</p><p><b>Conclusion:</b> Although IFALD related to TPN use is associated with specific liver dysfunction patterns in patients awaiting intestinal or multivisceral transplants, it does not appear to be associated with significant key transplant outcomes such as graft failure, mortality, or acute rejection. However, it is associated with persistently elevated LFTs even 1 year after transplant. These findings suggest that while TPN-related liver injury is common, it may not have a clinically significant effect on long-term transplant success. Further research is needed to explore the long-term implications of IFALD in this patient population.</p><p>Jody (Lind) Payne, RD, CNSC<sup>1</sup>; Kassandra Samuel, MD, MA<sup>2</sup>; Heather Young, MD<sup>3</sup>; Karey Schutte, RD<sup>3</sup>; Kristen Horner, RDN, CNSC<sup>3</sup>; Daniel Yeh, MD, MHPE, FACS, FCCM, FASPEN, CNSC<sup>3</sup></p><p><sup>1</sup>Denver Health, Parker, CO; <sup>2</sup>Denver Health, St. Joseph Hospital, Denver, CO; <sup>3</sup>Denver Health, Denver, CO</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Central line-associated blood stream infection (CLABSI) is associated with increased complications, length of stay and cost of care. The majority of CLABSI studies are focused on home parenteral nutrition (PN) patients and there is a paucity of data documenting the incidence of CLABSI attributable to PN in the inpatient setting. At our institution, we have observed that clinicians are reluctant to initiate PN in patients with clear indications for PN due to concerns about CLABSI. Therefore, we performed a quality improvement project to document our incidence of CLABSI rate for new central parenteral nutrition (CPN) initiated during hospitalization.</p><p><b>Methods:</b> We performed a retrospective review of adult inpatients who initiated CPN at our facility from 1/1/23-12/31/23. Patients were excluded if they received PN prior to admission or only received peripheral PN. The National Healthcare Safety Network (NHSN) definitions were used for CLABSI and secondary attribution. Further deeper review of CLABSI cases was provided by an Infectious Disease (ID) consultant to determine if positive cases were attributable to CPN vs other causes. The type of venous access for the positive patients was also reviewed.</p><p><b>Results:</b> A total of 106 inpatients received CPN for a total of 1121 CPN days. The median [IQR] length of CPN infusion was 8 [4-14] days. Mean (standard deviation) age of patients receiving CPN infusion was 53.3 (18.6) years and 65 (61%) were men. The CPN patients who met criteria for CLABSI were further reviewed by ID consultant and resulted in only four CLABSI cases being attributable to CPN. These four cases resulted in an incidence rate of 3.6 cases of CLABSI per 1000 CPN days. Two of these patients were noted for additional causes of infection including gastric ulcer perforation and bowel perforation with anastomotic leak. Three of the patients had CPN infused via a central venous catheter (a port, a femoral line, and a non-tunneled internal jugular catheter) and the fourth patient had CPN infused via a peripherally inserted central catheter. The incidence rate for CLABSI cases per catheter days was not reported in our review.</p><p><b>Conclusion:</b> At our institution, < 4% of patients initiating short-term CPN during hospitalization developed a CLABSI attributable to the CPN. This low rate of infection serves as a benchmark for our institution's quality improvement and quality assurance efforts. Collaboration with ID is recommended for additional deeper review of CPN patients with CLABSI to determine if the infection is more likely to be related to other causes than infusion of CPN.</p><p>Julianne Harcombe, RPh<sup>1</sup>; Jana Mammen, PharmD<sup>1</sup>; Hayato Delellis, PharmD<sup>1</sup>; Stefani Billante, PharmD<sup>1</sup></p><p><sup>1</sup>Baycare, St. Joseph's Hospital, Tampa, FL</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> Florida Residency Conference 2023.</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Purpose/Background: Refeeding syndrome is defined as potentially fatal shifts in fluids and electrolytes that may occur in malnourished patients receiving enteral or parenteral nutrition. When refeeding syndrome occurs, a reduction in phosphorus/magnesium/potassium levels and thiamine deficiency can be seen shortly after initiation of calorie provision. The American Society of Parenteral and Enteral Nutrition guidelines considers hypophosphatemia as the hallmark sign of refeeding syndrome; however, magnesium and potassium have been shown to be equally important. The purpose of this study is to identify the incidence of hypophosphatemia in patients who are at risk of refeeding syndrome and the importance of monitoring phosphorus.</p><p><b>Methods:</b> This study was a multicenter retrospective chart review that was conducted using the BayCare Health System medical records database, Cerner. The study included patients who were 18 years or older, admitted between January 2023 through December 2023, received total parenteral nutrition (TPN) and were at risk of refeeding syndrome. We defined patients at risk of refeeding syndrome as patients who meet two of the following criteria prior to starting the TPN: body mass index (BMI) prior to starting TPN < 18.5 kg/m2, 5% weight loss in 1 month, no oral intake 5 days, or low levels of serum phosphorus/magnesium/potassium. COVID-19 patients and patients receiving propofol were excluded from the study. The primary objective of the study was to evaluate the incidence of hypophosphatemia versus hypomagnesemia versus hypokalemia in patients receiving TPN who were at risk of refeeding syndrome. The secondary objective was to evaluate whether the addition of thiamine upon initiation of TPN showed benefit in the incidence of hypophosphatemia.</p><p><b>Results:</b> A total of 83 patients met the criteria for risk of refeeding syndrome. Out of the 83 patients, a total of 53 patients were used to run a pilot study to determine the sample size and 30 patients were included in the study. The results on day 1 and day 2 suggest the incidence of hypomagnesemia differs from that of hypophosphatemia and hypokalemia, with a notably lower occurrence. The Cochran's Q test yielded x2(2) = 9.57 (p-value = 0.008) on day 1 and x2(2) = 4.77 (p-value = 0.097) on day 2, indicating a difference in at least one group compared to the others on only day 1. A post hoc analysis found a difference on day 1 between the incidence of hypophosphatemia vs hypomagnesemia (30%) and hypomagnesemia vs hypokalemia (33.3%). For the secondary outcome, the difference in day 2 versus day 1 phosphorus levels with the addition of thiamine in the TPN was 0.073 (p-value = 0.668, 95% CI [-0.266 – 0.413]).</p><p><b>Conclusion:</b> Among patients who were on parenteral nutrition and were at risk of refeeding syndrome, there was not a statistically significant difference in the incidence of hypophosphatemia vs hypomagnesemia and hypomagnesemia vs hypokalemia. Among patients who were on parenteral nutrition and were at risk of refeeding syndrome, there was not a statistically significant difference on day 2 phosphorus levels vs day 1 phosphorus levels when thiamine was added.</p><p></p><p></p><p>Jennifer McClelland, MS, RN, FNP-BC<sup>1</sup>; Margaret Murphy, PharmD, BCNSP<sup>1</sup>; Matthew Mixdorf<sup>1</sup>; Alexandra Carey, MD<sup>1</sup></p><p><sup>1</sup>Boston Children's Hospital, Boston, MA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Iron deficiency anemia (IDA) is common in patients with intestinal failure (IF) dependent on parenteral nutrition (PN). Treatment with enteral iron is preferred; however, may not be tolerated or efficacious. In these cases, intravenous (IV) iron is a suitable alternative. Complications include adverse reactions and infection, though low when using low-molecular-weight (LMW) formulations. In a home PN (HPN) program, an algorithm (Figure 1) was developed to treat IDA utilizing IV iron.</p><p><b>Methods:</b> A retrospective chart review was conducted in large HPN program (~150 patients annually) from Jan 2019 - April 2024 who were prescribed IV iron following an algorithm. Laboratory studies were analyzed looking for instances of ferritin >500 ng/mL indicating potential iron overload, as well as transferrin saturation 12-20% indicating iron sufficiency. In instances of ferritin levels >500 further review was conducted to understand etiology, clinical significance and if the IV iron algorithm was adhered to.</p><p><b>Results:</b> HPN patients are diagnosed with IDA based on low iron panel (low hemoglobin and/or MCV, low ferritin, high reticulocyte count, serum iron and transferrin saturation and/or high total iron binding capacity (TIBC). If the patient can tolerate enteral iron supplementation, a dose of 3-6 mg/kg/day is initiated. If patient cannot tolerate enteral iron, the IV route is initiated. Initial IV dose is administered in the hospital or infusion center for close monitoring and to establish home maintenance administration post repletion dosing. Iron dextran is preferred as it can be directly added into the PN and run for duration of the cycle. Addition to the PN eliminates an extra infusion and decrease additional CVC access. Iron dextran is incompatible with IV lipids, so the patient must have one lipid-free day weekly to be able to administer. If patient receives daily IV lipids, iron sucrose is given as separate infusion from the PN. Maintenance IV iron dosing is 1 mg/kg/week, with dose and frequency titrated based on clinical status, lab studies and trends. Iron panel and C-reactive protein (CRP) are ordered every 2 months. If lab studies are below the desired range and consistent with IDA, IV iron dose is increased by 50% by dose or frequency; if studies are over the desired range, IV iron dose is decreased by 50% by dose or frequency. Maximum home dose is < 3 mg/kg/dose; if higher dose needed, patient is referred to an infusion center. IV iron is suspended if ferritin >500 ng/mL due to risk for iron overload and deposition in the liver. Ferritin results (n = 4165) for all patients in HPN program from January 2019-April 2024 were reviewed looking for levels >500 ng/mL indicating iron overload. Twenty-nine instances of ferritin >500 ng/mL (0.7% of values reviewed) were identified in 14 unique patients on maintenance IV iron. In 9 instances, the high ferritin level occurred with concomitant acute illness with an elevated CRP; elevated ferritin in these cases was thought to be related to an inflammatory state vs. iron overload. In 2 instances, IV iron dose was given the day before lab draw, rendering a falsely elevated result. Two patients had 12 instances (0.28% of values reviewed) of elevated ferritin thought to be related to IV iron dosing in the absence of inflammation, with normal CRP levels. During this period, there were no recorded adverse events.</p><p><b>Conclusion:</b> IDA is common in patients with IF dependent on PN. Iron is not a standard component or additive in PN. Use of IV iron in this population can increase quality of life by decreasing need for admissions, visits to infusion centers, or need for blood transfusions in cases of severe anemia. IV iron can be safely used for maintenance therapy in HPN patients with appropriate dosing and monitoring.</p><p></p><p><b>Figure 1.</b> Intravenous Iron in the Home Parenteral Nutrition dependent patient Algorithm.</p><p>Lynne Sustersic, MS, RD<sup>1</sup>; Debbie Stevenson, MS, RD, CNSC<sup>2</sup></p><p><sup>1</sup>Amerita Specialty Infusion Services, Thornton, CO; <sup>2</sup>Amerita Specialty Infusion Services, Rochester Hills, MI</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Desmoplastic small round tumor (DSRT) is a soft-tissue sarcoma that causes tumors to form in the abdomen and pelvis. To improve control, cytoreduction, hyperthermic intraperitoneal chemotherapy (CRS/HIPEC) and radiotherapy are often used, which can result in bowel obstruction secondary to sclerosing peritonitis. This necessitates total parenteral nutrition therapy (TPN) due to the inability to consume nutrition orally or enterally. A major complication of parenteral nutrition therapy is parenteral nutrition associated liver disease (PNALD) and the most common metastasis for DSRT is the liver. This case report details the substitution of an olive and soy oil-based intravenous lipid emulsion (OO, SO-ILE) for a soy, MCT, olive, fish oil-based intravenous lipid emulsion (SO, MCT, OO, FO-ILE) to treat high liver function tests (LFTs).</p><p><b>Methods:</b> A 28-year-old male with DSRT metastatic to peritoneum and large hepatic mass complicated by encapsulating peritonitis and enterocutaneous fistula (ECF), following CRS/HIPEC presented to the parenteral nutrition program at Amerita Specialty Infusion Services in 2022. TPN was initiated from January 2022 to March 2023, stopped, and restarted in December 2023 following a biliary obstruction. TPN was initiated and advanced using 1.3 g/kg/day SMOFlipid, (SO, MCT, OO, FO-ILE), known to help mitigate PNALD. The patient developed rising LFTs, with alanine transferase (ALT) peaking at 445 U/L, aspartate transferase (AST) at 606 U/L, and alkaline phosphatase (ALP) at 1265 U/L. Despite transitioning the patient to a cyclic regimen and maximizing calories in dextrose and amino acids, liver function continued to worsen. A switch to Clinolipid, (OO, SO-ILE), at 1.3 g/kg/day was tried.</p><p><b>Results:</b> Following the initiation of OO, SO-ILE, LFTs improved in 12 days with ALT resulting at 263 U/L, AST at 278 U/L, and ALP at 913 U/L. These values continued to improve until the end of therapy in June 2024 with a final ALT value of 224 U/L, AST at 138 U/L, and ALP at 220 U/L. See Figure 1. No significant improvements in total bilirubin were found. The patient was able to successfully tolerate this switch in lipid emulsions and was able to increase his weight from 50 kg to 53.6 kg.</p><p><b>Conclusion:</b> SO, MCT, OO, FO-ILE is well-supported to help prevent and alleviate adverse effects of PNALD, however lipid emulsion impacts on other forms of liver disease need further research. Our case suggests that elevated LFTs were likely cancer induced, rather than associated with prolonged use of parenteral nutrition. A higher olive oil lipid concentration may have beneficial impacts on LFTs that are not associated with PNALD. It is also worth noting that soybean oil has been demonstrated in previous research to have a negative impact on liver function, and the concentration of soy in SO, MCT, OO, FO-ILE is larger (30%) compared to OO, SO-ILE (20%). This may warrant further investigation into specific soy concentrations’ impact on liver function. LFTs should be assessed and treated on a case-by-case basis that evaluates disease mechanisms, medication-drug interactions, parenteral nutrition composition, and patient subjective information.</p><p></p><p><b>Figure 1.</b> OO, SO-ILE Impact on LFTs.</p><p>Shaurya Mehta, BS<sup>1</sup>; Ajay Jain, MD, DNB, MHA<sup>1</sup>; Kento Kurashima, MD, PhD<sup>1</sup>; Chandrashekhara Manithody, PhD<sup>1</sup>; Arun Verma, MD<sup>1</sup>; Marzena Swiderska-Syn<sup>1</sup>; Shin Miyata, MD<sup>1</sup>; Mustafa Nazzal, MD<sup>1</sup>; Miguel Guzman, MD<sup>1</sup>; Sherri Besmer, MD<sup>1</sup>; Matthew Mchale, MD<sup>1</sup>; Jordyn Wray<sup>1</sup>; Chelsea Hutchinson, MD<sup>1</sup>; John Long, DVM<sup>1</sup></p><p><sup>1</sup>Saint Louis University, St. Louis, MO</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> North American Society for Pediatric Gastroenterology, Hepatology and Nutrition (NASPGHAN) Florida.</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Short bowel syndrome (SBS) is a devastating condition. In absence of enteral nutrition (EN), patients are dependent on Total Parenteral Nutrition (TPN) and suffer from of intestinal failure associated liver disease and gut atrophy. Intestinal adaptation (IA) and enteral autonomy (EA) remains the clinical goal. We hypothesized EA can be achieved using our DREAM system, (US patent 63/413,988) which allows EN via the stomach and then a mechanism for cyclical recirculation of nutrient rich distal intestinal content into proximal bowel enabling full EN despite SBS.</p><p><b>Methods:</b> 24 neonatal pigs were randomly allocated to enteral nutrition (EN; n = 8); TPN-SBS (on TPN only, n = 8); or DREAM (n = 8). Liver, gut, and serum were collected for histology, and serum biochemistry. Statistical analysis was performed using ‘Graph Pad Prism 10.1.2 (324)’. All tests were 2-tailed using a significance level of 0.05.</p><p><b>Results:</b> TPN-SBS piglets had significant cholestasis vs DREAM (p = 0.001) with no statistical difference in DREAM vs EN (p = 0.14). DREAM transitioned to full EN by day 4. Mean serum conjugated bilirubin for EN was 0.037 mg/dL, TPN-SBS 1.2 mg/dL, and DREAM 0.05 mg/dL. Serum bile acids were significantly elevated in TPN-SBS vs EN (p = 0.007) and DREAM (p = 0.03). Mean GGT, a marker of cholangiocytic injury was significantly higher in TPN-SBS vs EN (p < 0.001) and DREAM (p < 0.001) with values of EN 21.2 U/L, TPN-SBS 47.9 U/L, and DREAM 22.5 U/L (p = 0.89 DREAM vs EN). To evaluate gut growth, we measured lineal gut mass (LGM), calculated as the weight of the bowel per centimeter. There was significant IA and preservation in gut atrophy with DREAM. Mean proximal gut LGM was EN 0.21 g/cm, TPN-SBS 0.11 g/cm, and DREAM 0.31 g/cm (p = 0.004, TPN-SBS vs DREAM). Distal gut LGM was EN 0.34 g/cm, TPN-SBS 0.13 g/cm, and DREAM 0.43 g/cm (p = 0.006, TPN-SBS vs DREAM). IHC revealed DREAM had similar hepatic CK-7 (bile duct epithelium marker), p = 0.18 and hepatic Cyp7A1, p = 0.3 vs EN. No statistical differences were noted in LGR5 positive intestinal stem cells in EN vs DREAM, p = 0.18. DREAM prevented changes in hepatic, CyP7A1, BSEP, FGFR4, SHP, SREPBP-1 and gut FXR, TGR5, EGF vs TPN and SBS groups.</p><p><b>Conclusion:</b> DREAM resulted in a significant reduction of hepatic cholestasis, prevented gut atrophy, and presents a novel method enabling early full EN despite SBS. This system, by driving IA and EN autonomy highlights a major advancement in SBS management, bringing a paradigm change to life saving strategies for SBS patients.</p><p>Silvia Figueiroa, MS, RD, CNSC<sup>1</sup>; Paula Delmerico, MS, RD, CNSC<sup>2</sup></p><p><sup>1</sup>MedStar Washington Hospital Center, Bethesda, MD; <sup>2</sup>MedStar Washington Hospital Center, Arlington, VA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Parenteral nutrition (PN) therapy is a vital clinical intervention for patients of all ages and across care settings. The complexity of PN has the potential to cause significant patient harm, especially when errors occur. According to The Institute for Safe Medication Practices (ISMP), PN is classified as a high-alert medication and safety-focused strategies should be formulated to minimize errors and harm. Processing PN is multifactorial and includes prescribing, order review and verification, compounding, labeling, and administration. PN prescription should consider clinical appropriateness and formulation safety. The PN formulation must be written to provide appropriate amounts of macronutrients and micronutrients based on the patient's clinical condition, laboratory parameters, and nutrition status. The PN admixture should not exceed the total amounts, recommended concentrations, or rate of infusion of these nutrients as it could result in toxicities and formulation incompatibility or instability. The ASPEN Parenteral Nutrition Safety Consensus Recommendations recommend PN be prescribed using standardized electronic orders via a computerized provider order entry (CPOE) system, as handwritten orders have potential for error. Data suggests up to 40% of PN-related errors occur during the prescription and transcription steps. Registered pharmacists (RPh) are tasked with reviewing PN orders for order accuracy and consistency with recommendations made by the nutrition support team. These RPh adjustments ensure formula stability and nutrient optimization. This quality improvement project compares the frequency of RPh PN adjustments following initial provider order after transition from a paper to CPOE ordering system. Our hypothesis is CPOE reduces the need for PN adjustments by pharmacists during processing which increases clinical effectiveness and maximizes resource efficiency.</p><p><b>Methods:</b> This was a retrospective evaluation of PN ordering practices at a large, academic medical center after shifting from paper to electronic orders. PN orders were collected during three-week periods in December 2022 (Paper) and December 2023 (CPOE) and analyzed for the frequency of order adjustments by the RPh. Adjustments were classified into intravascular access, infusion rate, macronutrients, electrolytes, multivitamin (MVI) and trace elements (TE), and medication categories. The total number of adjustments made by the RPh during final PN processing was collected. These adjustments were made per nutrition support team.</p><p><b>Results:</b> Daily PN orders for 106 patients – totaling 694 orders – were reviewed for provider order accuracy at the time of fax (paper) and electronic (CPOE) submission. Order corrections made by the RPh decreased by 96% for infusion rate, 91.5% for macronutrients, 79.6% for electrolytes, 81.4% for MVI and TE, and 50% for medication additives (Table 1).</p><p><b>Conclusion:</b> Transitioning to CPOE led to reduction in the need for PN order adjustments at the time of processing. One reason for this decline is improvement in physician understanding of PN recommendations. With CPOE, the registered dietitian's formula recommendation is viewable within the order template and can be referenced at the time of ordering. The components of the active PN infusion also automatically populate upon ordering a subsequent bag. This information can aid the provider when calculating insulin needs or repleting electrolytes outside the PN, increasing clinical effectiveness. A byproduct of this process change is improved efficiency, as CPOE requires less time for provider prescription and RPh processing and verification.</p><p><b>Table 1.</b> RPh Order Adjustments Required During Collection Period.</p><p></p><p>Elaina Szeszycki, BS, PharmD, CNSC<sup>1</sup>; Emily Gray, PharmD<sup>2</sup>; Kathleen Doan, PharmD, BCPPS<sup>3</sup>; Kanika Puri, MD<sup>1</sup></p><p><sup>1</sup>Riley Hospital for Children at Indiana University Health, Indianapolis, IN; <sup>2</sup>Lurie Children's Hospital, Chicago, IL; <sup>3</sup>Riley Hospital for Children at IU Health, Indianapolis, IN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Parenteral nutrition (PN) is ordered daily at Riley Hospital for Children at IU Health by different specialties. Our hospital is a stand-alone pediatric hospital, including labor, delivery, and high-risk maternal care. Historically, the PN orders were due by early afternoon with a hard cut-off by end of the day shift for timely central compounding at a nearby adult hospital. Due to the relocation of staff and equipment to a new sterile compounding facility, a hard deadline was created with an earlier cutoff time and contingency plans for orders received after the deadline. This updated process was created to allow for timely delivery to Riley and subsequently to the patients to meet the standard PN hang-time of 2100.The Nutrition Support Team (NST) and Pharmacy and Therapeutics (P&T) Committee approved an updated PN order process as follows:</p><p>Enforce Hard PN Deadline of 1200 for new and current PN orders If PN not received by 1200, renew active PN order for the next 24 hours If active PN order is not appropriate for the next 24 hours, the providers will need to order IVF in place of PN until the following day PN orders into PN order software by 1500</p><p><b>Methods:</b> A quality improvement (QI) process check was performed 3 months initiation of the updated PN order process. Data collection was performed for 1 month with the following data points: Total PN orders, Missing PN orders at 1200, PN orders re-ordered per P&T policy after 1200 deadline, Lab review, Input and output, Subsequent order changes for 24 hours after renewal of active PN order, Waste of PN, and Service responsible for late PN order.</p><p><b>Results:</b></p><p></p><p><b>Conclusion:</b> The number of late PN orders after the hard deadline was < 5% and there was a minimal number of renewed active PN orders due to the pharmacists' concern for ensuring safety of our patients. No clinically significant changes resulted from renewal of active PN, so considered a safe process despite small numbers. The changes made to late PN orders were minor or related to the planned discontinuation of PN. After review of results by NST and pharmacy administration, it was decided to take the following actions: Review data and process with pharmacy staff to assist with workload flow and education Create a succinct Riley TPN process document for providers, specifically services with late orders, reviewing PN order entry hard deadline and need for DC PN order by deadline to assist with pharmacy staff workflow and avoidance of potential PN waste Repeat QI analysis in 6-12 months.</p><p><b>International Poster of Distinction</b></p><p>Muna Islami, PharmD, BCNSP<sup>1</sup>; Mohammed Almusawa, PharmD, BCIDP<sup>2</sup>; Nouf Alotaibi, PharmD, BCPS, BCNSP<sup>3</sup>; Jwael Alhamoud, PharmD<sup>1</sup>; Maha Islami, PharmD<sup>4</sup>; Khalid Eljaaly, PharmD, MS, BCIDP, FCCP, FIDSA<sup>4</sup>; Majda Alattas, PharmD, BCPS, BCIDP<sup>1</sup>; Lama Hefni, RN<sup>5</sup>; Basem Alraddadi, MD<sup>1</sup></p><p><sup>1</sup>King Faisal Specialist Hospital, Jeddah, Makkah; <sup>2</sup>Wayne State University, Jeddah, Makkah; <sup>3</sup>Umm al Qura University, Jeddah, Makkah; <sup>4</sup>King Abdulaziz University Hospital, Jeddah, Makkah; <sup>5</sup>King Faisal Specialist Hospital, Jeddah, Makkah</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Parenteral nutrition (PN) is a critical therapy for patients unable to meet their nutritional needs through the gastrointestinal tract. While it offers a life-saving solution, it also carries the risk of central line-associated bloodstream infections (CLABSIs). However, there is a lack of comprehensive studies examining the risk factors for CLABSIs in a more heterogeneous cohort of PN recipients. This study aims to identify the risk factors associated with CLABSIs in patients receiving PN therapy in Saudi Arabia.</p><p><b>Methods:</b> This retrospective cohort multicenter study was conducted in three large tertiary referral centers in Saudi Arabia. The study included all hospitalized patients who received PN therapy through central lines between 2018 and 2022. The purpose of the study was to investigate the association between parenteral nutrition (PN) and central line-associated bloodstream infections (CLABSIs), using both univariate and multivariate analysis.</p><p><b>Results:</b> Out of 662 hospitalized patients who received PN and had central lines, 123 patients (18.6%) developed CLABSI. Among our patients, the duration of parenteral nutrition was a dependent risk factor for CLABSI development (OR, 1.012; 95% CI, 0.9-1.02). In patients who were taken PN, the incidence of CLABSI did not change significantly over the course of the study's years.</p><p><b>Conclusion:</b> The length of PN therapy is still an important risk factor for CLABSIs; more research is required to determine the best ways to reduce the incidence of CLABSI in patients on PN.</p><p><b>Table 1.</b> Characteristics of Hospitalized Patients Who Received PN.</p><p></p><p>1 n (%); Median (IQR) BMI, Body Mass Index.</p><p><b>Table 2.</b> The Characteristics of Individuals With and Without CLABSI Who Received PN.</p><p></p><p>1 n (%); Median (IQR), 2 Fisher's exact test; Pearson's Chi-squared test; Mann Whitney U test PN, Parenteral Nutrition</p><p></p><p>CLABSI, central line-associated bloodstream infections PN, parenteral nutrition</p><p><b>Figure 1.</b> Percentage of Patients With a Central Line Receiving PN Who Experienced CLABSI.</p><p>Duy Luu, PharmD<sup>1</sup>; Rachel Leong, PharmD<sup>2</sup>; Nisha Dave, PharmD<sup>2</sup>; Thomas Ziegler, MD<sup>2</sup>; Vivian Zhao, PharmD<sup>2</sup></p><p><sup>1</sup>Emory University Hospital - Nutrition Support Team, Lawrenceville, GA; <sup>2</sup>Emory Healthcare, Atlanta, GA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Intravenous lipid emulsion (ILE) is an essential component in parenteral nutrition (PN)-dependent patients because it provides calories and essential fatty acids; however, the use of soybean oil-based ILE (SO-ILE) may contribute to the development of PN-associated liver disease (PNALD) in patients with intestinal failure who require chronic PN. To mitigate this risk, new formulations of ILE, such as a mixture of SO, medium chain triglycerides (MCT), olive oil (OO), and fish oil-based ILE (SO/MCT/OO/FO-ILE), or pure fish oil-based ILE (FO-ILE) are now available in the US. FO-ILE is only approved for pediatric use for PN-associated cholestasis. This patient case highlights the benefits of using a combination of FO-containing ILEs to improve PNALD.</p><p><b>Methods:</b> A 65-year-old female with symptomatic achalasia required robotic-assisted esophagomyotomy with fundoplasty in February 2020. Her postoperative course was complicated with bowel injury that required multiple small bowel resections and total colectomy with end jejunostomy, resulting in short bowel syndrome with 80 centimeters of residual small bowel. Postoperatively, daily PN containing SO-ILE was initiated along with tube feedings (TF) during hospitalization, and she was discharged home with PN and TF. Her surgeon referred her to the Emory University Hospital (EUH) Nutrition Support Team (NST) for management. She received daily cyclic-PN infused over 12 hours, providing 25.4 kcal/kg/day with SO-ILE 70 grams (1.2 g/kg) three times weekly and standard TF 1-2 containers/day. In September 2020, she complained of persistent jaundice and was admitted to EUH. She presented with scleral icterus, hyperbilirubinemia, and elevated liver function tests (LFTs). EUH NST optimized PN to provide SO/MCT/OO/FO-ILE (0.8 g/kg/day), which improved blood LFTs, and the dose was then increased to 1 g/kg/day. In the subsequent four months, her LFTs worsened despite optimizing pharmacotherapy, continuing cyclic-TF, and reducing and then discontinuing ILE. She required multiple readmissions at EUH and obtained two liver biopsies that confirmed a diagnosis of PN-induced hepatic fibrosis after 15 months of PN. Her serum total bilirubin level peaked at 18.5 mg/dL, which led to an intensive care unit admission and required molecular adsorbent recirculating system therapy. In March 2022, the NST exhausted all options and incorporated FO-ILE (0.84 g/kg/day) three times weekly (separate infusion) and SO/MCT/OO/FO-ILE (1 g/kg/day) weekly.</p><p><b>Results:</b> The patient's LFTs are shown in Figure 1. The blood level of aspartate aminotransferase improved from 123 to 60 units/L, and alanine aminotransferase decreased from 84 to 51 units/L after 2 months and returned to normal after 4 months of the two ILEs. Similarly, the total bilirubin decreased from 5.6 to 2.2 and 1.1 mg/dL by 2 and 6 months, respectively. Both total bilirubin and transaminase levels remained stable. Although her alkaline phosphatase continued to fluctuate and elevated in the last two years, this marker decreased from 268 to 104 units/L. All other PNALD-related symptoms were resolved.</p><p><b>Conclusion:</b> This case demonstrates that the combination of FO-containing ILEs significantly improved and stabilized LFTs in an adult with PNALD. Additional research is needed to investigate the effect of FO-ILE in adult PN patients to mitigate PNALD.</p><p></p><p>SO: soybean oil; MCT: median-chain triglyceride; OO: olive oil; FO: fish oil; AST: Aspartate Aminotransferase; ALT: Alanine Aminotransferase.</p><p><b>Figure 1.</b> Progression of Liver Enzymes Status in Relation to Lipid Injectable Emulsions.</p><p>Narisorn Lakananurak, MD<sup>1</sup>; Leah Gramlich, MD<sup>2</sup></p><p><sup>1</sup>Department of Medicine, Faculty of Medicine, Chulalongkorn University, Bangkok, Krung Thep; <sup>2</sup>Division of Gastroenterology, Faculty of Medicine and Dentistry, University of Alberta, Edmonton, AB</p><p><b>Financial Support:</b> This research study received a grant from Baxter, Canada.</p><p><b>Background:</b> Pre-operative parenteral nutrition (PN) has been shown to enhance outcomes in malnourished surgical patients. Traditionally, pre-operative PN necessitates hospital admission, which leads to increased length of stay (LOS) and higher hospital costs. Furthermore, inpatient pre-operative PN may not be feasible or prioritized when access to hospital beds is restricted. Outpatient PN presents a potential solution to this issue. To date, the feasibility and impact of outpatient PN for surgical patients have not been investigated. This study aims to assess the outcomes and feasibility of outpatient pre-operative PN in malnourished surgical patients.</p><p><b>Methods:</b> Patients scheduled for major surgery who were identified as at risk of malnutrition using the Canadian Nutrition Screening Tool and classified as malnourished by Subjective Global Assessment (SGA) B or C were enrolled. Exclusion criteria included severe systemic diseases as defined by the American Society of Anesthesiologists (ASA) classification III to V, insulin-dependent diabetes mellitus, and extreme underweight (less than 40 kg). Eligible patients received a peripherally inserted central catheter (PICC) line and outpatient PN (Olimel 7.6%E 1,000 ml) for 5-10 days using the maximum infusion days possible prior to surgery at an infusion clinic. PN was administered via an infusion pump over 4-5 hours by infusion clinic nurses. The outcomes and feasibility of outpatient PN were assessed. Safety outcomes, including refeeding syndrome, dysglycemia, volume overload, and catheter-related complications, were monitored.</p><p><b>Results:</b> Outpatient PN was administered to eight patients (4 males, 4 females). Pancreatic cancer and Whipple's procedure were the most common diagnoses and operations, accounting for 37.5% of cases. (Table 1) The mean (SD) duration of PN was 5.5 (1.2) days (range 5-8 days). Outpatient PN was completed in 75% of patients, with an 88% completion rate of PN days (44/50 days). Post-PN infusion, mean body weight and body mass index increased by 4.6 kg and 2.1 kg/m², respectively. The mean PG-SGA score improved by 4.9 points, and mean handgrip strength increased from 20 kg to 25.2 kg. Quality of life, as measured by SF-12, improved in both physical and mental health domains (7.3 and 3.8 points, respectively). Patient-reported feasibility scores were high across all aspects (Acceptability, Appropriateness, and Feasibility), with a total score of 55.7/60 (92.8%). Infusion clinic nurses (n = 3) also reported high total feasibility scores (52.7/60, 87.8%). (Table 2) No complications were observed in any of the patients.</p><p><b>Conclusion:</b> Outpatient pre-operative PN was a feasible approach that was associated with improved outcomes in malnourished surgical patients. This novel approach has the potential to enhance outcomes and decrease the necessity for hospital admission in malnourished surgical patients. Future studies involving larger populations are needed to evaluate the efficacy of outpatient PN.</p><p><b>Table 1.</b> Baseline Characteristics of the Participants (n = 8).</p><p></p><p><b>Table 2.</b> Outcomes and Feasibility of Outpatient Preoperative Parenteral Nutrition (n = 8).</p><p></p><p>Adrianna Wierzbicka, MD<sup>1</sup>; Rosmary Carballo Araque, RD<sup>1</sup>; Andrew Ukleja, MD<sup>1</sup></p><p><sup>1</sup>Cleveland Clinic Florida, Weston, FL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Gastroparesis (GP) is a chronic motility disorder marked by delayed gastric emptying, associated with symptoms: nausea, vomiting and abdominal pain. Treatment consists of diet modifications and medications, with nutritional support tailored to disease severity. Severe refractory cases may require enteral or parenteral nutrition (PN). However, the role of home parenteral nutrition (HPN) in managing GP is underexplored. This study aims to enhance nutrition therapy practice by examining the utilization of HPN in GP population, addressing a significant gap in current nutrition support strategies.</p><p><b>Methods:</b> We conducted a retrospective, single-center analysis of patients receiving HPN from August 2022 to August 2024. Data were obtained through a review of electronic medical records as part of a quality improvement monitoring process. Patients' demographics, etiology of GP, indications for HPN, types of central access, duration of therapy, and PN-related complications were analyzed using descriptive statistics. Inclusion criteria were: adults (>18 yrs.), GP diagnosis by gastric scintigraphy, and HPN for a minimum of 2 consecutive months. Among 141 identified HPN patients, 10 were diagnosed with GP as indication for PN.</p><p><b>Results:</b> GP patients constituted 7% (10/141) of our home PN population. In this cohort analysis of 10 patients with GP receiving HPN, the demographic profile was predominantly female (80%); a mean age of 42.6 yrs., all individuals identified as Caucasian. All patients had idiopathic GP, severe gastric emptying delay was found in 80% of cases, with all experiencing predominant symptoms of nausea/vomiting. Type of central access: 50% PICC lines, 30% Hickman catheters, 10% Powerlines, and 10% mediport. The mean weight change with PN therapy was an increase of 21.9 lbs. 80% of patients experienced infection-related complications, including bacteremia (Methicillin-Sensitive Staphylococcus Aureus (MSSA), Methicillin-Resistant Staphylococcus Aureus (MRSA)), Pseudomonas, and fungemia. Deep vein thrombosis (DVT) was identified in 20% of patients, alongside one case of a cardiac thrombus. Tube feeding trials were attempted in 70% of cases, but 50% ultimately discontinued due to intolerance, such as abdominal pain or complications like buried bumper syndrome. Chronic pain management was used in 60% of patients, with 40% on opioid therapy (morphine, fentanyl). PN was discontinued in 50% of patients due to recurrent infections (20%), advancement to tube feeding (20%), questionable compliance (20%), or improvement in oral intake (40%).</p><p><b>Conclusion:</b> This retrospective analysis underscores the potential of HPN as a nutritional strategy for GP, particularly in patients with refractory symptoms and severe delay in gastric emptying who previously failed EN or experienced complications related to the enteral access. In addition to the observed mean weight gain, HPN seems to play a crucial role in alleviating debilitating symptoms such as nausea, vomiting, and abdominal pain, thereby improving patients' overall quality of life. Nonetheless, the prevalence of infection-related complications and the requirement for chronic pain management underscore the challenges associated with GP treatment. The variability in patient responses to different nutritional strategies emphasizes the importance of individualized care plans. These findings advocate for further research to optimize HPN protocols and improve comprehensive management strategies in GP.</p><p></p><p><b>Figure 1.</b> Reasons for PN Discontinuation.</p><p></p><p><b>Figure 2.</b> Complication Associated with PN.</p><p>Longchang Huang, MD<sup>1</sup>; Peng Wang<sup>2</sup>; Shuai Liu<sup>3</sup>; Xin Qi<sup>1</sup>; Li Zhang<sup>1</sup>; Xinying Wang<sup>4</sup></p><p><sup>1</sup>Department of General Surgery, Jinling Hospital, Medical School of Nanjing University, Nanjing, Jiangsu; <sup>2</sup>Department of Digestive Disease Research Center, Gastrointestinal Surgery, The First People's Hospital of Foshan, Guangdong, Foshan; <sup>3</sup>Department of General Surgery, Jinling Hospital, Medical School of Nanjing University, Nanjing, Jiangsu; <sup>4</sup>Wang, Department of General Surgery, Jinling Hospital, Medical School of Nanjing University, Nanjing, Jiangsu</p><p><b>Financial Support:</b> National Natural Science Foundation of China, 82170575 and 82370900.</p><p><b>Background:</b> Total parenteral nutrition (TPN) induced gut microbiota dysbiosis is closely linked to intestinal barrier damage, but the mechanism remains unclear.</p><p><b>Methods:</b> Through the application of 16S rRNA gene sequencing and metagenomic analysis, we examined alterations in the gut microbiota of patients with chronic intestinal failure (CIF) and TPN mouse models subjected to parenteral nutrition, subsequently validating these observations in an independent verification cohort. Additionally, we conducted a comprehensive analysis of key metabolites utilizing liquid chromatography-mass spectrometry (LC-MS). Moreover, we explored modifications in essential innate-like lymphoid cell populations through RNA sequencing (RNA-seq), flow cytometry, and single-cell RNA sequencing (scRNA-seq).</p><p><b>Results:</b> The gut barrier damage associated with TPN is due to decreased Lactobacillus murinus. L.murinus mitigates TPN-induced intestinal barrier damage through the metabolism of tryptophan into indole-3-carboxylic acid (ICA). Furthermore, ICA stimulates innate lymphoid cells 3 (ILC3) to secrete interleukin-22 by targeting the nuclear receptor Rorc to enhance intestinal barrier protection.</p><p><b>Conclusion:</b> We elucidate the mechanisms driving TPN-associated intestinal barrier damage and indicate that interventions with L. murinus or ICA could effectively ameliorate TPN-induced gut barrier injury.</p><p></p><p><b>Figure 1.</b> TPN Induces Intestinal Barrier Damage in Humans and Mice. (a) the rate of febrile and admission of ICU in the Cohort 1. (b-d) Serum levels of IFABP, CRP, and LPS in patients with CIF. (e) Representative intestinal H&E staining and injury scores (f) (n = 10 mice per group). (g) Results of the electrical resistance of the intestine in mice by Ussing Chamber (n = 5 mice per group). (h) Immunofluorescence experiments in the intestines and livers of mice. (i) The results of Western blot in the Chow and TPN groups.</p><p></p><p><b>Figure 2.</b> TPN Induces Gut Dysbiosis in Humans and Mice. (a) PCoA for 16S rRNA of fecal content from Cohort 1 (n = 16 individuals/group). (b) Significant abundance are identified using linear discriminant analysis (LDA). (c) Top 10 abundant genus. (d) PCoA of the relative genus or species abundances (n = 5 mice per group). (e) LDA for mice. (f) Sankey diagram showing the top 10 abundant genus of humans and mice. (g) The following heatmap illustrates the correlation between the abundance of species in intestinal microbiota and clinical characteristics of patients with CIF.</p><p></p><p><b>Figure 3.</b> Metabolically active L.murinus ameliorate intestinal barrier damage. (a) RT-PCR was conducted to quantify the abundance of L. murinus in feces from L-PN and H-PN patients (Cohorts 1 and 2). (b) Representative intestinal H&E staining and injury scores (c) (n = 10 mice per group). (d) The results of the electrical resistance of the intestine in mice by Ussing Chamber (n = 5 mice per group). (e) The results of Western Blot. (f) 3D-PCA and volcano plot (g) analyses between the Chow and TPN group mice. (h) The metabolome-wide pathways were enriched based on the metabolomics data obtained from fecal content from Chow and TPN group mice (n = 5 mice per group). (i) The heatmap depicts the correlation between the abundance of intestinal microbiota in species level and tryptophan metabolites of the Chow and TPN group mice (n = 5 mice per group). (j) VIP scores of 3D-PCA. A taxon with a variable importance in projection (VIP) score of >1.5 was deemed to be of significant importance in the discrimination process.</p><p></p><p><b>Figure 4.</b> ICA is critical for the effects of L.murinus. (a) The fecal level of ICA from TPN mice treated PBS control or ICA(n = 10 mice per group). (b) Representative intestinal H&E staining and injury scores (c) (n = 10 mice per group). (d) The results of the electrical resistance of the intestine in mice by Ussing Chamber (n = 5 mice per group). (e) The results of Western blot. (f) This metabolic pathway illustrates the production of ICA by the bacterium L. murinus from the tryptophan. (g) PLS-DA for the profiles of metabolite in feces from TPN mice receiving ΔArAT or live L. murinus (n = 5 mice per group). (h) The heat map of tryptophan-targeted metabolomics in fecal samples from TPN mice that received either ΔArAT (n = 5) or live L. murinus (n = 5). (i) Representative intestinal H&E staining and injury scores (j)(n = 10 mice per group). (k) The results of Western Blot.</p><p>Callie Rancourt, RDN<sup>1</sup>; Osman Mohamed Elfadil, MBBS<sup>1</sup>; Yash Patel, MBBS<sup>1</sup>; Taylor Dale, MS, RDN<sup>1</sup>; Allison Keller, MS, RDN<sup>1</sup>; Alania Bodi, MS, RDN<sup>1</sup>; Suhena Patel, MBBS<sup>1</sup>; Andrea Morand, MS, RDN, LD<sup>1</sup>; Amanda Engle, PharmD, RPh<sup>1</sup>; Manpreet Mundi, MD<sup>1</sup></p><p><sup>1</sup>Mayo Clinic, Rochester, MN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Although patent foramen ovale (PFO) are generally asymptomatic and cause no health concerns, they can be a risk factor for embolism and stroke. Due to this theoretical risk, some institutions have established protocols requiring most IV solutions to be administered through a small micron filter in patients with PFO. While 2-in-1 dextrose and amino acid solutions can be filtered through a 0.22-micron filter with relative ease, injectable lipid emulsions (ILEs), whether on their own or as part of a total admixture, consist of larger particles, requiring a 1.2-micron or bigger filter size. The use of the larger filter precludes the administration of ILE, an essential source of calories, in patients with PFO. It is unknown if patients who do receive ILE have an increased incidence of lipid embolism and stroke.</p><p><b>Methods:</b> A single-center retrospective review of patients on central parenteral nutrition (CPN) was completed. Demographics, and baseline clinical characteristics including co-morbidities and history of CVA were collected. The outcome of interest is defined as an ischemic cerebrovascular accident (CVA) within 30 days of CPN and, therefore, potentially attributable to it. Other cardiovascular and thromboembolic events were captured. All Patients with a PFO diagnosis and inpatient CPN administration between January 1, 2018, and December 18, 2023, at our quaternary care referral center were included as the case cohort. A 3:1 control group matched to age, gender, duration of inpatient CPN, and clinical co-morbidities was identified and utilized to examine the difference in the outcome of interest.</p><p><b>Results:</b> Patients with PFO who received CPN (n = 38, 53.8% female) had a mean age of 63.5 ± 13.1 years and a mean BMI of 31.1 ± 12.1 at CPN initiation (Table 1). The PFO varied in size, with the majority (38.5%) having a very small/trivial one (Table 2). All patients in this cohort had appropriate size filters placed for CPN and ILE administration. CPN prescription and duration were comparable between both groups. The majority of patients with PFO (53.8%) received mixed oil ILE, followed by soy-olive oil ILE (23.1%), whereas the majority of patients without PFO (51.8%) received soy-olive oil ILE and (42.9%) received mixed oil ILE (Table 3). Case and control groups had cardiovascular risks at comparable prevalence, including obesity, hypertension, diabetes, and dyslipidemia. However, more patients had a history of vascular cardiac events and atrial fibrillation in the PFO group, and more patients were smokers in the non-PFO group (Table 4). Patients with PFO received PN for a median of 7 days (IQR: 5,13), and 32 (84.2%) received ILE. Patients without PFO who received CPN (n = 114, 52.6%) had a mean age of 64.9 ± 10 years and a mean BMI of 29.6 ± 8.5 at CPN initiation. Patients in this cohort received PN for a median of 7 days (IQR: 5,13.5), and 113 (99.1%) received ILE. There was no difference in the incidence of ischemic CVA within 30 days of receiving CPN between both groups (2 (5.3%) in the PFO group vs. 1 (0.8%) in the non-PFO group; p = 0.092) (Table 4).</p><p><b>Conclusion:</b> The question of the risk of CVA/stroke with CPN in patients with PFO is clinically relevant and often lacks a definitive answer. Our study revealed no difference in ischemic CVA potentially attributable to CPN between patients with PFO and patients without PFO in a matched control cohort in the first 30 days after administration of PN. This finding demonstrates that CPN with ILE is likely safe for patients with PFO in an inpatient setting.</p><p><b>Table 1.</b> Baseline Demographics and Clinical Characteristics.</p><p></p><p><b>Table 2.</b> PFO Diagnosis.</p><p></p><p>*All received propofol concomitantly.</p><p><b>Table 3.</b> PN Prescription.</p><p></p><p><b>Table 4.</b> Outcomes and Complications.</p><p></p><p><b>Enteral Nutrition Therapy</b></p><p>Osman Mohamed Elfadil, MBBS<sup>1</sup>; Edel Keaveney, PhD<sup>2</sup>; Adele Pattinson, RDN<sup>1</sup>; Danelle Johnson, MS, RDN<sup>1</sup>; Rachael Connolly, BSc.<sup>2</sup>; Suhena Patel, MBBS<sup>1</sup>; Yash Patel, MBBS<sup>1</sup>; Ryan Hurt, MD, PhD<sup>1</sup>; Manpreet Mundi, MD<sup>1</sup></p><p><sup>1</sup>Mayo Clinic, Rochester, MN; <sup>2</sup>Rockfield MD, Galway</p><p><b>Financial Support:</b> Rockfield Medical Devices.</p><p><b>Background:</b> Patients on home enteral nutrition (HEN), many of whom are mobile, can experience significant hardships and reduced quality of life (QoL) due to limitations on mobility, on top of burdens due to underlying disease processes. Improving mobility while feeding could reduce burdens associated with HEN and potentially improve QoL. This prospective cohort study aims to evaluate participants’ perspectives on their mobility, ease of performing physical activities while feeding, and QoL following the use of a novel enteral feeding system (EFS).</p><p><b>Methods:</b> A prospective single-center study was conducted to evaluate a novel EFS, which is an FDA-cleared elastomeric system (Mobility + ®) that consists of a lightweight feeding pouch (reservoir for 500 mL feed), a filling set (used in conjunction with a syringe to fill EFS) and a feeding set to deliver EN formula to an extension set/feeding tube with an ISO 80369-3 compatible connector. Adult HEN-dependent patients were recruited by invitation to use the study EFS for a minimum of 2 feeds a day for 14 days, preceded by a familiarization period of 5-7 days. Participant perspectives on how they rated performing typical daily activities while feeding (e.g., moving, traveling, socializing) and feeding system parameters (ease of use, portability, noise, discretion, performance) were evaluated using HEN-expert validated questionnaires. A score was given for each rating from 1 to 5, with 5 being the most positive response. An overall score was calculated and averaged for the cohort. Participants were followed up during the familiarization period. On days 7 and 14, additional telephone interviews were conducted regarding compliance, enteral feed intake, participant perspectives on study EFS vs. current system, and other measures. We excluded those with reduced functional capacity due to their underlying disease(s).</p><p><b>Results:</b> Seventeen participants completed the study (mean age 63.8 ± 12 years; 70.6% male). Participants used various feeding systems, including gravity, bolus method, and pump, with the majority (82.4%) having a G-tube placed (Table 1). Sixteen (94.1%) patients achieved use of study EFS for at least two feeds a day (and majority of daily EN calories) for all study days (Table 2). The ratings for the ability to perform various activities using study EFS were significantly different compared to those of the systems used before the study. An improvement in ratings was noted for the ease of performing common daily activities, including moving between rooms or on stairs, taking short and long walks, traveling by car or public transport, engaging in moderate- to high-intensity activities, sleeping, and socializing with family and friends, between the time point before enrolment and end of study (day 14) (p-value < 0.0001) (Table 3). Ratings of feeding system parameters were significantly different between systems used before the study and the study EFS (p < 0.0001) (Table 3), with the largest increases in positive ratings noted in relation to easiness to carry, noise level, and ability to feed discreetly. Ratings for overall satisfaction with the performance of study EFS did not differ from the ratings for the systems used before the study, with participants reporting that the main influencing factors were the length of time and the effort needed to fill study EFS. No difference was noted in the QoL rating.</p><p><b>Conclusion:</b> The studied EFS is safe and effective as an enteral feeding modality that provides an alternative option for HEN recipients. Participants reported a significant positive impact of study EFS on their activities of daily living. Although the overall QoL rating remained the same, improvements in mobility, discretion, and ease of carrying— aspects of QoL—were associated with the use of study EFS.</p><p><b>Table 1.</b> Baseline Demographics and Clinical Characteristics.</p><p></p><p><b>Table 2.</b> Safety and Effectiveness.</p><p></p><p><b>Table 3.</b> Usability and Impact of the Study EFS.</p><p></p><p>Talal Sharaiha, MD<sup>1</sup>; Martin Croce, MD, FACS<sup>2</sup>; Lisa McKnight, RN, BSN MS<sup>2</sup>; Alejandra Alvarez, ACP, PMP, CPXP<sup>2</sup></p><p><sup>1</sup>Aspisafe Solutions Inc., Brooklyn, NY; <sup>2</sup>Regional One Health, Memphis, TN</p><p><b>Financial Support:</b> Talal Sharaiha is an executive of Aspisafe Solutions Inc. Martin Croce, Lisa McKnight and Alejandra Alvarez are employees of Regional One Health institutions. Regional One Health has a financial interest in Aspisafe Solutions Inc. through services, not cash. Aspisafe provided the products at no charge.</p><p><b>Background:</b> Feeding tube securement has seen minimal innovation over the past decades, leaving medical adhesives to remain as the standard method. However, adhesives frequently fail to maintain secure positioning, with dislodgement rates reported between 36% and 62%, averaging approximately 40%. Dislodgement can lead to adverse outcomes, including aspiration, increased risk of malnutrition, higher healthcare costs, and extended nursing care. We aimed to evaluate the safety and efficacy of a novel feeding tube securement device in preventing NG tube dislodgement compared to standard adhesive tape. The device consists of a bracket that sits on the patient's upper lip. The bracket has a non-adhesive mechanism for securing feeding tubes ranging from sizes 10 to 18 French. It extends to two cheek pads that sit on either side of the patient's cheeks and is further supported by a head strap that wraps around the patient's head (Fig. 1 + Fig. 2).</p><p><b>Methods:</b> We conducted a prospective, case-control trial at Regional One Health Center, Memphis, TN, comparing 50 patients using the novel securement device, the NG Guard, against 50 patients receiving standard care with adhesive tape. The primary outcome was the rate of accidental or intentional NG tube dislodgement. Secondary outcomes included the number of new NG tubes required as a result of dislodgement, and device-related complications or adhesive-related skin injuries in the control group. Statistical analyses employed Student's t-test for continuous variables and Fisher's exact test for categorical variables. Significance was set at an alpha level of 0.05. We adjusted for confounding variables, including age, sex, race, and diagnosis codes related to delirium, dementia, and confusion (Table 1).</p><p><b>Results:</b> There were no significant differences between the groups in baseline characteristics, including age, sex, race, or confusion-related diagnoses (Table 2) (p ≥ 0.09). Nasogastric tube dislodgement occurred significantly more often in the adhesive tape group (31%) compared to the intervention group (11%) (p < 0.05). The novel device reduced the risk of tube dislodgement by 65%. Additionally, 12 new tubes were required in the control group compared to 3 in the intervention group (p < 0.05), translating to 18 fewer reinsertion events per 100 tubes inserted in patients secured by the novel device. No device-related complications or adhesive-related injuries were reported in either group.</p><p><b>Conclusion:</b> The novel securement device significantly reduced the incidence of nasogastric tube dislodgement compared to traditional adhesive tape. It is a safe and effective securement method and should be considered for use in patients with nasogastric tubes to reduce the likelihood of dislodgement and the need for reinsertion of new NG tubes.</p><p><b>Table 1.</b> Diagnosis Codes Related to Dementia and Delirium.</p><p></p><p><b>Table 2.</b> Baseline Demographics.</p><p></p><p></p><p><b>Figure 1.</b> Novel Securement Device - Front View.</p><p></p><p><b>Figure 2.</b> Novel Securement Device - Side Profile.</p><p><b>Best of ASPEN-Enteral Nutrition Therapy</b></p><p><b>Poster of Distinction</b></p><p>Alexandra Kimchy, DO<sup>1</sup>; Sophia Dahmani, BS<sup>2</sup>; Sejal Dave, RDN<sup>1</sup>; Molly Good, RDN<sup>1</sup>; Salam Sunna, RDN<sup>1</sup>; Karen Strenger, PA-C<sup>1</sup>; Eshetu Tefera, MS<sup>3</sup>; Alex Montero, MD<sup>1</sup>; Rohit Satoskar, MD<sup>1</sup></p><p><sup>1</sup>MedStar Georgetown University Hospital, Washington, DC; <sup>2</sup>Georgetown University Hospital, Washington, DC; <sup>3</sup>MedStar Health Research Institute, Columbia, MD</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Early nutrition intervention is of high importance in patients with cirrhosis given the faster onset of protein catabolism for gluconeogenesis compared to those without liver disease. Severe malnutrition is associated with frequent complications of cirrhosis such as infection, hepatic encephalopathy, and ascites. Furthermore, studies have demonstrated higher mortality rates in cirrhotic patients who are severely malnourished in both the pre- and post-transplant setting. The current practice guidelines encourage the use of enteral nutrition in cirrhotic patients who are unable to meet their intake requirements with an oral diet. The aim of this study was to evaluate the utilization and implications of enteral feeding in hospitalized patients with cirrhosis diagnosed with severe protein calorie malnutrition at our institution.</p><p><b>Methods:</b> This was a retrospective study of patients admitted to the transplant hepatology inpatient service at MedStar Georgetown University Hospital from 2019-2023. ICD-10-CM code E43 was then used to identity patients with a diagnosis of severe protein calorie malnutrition. The diagnosis of cirrhosis and pre-transplant status were confirmed by review of the electronic medical record. Patients with the following characteristics were excluded: absence of cirrhosis, history of liver transplant, admission diagnosis of upper gastrointestinal bleed, and/or receipt of total parenteral nutrition. Wilcoxon rank sum and two sample t-tests were used to examine differences in the averages of continuous variables between two groups. Chi-square and Fisher exact tests were used to investigate differences for categorical variables. Statistical significance was defined as p-values ≤ 0.05.</p><p><b>Results:</b> Of the 96 patients with cirrhosis and severe protein calorie malnutrition, 31 patients (32%) received enteral nutrition. Time from admission to initiation of enteral feeding was on average 7 days with an average total duration of enteral nutrition of 10 days. In the group that received enteral nutrition, there was no significant change in weight, BMI, creatinine, total bilirubin or MELD 3.0 score from admission to discharge; however, albumin, sodium and INR levels had significantly increased (Table 1). A comparative analysis between patients with and without enteral nutrition showed a significant increase in length of stay, intensive care requirement, bacteremia, gastrointestinal bleeding, discharge MELD 3.0 score and in hospital mortality rates among patients with enteral nutrition. There was no significant difference in rates of spontaneous bacterial peritonitis, pneumonia, admission MELD 3.0 score or post-transplant survival duration in patients with enteral nutrition compared to those without enteral nutrition (Table 2).</p><p><b>Conclusion:</b> In this study, less than fifty percent of patients hospitalized with cirrhosis received enteral nutrition despite having a diagnosis of severe protein calorie malnutrition. Initiation of enteral nutrition was found to be delayed a week, on average, after hospital admission. Prolonged length of stay and higher in-hospital mortality rates suggest a lack of benefit of enteral nutrition when started late in the hospital course. Based on these findings, our institution has implemented a quality improvement initiative to establish earlier enteral feeding in hospitalized patients with cirrhosis and severe protein calorie malnutrition. Future studies will evaluate the efficacy of this initiative and implications for clinical outcomes.</p><p><b>Table 1.</b> The Change in Clinical End Points from Admission to Discharge Among Patients Who Received Enteral Nutrition.</p><p></p><p>Abbreviations: kg, kilograms; BMI, body mass index; INR, international normalized ratio; Na, sodium, Cr, creatinine; TB, total bilirubin; MELD, model for end stage liver disease; EN, enteral nutrition; Std, standard deviation</p><p><b>Table 2.</b> Comparative Analysis of Clinical Characteristics and Outcomes Between Patients With And Without Enteral Nutrition.</p><p></p><p>Abbreviations: MASLD, metabolic dysfunction-associated steatotic liver disease; HCV, hepatitis C virus; HBV, hepatitis B virus; AIH, autoimmune hepatitis; PSC, primary sclerosing cholangitis; PBC, primary biliary cholangitis; EN, enteral nutrition; RD, registered dietician; MICU, medical intensive care unit; PNA, pneumonia; SBP, spontaneous bacterial peritonitis; GIB, gastrointestinal bleed; MELD, model for end stage liver disease; d, days; N, number; Std, standard deviation.</p><p>Jesse James, MS, RDN, CNSC<sup>1</sup></p><p><sup>1</sup>Williamson Medical Center, Franklin, TN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Feeding tubes (Tubes) are used to deliver enteral nutrition to patients who are unable to safely ingest nutrients and medications orally, a population at elevated risk of malnutrition and dehydration. Unfortunately, these Tubes have a propensity for becoming clogged. Staff will attempt to unclog Tubes using standard bedside techniques including warm water flushes or chemical enzymes. However, not only are these practices time-consuming, often they are unsuccessful, requiring replacement. An actuated mechanical device for restoring patency in clogged small bore Tubes was evaluated at a level 2 medical center as an alternative declogging method from September 2021- July 2023. Study objectives were to explore the actuated mechanical device's ability to unclog indwelling Tubes and monitor any potential safety issues.</p><p><b>Methods:</b> The TubeClear® System (actuated mechanical device, Actuated Medical, Inc., Bellefonte, PA, Figure 1) was developed to resolve clogs from various indwelling Tubes. N = 20 patients (Table 1) with n = 16, 10Fr 109 cm long nasogastric (NG) tubes, and n = 4, 10Fr 140 cm long nasojejunal (NJ) tubes, underwent clearing attempts with the actuated mechanical device. Initially, patients underwent standard declogging strategies for a minimum of 30 minutes, including warm water flushes and Creon/NaHCO3 slurry. Following unsuccessful patency restoration (n = 17) or patency restoration and reclogging occurring (n = 3), the actuated mechanical device was attempted. Procedure time was estimated from electronic monitoring records charting system and included set up, use, and cleaning time for the actuated mechanical device, to the closest five minutes. All clearing procedures were completed by three trained registered dietitians.</p><p><b>Results:</b> The average time to restore Tube patency (n = 20) was 26.5 min (25 minutes for NG, 32.5 min for NJ) with 90% success (Table 2), and no significant safety issues reported by the operator or patient. User satisfaction was 100% (20/20) and patient discomfort being 10% (2/20).</p><p><b>Conclusion:</b> Based on presented results, the actuated mechanical device was significantly more successful at resolving clogs compared to alternative bedside practices. Operators noted that the “Actuated mechanical device was able to work with clogs when slurries/water can't be flushed.” It was noted that actuated mechanical device use prior to formation of full clog, utilizing a prophylactic approach, “was substantially easier than waiting until the Tube fully clogged.” For a partly clogged Tube, “despite it being somewhat patent and useable, a quick pass of the actuated mechanical device essentially restored full patency and likely prevented a full clog.” For an NG patient, “no amount of flushing or medication slurry was effective, but the actuated mechanical device worked in just minutes without issue.” “Following standard interventions failure after multiple attempts, only the actuated mechanical device was able to restore Tube patency, saving money on not having to replace Tube.” For a failed clearance, the operator noted “that despite failure to restore patency, there was clearly no opportunity for flushes to achieve a better result and having this option [actuated mechanical device] was helpful to attempt to avoid tube replacement.” For an NJ patient, “there would have been no other conventional method to restore patency of a NJ small bore feeding tube without extensive x-ray exposure and \"guess work,\" which would have been impossible for this patient who was critically ill and ventilator dependent.” Having an alternative to standard bedside unclogging techniques proved beneficial to this facility, with 90% effectiveness and saving those patients from undergoing a Tube replacement and saving our facility money by avoiding Tube replacement costs.</p><p><b>Table 1.</b> Patient and Feeding Tube Demographics.</p><p></p><p><b>Table 2.</b> Actuated Mechanical Device Uses.</p><p></p><p></p><p><b>Figure 1.</b> Actuated Mechanical Device for Clearing Partial and Fully Clogged Indwelling Feeding Tubes.</p><p>Vicki Emch, MS, RD<sup>1</sup>; Dani Foster<sup>2</sup>; Holly Walsworth, RD<sup>3</sup></p><p><sup>1</sup>Aveanna Medical Solutions, Lakewood, CO; <sup>2</sup>Aveanna Medical Solutions, Chandler, AZ; <sup>3</sup>Aveanna Medical Solutions, Erie, CO</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Homecare providers have managed through multiple formula backorders since the pandemic. Due to creative problem-solving, clinicians have successfully been able to offer substitutions. However, when pump feeding sets are on backorder, the options are limited; feeding sets are specific to the brand of pump. Recently, a backorder of feeding sets used for pumps common in acute and home care resulted in a severe shortage in the home care supply chain. This required fast action to ensure patients are able to continue administering their tube feeding and prevent readmissions. A solution is to change the patient to a pump brand which is not on backorder. Normally transitioning a patient to a different brand of pump would require in-person teaching. Due to the urgency of the situation, a more efficient method needed to be established. A regional home care provider determined that 20% of patients using enteral feeding pumps were using backordered sets, and 50% were pediatric patients who tend not to tolerate other methods of feeding. In response this home care provider assembled a team to create a new educational model for pump training. The team was composed of Registered Nurses, Registered Dietitians, and patient care and distribution representatives. The hypothesis is that providing high quality educational material with instructional videos, detailed communication on the issue and telephonic clinical support will allow for a successful transition.</p><p><b>Methods:</b> To determine urgency of transition we prioritized patients with a diagnosis of short bowel syndrome, gastroparesis, glycogen storage disease, vent dependency, < 2 years of age, those living in a rural area with a 2-day shipping zip code and conducted a clinical review to determine patients with jejunal feeding tube. (See Table 1) A pump conversion team contacted patients/caregivers to review the situation, discuss options for nutrition delivery, determine current inventory of sets, assessed urgency for transition and coordinated pump, sets and educational material delivery. Weekly reporting tracked number of patients using the impacted pump, transitioned patients, and those requesting to transition back to their original pump.</p><p><b>Results:</b> A total of 2111 patients were using the feeding pump with backordered sets and 50% of these patients were under the age of 12 yrs. old. Over a period of 3 months, 1435 patients or 68% of this patient population were successfully transitioned to a different brand of pump and of those only 7 patients or 0.5% requested to return to their original pump even though they understood the risk of potentially running short on feeding sets. (See Figure 1).</p><p><b>Conclusion:</b> A team approach which included proactively communicating with patients/caregivers, prioritizing patient risk level, providing high-quality educational material with video links and outbound calls from a clinician resulted in a successful transition to a new brand of feeding pump.</p><p><b>Table 1.</b> Patient Priority Levels for Pump with Backordered Sets (Table 1).</p><p></p><p></p><p><b>Figure 1.</b> Number of Pump Conversions (Chart 1).</p><p>Desiree Barrientos, DNP, MSN, RN, LEC<sup>1</sup></p><p><sup>1</sup>Coram CVS, Chino, CA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Follow-up care for the enteral nutrition therapy community is essential for good outcomes. No data had been collected regarding home enteral nutrition (HEN) outcomes at a major university medical center. There was no robust program in place to follow up with patients who were discharged on tube feedings. Consequently, there was little information regarding the follow-up care or patient outcomes related to current practice, complications, re-hospitalizations, and equipment issues for this population.</p><p><b>Methods:</b> The tools utilized were the questionnaire for the 48-hours and 30-day post-discharge discharge outreach calls, pre-discharge handouts, and feeding pump handouts.</p><p><b>Results:</b> Education: Comparison of 48-Hours and 30 days. Q1: Can you tell me why you were hospitalized?Q2: Did a provider contact you for education prior to discharging home? Q3: Do you understand your nutrition orders from your Doctor? Q4: Can you tell me the steps of how to keep your PEG/TF site clean? Q5: Can you tell me how much water to flush your tube? There was an improvement in patient understanding, self-monitoring, and navigation between the two survey timepoints. Regarding patient education in Q3, there was an improved understanding of nutrition orders from 91% to 100%, Q4: steps to keeping tube feeding site clean resulted from 78% to 96%, and knowledge of water flushed before and after each feeding from 81% to 100% at the 48-hour and 30-day timepoints, respectively.</p><p><b>Conclusion:</b> There was an improvement in patient understanding, self-monitoring, and navigation between the two survey timepoints. Verbal responses to open-ended and informational questions were aggregated to analyze complications, care gaps, and service failures.</p><p><b>Table 1.</b> Questionnaire Responses At 48 Hours and 30 Days.</p><p></p><p><b>Table 2.</b> Questionnaire Responses At 48 Hours and 30 Days.</p><p></p><p></p><p><b>Figure 1.</b> Education: Comparison at 48-hours and 30-days.</p><p></p><p><b>Figure 2.</b> Self-monitoring and Navigation: Comparison at 48-hours and 30-days.</p><p>Rachel Ludke, MS, RD, CD, CNSC, CCTD<sup>1</sup>; Cayla Marshall, RD, CD<sup>2</sup></p><p><sup>1</sup>Froedtert Memorial Lutheran Hospital, Waukesha, WI; <sup>2</sup>Froedtert Memorial Lutheran Hospital, Big Bend, WI</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Initiation of early enteral nutrition plays an essential role in improving patient outcomes<sup>1</sup>. Historically, feeding tubes have been placed by nurses, doctors and advanced practice providers. Over the past two decades, the prevalence of dietitians (RDNs) placing feedings tubes at the bedside has grown. This practice has been endorsed by the Academy of Nutrition and Dietetics and American Society for Enteral and Parenteral Nutrition through modifications to the Scope of Practice for the Registered Dietitian Nutritionist and Standards of Professional Performance for Registered Dietitian Nutritionists.<sup>2,3</sup> Feeding tubes placed at the bedside by RDNs has the potential to decrease nursing, fluoroscopy and internal transport time, which is of interest to our hospital. In fall of 2023, we launched a pilot to evaluate the feasibility of an RDN-led bedside tube placement team at our 800-bed level 1 trauma center.</p><p><b>Methods:</b> RDNs first worked closely with nursing leadership to create a tube placement workflow to identify appropriate patients, outline communication needed with the bedside nurse and provider, and establish troubleshooting solutions (Figure 1). Intensive care unit nursing staff then trained RDNs in tube placement using camera-guided technology (IRIS) and deemed RDNs competent after 10 successful tube placements. Given limited literature on RDN led tube placement, we defined success as >80% of tube placements in an appropriate position within the gastrointestinal tract.</p><p><b>Results:</b> To date, the pilot includes 57 patients; Forty-six tubes (80.7%; 39 gastric and 7 post-pyloric) were placed successfully, as confirmed by KUB. Of note, 2 of the 39 gastric tubes were originally ordered to be placed post-pylorically, however gastric tubes were deemed acceptable for these patients after issues were identified during placement. Eleven (19%) tube placements were unsuccessful due to behavioral issues, blockages in the nasal cavity, and anatomical abnormalities.</p><p><b>Conclusion:</b> This pilot demonstrated that well trained RDNs can successfully place feeding tubes at the bedside using camera-guided tube placement technology. One limitation to this pilot is the small sample size. We initially limited the pilot to 2 hospital floors and had trouble educating nursing staff on the availability of the RDN to place tubes. Through evaluation of tube placement orders, we found the floors placed on average 198 tubes during our pilot, indicating 141 missed opportunities for RDNs to place tubes. To address these issues, we created numerous educational bulletins and worked with unit nursing staff to encourage contacting the RDN when feeding tube placement was needed. We also expanded the pilot hospital-wide and are looking into time periods when tubes are most often placed. Anecdotally, bedside feeding tube placement takes 30 to 60 minutes, therefore this pilot saved 1350 to 2700 minutes of nursing time and 180 to 360 minutes of fluoroscopy time necessary to place post-pyloric tubes. Overall, our pilot has demonstrated feasibility in RDN-led bedside feeding tube placement, allowing staff RDNs to practice at the top of their scope and promoting effective use of hospital resources.</p><p></p><p><b>Figure 1.</b> Dietitian Feeding Tube Insertion Pilot: 2NT and 9NT.</p><p>Lauren Murch, MSc, RD<sup>1</sup>; Janet Madill, PhD, RD, FDC<sup>2</sup>; Cindy Steel, MSc, RD<sup>3</sup></p><p><sup>1</sup>Nestle Health Science, Cambridge, ON; <sup>2</sup>Brescia School of Food and Nutritional Sciences, Faculty of Health Sciences, Western University, London, ON; <sup>3</sup>Nestle Health Science, Hamilton, ON</p><p><b>Financial Support:</b> Nestle Health Science.</p><p><b>Background:</b> Continuing education (CE) is a component of professional development which serves two functions: maintaining practice competencies, and translating new knowledge into practice. Understanding registered dietitian (RD) participation and perceptions of CE facilitates creation of more effective CE activities to enhance knowledge acquisition and practice change. This preliminary analysis of a practice survey describes RD participation in CE and evaluates barriers to CE participation.</p><p><b>Methods:</b> This was a cross-sectional survey of clinical RDs working across care settings in Canada. Targeted participants (n = 4667), identified using a convenience sample and in accordance with applicable law, were invited to complete a 25-question online survey, between November 2023 to February 2024. Descriptive statistics and frequencies were reported.</p><p><b>Results:</b> Nationally, 428 RDs working in acute care, long term care and home care, fully or partially completed the survey (9.1% response). Respondents indicated the median ideal number of CE activities per year was 3 in-person, 5 reviews of written materials, and 6 virtual activities. However, participation was most frequently reported as “less often than ideal” for in person activities (74.7% of respondents) and written material (53.6%) and “as often as ideal” for virtual activities (50.7%). Pre-recorded video presentations, live virtual presentations and critically reviewing written materials were the most common types of CE that RDs had participated in at least once in the preceding 12-months. In-person hands-on sessions, multimodal education and simulations were the least common types of CE that RDs had encountered in the preceding 12-months (Figure 1). The most frequent barriers to participation in CE were cost (68% of respondents), scheduling (60%), and location (51%). However, encountered barriers that most greatly limited participation in CE were inadequate staff coverage, lack of dedicated education days within role, and lack of dedicated time during work hours (Table 1). When deciding to participate in CE, RDs ranked the most important aspects of the content as 1) from a credible source, 2) specific/narrow topic relevant to practice and 3) enabling use of practical tools/skills at the bedside.</p><p><b>Conclusion:</b> This data suggests there is opportunity for improvement in RD CE participation, with the greatest gaps in in-person and written activities. Although RDs recognize the importance of relevant and practical content, they reported infrequent exposure to types of CE that are well-suited to this, such as simulations and hands-on activities. When planning and executing CE, content should be credible, relevant, and practical, using a format that is both accessible and impactful. Results of this study help benchmark Canadian RD participation in CE and provide convincing evidence to address barriers and maximize optimal participation.</p><p><b>Table 1.</b> Frequent and Impactful Barriers Limiting Participation in CE Activities.</p><p></p><p>Note: 1. Percentages total greater than 100% because all respondents selected the 3 most important barriers impacting their participation in CE activities. 2. Items are ranked based on a weighted score calculated from a 5-point Likert scale, indicating the extent to which the barrier was perceived to have limited participation in CE activities.</p><p></p><p><b>Figure 1.</b> Types of Continuing Education Activities Dietitians Participated In At Least Once, In The Preceding 12-Months.</p><p>Karen Sudders, MS, RDN, LDN<sup>1</sup>; Alyssa Carlson, RD, CSO, LDN, CNSC<sup>2</sup>; Jessica Young, PharmD<sup>3</sup>; Elyse Roel, MS, RDN, LDN, CNSC<sup>2</sup>; Sophia Vainrub, PharmD, BCPS<sup>4</sup></p><p><sup>1</sup>Medtrition, Huntingdon Valley, PA; <sup>2</sup>Endeavor Health/Aramark Healthcare +, Evanston, IL; <sup>3</sup>Parkview Health, Fort Wayne, IN; <sup>4</sup>Endeavor Health, Glenview, IL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Adequate nutrition is a critical component of patient care and plays a significant role in reducing hospital length of stay (LOS). Malnutrition is associated with numerous adverse outcomes, including increased morbidity, delayed recovery, and prolonged hospitalization (Tappenden et al., 2013). Timely nutrition interventions, and strategies for integrating successful nutrition care into hospital protocols to reduce LOS are essential parts of medical nutrition therapy. Modular nutrition often plays a key role in these interventions. A study by Klek et al. (2020) emphasizes the role of modular nutrition in providing personalized nutritional support for the specific needs of critically ill patients. The study suggests that using nutrient modules allows for a more precise adjustment of nutrition based on the metabolic requirements of patients (Klek et al., 2020). Modular nutrition has also been shown to positively impact clinical outcomes in ICU patients. An observational study by Compher et al. (2019) reported that the targeted administration of protein modules to achieve higher protein intake was associated with improved clinical outcomes, such as reduced ICU LOS (Compher et al., 2019).</p><p><b>Methods:</b> Administration of modular nutrition can be a challenge. Typically, modular proteins (MP) are ordered through the dietitian and dispensed as part of the diet order. The nursing team is responsible for administration and documentation of the MP. There is commonly a disconnect between MP prescription and administration. In some cases, it's related to the MP not being a tracked task in the electronic health record (EHR). The goal of this evaluation was to review data related to a quality improvement (QI)initiative where MP (ProSource TF) was added to the medication administration record (MAR) which used a barcode scanning process to track provision and documentation of MP. The objective of this evaluation was to determine possible correlation between the QI initiative and patients’ ICU LOS. The QI initiative evaluated a pre-implementation timeframe from June 1<sup>st</sup>, 2021 to November 30<sup>th</sup>, 2021, with a post implementation timeframe from January 1<sup>st</sup>, 2022 to June 30<sup>th</sup>, 2022. There were a total of 1962 ICU encounters in the pre-implementation period and 1844 ICU encounters in the post implementation period. The data was analyzed using a series of statistical tests.</p><p><b>Results:</b> The t-test for the total sample was significant, t(3804) = 8.35, p < .001, indicating the average LOS was significantly lower at post compared to pre implementation(TABLE 1). This positive correlation allows us to assume that improved provision of MP may be related to a reduced LOS in the ICU. In addition to LOS, we can also suggest a relationship with the MAR and MP utilization. Pre-implementation, 1600 doses of MP were obtained with an increase of 2400 doses obtained post implementation. The data suggests there is a correlation between product use and MAR implementation even though the overall encounters at post implementation were reduced. There was a 50% increase in product utilization post implementation compared to previous.</p><p><b>Conclusion:</b> The data provided suggests the benefit for adding MP on the MAR to help improve provision, streamline documentation and potentially reduce ICU LOS.</p><p><b>Table 1.</b> Comparison of LOS Between Pre and Post Total Encounters.</p><p></p><p>Table 1 displays the t-test comparison of LOS in pre vs post implementation of MP on the MAR.</p><p></p><p><b>Figure 1.</b> Displays Product Utilization and Encounters Pre vs Post Implementation of MP on the MAR.</p><p><b>International Poster of Distinction</b></p><p>Eliana Giuntini, PhD<sup>1</sup>; Ana Zanini, RD, MSc<sup>2</sup>; Hellin dos Santos, RD, MSc<sup>2</sup>; Ana Paula Celes, MBA<sup>2</sup>; Bernadette Franco, PhD<sup>3</sup></p><p><sup>1</sup>Food Research Center/University of São Paulo, São Paulo; <sup>2</sup>Prodiet Medical Nutrition, Curitiba, Parana; <sup>3</sup>Food Research Center/School of Pharmaceutical Sciences/University of São Paulo, São Paulo</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Critically ill patients present an increased need for protein to preserve muscle mass due to anabolic resistance. Additionally, these patients are more prone to developing hyperglycemia, and one of the nutritional strategies that can be adopted is to provide a diet with a low glycemic index. Hypercaloric and high-protein enteral formulas can help meet energy and protein goals for these patients. Due to their reduced carbohydrate content, these formulas can also contribute to lowering the postprandial glycemic response. The study aimed to evaluate the glycemic index (GI) and the glycemic load (GL) of a specialized high-protein enteral nutrition formula.</p><p><b>Methods:</b> Fifteen healthy volunteers were selected, based on self-reported absence of diseases or regular medication use, aged between 21 and 49 years, with normal glucose tolerance according to fasting and postprandial glucose assessments over 2 hours. The individuals attended after a 10-hour fast, once per week, consuming the glucose solution – reference food – for 3 weeks, and the specialized high-protein enteral formula (Prodiet Medical Nutrition) in the following week, both in amounts equivalent to 25 g of available carbohydrates. The specialized high-protein formula provides 1.5 kcal/ml, 26% protein (98 g/L), 39% carbohydrates, and 35% lipids, including EPA + DHA. Capillary blood sampling was performed at regular intervals, at 0 (before consumption), 15, 30, 45, 60, 90, and 120 minutes. The incremental area under the curve (iAUC) was calculated, excluding areas below the fasting line. Glycemic load (GL) was determined based on the equation GL = [GI (glucose=reference) X grams of available carbohydrates in the portion]/100. Student's t-test were conducted to identify differences (p < 0.05).</p><p><b>Results:</b> To consume 25 g of available carbohydrates, the individuals ingested 140 g of the high-protein formula. The high-protein enteral formula showed a low GI (GI = 23) with a significant difference compared to glucose (p < 0.0001) and a low GL (GL = 8.2). The glycemic curve data showed significant differences at all time points between glucose and the specialized high-protein formula, except at T90, with the glycemic peak occurring at T30 for glucose (126 mg/dL) and at both T30 and T45 for the specialized high-protein enteral formula, with values significantly lower than glucose (102 vs 126 mg/dL). The iAUC was smaller for the specialized high-protein formula compared to glucose (538 ± 91 vs 2061 ± 174 mg/dL x min) (p < 0.0001), exhibiting a curve without high peak, typically observed in foods with a reduced glycemic index.</p><p><b>Conclusion:</b> The specialized high-protein enteral nutrition formula showed a low GI and GL, resulting in a significantly reduced postprandial glycemic response, with lower glucose elevation and variation. This may reduce insulin requirements, and glycemic variability.</p><p></p><p><b>Figure 1.</b> Mean Glycemic Response of Volunteers (N = 15) to 25 G of Available Carbohydrates After Consumption of Reference Food and a Specialized High-Protein Enteral Nutrition Formula, in 120 Min.</p><p>Lisa Epp, RDN, LD, CNSC, FASPEN<sup>1</sup>; Bethaney Wescott, APRN, CNP, MS<sup>2</sup>; Manpreet Mundi, MD<sup>2</sup>; Ryan Hurt, MD, PhD<sup>2</sup></p><p><sup>1</sup>Mayo Clinic Rochester, Rochester, MN; <sup>2</sup>Mayo Clinic, Rochester, MN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Hypnotherapy is the use of hypnosis for the treatment of a medical or psychological disorder. Specifically, gut directed hypnotherapy is a treatment option for functional gastrointestinal disorders and disorders of the gut brain axis. It has been shown to be effective in management of GI symptoms such as abdominal pain, nausea, functional dyspepsia and irritable bowel syndrome symptoms. Evidence suggests that 6%–19% of patients with these GI symptoms exhibit characteristics of Avoidant/restrictive food intake disorder (ARFID). Multiple studies show improvement in GI symptoms and ability to maintain that improvement after 1 year. However, there is a paucity of data regarding use of hypnotherapy in home enteral nutrition patients.</p><p><b>Methods:</b> A case report involving a 67-year-old adult female with h/o Irritable bowel syndrome (diarrhea predominant) and new mucinous appendiceal cancer s/p debulking of abdominal tumor, including colostomy and distal gastrectomy is presented. She was on parenteral nutrition (PN) for 1 month post op due to delayed return of bowel function before her oral diet was advanced. Unfortunately, she had difficulty weaning from PN as she was “scared to start eating” due to functional dysphagia with gagging at the sight of food, even on TV. After 4 weeks of PN, a nasojejunal feeding tube was placed and she was dismissed home.</p><p><b>Results:</b> At multidisciplinary outpatient nutrition clinic visit, the patient was dependent on enteral nutrition and reported inability to tolerate oral intake for unclear reasons. Long term enteral access was discussed, however the patient wished to avoid this and asked for alternative interventions she could try to help her eat. She was referred for gut directed hypnotherapy. After 4 in-person sessions over 3 weeks of hypnotherapy the patient was able to tolerate increasing amounts of oral intake and remove her nasal jejunal feeding. Upon follow-up 4 months later, she was still eating well and continued to praise the outcome she received from gut directed hypnotherapy.</p><p><b>Conclusion:</b> Patient-centered treatments for gut-brain axis disorders and disordered eating behaviors and/or eating disorders are important to consider in addition to nutrition support. These include but are not limited to Cognitive Behavior Therapy, mindfulness interventions, acupuncture, biofeedback strategies, and gut directed hypnotherapy. Group, online and therapist directed therapies could be considered for treatment avenues dependent on patient needs and preferences. Additional research is needed to better delineate impact of these treatment modalities in the home enteral nutrition population.</p><p>Allison Krall, MS, RD, LD, CNSC<sup>1</sup>; Cassie Fackler, RD, LD, CNSC<sup>1</sup>; Gretchen Murray, RD, LD, CNSC<sup>1</sup>; Amy Patton, MHI, RD, CNSC, LSSGB<sup>2</sup></p><p><sup>1</sup>The Ohio State University Wexner Medical Center, Columbus, OH; <sup>2</sup>The Ohio State University Wexner Medical Center, Westerville, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> It is well documented that unnecessary hospital admissions can have a negative impact on patient's physical and emotional wellbeing and can increase healthcare costs.<sup>1</sup> Numerous strategies exist to limit unnecessary hospital admissions; one innovative strategy being utilized at our 1000+ bed Academic Medical Center involves Registered Dietitians (RDs). Literature demonstrates that feeding tube placement by a dedicated team using electromagnetic tracking improves patient morbidity and mortality, and is a cost effective solution for this procedure<sup>.2</sup> RDs have been part of feeding tube teams for many years, though exact numbers of RD only teams are unclear.<sup>3</sup> The Revised 2021 Standards of Practice and Standards of Professional Performance for RDNs (Competent, Proficient, and Expert) in Nutrition Support identifies that dietitians at the “expert” level strive for additional experience and training, and thus may serve a vital role in feeding tube placement teams.<sup>4</sup> Prior to the implementation of the RD led tube team, there was no uniform process at our hospital for these patients to obtain enteral access in a timely manner.</p><p><b>Methods:</b> In December 2023 an “RD tube team” consult and order set went live within the electronic medical record at our hospital. The original intent of the tube team was to cover the inpatient units, but it soon became apparent that there were opportunities to extend this service to observation areas and the Emergency Department (ED). This case series abstract will outline case studies from three patients with various clinical backgrounds and how the tube team was able to prevent an inpatient admission. Patient 1: An 81-year-old female who returned to the ED on POD# 4 s/p esophageal repair with a dislodged nasoenteric feeding tube. The RD tube team was consulted and was able to replace her tube and bridled it in place. Patient discharged from ED without requiring hospital readmission. Patient 2: An 81-year-old male with a history of ENT cancer who transferred to our ED after outside hospital ED had no trained staff available to replace his dislodge nasoenteric feeding tube. RD tube team replaced his tube and bridled it into place. Patient was able to discharge from the ED without admission. Patient 3: A 31-year-old female with complex GI history and multiple prolonged hospitalizations due to PO intolerance. Patient returned to ED 2 days post discharge with a clogged nasoenteric feeding tube. Tube was unable to be unclogged, thus the RD tube team was able to replace tube in ED and prevent readmission.</p><p><b>Results:</b> Consult volumes validated there was a need for a tube team service. In the first 8 months of consult and order set implementation, a total of 403 tubes were placed by the RD team. Of those, 24 (6%) were placed in the ED and observation units. In the 3 patient cases described above, and numerous other patient cases, the RD was able to successfully place a tube using an electromagnetic placement device (EMPD), thereby preventing a patient admission.</p><p><b>Conclusion:</b> Creating a feeding tube team can be a complex process to navigate and requires support from senior hospital administration, physician champions, nursing teams and legal/risk management teams. Within the first year of implementation, our hospital system was able to demonstrate that RD led tube teams have the potential to not only help with establishing safe enteral access for patients, but also can be an asset to the medical facility by preventing admissions and readmissions.</p><p><b>Table 1.</b> RD Tube Team Consults (December 11, 2023-August 31, 2024).</p><p></p><p>Arina Cazac, RD<sup>1</sup>; Joanne Matthews, RD<sup>2</sup>; Kirsten Willemsen, RD<sup>3</sup>; Paisley Steele, RD<sup>4</sup>; Savannah Zantingh, RD<sup>5</sup>; Sylvia Rinaldi, RD, PhD<sup>2</sup></p><p><sup>1</sup>Internal Equilibrium, King City, ON; <sup>2</sup>London Health Sciences Centre, London, ON; <sup>3</sup>NutritionRx, London, ON; <sup>4</sup>Vanier Children's Mental Wellness, London, ON; <sup>5</sup>Listowel-Wingham and Area Family Health Team, Wingham, ON</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Parkinson's disease is the second most prevalent neurodegenerative disease with a predominant disease-related symptom known as dysphagia. Dysphagia can increase the risk of aspiration, the inhaling of food or liquids into the lungs, potentially instigating the onset of pneumonia, a recurrent fatality in patients with Parkinson's disease. Therefore, feeding tubes are placed to deliver nutrition into the stomach or the small intestine to maintain appropriate nutrition delivery and reduce the risk of aspiration by oral intake. To the best of our knowledge, there is no research comparing the differences in outcomes between gastric (G) or jejunal (J) tube feeding in patients with Parkinson's disease-related dysphagia, however, limited research does exist in critically ill populations comparing these two modalities. The purpose of this study is to compare the differences in hospital readmissions related to aspiration events and differences in mortality rates after placement of a gastric or jejunal feeding tube in patients with Parkinson's disease-related dysphagia.</p><p><b>Methods:</b> This was a retrospective chart review of patients either admitted to the medicine or clinical neurosciences units at University Hospital in London Ontario, Canada, between January 1, 2010, to December 31, 2022. Patients were included if they had a documented diagnosis of Parkinson's or associated diseases and had a permanent feeding tube placed during hospital admission that was used for enteral nutrition. Patients were excluded from the study if they had a comorbidity that would affect their survival, such as cancer, or if a feeding tube was placed unrelated to Parkinson's Disease-related dysphagia, for example, feeding tube placement post-stroke. A p-value < 0.05 was considered statistically significant.</p><p><b>Results:</b> 25 participants were included in this study; 7 had gastric feeding tubes, and 18 had jejunal feeding tubes. Demographic data is shown in Table 1. No statistically significant differences were found in demographic variables between the G- and J-tube groups. Of interest, none of the 28% of participants that had dementia were discharged to home; 5 were discharged to long-term care, 1 was discharged to a complex continuing care facility, and 1 passed in hospital. Differences in readmission rates and morality between groups did not reach significance, likely due to our small sample size in the G-tube group (Figures 1 and 2). However, we found that 50% of participants were known to have passed within 1 year of initiating enteral nutrition via their permanent feeding tube, and there was a trend of higher readmission rates in the G-tube group.</p><p><b>Conclusion:</b> While this study did not yield statistically significant results, it highlights the need for further research of a larger sample size to assess confounding factors, such as concurrent oral intake, that affect the difference in outcomes between G- and J-tube groups. Future research would also benefit from examining the influence on quality of life in these patients. Additional research is necessary to inform clinical practice guidelines and clinical decision-making for clinicians, patients and families when considering a permanent feeding tube.</p><p><b>Table 1.</b> Participant Demographics.</p><p></p><p></p><p>Readmission rates were calculated as a hospital. If a participant was readmitted more than once within the defined percentage of the number of readmissions to the number of discharges from timeframes, subsequent readmissions were counted as a new readmission and new discharge event. Readmission rate calculations did not include participants who passed during or after the defined timeframes. Differences in readmission rates between gastric and jejunal feeding tube groups did no reach statistical significance.</p><p><b>Figure 1.</b> Readmission Rate.</p><p></p><p>Mortality rates were calculated from the time that enteral nutrition was initiated through a permanent feeding tube in 30-day, 60-day, 90-day, and 1-year time intervals. Differences in mortality rates between gastric and jejunal feeding tube groups did not reach statistical significance.</p><p><b>Figure 2.</b> Mortality Rate.</p><p>Jennifer Carter, MHA, RD<sup>1</sup></p><p><sup>1</sup>Winchester Medical Center, Valley Health, Winchester, VA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Early enteral nutrition is shown to improve patient outcomes and can decrease, or attenuate, the progression of malnutrition. Placement of nasoenteric feeding tubes was deemed within the scope of practice (SOP) for Registered Dietitian Nutritionists (RDNs) in 2007, with recent revisions in 2021, by the Academy of Nutrition and Dietetics (AND) and the American Society of Parenteral and Enteral Nutrition (ASPEN). Due to enhanced order writing privileges, RDNs are knowledgeable and aware of those in need of enteral nutrition recommendations. The goal of this abstract is to bring light to the efficiency of a RDN led nasoenteric tube placement team.</p><p><b>Methods:</b> A retrospective chart review of the first 70 patients who received a nasoenteric tube placed by a RDN in 2023 was conducted. Data points collected include time of tube order to tube placement and time of tube order to enteral nutrition order.</p><p><b>Results:</b> Out of 70 tubes placed, the average time from tube order to tube placement was 2.28 hours. The longest time from tube order to placement was 19 hours. The average time from tube order to enteral nutrition order was 5.58 hours. The longest time from tube order to enteral nutrition order was 23.6 hours.</p><p><b>Conclusion:</b> This retrospective review reflects the timeliness of placement and provision of enteral nutrition in the acute care setting when performed by an all RDN team. Overall, placement occurred within less than 2.5 hours of tube placement order, and enteral nutrition orders entered less than 6 hours of tube placement order. The RDNs at Winchester Medical Center have been placing nasoenteric-feeding tubes since 2013 using an electromagnetic tube placement device (EMPD). As of 2018, it became an all RDN team. With the enhanced support from AND and ASPEN, nasoenteric feeding tube placement continues to be promoted within the SOP for RDNS. Also, the Accreditation Council for Education in Nutrition and Dietetics (ACEND) now requires dietetic interns to learn, observe, and even assist with nasoenteric tube placements. Over time, more RDNs in the acute care setting will likely advance their skillset to include this expertise.</p><p></p><p><b>Figure 1.</b> Time From MD Order to Tube Placement in Hours.</p><p></p><p><b>Figure 2.</b> Time From MD Order of Tube to Tube Feed Order in Hours.</p><p><b>Poster of Distinction</b></p><p>Vanessa Millovich, DCN, MS, RDN, CNSC<sup>1</sup>; Susan Ray, MS, RD, CNSC, CDCES<sup>2</sup>; Robert McMahon, PhD<sup>3</sup>; Christina Valentine, MD, RDN, FAAP, FASPEN<sup>4</sup></p><p><sup>1</sup>Kate Farms, Hemet, CA; <sup>2</sup>Kate Farms, Temecula, CA; <sup>3</sup>Seven Hills Strategies, Columbus, OH; <sup>4</sup>Kate Farms, Cincinnati, OH</p><p><b>Financial Support:</b> Kate Farms provided all financial support.</p><p><b>Background:</b> Whole food plant-based diets have demonstrated metabolic benefits across many populations. The resulting increased intake of dietary fiber and phytonutrients is integral to the success of this dietary pattern due to the positive effect on the digestive system. Patients dependent on tube feeding may not receive dietary fiber or sources of phytonutrients, and the impact of this is unknown. Evidence suggests that the pathways that promote digestive health include more than traditional prebiotic sources from carbohydrate fermentation. Data on protein fermentation metabolites and potential adverse effects on colon epithelial cell integrity are emerging. These lesser-known metabolites, branched-chain fatty acids (BCFAs), are produced through proteolytic fermentation. Emerging research suggests that the overproduction of BCFAs via protein fermentation may be associated with toxic by-products like p-cresol. These resulting by-products may play a role in digestive disease pathogenesis. Enteral formulas are often used to support the nutritional needs of those with digestive conditions. Plant-based formulations made with yellow pea protein have been reported to improve GI tolerance symptoms. However, the underlying mechanisms responsible have yet to be investigated. The purpose of this study was to assess the impact of a mixed food matrix enteral formula containing pea protein, fiber, and phytonutrients on various markers of gut health in healthy children and adults, using an in-vitro model.</p><p><b>Methods:</b> Stool samples of ten healthy pediatric and 10 adult donors were collected and stored at -80°C. The yellow pea protein formulas (Kate Farms™ Pediatric Standard 1.2 Vanilla-P1, Pediatric Peptide 1.0 Vanilla-P2, and Standard 1.4 Plain-P3) were first predigested using standardized intestinal processing to simulate movement along the digestive tract. The in-vitro model was ProDigest's Colon-on-a-Plate (CoaP®) simulation platform which has demonstrated in vivo-in vitro correlation. Measurements of microbial metabolic activity included pH, production of gas, SCFAs, BCFA, ammonia, and microbiota shifts. Paired two-sided t-tests were performed to evaluate differences between treatment and control. Differential abundance analysis was performed using LEfSe and treeclimbR. Statistical significance, as compared to negative control, is indicated by a p-value of < 0.05.</p><p><b>Results:</b> In the pediatric group, the microbial analysis showed significant enrichment of Bifidobacteria as well as butyrate-producing genera Agathobacter and Agathobaculum with the use of the pediatric formulas when compared to the control. P1 resulted in a statistically significant reduction of BCFA production (p < = 0.05). P1 and P2 resulted in statistically significant increases in acetate and propionate. In the adult group, with treatment using P3, microbial analysis showed significant enrichment of Bifidobacteria compared to the control group. P3 also resulted in a reduction of BCFAs, although not statistically significant. Gas production and drop in pH were statistically significant (p < = 0.05) for all groups P1, P2, and P3 compared to control, which indicates microbial activity.</p><p><b>Conclusion:</b> All enteral formulas demonstrated a consistent prebiotic effect on the gut microbial community composition in healthy pediatric and adult donors. These findings provide insight into the mechanisms related to digestive health and highlight the importance of designing prospective interventional research to better understand the role of fiber and phytonutrients within enteral products.</p><p>Hill Johnson, MEng<sup>1</sup>; Shanshan Chen, PhD<sup>2</sup>; Garrett Marin<sup>3</sup></p><p><sup>1</sup>Luminoah Inc, Charlottesville, VA; <sup>2</sup>Virginia Commonwealth University, Richmond, VA; <sup>3</sup>Luminoah Inc, San Diego, CA</p><p><b>Financial Support:</b> Research was conducted with the support of VBHRC's VA Catalyst Grant Funding.</p><p><b>Background:</b> Medical devices designed for home use must prioritize user safety, ease of operation, and reliability, especially in critical activities such as enteral feeding. This study aimed to validate the usability, safety, and overall user satisfaction of a novel enteral nutrition system through summative testing and task analysis.</p><p><b>Methods:</b> A simulation-based, human factors summative study was conducted with 36 participants, including both caregivers and direct users of enteral feeding technology. Participants were recruited across three major cities: Houston, Chicago, and Phoenix. Task analysis focused on critical and non-critical functions of the Luminoah FLOW™ Enteral Nutrition System, while user satisfaction was measured using the System Usability Scale (SUS). The study evaluated successful task completion, potential use errors, and qualitative user feedback.</p><p><b>Results:</b> All critical tasks were completed successfully by 100% of users, with the exception of a cleaning task, which had an 89% success rate. Non-critical tasks reached an overall completion rate of 95.7%, demonstrating the ease of use and intuitive design of the system. The SUS score was exceptionally high, with an average score of 91.5, indicating strong user preference for the device over current alternatives. Furthermore, 91% of participants indicated they would choose the new system over other products in the market.</p><p><b>Conclusion:</b> The innovative portable enteral nutrition system demonstrated excellent usability and safety, meeting the design requirements for its intended user population. High completion rates for critical tasks and an overwhelmingly positive SUS score underscore the system's ease of use and desirability. These findings suggest that the system is a superior option for home enteral feeding, providing both safety and efficiency in real-world scenarios. Further refinements in instructional materials may improve user performance on non-critical tasks.</p><p>Elease Tewalt<sup>1</sup></p><p><sup>1</sup>Phoenix Veterans Affairs Administration, Phoenix, AZ</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Enhanced Recovery After Surgery (ERAS) protocols, including preoperative carbohydrate loading, aim to accelerate recovery by reducing the stress responses and insulin resistance. These protocols have been shown to decrease hospital stays, postoperative complications, and healthcare costs. However, there is limited knowledge about the safety and efficacy of ERAS for diabetic patients. Patients with diabetes make up 15% of surgical cases and often have longer hospital stays and more postoperative complications. This study evaluated outcome measures important to patients with diabetes in a non-diabetic population in order to support the groundwork for future trials that could include diabetic patients in ERAS protocols.</p><p><b>Methods:</b> A retrospective chart review at the Phoenix Veterans Affairs Health Care System compared 24 colorectal surgery patients who received preoperative carbohydrate drinks with 24 who received traditional care. Outcomes assessed included blood glucose (BG) levels, aspiration, and postoperative complications. Additional analyses evaluated adherence, length of hospital stay, and healthcare costs.</p><p><b>Results:</b> The demographics of the two groups were comparable (Table 1). The preoperative BG levels of the carbohydrate loading group were similar (164.6 ± 36.3 mg/dL) to the control group (151.8 ± 47.7 mg/dL) (p > 0.05) (Figure 1). The carbohydrate loading group demonstrated lower and more stable postoperative BG levels (139.4 ± 37.5 mg/dL) compared to the control group (157.6 ± 61.9 mg/dL), but this difference was not statistically significant (p > 0.05) (Figure 2). There were no significant differences in aspiration or vomiting between the groups (p > 0.05) (Table 2). The carbohydrate loading group had a shorter average hospital stay by one day, but this difference was not statistically significant (p > 0.05) (Table 2).</p><p><b>Conclusion:</b> Carbohydrate loading as part of ERAS protocols was associated with better postoperative glucose control, no increased risk of complications, and reduced hospital stays. Although diabetic patients were not included in this study, these findings suggest thatcarbohydrate loading is a safe and effective component of ERAS. Including diabetic patients in ERAS is a logical next step that could significantly improve surgical outcomes for this population. Future research should focus on incorporating diabetic patients to assess the impact of carbohydrate loading on postoperative glucose control, complication rates, length of stay, and healthcare costs.</p><p><b>Table 1.</b> Demographics.</p><p></p><p>The table includes the demographics of the two groups. The study group consists of participants who received the carbohydrate drink, while the control group includes those who received traditional care.</p><p><b>Table 2.</b> Postoperative Outcomes.</p><p></p><p>The table includes the postoperative outcomes of the two groups. The study group consists of participants who received the carbohydrate drink, while the control group includes those who received traditional care.</p><p></p><p>The figure shows the preoperative BG levels of the carbohydrate and non-carbohydrate groups, along with a trendline for each group (p > 0.05).</p><p><b>Figure 1.</b> Preoperative BG Levels.</p><p></p><p>The figure shows the postoperative BG levels of the carbohydrate and non-carbohydrate groups, along with a trendline for each group (p > 0.05).</p><p><b>Figure 2.</b> Postoperative BG Levels.</p><p><b>Malnutrition and Nutrition Assessment</b></p><p>Amy Patton, MHI, RD, CNSC, LSSGB<sup>1</sup>; Elisabeth Schnicke, RD, LD, CNSC<sup>2</sup>; Sarah Holland, MSc, RD, LD, CNSC<sup>3</sup>; Cassie Fackler, RD, LD, CNSC<sup>2</sup>; Holly Estes-Doetsch, MS, RDN, LD<sup>4</sup>; Kristen Roberts, PhD, RDN, LD, CNSC, FASPEN, FAND<sup>5</sup>; Christopher Taylor, PhD, RDN<sup>4</sup></p><p><sup>1</sup>The Ohio State University Wexner Medical Center, Westerville, OH; <sup>2</sup>The Ohio State University Wexner Medical Center, Columbus, OH; <sup>3</sup>The Ohio State University Wexner Medical Center, Upper Arlington, OH; <sup>4</sup>The Ohio State University, Columbus, OH; <sup>5</sup>The Ohio State University, Granville, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> The unfavorable association of malnutrition on hospital outcomes such as longer length of stays (LOS), increased falls, and increased hospital readmissions has been well documented in the literature. We aimed to see if a different model of care that lowered Registered Dietitian (RD) to patient ratios would lead to increased malnutrition identification and documentation within our facility. We also evaluated the relationship between these metrics and LOS on monitored hospital units.</p><p><b>Methods:</b> In July 2022, two additional RDs were hired at a large 1000+ bed academic medical center as part of a pilot program focused on malnutrition identification, documentation integrity, and staff training. The RD to patient ratio was reduced from 1:75 to 1:47 on three units of the hospital. On the pilot units, the RD completed a full nutrition assessment, including a Nutrition Focused Physical Exam (NFPE), for all patients who were identified as \"at risk\" per hospital nutrition screening policy. Those patients that were not identified as “at risk” received a full RD assessment with NFPE by day 7 of admission or per consult request. A malnutrition dashboard was created with assistance from a Quality Data Manager from the Quality and Operations team. This visual graphic allowed us to track and monitor RD malnutrition identification rates by unit and the percentage of patients that had a malnutrition diagnosis captured by the billing and coding team. Data was also pulled from the Electronic Medical Record (EMR) to look at other patient outcomes. In a retrospective analysis we compared the new model of care to the standard model on one of these units.</p><p><b>Results:</b> There was an increase in the RD identified capture rate of malnutrition on the pilot units. On a cardiac care unit, the RD identification rate went from a baseline of 6% in Fiscal Year (FY) 2022 to an average of 12.5% over FY 2023-2024. On two general medicine units, the malnutrition rates identified by RD nearly doubled during the two-year intervention (Table 1). LOS was significantly lower on one of the general medicine intervention floors compared to a control unit (p < 0.001, Cohen's D: 13.8) (Table 2). LOS was reduced on all units analyzed between FY22 and FY23/24. Those patients with a malnutrition diagnosis had a 15% reduction in LOS FY22 to FY23/24 in control group compared to 19% reduction in LOS for those identified with malnutrition on intervention unit. When comparing intervention versus control units for FY23 and FY24 combined, the intervention had a much lower LOS than control unit.</p><p><b>Conclusion:</b> Dietitian assessments and related interventions may contribute in reducing LOS. Reducing RD to patient ratios may allow for greater identification of malnutrition and support patient outcomes such as LOS. There is an opportunity to evaluate other patient outcomes for the pilot units including falls, readmission rates and Case Mix Index.</p><p><b>Table 1.</b> RD Identified Malnutrition Rates on Two General Medicine Pilot Units.</p><p></p><p><b>Table 2.</b> Control Unit and Intervention Unit Length of Stay Comparison.</p><p></p><p>Amy Patton, MHI, RD, CNSC, LSSGB<sup>1</sup>; Misty McGiffin, DTR<sup>2</sup></p><p><sup>1</sup>The Ohio State University Wexner Medical Center, Westerville, OH; <sup>2</sup>The Ohio State University Wexner Medical Center, Columbus, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Delays in identifying patients at nutrition risk can impact patient outcomes. Per facility policy, Dietetic Technicians (DTs) and Registered Dietitians (RDs) review patient charts, meet with patients, and assign a nutrition risk level. High risk patients are then assessed by the RD for malnutrition among other nutrition concerns. Based on a new tracking process implemented in January of 2023, an average of 501 patient nutrition risk assignments were overdue or incomplete per month in January thru April of 2023 with a dramatic increase to 835 in May. Missed risk assignments occur daily. If the risk assignment is not completed within the 72-hour parameters of the policy, this can result in late or missed RD assessment opportunities and policy compliance concerns.</p><p><b>Methods:</b> In June 2023, a Lean Six Sigma quality improvement project using the DMAIC (Define, Measure, Analyze, Improve, Control) framework was initiated at a 1000+ bed Academic Medical Center with the goal to improve efficiency of the nutrition risk assignment (NRA) process for RDs and DTs. A secondary goal was to see if potential improved efficiency would also lead to an increase in RD identified malnutrition based on Academy of Nutrition and Dietetics/American Society for Parenteral and Enteral Nutrition Indicators or Malnutrition (AAIM) criteria. A group of RDs and DTs met to walk through each of the quality improvement process steps. The problem was defined and baseline measurements of malnutrition identification rates and missed nutrition risk assignments were analyzed. A fishbone diagram was used to help with root cause analysis and later a payoff matrix was used to identify potential interventions. The improve phase was implemented in October and November 2023 and included changes to the screening policy itself and redistributing clinical nutrition staff on certain patient units.</p><p><b>Results:</b> Identified improvements did have a positive impact on incomplete work and on malnutrition identification rates. Malnutrition identification rates on average May thru October were 11.7% compared to malnutrition rates November thru April of 12.1%. The number of missed NRA's decreased from an average of 975 per month May thru October to 783 per month November thru April, a decrease of 192 per month (20%). An additional quality improvement process cycle is currently underway to further improve these metrics.</p><p><b>Conclusion:</b> Fostering a culture of ongoing improvement presents a significant challenge for both clinicians and leaders. Enhancing nutrition care and boosting clinician efficiency are critical goals. Introducing clinical nutrition leaders to tools designed for quality monitoring and enhancement can lead to better performance outcomes and more effective nutrition care. Tools such as those used for this project along with PDSA (Plan, Do, Study, Act) projects are valuable in this process. Involving team members from the beginning of these improvement efforts can also help ensure the successful adoption of practice changes.</p><p><b>Table 1.</b> RD Identified Malnutrition Rates.</p><p></p><p><b>Table 2.</b> Incomplete Nutrition Risk Assignments (NRA's).</p><p></p><p>Maurice Jeanne Aguero, RN, MD<sup>1</sup>; Precy Gem Calamba, MD, FPCP, DPBCN<sup>2</sup></p><p><sup>1</sup>Department of Internal Medicine, Prosperidad, Agusan del Sur; <sup>2</sup>Medical Nutrition Department, Tagum City, Davao del Norte</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition is a strong predictor for mortality and morbidity through poor response to therapy, and quality of life among Gastro-intestinal (GI) cancer patients. In a tertiary government hospital in Tagum City where a cancer center is present, although malnutrition screening among patients with cancer is routinary, no studies focusing on determining the association between nutritional status and quality of life among GI cancer patients were conducted in the past. This study generally aims to determine if nutritional status is associated with the quality of life among adult GI cancer Filipino patients seeking cancer therapy in a tertiary government hospital.</p><p><b>Methods:</b> A quantitative, observational, cross-sectional, survey analytical, and predictive type of research was done. World Health Organization Quality of Life Brief Version (WHOQOL-BREF) questionnaire was utilized to determine the quality of life of cases. Logistic regression analysis was used for the association between the demographic, clinical and nutritional profile among patients with gastrointestinal cancer.</p><p><b>Results:</b> Among respondents (n = 160, mean age 56.4 ± 12 years), majority were male (61.9%), married (77.5%), Roman Catholic (81.1%), and finished high school (38.1%). Almost half were diagnosed cases of colon adenocarcinoma (43.125%), followed by rectal adenocarcinoma (22.5%), rectosigmoid adenocarcinoma (11.875%), then GI stromal tumor (5.625%). On cancer staging, 40.625% were Stage 4, followed by Stage 3b (19.375%), Stage 3c (10%), Stage 3a (5.625%), then Stage 2a (4.375%). Only 2.5% were Stage 4a, while 0.625% were Stage 4b. More than one fourth received CAPEOX (38.125%), followed by FOLFOX (25.625%), then IMATINIB (5.625%). Among cases, 15.6% were underweight or obese, or overweight (34.4%). In terms of SGA grading, 38.1% had severe level, 33.8% moderate level, while the rest normal to mild. On quality of life, mean scores per variable were: generally good for general quality of life (3.71 ± 0.93), generally satisfied for perception on general health, being satisfied with one's self, and his or her relationship with others (3.46 to 3.86 ± 0.97), generally of moderate satisfaction on having enough energy to daily life, on accepting bodily appearance, on the availability of information needed for daily living, and on the extent of having the opportunity for leisure (2.71 to 3.36 ± 1.02), a little level of satisfaction was thrown on having enough money to meet their needs (2.38 ± 0.92). Participants, on average, experienced quite often negative feelings such as having blue mood, despair, depression and anxiety (2.81 ± 0.79). A significant association between age (p = 0.047), cancer diagnosis (p = 0.001), BMI status (p = 0.028), and SGA nutritional status (p = 0.010) relative to the quality of life among adult cancer patients was documented.</p><p><b>Conclusion:</b> Nutritional status was significantly associated with the quality of life among adult GI cancer Filipino patients seeking cancer therapy in a tertiary government hospital. Public health interventions may play a critical role on these factors to improve patient survival and outcome.</p><p>Carmen, Kaman Lo, MS, RD, LDN, CNSC<sup>1</sup>; Hannah Jacobs, OTD, OTR/L<sup>2</sup>; Sydney Duong, MS, RD, LDN<sup>3</sup>; Julie DiCarlo, MS<sup>4</sup>; Donna Belcher, EdD, MS, RD, LDN, CDCES, CNSC, FAND<sup>5</sup>; Galina Gheihman, MD<sup>6</sup>; David Lin, MD<sup>7</sup></p><p><sup>1</sup>Massachusetts General Hospital, Sharon, MA; <sup>2</sup>MedStart National Rehabilitation Hospital, Washington, DC; <sup>3</sup>New England Baptist Hospital, Boston, MA; <sup>4</sup>Center for Neurotechnology and Neurorecovery, Mass General Hospital, Boston, MA; <sup>5</sup>Nutrition and Food Services, MGH, Boston, MA; <sup>6</sup>Harvard Medical School and Mass General Hospital, Boston, MA; <sup>7</sup>Neurocritical Care & Neurorecovery, MGH, Boston, MA</p><p><b>Financial Support:</b> Academy of Nutrition and Dietetics, Dietitian in Nutrition Support Member Research Award.</p><p><b>Background:</b> Nutritional status is a known modifiable factor for optimal recovery in brain injury survivors, yet, data on specific benchmarks for optimizing clinical outcomes through nutrition are limited. This pilot study aimed to quantify the clinical, nutritional, and functional characteristics of brain injury survivors from ICU admission to 90 days post-discharge.</p><p><b>Methods:</b> Patients admitted to the Massachusetts General Hospital (MGH) Neurosciences ICU over a 12-month period were enrolled based on these criteria: age 18 years and greater, primary diagnoses of acute brain injury, ICU stay for at least 72 hours, and meeting MGH Neurorecovery Clinic referral criteria for outpatient follow-up with survival beyond 90 days post-discharge. Data were collected from the electronic health record and from the neurorecovery clinic follow-visit/phone interview. These included patient characteristics, acute clinical outcomes, nutrition intake, surrogate nutrition and functional scores at admission, discharge, and 90 days post-discharge. Descriptive statistics were used for analysis.</p><p><b>Results:</b> Of 212 admissions during the study period, 50 patients were included in the analysis. The mean APACHE (n = 50), GCS (n = 50), and NIHSS (n = 20) scores were 18, 11 and 15, respectively. Seventy-eight percent of the patients required ventilation with a mean duration of 6.3 days. Mean ICU and post ICU length of stay were 17.4 and 15.9 days, respectively. Eighty percent received nutrition (enteral or oral) within 24-48 hours of ICU admission. The first 7-day mean ICU energy and protein intake were 1128 kcal/day and 60.3 g protein/day, respectively, both 63% of estimated needs. Assessing based on ASPEN guidelines, patients with a BMI ≥ 30 received less energy (11.6 vs 14.6 kcal/kg/day) but higher protein (1.04 vs 0.7 g protein/kg/day) than those with a BMI < 30. Twelve percent of patients had less than 50% of their nutritional intake for at least 7 days pre-discharge and were considered nutritionally at risk. Forty-six percent were discharged with long-term enteral access. Only 16% of the patients were discharged to home rather than a rehabilitation facility. By 90 days post-discharge, 32% of the patients were readmitted, with 27% due to stroke. Upon admission, patients’ mean MUST (Malnutrition Universal Screening Tool) and MST (Malnutrition Screening Tool) scores were 0.56 and 0.48, respectively, reflecting that they were at low nutritional risk. By discharge, the mean MUST and MST scores of these patients increased to 1.16 and 2.08, respectively, suggesting these patients had become nutritionally at risk. At 90 days post-discharge, both scores returned to a low nutrition risk (MUST 0.48 and MST 0.59). All patients’ functional scores, as measured by the modified Rankin scale (mRS), followed a similar pattern: the mean score was 0.1 at admission, 4.2 at discharge, and 2.8 at 90 days post-discharge. The 90 days post-discharge Barthel index was 64.1, indicating a moderate dependence in these patients.</p><p><b>Conclusion:</b> This pilot study highlighted key nutritional, clinical, and functional characteristics of brain injury survivors from ICU admission to 90 days post-discharge. Further statistical analysis will help delineate the relationships between nutritional status and clinical and functional outcomes, which may guide future research and practice in ICU nutrition and neurorehabilitation.</p><p>Lavanya Chhetri, BS<sup>1</sup>; Amanda Van Jacob, MS, RDN, LDN, CCTD<sup>1</sup>; Sandra Gomez, PhD, RD<sup>1</sup>; Pokhraj Suthar, MBBS<sup>1</sup>; Sarah Peterson, PhD, RD<sup>1</sup></p><p><sup>1</sup>Rush University Medical Center, Chicago, IL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Identifying frailty in patients with liver disease provides valuable insights into a patient's nutritional status and physical resilience. However, it is unclear if reduced muscle is an important etiology of frailty in liver disease. Identifying the possible connection between frailty and muscle mass may lead to better risk prediction, personalized interventions, and improved outcomes in patient care. The purpose of this study is to determine if frail patients have lower skeletal muscle index (SMI) compared to not-frail patients with liver disease undergoing liver transplant evaluation.</p><p><b>Methods:</b> A retrospective, cross-sectional study design was utilized. Patients greater than 18 years of age who underwent a liver transplant evaluate from January 1<sup>st</sup>, 2019 until December 31, 2023 were included if they had a liver frailly index (LFI) assessment completed during the initial liver transplant evaluation and a diagnostic abdominal CT scan completed within 30 days of the initial liver transplant evaluation. Demographic data (age, sex, height, and BMI), etiology of liver disease, MELD-Na score, history of diabetes and hepatocellular carcinoma, liver disease complications (ascites, hepatocellular carcinoma, hepatic encephalopathy & esophageal varices), and LFI score were recorded for each patient. LFI was recorded as both a continuous variable and dichotomized into a categorical variable (frail: defined as LFI ≥ 4.5 versus not frail: defined as LFI ≤ 4.4). Cross-sectional muscle area (cm<sup>2</sup>) from the third lumbar region of the CT was quantified; SMI was calculated (cm<sup>2</sup>/height in meters<sup>2</sup>) and low muscle mass was dichotomized into a categorical variable (low muscle mass: defined as SMI ≤ 50 cm<sup>2</sup>/m<sup>2</sup> for males and ≤39 cm<sup>2</sup>/m<sup>2</sup> for females versus normal muscle mass: defined as SMI > 50 cm<sup>2</sup>/m<sup>2</sup> for males and >39 cm<sup>2</sup>/m<sup>2</sup> for females). An independent t-test analysis was used to determine if there is a difference in SMI between patients who are categorized as frail versus not frail.</p><p><b>Results:</b> A total of 104 patients, 57% male with a mean age of 57 ± 10 years and mean of BMI 28.1 ± 6.4 kg/m<sup>2</sup>, were included. The mean MELD-Na score was 16.5 ± 6.9; 25% had a history of hepatocellular carcinoma and 38% had a history of diabetes. The majority of the sample had at least one liver disease complication (72% had ascites, 54% had hepatic encephalopathy, and 67% had varices). The mean LFI score was 4.5 ± 0.9 and 44% were categorized as frail. The mean SMI was 45.3 ± 12.6 cm<sup>2</sup>/m<sup>2</sup> and 52% were categorized as having low muscle mass (males: 63% and females: 38%). There was no difference in SMI between patients who were frail versus not frail (43.5 ± 10.6 versus 47.3 ± 13.9 cm<sup>2</sup>/m<sup>2</sup>, p = 0.06). The difference between SMI by frailty status was reported for males and females, no significance testing was used due to the small sample size. Both frail males (43.5 ± 12.2 versus 48.4 ± 14.9) and females (43.4 ± 9.3 versus 45.2 ± 11.8) had a lower SMI compared to non-frail patients.</p><p><b>Conclusion:</b> No difference in SMI between frail versus not frail patients was observed; however, based on the p-value of 0.06 a marginal trend and possible difference may exist, but further research is needed to confirm the findings. Additionally, it is concerning that men had a higher rate of low muscle mass and the mean SMI for both frail and not frail men was below the cut-off used to identify low muscle mass (SMI ≤ 50 cm<sup>2</sup>/m<sup>2</sup>). Additional research is needed to explore the underlying factors contributing to low muscle mass in men, particularly in frail populations, and to determine whether targeted interventions aimed at improving muscle mass could mitigate frailty and improve clinical outcomes in patients undergoing liver transplant evaluation.</p><p>Rebekah Preston, MS, RD, LD<sup>1</sup>; Keith Pearson, PhD, RD, LD<sup>2</sup>; Stephanie Dobak, MS, RD, LDN, CNSC<sup>3</sup>; Amy Ellis, PhD, MPH, RD, LD<sup>1</sup></p><p><sup>1</sup>The University of Alabama, Tuscaloosa, AL; <sup>2</sup>The University of Alabama at Birmingham, Birmingham, AL; <sup>3</sup>Thomas Jefferson University, Philadelphia, PA</p><p><b>Financial Support:</b> The ALS Association Quality of Care Grant.</p><p><b>Background:</b> Amyotrophic Lateral Sclerosis (ALS) is a progressive neurodegenerative disease. Malnutrition is common in people with ALS (PALS) due to dysphagia, hypermetabolism, self-feeding difficulty and other challenges. In many clinical settings, malnutrition is diagnosed using the Academy of Nutrition and Dietetics/American Society of Parenteral and Enteral Nutrition indicators to diagnose malnutrition (AAIM) or the Global Leadership Initiative on Malnutrition (GLIM) criteria. However, little is known about malnutrition assessment practices in ALS clinics. This qualitative study explored how RDs at ALS clinics are diagnosing malnutrition in PALS.</p><p><b>Methods:</b> Researchers conducted 6 virtual focus groups with 22 RDs working in ALS clinics across the United States. Audio recordings were transcribed verbatim, and transcripts imported to NVivo 14 software (QSR International, 2023, Melbourne, Australia). Two research team members independently analyzed the data using deductive thematic analysis.</p><p><b>Results:</b> The AAIM indicators were identified as the most used malnutrition diagnostic criteria. Two participants described using a combination of AAIM and GLIM criteria. Although all participants described performing thorough nutrition assessments, some said they do not formally document malnutrition in the outpatient setting due to lack of reimbursement and feeling as though the diagnosis would not change the intervention. Conversely, others noted the importance of documenting malnutrition for reimbursement purposes. Across all groups, RDs reported challenges in diagnosing malnutrition because of difficulty differentiating between disease- versus malnutrition-related muscle loss. Consequently, several RDs described adapting current malnutrition criteria to focus on weight loss, decreased energy intake, or fat loss.</p><p><b>Conclusion:</b> Overall, RDs agreed that malnutrition is common among PALS, and they conducted thorough nutrition assessments as part of standard care. Among those who documented malnutrition, most used AAIM indicators to support the diagnosis. However, as muscle loss is a natural consequence of ALS, RDs perceived difficulty in assessing nutrition-related muscle loss. This study highlights the need for malnutrition criteria specific for PALS.</p><p><b>Table 1.</b> Themes Related to Diagnosing Malnutrition in ALS.</p><p></p><p>Carley Rusch, PhD, RDN, LDN<sup>1</sup>; Nicholas Baroun, BS<sup>2</sup>; Katie Robinson, PhD, MPH, RD, LD, CNSC<sup>1</sup>; Maria Geraldine E. Baggs, PhD<sup>1</sup>; Refaat Hegazi, MD, PhD, MPH<sup>1</sup>; Dominique Williams, MD, MPH<sup>1</sup></p><p><sup>1</sup>Abbott Nutrition, Columbus, OH; <sup>2</sup>Miami University, Oxford, OH</p><p><b>Financial Support:</b> This study was supported by Abbott Nutrition.</p><p><b>Background:</b> Malnutrition is increasingly recognized as a condition that is present in all BMI categories. Although much research to date has focused on malnutrition in patients with lower BMIs, there is a need for understanding how nutrition interventions may alter outcomes for those with higher BMIs and existing comorbidities. In a post-hoc analysis of the NOURISH trial, which investigated hospitalized older adults with malnutrition, we sought to determine whether consuming a specialized ONS containing high energy, protein and beta-hydroxy-beta-methylbutyrate (ONS+HMB) can improve vitamin D and nutritional status in those with a BMI ≥ 27.</p><p><b>Methods:</b> Using data from the NOURISH trial, a randomized, placebo-controlled, multi-center, double-blind study conducted in hospitalized participants with malnutrition and a primary diagnosis of congestive heart failure, acute myocardial infarction, pneumonia, or chronic obstructive pulmonary disease, post-hoc analysis was conducted. In the trial, participants received standard care with either ONS + HMB or a placebo beverage (target 2 servings/day) during hospitalization and for 90 days post-discharge. Nutritional status was assessed using Subjective Global Assessment (SGA) and handgrip strength at baseline, 0-, 30-, 60- and 90-days post-discharge. Vitamin D (25-hydroxyvitamin D) was assessed within 72 hours of admission (baseline), 30- and 60-days post-discharge. Participants with a BMI ≥ 27 formed the analysis cohort. Treatment effect was determined using analysis of covariance adjusted for baseline measures.</p><p><b>Results:</b> The post-hoc cohort consisted of 166 patients with a BMI ≥ 27, mean age 76.41 ± 8.4 years, and who were predominantly female (51.2%). Baseline handgrip strength (n = 137) was 22.3 ± 0.8 kg while serum concentrations of 25-hydroxyvitamin D (n = 138) was 26.0 ± 1.14 ng/mL. At day 90, ONS+HMB improved nutritional status in which 64% of ONS+HMB group were well-nourished (SGA-A) vs. 37% of the control group (p = 0.011). There was a trend towards higher changes in handgrip strength with ONS+HMB during index hospitalization (baseline to discharge) compared to placebo (least squares means ± standard error: 1.34 kg ± 0.35 vs. 0.41 ± 0.39; p = 0.081) but was not significant at all other timepoints. Vitamin D concentrations were significantly higher at day 60 in those receiving ONS + HMB compared to placebo (29.7 ± 0.81 vs. 24.8 ± 0.91; p < 0.001).</p><p><b>Conclusion:</b> Hospitalized older patients with malnutrition and a BMI ≥ 27 had significant improvements in their vitamin D and nutritional status at day 60 and 90, respectively, if they received standard care + ONS+HMB as compared to placebo. This suggests transitions of care post-acute setting should consider continuation of nutrition for patients with elevated BMI and malnutrition using interventions such as ONS+HMB in combination with standard care.</p><p>Aline Dos Santos<sup>1</sup>; Isis Helena Buonso<sup>2</sup>; Marisa Chiconeli Bailer<sup>2</sup>; Maria Fernanda Jensen Kok<sup>2</sup></p><p><sup>1</sup>Hospital Samaritano Higienópolis, São Paulo; <sup>2</sup>Hospital Samaritano Higienopolis, São Paulo</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition negatively impacts the length of hospital stay, infection rate, mortality, clinical complications, hospital readmission and also on average healthcare costs. It is believed that early nutritional interventions could reduce negative events and generate economic impact. Therefore, our objective was to evaluate the average cost of hospitalization of patients at nutritional risk through nutritional screening with indication of oral nutritional supplementation.</p><p><b>Methods:</b> Retrospective study including 110 adult patients hospitalized in a private institution admitted between August 2023 and January 2024. Nutritional screening was performed within 24 hours of admission. To classify low muscle mass according to calf circumference (CC), the cutoff points were considered: 33 cm for women and 34 cm for men, measured within 96 hours of hospital admission. They were evaluated in a grouped manner considering G1 patients with an indication for oral supplementation (OS) but not started for modifiable reasons, G2 patients with an indication for OS and started assertively (within 48 hours of the therapeutic indication) and G3 patients with an indication for OS but started late (after 48 hours of therapeutic indication) and G4 the joining of patients from G1 and G3 as both did not receive OS assertively. Patients receiving enteral or parenteral nutritional therapy were excluded.</p><p><b>Results:</b> G2 was prevalent in the studied sample (51%), had an intermediate average length of stay (20.9 days), lower average daily hospitalization cost, average age of 71 years, significant prevalence of low muscle mass (56%) and lower need for hospitalization in intensive care (IC) (63%) with an average length of stay (SLA) in IC of 13.5 days. G1 had a lower prevalence (9%), shorter average length of stay (16 days), average daily cost of hospitalization 41% higher than G2, average age of 68 years, unanimity of adequate muscle mass (100%) and considerable need for hospitalization in intensive care (70%), but with a SLA in IC of 7.7 days. G3 represented 40% of the sample studied, had a longer average length of stay (21.5 days), average daily cost of hospitalization 22% higher than G2, average age of 73 years, significant prevalence of low muscle mass (50%) and intermediate need for hospitalization in intensive care (66%) but with SLA in IC of 16.5 days. Compared to G2, G4 presented a similar sample group (G2: 56 patients and G4: 54 patients) as well as mean age (72 years), hospitalization (20.55 days), hospitalization in IC (66%), SLA in IC (64.23%) but higher average daily hospitalization cost (39% higher than G2) and higher prevalence of patients with low muscle mass (59%).</p><p><b>Conclusion:</b> From the results presented, we can conclude that the percentage of patients who did not receive OS and who spent time in the IC was on average 5% higher than the other groups, with unanimously adequate muscle mass in this group, but with the need for supplementation due to clinical conditions, food acceptance and weight loss. More than 50% of patients among all groups except G1 had low muscle mass. Regarding costs, patients supplemented assertively or late cost, respectively, 45% and 29% less compared to patients who did not receive OS. Comparing G2 with G4, the cost remains 39% lower in patients assertively supplemented.</p><p><b>International Poster of Distinction</b></p><p>Daphnee Lovesley, PhD, RD<sup>1</sup>; Rajalakshmi Paramasivam, MSc, RD<sup>1</sup></p><p><sup>1</sup>Apollo Hospitals, Chennai, Tamil Nadu</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition in hospitals, once an underreported issue, has gained significant attention in recent decades. This widespread problem negatively impacts recovery, length of stay (LOS), and overall outcomes. This study aimed to evaluate the effectiveness of clinical nutrition practices and the role of a Nutrition Steering Committee (NSC) in addressing and potentially eradicating this persistent problem by improving nutritional care and patient outcomes.</p><p><b>Methods:</b> Consecutive patients admitted to non-critical units of a tertiary care hospital between January 2018 and August 2024 were included in the study. Patient demographics, body mass index (BMI), modified Subjective Global Assessment (mSGA), oral nutritional supplements (ONS) usage, and clinical outcomes were retrospectively extracted from electronic medical records. Data was analyzed using SPSS version 20.0, comparing results before and after the implementation of the NSC.</p><p><b>Results:</b> Out of 239,630 consecutive patients, 139,895 non-critical patients were included, with a mean age of 57.10 ± 15.89 years; 64.3% were men and 35.7% women. The mean BMI was 25.76 ± 4.74 kg/m<sup>2</sup>, and 49.6% of the patients were polymorbid. The majority (25.8%) were admitted with cardiac illness. According to the modified Subjective Global Assessment (mSGA), 87.1% were well-nourished, 12.8% moderately malnourished, and 0.1% severely malnourished. ONS were prescribed for 10% of the population and ONS prescription was highest among underweight (28.4%); Normal BMI (13%); Overweight (9.1%); Obese (7.7%) (p = 0.000) and mSGA – well-nourished (5.5%); moderately malnourished (MM) 41%; severely malnourished (SM) 53.2% (p = 0.000) and pulmonology (23.3%), followed by gastroenterology & Hepatology (19.2%) (p = 0.000). The mean hospital LOS was 4.29 ± 4.03 days, with an overall mortality rate of 1.2%. Severe malnutrition, as rated by mSGA, significantly impacted mortality (0.8% vs. 5.1%, p = 0.000). Mortality risk increased with polymorbidity (0.9% vs. 1.5%) and respiratory illness (2.6%, p = 0.000). Poor nutritional status, as assessed by mSGA (34.7%, 57.4%, 70.9%) and BMI (43.7% vs. 38%), was associated with longer hospital stays (LOS ≥ 4 days, p = 0.000). The implementation of the NSC led to significant improvements - average LOS decreased (4.4 vs. 4.1 days, p = 0.000), and mortality risk was reduced from 1.6% to 0.7% (p = 0.000). No significant changes were observed in baseline nutritional status, indicating effective clinical nutrition practices in assessing patient nutrition. ONS prescriptions increased from 5.2% to 9.7% between 2022 and 2024 (p = 0.000), contributing to the reduction in mortality rates to below 1% after 2022, compared to over 1% before NSC (p = 0.000). A significant negative correlation was found between LOS and ONS usage (p = 0.000). Step-wise binary logistic regression indicated that malnutrition assessed by mSGA predicts mortality with an odds ratio of 2.49 and LOS with an odds ratio of 1.7, followed by ONS prescription and polymorbidity (p = 0.000).</p><p><b>Conclusion:</b> A well-functioning NSC is pivotal in driving successful nutritional interventions and achieving organizational goals. Early identification of malnutrition through mSGA, followed by timely and appropriate nutritional interventions, is essential to closing care gaps and improving clinical outcomes. Strong leadership and governance are critical in driving these efforts, ensuring that the patients receive optimal nutritional support to enhance recovery and reduce mortality.</p><p><b>Table 1.</b> Patient Characteristics: Details of Baseline Anthropometric & Nutritional Status.</p><p></p><p>Baseline details of Anthropometric Measurements and Nutrition Status.</p><p><b>Table 2.</b> Logistic Regression to Predict Hospital LOS and Mortality.</p><p></p><p>Step-wise binary logistic regression indicated that malnutrition assessed by mSGA predicts mortality with an odds ratio of 2.49 and LOS with an odds ratio of 1.7, followed by ONS prescription and polymorbidity (p = 0.000).</p><p></p><p>mSGA-rated malnourished patients stayed longer in the hospital compared to the well-nourished category (p = 0.000)</p><p><b>Figure 1.</b> Nutritional Status (mSGA) Vs Hospital LOS (>4days).</p><p>Hannah Welch, MS, RD<sup>1</sup>; Wendy Raissle, RD, CNSC<sup>2</sup>; Maria Karimbakas, RD, CNSC<sup>3</sup></p><p><sup>1</sup>Optum Infusion Pharmacy, Phoenix, AZ; <sup>2</sup>Optum Infusion Pharmacy, Buckeye, AZ; <sup>3</sup>Optum Infusion Pharmacy, Milton, MA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Food insecurity is when people do not have enough food to eat and do not know where their next meal will come from. In the United States approximately 49 million people relied on food assistance charities in 2022 (data from Feeding America). Patients receiving parenteral nutrition (PN), who may be capable of supplementing with oral intake, may experience food insecurity due to chronic health conditions limiting work capability and total family income. Patients may also experience lack of affordable housing, increased utilities and the burden of medical expenses. Signs of food insecurity may present as weight loss, malnourishment, low energy, difficulty concentrating, or other physical indicators such as edema, chronically cracked lips, dry skin, and itchy eyes. The purpose of this abstract is to highlight two unique patient case presentations where food insecurity prompted clinicians to intervene.</p><p><b>Methods:</b> Patient 1: 50-year-old male with short bowel syndrome (SBS) on long-term PN called the registered dietitian (RD) regarding financial difficulties with feeding the family (see Table 1). The patient and clinician relationship allowed the patient to convey sensitive concerns to the RD regarding inability to feed himself and his family, which resulted in the patient relying on the PN for all nutrition. Due to the food insecurity present, the clinician made changes to PN/hydration to help improve patient's clinical status. Patient 2: A 21-year-old male with SBS on long-term PN spoke with his in-home registered nurse (RN) regarding family's difficulties affording food (see Table 2). The RN informed the clinical team of suspected food insecurity and the insurance case manager (CM) was contacted regarding food affordability. The RD reached out to local community resources such as food banks, food boxes and community programs. A community program was able to assist the patient with meals until patient's aunt started cooking meals for him. This patient did not directly share food insecurity with RD; however, the relationship with the in-home RN proved valuable in having these face-to-face conversations with the patient.</p><p><b>Results:</b> In these two patient examples, difficulty obtaining food affected the patients’ clinical status. The clinical team identified food insecurity and the need for further education for the interdisciplinary team. A food insecurity informational handout was created by the RD with an in-service to nursing to help aid recognition of signs (Figure 1) to detect possible food insecurity and potential patient resources available in the community. Figure 2 presents suggested questions to ask a patient if an issue is suspected.</p><p><b>Conclusion:</b> Given the prevalence of food insecurity, routine assessment for signs and symptoms is essential. Home nutrition support teams (including RDs, RNs, pharmacists and care technicians) are positioned to assist in this effort as they have frequent phone and in-home contact with patients and together build a trusted relationship with patients and caregivers. Clinicians should be aware regarding potential social situations which can warrant changes to PN formulations. To approach this sensitive issue thoughtfully, PN infusion providers should consider enhancing patient assessments and promote education across the interdisciplinary team to create awareness of accessible community resources.</p><p><b>Table 1.</b> Patient 1 Information.</p><p></p><p><b>Table 2.</b> Suspected Food Insecurity Timeline.</p><p></p><p></p><p><b>Figure 1.</b> Signs to Detect Food Insecurity.</p><p></p><p><b>Figure 2.</b> Questions to Ask.</p><p><b>Poster of Distinction</b></p><p>Christan Bury, MS, RD, LD, CNSC<sup>1</sup>; Amanda Hodge Bode, RDN, LD<sup>2</sup>; David Gardinier, RD, LD<sup>3</sup>; Roshni Sreedharan, MD, FASA, FCCM<sup>3</sup>; Maria Garcia Luis, MS, RD, LD<sup>4</sup></p><p><sup>1</sup>Cleveland Clinic, University Heights, OH; <sup>2</sup>Cleveland Clinic Foundation, Sullivan, OH; <sup>3</sup>Cleveland Clinic, Cleveland, OH; <sup>4</sup>Cleveland Clinic Cancer Center, Cleveland, OH</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> The Society of Critical Care Medicine, Critical Care Congress in Orlando, FL on February 25<sup>th</sup>.</p><p><b>Publication:</b> Critical Care Medicine.2025;53(1):In press.</p><p><b>Financial Support:</b> Project support provided by Morrison Cleveland Clinic Nutrition Research collaborative.</p><p><b>Background:</b> Hospitalized and critically ill patients who have preexisting malnutrition can have worse outcomes and an increased length of stay (LOS). Currently, Registered Dietitians (RDs) assess malnutrition with a nutrition focused physical exam (NFPE). Recent recommendations encourage the use of body composition tools such as computed tomography (CT) along with the NFPE. Trained RDs can use CT scans to evaluate skeletal muscle mass at the third lumbar vertebrae (L3) and then calculate Skeletal Muscle Index (SMI) and mean Hounsfield Units (HU) to determine muscle size and quality, respectively. This has been validated in various clinical populations and may be particularly useful in the critically ill where the NFPE is difficult. We aim to evaluate if using CT scans in the surgical and critical care population can be a supportive tool to capture a missed malnutrition diagnosis.</p><p><b>Methods:</b> One-hundred and twenty patients admitted to the Cleveland Clinic from 2021-2023 with a malnutrition evaluation that included an NFPE within 2 days of an abdominal CT were evaluated. Of those, 59 patients had a major surgery or procedure completed that admission and were included in the final analysis. The CT scans were read by a trained RD at L3 using Terarecon, and results were cross-referenced by an artificial intelligence software (AI) called Veronai. Age, sex, BMI, SMI & HU were analyzed, along with the malnutrition diagnosis.</p><p><b>Results:</b> Fifty-nine patients were analyzed. Of these, 61% were male, 51% were >65 years old, and 24% had a BMI > 30. Malnutrition was diagnosed in 47% of patients. A total of 24 patients had no muscle wasting based on the NFPE while CT captured low muscle mass in 58% in that group. Twenty-two percent of patients (13/59) had a higher level of malnutrition severity when using CT. Additionally, poor muscle quality was detected in 71% of patients among all age groups. Notably, there was a 95% agreement between AI and the RD's assessment in detecting low muscle mass.</p><p><b>Conclusion:</b> RDs can effectively analyze CT scans and use SMI and HU with their NFPE. The NFPE alone is not always sensitive enough to detect low muscle in surgical and critically ill patients. The degree of malnutrition dictates nutrition interventions, including the timing and type of nutrition support, so it is imperative to accurately diagnose and tailor interventions to improve outcomes.</p><p><b>Table 1.</b> Change in Malnutrition Diagnosis Using CT.</p><p></p><p>The graph shows the change in Malnutrition Diagnosis when CT was applied in conjunction with the NFPE utilizing ASPEN Guidelines.</p><p><b>Table 2.</b> Muscle Assessment: CT vs NFPE.</p><p></p><p>This graph compares muscle evaluation using both CT and the NFPE.</p><p></p><p>CT scan at 3rd lumbar vertebrae showing normal muscle mass and normal muscle quality in a pt >65 years old.</p><p><b>Figure 1.</b> CT Scans Evaluating Muscle Size and Quality.</p><p></p><p>CT scan at 3rd lumbar vetebrae showing low muscle mass and low muscle quality in a patient with obesity.</p><p><b>Figure 2.</b> CT Scans Evaluating Muscle Size and Quality.</p><p>Elif Aysin, PhD, RDN, LD<sup>1</sup>; Rachel Platts, RDN, LD<sup>1</sup>; Lori Logan, RN<sup>1</sup></p><p><sup>1</sup>Henry Community Health, New Castle, IN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Disease-related malnutrition alters body composition and causes functional decline. In acute care hospitals, 20-50% of inpatients have malnutrition when they are admitted to the hospital. Patients with malnutrition experience higher medical costs, increased mortality, and longer hospital stays. Malnutrition is associated with an increased risk of readmission and complications. Monitoring, diagnosis, treatment, and documentation of malnutrition are important for treating patients. It also contributes to proper Diagnosis Related Group (DRG) coding and accurate CMI (Case Mix Index), which can increase reimbursement.</p><p><b>Methods:</b> After dietitians completed the Nutrition Focused Physical Exam (NFPE) course and sufficient staff had been provided, the malnutrition project was initiated under the leadership of RDNs in our small rural community hospital. The interdisciplinary medical committee approved the project to improve malnutrition screening, diagnosis, treatment practices, and coding. It was decided to use the Academy/American Society of Parenteral and Enteral Nutrition (ASPEN) and Global Leadership Initiative on Malnutrition (GLIM) criteria to diagnose malnutrition. The Malnutrition Screening Tool (MST) was completed by nurses to determine the risk of malnutrition. The Nutrition and Dietetics department created a new custom report that provides NPO-Clear-Full liquids patients' reports using the nutrition database. RDNs check NPO-Clear-Full liquids patients' reports, BMI reports, and length of stay (LOS) patient lists on weekdays and weekends. RDNs also performed the NFPE exam to evaluate nutritional status. If malnutrition is identified, RDNs communicate with providers through the hospital messenger system. Providers add malnutrition diagnosis in their documentation and plan of care. RDNs created a dataset and shared it with Coders and Clinical Documentation Integrity Specialists/Care Coordination. We keep track of malnutrition patients. In addition, RDNs spent more time with malnourished patients. They contributed to discharge planning and education.</p><p><b>Results:</b> The prevalence of malnutrition diagnosis and the amount of reimbursement for 2023 were compared to the six months after implementing the malnutrition project. Malnutrition diagnosis increased from 2.6% to 10.8%. Unspecified protein-calorie malnutrition diagnoses decreased from 39% to 1.5%. RDN-diagnosed malnutrition has been documented in provider notes for 82% of cases. The malnutrition diagnosis rate increased by 315% and the malnutrition reimbursement rate increased by 158%. Of those patients identified with malnutrition, 59% received malnutrition DRG code. The remaining 41% of patients received higher major complications and comorbidities (MCCs) codes. Our malnutrition reimbursement increased from ~$106,000 to ~$276,000.</p><p><b>Conclusion:</b> The implementation of evidence-based practice guidelines was key in identifying and accurately diagnosing malnutrition. The provision of sufficient staff with the necessary training and multidisciplinary teamwork has improved malnutrition diagnosis documentation in our hospital, increasing malnutrition reimbursement.</p><p><b>Table 1.</b> Before and After Malnutrition Implementation Results.</p><p></p><p></p><p><b>Figure 1.</b> Prevalence of Malnutrition Diagnosis.</p><p>Elisabeth Schnicke, RD, LD, CNSC<sup>1</sup>; Sarah Holland, MSc, RD, LD, CNSC<sup>2</sup></p><p><sup>1</sup>The Ohio State University Wexner Medical Center, Columbus, OH; <sup>2</sup>The Ohio State University Wexner Medical Center, Upper Arlington, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition is associated with increased length of stay, readmissions, mortality and poor outcomes. Early identification and treatment are essential. The Malnutrition Screening Tool (MST) is a quick, easy tool recommended for screening malnutrition in adult hospitalized patients. This is commonly used with or without additional indicators. We aimed to evaluate our current nutrition screening policy, which utilizes MST, age and body mass index (BMI), to improve malnutrition identification.</p><p><b>Methods:</b> This quality improvement project data was obtained over a 3-month period on 4 different adult services at a large academic medical center. Services covered included general medicine, hepatology, heart failure and orthopedic surgery. Patients were assessed by a Registered Dietitian (RD) within 72hrs of admission if they met the following high-risk criteria: MST score >2 completed by nursing on admission, age ≥65 yrs or older, BMI ≤ 18.5 kg/m<sup>2</sup>. If none of the criteria were met, patients were seen within 7 days of admission or sooner by consult request. Malnutrition was diagnosed using Academy of Nutrition and Dietetics/American Society for Parenteral and Enteral Nutrition indicators of malnutrition (AAIM) criteria. Data collected included malnutrition severity and etiology, age, gender, BMI and MST generated on admission.</p><p><b>Results:</b> A total of 239 patients were diagnosed with malnutrition. Table 1 shows detailed characteristics. Malnutrition was seen similarly across gender (51% male, 49% female) and age groups. Age range was 21-92 yrs with an average age of 61 yrs. BMI range was 9.8-50.2 kg/m2 with an average BMI of 24.6 kg/m2. More patients were found to have moderate malnutrition at 61.5% and chronic malnutrition at 54%. When data was stratified by age ≥65 yrs, similar characteristics were seen for malnutrition severity and etiology. Notably, more patients (61.5%) had an MST of < 2 or an incomplete MST compared to patients < 65 yrs of age (56%). There were 181 patients (76%) that met high risk screening criteria and were seen for an assessment within 72hrs. Seventy patients (39%) were screened only due to age ≥65 yrs. Forty-five (25%) were screened due to MST alone. There were 54 (30%) that met 2 indicators for screening. Only a small number of patients met BMI criteria alone or 3 indicators (6 patients or 3% each).</p><p><b>Conclusion:</b> Utilizing MST alone would have missed over half of patients diagnosed with malnutrition and there was a higher miss rate with older adults using MST alone. Age alone as a screening criteria caught more patients than MST alone did. Adding BMI to screening criteria added very little and we still missed 24% of patients with our criteria. A multi-faceted tool should be explored to best capture patients.</p><p><b>Table 1.</b> Malnutrition characteristics.</p><p></p><p>*BMI: excludes those with amputations, paraplegia; all patients n = 219, patients ≥65yo n = 106.</p><p>Robin Nuse Tome, MS, RD, CSP, LDN, CLC, FAND<sup>1</sup></p><p><sup>1</sup>Nemours Children's Hospital, DE, Landenberg, PA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition is a global problem impacting patients of all ages, genders, and races. Malnourished hospitalized patients are associated with poorer outcomes including longer in-hospital length of stays, higher rate of death, higher need for home healthcare service, and a higher rate of 30day readmission. This results in a higher economic burden for the healthcare industry, insurance, and the individual. Malnutrition documentation plays a vital role in capturing the status of the patient and starting the conversation about interventions to address then concern.</p><p><b>Methods:</b> A taskforce made up of physicians, registered dietitians (RDs), and clinical documentation specialists met to discuss strategies to increase documentation of the malnutrition diagnosis to facilitate better conversation about the concern and potential interventions. Options included inbox messages, a best practice alert, a SmartLink between the physician and RD note, and adding the diagnosis to the problem list. Of the options, the team selected to develop a SmartLink was developed within the Electronic Medical Record (EMR) that links text from the RD note about malnutrition to the physician note to capture the diagnosis of malnutrition, the severity, and the progression of the diagnosis over time.</p><p><b>Results:</b> Preliminary data shows that physician documentation of the malnutrition diagnosis as well as the severity and progression of the diagnosis increased by 20% in the pilot medical team. Anecdotally, physicians were more aware of the patient's nutrition status with documentation linked to their note and collaboration between the medical team and the RD to treat malnutrition has increased.</p><p><b>Conclusion:</b> We hypothesize that expanding the practice to the entire hospital, will increase documentation of malnutrition diagnosis in the physician note. This will help increase awareness of the nutrition status of the patient, draw attention and promote collaboration on interventions to treat, and increase billable revenue to the hospital by capturing the documentation of the degree of malnutrition in the physician note.</p><p>David López-Daza, RD<sup>1</sup>; Cristina Posada-Alvarez, Centro Latinoamericano de Nutrición<sup>1</sup>; Alejandra Agudelo-Martínez, Universidad CES<sup>2</sup>; Ana Rivera-Jaramillo, Boydorr SAS<sup>3</sup>; Yeny Cuellar-Fernández, Centro Latinoamericano de Nutrición<sup>1</sup>; Ricardo Merchán-Chaverra, Centro Latinoamericano de Nutrición<sup>1</sup>; María-Camila Gómez-Univio, Centro Latinoamericano de Nutrición<sup>1</sup>; Patricia Savino-Lloreda, Centro Latinoamericano de Nutrición<sup>1</sup></p><p><sup>1</sup>Centro Latinoamericano de Nutrición (Latin American Nutrition Center), Chía, Cundinamarca; <sup>2</sup>Universidad CES (CES University), Medellín, Antioquia; <sup>3</sup>Boydorr SAS, Chía, Cundinamarca</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> The Malnutrition Screening Tool (MST) is a simple and quick instrument designed to identify the risk of malnutrition in various healthcare settings, including home hospitalization. Its use has become widespread due to its ease of application and ability to preliminarily assess the nutritional status of patients. In the context of home care, where clinical resources may be limited, an efficient screening tool is crucial to ensure early interventions that prevent nutritional complications in vulnerable populations. The objective was to evaluate the diagnostic accuracy of the MST in detecting malnutrition among patients receiving home care.</p><p><b>Methods:</b> A diagnostic test study was conducted, collecting sociodemographic data, MST results, and malnutrition diagnoses based on the Global Leadership Initiative on Malnutrition (GLIM) criteria. A positive MST score was defined as a value of 2 or above. Categorical data were summarized using proportions, while quantitative variables were described using measures of central tendency. Sensitivity, specificity, the area under the receiver operating characteristic curve (AUC), positive likelihood ratio (LR + ), and negative likelihood ratio (LR-) were estimated along with their respective 95% confidence intervals.</p><p><b>Results:</b> A total of 676 patients were included, with a median age of 82 years (interquartile range: 68-89 years), and 57.3% were female. According to the GLIM criteria, 59.8% of the patients met the criteria for malnutrition. The MST classified patients into low risk (62.4%), medium risk (30.8%), and high risk (6.8%). The sensitivity of the MST was 11.4% (95% CI: 8.5-14.9%), while its specificity was 100% (95% CI: 98.7-100%). The positive likelihood ratio (LR + ) was 1.75, and the negative likelihood ratio (LR-) was 0.0. The area under the curve was 0.71 (95% CI: 0.69-0.73), indicating moderate discriminative capacity.</p><p><b>Conclusion:</b> While the MST demonstrates extremely high specificity, its low sensitivity limits its effectiveness in accurately identifying malnourished patients in the context of home care. This suggests that, although the tool is highly accurate for confirming the absence of malnutrition, it fails to detect a significant number of patients who are malnourished. As a result, while the MST may be useful as an initial screening tool, its use should be complemented with more comprehensive assessments to ensure more precise detection of malnutrition in this high-risk population.</p><p><b>Poster of Distinction</b></p><p>Colby Teeman, PhD, RDN, CNSC<sup>1</sup>; Kaylee Griffith, BS<sup>2</sup>; Karyn Catrine, MS, RDN, LD<sup>3</sup>; Lauren Murray, MS, RD, CNSC, LD<sup>3</sup>; Amanda Vande Griend, BS, MS<sup>2</sup></p><p><sup>1</sup>University of Dayton, Xenia, OH; <sup>2</sup>University of Dayton, Dayton, OH; <sup>3</sup>Premier Health, Dayton, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> The prevalence of malnutrition in critically ill populations has previously been shown to be between 38-78%. Previously published guidelines have stated that patients in the ICU should be screened for malnutrition within 24-48 hours and all patients in the ICU for >48 hours should be considered at high risk for malnutrition. Patients with a malnutrition diagnosis in the ICU have been shown to have poorer clinical outcomes; including longer length of stay, greater readmission rates, and increased mortality. The purpose of the current study was to see if severity of malnutrition impacted the time to initiate enteral nutrition and the time to reach goal enteral nutrition rate in critically ill patients and determine the possible impact of malnutrition severity on clinical outcomes.</p><p><b>Methods:</b> A descriptive, retrospective chart review was conducted in multiple ICU units at a large level I trauma hospital in the Midwest. All participants included in analysis had been assessed for malnutrition by a registered dietitian according to ASPEN clinical practice guidelines. Exclusion criteria included patients receiving EN prior to the RDN assessment, those who received EN for < 24 hours total, patients on mixed oral and enteral nutrition diets, and patients receiving any parenteral nutrition. Participants were grouped by malnutrition status; no malnutrition n = 27, moderate malnutrition n = 22, and severe malnutrition n = 32. All data analysis were analyzed SPSS version 29.</p><p><b>Results:</b> There was no difference in primary outcomes (time to EN initiation, time to EN goal rate) by malnutrition status (both p > 0.05). Multiple regression analysis found neither moderately malnourished or severely malnourished patients were more likely to have enteral nutrition initiation delayed for >48 hours from admission (p > 0.05). Neither ICU LOS nor hospital LOS was different among malnutrition groups (p > 0.05). Furthermore, neither ICU nor hospital mortality was different among malnutrition groups (p < 0.05). Among patients who were moderately malnourished, 81.8% required vasopressors, compared to 75% of patients who were severely malnourished, and 44.4% of patients who did not have a malnutrition diagnosis (p = 0.010). 90.9% of moderately malnourished patients required extended time on a ventilator (>72 hours), compared to 59.4% of severely malnourished patients, and 51.9% of patients without a malnutrition diagnosis (p = 0.011).</p><p><b>Conclusion:</b> Although the severity of malnutrition did not impact LOS, readmission, or mortality, malnutrition status did significantly predict greater odds of a patient requiring vasopressors and spending an extended amount of time on a ventilator. Further studies with larger sample sizes are warranted to continue developing a better understanding of the relationship between malnutrition status and clinical outcomes.</p><p>Jamie Grandic, RDN-AP, CNSC<sup>1</sup>; Cindi Stefl, RN, BSN, CCDS<sup>2</sup></p><p><sup>1</sup>Inova Health System, Fairfax Station, VA; <sup>2</sup>Inova Health System, Fairfax, VA</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> Vizient Connections Summit 2024 (Sept 16-19, 2024).</p><p><b>Publication:</b> 2024 Vizient supplement to the American Journal of Medical Quality (AJMQ).</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Research reveals that up to 50% of hospitalized patients are malnourished, yet only 9% of these cases are diagnosed. <sup>(1)</sup> Inadequate diagnosis and intervention of malnutrition can lead to poorer patient outcomes and reduced revenue. Our systemwide malnutrition awareness campaign successfully enhanced dietitian engagement, provider education, and streamlined documentation processes. This initiative resulted in a two-fold increase in the capture of malnutrition codes, a notable rise in malnutrition variable capture, and an average increase in diagnosis-related group relative weight by approximately 0.9. Consequently, there was a ~ 300% increase in revenue associated with accurate malnutrition diagnosis and documentation, alongside improvements in the observed-to-expected (O/E) ratio for mortality and length of stay. Given the critical impact of malnutrition on mortality, length of stay, and costs, enhanced identification programs are essential. Within our health system, a malnutrition identification program has been implemented across five hospitals for several years. System leadership roles for clinical nutrition and clinical documentation integrity (CDI) were established to ensure consistency, implement best practices, and optimize program efficacy. In 2022, a comprehensive analysis identified opportunities for improvement: a low systemwide capture rate (2%), limited awareness of the program's benefits, and inconsistent documentation practices. The leadership team, with the support of our executive sponsors, addressed these issues, engaged with service line leaders, and continue to drive program enhancements.</p><p><b>Methods:</b> A 4-part malnutrition education campaign was implemented: Strengthened collaboration between Clinical Nutrition and CDI, ensuring daily systemwide communication of newly identified malnourished patients. Leadership teams, including coding and compliance, reviewed documentation protocols considering denial risks and regulatory audits. Launched a systemwide dietitian training program with a core RD malnutrition optimization team, a 5-hour comprehensive training, and monthly chart audits aiming for >80% documentation compliance. Created a Provider Awareness Campaign featuring interactive presentations on the malnutrition program's benefits and provider documentation recommendations. Developed an electronic health record (EHR) report and a malnutrition EHR tool to standardize documentation. EHR and financial reports were used to monitor program impact and capture rates.</p><p><b>Results:</b> The malnutrition campaign has notably improved outcomes through ongoing education for stakeholders. A malnutrition EHR tool was also created in November 2022. This tool is vital for enhancing documentation, significantly boosting provider and CDI efficiency. Key results include: Dietitian documentation compliance increased from 85% (July 2022) to 95% (2024); RD-identified malnutrition cases increased from 2% (2021) to 16% (2024); Monthly average of final coded malnutrition diagnoses increased from 240 (2021) to 717 (2023); Average DRG relative weight climbed from 1.24 (2021) to 2.17 (2023); Financial impact increased from $5.5 M (2021) to $17.7 M (2024); and LOS O/E improved from 1.04 to 0.94 and mortality O/E improved from 0.77 to 0.62 (2021-2023).</p><p><b>Conclusion:</b> This systemwide initiative not only elevates capture rates and documentation but also enhances overall outcomes. By CDI and RD teams taking on more of a collaborative, leadership role, providers can concentrate more on patient care, allowing these teams to operate at their peak. Looking ahead to 2025, the focus will shift towards leading indicators to refine malnutrition identification and assess the educational campaign's impact further.</p><p>Ryota Sakamoto, MD, PhD<sup>1</sup></p><p><sup>1</sup>Kyoto University, Kyoto</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Growing concern about the environmental impact of meat eating has led to consideration of a shift to a plant-based diet. One nutrient that tends to be particularly deficient in a plant-based diet is vitamin B12. Since vitamin B12 deficiency can cause anemia, limb paresthesia and muscle weakness, and psychiatric symptoms including depression, delirium, and cognitive impairment, it is important to find sustainable local sources of vitamin B12. In this study, we focused on gundruk and sinki, two fermented and preserved vegetables traditionally consumed mainly in Nepal, India, and Bhutan, and investigated their vitamin B12 content. The sinki is mainly made from radish roots, while gundruk is made from green leaves such as mustard leaves, which are preserved through fermentation and sun-drying processes. Previous reports indicated that, in these regions, not only vegetarians and vegans but also a significant number of people may have consistently low meat intake, especially among the poor. The governments and other organizations have been initiating feeding programs to supply fortified foods with vitamins A, B1, B2, B3, B6, B9, B12, iron, and zinc, especially to schools. At this time, however, it is not easy to get fortified foods to residents in the community. It is important to explore the possibility of getting vitamin B12 from locally available products that can be taken by vegetarians, vegans, or the poor in the communities.</p><p><b>Methods:</b> Four samples of gundruk and five samples of sinki were obtained from markets, and the vitamin B12 content in them was determined using Lactobacillus delbrueckii subsp. lactis (Lactobacillus leichmannii) ATCC7830. The lower limit of quantification was set at 0.03 µg/100 g. The sample with the highest vitamin B12 concentration in the microbial quantification method was also measured for cyanocobalamin using LC-MS/MS (Shimadzu LC system equipped with Triple Quad 5500 plus AB-Sciex mass spectrometer). The Multiple Reaction Monitoring transition pattern for cyanocobalamin were Q1: 678.3 m/z, Q3: 147.1 m/z.</p><p><b>Results:</b> For gundruk, vitamin B12 was detected in all four of the four samples, with values of 5.0 µg/100 g, 0.13 µg/100 g, 0.12 µg/100 g, and 0.04 µg/100 g, respectively, from highest to lowest. For sinki, it was detected in four of the five samples, with values of 1.4 µg/100 g, 0.41 µg/100 g, 0.34 µg/100 g, and 0.16 µg/100 g, respectively, from highest to lowest. The cyanocobalamin concentration by LC-MS/MS in one sample was estimated to be 1.18 µg/100 g.</p><p><b>Conclusion:</b> According to “Vitamin and mineral requirements in human nutrition (2nd edition) (2004)” by the World Health Organization and the Food and Agriculture Organization of the United Nations, the recommended intake of vitamin B12 is 2.4 µg/day for adults, 2.6 µg/day for pregnant women and 2.8 µg/day for lactating women. The results of this study suggest that gundruk and sinki have potential as a source of vitamin B12, although there is a great deal of variability among samples. In order to use gundruk and sinki as a source of vitamin B12, it may be necessary to find a way to stabilize the vitamin B12 content while focusing on the relationship between vitamin B12 and the different ways of making gundruk and sinki.</p><p>Teresa Capello, MS, RD, LD<sup>1</sup>; Amanda Truex, MS, RRT, RCP, AE-C<sup>1</sup>; Jennifer Curtiss, MS, RD, LD, CLC<sup>1</sup>; Ada Lin, MD<sup>1</sup></p><p><sup>1</sup>Nationwide Children's Hospital, Columbus, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> The metabolic demands of critically ill children are defined by an increase in resting energy expenditure. (1,2). Energy needs in the PICU are ever changing and accurate evaluations are challenging to obtain. (3) Predictive equations have been found to be inaccurate due to over or under feeding patients which can lead to negative outcomes such as muscle loss and poor healing (underfeeding) and weight gain as adipose (overfeeding) (1,4,5). Indirect calorimetry (IC) is considered the gold standard to assess metabolic demand especially for critically ill pediatric patients (1,2,4). The use of IC may be limited due to staffing, equipment availability and cost as well as other patient related issues and/or cart specifications (6). In our facility, we identified limited use of IC by dietitians. Most tests were ordered by PICU dietitians and rarely outside the critical care division even though testing would benefit patients in other divisions of the hospital such as the CTICU, rehab, NICU and stepdown areas. Informal polling of non-PICU dietitians revealed that they had significant uncertainty interpreting data and providing recommendations based on test results. Reasons for uncertainty mostly centered from a lack of familiarity with this technology. The purpose of this study was to develop guidelines and a worksheet for consistently evaluating IC results with the goal of encouraging increased use of indirect calorimetry at our pediatric facility.</p><p><b>Methods:</b> A committee of registered dietitians (RDs) and respiratory therapists (RTs) met in January 2023 and agreed on step-by-step guidelines which were trialed, reviewed, and updated monthly. Finalized guidelines were transitioned to a worksheet to improve consistency of use and aid in interpretation of IC results. A shared file was established for articles about IC as well as access to the guidelines and worksheet (Figures 1 and 2). For this study, IC data from January 1, 2022 to July 31, 2024 was reviewed. This data included number of tests completed and where the orders originated.</p><p><b>Results:</b> Since the guidelines have been implemented, the non-PICU areas using IC data increased from 16% in 2022 to 30% in 2023 and appears to be on track to be the same in 2024 (Figure 3). RDs report an improved comfort level with evaluating test results as well as making recommendations for test ordering.</p><p><b>Conclusion:</b> The standardized guidelines and worksheet increased RD's level of comfort and interpretation of test results. The PICU RDs have become more proficient and comfortable explaining IC during PICU rounds. It is our hope that with the development of the guidelines/worksheet, more non-PICU RDs will utilize the IC testing outside of the critical care areas where longer lengths of stay may occur. IC allows for more individualized nutrition prescriptions. An additional benefit was the mutual exchange of information between disciplines. The RTs provided education on the use of the machine to the RDs. This enhanced RDs understanding of IC test results from the RT perspective. In return, the RDs educated the RTs as to why certain aspects of the patient's testing environment were helpful to report with the results for the RD to interpret the information correctly. The committee continues to meet and discuss patients’ tests to see how testing can be optimized as well as how results may be used to guide nutrition care.</p><p></p><p><b>Figure 1.</b> Screen Capture of Metabolic Cart Shared File.</p><p></p><p><b>Figure 2.</b> IC Worksheet.</p><p></p><p><b>Figure 3.</b> Carts completed per year by unit: 2022 is pre-intervention; 2023 and 2024 are post intervention. Key: H2B = PICU; other areas are non-PICU (H4A = cardiothoracic stepdown, H4B = cardiothoracic ICU, H5B = burn, H8A = pulmonary, H8B = stable trach/vent unit, H10B = Neurosurgery/Neurology, H11B = Nephrology/GI, H12A = Hematology/Oncology, C4A = NICU, C5B = infectious disease).</p><p>Alfredo Lozornio-Jiménez-de-la-Rosa, MD, MSCN<sup>1</sup>; Minu Rodríguez-Gil, MSCN<sup>2</sup>; Luz Romero-Manriqe, MSCN<sup>2</sup>; Cynthia García-Vargas, MD, MSCN<sup>2</sup>; Rosa Castillo-Valenzuela, PhD<sup>2</sup>; Yolanda Méndez-Romero, MD, MSC<sup>1</sup></p><p><sup>1</sup>Colegio Mexicano de Nutrición Clinica y Terapia Nutricional (Mexican College of Clinical Nutrition and Nutritional Therapy), León, Guanajuato; <sup>2</sup>Colegio Mexicano de Nutrición Clínica y Terapia Nutricionalc (Mexican College of Clinical Nutrition and Nutritional Therapy), León, Guanajuato</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Sarcopenia is a systemic, progressive musculoskeletal disorder associated with an increased risk of adverse events and is highly prevalent in older adults. This condition leads to a decline in functionality, quality of life, and has an economic impact. Sarcopenia is becoming increasingly common due to age-related changes, metabolic changes, obesity, sedentary lifestyle, chronic degenerative diseases, malnutrition, and pro-inflammatory states. The objective of this study was to investigate the relationship between strength and muscle mass, measured by calf circumference, both corrected for BMI, in young and older Mexican adults.</p><p><b>Methods:</b> This is a prospective, observational, cross-sectional, population-based clinical study conducted among Mexican men and women aged 30 to 90 years old, obtained through convenience sampling. The study was approved by the Ethics Committee of the Aranda de la Parra Hospital in León, Guanajuato, Mexico, and adheres to the Declaration of Helsinki. Informed consent was obtained from all participants after explaining the nature of the study. Inclusion criteria were: Mexican men and women aged 30 to 90 years old who were functionally independent (Katz index category \"a\"). Participants with amputations, movement disorders, or immobilization devices on their extremities were excluded. The research team was previously standardized in anthropometric measurements. Demographic data and measurements of weight, height, BMI, calf circumference, and grip strength using a dynamometer were collected. Data are presented as average and standard deviation. Spearman's correlation analysis was used to assess the relationship between BMI and calf circumference adjusted for BMI, with grip strength, considering a significance level of p < 0.05.</p><p><b>Results:</b> The results of 1032 subjects were presented, 394 men and 638 women from central Mexico, located in workplaces, recreation centers, and health facilities, aged between 30 and 90 years old. Table 1 shows the distribution of the population in each age category, categorized by sex. Combined obesity and overweight were found in 75.1% of the sample population, with a frequency of 69.2% in men and 78.7% in women; 20% had a normal weight, with 25.6% in men and 16.6% in women, and 4.8% had low BMI, with 5.1% of men and 4.7% of women (Graph 1). The depletion of calf circumference corrected for BMI and age in the female population begins at 50 years old with exacerbation at 65 years old and older, while in men a greater depletion can be observed from 70 years old onwards. (Graph 2). When analyzing the strength corrected for BMI and age, grip strength lowers at 55 years old, lowering even more as age increases, in both genders; Chi-square=83.5, p < 0.001 (Graph 3). By Spearman correlation, an inverse and high relationship was found in both genders between age and grip strength, that is, as age increases, grip strength decreases (r = -0.530, p < 0.001). A moderate and negative correlation was found between age and calf circumference, as age increases, calf circumference decreases independently of BMI (r = -0.365, p < 0.001). Calf circumference and grip strength are positively and moderately related, as calf circumference decreases, the grip strength decreases, independently of BMI (r = 0.447, p < 0.0001).</p><p><b>Conclusion:</b> These results show that the study population exhibited a decrease in grip strength, not related to BMI, from early ages, which may increase the risk of early-onset sarcopenia. This findings encourage early assessment of both grip strength and muscle mass, using simple and accessible measurements such as grip strength and calf circumference, adjusted for BMI. These measurements can be performed in the office during the initial patient encounter or in large populations, as in this study.</p><p><b>Table 1.</b> Distribution of the Population According to Age and Gender.</p><p></p><p>Alison Hannon, Medical Student<sup>1</sup>; Anne McCallister, DNP, CPNP<sup>2</sup>; Kanika Puri, MD<sup>3</sup>; Anthony Perkins, MS<sup>1</sup>; Charles Vanderpool, MD<sup>1</sup></p><p><sup>1</sup>Indiana University School of Medicine, Indianapolis, IN; <sup>2</sup>Indiana University Health, Indianapolis, IN; <sup>3</sup>Riley Hospital for Children at Indiana University Health, Indianapolis, IN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> The Academy of Nutrition and Dietetics and American Society for Parenteral and Enteral Nutrition have published criteria to diagnose a patient with mild, moderate, or severe malnutrition that use either single or multiple data points. Malnutrition is associated with worse clinical outcomes, and previous data from our institution showed that hospitalized children with severe malnutrition are at higher risk of mortality compared to mild and moderate malnutrition. This project aims to determine if differences in clinical outcomes exist in patients with severe malnutrition based on the diagnostic criteria or anthropometrics differences in patients.</p><p><b>Methods:</b> We included all patients discharged from Riley Hospital for Children within the 2023 calendar year diagnosed with severe malnutrition, excluding maternity discharges. Diagnostic criteria used to determine severe malnutrition was collected from registered dietitian (RD) documentation and RD-assigned malnutrition statement within medical records for the admission. Data was collected on readmission rates, mortality, length of stay (LOS), LOS index, cost, operative procedures, pediatric intensive care unit (ICU) admissions, and anthropometrics measured on admission. We used mixed-effects regression or mixed effects logistic regression to test whether the outcome of interest differed by severe malnutrition type. Each model contained a random effect for patient to account for correlation within admissions by the same patient and a fixed effect for severe malnutrition type. All analyses were performed using SAS v9.4.</p><p><b>Results:</b> Data was gathered on 409 patient admissions. 383 admissions had diagnostic criteria clearly defined regarding severity of malnutrition. This represented 327 unique patients (due to readmissions). There was no difference in any measured clinical outcomes based on the criteria used for severe malnutrition, including single or multiple point indicators or patients who met both single and multiple point indicators (Table 1). Anthropometric data was analyzed including weight Z-score (n = 398) and BMI Z-score (n = 180). There was no difference seen in the majority of measured clinical outcomes in admissions with severe malnutrition when comparing based on weight or BMI Z-score categories of Z < -2, -2 < Z < -0.01, or Z > 0 (Table 2). Patients admitted with severe malnutrition and a BMI Z score > 0 had an increase in median cost (p = 0.042) compared to BMI < -2 or between -2 and 0 (Table 2). There was a trend towards increased median cost (p = 0.067) and median LOS (p = 0.104) in patients with weight Z score > 0.</p><p><b>Conclusion:</b> Hospitalized patients with severe malnutrition have similar clinical outcomes regardless of diagnostic criteria used to determine the diagnosis of severe malnutrition. Fewer admissions with severe malnutrition (n = 180 or 44%) had sufficient anthropometric data to determine BMI. Based on this data, future programs at our institution aimed at intervention prior to and following admission will need to focus on all patients with severe malnutrition and will not be narrowed based on criteria (single, multiple data point) of severe malnutrition or anthropometrics. Quality improvement projects include improving height measurement and BMI determination upon admission, which will allow for future evaluation of impact of anthropometrics on clinical outcomes.</p><p><b>Table 1.</b> Outcomes by Severe Malnutrition Diagnosis Category.</p><p></p><p>Outcomes by severe malnutrition compared based on the diagnostic criteria used to determine malnutrition diagnosis. Patients with severe malnutrition only are represented. Diagnostic criteria determined based on ASPEN/AND guidelines and defined during admission by registered dietitian (RD). OR = operative room; ICU = intensive care unit; LOS = length of stay. Data on 383 admissions presented, total of 327 patients due to readmissions: 284 patients had 1 admission; 33 patients had 2 admissions; 8 patients had 3 admissions; 1 patient had 4 admissions; 1 patient had 5 admissions</p><p><b>Table 2.</b> Outcomes By BMI Z-score Category.</p><p></p><p>Outcomes of patients admitted with severe malnutrition, stratified based on BMI Z-score. Patients with severe malnutrition only are represented. BMI Z-score determined based on weight and height measurement at time of admission, recorded by bedside admission nurse. OR = operative room; ICU = intensive care unit; LOS = length of stay. Due to incomplete height measurements, data on only 180 admissions was available, total of 158 patients: 142 patients had 1 admission; 12 patients had 2 admissions; 3 patients had 3 admissions; 1 patient had 5 admissions</p><p>Claudia Maza, ND MSc<sup>1</sup>; Isabel Calvo, MD, MSc<sup>2</sup>; Andrea Gómez, ND<sup>2</sup>; Tania Abril, MSc<sup>3</sup>; Evelyn Frias-Toral, MD, MSc<sup>4</sup></p><p><sup>1</sup>Centro Médico Militar (Military Medical Center), Guatemala, Santa Rosa; <sup>2</sup>Hospital General de Tijuana (Tijuana General Hospital), Tijuana, Baja California; <sup>3</sup>Universidad Católica de Santiago de Guayaquil (Catholic University of Santiago de Guayaquil), Guayaquil, Guayas; <sup>4</sup>Universidad Espíritu Santo (Holy Spirit University), Dripping Springs, TX</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition is a common and significant issue among hospitalized patients, particularly in older adults or those with multiple comorbidities. The presence of malnutrition in such patients is associated with an increased risk of morbidity, mortality, prolonged hospital stays, and elevated healthcare costs. A crucial indicator of nutritional status is muscle strength, which can be effectively measured using handgrip strength (HGS). This study aimed to describe the relationship between nutritional status and muscle strength reduction in patients from two hospitals in Latin America.</p><p><b>Methods:</b> A retrospective observational study was conducted from February to May 2022. Data were collected from two hospitals: one in Guatemala and one in Mexico. A total of 169 patients aged 19-98 years were initially considered for the study, and 127 met the inclusion criteria. The sample comprised adult patients of both sexes admitted to internal medicine, surgery, and geriatrics departments. Handgrip strength, demographic data, baseline medical diagnosis, weight, and height were recorded at admission and on the 14th day of hospitalization. Exclusion criteria included patients with arm or hand movement limitations, those under sedation or mechanical ventilation, and those hospitalized for less than 24 hours. HGS was measured using JAMAR® and Smedley dynamometers, following standard protocols. Statistical analysis was performed using measures of central tendency, and results were presented in tables and figures.</p><p><b>Results:</b> In the first hospital (Mexico), 62 patients participated, with a predominant female sample. The average weight was 69.02 kg, height 1.62 meters, and BMI 26.14 kg/m² (classified as overweight). The most common admission diagnoses were infectious diseases, nervous system disorders, and digestive diseases. (Table 1) A slight increase in HGS (0.49 kg) was observed between the first and second measurements. (Figure 1) In the second hospital (Guatemala), 62 patients also met the inclusion criteria, with a predominant male sample. The average weight was 65.92 kg, height 1.61 meters, and BMI 25.47 kg/m² (classified as overweight). Infectious diseases and musculoskeletal disorders were the most common diagnoses. (Table 1) HGS decreased by 2 kg between the first and second measurements. (Figure 2) Low HGS was associated with underweight patients and those with class II and III obesity. Patients with normal BMI in both centers exhibited significant reductions in muscle strength, indicating that weight alone is not a sufficient indicator of muscle strength preservation.</p><p><b>Conclusion:</b> This multicenter study highlights the significant relationship between nutritional status and decreased muscle strength in hospitalized patients. While underweight patients showed reductions in HGS, those with class II and III obesity also experienced significant strength loss. These findings suggest that HGS is a valuable, non-invasive tool for assessing both nutritional status and muscle strength in hospitalized patients. Early identification of muscle strength deterioration can help healthcare providers implement timely nutritional interventions to improve patient outcomes.</p><p><b>Table 1.</b> Baseline Demographic and Clinical Characteristics of the Study Population.</p><p></p><p>NS: Nervous System, BMI: Body Mass Index</p><p></p><p><b>Figure 1.</b> Relationship Between Nutritional Status and Handgrip Strength (Center 1 - Mexico).</p><p></p><p><b>Figure 2.</b> Relationship Between Nutritional Status and Handgrip Strength (Center 2 - Guatemala).</p><p>Reem Farra, MDS, RD, CNSC, CCTD<sup>1</sup>; Cassie Greene, RD, CNSC, CDCES<sup>2</sup>; Michele Gilson, MDA, RD, CEDS<sup>2</sup>; Mary Englick, MS, RD, CSO, CDCES<sup>2</sup>; Kristine Thornham, MS, RD, CDE<sup>2</sup>; Debbie Andersen, MS, RD, CEDRD-S, CHC<sup>3</sup>; Stephanie Hancock, RD, CSP, CNSC<sup>4</sup></p><p><sup>1</sup>Kaiser Permanente, Lone Tree, CO; <sup>2</sup>Kaiser Permanente, Denver, CO; <sup>3</sup>Kaiser Permanente, Castle Rock, CO; <sup>4</sup>Kaiser Permanente, Littleton, CO</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Currently, there is no standardized screening required by the Centers for Medicare and Medicaid Services for malnutrition in outpatient settings. This raises concerns about early identification of malnutrition, the likelihood of nutrition interventions, and increased healthcare costs. While malnutrition screening in hospitalized patients is well-studied, its impact in outpatient care has not been thoroughly examined. Studies show that malnourished patients are 40% more likely to be readmitted within 30 days of discharge, adding over $10,000 to hospitalization costs. This quality improvement project aimed to evaluate the impact of implementing a standardized malnutrition screening tool for all patients as part of nutrition assessments by Registered Dietitians (RDs).</p><p><b>Methods:</b> The Malnutrition Screening Tool (MST) was chosen due to its validity, sensitivity, and specificity in identifying malnutrition risk in outpatient settings. This tool assesses risk by asking about recent unintentional weight loss and decreased intake due to appetite loss. Based on responses, a score of 0-5 indicates the severity of risk. Scores of 0-1 indicate no risk, while scores of 2-5 indicate a risk for malnutrition. This questionnaire was integrated into the nutrition assessment section of the electronic medical record to standardize screening for all patients. Those with scores of 2 or greater were included, with no exclusions for disease states. Patients were scheduled for follow-ups 2-6 weeks after the initial assessment, during which their MST score was recalculated.</p><p><b>Results:</b> A total of 414 patients were screened, with 175 completing follow-up visits. Of these, 131 showed improvements in their MST scores after nutrition interventions, 12 had score increases, and 32 maintained the same score. Two hundred thirty-nine patients were lost to follow-up for various reasons, including lack of response, limited RD scheduling access, changes in insurance, and mortality. Those with improved MST scores experienced an average cost avoidance of $15,000 each in subsequent hospitalizations. This cost avoidance was due to shorter hospitalizations, better treatment responses, and reduced need for medical interventions. The project raised awareness about the importance of early nutrition interventions among the multidisciplinary team.</p><p><b>Conclusion:</b> This project suggests that standardizing malnutrition screening in outpatient settings could lead to cost avoidance for patients and health systems, along with improved overall care. Further studies are needed to identify the best tools for outpatient malnutrition screening, optimal follow-up timelines, and effective nutrition interventions for greatest cost avoidance.</p><p>Amy Sharn, MS, RDN, LD<sup>1</sup>; Raissa Sorgho, PhD, MScIH<sup>2</sup>; Suela Sulo, PhD, MSc<sup>3</sup>; Emilio Molina-Molina, PhD, MSc, MEd<sup>4</sup>; Clara Rojas Montenegro, RD<sup>5</sup>; Mary Jean Villa-Real Guno, MD, FPPS, FPSPGHAN, MBA<sup>6</sup>; Sue Abdel-Rahman, PharmD, MA<sup>7</sup></p><p><sup>1</sup>Abbott Nutrition, Columbus, OH; <sup>2</sup>Center for Wellness and Nutrition, Public Health Institute, Sacramento, CA; <sup>3</sup>Global Medical Affairs and Research, Abbott Nutrition, Chicago, IL; <sup>4</sup>Research & Development, Abbott Nutrition, Granada, Andalucia; <sup>5</sup>Universidad del Rosario, Escuela de Medicina, Bogota, Cundinamarca; <sup>6</sup>Ateneo de Manila University, School of Medicine and Public Health, Metro Manila, National Capital Region; <sup>7</sup>Health Data Synthesis Institute, Chicago, IL</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> American Society for Nutrition, June 29-July 2, Chicago, IL, USA; American Academy of Pediatrics, September 27-October 1, Orlando, FL, USA.</p><p><b>Publication:</b> Sharn AR, Sorgho R, Sulo S, Molina-Molina E, Rojas Montenegro C, Villa-Real Guno MJ, Abdel-Rahman S. Using mid-upper arm circumference z-score measurement to support youth malnutrition screening as part of a global sports and wellness program and improve access to nutrition care. Front Nutr. 2024 Aug 12;11:1423978. doi: 10.3389/fnut.2024.1423978. PMID: 39188981; PMCID: PMC11345244.</p><p><b>Financial Support:</b> This study was financially supported by the Abbott Center for Malnutrition Solutions, Chicago, IL, USA.</p><p>Veeradej Pisprasert, MD, PhD<sup>1</sup>; Kittipadh Boonyavarakul, MD<sup>2</sup>; Sornwichate Rattanachaiwong, MD<sup>3</sup>; Thunchanok Kuichanuan, MD<sup>3</sup>; Pranithi Hongsprabhas, MD<sup>3</sup>; Chingching Foocharoen, MD<sup>3</sup></p><p><sup>1</sup>Faculty of Medicine, Khon Kaen University, Muang Khon Kaen, Khon Kaen; <sup>2</sup>Chulalongkorn University, Bangkok, Krung Thep; <sup>3</sup>Department of Medicine, Khon Kaen University, Muang Khon Kaen, Khon Kaen</p><p><b>Financial Support:</b> Grant supported by Khon Kaen University.</p><p><b>Background:</b> Systemic sclerosis (SSc) is an autoimmune disease which malnutrition is a common complication caused by chronic inflammation of its natural history and/or gastrointestinal tract involvement. Current nutritional assessment tools, e.g. GLIM criteria, may include data regarding muscle mass measurement for nutritional diagnosis. Anthropometric measurement is a basic method in determining muscle mass, however, data in such condition is limited. This study aimed to determine utility of determining muscle mass and muscle function by anthropometric measurement for diagnosing malnutrition in SSc patients.</p><p><b>Methods:</b> A cross-sectional diagnostic study was conducted in adult SSc patients at Srinagarind Hospital, Thailand. All patients were assessed for malnutrition based on Subjective Global Assessment (SGA). Muscle mass was measured by mid-upper-arm muscle circumference (MUAC) and calf circumference (CC), in addition, muscle function was determined by handgrip strength (HGS).</p><p><b>Results:</b> A total of 208 SSc patients were included, of which 149 were females (71.6%). The respective mean age and body mass index was 59.3 ± 11.0 years and 21.1 ± 3.9 kg/m². Nearly half (95 cases; 45.7%) were malnourished based on SGA. Mean values of MUAC, CC, and HGS were 25.9 ± 3.83, 31.5 ± 3.81, and 19.0 ± 6.99 kg, respectively. Area under the curve (AUC) of receiver operating characteristic (ROC) curves of MUAC for diagnosing malnutrition was 0.796, of CC was 0.759, and HGS was 0.720. Proposed cut-off values were shown in table 1.</p><p><b>Conclusion:</b> Muscle mass and muscle function were associated with malnutrition. Assessment of muscle mass and/or function by anthropometric measurement may be one part of nutritional assessment in patients with systemic sclerosis.</p><p><b>Table 1.</b> Proposed Cut-Off Values of MUAC, CC, and HGS in Patients With Systemic Sclerosis.</p><p></p><p>CC; calf circumference, HGS; handgrip strength, MUAC; mid-upper-arm circumference.</p><p></p><p><b>Figure 1.</b> ROC Curve of MUAC, CC, and HGS in Diagnosing Malnutrition by Subjective Global Assessment (SGA).</p><p>Trevor Sytsma, BS<sup>1</sup>; Megan Beyer, MS, RD, LDN<sup>2</sup>; Hilary Winthrop, MS, RD, LDN, CNSC<sup>3</sup>; William Rice, BS<sup>4</sup>; Jeroen Molinger, PhDc<sup>5</sup>; Suresh Agarwal, MD<sup>3</sup>; Cory Vatsaas, MD<sup>3</sup>; Paul Wischmeyer, MD, EDIC, FCCM, FASPEN<sup>6</sup>; Krista Haines, DO, MA<sup>3</sup></p><p><sup>1</sup>Duke University, Durham, NC; <sup>2</sup>Duke University School of Medicine- Department of Anesthesiology, Durham, NC; <sup>3</sup>Duke University School of Medicine, Durham, NC; <sup>4</sup>Eastern Virginia Medical School, Norfolk, VA; <sup>5</sup>Duke University Medical Center - Department of Anesthesiology - Duke Heart Center, Raleigh, NC; <sup>6</sup>Duke University Medical School, Durham, NC</p><p><b>Financial Support:</b> Baxter.</p><p><b>Background:</b> Achieving acceptable nutritional goals is a crucial but often overlooked component of postoperative care, impacting patient-important outcomes like reducing infectious complications and shortening ICU length of stay. Predictive resting energy expenditure (pREE) equations poorly correlate with actual measured REE (mREE), leading to potentially harmful over- or under-feeding. International ICU guidelines now recommend the use of indirect calorimetry (IC) to determine mREE and personalized patient nutrition. Surgical stress increases protein catabolism and insulin resistance, but the effect of age on postoperative mREE trends, which commonly used pREE equations do not account for, is not well studied. This study hypothesized that older adults undergoing major abdominal surgery experience slower metabolic recovery than younger patients, as measured by IC.</p><p><b>Methods:</b> This was an IRB-approved prospective trial of adult patients following major abdominal surgery with open abdomens secondary to blunt or penetrating trauma, sepsis, or vascular emergencies. Patients underwent serial IC assessments to guide postoperative nutrition delivery. Assessments were offered within the first 72 hours after surgery, then every 3 ± 2 days during ICU stay, followed by every 7 ± 2 days in stepdown. Patient mREE was determined using the Q-NRG® Metabolic Monitor (COSMED) in ventilator mode for mechanically ventilated patients or the mask or canopy modes, depending on medical feasibility. IC data were selected from ≥ 3-minute intervals that met steady-state conditions, defined by a variance of oxygen consumption and carbon dioxide production of less than 10%. Measurements not meeting these criteria were excluded from the final analysis. Patients without mREE data during at least two-time points in the first nine postoperative days were also excluded. Older adult patients were defined as ≥ 65 years and younger patients as ≤ 50. Trends in REE were calculated using the method of least squares and compared using t-tests assuming unequal variance (α = 0.05). Patients' pREE values were calculated from admission anthropometric data using ASPEN-SCCM equations and compared against IC measurements.</p><p><b>Results:</b> Eighteen older and 15 younger adults met pre-specified eligibility criteria and were included in the final analysis. Average rates and standard error of REE recovery in older and younger adult patients were 28.9 ± 17.1 kcal/day and 75.5 ± 18.9 kcal/day, respectively, which approached – but did not reach – statistical significance (p = 0.07). The lower and upper bands of pREE for the older cohort averaged 1728 ± 332 kcal and 2093 ± 390 kcal, respectively, markedly exceeding the mREE obtained by IC for most patients and failing to capture the observed variability identified using mREE. In younger adults, pREE values were closer to IC measurements, with lower and upper averages of 1705 ± 278 kcal and 2084 ± 323 kcal, respectively.</p><p><b>Conclusion:</b> Our data signal a difference in rates of metabolic recovery after major abdominal surgery between younger and older adult patients but did not reach statistical significance, possibly due to insufficient sample size. Predictive energy equations do not adequately capture changes in REE and may overestimate postoperative energy requirements in older adult patients, failing to appreciate the increased variability in mREE that our study found in older patients. These findings reinforce the importance of using IC to guide nutrition delivery during the early recovery period post-operatively. Larger trialsemploying IC and quantifying protein metabolism contributions are needed to explore these questions further.</p><p><b>Table 1.</b> Patient Demographics.</p><p></p><p></p><p><b>Figure 1.</b> Postoperative Changes in mREE in Older and Younger Adult Patients Following Major Abdominal Surgery Compared to pREE (ASPEN).</p><p>Amber Foster, BScFN, BSc<sup>1</sup>; Heather Resvick, PhD(c), MScFN, RD<sup>2</sup>; Janet Madill, PhD, RD, FDC<sup>3</sup>; Patrick Luke, MD, FRCSC<sup>2</sup>; Alp Sener, MD, PhD, FRCSC<sup>4</sup>; Max Levine, MD, MSc<sup>5</sup></p><p><sup>1</sup>Western University, Ilderton, ON; <sup>2</sup>LHSC, London, ON; <sup>3</sup>Brescia School of Food and Nutritional Sciences, Faculty of Health Sciences, Western University, London, ON; <sup>4</sup>London Health Sciences Centre, London, ON; <sup>5</sup>University of Alberta, Edmonton, AB</p><p><b>Financial Support:</b> Brescia University College MScFN stipend.</p><p><b>Background:</b> Currently, body mass index (BMI) is used as the sole criterion to determine whether patients with chronic kidney disease (CKD) are eligible for kidney transplantation. However, BMI is not a good measure of health in this patient population, as it does not distinguish between muscle mass, fat mass, and water weight. Individuals with end-stage kidney disease often experience shifts in fluid balance, resulting in fluid retention, swelling, and weight gain. Consequently, BMI of these patients may be falsely elevated. Therefore, it is vitally important to consider more accurate and objective measures of body composition for this patient population. The aim this study is to determine whether there is a difference in body composition between individuals with CKD who are categorized as having healthy body weight, overweight, or obesity.</p><p><b>Methods:</b> This was a cross-sectional study analyzing body composition of 114 adult individuals with CKD being assessed for kidney transplantation. Participants were placed into one of three BMI groups: healthy weight (group 1, BMI < 24.9 kg/m<sup>2</sup>, n = 29), overweight (group 2, BMI ≥ 24.9-29.9 kg/m2, n = 39) or with obesity (group 3, BMI ≥ 30 kg/m2, n = 45). Fat mass, lean body mass (LBM), and phase angle (PhA) were measured using Bioelectrical Impedance Analysis (BIA). Standardized phase angle (SPhA), a measure of cellular health, was calculated by [(observed PhA-mean PhA)/standard deviation of PhA]. Handgrip strength (HGS) was measured using a Jamar dynamometer, and quadriceps muscle layer thickness (QMLT) was measured using ultrasonography. Normalized HGS (nHGS) was calculated as [HGS/body weight (kg)], and values were compared to age- and sex-specific standardized cutoff values. Fat free mass index (FFMI) was calculated using [LBM/(height (m))<sup>2</sup>]. Low FFMI, which may identify high risk of malnutrition, was determined using ESPEN cut-off values of < 17 kg/m2 for males and < 15 kg/m2 for females. Frailty status was determined using the validated Fried frailty phenotype assessment tool. Statistical analysis: continuous data was analyzed using one-way ANOVA followed by Tukey post hoc, while chi-square tests were used for analysis of categorical data. IBM SPSS version 29, significance p < 0.05.</p><p><b>Results:</b> Participants in group 1 were younger than either group 2 (p = 0.004) or group 3 (p < 0.001). There was no significant difference in males and females between the three groups. FFMI values below cutoff were significantly higher for group 1 (13%), versus group 2 (0%) and group 3 (2.1%) (p = 0.02). A significant difference in nHGS was found between groups, with lower muscle strength occurring more frequently among participants in group 3 (75%) vs 48.7% in group 2 and 28.5% in group 1 (p < 0.001). No significant differences were seen in QMLT, SPhA, HGS, or frailty status between the three BMI groups.</p><p><b>Conclusion:</b> It appears that there is no difference in body composition parameters such as QMLT, SPhA, or frailty status between the three BMI groups. However, patients with CKD categorized as having a healthy BMI were more likely to be at risk for malnutrition. Furthermore, those individuals categorized as having a healthy BMI appeared to have more muscle strength compared to the other two groups. Taken together, these results provide convincing evidence that BMI should not be the sole criterion for listing patients for kidney transplantation. Further research is needed to confirm these findings.</p><p>Kylie Waynick, BS<sup>1</sup>; Katherine Petersen, MS, RDN, CSO<sup>2</sup>; Julie Kurtz, MS, CDCES, RDN<sup>2</sup>; Maureen McCoy, MS, RDN<sup>3</sup>; Mary Chew, MS, RDN<sup>4</sup></p><p><sup>1</sup>Arizona State University and Veterans Healthcare Administration, Phoenix, AZ; <sup>2</sup>Veterans Healthcare Administration, Phoenix, AZ; <sup>3</sup>Arizona State University, Phoenix, AZ; <sup>4</sup>Phoenix VAHCS, Phoenix, AZ</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition does not have a standardized definition nor universal identification criteria. Registered dietitian nutritionists (RDNs) most often diagnose based on Academy and ASPEN Identification of Malnutrition (AAIM) criteria while physicians are required to use International Classification of Diseases Version 10 (ICD-10-CM). However, there are major differences in how malnutrition is identified between the two. The malnutrition ICD-10 codes (E43.0, E44.0, E44.1, E50.0-E64.9) have vague diagnostic criteria leading providers to use clinical expertise and prior nutrition education. For dietitians, AAIM's diagnostic criteria is clearly defined and validated to identify malnutrition based on reduced weight and intake, loss of muscle and fat mass, fluid accumulation, and decrease in physical functioning. Due to lack of standardization, the process of identifying and diagnosing malnutrition is inconsistent. The purpose of this study was to analyze the congruence of malnutrition diagnosis between physicians and dietitians using the two methods and to compare patient outcomes between the congruent and incongruent groups.</p><p><b>Methods:</b> A retrospective chart review of 668 inpatients assigned a malnutrition diagnostic code were electronically pulled from the Veteran Health Administration's Clinical Data Warehouse for the time periods of April through July in 2019, 2020, and 2021. Length of stay, infection, pressure injury, falls, thirty day readmissions, and documentation of communication between providers was collected from chart review. Data for cost to the hospital was pulled from Veterans Equitable Resource Allocation (VERA) and paired with matching social security numbers in the sample. Chi squares were used for comparing differences between incongruency and congruency for infection, pressure injury, falls, and readmissions. Means for length of stay and cost to hospital between the two groups were analyzed using ANOVA through SPSS.</p><p><b>Results:</b> The diagnosis of malnutrition is incongruent between providers. The incongruent group had a higher percentage of adverse patient outcomes than those with congruent diagnosis. Congruent diagnoses were found to be significantly associated with incidence of documented communication (p < 0.001).</p><p><b>Conclusion:</b> This study showcases a gap in malnutrition patient care. Further research needs to be conducted to understand the barriers to congruent diagnosis and communication between providers.</p><p>Nana Matsumoto, RD, MS<sup>1</sup>; Koji Oba, Associate Professor<sup>2</sup>; Tomonori Narita, MD<sup>3</sup>; Reo Inoue, MD<sup>2</sup>; Satoshi Murakoshi, MD, PhD<sup>4</sup>; Yuki Taniguchi, MD<sup>2</sup>; Kenichi Kono, MD<sup>2</sup>; MIdori Noguchi, BA<sup>5</sup>; Seiko Tsuihiji<sup>2</sup>; Kazuhiko Fukatsu, MD, PhD<sup>2</sup></p><p><sup>1</sup>The University of Tokyo, Bunkyo-City, Tokyo; <sup>2</sup>The University of Tokyo, Bunkyo-ku, Tokyo; <sup>3</sup>The University of Tokyo, Chuo-City, Tokyo; <sup>4</sup>Kanagawa University of Human Services, Yokosuka-city, Kanagawa; <sup>5</sup>The University of Tokyo Hospital, Bunkyo-ku, Tokyo</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Therapeutic diets are often prescribed for patients with various disorders, for example, diabetes, renal dysfunction and hypertension. However, due to the limitations of nutrients amount given, therapeutic diets might reduce appetite. Hospital meals maintain patients’ nutritional status when the meals are fully consumed regardless of the diet types. It is possible that therapeutic diets are at least partly associated with malnutrition in hospital. Therefore, we conducted an exploratory retrospective cohort study to investigate whether there are differences inpatients’ oral consumption between therapeutic and regular diets, taking into account other factors.</p><p><b>Methods:</b> The study protocol was approved by the Ethics Committee of the University of Tokyo under protocol No.2023396NI-(1). We retrospectively extracted information from medical record of the patients who were admitted to the department of orthopedic and spine surgery at the University of Tokyo Hospital between June to October 2022. Eligible patients were older than 20 years old and hospitalized more than 7 days. These patients were provided oral diets as main source of nutrition. Patients prescribed texture modified diet, half or liquid diet are excluded. The measurements include percentages of oral food intake at various points during the hospitalization (e.g. at admission, before and after surgery and discharge), sex and ages. The differences in patient's oral consumption rate between therapeutic diet and regular diet were analyzed through a linear mixed-effect model.</p><p><b>Results:</b> A total of 290 patients were analyzed, with 50 patients receiving a therapeutic diet and 240 patients receiving regular diet at admission. The mean percentage of oral intake was 83.1% in a therapeutic diet and 87.2% in a regular diet, and consistently 4-6% higher for regular diets compared to therapeutic diets, at each timing of hospitalization (Figure). In a linear mixed effect model with adjustment of sex and age, mean percentage of oral intake of a regular diet was 4.0% higher (95% confidence interval [CI], -0.8% to 8.9%, p = 0.100) than a therapeutic diet, although the difference did not reach statistical significance. The mean percentage of oral intake in women were 15.6% lower than men (95%CI, -19.5% to -11.8%.) Likewise, older patient's intake rate was reduced compared than younger patients (difference, -0.2% per age, 95%CI -0.3% to -0.1%).</p><p><b>Conclusion:</b> This exploratory study failed to show that therapeutic diets may reduce food intake in orthopedic and supine surgery patients as compared to regular diet. However, sex and ages were important factors affecting food intake. We need to pay special attention to female and/or aged patients for increasing oral food intake. Future research will increase the number of patients examined, expand the cohort to other department and proceed to prospective study to find out what factors truthfully affect patient's oral intake during the hospitalization.</p><p></p><p><b>Figure 1.</b> The Percentage of Oral Intake During Hospitalization in Each Diet.</p><p>Lorena Muhaj, MS<sup>1</sup>; Michael Owen-Michaane, MD, MA, CNSC<sup>2</sup></p><p><sup>1</sup>1 Institute of Human Nutrition, Vagelos College of Physicians and Surgeons, Columbia University Irving Medical Center, New York, NY; <sup>2</sup>Columbia University Irving Medical Center, New York, NY</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Muscle mass is crucial for overall health and well-being. Consequently, accurate estimation of muscle mass is essential for diagnosing malnutrition and conditions such as sarcopenia and cachexia. The Kim equation uses biomarker data to estimate muscle mass, but whether this tool provides an accurate estimate in populations with high BMI and kidney disease remains uncertain. Therefore, the aim of this study is to assess whether the Kim equation is suitable and reliable for estimating muscle mass and predicting malnutrition, sarcopenia, and related outcomes in a cohort with diverse BMI and kidney disease.</p><p><b>Methods:</b> This is a cross-sectional study using data from the All of Us Research Program. Data on demographics, weight, height, creatinine, cystatin C, and diagnoses of malnutrition, hip fractures, and cachexia were obtained from electronic health records (EHR). The Kim equation was derived from creatinine, cystatin C and weight (Table 1) and compared with established sarcopenia cutoffs for appendicular lean mass (ALM), including ALM/BMI and ALM/height<sup>2</sup>. Malnutrition was identified through specific ICD-10-CM codes recorded in the EHR and participants were categorized based on malnutrition status (with or without malnutrition). Muscle mass and biomarker levels were compared between groups with or without severe/moderate malnutrition, and the relationships of BMI with creatinine and cystatin C levels were analyzed using linear regression. Wilcoxon rank-sum tests were used to assess associations between estimated muscle mass and malnutrition diagnosis.</p><p><b>Results:</b> Baseline characteristics were stratified by gender for comparison. The mean age of participants was 58.2 years (SD = 14.7). The mean BMI was 30.4 kg/m<sup>2</sup> (SD = 7.5). Mean serum creatinine and cystatin C levels were 2.01 mg/dL (SD = 1.82) and 0.18 mg/dL (SD = 0.11), respectively. The mean estimated muscle mass was 80.1 kg (SD = 21), with estimated muscle mass as a percentage of body weight being 92.8%. Mean ALM/BMI was 2.38 (SD = 0.32), while ALM/height<sup>2</sup> value was 25.41 kg/m<sup>2</sup> (SD = 5.6). No participant met the cutoffs for sarcopenia. All calculated variables are summarized in Table 1. In this cohort, < 2% were diagnosed with severe malnutrition and < 2% with moderate malnutrition (Table 2). Muscle mass was lower in participants with severe malnutrition compared to those without (W = 2035, p < 0.05) (Figure 1).</p><p><b>Conclusion:</b> This study suggests the Kim equation overestimates muscle mass in populations with high BMI or kidney disease, as no participants met sarcopenia cutoffs despite the expected prevalence in CKD. This overestimation is concerning given the known risk of low muscle mass in CKD. While lower muscle mass was significantly associated with severe malnutrition (p < 0.05), the Kim equation identified fewer malnutrition cases than expected based on clinical data. Though biomarkers like creatinine and cystatin C may help diagnose malnutrition, the Kim equation may not accurately estimate muscle mass or predict malnutrition and sarcopenia in diverse populations. Further research is needed to improve these estimates.</p><p><b>Table 1.</b> Muscle Mass Metrics Calculations and Diagnosis of Sarcopenia Based on FNIH (ALM/BMI) and EWGSOP (ALM/Height2) Cut-off Values.</p><p></p><p>Abbreviations: BMI-Body Mass Index; TBMM-Total Body Muscle Mass (referred as Muscle Mass as well) (calculated using the Kim Equation); ALM-Appendicular Lean Muscle Mass (using the McCarthy equation); ALM/Height2-Appendicular Lean Muscle Mass adjusted for height square (using EWGSOP cutoffs for diagnosing sarcopenia); ALM/BMI-Appendicular Lean Muscle Mass adjusted for BMI (using FNIH cutoffs for diagnosing sarcopenia). Equation 1: Kim equation - Calculated body muscle mass = body weight * serum creatinine/((K * body weight * serum cystatin C) + serum creatinine)</p><p><b>Table 2.</b> Prevalence of Severe and Moderate Malnutrition.</p><p></p><p>(Counts less than 20 suppressed to prevent reidentification of participants).</p><p></p><p><b>Figure 1.</b> Muscle Mass in Groups With and Without Severe Malnutrition.</p><p><b>Poster of Distinction</b></p><p>Robert Weimer, BS<sup>1</sup>; Lindsay Plank, PhD<sup>2</sup>; Alisha Rovner, PhD<sup>1</sup>; Carrie Earthman, PhD, RD<sup>1</sup></p><p><sup>1</sup>University of Delaware, Newark, DE; <sup>2</sup>University of Auckland, Auckland</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Loss of skeletal muscle is common in patients with liver cirrhosis and low muscle mass is internationally recognized as a key phenotypic criterion to diagnose malnutrition.<sup>1,2</sup> Muscle can be clinically assessed through various modalities, although reference data and consensus guidance are limited for the interpretation of muscle measures to define malnutrition in this patient population. The aim of this study was to evaluate the sensitivity and specificity of published sarcopenia cutpoints applied to dual-energy X-ray absorptiometry (DXA) muscle measures to diagnose malnutrition by the GLIM criteria, using in vivo neutron activation analysis (IVNAA) measures of total body protein (TBP) as the reference.</p><p><b>Methods:</b> Adults with liver cirrhosis underwent IVNAA and whole body DXA at the Body Composition Laboratory of the University of Auckland. DXA-fat-free mass (FFM) and appendicular skeletal muscle mass (ASMM, with and without correction for wet bone mass<sup>3</sup>) were measured and indexed to height squared (FFMI, ASMI). The ratio of measured to predicted TBP based on healthy reference data matched for age, sex and height was calculated as a protein index; values less than 2 standard deviations below the mean (< 0.77) were defined as protein depletion (malnutrition). Published cut points from recommended guidelines were evaluated (Table 1).<sup>4-9</sup> DXA values below the cut point were interpreted as ‘sarcopenic’. Sensitivity and specificity for each cut point were determined.</p><p><b>Results:</b> Study sample included 350 adults (238 males/112 females, median age 52 years) with liver cirrhosis and median model for end-stage liver disease (MELD) score 12 (range 5-36). Application of published sarcopenia cutpoints to diagnose malnutrition by DXA in patients with liver cirrhosis had sensitivity ranging from 40.8% - 79.0% and specificity ranging from 79.6% to 94.2% (Table 1). Although all of the selected published cutpoints for DXA-measured ASMI were similar, the Baumgartner<sup>4</sup> and Newman<sup>5</sup> ASMI cutpoints when applied to our DXA-measured ASMI, particularly after correction for wet bone mass, yielded the best combination of sensitivity and specificity for diagnosing malnutrition as identified by protein depletion in patients with liver cirrhosis. The Studentski ASMM cutpoints in Table 1 yielded unacceptably low sensitivity (41-55%).</p><p><b>Conclusion:</b> These findings suggest that the use of the DXA-derived Baumgartner/Newman bone-corrected ASMI cutpoints offer acceptable validity in the diagnosis of malnutrition by GLIM in patients with liver cirrhosis. However, given that it is not common practice to make this correction for wet bone mass in DXA measures of ASMI, the application of these cutpoints to standard uncorrected measures of ASMI by DXA would likely yield much lower sensitivity, suggesting that many individuals with low muscularity and malnutrition would be misdiagnosed as non-malnourished when applying these cutpoints.</p><p><b>Table 1.</b> Evaluation of Selected Published Cut-Points for Dual-Energy X-Ray Absorptiometry Appendicular Skeletal Muscle Index to Identify Protein Depletion in Patients with Liver Cirrhosis.</p><p></p><p>Abbreviations: DXA, dual-energy X-ray absorptiometry; M, male; F, female; GLIM, Global Leadership Initiative on Malnutrition; EWGSOP, European Working Group on Sarcopenia in Older People; AWGS, Asia Working Group for Sarcopenia; FNIH, Foundation for the National Institutes of Health; ASMM, appendicular skeletal muscle mass in kg determined from DXA-measured lean soft tissue of the arms and legs; ASMI, ASMM indexed to height in meters-squared; ASMM-BC, ASMM corrected for wet bone mass according to Heymsfield et al 1990; ASMI-BC, ASMI corrected for wet bone mass.</p><p><b>Critical Care and Critical Health Issues</b></p><p>Amir Kamel, PharmD, FASPEN<sup>1</sup>; Tori Gray, PharmD<sup>2</sup>; Cara Nys, PharmD, BCIDP<sup>3</sup>; Erin Vanzant, MD, FACS<sup>4</sup>; Martin Rosenthal, MD, FACS, FASPEN<sup>1</sup></p><p><sup>1</sup>University of Florida, Gainesville, FL; <sup>2</sup>Cincinnati Children, Gainesville, FL; <sup>3</sup>Orlando Health, Orlando, FL; <sup>4</sup>Department of Surgery, Division of Trauma and Acute Care Surgery, College of Medicine, University of Florida, Gainesville, FL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Amino acids (AAs) serve different purposes in our body including structural, enzymatic, and integral cellular functions. Amino acids utilization and demand may vary between healthy and disease states. Certain conditions such as chronic kidney disease or short bowel syndrome can affect plasma AA levels. Previous research has identified citrulline as a marker of intestinal function and absorptive capacity. Stressors such as surgery or trauma can alter AAs metabolism, potentially leading to a hypercatabolic state and changes in the available AAs pool. The primary objective of this study is to compare AA levels in patients who have undergone abdominal surgery with those who have not. The secondary endpoint is to describe post-surgical complications and correlate plasma AAs level to such complications.</p><p><b>Methods:</b> This study was a single-center retrospective analysis conducted between January 1, 2007and March 15, 2019, of patients who were referred to the University of Florida Health Nutrition Support Team (NST) and had a routine metabolic evaluation with amino acid levels as a part of nutrition support consult. Amino acid data were excluded if specimen were deemed contaminated. Patients with genetic disorders were also excluded from the study. During the study period, the amino acid bioassay was performed using Bio-chrome Ion exchange chromatography (ARUP Laboratories, Salt Lake City, UT).</p><p><b>Results:</b> Of the 227 patients screened, 181 patients were included in the study (58 who underwent abdominal surgery and 123 who did not). The mean age, BMI and height of participants were 52.2 years, 25.1 kg/m<sup>2</sup> and 169 cm respectively. Baseline characteristics were similar between the two groups, 31% of the surgery arm had undergone a surgical procedure within a year from the index encounter, 86.5% retained their colon, 69.2% had a bowel resection with mean of 147.6 cm of bowel left for those with documented length of reaming bowel (36 out of 58). Postoperative complications of small bowel obstruction, ileus, leak, abscess, bleeding and surgical site infection (SSI) were 12.1%, 24%, 17.2%, 20.7%, 3.4% and 17.2% respectively. Among the 19 AAs evaluated, median citrulline and methionine levels were significantly different between the 2 groups (23 [14-35] vs 17 [11-23]; p = 0.0031 and 27 [20-39] vs 33[24-51]; p = 0.0383. Alanine and arginine levels were associated with postoperative ileus, leucine levels correlated with SSI and glutamic acid and glycine levels were linked to postoperative fistula formation.</p><p><b>Conclusion:</b> Most amino acid levels showed no significant differences between patients who underwent abdominal surgery and those who did not except for citrulline and methionine. Specific amino acids, such as alanine, arginine, leucine, glutamic acid and glycine may serve as an early indicator of post-surgical complications, however larger prospective trial is warranted to validate our findings.</p><p>Kento Kurashima, MD, PhD<sup>1</sup>; Grace Trello<sup>1</sup>; James Fox<sup>1</sup>; Edward Portz<sup>1</sup>; Shaurya Mehta, BS<sup>1</sup>; Austin Sims<sup>1</sup>; Arun Verma, MD<sup>1</sup>; Chandrashekhara Manithody, PhD<sup>1</sup>; Yasar Caliskan, MD<sup>1</sup>; Mustafa Nazzal, MD<sup>1</sup>; Ajay Jain, MD, DNB, MHA<sup>1</sup></p><p><sup>1</sup>Saint Louis University, St. Louis, MO</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Marginal donor livers (MDLs) have been used for liver transplantation to address major organ shortages. However, MDLs are notably susceptible to Ischemia/Reperfusion injury (IRI). Recent investigations have highlighted Ferroptosis, a new type of programmed cell death, as a potential contributor to IRI. We hypothesized that modulating ferroptosis by the iron chelator deferoxamine (DFO) could alter the course of IRI.</p><p><b>Methods:</b> Using our novel Perfusion Regulated Organ Therapeutics with Enhanced Controlled Testing (PROTECT) model (US provision Patent, US63/136,165), six human MDLs (liver A to F) were procured and split into paired lobes. Simultaneous perfusion was performed on both lobes, with one lobe subjected to DFO while the other serving as an internal control. Histology, serum chemistry, expression of ferroptosis-associated genes, determination of iron accumulation, and measurement of lipid peroxidation, were performed.</p><p><b>Results:</b> Histological analysis revealed severe macrovesicular steatosis (>30%) in liver A and D, while liver B and E exhibited mild to moderate macrovesicular steatosis. Majority of the samples noted mild inflammation dominantly in zone 3. No significant necrosis was noted during perfusion. Perl's Prussian blue stain and non-heme iron quantification demonstrated a suppression of iron accumulation in liver A to D with DFO treatment (p < 0.05). Based on the degree of iron chelation, 12 lobes were categorized into two groups: lobes with decreased iron (n = 4) and those with increased iron (n = 8). Comparative analysis demonstrated that ferroptosis-associated genes (HIF1-alpha, RPL8, IREB2, ACSF2, NQO1) were significantly downregulated in the former (p = 0.0338, p = 0.0085, p = 0.0138, p = 0.0138, p = 0.0209, respectively). Lipid peroxidation was significantly suppressed in lobes with decreased iron (p = 0.02). While serum AST was lower in iron chelated lobes this did not reach statistical significance.</p><p><b>Conclusion:</b> This study affirmed that iron accumulation was driven by normothermic perfusion. Reduction of iron content suppressed ferroptosis-associated genes and lipid peroxidation to mitigate IRI. Our results using human MDLs revealed a novel relationship between iron content and ferroptosis, providing a solid foundation for future development of IRI therapeutics.</p><p>Gabriella ten Have, PhD<sup>1</sup>; Macie Mackey, BSc<sup>1</sup>; Carolina Perez, MSc<sup>1</sup>; John Thaden, PhD<sup>1</sup>; Sarah Rice, PhD<sup>1</sup>; Marielle Engelen, PhD<sup>1</sup>; Nicolaas Deutz, PhD, MD<sup>1</sup></p><p><sup>1</sup>Texas A&M University, College Station, TX</p><p><b>Financial Support:</b> Department of Defense: CDMRP PR190829 - ASPEN Rhoads Research Foundation - C. Richard Fleming Grant & Daniel H. Teitelbaum Grant.</p><p><b>Background:</b> Sepsis is a potentially life-threatening complication of infection in critically ill patients and is characterized by severe tissue breakdown in several organs, leading to long-term muscle weakness, fatigue, and reduced physical activity. Recent guidelines suggest increasing protein intake gradually in the first days of recovery from a septic event. It is, however, unclear how this affects whole-body protein turnover. Therefore in an acute sepsis-recovery pig model, we studied whole-body protein metabolism in the early sepsis recovery phase after restricted feeding with a balanced meal of amino acids (AA).</p><p><b>Methods:</b> In 25 catheterized pigs (± 25 kg), sepsis was induced by IV infusion of live Pseudomonas aeruginosa bacteria (5*108 CFU/hour). At t = 9 h, recovery was initiated with IV administration of gentamicin. Post sepsis, twice daily food was given incrementally (Day 1:25%, 2:50%, 3:75%, >4:100%). The 100% meals contained per kg BW: 15.4gr CHO and 3.47gr fat, and a balanced free AA (reflecting muscle AA profile) mixture (0.56 gr n = 3.9 gr AA). Before sepsis (Baseline) and on recovery day 3, an amino acid stable isotope mixture was administered IV as pulse (post-absorptive). Subsequently, arterial blood samples were collected postabsorptive for 2 hours. Amino acid concentrations and enrichments were determined with LCMS. Statistics: RM-ANOVA, <i><b>α </b></i>= 0.05.</p><p><b>Results:</b> At day 3, animal body weight was decreased (2.4 [0.9, 3.9]%, p = 0.0025). Compared to baseline values, plasma AA concentration profiles were changed. Overall, the total non-essential AA plasma concentration did not change. Essential AA plasma concentrations of histidine, leucine, methionine, phenylalanine, tryptophan, and valine were lower (p < 0.05) and lysine higher (p = 0.0027). No change in isoleucine. We observed lower whole-body production (WBP) of the non-essential amino acids arginine (p < 0.0001), glutamine (p < 0.0001), glutamate (p < 0.0001), glycine (p < 0.0001), hydro-proline (p = 0.0041), ornithine (p = 0.0003), taurine (p < 0.0001), and tyrosine (p < 0.0001). Citrulline production has not changed. In addition, lower WBP was observed for the essential amino acids, isoleucine (p = 0.0002), leucine (p < 0.0001), valine (p < 0.0001), methionine (p < 0.0001), tryptophane (p < 0.0001), and lysine (p < 0.0001). Whole-body protein breakdown and protein synthesis were also lower (p < 0.0001), while net protein breakdown has not changed.</p><p><b>Conclusion:</b> Our sepsis-recovery pig model suggests that food restriction in the early phase of sepsis recovery leads to diminished protein turnover.</p><p>Gabriella ten Have, PhD<sup>1</sup>; Macie Mackey, BSc<sup>1</sup>; Carolina Perez, MSc<sup>1</sup>; John Thaden, PhD<sup>1</sup>; Sarah Rice, PhD<sup>1</sup>; Marielle Engelen, PhD<sup>1</sup>; Nicolaas Deutz, PhD, MD<sup>1</sup></p><p><sup>1</sup>Texas A&M University, College Station, TX</p><p><b>Financial Support:</b> Department of Defense: CDMRP PR190829 - ASPEN Rhoads Research Foundation - C. Richard Fleming Grant & Daniel H. Teitelbaum Grant.</p><p><b>Background:</b> Sepsis is a potentially life-threatening complication of infection in critically ill patients. It is characterized by severe tissue breakdown in several organs, leading to long-term muscle weakness, fatigue, and reduced physical activity (ICU-AW). In an acute sepsis-recovery ICU-AW pig model, we studied whether meals that only contain essential amino acids (EAA) can restore the metabolic deregulations during sepsis recovery, as assessed by comprehensive metabolic phenotyping1.</p><p><b>Methods:</b> In 49 catheterized pigs (± 25 kg), sepsis was induced by IV infusion of live Pseudomonas aeruginosa bacteria (5*108 CFU/hour). At t = 9 h, recovery was initiated with IV administration of gentamicin. Post sepsis, twice daily food was given blindly and incrementally (Day 1:25%, 2:50%, 3:75%, >4:100%). The 100% meals contained per kg BW: 15.4gr CHO and 3.47gr fat, and 0.56 gr N of an EAA mixture (reflecting muscle protein EAA, 4.3 gr AA) or control (TAA, 3.9 gr AA). Before sepsis (Baseline) and on recovery day 7, an amino acid stable isotope mixture was administered IV as pulse (post-absorptive). Subsequently, arterial blood samples were collected for 2 hours. AA concentrations and enrichments were determined with LCMS. Statistics: RM-ANOVA, <i><b>α</b></i> = 0.05.</p><p><b>Results:</b> A body weight reduction was found after sepsis, restored on Day 7 post sepsis. Compared to baseline, in the EAA group, increased muscle fatigue (p < 0.0001), tau-methylhistidine whole-body production (WBP) (reflects myofibrillar muscle breakdown, p < 0.0001), and whole-body net protein breakdown (p < 0.0001) was observed but less in the control group (muscle fatigue: p < 0.0001, tau-methylhistidine: p = 0.0531, net protein breakdown (p < 0.0001). In addition on day 7, lower WBP was observed of glycine (p < 0.0001), hydroxyproline (p < 0.0001), glutamate (p < 0.0001), glutamine (p < 0.0001), and taurine (p < 0.0001), however less (glycine: p = 0.0014; hydroxyproline (p = 0.0007); glutamate p = 0.0554) or more (glutamine: p = 0.0497; taurine: p < 0.0001) in the control group. In addition, the WBP of citrulline (p = 0.0011) was increased on day seven but less in the control group (p = 0.0078). Higher plasma concentrations of asparagine (p < 0.0001), citrulline (p < 0.0001), glutamine (p = 0.0001), tau-methylhistidine (0.0319), serine (p < 0.0001), taurine (p < 0.0001), and tyrosine (p < 0.0001) were observed in the EAA group. In the EAA group, the clearance was lower (p < 0.05), except for glycine, tau-methylhistidine, and ornithine.</p><p><b>Conclusion:</b> Conclusion Our sepsis-recovery pig ICU-AW model shows that feeding EAA-only meals after sepsis relates to an increased muscle and whole-body net protein breakdown and affects non-EAA metabolism. We hypothesize that non-essential amino acids in post-sepsis nutrition are needed to improve protein anabolism.</p><p>Rebecca Wehner, RD, LD, CNSC<sup>1</sup>; Angela Parillo, MS, RD, LD, CNSC<sup>1</sup>; Lauren McGlade, RD, LD, CNSC<sup>1</sup>; Nan Yang, RD, LD, CNSC<sup>1</sup>; Allyson Vasu-Sarver, MSN, APRN-CNP<sup>1</sup>; Michele Weber, DNP, RN, APRN-CNS, APRN-NP, CCRN, CCNS, OCN, AOCNS<sup>1</sup>; Stella Ogake, MD, FCCP<sup>1</sup></p><p><sup>1</sup>The Ohio State University Wexner Medical Center, Columbus, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Patients in the intensive care unit, especially those on mechanical ventilation, frequently receive inadequate enteral nutrition (EN) therapy. Critically ill patients receive on average 40-50% of prescribed nutritional requirements, while ASPEN/SCCM guidelines encourage efforts be made to provide > 80% of goal energy and protein needs. One method to help achieve these efforts is the use of volume-based feedings (VBF). At our institution, an hourly rate-based feeding (RBF) approach is standard. In 2022, our Medical Intensive Care Unit (MICU) operations committee inquired about the potential benefits of implementing VBF. Before changing our practice, we collected data to assess our current performance in meeting EN goals and to identify the reasons for interruptions. While literature suggests that VBF is considered relatively safe in terms of EN complications compared to RBF, to our knowledge, there is currently no information on the safety of starting VBF in patients at risk of gastrointestinal (GI) intolerance. Therefore, we also sought to determine whether EN is more frequently held due to GI intolerance versus operative procedures.</p><p><b>Methods:</b> We conducted a retrospective evaluation of EN delivery compared to EN goal and the reason for interruption if EN delivery was below goal in the MICU of a large tertiary academic medical center. We reviewed ten days of information on any MICU patient on EN. One day constituted the total EN volume received, in milliliters, from 0700-0659 hours. Using the QI Assessment Form, we collected the following data: goal EN volume in 24 hours, volume received in 24 hours, percent volume received versus prescribed, and hours feeds were held (or below goal rate) each day. The reasons for holding tube feeds were divided into six categories based on underlying causes: feeding initiation/titration, GI issues (constipation, diarrhea, emesis, nausea, distention, high gastric residual volume), operative procedure, non-operative procedure, mechanical issue, and practice issues. Data was entered on a spreadsheet, and descriptive statistics were used to evaluate results.</p><p><b>Results:</b> MICU patients receiving EN were observed over ten random days in a two-week period in August 2022. Eighty-two patients were receiving EN. Three hundred and four EN days were observed. Average percent EN delivered was 70% among all patients. EN was withheld for the following reasons: 34 cases (23%) were related to feeding initiation, 55 (37%) GI issues, 19 (13%) operative procedures, 32 (22%) non-operative procedures, 2 (1%) mechanical issues, and 5 (3%) cases were related to practice issues. VBF could have been considered in 51 cases (35%).</p><p><b>Conclusion:</b> These results suggest that EN delivery in our MICU is most often below prescribed amount due to GI issues and feeding initiation. Together, they comprised 89 cases (60%). VBF protocols would not improve delivery in either case. VBF would likely lead to increased discomfort in patients experiencing GI issues, and feeding initiation can be improved with changes to advancement protocols. Due to VBF having potential benefit in only 35% of cases, as well as observing above average EN delivery, this protocol was not implemented in the observed MICU.</p><p>Delaney Adams, PharmD<sup>1</sup>; Brandon Conaway, PharmD<sup>2</sup>; Julie Farrar, PharmD<sup>3</sup>; Saskya Byerly, MD<sup>4</sup>; Dina Filiberto, MD<sup>4</sup>; Peter Fischer, MD<sup>4</sup>; Roland Dickerson, PharmD<sup>3</sup></p><p><sup>1</sup>Regional One Health, Memphis, TN; <sup>2</sup>Veterans Affairs Medical Center, Memphis, TN; <sup>3</sup>University of Tennessee College of Pharmacy, Memphis, TN; <sup>4</sup>University of Tennessee College of Medicine, Memphis, TN</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> Society for Critical Care Medicine 54th Annual Critical Care Congress. February 23 to 25, 2025, Orlando, FL.</p><p><b>Publication:</b> Critical Care Medicine.2025;53(1):In press.</p><p><b>Financial Support:</b> None Reported.</p><p><b>Best of ASPEN-Critical Care and Critical Health Issues</b></p><p>Megan Beyer, MS, RD, LDN<sup>1</sup>; Krista Haines, DO, MA<sup>2</sup>; Suresh Agarwal, MD<sup>2</sup>; Hilary Winthrop, MS, RD, LDN, CNSC<sup>2</sup>; Jeroen Molinger, PhDc<sup>3</sup>; Paul Wischmeyer, MD, EDIC, FCCM, FASPEN<sup>4</sup></p><p><sup>1</sup>Duke University School of Medicine- Department of Anesthesiology, Durham, NC; <sup>2</sup>Duke University School of Medicine, Durham, NC; <sup>3</sup>Duke University Medical Center - Department of Anesthesiology - Duke Heart Center, Raleigh, NC; <sup>4</sup>Duke University Medical School, Durham, NC</p><p><b>Financial Support:</b> Baxter, Abbott.</p><p><b>Background:</b> Resting energy expenditure (REE) is critical in managing nutrition in intensive care unit (ICU) patients. Accurate energy needs assessments are vital for optimizing nutritional interventions. However, little is known about how different disease states specifically influence REE in ICU patients. Existing energy recommendations are generalized and do not account for the metabolic variability across disease types. Indirect calorimetry (IC) is considered the gold standard for measuring REE but is underutilized. This study addresses this gap by analyzing REE across disease states using metabolic cart assessments in a large academic medical center. The findings are expected to inform more precise, disease-specific nutritional recommendations in critical care.</p><p><b>Methods:</b> This is a pooled analysis of patients enrolled in four prospective clinical trials evaluating ICU patients across a range of disease states. Patients included in this analysis were admitted to the ICU with diagnoses of COVID-19, respiratory failure, cardiothoracic (CT) surgery, trauma, or surgical intensive care conditions. All patients underwent IC within 72 hours of ICU admission to assess REE, with follow-up measurements conducted when patients were medically stable. In each study, patients were managed under standard ICU care protocols in each study, and nutritional interventions were individualized or standardized based on clinical trial protocols. The primary outcome was the measured REE, expressed in kcal/day and normalized to body weight (kcal/kg/day). Summary statistics for demographic and clinical characteristics, such as age, gender, height, weight, BMI, and comorbidities, were reported. Comparative analyses across the five disease states were performed using ANOVA tests to determine the significance of differences in REE.</p><p><b>Results:</b> The analysis included 165 ICU patients. The cohort had a mean age of 58 years, with 58% male and 42% female. Racial demographics included 36% black, 52% white, and 10% from other backgrounds. Patients in the surgical ICU group had the lowest caloric requirements, averaging 1503 kcal/day, while COVID-19 patients had the highest calorie needs of 1982 kcal/day. CT surgery patients measured 1644 kcal/day, respiratory failure measured 1763 kcal/day, and trauma patients required1883 kcal/day. ANOVA analysis demonstrated statistically significant differences in REE between these groups (p < 0.001). When normalized to body weight (kcal/kg/day), the range of REE varied from 20.3 to 23.5 kcal/kg/day, with statistically significant differences between disease states (p < 0.001).</p><p><b>Conclusion:</b> This study reveals significant variability in REE across different disease states in ICU patients, highlighting the need for disease-specific energy recommendations. These findings indicate that specific disease processes, such as COVID-19 and trauma, may increase metabolic demands, while patients recovering from surgical procedures may have comparatively lower energy needs. These findings emphasize the importance of individualized nutritional interventions based on a patient's disease state to optimize recovery, clinical outcomes and prevent underfeeding or overfeeding, which can adversely affect patient outcomes. The results suggest IC should be more widely implemented in ICU settings to guide precise and effective nutrition delivery based on real-time metabolic data rather than relying on standard predictive equations. Further research is needed to refine these recommendations and explore continuous monitoring of REE and tailored nutrition needs in the ICU.</p><p><b>Table 1.</b> Demographic and Clinical Characteristics.</p><p></p><p><b>Table 2.</b> Disease Group Diagnoses.</p><p></p><p></p><p><b>Figure 1.</b> Average Measured Resting Energy Expenditure by Disease Group.</p><p>Hailee Prieto, MA, RD, LDN, CNSC<sup>1</sup>; Emily McDermott, MS, RD, LDN, CNSC<sup>2</sup></p><p><sup>1</sup>Northwestern Memorial Hospital, Shorewood, IL; <sup>2</sup>Northwestern Memorial Hospital, Chicago, IL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Communication between Registered Dietitians and CTICU teams is non-standardized and often ineffective in supporting proper implementation of RD recommendations. Late or lack of RD support can impact the quality of nutrition care provided to patients. In FY23, the CTICU nutrition consult/risk turnaround time was 58% within 24hrs and missed nutrition consults/risk was 9%. Our goal was to improve RD consult/risk turnaround time within 24hrs, based on our department goal, from 58% to 75% and missed RD consult/risk from 9% to 6%, through standardizing communication between RD and CTICU APRNs. The outcome metrics nutrition risk turnaround time and nutrition consult turnaround time. The process metric was our percent of RD presence in rounds.</p><p><b>Methods:</b> We used the DMAIC module to attempt to solve our communication issue in CTICU. We took the voice of the customer and surveyed the CTICU ARPNs and found that a barrier was the RDs limited presence in the CTICU. We found that the CTICU APRNs find it valuable to have an RD rounding daily with their team. We then did a literature search on RDs rounding in ICU, specifically cardiac/thoracic ICUs and found critically ill cardiac surgery patients are at high risk of developing malnutrition, however initiation of medical nutrition therapy and overall adequacy of nutrition provision is lower compared to noncardiac surgical or MICU patients. RD written orders directly improve outcomes in patients receiving nutrition support. To have the most influence, RDs need to be present in the ICU and be involved when important decisions are being made. Dietitian involvement in the ICU team significantly improves the team's ability to implement prompt, relevant nutrition support interventions. We used process mapping to address that rounding times overlap with the step-down cardiac floors and ICU rounding. We optimized the schedule for the RDs daily to be able to attend as many rounds as possible daily, including the CTICU rounds. We then implemented a new rounding structure within the Cardiac Service Line based on literature search for the standard of care and RD role in ICU rounding.</p><p><b>Results:</b> Our percentage of turnaround time for nutrition consults/risks increased by 26% within 24hrs (58 to 84%) and decreased for missed consults/risks to 1% to exceed goals. The number of nutrition interventions we were able to implement increased with more RDs attending rounds, which was tracked with implementation of a RD rounding structure within the CTICU. The number of implemented interventions from 1 to 2 RDs was skewed due to the RD attempting to round with both teams each day there was only 1 RD.</p><p><b>Conclusion:</b> Communication between the CTICU team and Clinical Nutrition continues to improve with consistent positive feedback from the ICU providers regarding the new rounding structure. The new workflow was implemented in the Clinical Nutrition Cardiac Service Line. For future opportunities, there are other ICU teams at NMH that do not have a dedicated RD to round with them due to RD staffing that could also benefit from a dedicated RD in those rounds daily.</p><p><b>Table 1.</b> New Rounding Structure.</p><p></p><p>*Critical Care Rounds; Green: Attend; Gold: Unable to attend.</p><p><b>Table 2.</b> Control Plan.</p><p></p><p></p><p><b>Figure 1.</b> Results Consult Risk Turn Around Time Pre & Post Rounding.</p><p></p><p><b>Figure 2.</b> Number of Implemented RD Nutrition Interventions by Number of RDs Rounding.</p><p>Kenny Ngo, PharmD<sup>1</sup>; Rachel Leong, PharmD<sup>2</sup>; Vivian Zhao, PharmD<sup>2</sup>; Nisha Dave, PharmD<sup>2</sup>; Thomas Ziegler, MD<sup>2</sup></p><p><sup>1</sup>Emory Healthcare, Macon, GA; <sup>2</sup>Emory Healthcare, Atlanta, GA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Micronutrients play a crucial role in biochemical processes in the body. During critical illness, the status of micronutrients can be affected by factors such as disease severity and medical interventions. Extracorporeal membrane oxygenation (ECMO) is a vital supportive therapy that has seen increased utilization for critically ill patients with acute severe refractory cardiorespiratory failure. The potential alterations in micronutrient status and requirements during ECMO are an area of significant interest, but data are limited. This study aimed to determine the incidence of micronutrient depletion in critically ill patients requiring ECMO.</p><p><b>Methods:</b> A retrospective chart review was conducted for patients with at least one micronutrient level measured in blood while receiving ECMO between January 1, 2015, and September 30, 2023. Chart reviews were completed using the Emory Healthcare electronic medical record system after the Emory University Institutional Review Board approved the study. A full waiver for informed consent and authorization was approved for this study. Data on demographic characteristics, ECMO therapy-related information, and reported micronutrient levels were collected. Descriptive statistics were used to evaluate the data.</p><p><b>Results:</b> A total of 77 of the 128 reviewed patients met inclusion criteria and were included in data analysis (Table 1). The average age of patients was 49, and 55.8% were female. The average duration of ECMO was 14 days, and the average length of stay in the intensive care unit was 46.7 days. Among the included patients, 56 required continuous renal replacement therapy (CRRT) along with ECMO. Patients who required CRRT received empiric standard supplementation of folic acid (1 mg), pyridoxine (200 mg), and thiamine (100 mg) every 48 hours. Of the 77 patients, 44% had below-normal blood levels of at least one of the measured micronutrients. The depletion percentages of various nutrients were as follows: vitamin C (80.6%), vitamin D (75.0%), iron (72.7%), copper (53.3%), carnitine (31.3%), selenium (30.0%), pyridoxine (25.0%), folic acid (18.8%), vitamin A (10.0%), zinc (7.1%), and thiamine (3.7%) (Table 2). Measured vitamin B12, manganese, and vitamin E levels were within normal limits.</p><p><b>Conclusion:</b> This study demonstrated that 60% of patients on ECMO had orders to evaluate at least one micronutrient in their blood. Of these, almost half had at least one micronutrient level below normal limits. These findings underscore the need for regular nutrient monitoring for critically ill patients. Prospective studies are needed to understand the impact of ECMO on micronutrient status, determine the optimal time for evaluation, and assess the need for and efficacy of supplementation in these patients.</p><p><b>Table 1.</b> General Demographic and ECMO Characteristics (N = 77).</p><p></p><p><b>Table 2.</b> Observed Micronutrient Status during ECMO for Critically Ill Patients.</p><p></p><p>Diane Nowak, RD, LD, CNSC<sup>1</sup>; Mary Kronik, RD, LD, CNSC<sup>2</sup>; Caroline Couper, RD, LD, CNSC<sup>3</sup>; Mary Rath, MEd, RD, LD, CNSC<sup>4</sup>; Ashley Ratliff, MS, RD, LD, CNSC<sup>4</sup>; Eva Leszczak-Lesko, BS Health Sciences, RRT<sup>4</sup></p><p><sup>1</sup>Cleveland Clinic, Elyria, OH; <sup>2</sup>Cleveland Clinic, Olmsted Twp, OH; <sup>3</sup>Cleveland Clinic, Rocky River, OH; <sup>4</sup>Cleveland Clinic, Cleveland, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Indirect calorimetry (IC) is the gold standard for the accurate determination of energy expenditure. The team performed a comprehensive literature review on current IC practices across the nation which showed facilities employing IC typically follow a standard protocol dictated by length of stay (LOS). Intensive Care Unit (ICU) Registered Dietitians (RD) have directed IC intervention to reduce reliance on inaccurate predictive equations and judiciously identify patients (1, 2) with the assistance of IC order-based practice. While IC candidacy is determined by clinical criteria, implementation has been primarily dictated by RD time constraints. Our project aims to include IC in our standard of care by using a standardized process for implementation.</p><p><b>Methods:</b> To implement IC at our 1,299-bed quaternary care hospital, including 249 ICU beds, a multidisciplinary team including ICU RDs and Respiratory Therapists (RT) partnered with a physician champion. Three Cosmed QNRG+ indirect calorimeters were purchased after a 6-month trial period. Due to the potential for rapid clinical status changes and RD staffing, the ICU team selected an order-based practice as opposed to a protocol. The order is signed by a Licensed Independent Practitioner (LIP) and includes three components: an order for IC with indication, a nutrition assessment consult, and a conditional order for the RD to release to RT once testing approved. After the order is signed, the RD collaborates with the Registered Nurse and RT by verifying standardized clinical criteria to assess IC candidacy. If appropriate, the RD will release the order for RT prior to testing to allow for documentation of ventilator settings. To begin the test, the RD enters patient information and calibrates the pneumotach following which RT secures ventilation connections. Next, RD starts the test and remains at bedside for the standardized 20-minute duration to ensure steady state is not interrupted. Once testing is completed, the best 5-minute average is selected to obtain the measured resting energy expenditure (mREE). The RD interprets the results considering a multitude of factors and, if warranted, modifies nutrition interventions.</p><p><b>Results:</b> Eight ICU registered dietitians completed 87 IC measurements from May 2024 through August 2024 which included patients across various ICUs. All 87 patients were selected by the RD due to concerns for over or underfeeding. Eighty-three percent of the measurements were valid tests and seventy-nine percent of the measurements led to intervention modifications. The amount of face-to-face time spent was 66 hours and 45 minutes or an average of 45 minutes per test. Additional time spent interpreting results and making modifications to interventions ranged from 15-30 minutes.</p><p><b>Conclusion:</b> IC has the ability to capture accurate energy expenditures in the critically ill. RD directed IC order-based practice has allowed for the successful IC introduction at our institution. The future transition from limited IC implementation to a standard of care will be dependent on the consideration of numerous challenges including RD time constraints and patient volumes amid the ebbs and flows of critical care. To align with the ever-changing dynamics of critical care, staffing level and workflows are being actively evaluated.</p><p><b>Table 1.</b> Indirect Calorimetry (IC) Checklist.</p><p></p><p></p><p><b>Figure 1.</b> IC Result with Invalid Test.</p><p></p><p><b>Figure 2.</b> IC Result with Valid Test.</p><p></p><p><b>Figure 3.</b> IC Indications and Contraindications.</p><p></p><p><b>Figure 4.</b> IC EPIC Order.</p><p>Rebecca Frazier, MS, RD, CNSC<sup>1</sup>; Chelsea Heisler, MD, MPH<sup>1</sup>; Bryan Collier, DO, FACS, FCCM<sup>1</sup></p><p><sup>1</sup>Carilion Roanoke Memorial Hospital, Roanoke, VA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Adequate energy intake with appropriate macronutrient composition is an essential part of patient recovery; however, predictive equations have been found to be of variable accuracy. Indirect calorimetry (IC), gives insight into the primary nutrition substrate being utilized as metabolic fuel and caloric needs, often identifying over- and under-feeding. Though IC is considered the gold standard for determining resting energy expenditure, it has challenges with cost, equipment feasibility, and time restraints with personnel, namely respiratory therapy (RT). Our hypothesis: Registered Dietitian (RD)-led IC tests can be conducted in a safe and feasible manner without undue risk of complication. In addition, IC will show a higher caloric need, particularly in patients who require at least seven days of ventilatory support.</p><p><b>Methods:</b> A team of RDs screened surgical ICU patients at a single institution. Intubated patients for at least 3 days were considered eligible for testing. Exclusion criteria included a PEEP > 10, fraction of inspired oxygen >60%, Richmond Agitation Sedation Scale ≥ 1, chest tube air leak, extracorporeal membrane oxygenation use, and >1°C change in 1 hour. Tests were completed using the Q-NRG+ Portable Metabolic Monitor (Baxter) based on RD, and patient availability. Test results were compared to calculated needs based on the Penn State Equation (39 tests). Overfeeding/underfeeding was defined as >15% deviation from the equation results. Analysis of mean difference in energy needs was calculated using a standard paired, two tailed t-test for </= 7 total ventilated days and >7 ventilated days.</p><p><b>Results:</b> Thirty patients underwent IC testing; a total of 39 tests were completed. There were no complications in RD led IC testing, and minimal RT involvement was required after 5 tests completed. Overall, 56.4% of all IC regardless of ventilator days indicated overfeeding. In addition, 33.3% of tests indicated appropriate feeding (85-115% of calculated REE), and 10.3% of tests demonstrated underfeeding. When stratified by ventilator days (> 7 d vs. ≤7 d), similar results were found noting 66% of IC tests were >15% from calculated caloric needs; 54.4-60.0% via equation were overfed and 12.5-6.7% were underfed, respectively.</p><p><b>Conclusion:</b> Equations estimating caloric needs provide inconsistent results. Nutritional equations under and overestimate nutritional needs similarly regardless of ventilatory days compared to IC. Despite the lack of statistical significance, the effects of poor nutrition are well documented and vastly clinically significant. With minimal training, IC can be performed safely with an RD and bedside RN. Utilizing the RD to coordinate and perform IC testing is a feasible process that maximizes personnel efficiency and allows for immediate adjustment in the nutrition plan. IC as the gold standard for nutrition estimation should be performed on surgical ICU patients to assist in developing nutritional treatment algorithms.</p><p>Dolores Rodríguez<sup>1</sup>; Mery Guerrero<sup>2</sup>; María Centeno<sup>2</sup>; Barbara Maldonado<sup>2</sup>; Sandra Herrera<sup>2</sup>; Sergio Santana<sup>3</sup></p><p><sup>1</sup>Ecuadorian Society for the Fight against Cancer, Guayaquil, Guayas; <sup>2</sup>SOLCA, Guayaquil, Guayas; <sup>3</sup>University of Havana, La Habana, Ciudad de la Habana</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> In 2022, the International Agency for Research on Cancer-Globocan reported nearly 20 million new cases of cancer worldwide, with 30,888 cases in Ecuador. Breast, prostate, and stomach cancers were the most diagnosed types. Oncohematological diseases (OHD) significantly affect the nutritional status of patients. The ELAN Ecuador-2014 study, involving over 5,000 patients, found malnutrition in 37% of participants overall, rising to 65% among those with OHD. The Latin American Study of Malnutrition in Oncology (LASOMO), conducted by FELANPE between 2019 and 2020, revealed a 59.1% frequency of malnutrition among 1,842 patients across 52 health centers in 10 Latin American countries. This study aims to present the current state of malnutrition associated with OHD among patients treated in Ecuadorian hospitals.</p><p><b>Methods:</b> The Ecuadorian segment of the LASOMO Study was conducted between 2019 and 2020, as part of the previously mentioned regional epidemiological initiative. This study was designed as a one-day, nationwide, multicenter survey involving health centers and specialized services for patients with Oncohematological diseases (OHD) across five hospitals located in the provinces of Guayas (3), Manabí (1), and Azuay (1). The nutritional status of patients with Oncohematological diseases (OHD) was assessed using the B + C scores from Detsky et al.'s Subjective Global Assessment (SGA). This study included male and female patients aged 18 years and older, admitted to clinical, surgical, intensive care, and bone marrow transplant (BMT) units during October and November 2019. Participation was voluntary, and patients provided informed consent by signing a consent form. Data were analyzed using location, dispersion, and aggregation statistics based on variable types. The nature and strength of relationships were assessed using chi-square tests for independence, with a significance level of < 5% to identify significant associations. Odds ratios for malnutrition were calculated along with their associated 95% confidence intervals.</p><p><b>Results:</b> The study enrolled 390 patients, with 63.6% women and 36.4% men, averaging 55.3 ± 16.5 years old; 47.2% were aged 60 years or older. The most common tumor locations included kidneys, urinary tract, uterus, ovaries, prostate, and testicles, accounting for 18.7% of all cases (refer to Table 1). Chemotherapy was the predominant oncological treatment, administered to 42.8% of the patients surveyed. Malnutrition affected 49.7% of the patients surveyed, with 14.4% categorized as severely malnourished (see figure 1). The incidence of malnutrition was found to be independent of age, educational level, tumor location, and current cytoreductive treatment (refer to Table 2). Notably, the majority of the malnourished individuals were men.</p><p><b>Conclusion:</b> Malnutrition is highly prevalent in patients treated for OHD in Ecuadorian hospitals.</p><p><b>Table 1.</b> Most Frequent Location of Neoplastic Disease in Hospitalized Ecuadorian Patients and Type of Treatment Received. The number and {in brackets} the percentage of patients included in the corresponding.</p><p></p><p><b>Table 2.</b> Distribution of Malnutrition Associated with Cancer According to Selected Demographic, Clinical, and Health Characteristics of Patients Surveyed During the Oncology Malnutrition Study in Ecuador. The number and {in brackets} the percentage of malnourished patients included in each characteristic category are presented. The frequency of malnutrition was estimated using the Subjective Global Assessment (Detsky et al,. 1987.)</p><p></p><p></p><p><b>Figure 1.</b> State of Malnutrition Among Patients Treated for Cancer in Hospitals In Ecuador.</p><p>Ranna Modir, MS, RD, CNSC, CDE, CCTD<sup>1</sup>; Christina Salido, RD<sup>1</sup>; William Hiesinger, MD<sup>2</sup></p><p><sup>1</sup>Stanford Healthcare, Stanford, CA; <sup>2</sup>Stanford Medicine, Stanford, CA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Cardiovascular Intensive Care Unit (CVICU) patients are at high risk for significant nutrient deficits, especially during the early intensive care unit (ICU) phase, when postoperative complications and hemodynamic instability are prevalent. These deficits can exacerbate the catabolic state, leading to muscle wasting, impaired immune function, and delayed recovery. A calorie deficit > 10,000 with meeting < 80% of nutritional needs in the early ICU phase (first 14 days) has been linked to worse outcomes, including prolonged intubation, increased ICU length of stay (LOS) and higher risk of organ dysfunction and infections like central line-associated bloodstream infections (CLABSI). When evaluating CLABSI risk factors, the role of nutrition adequacy and malnutrition if often underestimated and overlooked, with more emphasis placed on the type of nutrition support (NS) provided, whether enteral nutrition (EN) or parenteral nutrition (PN). Historically, there has been practice of avoiding PN to reduce CLABSI risk, rather than ensuring that nutritional needs are fully met. This practice is based on initial reports from decades ago linking PN with hospital-acquired infections. However, updated guidelines based on modern data now indicate no difference in infectious outcomes with EN vs PN. In fact, adding PN when EN alone is insufficient can help reduce nutritional deficiencies, supporting optimal immune response and resilience to infection. As part of an ongoing NS audit in our CVICU, we reviewed all CLABSI cases over a 23-month period to assess nutrition adequacy and the modality of NS (EN vs PN) provided.</p><p><b>Methods:</b> Data were extracted from electronic medical records for all CLABSI cases from September 2020 to July 2022. Data collected included patient characteristics, clinical and nutrition outcomes (Table 1). Descriptive statistics (means, standard deviations, frequencies) were calculated. Chi-square or Fisher's exact test assessed the association between type of NS and meeting >80% of calorie/protein targets within 14 days and until CLABSI onset. A significance level of 0.05 was used.</p><p><b>Results:</b> In a 23 month period, 28/51 (54.9%) patients with a CLABSI required exclusive NS throughout their entire CVICU stay with n = 18 male (64.3%), median age 54.5 years, mean BMI 27.4, median CVICU LOS was 49.5 days with a 46.4% mortality rate. Surgical intervention was indicated in 60.7% patients with 41.2% requiring preoperative extracorporeal membrane oxygenation (ECMO) and 52.9% postoperative ECMO (Table 1). Majority of patients received exclusive EN (53.6%) with 46.4% EN + PN and 0% exclusive PN. In the first 14 ICU days, 21.4% met >80% of calorie needs, 32.1% met >80% of protein needs with 32.1% having a calorie deficit >10,000 kcal. No difference in type of NS and ability to meet >80% of nutrient targets in the first 14 days (Table 1, p = 0.372, p = 0.689). Majority of PN (61.5%) was initiated after ICU day 7. From ICU day 1 until CLABSI onset, the EN + PN group were more able to meet >80% of calorie targets vs exclusive EN (p = 0.016). 50% were diagnosed with malnutrition. 82% required ECMO cannulas and 42.9% dialysis triple lumen. Enterococcus Faecalis was the most common organism for the EN (43.7%) and EN + PN group (35.7%) (Table 2).</p><p><b>Conclusion:</b> This single-center analysis of CVICU CLABSI patients found majority requiring exclusive NS failed to meet >80% of nutrition needs during the early ICU phase. Exclusive EN was the primary mode of NS compared to EN + PN or PN alone, challenging the assumption that PN inherently increases CLABSI risk. In fact, EN + PN improved ability to meet calorie targets until CLABSI onset. These findings suggest that early nutrient deficits may increase CLABSI risk and that the risk is not dependent on the type of NS provided.</p><p><b>Table 1.</b> Patient Characteristics, Clinical and Nutritional Outcomes.</p><p></p><p><b>Table 2.</b> Type of Central Access Device and Microorganism in Relation to Modality of Nutrition Support Provided.</p><p></p><p>Oki Yonatan, MD<sup>1</sup>; Faya Nuralda Sitompul<sup>2</sup></p><p><sup>1</sup>ASPEN, Jakarta, Jakarta Raya; <sup>2</sup>Osaka University, Minoh, Osaka</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Ginseng, widely used as a functional food or therapeutic supplement in Asia, contains bioactive compounds such as ginsenosides, which exert a range of biological effects, including hypoglycemic, anti-inflammatory, cardioprotective, and anti-tumor properties. However, studies have indicated that ginseng also has anticoagulant and anti-aggregation effects and may be associated with bleeding. This case report presents a potential case of ginseng-induced bleeding in an elderly patient with advanced pancreatic cancer. Case Description: A 76-year-old male with stage IV pancreatic cancer and metastases to the liver, lymph nodes, and peritoneum was in home care with BIPAP ventilation, NGT feed, ascites drainage, and foley catheter. He had a history of type A aortic dissection repair, anemia, and thrombocytopenia, with platelet counts consistently below 50,000/µL. Despite no history of anticoagulant use, the patient developed massive gastrointestinal bleeding and hematuria after consuming 100 grams of American Ginseng (AG) per day for five days. Disseminated intravascular coagulation (DIC) was initially suspected, but no signs of bleeding were observed until the third week of care, coinciding with ginseng consumption. Endoscopy was not performed due to the patient's unstable condition and the family's refusal. Discussion: The consumption of AG may have triggered bleeding due to the patient's already unstable condition and low platelet count. Ginsenosides, particularly Rg1, Rg2, and Rg3, have been shown to exert anticoagulant effects, prolonging clotting times and inhibiting platelet aggregation. Studies have demonstrated that AG extracts can significantly extend clotting times and reduce platelet activity, potentially contributing to the observed bleeding. Conclusion: This case highlights the possible role of AG in inducing severe bleeding in a patient with pancreatic cancer and thrombocytopenia. Given ginseng's known anticoagulant properties, caution should be exercised when administering it to patients with hematological abnormalities or bleeding risks, and further research is warranted to assess its safety in these populations.</p><p><b>Methods:</b> None Reported.</p><p><b>Results:</b> None Reported.</p><p><b>Conclusion:</b> None Reported.</p><p>Kursat Gundogan, MD<sup>1</sup>; Mary Nellis, PhD<sup>2</sup>; Nurhayat Ozer, PhD<sup>3</sup>; Sahin Temel, MD<sup>3</sup>; Recep Yuksel, MD<sup>4</sup>; Murat Sungar, MD<sup>5</sup>; Dean Jones, PhD<sup>2</sup>; Thomas Ziegler, MD<sup>6</sup></p><p><sup>1</sup>Division of Clinical Nutrition, Erciyes University Health Sciences Institute, Kayseri; <sup>2</sup>Emory University, Atlanta, GA; <sup>3</sup>Erciyes University Health Sciences Institute, Kayseri; <sup>4</sup>Department of Internal Medicine, Erciyes University School of Medicine, Kayseri; <sup>5</sup>Department of Internal Medicine, Erciyes University School of Medicine, Kayseri; <sup>6</sup>Emory Healthcare, Atlanta, GA</p><p><b>Financial Support:</b> Erciyes University Scientific Research Committee (TSG- 2021–10078) and TUBITAK 2219 program (1059B-192000150), each to KG, and National Institutes of Health grant P30 ES019776, to DPJ and TRZ.</p><p><b>Background:</b> Metabolomics represents a promising profiling technique for the investigation of significant metabolic alterations that arise in response to critical illness. The present study utilized plasma high-resolution metabolomics (HRM) analysis to define systemic metabolism associated with common critical illness severity scores in critically ill adults.</p><p><b>Methods:</b> This cross-sectional study performed at Erciyes University Hospital, Kayseri, Turkiye and Emory University, Atlanta, GA, USA. Participants were critically ill adults with an expected length of intensive care unit stay longer than 48 h. Plasma for metabolomics was obtained on the day of ICU admission. Data was analyzed using regression analysis of two ICU admission illness severity scores (APACHE II and mNUTRIC) against all plasma metabolomic features in metabolome-wide association studies (MWAS). APACHE II score was analyzed as a continuous variable, and mNUTRIC score was analyzed as a dichotomous variable [≤4 (low) vs. > 4 (high)]. Pathway enrichment analysis was performed on significant metabolites (raw p < 0.05) related to each of the two illness severity scores independently.</p><p><b>Results:</b> A total of 77 patients were included. The mean age was 69 years (range 33-92 years); 65% were female. More than 15,000 metabolomic features were identified for MWAS of APACHE II and mNUTRIC scores, respectively. Metabolic pathways significantly associated with APACHE II score at ICU admission included: C21-steroid hormone biosynthesis, and urea cycle, vitamin E, seleno amino acid, aspartate/asparagine, and thiamine metabolism. Metabolic pathways associated with mNUTRIC score at ICU admission were N-glycan degradation, and metabolism of fructose/mannose, vitamin D3, pentose phosphate, sialic acid, and linoleic acid. Within the significant pathways, the gut microbiome-derived metabolites hippurate and N-acetyl ornithine were downregulated, and creatine and glutamate were upregulated with increasing APACHE II scores. Metabolites involved in energy metabolism that were altered with a high (> 4) mNUTRIC score included N-acetylglucosamine (increased) and gluconate (decreased).</p><p><b>Conclusion:</b> Plasma HRM identified significant associations between two commonly used illness severity scores and metabolic processes involving steroid biosynthesis, the gut microbiome, skeletal muscle, and amino acid, vitamin, and energy metabolism in adult critically ill patients.</p><p>Hilary Winthrop, MS, RD, LDN, CNSC<sup>1</sup>; Megan Beyer, MS, RD, LDN<sup>2</sup>; Jeroen Molinger, PhDc<sup>3</sup>; Suresh Agarwal, MD<sup>4</sup>; Paul Wischmeyer, MD, EDIC, FCCM, FASPEN<sup>5</sup>; Krista Haines, DO, MA<sup>4</sup></p><p><sup>1</sup>Duke Health, Durham, NC; <sup>2</sup>Duke University School of Medicine- Department of Anesthesiology, Durham, NC; <sup>3</sup>Duke University Medical Center - Department of Anesthesiology - Duke Heart Center, Raleigh, NC; <sup>4</sup>Duke University School of Medicine, Durham, NC; <sup>5</sup>Duke University Medical School, Durham, NC</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Little evidence currently exists on the effect of body mass index (BMI) on resting energy expenditure (REE). Current clinical practice guidelines are based primarily on expert opinion and provide a wide range of calorie recommendations without delineations specific to BMI categories, making it difficult for the clinician to accurately prescribe calorie needs for their hospitalized patients. This abstract utilizes metabolic cart data from studies conducted at a large academic healthcare system to investigate trends within BMI and REE.</p><p><b>Methods:</b> A pooled cohort of hospitalized patients was compiled from three clinical trials where metabolic cart measurements were collected. In all three studies, indirect calorimetry was initially conducted in the intensive care setting, with follow up measurements conducted as clinically able. Variables included in the analysis was measured resting energy expenditure (mREE) in total kcals as well as kcals per kilogram of body weight on ICU admission, respiratory quotient, days post ICU admission, as well as demographics and clinical characteristics. ANOVA tests were utilized to analyze continuous data.</p><p><b>Results:</b> A total of 165 patients were included in the final analysis with 338 indirect calorimetry measurements. Disease groups included COVID-19 pneumonia, non-COVID respiratory failure, surgical ICU, cardiothoracic surgery, and trauma. The average age of patients was 58 years old, with 96 males (58.2%) and 69 females (41.8%), and an average BMI of 29.0 kg/m<sup>2</sup>. The metabolic cart measurements on average were taken on day 8 post ICU admission (ranging from day 1 to day 61). See Table 1 for more demographics and clinical characteristics. Indirect calorimetry measurements were grouped into three BMI categories: BMI ≤ 29.9 (normal), BMI 30-39.9 (obese), and BMI ≥ 40 (super obese). ANOVA analyses showed statistical significance amongst the three BMI groups in both total kcals (p < 0.001) and kcals per kg (p < 0.001). The normal BMI group had an average mREE of 1632 kcals (range of 767 to 4023), compared to 1868 kcals (range of 1107 to 3754) in the obese BMI group, and 2004 kcals (range of 1219 to 3458) in the super obese BMI group. Similarly, when analyzing kcals per kg, the normal BMI group averaged 23.3 kcals/kg, the obese BMI group 19.8, and the super obese BMI group 16.3.</p><p><b>Conclusion:</b> Without access to a metabolic cart to accurately measure REE, the majority of nutrition clinicians are left to estimations. Current clinical guidelines and published data do not provide the guidance that is necessary to accurately feed many hospitalized patients. This current analysis only scratches the surface on the metabolic demands of different patient populations based on their BMI status, especially given the wide ranges of energy expenditures. Robust studies are needed to further elucidate the relationships between BMI and REE in different disease states.</p><p><b>Table 1.</b> Demographics and Clinical Characteristics.</p><p></p><p></p><p><b>Figure 1.</b> Average Measured Resting Energy Expenditure (in Total Kcals), by BMI Group.</p><p></p><p><b>Figure 2.</b> Average Measured Resting Energy Expenditure (in kcals/kg), by BMI Group.</p><p>Carlos Reyes Torres, PhD, MSc<sup>1</sup>; Daniela Delgado Salgado, Dr<sup>2</sup>; Sergio Diaz Paredes, Dr<sup>1</sup>; Sarish Del Real Ordoñez, Dr<sup>1</sup>; Eva Willars Inman, Dr<sup>1</sup></p><p><sup>1</sup>Hospital Oncológico de Coahuila (Oncological Hospital of Coahuila), Saltillo, Coahuila de Zaragoza; <sup>2</sup>ISSSTE, Saltillo, Coahuila de Zaragoza</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Chemotherapy is one of the principal treatment in cancer. Is described that some degree of toxicity is present in 98% of patients. Changes in body composition are frequent and are related to worse outcomes. Low muscle mass is associated with chemotherapy toxicity in observational studies. Phase angle (PhA) is an indicator cell integrity and positively correlates with adequate nutritional status and muscle mass. There is limited studies that have been evaluated the associaton of PhA and chemotherapy toxicity. The aim of this study was to evaluate the association of PhA, body composition and chemotherapy toxicity in cancer patients.</p><p><b>Methods:</b> A prospective cohort study was conducted in adult patients with solid neoplasic disease with first-line systemic treatments. The subjects were evaluated at first chemotherapy treatment using bioelectrical impedance analysis with a RJL device and according to the standardized technique. The outcome is to know chemotherapy toxicity in the first 4 cycles of chemotherapy and associated with PhA and body composition. Toxicity was evaluated using National Cancer Institute (NCI) common terminology criteria for adverse events version 5.0. A PhA < 4.7 was considered low according to other studies.</p><p><b>Results:</b> A total of 54 patients were evaluated and included in the study. The most common cancer diagnosis was breast cancer (40%), gastrointestinal tumors (33%), lung cancer (18%). Chemotherapy toxicity was presented in 46% of the patients. The most common adverse effects were gastrointestinal (48%), blood disorders (32%) and metabolically disorders (40%). There were statistically differences in PhA between patients with chemotherapy toxicity and patients without adverse effects: 4.45º (3.08-4.97) vs 6.07º (5.7-6.2) respectively, p vale < 0.001. PhA was associated with the risk of chemotherapy toxicity HR 8.7 (CI 95% 6.1-10.7) log rank test p = 0.02.</p><p><b>Conclusion:</b> PhA was associated with the risk of chemotherapy toxicity in cancer patients.</p><p>Lizl Veldsman, RD, M Nutr, BSc Dietetics<sup>1</sup>; Guy Richards, MD, PhD<sup>2</sup>; Carl Lombard, PhD<sup>3</sup>; Renée Blaauw, PhD, RD<sup>1</sup></p><p><sup>1</sup>Division of Human Nutrition, Department of Global Health, Faculty of Medicine & Health Sciences, Stellenbosch University, Cape Town, Western Cape; <sup>2</sup>Department of Surgery, Division of Critical Care, Faculty of Health Sciences, University of the Witwatersrand, Johannesburg, Gauteng; <sup>3</sup>Division of Epidemiology and Biostatistics, Department of Global Health, Faculty of Medicine and Health Sciences, Stellenbosch University, Cape Town, Western Cape</p><p><b>Financial Support:</b> Fresenius Kabi JumpStart Research Grant.</p><p><b>Background:</b> Critically ill patients lose a significant amount of muscle mass over the first ICU week. We sought to determine the effect of bolus amino acid (AA) supplementation on the urea-to-creatinine ratio (UCR) trajectory over time, and whether UCR correlates with histology myofiber cross-sectional area (CSA) as a potential surrogate marker of muscle mass.</p><p><b>Methods:</b> This was a secondary analysis of data from a registered clinical trial (ClinicalTrials.gov NCT04099108) undertaken in a predominantly trauma surgical ICU. Participants were randomly assigned into two groups both of which received standard care nutrition (SCN) and mobilisation. Study participants randomised to the intervention group also received a bolus AA supplement, with a 45-minute in-bed cycling session, from average ICU day 3 for a mean of 6 days. The change in vastus lateralis myofiber CSA, measured from pre-intervention (average ICU day 2) to post-intervention (average ICU day 8), was assessed through biopsy and histological analysis. A linear mixed-effects regression model was used to compare the mean daily UCR profiles between study groups from ICU day 0 to day 10. As a sensitivity analysis, we adjusted for disease severity on admission (APACHE II and SOFA scores), daily fluid balance, and the presence of acute kidney injury (AKI). A spearman correlation compared the UCR on ICU day 2 (pre-intervention) and ICU day 8 (post-intervention) with the corresponding myofiber CSA.</p><p><b>Results:</b> A total of 50 enrolled participants were randomised to the study intervention and control groups in a 1:1 ratio. The control and intervention groups received, on average, 87.62 ± 32.18 and 85.53 ± 29.29 grams of protein per day (1.26 ± 0.41 and 1.29 ± 0.40 g/kg/day, respectively) from SCN, and the intervention group an additional 30.43 ± 5.62 grams of AA (0.37 ± 0.06 g/kg protein equivalents) from the AA supplement. At baseline, the mean UCR for the control (75.6 ± 31.5) and intervention group (63.8 ± 27.1) were similar. Mean UCR increased daily from baseline in both arms, but at a faster rate in the intervention arm with a significant intervention effect (p = 0.0127). The UCR for the intervention arm at day 7 and 8 was significantly higher by 21 and 22 units compared to controls (p = 0.0214 and p = 0.0215, respectively). After day 7 the mean daily UCR plateaued in the intervention arm, but not in the controls. Adjusting for disease severity, daily fluid balance, and AKI did not alter the intervention effect. A significant negative association was found between UCR and myofiber CSA (r = -0.39, p = 0.011) at ICU day 2 (pre-intervention), but not at day 8 (post-intervention) (r = 0.23, p = 0.153).</p><p><b>Conclusion:</b> Bolus amino acid supplementation significantly increases the UCR during the first ICU week, thereafter plateauing. UCR at baseline may be an indicator of muscle status.</p><p></p><p><b>Figure 1.</b> Change in Urea-to-Creatinine Ratio (UCR) Over ICU Days in the Control and Intervention Group. Error bars Represent 95% Confidence Intervals (CIs).</p><p>Paola Renata Lamoyi Domínguez, MSc<sup>1</sup>; Iván Osuna Padilla, PhD<sup>2</sup>; Lilia Castillo Martínez, PhD<sup>3</sup>; Josué Daniel Cadeza-Aguilar, MD<sup>2</sup>; Martín Ríos-Ayala, MD<sup>2</sup></p><p><sup>1</sup>UNAM, National Autonomous University of Mexico, Mexico City, Distrito Federal; <sup>2</sup>National Institute of Respiratory Diseases, Mexico City, Distrito Federal; <sup>3</sup>National Institute of Medical Sciences and Nutrition Salvador Zubirán, Mexico City, Distrito Federal</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Non-defecation (ND) is highly prevalent in critically ill patients on mechanical ventilation (MV) and has been reported in up to 83% of cases. This disorder is associated with high morbidity and mortality rates. Most existing research has focused on the association between clinical data and non-defecation; however, there is a lack of evidence regarding its association with dietary factors. We aimed to analyze the association between dietary fiber in enteral nutrition and the amount of fluids administered through enteral and parenteral routes with defecation during the first 6-days of MV in critically ill patients with pneumonia and other lung manifestations.</p><p><b>Methods:</b> We conducted a longitudinal analysis of MV patients receiving enteral nutrition (EN) at a tertiary care hospital in Mexico City between May 2023 and April 2024. The inclusion criteria were age >18 years, MV, admission to the respiratory intensive care unit (ICU) for pneumonia or other lung manifestations, and nutritional assessment performed during the first 24 h after ICU admission. Exclusion criteria was established as patients who required parenteral nutrition, major surgery, traumatic brain injury, or neuromuscular disorders were excluded from this study. A nutritional assessment, including the NUTRIC score, SOFA and APACHE II assessments, and an estimation of energy-protein requirements, was performed by trained dietitians within the first 24 hours after ICU admission. During each day of follow-up (0 to 6 days), we recorded the amount of fiber provided in EN, and volume of infusion fluids, including enteral and parenteral route, and medical prescription of opioids, sedatives, neuromuscular blockers and vasopressors. ND was defined as >6 days without defecation from ICU admission. The differences between ND and defecation were also assessed. Association of ND with dietary factors were examined using discrete-time survival analysis.</p><p><b>Results:</b> Seventy-four patients were included; ND was observed in 40 patients (54%). Non-defecation group had higher ICU length of stay, and 50% of this group had the first defecation until day 10. No differences in fiber provision and volume of infusion fluids were observed between the groups. In multivariate analysis, no associations between ND and fiber (fiber intake 10 to 20 g per day, OR 1.17 95% CI:0.41-3.38, p = 0.29) or total fluids (fluids intake 25 to 30 ml/kg/d, OR 1.85 95%CI:0.44-7.87, p = 0.404) were observed.</p><p><b>Conclusion:</b> Non-defecation affected 54% of the study population. Although fiber and fluids are considered a treatment for non-defecation, we did not find an association in critically ill patients.</p><p><b>Table 1.</b> Demographic and Clinical Characteristics by Groups.</p><p></p><p><b>Table 2.</b> Daily Comparison of Dietary Factors.</p><p></p><p>Andrea Morand, MS, RDN, LD<sup>1</sup>; Osman Mohamed Elfadil, MBBS<sup>1</sup>; Kiah Graber, RDN<sup>1</sup>; Yash Patel, MBBS<sup>1</sup>; Suhena Patel, MBBS<sup>1</sup>; Chloe Loersch, RDN<sup>1</sup>; Isabelle Wiggins, RDN<sup>1</sup>; Anna Santoro, MS, RDN<sup>1</sup>; Natalie Johnson, MS<sup>1</sup>; Kristin Eckert, MS, RDN<sup>1</sup>; Dana Twernbold, RDN<sup>1</sup>; Dacia Talmo, RDN<sup>1</sup>; Elizabeth Engel, RRT, LRT<sup>1</sup>; Avery Erickson, MS, RDN<sup>1</sup>; Alex Kirby, MS, RDN<sup>1</sup>; Mackenzie Vukelich, RDN<sup>1</sup>; Kate Sandbakken, RDN<sup>1</sup>; Victoria Vasquez, RDN<sup>1</sup>; Manpreet Mundi, MD<sup>1</sup></p><p><sup>1</sup>Mayo Clinic, Rochester, MN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Current guidelines for nutrition support in critically ill patients recommend the utilization of indirect calorimetry (IC) to determine energy needs. However, IC testing is limited at many institutions due to accessibility, labor, and costs, which leads to reliance on predictive equations to determine caloric targets. A quality improvement (QI) initiative was implemented to assess the impact on nutrition care when IC is routinely completed.</p><p><b>Methods:</b> A prospective QI project in selected medical and surgical intensive care units (ICU) included critically ill patients assessed by the dietitian within 24-48 hours of consult order or by hospital day 4. Patients with contraindications to IC were excluded, including those requiring ECMO, CRRT, MARS, or FiO2 > 55, as well as spontaneously breathing patients requiring significant supplemental oxygen. The initial caloric target was established utilizing predictive equations, which is the standard of care at our institution. After day 4 of the ICU stay, IC measurements, predictive equations, and weight-based nomograms were collected. The predictive equations utilized included Harris-Benedict (HB) - basal, adjusted HB (75% of basal when body mass index (BMI) > 30), Penn State if ventilated, Mifflin St. Jeor (MSJ), revised HB, and revised HB adjusted (75% of basal when BMI > 30). Additional demographic, anthropometric, and clinical data were collected.</p><p><b>Results:</b> Patients (n = 85) were majority male (n = 53, 62.4%), admitted to the surgical ICU (n = 57, 67.1%), overweight (mean BMI 29.8 kg/m^2), and the average age was 61.3 years old (SD 16.5). At the time of the IC test, the median ICU length of stay was 6 days; 77.6% (n = 66) were supported with mechanical ventilation, and median ventilator days were 4 days (Table 1). Mean IC measured REE was compared to predictive equations, showing that except for weight-based nomogram high caloric needs (p = 0.3615), all equations were significantly lower than IC (p < 0.0001). Median absolute differences from IC for evaluated predictive equations (Figure 1). After REE was measured, caloric goals increased significantly for patients on enteral (EN) and parenteral nutrition (PN) (p = 0.0016 and p = 0.05, respectively). In enterally fed the mean calorie goal before REE was 1655.4 (SD 588), and after REE 1917.6 (SD 528.6), an average increase of 268.4 kcal/day; In parenterally fed patients, the mean calorie goal before REE was1395.2 kcal (SD 313.6) and after REE 1614.1 (SD 239.3), an average increase of 167.5 kcal (Table 2). The mean REE per BMI category per actual body weight was BMI < 29.9 = 25.7 ± 7.9 kcal/kg, BMI 30-34.9 = 20.3 ± 3.8 kcal/kg, BMI 35-39.9 = 22.8 ± 4.6 kcal/kg, and BMI ≥ 40 = 16.3 ± 2.9 kcal/kg (25.4 ± 10.5 kcal/kg of ideal body weight). (Figure 2) illustrates the average daily calorie need broken down by BMI for IC and examined predictive equations.</p><p><b>Conclusion:</b> There was a significant difference between IC measurements and various predictive equations except for weight-based high-estimated calorie needs. Nutrition goals changed significantly in response to IC measurements. It is recommended that we expand the use of IC in the critically ill population at our institution. In settings where IC is not possible, weight-based nomograms should be utilized.</p><p><b>Table 1.</b> Baseline Demographics and Clinical Characteristics.</p><p></p><p><b>Table 2.</b> Nutrition Support.</p><p></p><p></p><p><b>Figure 1.</b> Difference in Daily Calorie Estimation Utilizing IC Compared to Predictive Equations.</p><p></p><p><b>Figure 2.</b> RMR by IC and Other Predictive Equations by BMI.</p><p><b>GI, Obesity, Metabolic, and Other Nutrition Related Concepts</b></p><p>Suhena Patel, MBBS<sup>1</sup>; Osman Mohamed Elfadil, MBBS<sup>1</sup>; Yash Patel, MBBS<sup>1</sup>; Chanelle Hager, RN<sup>1</sup>; Manpreet Mundi, MD<sup>1</sup>; Ryan Hurt, MD, PhD<sup>1</sup></p><p><sup>1</sup>Mayo Clinic, Rochester, MN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Chronic capillary leak syndrome is a rare but potentially fatal side effect of immunotherapy for some malignancies, mainly manifested with intractable generalized edema and often refractory hypotension. An idiopathic type of the syndrome is also known. It can be diagnosed by exclusion in patients with a single or recurrent episode of intravascular hypovolemia or generalized edema, primarily manifested by the diagnostic triad of hypotension, hemoconcentration, and hypoalbuminemia in the absence of an identifiable alternative cause. Supportive care with a role for steroids remains the standard treatment. In capillary leak syndrome secondary to cancer immune therapy, discontinuing the offending agent is typically considered. This relatively rare syndrome can be associated with significant clinical challenges. This clinical case report focuses on aspects of nutrition care.</p><p><b>Methods:</b> A 45-year-old male with a past medical history of hypertension, pulmonary tuberculosis in childhood, and rectal cancer was admitted for evaluation of anasarca. He was first diagnosed with moderately differentiated invasive adenocarcinoma of rectum IIIb (cT3, cN1, cM0) in October 2022. As initial therapy, he was enrolled in a clinical trial. He received 25 cycles of immunotherapy with the study drug Vudalimab (PD1/CTLA4 bispecific antibody), achieving a complete clinical response without additional chemotherapy, radiation, or surgery. He, unfortunately, has developed extensive capillary leak syndrome manifested with recurrent anasarca, chylous ascites, and pleural effusions since November 2023. His treatment was also complicated by the development of thyroiditis and insulin-dependent diabetes. The patient most recently presented with abdominal fullness, ascites, and peripheral edema that did not improve despite diuretic therapy. A diagnostic and therapeutic paracentesis was performed, and chylous ascites were revealed. Two weeks later, the patient was presented with re-accumulation of ascites and worsening anasarca with pleural and pericardial effusion. A PET CT then was negative for malignant lesions but revealed increased uptake along the peritoneal wall, suggestive of peritonitis. A lymphangiogram performed for further evaluation revealed no gross leak/obstruction; however, this study could not rule out microleak from increased capillary permeability. However, he required bilateral pleural and peritoneal drains (output ranged from 0.5 to 1 L daily). A diagnosis of Capillary leak syndrome was made. In addition to Octreotide, immunosuppression therapy was initiated with IV methyl prednisone (40 mg BID) followed by a transition to oral steroids (60 mg PO); however, the patient's symptoms reappeared with a reduction in dose of prednisone and transition to oral steroids. His immunosuppression regimen was modified to include a trial of IVIG weekly and IV albumin twice daily. From a nutritional perspective, he was initially on a routine oral diet. Still, his drain was increased specifically after fatty food consumption, so he switched to a low fat 40 g/day and high protein diet to prevent worsening chylous ascites. In the setting of worsening anasarca and moderate malnutrition based on ASPEN criteria, along with significant muscle loss clinically, he was started on TPN. A no-fat diet was initiated to minimize lymphatic flow with subsequent improvement in his chest tube output volume, followed by a transition to home parenteral nutrition with mixed oil and oral diet.</p><p><b>Results:</b> None Reported.</p><p><b>Conclusion:</b> Chronic capillary/lymphatic leak syndrome can be challenging and necessitate dietary modification. Along with dietary changes to significantly reduce oral fat intake, short—or long-term PN can be considered.</p><p>Kishore Iyer, MBBS<sup>1</sup>; Francisca Joly, MD, PhD<sup>2</sup>; Donald Kirby, MD, FACG, FASPEN<sup>3</sup>; Simon Lal, MD, PhD, FRCP<sup>4</sup>; Kelly Tappenden, PhD, RD, FASPEN<sup>2</sup>; Palle Jeppesen, MD, PhD<sup>5</sup>; Nader Youssef, MD, MBA<sup>6</sup>; Mena Boules, MD, MBA, FACG<sup>6</sup>; Chang Ming, MS, PhD<sup>6</sup>; Tomasz Masior, MD<sup>6</sup>; Susanna Huh, MD, MPH<sup>7</sup>; Tim Vanuytsel, MD, PhD<sup>8</sup></p><p><sup>1</sup>Icahn School of Medicine at Mount Sinai, New York, NY; <sup>2</sup>Nutritional Support, Hôpital Beaujon, Paris, Ile-de-France; <sup>3</sup>Department of Intestinal Failure and Liver Diseases, Cleveland, OH; <sup>4</sup>Salford Royal NHS Foundation Trust, Salford, England; <sup>5</sup>Department of Intestinal Failure and Liver Diseases, Rigshospitalet, Copenhagen, Denmark, Hovedstaden; <sup>6</sup>Ironwood Pharmaceuticals, Basel, Basel-Stadt; <sup>7</sup>Ironwood Pharmaceuticals, Boston, MA; <sup>8</sup>University Hospitals Leuven, Leuven, Brabant Wallon</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> American College of Gastroenterology 2024, October 25-30, 2024, Philadelphia, Pennsylvania.</p><p><b>Financial Support:</b> None Reported.</p><p><b>International Poster of Distinction</b></p><p>Francisca Joly, MD, PhD<sup>1</sup>; Tim Vanuytsel, MD, PhD<sup>2</sup>; Donald Kirby, MD, FACG, FASPEN<sup>3</sup>; Simon Lal, MD, PhD, FRCP<sup>4</sup>; Kelly Tappenden, PhD, RD, FASPEN<sup>1</sup>; Palle Jeppesen, MD, PhD<sup>5</sup>; Federico Bolognani, MD, PhD<sup>6</sup>; Nader Youssef, MD, MBA<sup>6</sup>; Carrie Li, PhD<sup>6</sup>; Reda Sheik, MPH<sup>6</sup>; Isabelle Statovci, BS, CH<sup>6</sup>; Susanna Huh, MD, MPH<sup>7</sup>; Kishore Iyer, MBBS<sup>8</sup></p><p><sup>1</sup>Nutritional Support, Hôpital Beaujon, Paris, Ile-de-France; <sup>2</sup>University Hospitals Leuven, Leuven, Brabant Wallon; <sup>3</sup>Department of Intestinal Failure and Liver Diseases, Cleveland, OH; <sup>4</sup>Salford Royal NHS Foundation Trust, Salford, England; <sup>5</sup>Department of Intestinal Failure and Liver Diseases, Rigshospitalet, Copenhagen, Denmark, Hovedstaden; <sup>6</sup>Ironwood Pharmaceuticals, Basel, Basel-Stadt; <sup>7</sup>Ironwood Pharmaceuticals, Boston, MA; <sup>8</sup>Icahn School of Medicine at Mount Sinai, New York, NY</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> Digestive Disease Week 2024, May 18 - 21, 2024, Washington, US.</p><p><b>Financial Support:</b> None Reported.</p><p>Tim Vanuytsel, MD, PhD<sup>1</sup>; Simon Lal, MD, PhD, FRCP<sup>2</sup>; Kelly Tappenden, PhD, RD, FASPEN<sup>3</sup>; Donald Kirby, MD, FACG, FASPEN<sup>4</sup>; Palle Jeppesen, MD, PhD<sup>5</sup>; Francisca Joly, MD, PhD<sup>3</sup>; Tomasz Masior, MD<sup>6</sup>; Patricia Valencia, PharmD<sup>7</sup>; Chang Ming, MS, PhD<sup>6</sup>; Mena Boules, MD, MBA, FACG<sup>6</sup>; Susanna Huh, MD, MPH<sup>7</sup>; Kishore Iyer, MBBS<sup>8</sup></p><p><sup>1</sup>University Hospitals Leuven, Leuven, Brabant Wallon; <sup>2</sup>Salford Royal NHS Foundation Trust, Salford, England; <sup>3</sup>Nutritional Support, Hôpital Beaujon, Paris, Ile-de-France; <sup>4</sup>Department of Intestinal Failure and Liver Diseases, Cleveland, OH; <sup>5</sup>Department of Intestinal Failure and Liver Diseases, Rigshospitalet, Copenhagen, Denmark, Hovedstaden; <sup>6</sup>Ironwood Pharmaceuticals, Basel, Basel-Stadt; <sup>7</sup>Ironwood Pharmaceuticals, Boston, MA; <sup>8</sup>Icahn School of Medicine at Mount Sinai, New York, NY</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> American College of Gastroenterology 2024, October 25-30, 2024, Philadelphia, Pennsylvania.</p><p><b>Financial Support:</b> None Reported.</p><p>Boram Lee, MD<sup>1</sup>; Ho-Seong Han, PhD<sup>1</sup></p><p><sup>1</sup>Seoul National University Bundang Hospital, Seoul, Seoul-t'ukpyolsi</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Pancreatic cancer is one of the fatal malignancies, with a 5-year survival rate of less than 10%. Despite advancements in treatment, its incidence is increasing, driven by aging populations and increasing obesity rates. Obesity is traditionally considered to be a negative prognostic factor for many cancers, including pancreatic cancer. However, the \"obesity paradox\" suggests that obesity might be associated with better outcomes in certain diseases. This study investigated the effect of obesity on the survival of long-term pancreatic cancer survivors after pancreatectomy.</p><p><b>Methods:</b> A retrospective analysis was conducted on 404 patients with pancreatic ductal adenocarcinoma (PDAC) patients who underwent surgery between January 2004 and June 2022. Patients were classified into the non-obese (BMI 18.5-24.9) (n = 313) and obese (BMI ≥ 25.0) (n = 91) groups. The data collected included demographic, clinical, perioperative, and postoperative information. Survival outcomes (overall survival [OS], recurrence-free survival [RFS], and cancer-specific survival [CSS]) were analyzed using Kaplan-Meier curves and Cox regression models. A subgroup analysis examined the impact of the visceral fat to subcutaneous fat ratio (VSR) on survival within the obese cohort.</p><p><b>Results:</b> Obese patients (n = 91) had a significantly better 5-year OS (38.9% vs. 27.9%, p = 0.040) and CSS (41.4% vs. 33%, p = 0.047) than non-obese patients. RFS did not differ significantly between the groups. Within the obese cohort, a lower VSR was associated with improved survival (p = 0.012), indicating the importance of fat distribution in outcomes.</p><p><b>Conclusion:</b> Obesity is associated with improved overall and cancer-specific survival in patients with pancreatic cancer undergoing surgery, highlighting the potential benefits of a nuanced approach to managing obese patients. The distribution of adipose tissue, specifically higher subcutaneous fat relative to visceral fat, further influences survival, suggesting that tailored treatment strategies could enhance the outcomes.</p><p>Nicole Nardella, MS<sup>1</sup>; Nathan Gilchrist, BS<sup>1</sup>; Adrianna Oraiqat, BS<sup>1</sup>; Sarah Goodchild, BS<sup>1</sup>; Dena Berhan, BS<sup>1</sup>; Laila Stancil, HS<sup>1</sup>; Jeanine Milano, BS<sup>1</sup>; Christina Santiago, BS<sup>1</sup>; Melissa Adams, PA-C<sup>1</sup>; Pamela Hodul, MD<sup>1</sup></p><p><sup>1</sup>Moffitt Cancer Center, Tampa, FL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Pancreatic cancer (PC) is a devastating diagnosis with 66,440 new cases and 51,750 deaths estimated in 2024. The prevalence of malnutrition in patients with cancer has been reported to range from 30-85% depending on patient age, cancer type, and stage of disease. Specifically, PC patients frequently present with malnutrition which can lead to negative effects on quality of life and tumor therapy. We hypothesize that increased awareness of early nutritional intervention for PC patients has led to high utilization of dietitian consultations at our tertiary cancer center.</p><p><b>Methods:</b> This IRB-exempt retrospective review included newly diagnosed, treatment naïve PC patients presenting to our institution in 2021-2023 (n = 701). We define newly diagnosed as having pathologically confirmed adenocarcinoma within 30 days of clinic presentation. Patients were screened for weight loss and risk of malnutrition using the validated Malnutrition Screening Tool (MST) (89.5 positive predictive value) at the initial consultation and were referred to a dietitian based on risk or patient preference. Data was collected on demographics, disease stage (localized vs metastatic), tumor location (head/neck/uncinate, body/tail, multi-focal), presenting symptoms (percent weight loss, abdominal pain, bloating, nausea/vomiting, fatigue, change in bowel habits), experience of jaundice, pancreatitis, and gastric outlet obstruction, and dietitian consultation. Descriptive variables and Fisher's exact test were used to report outcomes.</p><p><b>Results:</b> The majority of patients were male (54%) with median age of 70 (27-95). About half of patients had localized disease (54%) with primary tumor location in the head/neck/uncinate region (57%). Patients with head/neck/uncinate tumor location mostly had localized disease (66%), while patients with body/tail tumors tended to have metastatic disease (63%). See Table 1 for further demographics. Unintentional weight loss was experienced by 66% of patients (n = 466), 69% of localized patients (n = 261) and 64% of metastatic patients (n = 205). Patients with localized disease stated a 12% loss in weight over a median of 3 months, while metastatic patients reported a 10% weight loss over a median of 5 months. Of the localized patients, the majority presented with symptoms of abdominal pain (66%), nausea/vomiting/fatigue (61%), and change in bowel habits (44%). Presenting symptoms of the metastatic patients were similar (see Table 2). There was no statistical significance of tumor location in relation to presenting symptoms. Dietitian consults occurred for 67% (n = 473) of the patient population, 77% for those with localized disease and 57% for those with metastatic disease. Of those with reported weight loss, 74% (n = 343) had dietitian consultation.</p><p><b>Conclusion:</b> Overall, it was seen that a high number of newly diagnosed, treatment naïve PC patients present with malnutrition. Patients with localized disease and tumors located in the head/neck/uncinate region experience the greatest gastrointestinal symptoms of nausea, vomiting, change in bowel habits, and fatigue. Early implementation of a proactive nutritional screening program resulted in increased awareness of malnutrition and referral for nutritional intervention for newly diagnosed PC patients.</p><p><b>Table 1.</b> Demographics and Disease Characteristics.</p><p></p><p><b>Table 2.</b> Presenting Symptoms.</p><p></p><p>Nicole Nardella, MS<sup>1</sup>; Nathan Gilchrist, BS<sup>1</sup>; Adrianna Oraiqat, BS<sup>1</sup>; Sarah Goodchild, BS<sup>1</sup>; Dena Berhan, BS<sup>1</sup>; Laila Stancil, HS<sup>1</sup>; Jeanine Milano, BS<sup>1</sup>; Christina Santiago, BS<sup>1</sup>; Melissa Adams, PA-C<sup>1</sup>; Pamela Hodul, MD<sup>1</sup></p><p><sup>1</sup>Moffitt Cancer Center, Tampa, FL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Pancreatic cancer (PC) is an aggressive disease, with 5-year survival rate of 13%. Symptoms occur late in the disease course, leading to approximately 50% of patients presenting with metastatic disease. New-onset diabetes is often one of the first symptoms of PC, with diagnosis occurring up to 3 years before cancer diagnosis. We hypothesize increasing awareness of PC prevalence in diabetic patients, both new-onset and pre-existing, may lead to early PC diagnosis.</p><p><b>Methods:</b> This IRB-exempt retrospective review included new PC patients presenting to our institution in 2021-2023 with diabetes diagnosis (n = 458). We define new-onset diabetes as having been diagnosed within 3 years prior to pathologically confirmed adenocarcinoma. We define newly diagnosed PC as having pathologically confirmed adenocarcinoma within 30 days of clinic presentation. Data was collected on demographics, staging (localized vs metastatic), tumor location (head/neck/uncinate, body/tail, multi-focal), treatment initiation, diabetes onset (new-onset vs pre-existing), diabetes regimen, and weight loss. Descriptive variables were used to report outcomes.</p><p><b>Results:</b> In the study period, 1,310 patients presented to our institution. Of those, 35% had a diabetes diagnosis (n = 458). The majority of patients were male (61%) with age at PC diagnosis of 69 (41-92). Patients mostly had localized disease (57%) with primary tumor location in the head/neck/uncinate region (59%). New-onset diabetes was present in 31% of diabetics, 11% of all new patients, with 63% having localized disease (79% head/neck/uncinate) and 37% metastatic (66% body/tail). Of those with pre-existing diabetes (69%), 54% had localized disease (69% head/neck/uncinate) and 46% had metastatic disease (53% body/tail). See Table 1 for further demographic/disease characteristics. Abrupt worsening of diabetes was seen in 10% (n = 31) of patients with pre-existing diabetes, and 12% had a change in their regimen prior to PC diagnosis. Hence 13% (175/1,310) of all new patients presented with either new-onset or worsening diabetes. Weight loss was present in 75% (n = 108) of patients with new-onset diabetes, with a median of 14% weight loss (3%-38%) over 12 months (1-24). Alternatively, weight loss was present in 66% (n = 206) of patients with pre-existing diabetes, with a median of 14% weight loss (4%-51%) over 6 months (0.5-18). Diabetes medication was as follows: 41% oral, 30% insulin, 20% both oral and insulin, 10% no medication. Of those patients with new-onset diabetes, 68% were diagnosed within 1 year of PC diagnosis and 32% were diagnosed within 1-3 years of PC diagnosis. Of those within 1 year of diagnosis, 68% had localized disease with 81% having head/neck/uncinate tumors. Of the metastatic (31%), 73% had body/tail tumors. For patients with diabetes diagnosis within 1-3 years of PC diagnosis, 52% had localized disease (75% head/neck/uncinate) and 48% had metastatic disease (59% body/tail). See Table 2 for further characteristics.</p><p><b>Conclusion:</b> Overall, approximately one-third of new patients presenting with PC at our institution had diabetes, and new-onset diabetes was present in one-third of those patients. The majority of diabetes patients presented with localized head/neck/uncinate tumors. When comparing new-onset vs pre-existing diabetes, new-onset tended to experience greater weight loss over a longer time with more localized disease than pre-existing diabetes patients. Patients with diabetes diagnosis within 1 year of PC diagnosis had more localized disease (head/neck/uncinate). Hence increased awareness of diabetes in relation to PC, particularly new onset and worsening pre-existing, may lead to early diagnosis.</p><p><b>Table 1.</b> Demographics and Disease Characteristics.</p><p></p><p><b>Table 2.</b> New-Onset Diabetes Characteristics.</p><p></p><p>Marcelo Mendes, PhD<sup>1</sup>; Gabriela Oliveira, RD<sup>2</sup>; Ana Zanini, RD, MSc<sup>2</sup>; Hellin dos Santos, RD, MSc<sup>2</sup></p><p><sup>1</sup>Cicatripelli, Belém, Para; <sup>2</sup>Prodiet Medical Nutrition, Curitiba, Parana</p><p><b>Encore Poster</b></p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> According to the NPUAP, a pressure injury (PI) is damage that occurs to the skin and/or underlying soft tissue, primarily over bony prominences, and may also be related to the use of medical devices. PIs can range from intact skin to deeper ulcers, affecting structures such as muscles and bones. The aim of this study was to report the experience of using a specialized supplement for wound healing in the treatment of a PI.</p><p><b>Methods:</b> This is a case report based on the experience of the nurses from Cicatripelli®. Data were collected from May to July 2024 through a review of medical records and photographs showing the wound's progression. The patient was a 69-year-old woman with COPD, diabetes mellitus, and systemic arterial hypertension, who denied smoking and alcohol consumption. She developed a stage 4 PI in the sacral region after 15 days of mechanical ventilation due to bacterial pneumonia and was admitted to a private clinic for treatment on May 2, 2024. Initial wound assessment: Measurements: 16.5x13x4cm (WxLxD); abundant purulent exudate with a foul odor; intact peripheral skin; mild to moderate pain; 75% granulation tissue and 25% liquefactive necrosis (slough) (Figure 1). The wound was cleansed with 0.1% polyhexamethylene biguanide (PHMB) solution, followed by conservative debridement, primary coverage with hydrophiber with 1.2% silver, and secondary coverage with cotton gauze and transparent film, with dressing changes scheduled every 72 hours. Supplementation (Correctmax – Prodiet Medical Nutrition) started on 05/20/2024, with a dosage of 2 sachets per day, containing 10 g of collagen peptide, 3 g of L-arginine, 612 mg of vitamin A, 16 mg of vitamin E, 508 mg of vitamin C, 30 mcg of selenium, and 16 mg of zinc.</p><p><b>Results:</b> On the 17th day of supplementation, the hydrophiber with silver dressing was replaced with PHMB-impregnated gauze, as the wound showed no signs of infection and demonstrated significant clinical improvement. Measurements: 8x6x2cm (WxLxD); moderate serohematic exudate; intact peripheral skin; 100% granulation tissue; significant improvement in pain and odor (Figure 2). On the 28th day, the dressing was switched to calcium and sodium alginate to optimize exudate control due to the appearance of mild dermatitis. Low-intensity laser therapy was applied, and a skin protective spray was used. Wound assessment: Measurements: 7x5.5x1.5 cm (WxLxD), with maintained characteristics (Figure 3). On the 56th day, the patient returned for dressing change and discharge instructions, as she could not continue the treatment due to a lack of resources. The approach remained the same with dressing changes every 3 days. Wound assessment: Measurements: 5x3.5x0.5 cm (WxLxD), with approximately 92% reduction in wound area, epithelialized margins, and maintained characteristics (Figure 4).</p><p><b>Conclusion:</b> Nutritional intervention with specific nutrient supplementation can aid in the management of complex wounds, serving as a crucial tool in the healing process and contributing to reduced healing time.</p><p></p><p><b>Figure 1.</b> Photo of the wound on the day of the initial assessment on 05/02/2024.</p><p></p><p><b>Figure 2.</b> Photo of the wound after 17 days of supplementation on 06/06/2024.</p><p></p><p><b>Figure 3.</b> Photo of the wound after 28 days of supplementation on 06/17/2024.</p><p></p><p><b>Figure 4.</b> Photo of the wound after 56 days of supplementation on 07/15/2024.</p><p>Ludimila Ribeiro, RD, MSc<sup>1</sup>; Bárbara Gois, RD, PhD<sup>2</sup>; Ana Zanini, RD, MSc<sup>3</sup>; Hellin dos Santos, RD, MSc<sup>3</sup>; Ana Paula Celes, MBA<sup>3</sup>; Flávia Corgosinho, PhD<sup>2</sup>; Joao Mota, PhD<sup>4</sup></p><p><sup>1</sup>School of Nutrition, Federal University of Goiás, Goiania, Goias; <sup>2</sup>School of Nutrition, Federal University of Goiás, Goiânia, Goias; <sup>3</sup>Prodiet Medical Nutrition, Curitiba, Parana; <sup>4</sup>Federal University of Goias, Goiania, Goias</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Postprandial blood glucose is considered an important risk factor for the development of macrovascular and microvascular diseases. Despite the use of hypoglycemic agents, patients with diabetes often experience postprandial hyperglycemia due to unbalanced meals. The aim of this study was to compare the effects of a low glycemic index formula for glycemic control as a substitute for a standard breakfast in patients with type 2 diabetes.</p><p><b>Methods:</b> This randomized, placebo-controlled, crossover study included 18 individuals with type 2 diabetes. Participants were instructed to consume, in random order, either a nutritional formula or a typical Brazilian breakfast with the same caloric content for three consecutive week days in different weeks. The nutritional formula (200 mL) provided 200 kcal, with 20 g of carbohydrates, 8.8 g of protein, 9.4 g of fat (MUFA: 5.6 g, PUFA: 2.0 g), and 3.0 g of fiber (DiamaxIG – Prodiet Medical Nutrition), serving as a breakfast substitute. Both the nutritional formula and the standard breakfast were provided to the participants. During the two weeks of intervention, participants used continuous glucose monitoring sensors (Libre 2). Weight and height were measured to calculate body mass index (BMI), and medication use was monitored.</p><p><b>Results:</b> The sample consisted of 61% females, with a mean age of 50.28 ± 12.58 years. The average blood glucose level was 187.13 ± 77.98 mg/dL and BMI 29.67 ± 4.86 kg/m². All participants were taking metformin, and two were taking it concomitantly with insulin. There were no changes in medication doses or regimens during the study. The incremental area under the curve was significantly lower in the nutritional formula group compared to the standard breakfast group (2,794.02 ± 572.98 vs. 4,461.55 ± 2,815.73, p = 0.01).</p><p><b>Conclusion:</b> The low glycemic index formula for glycemic control significantly reduced postprandial glycemic response compared to a standard Brazilian breakfast in patients with type 2 diabetes. These findings suggest that incorporating low glycemic index meals could be an effective strategy for better managing postprandial blood glucose levels in this population, which may help mitigate the risk of developing macro and microvascular complications.</p><p>Kirk Kerr, PhD<sup>1</sup>; Bjoern Schwander, PhD<sup>2</sup>; Dominique Williams, MD, MPH<sup>1</sup></p><p><sup>1</sup>Abbott Nutrition, Columbus, OH; <sup>2</sup>AHEAD GmbH, Bietigheim-Bissingen, Baden-Wurttemberg</p><p><b>Financial Support:</b> Abbott Nutrition.</p><p><b>Background:</b> According to the World Health Organization, obesity is a leading risk factor for global non communicable diseases like diabetes, heart disease and cancer. Weight cycling, often defined as intentionally losing and unintentionally regaining weight, is observed in people with obesity and can have adverse health and economic consequences. This study develops the first model of the health economic consequences of weight cycling in individuals with obesity, defined by a body-mass index (BMI) ≥ 30 kg/m².</p><p><b>Methods:</b> A lifetime state-transition model (STM) with monthly cycles was developed to simulate a cohort of individuals with obesity, comparing “weight cyclers” versus “non-cyclers”. The simulated patient cohort was assumed to have average BMI 35.5, with 11% of patients with cardiovascular disease, and 6% of patients with type 2 diabetes. Key outcomes were the cost per obesity-associated event avoided, the cost per life-year (LY) gained, and the cost per quality-adjusted life year (QALY) gained, using a US societal perspective. Transition probabilities for obesity-associated diseases were informed by meta-analyses and were based on US disease-related base risks. Risks were adjusted by BMI-related, weight-cycling-related, and T2D-related relative risks (RR). BMI progression, health utilities, direct and indirect costs, and other population characteristics were informed by published US studies. Future costs and effects were discounted by 3% per year. Deterministic and probabilistic sensitivity analyses were performed to investigate the robustness of the results.</p><p><b>Results:</b> Simulating a lifetime horizon, non-cyclers had 0.090 obesity-associated events avoided, 0.602 LYs gained, 0.518 QALYs gained and reduced total costs of approximately $4,592 ($1,004 direct and $3,588 indirect costs) per person. Sensitivity analysis showed the model was most responsive to changes in patient age and cardiovascular disease morbidity and mortality risks. Non-cycling as the cost-effective option was robust in sensitivity analyses.</p><p><b>Conclusion:</b> The model shows that weight cycling has a major impact on the health of people with obesity, resulting in increased direct and indirect costs. The approach to chronic weight management programs should focus not only on weight reduction but also on weight maintenance to prevent the enhanced risks of weight cycling.</p><p>Avi Toiv, MD<sup>1</sup>; Arif Sarowar, MSc<sup>2</sup>; Hope O'Brien, BS<sup>2</sup>; Thomas Pietrowsky, MS, RD<sup>1</sup>; Nemie Beltran, RN<sup>1</sup>; Yakir Muszkat, MD<sup>1</sup>; Syed-Mohammad Jafri, MD<sup>1</sup></p><p><sup>1</sup>Henry Ford Hospital, Detroit, MI; <sup>2</sup>Wayne State University School of Medicine, Detroit, MI</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Age is an important factor in the transplant evaluation as age at transplantation is historically thought to influence transplant outcomes in organ transplant recipients. There is limited data on the impact of age on intestinal (IT) and multivisceral (MVT) organ transplantation. This study investigates the impact of age on post-transplant outcomes in patients who received an intestinal or multivisceral (including intestine) transplant, comparing those under 40 years old to those aged 40 and above.</p><p><b>Methods:</b> We conducted a retrospective chart review of all patients who underwent IT at an academic transplant center from 2010 to 2023. The primary outcome was patient survival and graft failure analyzed with Kaplan-Meier survival analysis.</p><p><b>Results:</b> Among 50 IT recipients, there were 11 IT recipients < 40years old and 39 IT recipients ≥40 years old (Table). The median age at transplant in the <40 group was 37 years (range, 17-39) and in the ≥40 group was 54 years (range, 40-68). In both groups, the majority of transplants were exclusively IT, however they included MVT as well. Kaplan-Meier survival analysis revealed no significant differences between the two age groups in graft survival or patient mortality. Reoperation within 1 month was significantly linked to decreased survival (p = 0.015) and decreased graft survival (p = 0.003) as was moderate to severe rejection within 1 month (p = 0.009) but was not significantly different between the two age groups. Wilcoxon rank-sum test showed no difference between groups in regard to reoperation or moderate to severe rejection at 1 or 3 months or the development of chronic kidney disease.</p><p><b>Conclusion:</b> Age at the time of intestinal transplantation (< 40 vs. ≥40 years old) does not appear to significantly impact major transplant outcomes, such as patient mortality, graft survival, or rejection rates. While reoperation and moderate to severe rejection within 1 and 3 months negatively affected overall outcomes, these complications were not more frequent in older or younger patients.</p><p><b>Table 1.</b> Demographic Characteristics of Intestinal Transplant Recipients.</p><p></p><p>BMI, body mass index; TPN, total parenteral nutrition.</p><p><b>International Poster of Distinction</b></p><p>Gabriela de Oliveira Lemos, MD<sup>1</sup>; Natasha Mendonça Machado, PhD<sup>2</sup>; Raquel Torrinhas, PhD<sup>3</sup>; Dan Linetzky Waitzberg, PhD<sup>3</sup></p><p><sup>1</sup>University of Sao Paulo School of Medicine, Brasília, Distrito Federal; <sup>2</sup>University of Sao Paulo School of Medicine, São Paulo; <sup>3</sup>Faculty of Medicine of the University of São Paulo, São Paulo</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> Ganepão 2023.</p><p><b>Publication:</b> Braspen Journal. ISSN 2764-1546 | Online Version.</p><p><b>Financial Support:</b> This study is linked to project no. 2011/09612-3 and was funded by the Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP).</p><p><b>Background:</b> Sphingolipids (SLs) are key molecules in cell signaling and play a central role in lipotoxicity. Moreover, they are major constituents of eucaryotic membrane cells. Data on SLs remodeling in the gastrointestinal tract after Roux-en-Y gastric bypass (RYGB) are lacking and may help to elicit tissue turnover and metabolism. This protocol aims to evaluate the SLs’ profile and remodeling within the plasma and different segments of the gastrointestinal tract (GIT) before and 3 months after RYGB in a population of women with obesity and type 2 diabetes mellitus (T2DM) and to correlate the changes within these tissues. This investigation is part of the SURMetaGIT study, registered at www.clinicalTrials.gov (NCT01251016).</p><p><b>Methods:</b> Twenty-eight women with obesity and T2DM who underwent RYGB were enrolled in this protocol. Those on insulin therapy were excluded. We collected plasma (n = 28) and intestinal samples from the gastric pouch (n = 9), duodenum (n = 8), jejunum (n = 9), and ileum (n = 9) for untarget metabolomics analysis at baseline and 3 months post-surgery. Indian ink (SPOT®) was used to check the places for the following biopsies. SLs were identified using high-performance liquid chromatography coupled mass spectrometry. Data were processed and analyzed using AnalysisBaseFileConverter and MS-DIAL, respectively. The magnitude of SLs’ changes after RYGB was assessed by the fold change (log2 post-surgery mean/pre-surgery mean). The Spearman test was performed for the correlation analysis. P value < 0.05 was considered significant. Statistics were carried out in the Jamovi software (2.2.5) and MetaboAnalyst 5.0.</p><p><b>Results:</b> 34 SLs were identified, including sphingomyelins (SM), ceramides (Cer), and glycosphingolipids (GlcSL). SMs were the most common SL found in the plasma – 27 SM, 3 Cer, and 4 GlcSL- and in GIT -16 SM, 13 Cer, and 5 GlcSL. Every GIT tissue presented distinct SL remodeling. The plasma and the jejunum better-discriminated SLs’ changes following surgery (Figure 1). The jejunum expressed the most robust changes, followed by the plasma and the duodenum (Figure 2). Figure 3 presents the heatmap of the plasma and the GIT tissues. Correlation analysis showed that the plasmatic SLs, particularly SM(d32:0), GlcCer(d42:1), and SM(d36:1), strongly correlated with jejunum SLs. These lipids showed a negative-strong correlation with jejunal sphingomyelins, but a positive-strong correlation with jejunal ceramides (Table 1).</p><p><b>Conclusion:</b> RYGB was associated with SL remodeling in the plasma and GIT. SM was the main SL found in the plasma and the GIT. The most robust changes occurred in the jejunum and the plasma, and these 2 samples presented the more relevant correlation. Considering our findings, the role of SM in metabolic changes after RYGB should be investigated.</p><p><b>Table 1.</b> Correlation Analysis of Sphingolipids from the plasma with the Sphingolipids from the Gastrointestinal Tract.</p><p></p><p>*p < ,05; **p < ,01; ***p < 0,001.</p><p></p><p>The green circle represents samples at baseline and the red circles represent the samples 3 months after RYGB.</p><p><b>Figure 1.</b> Principal Component Analysis (PCA) from GIT Tissues and Plasma.</p><p></p><p>Fold change = log2 post-surgery mean/pre-surgery mean.</p><p><b>Figure 2.</b> Fold Change of Sphingolipids from the Plasma and Gastrointestinal Tract.</p><p></p><p>The map under the right top green box represents lipids’ abundance before surgery, and the map under the left top red box represents lipids’ abundance after RYGB.</p><p><b>Figure 3.</b> Heatmap of Sphingolipids from the Plasma and Gastrointestinal Tract.</p><p>Lucas Santander<sup>1</sup>; Gabriela de Oliveira Lemos, MD<sup>2</sup>; Daiane Mancuzo<sup>3</sup>; Natasha Mendonça Machado, PhD<sup>4</sup>; Raquel Torrinhas, PhD<sup>5</sup>; Dan Linetzky Waitzberg, PhD<sup>5</sup></p><p><sup>1</sup>Universidade Santo Amaro (Santo Amaro University), São Bernardo Do Campo, Sao Paulo; <sup>2</sup>University of Sao Paulo School of Medicine, Brasília, Distrito Federal; <sup>3</sup>Universidade São Caetano (São Caetano University), São Caetano do Sul, Sao Paulo; <sup>4</sup>University of Sao Paulo School of Medicine, São Paulo; <sup>5</sup>Faculty of Medicine of the University of São Paulo, São Paulo</p><p><b>Financial Support:</b> Fundação de Amparo a Pesquisa do Estado de São Paulo.</p><p><b>Background:</b> Microalbuminuria (MAL) is an early biomarker of kidney injury, linked to conditions like hypertension, type 2 diabetes (T2DM), and obesity. It is also associated with higher cardiovascular (CV) risk. This protocol examines the impact of Roux-en-Y gastric bypass (RYGB) on MAL and CV markers in patients with obesity, T2DM, and MAL.</p><p><b>Methods:</b> 8 women with grade II-III obesity, T2DM, and MAL who underwent RYGB were included. Patients on insulin therapy were not included. MAL was defined as a urinary albumin-to-creatinine ratio > 30 mg/g. MAL, glycemic, and lipid serum biomarkers were measured at baseline and three months post-surgery. Systolic (SBP) and diastolic (DBP) blood pressures and medical treatments for T2DM and hypertension were also assessed. T2DM remission was defined by ADA 2021 criteria. Categorical variables were reported as absolute and relative frequencies, while continuous variables were expressed as median and IQR based on normality tests. Intragroup and intergroup comparisons were conducted using the Wilcoxon and Mann-Whitney tests for numeric data. The Fisher test was performed when necessary to compare dichotomic variables. Data were analyzed in the JASP software version 0.18.1.0.</p><p><b>Results:</b> Overall, RYGB was associated with weight loss, improved body composition, and better CV markers (Table 1). Post-surgery, MAL decreased by at least 70% in all patients and resolved in half. All patients with MAL resolution had pre-surgery levels ≤ 100 mg/g. Those without resolution had severe pre-surgery MAL (33.8 vs. 667.5, p = 0.029), and higher SBP (193 vs. 149.5, p = 0.029) DBP (138 vs. 98, p = 0.025). Blood pressure decreased after surgery but remained higher in patients without MAL resolution: SBP (156.0 vs. 129.2, p = 0.069) and DBP (109.5 vs. 76.5, p < 0.001). MAL resolution was not linked to T2DM remission at 3 months (75% vs. 50%, p = 1.0). One patient had worsened MAL (193.0 vs. 386.9 mg/g) after RYGB. Glomerular filtration rate (GFR) tended to increase post-surgery only in the group with MAL resolution (95.4 vs. 108.2 ml/min/1.73 m², p = 0.089), compared to the group without MAL resolution (79.2 vs. 73.7 ml/min/1.73 m², p = 0.6).</p><p><b>Conclusion:</b> RYGB effectively reduced markers of renal dysfunction and cardiovascular risk in this cohort. Patients showed a decrease in MAL, with resolution in half of the patients. The small sample size and short follow-up period may have limited the overall impact of the surgery on renal function. Future studies with larger cohorts and longer follow-ups are needed to understand better the effects of bariatric surgery on MAL, and its relation to other CV markers.</p><p><b>Table 1.</b> Biochemical and Clinical Data Analysis Following RYGB.</p><p></p><p>eGFR: estimated glomerular filtration rate; HbA1c: glycated hemoglobin; HDL-c: high-density lipoprotein cholesterol; HOMA-BETA: beta-cell function by the homeostasis model; HOMA-IR: Homeostasis Model Assessment of Insulin Resistance; LDL-c: low-density lipoprotein cholesterol; Non-HDL-c: non-high-density lipoprotein cholesterol; VLDL-c: very-low-density lipoprotein cholesterol; DBP: diastolic blood pressure; SBP: systolic blood pressure; WC: waist circumference.</p><p>Michelle Nguyen, BSc, MSc<sup>1</sup>; Johane P Allard, MD, FRCPC<sup>2</sup>; Dane Christina Daoud, MD<sup>3</sup>; Maitreyi Raman, MD, MSc<sup>4</sup>; Jennifer Jin, MD, FRCPC<sup>5</sup>; Leah Gramlich, MD<sup>6</sup>; Jessica Weiss, MSc<sup>1</sup>; Johnny H. Chen, PhD<sup>7</sup>; Lidia Demchyshyn, PhD<sup>8</sup></p><p><sup>1</sup>Pentavere Research Group Inc., Toronto, ON; <sup>2</sup>Division of Gastroenterology, Department of Medicine, Toronto General Hospital, Toronto, ON; <sup>3</sup>Division of Gastroenterology, Centre Hospitalier de l'Université de Montréal (CHUM), Department of Medicine, University of Montreal, Montreal, QC; <sup>4</sup>Division of Gastroenterology, University of Calgary, Calgary, AB; <sup>5</sup>Department of Medicine, University of Alberta, Division of Gastroenterology, Royal Alexandra Hospital, Edmonton, AB; <sup>6</sup>Division of Gastroenterology, Faculty of Medicine and Dentistry, University of Alberta, Edmonton, AB; <sup>7</sup>Takeda Canada Inc., Vancouver, BC; <sup>8</sup>Takeda Canada Inc., Toronto, ON</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> 46th European Society for Clinical Nutrition and Metabolism (ESPEN) Congress. 7-10 September 2024, Milan, Italy.</p><p><b>Financial Support:</b> Funding of this study is from Takeda Canada Inc.</p><p><b>Background:</b> Teduglutide is indicated for the treatment of patients with short bowel syndrome (SBS) who are dependent on parenteral support (PS). This study evaluated longer-term teduglutide effectiveness and safety in Canadian patients diagnosed with SBS dependent on PS using Real-World Evidence.</p><p><b>Methods:</b> This was an observational, retrospective study, using data from the national Canadian Takeda patient support program, and included adults with SBS. Data were collected 6 months before teduglutide initiation and from initiation to Dec-01-2023, death, or loss of follow-up. Descriptive statistics characterized the population and treatment-emergent adverse events (TEAEs). Changes in parenteral nutrition/intravenous fluid supply (PN/IV) were assessed based on decreases in PN/IV volume from baseline. Statistical significance was set at p < 0.05.</p><p><b>Results:</b> 52 patients (60% women) were included in this study. Median age (range) was 54 (22–81) and 50% had Crohn's disease as their etiology of SBS. At 6 months, median (range) absolute and percentage reduction from baseline in PN/IV volume was 3,900 mL/week (-6,960–26,784; p < 0.001), and 28.1% (-82.9–100). At 24 months, median (range) absolute reduction from baseline was 6,650 mL/week (-4,400–26,850; p = 0.003), and the proportion of patients who achieved ≥20% reduction in weekly PN/IV was 66.7%. Over the study, 27% achieved independence from PN/IV. TEAEs were reported in 51 (98%) patients (83% were serious TEAEs) during the study period, the 3 most common were weight changes, diarrhea, and fatigue.</p><p><b>Conclusion:</b> Patients showed significant decreases in PN/IV volumes after initiating teduglutide, with no unexpected safety findings. This study demonstrates real-world, longer-term effectiveness and safety of teduglutide in Canadian patients with SBS, complimenting previous clinical trials, and real-world studies.</p><p><b>Poster of Distinction</b></p><p>Sarah Carter, RD, LDN, CNSC<sup>1</sup>; Ruth Fisher, RDN, LD, CNSC<sup>2</sup></p><p><sup>1</sup>Coram CVS/Specialty Infusion Services, Tullahoma, TN; <sup>2</sup>Coram CVS/Specialty Infusion Services, Saint Hilaire, MN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Perceived benefit is one factor determining therapy continuation. Little information is published regarding positive outcomes for patients receiving the GLP-2 analog teduglutide apart from success rates of weaning HPN and hydration volumes by 20%. Patients who are new to therapy may question how long after initiation should they expect to see results. Patients may collaborate with prescribers to improve therapy tolerance if they have hope for improvements in their quality of life. This data analysis will provide details regarding patients receiving teduglutide and their perceived benefits to therapy.</p><p><b>Methods:</b> Dietitians interview patients receiving teduglutide as part of a service agreement, monitoring persistency and helping to eliminate barriers to therapy. Dietitians make weekly and monthly calls based on patients’ drug start dates and document interventions in flowsheets in patients’ electronic medical records. The interventions are published to data visualization software to monitor compliance with dietitian outreach as part of a quality improvement project. All patients were diagnosed with short bowel syndrome, but baseline characteristics were not exported to this dashboard. This study is a retrospective analysis of the existing dashboard using 3 years of assessments (May 1, 2021-April 30, 2024). Exclusion criteria included therapy dispensed from a pharmacy not using the company's newest computer platform and patients who did not respond to outreach. We analyzed the time on therapy before a positive outcome was reported by the patient to the dietitian and which positive outcomes were most frequently reported.</p><p><b>Results:</b> The data set included 336 patients with 2509 phone assessments. The most frequently reported first positive outcome was improved ostomy output/less diarrhea (72%, n = 243), occurring just after a month on therapy (mean 31 days ± 26.4). The mean time for first positive outcome for all patients who reported one was 32 days ± 28.5 (n = 314). Of the 22 patients who reported no positive outcome, 13 didn't answer the dietitians’ calls after initial contact. A summary is listed in Table 1. Overall positive outcomes reported were improved ostomy output/less diarrhea (87%, n = 292), weight gain (70%, n = 236), improved appetite/interest in food (33%, n = 112), feeling stronger/more energetic (27%, n = 92), improved quality of life (23%, n = 90), improved lab results (13%, n = 45) and fewer antidiarrheal medications (12%, n = 40). Of the 218 patients receiving parenteral support, 44 patients stopped hydration and HPN completely (20%) with another 92 patients reporting less time or days on hydration and HPN (42%) for a total of 136 patients experiencing a positive outcome of parenteral support weaning (62%). Patients reported improvements in other areas of their lives including fewer hospitalizations (n = 39), being able to travel (n = 35), tolerating more enteral nutrition volume (n = 19), returning to work/school (n = 14) and improving sleep (n = 13). A summary is diagramed in Figure 2.</p><p><b>Conclusion:</b> This retrospective analysis indicates that teduglutide is associated with improved symptom control and improved quality of life measures, with most patients seeing a response to therapy within the first 2 months. Patients responded to teduglutide with a decrease in ostomy output and diarrhea as the most frequent recognizable response to therapy. In addition to the goal of weaning parenteral support, clinicians should be cognizant of improvements in patients’ clinical status that can have significant impact in quality of life.</p><p><b>Table 1.</b> Timing of First Reported Positive Outcome by Patients Receiving Teduglutide.</p><p></p><p></p><p><b>Figure 1.</b> Total Positive Outcomes Reported by Patients (n = 336).</p><p><b>Poster of Distinction</b></p><p>Jennifer Cholewka, RD, CNSC, CDCES, CDN<sup>1</sup>; Jeffrey Mechanick, MD<sup>1</sup></p><p><sup>1</sup>The Mount Sinai Hospital, New York, NY</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Bariatric surgery is a guideline-directed intervention for patients with severe obesity. Adherence to post-operative recommendations is variable and consequent undernutrition complicated by multiple micronutrient deficiencies is prevalent. A post-bariatric surgery syndrome (PBSS) is not well defined and therefore poorly understood, though early detection and intervention would likely decrease clinical and economic burdens. Our current experience with PBSS is presented here as a case series. We will define PBSS based on clinical evidence related to risk factors, interventions, and clinical/biochemical responses.</p><p><b>Methods:</b> Twenty four consecutive patients referred to our metabolic support service were identified between January 1, 2019 and December 31, 2023 who were admitted to The Mount Sinai Hospital in New York City with a history of RYGB (roux en y gastric bypass) or BPDDS (biliopancreatic diversion with duodenal switch) and found to have failure to thrive, undernutrition, clinical/biochemical features of at least one micronutrient deficiency, and indications for parenteral nutrition. Patients were excluded if they had surgical complications or were not treated with parenteral nutrition. Fifteen patients were included in this case series and de-identified prior to data collection using the electronic health records (EPIC) and coded data collection sheets. Descriptive statistical methods were used to analyze parametric variables as mean ± standard deviation and non-parametric variables as median (interquartile range).</p><p><b>Results:</b> Results are provided in Table 1.</p><p><b>Conclusion:</b> The PBSS is defined by significant decompensation following a bariatric surgery procedure with malabsorptive component characterized by failure to thrive, hypoalbuminemia, multiple micronutrient deficiencies, and need for parenteral nutrition. Major risk factors include inadequate protein and micronutrient intake due to either unawareness (e.g., not recommended) or poor adherence, significant alcohol consumption, or a complicating medical/surgical condition. Parenteral nutrition formulation was safe in this population and prioritizes adequate nitrogen, nonprotein calories, and micronutrition. Further analyses on risk factors, responses to therapy, and role of a multidisciplinary team are in progress.</p><p><b>Table 1.</b> Risks/Presentation.</p><p></p><p><b>Table 2.</b> Responses to Parenteral Nutrition Intervention.</p><p></p><p>Holly Estes-Doetsch, MS, RDN, LD<sup>1</sup>; Aimee Gershberg, RD, CDN, CPT<sup>2</sup>; Megan Smetana, PharmD, BCPS, BCTXP<sup>3</sup>; Lindsay Sobotka, DO<sup>3</sup>; Kristen Roberts, PhD, RDN, LD, CNSC, FASPEN, FAND<sup>4</sup></p><p><sup>1</sup>The Ohio State University, Columbus, OH; <sup>2</sup>NYC Health + Hospitals, New York City, NY; <sup>3</sup>The Ohio State University Wexner Medical Center, Columbus, OH; <sup>4</sup>The Ohio State University, Granville, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Decompensated cirrhosis increases the risk of fat maldigestion through altered bile synthesis and excretion through the bile canaliculi. Maldigestion increases the risk of vitamin and mineral deficiencies which when untreated contribute to consequential health issues such as metabolic bone disease, xerophthalmia, and hyperkeratosis. There is an absence of comprehensive guidelines for prevention and treatment of deficiencies.</p><p><b>Methods:</b> Medical and surgical history, anthropometrics, medications and nutritional supplements, laboratory data, and medical procedures were extracted and analyzed from the electronic medical record.</p><p><b>Results:</b> A patient with congenital biliary atresia and decompensated cirrhosis was seen in a hepatology outpatient clinic. Biochemical assessment revealed severe vitamin A deficiency and suboptimal vitamin D and zinc status. Physical assessment indicated telogen effluvium and transient blurry vision. Despite a history of high-dose oral retinyl acetate ranging from 10,000-50,000 units daily and a 3-day course of 100,000 units via intermuscular injection and co-treatment of zinc deficiency to ensure adequate circulating retinol binding protein, normalization of serum retinol was not possible over the last 10 years. The patient's serum vitamin A level normalized following liver transplantation.</p><p><b>Conclusion:</b> In decompensated cirrhosis, there is a lack of sufficient guidelines for micronutrient dosing when traditional treatment strategies are unsuccessful. Furthermore, altered secretion of transport proteins due to underlying liver dysfunction may pose challenges in evaluating laboratory markers of micronutrient status. Collaborations with pharmacy and medicine support a thorough assessment, and the establishment of a safe treatment and monitoring plan. Clinical research is needed to understand strategies for acceptable and safe dosing strategies for patients with chronic, unresponsive fat soluble vitamin deficiencies.</p><p>Gang Wang, PhD<sup>1</sup></p><p><sup>1</sup>Nimble Science, Calgary, AB</p><p><b>Financial Support:</b> This work was supported by the National Research Council of Canada Industrial Research Assistance Program (NRC IRAP), an Alberta Innovate Accelerating Innovations into CarE (AICE) grant, and partially by a Clinical Problems Incubator Grant from the Snyder Institute for Chronic Disease at the University of Calgary.</p><p><b>Background:</b> The small intestine (SI) microbiome plays a crucial role in nutrient absorption and emerging evidence indicates that fecal contents is insufficient to represent the SI microenvironment. Endoscopic sampling is possible but expensive and not scalable. Ingestible sampling capsule technologies are emerging. However, potential contamination becomes are major limitations of these devices.</p><p><b>Methods:</b> We previously reported the Small Intestinal MicroBiome Aspiration (SIMBA) capsules as effective means for sampling, sealing and preserving SI luminal contents for 16 s rRNA gene sequencing analysis. A subset of the DNA samples, including SI samples collected by SIMBA capsules (CAP) and the matched saliva (SAL), fecal (FEC) and duodenal endoscopic aspirate (ASP) and brush (BRU) samples, from 16 participants recruited for an observational clinical validation study were sent for shotgun metagenomic sequencing. The aims are 1) to compare the sampling performance of the capsule (CAP) compared to endoscopic aspirates (ASP) and 850 small intestine, large intestine and fecal samples from the Clinical Microbiomics data warehouse (PRJEB28097), and 2) to characterize samples from the 4 different sampling sites in terms of species composition and functional potential.</p><p><b>Results:</b> 4/80 samples (1/16 SAL, 2/16 ASP, 0/16 BRU, 1/16 CAP and 0/16 FEC) failed library preparation and 76 samples were shotgun sequenced (average of 38.5 M read pairs per sample) (Figure 1). Quality assessment demonstrated that despite the low raw DNA yield of CAP samples, they retained a minimal level of host contamination, in comparison to ASP and BRU (mean 5.27 % vs. 93.09 - 96.44% of average total reads per sample) (Figure 2). CAP samples shared majority of the species with ASP samples as well as a big fraction of species detected in the terminal ileum samples. ASP and CAP sample composition was more similar to duodenum, jejunum and saliva and very different from large intestine and stool samples. Functional genomics further revealed GI regional-specific differences: In both ASP and CAP samples we detect a number of Gut Metabolic Modules (GMMs) for carbohydrate digestion and short-chain fatty acids. However, probiotic species, and species and genes involved in the bile acid metabolism were mainly prevalent in CAP and FEC samples and could not be detected in ASP samples.</p><p><b>Conclusion:</b> CAP and ASP microbiome are compositionally similar despite of the high level of host contamination of ASP samples. CAP appears to be of better quality to reveal GI regional-specific functional potentials than ASP. This analysis demonstrates the great potential for the SIMBA capsule for unveiling the SI microbiome and supports the prospective use of SIMBA capsules in observational and interventional studies to investigate the impacts of short-term and long-term biotic foods interventions to the gut microbiome (Figure 3). A series of studies utilizing the SIMBA capsules are being conducted under way and the detectability of the biotic foods intervention impacts will be reported in near future (Table 1).</p><p><b>Table 1.</b> List of Ongoing Observational and Interventional Clinical Studies using SIMBA Capsule.</p><p></p><p></p><p><b>Figure 1.</b> Shotgun Metagenomic Sequencing: Taxonomy Overview (Relative Abundance of the 10 Most Species By Sampling Site).</p><p></p><p><b>Figure 2.</b> Shotgun Metagenomic Sequencing: High Quality Non-Host Contamination Reads On Sampling Sites.</p><p></p><p><b>Figure 3.</b> Short-term and Long-term Interventional Study Protocols Using SIMBA Capsules.</p><p>Darius Bazimya, MSc. Nutrition, RN<sup>1</sup>; Francine Mwitende, RN<sup>1</sup>; Theogene Uwizeyimana, Phn<sup>1</sup></p><p><sup>1</sup>University of Global Health Equity, Kigali</p><p><b>Financial Support:</b> University of Global Health Equity.</p><p><b>Background:</b> Rwanda is currently grappling with the double burden of malnutrition and rising obesity, particularly in urban populations. As the country experiences rapid urbanization and dietary shifts, traditional issues of undernutrition coexist with increasing rates of obesity and related metabolic disorders. This study aims to investigate the relationship between these nutrition-related issues and their impact on gastrointestinal (GI) health and metabolic outcomes in urban Rwandan populations.</p><p><b>Methods:</b> A cross-sectional study was conducted in Kigali, Rwanda's capital, involving 1,200 adult participants aged 18 to 65. Data were collected on dietary intake, body mass index (BMI), GI symptoms, and metabolic markers such as fasting glucose, cholesterol levels, and liver enzymes. Participants were categorized into three groups: undernourished, normal weight, and overweight/obese based on their BMI. GI symptoms were assessed using a validated questionnaire, and metabolic markers were evaluated through blood tests. Statistical analyses were performed to assess correlations between dietary patterns, BMI categories, and GI/metabolic health outcomes.</p><p><b>Results:</b> The study found that 25% of participants were classified as undernourished, while 22% were obese, reflecting the double burden of malnutrition and rising obesity in Rwanda's urban population. Among obese participants, 40% exhibited elevated fasting glucose levels (p < 0.01), and 30% reported significant GI disturbances, such as irritable bowel syndrome (IBS) and non-alcoholic fatty liver disease (NAFLD). In contrast, undernourished individuals reported fewer GI symptoms but showed a higher prevalence of micronutrient deficiencies, including anaemia (28%) and vitamin A deficiency (15%). Dietary patterns characterized by high-fat and low-fiber intake were significantly associated with increased GI disorders and metabolic dysfunction in both obese and normal-weight participants (p < 0.05).</p><p><b>Conclusion:</b> This study highlights the growing public health challenge posed by the coexistence of undernutrition and obesity in Rwanda's urban centres. The dietary shifts associated with urbanization are contributing to both ends of the nutritional spectrum, adversely affecting GI and metabolic health. Addressing these issues requires comprehensive nutrition interventions that consider the dual challenges of undernutrition and obesity, promoting balanced diets and improving access to health services. These findings have important implications for nutrition therapy and metabolic support practices in Rwanda, emphasizing the need for tailored interventions that reflect the country's unique nutritional landscape.</p><p>Levi Teigen, PhD, RD<sup>1</sup>; Nataliia Kuchma, MD<sup>2</sup>; Hijab Zehra, BS<sup>1</sup>; Annie Lin, PhD, RD<sup>3</sup>; Sharon Lopez, BS<sup>2</sup>; Amanda Kabage, MS<sup>2</sup>; Monika Fischer, MD<sup>4</sup>; Alexander Khoruts, MD<sup>2</sup></p><p><sup>1</sup>University of Minnesota, St. Paul, MN; <sup>2</sup>University of Minnesota, Minneapolis, MN; <sup>3</sup>University of Minnesota, Austin, MN; <sup>4</sup>Indiana University School of Medicine, Indianapolis, IN</p><p><b>Financial Support:</b> Achieving Cures Together.</p><p><b>Background:</b> Fecal microbiota transplantation (FMT) is a highly effective treatment for recurrent Clostridioides difficile infection (rCDI). The procedure results in the repair of the gut microbiota following severe antibiotic injury. Recovery from rCDI is associated with high incidence of post-infection irritable bowel syndrome (IBS). In addition, older age, medical comorbidities, and prolonged diarrheal illness contribute to frailty in this patient population. The effect of FMT on IBS symptoms and frailty in patients with rCDI is largely unknown. In this prospective cohort study, we collected IBS symptom and frailty data over the 3 months following FMT treatment for rCDI in patients at two large, academic medical centers.</p><p><b>Methods:</b> Consenting adults who underwent FMT treatment for rCDI were enrolled in this study (n = 113). We excluded patients who developed a recurrence of CDI within the 3-month follow-up period (n = 15) or had incomplete IBS symptom severity scale (IBS-SSS) scores at any timepoint. The IBS-SSS is a 5-item survey measuring symptom intensity, with a score range from 0 to 500 with higher scores representing greater severity. IBS-SSS was collected at baseline, 1-week post FMT, 1-month post-FMT, and 3-months post-FMT. Frailty was assessed at baseline and 3-months using the FRAIL scale (categorical variable: “Robust Health”, “Pre-Frail”, “Frail”). Kruskal-Wallis test was used to compare IBS-SSS across timepoints. Post-hoc analysis was performed with the Pairwise Wilcoxon Rank Sum Tests using the False Discovery Rate adjustment method. The Friedman test was used to compare frailty distribution between the baseline and 3-month timepoints.</p><p><b>Results:</b> Mean age of the cohort was 63.3 (SD 15.4) years; 75% of the patients were female sex (total n = 58 patients). The IBS-SSS scores across timepoints are presented in Table 1 and Figure 1. The median IBS score at baseline was 134 [IQR 121], which decreased to a median score of 65 [IQR 174] at 1-week post-FMT. The baseline timepoint was found to differ from the 1-week, 1-month, and 3-month timepoints (p < 0.05). No other differences between timepoints were observed. Frailty was assessed at baseline and 3 months (total n = 52). At baseline, 71% of patients (n = 37) were considered Pre-Frail or Frail, but this percentage decreased to 46% (n = 24) at 3-months (Table 2; p < 0.05).</p><p><b>Conclusion:</b> Findings from this multicenter, prospective cohort study indicate an overall improvement in both IBS symptoms and frailty in the 3-months following FMT therapy for rCDI. Notably, IBS symptom scores were found to improve by 1-week post FMT. Further research is required to understand what predicts IBS symptom improvement following FMT and if nutrition therapy can help support further improvement. It will also be important to further understand how nutrition therapy can help support the improvement in frailty status observed following FMT for rCDI.</p><p><b>Table 1.</b> Distribution of IBS-SSS Scores at Baseline and Following FMT.</p><p></p><p><b>Table 2.</b> Frailty Distribution Assessed by FRAIL Scale at Baseline and 3-Months Post-FMT.</p><p></p><p></p><p>Box-plot distributions of IBS-SSS scores across timepoints. Median IBS-SSS score are baseline was 134 and decreased to a median score of 65 at 1-week post-FMT. The baseline timepoint was found to differ from the 1-week, 1-month, and 3-month timepoints (p < 0.05).</p><p><b>Figure 1.</b> Distribution of IBS-SSS Scores by Timepoint.</p><p>Oshin Khan, BS<sup>1</sup>; Subanandhini Subramaniam Parameshwari, MD<sup>2</sup>; Kristen Heitman, PhD, RDN<sup>1</sup>; Kebire Gofar, MD, MPH<sup>2</sup>; Kristin Goheen, BS, RDN<sup>1</sup>; Gabrielle Vanhouwe, BS<sup>1</sup>; Lydia Forsthoefel, BS<sup>1</sup>; Mahima Vijaybhai Vyas<sup>2</sup>; Saranya Arumugam, MBBS<sup>2</sup>; Peter Madril, MS, RDN<sup>1</sup>; Praveen Goday, MBBS<sup>3</sup>; Thangam Venkatesan, MD<sup>2</sup>; Kristen Roberts, PhD, RDN, LD, CNSC, FASPEN, FAND<sup>4</sup></p><p><sup>1</sup>The Ohio State University, Columbus, OH; <sup>2</sup>The Ohio State University Wexner Medical Center, Columbus, OH; <sup>3</sup>Nationwide Children's Hospital, Columbus, OH; <sup>4</sup>The Ohio State University, Granville, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Cyclic vomiting syndrome (CVS) is a disorder of gut-brain interaction characterized by recurrent spells of nausea, vomiting, and abdominal pain lasting hours to days. Estimated prevalence is approximately 2% in adults and children. Due to recalcitrant symptoms, patients might develop disorder eating patterns though data are lacking. Identifying patients at risk for disordered eating patterns and malnutrition can help optimize nutritional status and may improve overall patient outcomes. The purpose of this study is to define nutrient intakes and dietary patterns of those with CVS seeking care at a tertiary care referral clinic, and to establish variation in dietary intakes based on disease severity.</p><p><b>Methods:</b> In this ongoing, cross-sectional study of adults diagnosed with CVS based on Rome IV criteria, participants are asked to complete validated surveys including a food frequency questionnaire (FFQ). Baseline demographics and clinical characteristics including disease severity (defined by the number of episodes per year) were ascertained. Healthy eating index (HEI) scores (scale of 0-100) were calculated to assess diet quality with higher scores indicating better diet quality compared to lower scores. Those with complete data were included in this interim analysis.</p><p><b>Results:</b> Data from 33 participants with an average age of 40 ± 16 years and an average BMI of 28.6 ± 7.9 is presented. The cohort was predominately female (67%), white (79%) and with moderate to severe disease (76%). The malnutrition screening tool supported that 42% of participants were at risk of malnutrition independent of BMI status (p = 0.358) and disease severity (p = 0.074). HEI scores were poor amongst those with CVS (55) and did not differ based on disease severity (58 vs 54; p = 0.452). Energy intakes varied ranging from 416-3974 kcals/day with a median intake of 1562 kcals/day.</p><p><b>Conclusion:</b> In CVS, dietary intake is poor and there is a high risk of malnutrition regardless of disease severity and BMI. Providers and registered dietitian nutritionists must be aware of the high rates of malnutrition risk and poor dietary intakes in this patient population to improve delivery of dietary interventions. Insight into disordered eating and metabolic derangements may improve the understanding of dietary intakes in CVS.</p><p>Hannah Huey, MDN<sup>1</sup>; Holly Estes-Doetsch, MS, RDN, LD<sup>2</sup>; Christopher Taylor, PhD, RDN<sup>2</sup>; Kristen Roberts, PhD, RDN, LD, CNSC, FASPEN, FAND<sup>3</sup></p><p><sup>1</sup>Nationwide Children's Hospital, Columbus, OH; <sup>2</sup>The Ohio State University, Columbus, OH; <sup>3</sup>The Ohio State University, Granville, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Micronutrient deficiencies are common in Crohn's disease (CD) due to poor dietary absorption. Pregnancy has an increased nutritional demand and when coupled with a malabsorptive condition like CD, clinicians must closely monitor micronutrient status. However, there is a lack of evidence-based guidelines for clinicians when managing these complex patients leaving clinicians to use clinical judgement for management. A case study of a pregnant female with CD presents for delivery with an undetected fat-soluble vitamin deficiency. The impact of a vitamin K deficiency on the post-partum management of a patient with CD is presented along with potential assessment and treatment strategies. At 25 weeks gestation, the patient presented with a biochemical iron deficiency anemia, vitamin B12 deficiency, and zinc deficiency for which she was treated with oral supplementation and/or intramuscular injections. No assessment of fat-soluble vitamins during the gestation period was conducted despite a pre-pregnancy history of multiple micronutrient deficiencies. At 33 weeks gestation, the mother was diagnosed with preeclampsia and delivered the fetus at 35 weeks. After birth, the infant presented to the NICU with a mediastinal mass, abnormal liver function tests and initial coagulopathy. The mother experienced a uterine hemorrhage post-cesarean section. At this time her INR was 14.8 with a severe prolongation of the PT and PTT and suboptimal levels of blood clotting factors II, VII, IX, and X. The patient was diagnosed with a vitamin K deficiency and was treated initially with 10 mg daily by mouth x 3 days resulting in an elevated serum vitamin K while PT and INR were trending towards normal limits. At discharge she was recommended to take 1 mg daily by mouth of vitamin K to prevent further deficiency. PT and INR were the biochemical assays that were reassessed every 3 months since serum vitamin K is more reflective of recent intake. CD represents a complex disorder and the impact of pregnancy on micronutrient status is unknown. During pregnancy, patients with CD may require additional micronutrient monitoring particularly in the case of historical micronutrient deficiencies or other risk factors. This case presents the need for further research into CD-specific micronutrient deficiencies and the creation of specific supplementation guidelines and treatment algorithms for detection of micronutrient deficiencies in at-risk patients.</p><p><b>Methods:</b> None Reported.</p><p><b>Results:</b> None Reported.</p><p><b>Conclusion:</b> None Reported.</p><p>Gretchen Murray, BS, RDN<sup>1</sup>; Kristen Roberts, PhD, RDN, LD, CNSC, FASPEN, FAND<sup>2</sup>; Phil Hart, MD<sup>1</sup>; Mitchell Ramsey, MD<sup>1</sup></p><p><sup>1</sup>The Ohio State University Wexner Medical Center, Columbus, OH; <sup>2</sup>The Ohio State University, Granville, OH</p><p><b>Financial Support:</b> UL1TR002733.</p><p><b>Background:</b> Enteric hyperoxaluria (EH) and resultant lithiasis is well documented in many malabsorptive conditions including inflammatory bowel disease, celiac disease, short bowel syndrome, and post gastric bypass. Chronic pancreatitis (CP) often leads to exocrine pancreatic insufficiency (EPI) and subsequent fat malabsorption increasing the risk of EH secondary to calcium binding to dietary fat leaving oxalates available for colonic absorption. Modulating oxalate intake by reducing whole grains, greens, baked beans, berries, nuts, beer and chocolate while simultaneously improving hydration is the accepted medical nutrition therapy (MNT) for EH and calcium-oxalate stones. Although sources of dietary oxalate are well-known, there is limited literature regarding dietary oxalates in the CP diet, leaving a paucity of data to guide MNT for EH and calcium-oxalate lithiasis for these patients.</p><p><b>Methods:</b> A cross-sectional, case-control study was performed comparing subjects with CP to healthy controls. Vioscreen™ food frequency questionnaire was used to assess and quantify total oxalic acid intake in the CP cohort and to describe dietary sources. Descriptive statistics were used to describe dietary intake of oxalic acid and contributing food sources.</p><p><b>Results:</b> A total of 52 subjects with CP were included and had a mean age of 50 ± 15 years. Most subjects were male (n = 35; 67%). Mean BMI was 24 ± 6 kg/m2 and 8 subjects (15%) were classified as underweight by BMI. Median daily caloric intake was 1549 kcal with a median daily oxalic acid intake of 104 mg (range 11-1428 mg). The top three contributors to dietary oxalate intake were raw and cooked greens such as spinach or lettuce, followed by mixed foods such as pizza, spaghetti, and tacos and tea. Other significant contributors (>100 mg) to dietary oxalate intake included sports or meal replacement bars, cookies and cakes, potato products (mashed, baked, chips, fried), and refined grains (breads, tortillas, bagels).</p><p><b>Conclusion:</b> In the CP population, highest contributors to oxalate intake include greens, mixed foods, tea, meal replacement bars, some desserts, potatoes, and refined grains. Many of the identified dietary oxalate sources are not considered for exclusion in a typical oxalate restricted diet. A personalized approach to dietary oxalate modulation is necessary to drive MNT for EH prevention in those with CP.</p><p>Qian Ren, PhD<sup>1</sup>; Peizhan Chen, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine<sup>2</sup></p><p><sup>1</sup>Department of Clinical Nutrition, Shanghai Sixth People's Hospital Affiliated to Shanghai Jiao Tong University School of Medicine, Shanghai; <sup>2</sup>Clinical Research Center, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai</p><p><b>Financial Support:</b> This study was supported by the Youth Cultivation Program of Shanghai Sixth People's Hospital (Grant No. ynqn202223), the Key Laboratory of Trace Element Nutrition, National Health Commission of the Peoples’ Republic of China (Grant No. wlkfz202308), and the Danone Institute China Diet Nutrition Research and Communication (Grant No. DIC2023-06).</p><p><b>Background:</b> Low serum vitamin D status was reported to be associated with reduced muscle mass; however, it is inconclusive whether this relationship is causal. This study used data from the National Health and Nutrition Examination Survey (NHANES) and two-sample Mendelian randomization (MR) analyses to ascertain the causal relationship between serum 25-hydroxyvitamin D [25(OH)D] and appendicular muscle mass (AMM).</p><p><b>Methods:</b> In the NHANES 2011–2018 dataset, 11,242 participants aged 18–59 years old were included, and multivariant linear regression was performed to assess the relationship between 25(OH)D and AMM measured by dual-energy X-ray absorptiometry (Figure 1). In two-sample MR analysis, 167 single nucleotide polymorphisms significantly associated with serum 25(OH)D were applied as instrumental variables (IVs) to assess vitamin D effects on AMM in the UK Biobank (417,580 Europeans) using univariable and multivariable MR models (Figure 2).</p><p><b>Results:</b> In the NHANES 2011–2018 dataset, serum 25(OH)D concentrations were positively associated with AMM (β = 0.013, SE = 0.001, p < 0.001) in all participants, after adjustment for age, race, season of blood collection, education, income, body mass index and physical activity. In stratification analysis by sex, males (β = 0.024, SE = 0.002, p < 0.001) showed more pronounced positive associations than females (β = 0.003, SE = 0.002, p = 0.024). In univariable MR, genetically higher serum 25(OH)D levels were positively associated with AMM in all participants (β = 0.049, SE = 0.024, p = 0.039) and males (β = 0.057, SE = 0.025, p = 0.021), but only marginally significant in females (β = 0.043, SE = 0.025, p = 0.090) based on IVW models were noticed. No significant pleiotropy effects were detected for the IVs in the two-sample MR investigations. In MVMR analysis, a positive causal effect of 25(OH)D on AMM were observed in total population (β = 0.116, SE = 0.051, p = 0.022), males (β = 0.111, SE = 0.053, p = 0.036) and females (β = 0.124, SE = 0.054, p = 0.021).</p><p><b>Conclusion:</b> Our results suggested a positive causal effect of serum 25(OH)D concentration on AMM; however, more researches are needed to understand the underlying biological mechanisms.</p><p></p><p><b>Figure 1.</b> Working Flowchart of Participants Selection in the Cross-Sectional Study.</p><p></p><p><b>Figure 2.</b> The study assumptions of the two-sample Mendelian Randomization analysis between serum 25(OH)D and appendicular muscle mass. The assumptions include: (1) the genetic instrumental variables (IVs) should exhibit a significant association with serum 25(OH)D; (2) the genetic IVs should not associate with any other potential confounding factors; and (3) the genetic IVs must only through serum 25(OH)D but not any other confounders to influence the appendicular muscle mass. The dotted lines indicate the violate of the assumptions.</p><p>Qian Ren, PhD<sup>1</sup>; Junxian Wu<sup>1</sup></p><p><sup>1</sup>Department of Clinical Nutrition, Shanghai Sixth People's Hospital Affiliated to Shanghai Jiao Tong University School of Medicine, Shanghai</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> A healthy diet is essential for both preventing and treating type 2 diabetes mellitus (T2DM), which has a negative impact on public health. Whole grains, which are rich in dietary fibre, can serve as a good source of carbohydrates. However, the correlation and causality between whole grain intake and the risk of T2DM and glucose metabolism remains to be clarified.</p><p><b>Methods:</b> First, the National Health and Nutrition Examination Survey database 2003-2018 were used to investigate the correlation between dietary whole grain/fibre intake and the risk of T2DM and glucose metabolism. Then, based on the largest publicly published genome-wide association analysis of whole grain intake in the Neale lab database, single nucleotide polymorphisms (SNPs) that were statistically and significantly associated with whole grain intake were selected as instrumental variables (p < 5×10<sup>-8</sup>, linkage disequilibrium r<sup>2</sup> < 0.1). Inverse variance weighted analysis (IVW), weighted median method and other methods were used to analyze the causal relationship between whole grain intake and T2DM. Heterogeneity test, gene pleiotropy test and sensitivity analysis were performed to evaluate the stability and reliability of the results.</p><p><b>Results:</b> The results showed that dietary intakes of whole grains (OR = 0.999, 95%CI: 0.999 ~ 1.000, p = 0.004)/fibre (OR = 0.996, 95% CI: 0.993 ~ 0.999, p = 0.014) were negatively associated with the risk of T2DM. In the population with normal glucose metabolism, dietary fibre intake was negatively associated with FPG (β = -0.003, SE = 0.001, p < 0.001), FINS (β = -0.023, SE = 0.011, p = 0.044), and HOMA-IR (β = -0.007, SE = 0.003, p = 0.023). In the population with abnormal glucose metabolism, dietary intakes of whole grains (β = -0.001, SE = 0.001, p = 0.036) and fibre (β = -0.006, SE = 0.002, p = 0.005) were negatively associated with HbA1c. For further MR analyses, the IVW method demonstrated that for every one standard deviation increase in whole grains intake, T2DM risk was decreased by 1.9% (OR = 0.981, 95%CI: 0.970 ~ 0.993, β = -0.019, p = 0.002), with consistent findings for IVW-multiplicative random effects, IVW-fixed effects and IVW-radial. Meanwhile, MR-Egger regression analysis (intercept = -2.7 × 10<sup>-5</sup>, p = 0.954) showed that gene pleiotropy doesn't influence the results of MR analysis. Leave-one-out analysis showed that individual SNP greatly doesn't influence the results (p<sub>heterogeneity</sub>= 0.445).</p><p><b>Conclusion:</b> Dietary intakes of whole grains may reduce the risk of T2DM and improve glucose metabolic homeostasis and insulin resistance. The causal relationship between whole grains intake and T2DM, as well as the optimal daily intake of whole grains still needs to be further explored in the future through large randomised controlled intervention studies and prospective cohort studies.</p><p>Hikono Sakata, Registered Dietitian<sup>1</sup>; MIsa Funaki, Registered Dietitian<sup>2</sup>; Kanae Masuda, Registered Dietitian<sup>2</sup>; Rio Kurihara, Registered Dietitian<sup>2</sup>; Tomomi Komura, Registered Dietitian<sup>2</sup>; Masaru Yoshida, Doctor<sup>2</sup></p><p><sup>1</sup>University of Hyogo, Ashiya-shi, Hyogo; <sup>2</sup>University of Hyogo, Himezi-shi, Hyogo</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> In recent years, lifestyle-related diseases such as obesity, diabetes, and dyslipidemia have been recognized as problems, one of the causes of which is an excessive fatty diet. Obesity and other related diseases are known to be risk factors for the severity of infectious diseases such as sepsis and novel coronavirus infection, but their pathomechanisms have not been clarified. Therefore, we hypothesized that a diet high in fat might induce functional changes not only in adipocytes but also in macrophages, weakening the immune response and leading to aggravation of infectious diseases. Therefore, in this study, we performed proteome analysis and RNA sequence analysis to examine what kind of gene and protein expression is induced in macrophages by high-fat diet loading.</p><p><b>Methods:</b> Four-week-old mice were divided into a normal diet (ND) group and a high-fat diet (HFD) group and kept for four weeks. Macrophages were collected intraperitoneally from mice that had been intraperitoneally injected with 2 mL of thioglycolate medium to promote macrophage proliferation one week before dissection, and incubated at 37°C with 5% CO2 using Roswell Park Memorial Institute medium (RPMI). After culturing in an environment for 2 hours, floating cells were removed, and proteome analysis was performed using the recovered macrophages. In addition, RNA sequence analysis was performed on RNA extracted from macrophages.</p><p><b>Results:</b> Proteome analysis identified more than 4000 proteins in each group. In the HFD group, compared to the ND group, decreased expression of proteins involved in phagocytosis, such as immunoglobulin and eosinophil peroxidase, was observed. In addition, RNA sequencing data analysis also showed a decrease in the expression levels of genes related to phagocytosis, which had been observed to decrease in proteome analysis.</p><p><b>Conclusion:</b> From the above, it was suggested that the phagocytic ability of macrophages is reduced by high-fat diet loading. This research is expected to clarify the molecular mechanisms by which high-fat dietary loading induces the expression of genes and proteins and induces immunosuppressive effects.</p><p>Benjamin Davies, BS<sup>1</sup>; Chloe Amsterdam, BA<sup>1</sup>; Basya Pearlmutter, BS<sup>1</sup>; Jackiethia Butsch, C-CHW<sup>2</sup>; Aldenise Ewing, PhD, MPH, CPH<sup>3</sup>; Erin Holley, MS, RDN, LD<sup>2</sup>; Subhankar Chakraborty, MD, PHD<sup>4</sup></p><p><sup>1</sup>The Ohio State University College of Medicine, Columbus, OH; <sup>2</sup>The Ohio State University Wexner Medical Center, Columbus, OH; <sup>3</sup>The Ohio State University College of Public Health, Columbus, OH; <sup>4</sup>The Ohio State University Wexner Medical Center, Dublin, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Food insecurity (FI) refers to lack of consistent access to sufficient food for an active, healthy life. This issue, influenced by economic, social, and environmental factors, disproportionately affects disadvantaged populations. According to the United States Department of Agriculture, more than 10% of U.S. households experience FI, leading to physical and mental health consequences. FI has been linked to poorer dietary quality, increased consumption of processed, calorie-dense foods, and a higher prevalence of obesity, diabetes, and cardiovascular diseases. Chronic stress from FI can also trigger or exacerbate mental health disorders, including depression and anxiety, further complicating health outcomes. Affecting ~20% of the global population, dyspepsia significantly diminishes quality of life and increases healthcare costs. Its etiology is multifactorial, involving abnormal gastric motility, visceral hypersensitivity, inflammation, and psychosocial stressors. Although research on the link between FI and disorders of gut-brain interaction (DGBIs) like dyspepsia is limited, emerging evidence suggests a bidirectional relationship. Therefore, this study was designed to examine the association between FI, dyspepsia, and other health-related social needs (HRSN). Our hypotheses include 1) patients with FI have more severe dyspepsia symptoms, and 2) FI is associated with HRSN in other domains.</p><p><b>Methods:</b> Patients presenting to a specialty motility clinic were prospectively enrolled into a registry created with the goal of holistically investigating the pathophysiology of DGBIs. Validated questionnaires for HRSN and dyspepsia were completed prior to their clinic visits. Data were managed with REDCap and statistical analyses were performed using SPSS.</p><p><b>Results:</b> 53 patients completed the questionnaires. 88.7% of patients were White and 73.6% were female with an average age of 45.6 years (21-72) and BMI of 28.7 kg/m<sup>2</sup> (17.8-51.1). FI was present in 13 (24.5%) patients. The overall severity of dyspepsia symptoms was significantly less in the food secure patients (13.8 vs. 18.8, p = 0.042). Of the four subscales of dyspepsia (nausea-vomiting, postprandial fullness, bloating, and loss of appetite), only loss of appetite was significantly greater in those with FI (2.3 vs. 1.3, p = 0.017). Patients with FI were more likely to be at medium (61.5% vs. 5.0%) or high risk (30.8% vs. 2.5%, p < 0.001) of financial hardship, experience unmet transportation needs (38.5% vs. 5.0%, p = 0.0.019) and housing instability (30.8% vs. 5.0%, p = 0.023) compared to those who were food secure. They were also at higher risk of depression (54% vs. 12.5%, p = 0.005), and reporting insufficient physical activity (92.3% vs. 55.0%, p = 0.05). After adjusting for age, gender, race, and BMI, FI was not a predictor of global dyspepsia severity. Greater BMI (O.R. 0.89, 95% C.I. 0.81-0.98) was associated with severity of early satiety. Female gender (O.R. 10.0, 95% C.I. 1.3-76.9, p = 0.03) was associated with the severity of nausea. Greater BMI (O.R. 1.10, 95% C.I. 1.001-1.22, p = 0.048) and female gender (O.R. 10.8, 95% C.I. 1.6-72.9, p = 0.015) both correlated with severity of postprandial fullness.</p><p><b>Conclusion:</b> FI affected ~25% of patients seen in our clinic. It was, however, not an independent predictor of overall severity of dyspepsia symptoms. Patients who experienced FI had higher prevalence of other HRSN, and risk of depression and physical inactivity. Our results highlight the importance of considering FI and other HRSN in the management of dyspepsia. Understanding this interaction is essential for improving clinical outcomes and guiding public health interventions.</p><p>Ashlesha Bagwe, MD<sup>1</sup>; Chandrashekhara Manithody, PhD<sup>1</sup>; Kento Kurashima, MD, PhD<sup>1</sup>; Shaurya Mehta, BS<sup>1</sup>; Marzena Swiderska-Syn<sup>1</sup>; Arun Verma, MD<sup>1</sup>; Austin Sims<sup>1</sup>; Uthayashanker Ezekiel<sup>1</sup>; Ajay Jain, MD, DNB, MHA<sup>1</sup></p><p><sup>1</sup>Saint Louis University, St. Louis, MO</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Farnesoid X receptor (FXR) a gut nuclear receptor regulates intestinal driven bile acid homeostasis in short bowel syndrome. Chenodeoxycholic acid (CDCA), a primary bile acid, acts as a FXR ligand. Stem cell-derived intestinal enteroids offer a valuable system to study intestinal FXR function. We hypothesized that transfection of porcine enteroids with small interfering RNA (siRNA) would modulate FXR to further define its mechanistic role.</p><p><b>Methods:</b> We developed a porcine protocol for matrigel-based 3D culture systems to generated enteroids from the small bowel of neonatal yorkshire pigs. After 7 days, cultures were passaged and expanded. RNA strands for Dicer-substrate siRNAs (DsiRNAs) were synthesized as single-strand RNAs (ssRNAs) by Integrated DNA Technologies and resuspended in an RNase-free buffer. DsiRNA targeted Sus scrofa, farnesoid x receptor (FXR) gene (KF597010.1). Three sets of DsiRNA were made for gene-specific silencing of FXR gene. Porcine enteroids were cultured and transfected with FXR-specific siRNA and control siRNA using Lipofectamine RNAiMAX reagent. They were also treated with escalating CDCA concentrations (25 μM to 50 μM) for 24 and 48 hrs. FXR mRNA levels were quantified by real-time PCR and functional assays performed to assess changes in bile acid uptake and efflux following transfection.</p><p><b>Results:</b> Data from 3 separate experiments of intestinal crypts showed similar results in enhanced FXR expression with CDCA against control (p < 0.01). CDCA treatment resulted in a dose-dependent increase in FXR mRNA expression, reaching peak levels after 48 hours of exposure (2.8X increase) with enhanced nuclear localization. Functionally, CDCA-treated enteroids displayed increased bile acid uptake and reduced efflux, validating FXR's role in mediating bile acid driven enterohepatic circulation. Several runs with siRNA were conducted. Using 30pMole siRNA (sense: 5' GAUCAACAGAAUCUUCUUCAUUATA 3' and antisense: 5' UAUAAUGAAGAAGAUUCUGUUGAUCUG 3') there was a 68% reduction in FXR expression against scramble. FXR silencing led to decreased bile acid uptake and increased efflux. No significant effects were observed in enteroids transfected with control siRNA. Paradoxically, CDCA treated cultures showed a higher proportion of immature enteroids to mature enteroids.</p><p><b>Conclusion:</b> In porcine enteroids, CDCA treatment increased FXR gene expression and promoted its nuclear localization. This finding implies the existence of a positive feedback cycle in which CDCA, an FXR ligand, induces further synthesis and uptake. siRNA transfection was able to significantly decrease FXR activity. By employing this innovative methodology, one can effectively examine the function of FXR in ligand treated or control systems.</p><p><b>Pediatric, Neonatal, Pregnancy, and Lactation</b></p><p>Kento Kurashima, MD, PhD<sup>1</sup>; Si-Min Park, MD<sup>1</sup>; Arun Verma, MD<sup>1</sup>; Marzena Swiderska-Syn<sup>1</sup>; Shaurya Mehta, BS<sup>1</sup>; Austin Sims<sup>1</sup>; Mustafa Nazzal, MD<sup>1</sup>; John Long, DVM<sup>1</sup>; Chandrashekhara Manithody, PhD<sup>1</sup>; Shin Miyata, MD<sup>1</sup>; Ajay Jain, MD, DNB, MHA<sup>1</sup></p><p><sup>1</sup>Saint Louis University, St. Louis, MO</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> North American Society for Pediatric Gastroenterology, Hepatology and Nutrition (NASPGHAN) Florida.</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Biliary atresia (BA) is a major cause of obstructive neonatal cholestatic disease. Although hepato-portoenterostomy (HPE) is routinely performed for BA patients, more than half eventually require liver transplantation. The intricate mechanisms of bile ductular injury driving its pathogenesis remains elusive and not recapitulated in current small animal and rodent models that rely on bile duct ligation. Addressing prevailing lacunae, we hypothesized that extra and intra hepatic bile duct destruction through an endothelial irritant would recapitulate human condition. We have thus developed a novel neonatal piglet BA model called, ‘BATTED’. Piglets have liver and gastro-intestinal homology to human infants and share anatomic and physiological processes providing a robust platform for BATTED and HPE.</p><p><b>Methods:</b> Six 7-10 day old piglets were randomized to BATTED (US provisional Patent US63/603,995) or sham surgery. BATTED included cholecystectomy, common bile duct and hepatic duct injection of 95% ethanol and a retainer suture for continued ethanol induced intrahepatic injury. Vascular access ports were placed and scheduled ultrasound guided liver biopsies were performed. Six-weeks post-BATTED piglets underwent HPE. 8-weeks after initial surgery, animals were euthanized. Serology, histology, gene expression and immunohistochemistry were performed.</p><p><b>Results:</b> Serological evaluation revealed a surge in conjugated bilirubin 6 weeks after BATTED procedure from baseline (mean Δ 0.39 mg/dL to 3.88 mg/dL). Gamma-glutamyl transferase (GGT) also exhibited a several fold increase (mean: Δ 16.3IU to 89.5IU). Sham did not display these elevations (conjugated bilirubin: Δ 0.39 mg/dL to 0.98 mg/dL, GGT: Δ 9.2IU to 10.4IU). Sirius red staining demonstrated significant periportal and diffuse liver fibrosis (16-fold increase) and bile duct proliferation marker CK-7 increased 9-fold with BATTED. Piglets in the BATTED group demonstrated enhanced CD-3 (7-fold), alpha-SMA (8.85-fold), COL1A1 (11.7-fold) and CYP7A1 (7-fold), vs sham. Successful HPE was accomplished in piglets with improved nutritional status and a reduction in conjugated bilirubin (Δ 4.89 mg/dL to 2.11 mg/dL).</p><p><b>Conclusion:</b> BATTED replicated BA features, including hyperbilirubinemia, GGT elevation, significant hepatic fibrosis, bile duct proliferation, and inflammatory infiltration with subsequent successful HPE. This model offers substantial opportunities to elucidate the mechanism underlying BA and adaptation post HPE, paving the path for the development of diagnostics and therapeutics.</p><p>Sirine Belaid, MBBS, MPH<sup>1</sup>; Vikram Raghu, MD, MS<sup>1</sup></p><p><sup>1</sup>UPMC, Pittsburgh, PA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> At our institution, pediatric residents are responsible for managing patients with intestinal failure (IF) in the inpatient units. However, they have reported feelings of inexperience and anxiety when dealing with these complex cases. This project aimed to identify knowledge gaps and evaluate the confidence levels of pediatric residents in managing IF patients.</p><p><b>Methods:</b> We conducted an online needs assessment survey using Qualtrics, which included Likert-scale, multiple-choice, open-ended and rating questions to assess residents' confidence levels (from 1 to 10) in performing tasks related to IF patient care. This voluntary survey, approved by the IRB as exempt, was distributed to all pediatric residents at the University of Pittsburgh via QR codes and emails.</p><p><b>Results:</b> Of the respondents, 32% participated in the survey, with nearly 50% having completed a rotation on the Intestinal Rehabilitation (IR) service. Residents reported the lowest confidence in calculating Total Parenteral Nutrition (TPN)-like Intravenous (IV) fluids (solutions administered centrally or peripherally, containing dextrose and major electrolytes designed to match the patients’ home TPN contents), identifying signs of D-lactic acidosis and small intestinal bacterial overgrowth (SIBO), and managing SIBO (average confidence rating of 2/10). They also expressed low confidence in ordering home TPN and TPN-like IV Fluids, understanding related anatomy, ensuring proper stoma care, managing central access loss, and addressing poor catheter blood flow (average confidence ratings of 3-4/10). Conversely, residents felt more confident managing feeding intolerance and central line-associated infections (average confidence ratings of 5-6/10). Additionally, they rated their ability to identify signs of septic and hypovolemic shock as 8/10, though they felt less confident in managing these conditions (7/10). Furthermore, 64% of respondents agreed that managing IF patients is educationally valuable and preferred laminated cards or simulated sessions as educational resources (average ratings of 8 and 6.8 out of 10, respectively).</p><p><b>Conclusion:</b> The survey highlights several areas where pediatric residents need further education. Addressing these knowledge gaps through targeted curricular interventions can better equip residents to manage IF patients and potentially increase their interest in this specialty.</p><p></p><p>CLABSI = Central Line-Associated Bloodstream Infection, TPN = Total Parenteral Nutrition, EHR = Electronic Health Record, IV = Intravenous, SIBO = Small Intestinal Bacterial Overgrowth.</p><p><b>Figure 1.</b> Stratifies the tasks related to managing patients with Intestinal Failure (IF) into three categories based on the average confidence rating score (>= 7/10, 5-6/10, <=4/10) of pediatric residents.</p><p></p><p><b>Figure 2.</b> Illustrates the distribution of pediatric residents’ opinions on the educational value of managing patients with intestinal failure.</p><p>Alyssa Ramuscak, MHSc, MSc<sup>1</sup>; Inez Martincevic, MSc<sup>1</sup>; Hebah Assiri, MD<sup>1</sup>; Estefania Carrion, MD<sup>2</sup>; Jessie Hulst, MD, PhD<sup>1</sup></p><p><sup>1</sup>The Hospital for Sick Children, Toronto, ON; <sup>2</sup>Hospital Metropolitano de Quito, Quito, Pichincha</p><p><b>Financial Support:</b> Nestle Health Science Canada, North York, Ontario, Canada.</p><p><b>Background:</b> Enteral nutrition provides fluids and nutrients to individuals unable to meet needs orally. Recent interest in real food-based formulas highlights a shift towards providing nutrition with familiar fruit, vegetable and protein ingredients. This study aimed to evaluate the tolerability and nutritional adequacy of hypercaloric, plant-based, real food ingredient formula in pediatric tube-fed patients.</p><p><b>Methods:</b> This prospective, single-arm, open-label study evaluated the tolerability and nutritional adequacy of a hypercaloric, plant-based formula, Compleat® Junior 1.5, in medically complex, stable, pediatric tube-fed patients aged 1-13 years. Participants were recruited from outpatient clinics at The Hospital for Sick Children, Toronto from May 2023 to June 2024. Demographic and anthropometric measurements (weight, height) were obtained at baseline. Daily dose of study product was isocaloric when compared to routine feeds. Total fluid needs were met by adding water. Participants transitioned to the study formula over 3 days, followed by 14-days of exclusive use of study product. Caregivers monitored volume of daily intake, feed tolerance, and bowel movements during the transition and study period using an electronic database (Medrio®). An end of study visit was conducted to collect weight measurements. Descriptive statistics summarize demographic and clinical characteristics (Table 1). The paired t-test compared weight-for-age and BMI-for-age z-scores between baseline and the end-of-study. Symptoms of intolerance and bowel movements, using either the Bristol Stool Scale or Brussel's Infant and Toddler Stool Scale, were described as frequency of events and compared at baseline to intervention period. The percent calorie and protein goals during the study period were calculated as amount calories received over prescribed, and amount protein received as per dietary reference intake for age and weight.</p><p><b>Results:</b> In total, 27 ambulatory pediatric participants with a median age of 5.5 years (IQR, 2.5-7) were recruited for the study with 26 completing (Table 1). Participant weight-for-age and BMI-for-age z-scores significantly improved between baseline and end of study, from -1.75 ± 1.93 to -1.67 ± 1.88 (p < 0.05), and from -0.47 ± 1.46 to 0.15 ± 0.23 (p < 0.05), respectively. There was no significant difference in the frequency of any GI symptom, including vomiting, gagging/retching, tube venting or perceived pain/discomfort with feeds, between baseline and end of study. There was no significant difference in frequency or type of stool between baseline and end of study. Study participants met 100% of their prescribed energy for the majority (13 ± 1.7 days) of the study period. All participants exceeded protein requirements during the study period. Twenty families (76.9%) indicated wanting to continue to use study product after completing the study.</p><p><b>Conclusion:</b> This prospective study demonstrated that a hypercaloric, plant-based, real food ingredient formula among stable, yet medically complex children was well tolerated and calorically adequate to maintain or facilitate weight gain over a 14-day study period. The majority of caregivers preferred to continue use of the study product.</p><p><b>Table 1.</b> Demographic and Clinical Characteristics of Participants (n = 27).</p><p></p><p><b>Poster of Distinction</b></p><p>Gustave Falciglia, MD, MSCI, MSHQPS<sup>1</sup>; Daniel Robinson, MD, MSCI<sup>1</sup>; Karna Murthy, MD, MSCI<sup>1</sup>; Irem Sengul Orgut, PhD<sup>2</sup>; Karen Smilowitz, PhD, MS<sup>3</sup>; Julie Johnson, MSPH PhD<sup>4</sup></p><p><sup>1</sup>Northwestern University Feinberg School of Medicine, Chicago, IL; <sup>2</sup>University of Alabama Culverhouse College of Business, Tuscaloosa, AL; <sup>3</sup>Northwestern University Kellogg School of Business & McCormick School of Engineering, Evanston, IL; <sup>4</sup>University of North Carolina School of Medicine, Chapel Hill, NC</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> Children's Hospital Neonatal Consortium (CHNC) Annual Conference, November 1, 2021, Houston, TX.</p><p><b>Financial Support:</b> None Reported.</p><p>Lyssa Lamport, MS, RDN, CDN<sup>1</sup>; Abigail O'Rourke, MD<sup>2</sup>; Barry Weinberger, MD<sup>2</sup>; Vitalia Boyar, MD<sup>2</sup></p><p><sup>1</sup>Cohen Children's Medical Center of New York, Port Washington, NY; <sup>2</sup>Cohen Children's Medical Center of NY, New Hyde Park, NY</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Premature infants in the neonatal intensive care unit (NICU) are at high risk for peripheral intravenous catheter infiltration (PIVI) because of the frequent cannulation of small and fragile vessels. The most common infusate in neonates is parenteral nutrition (PN), followed by antibiotics. Previous reports have suggested that the intrinsic properties of infusates, such as pH, osmolality, and calcium content, determine the severity of PIVIs. This has led to the common practice of restricting the intake of protein and/or calcium to less than that which is recommended for optimal growth and bone mineralization.</p><p><b>Methods:</b> Our objective was to identify the characteristics of infants and intravenous (IV) infusates that associated with the development of severe neonatal PIVIs. We conducted a retrospective analysis of PIVIs in our level IV NICU from 2018-2022 (n = 120). Each PIVI was evaluated by a wound certified neonatologist and classified as mild, moderate, or severe using a scoring system based on the Infusion Nurses Society (INS) staging criteria. Comparison between groups were done using ANOVA and chi-square analysis or Mann-Whitney tests for non-parametric data.</p><p><b>Results:</b> Infants with severe PIVIs had a lower mean birthweight than those with mild or moderate PIVIs (1413.1 g vs 2116.9 g and 2020.3 g respectively, p = .01) (Table 1). Most PIVIs occurred during infusions of PN and lipids, but the severity was not associated with the infusion rate, osmolality, or with the concentration of amino acids (median 3.8 g/dL in the mild group, 3.5 g/dL in the moderate group and 3.4 g/dL in the severe group) or calcium (median 6500 mg/L in all groups) (Table 2, Figure 1). Of note, the infusion of IV medications within 24 hours of PIVI was most common in the severe PIVI group (p = .03)(Table 2). Most PIVIs, including mild, were treated with hyaluronidase. Advanced wound care was required for 4% of moderate and 44% of severe PIVIs, and none required surgical intervention.</p><p><b>Conclusion:</b> Severe PIVIs in the NICU are most likely to occur in infants with a low birthweight and within 24 hours of administration of IV medications. This is most likely because medications must be kept at acidic or basic pH for stability, and many have high osmolarity and/or intrinsic caustic properties. Thus, medications may induce chemical phlebitis and extravasation with inflammation. In contrast, PN components, including amino acids and calcium, are not related to the severity of extravasations. Our findings suggest that increased surveillance of IV sites for preterm infants following medication administration may decease the risk of severe PIVIs. Conversely, reducing or withholding parenteral amino acid and or calcium to mitigate PIVI risk may introduce nutritional deficiencies without decreasing the risk of clinically significant PIVIs.</p><p><b>Table 1.</b> Characteristic Comparison of Mild, Moderate, and Severe Pivis in Neonatal ICU. PIVI Severity Was Designated Based on INS Criteria.</p><p></p><p><b>Table 2.</b> Association of Medication Administration and Components of Infusates With the Incidence and Severity of PIVI in NICU.</p><p></p><p></p><p><b>Figure 1.</b> Infusate Properties.</p><p>Stephanie Oliveira, MD, CNSC<sup>1</sup>; Josie Shiff<sup>2</sup>; Emily Romantic, RD<sup>3</sup>; Kathryn Hitchcock, RD<sup>4</sup>; Gillian Goddard, MD<sup>4</sup>; Paul Wales, MD<sup>5</sup></p><p><sup>1</sup>Cincinnati Children's Hospital Medical Center, Mason, OH; <sup>2</sup>University of Cincinnati, Cincinnati, OH; <sup>3</sup>Cincinnati Children's Hospital Medical Center, Cincinnati, OH; <sup>4</sup>Cincinnati Children's Hospital, Cincinnati, OH; <sup>5</sup>Cincinnati Children's Hospital Medical Center, Cincinnati, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> It is common for children with intestinal failure on parenteral nutrition to be fed an elemental enteral formula as it is believed they are typically better tolerated due to the protein module being free amino acids, the absence of other allergens, and the presence of long chain fatty acids. In February 2022, a popular elemental formula on the market was recalled due to bacterial contamination that necessitated an immediate transition to an alternative enteral formula. This included initiating plant-based options for some of our patients. We have experienced a growing interest and request from families to switch to plant-based formulas due to religious practices, cost concerns, and personal preferences. While plant-based formulas lack major allergens and may contain beneficial soluble fiber, they are under studied in this patient population. This study aims to determine if growth was affected amongst children with intestinal failure on parenteral nutrition who switched from elemental to plant-based formulas.</p><p><b>Methods:</b> We conducted a retrospective cohort study of IF patients on PN managed by our intestinal rehabilitation program who transitioned from elemental to plant-based formulas during the product recall. Data were collected on demographics, intestinal anatomy, formula intake, parenteral nutrition support, tolerance, stool consistency, and weight gain for 6 months before and after formula transition. Paired analyses were performed for change in growth and nutritional intake, using the Wilcoxon-signed rank test. Chi-squared tests were performed to compare formula tolerance. An alpha value < 0.05 was considered significant.</p><p><b>Results:</b> Eleven patients were included in the study [8 Males; median gestational age 33 (IQR = 29, 35.5) weeks, median age at assessment 20.4 (IQR = 18.7,29.7) months]. All participants had short bowel syndrome (SBS) as their IF category. Residual small bowel length was 28(IQR = 14.5,47.5) cm. Overall, there was no statistically significant difference in growth observed after switching to plant-based formulas (p = 0.76) (Figure 1). Both median enteral formula volume and calorie intake were higher on plant-based formula, but not statistically significant (p = 0.83 and p = 0.41) (Figure 2). 7 of 11 patients (64%) reported decreased stool count (p = 0.078) and improved stool consistency (p = 0.103) after switching to plant-based formula. Throughout the study, the rate of PN calorie and volume weaning were not different after switching to plant-based formula (calories: p = 0.83; volume: p = 0.52) (Figure 3).</p><p><b>Conclusion:</b> In this small study of children with IF, the switch from free amino acid formula to an intact plant-based formula was well tolerated. Growth was maintained between groups. After switching to plant-based formulas these children tolerated increased enteral volumes, but we were underpowered to demonstrate a statistical difference. There was no evidence of protein allergy among children who switched. Plant-based formulas may be an alternative option to elemental formulas for children with intestinal failure.</p><p></p><p><b>Figure 1:</b> Change in Weight Gain (g/day) on Elemental and Plant-Based Formulas.</p><p></p><p><b>Figure 2.</b> Change in Enteral Calories (kcal/kg/day) and Volume (mL/kg/day) on Elemental and Plant-Based Formulas.</p><p></p><p><b>Figure 3.</b> Change in PN Calories (kcal/kg/day) and Volume (mL/kg/day) on Elemental and Plant-Based Formulas.</p><p>Carly McPeak, RD, LD<sup>1</sup>; Amanda Jacobson-Kelly, MD, MSc<sup>1</sup></p><p><sup>1</sup>Nationwide Children's Hospital, Columbus, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> In pediatrics, post-pyloric enteral feeding via jejunostomy, gastrojejunostomy or naso-jejunostomy tubes is increasingly utilized to overcome problems related to aspiration, severe gastroesophageal reflux, poor gastric motility, and gastric outlet obstruction. Jejunal enteral feeding bypasses the duodenum, a primary site of absorption for many nutrients. Copper is thought to be primarily absorbed in the stomach and proximal duodenum, thus patients receiving nutritional support in a way that bypasses this region can experience copper deficiency. The prevalence and complications of copper deficiency in the pediatric population are not well documented.</p><p><b>Methods:</b> This was a retrospective case series of two patients treated at Nationwide Children's Hospital (NCH). Medical records were reviewed to collect laboratory, medication/supplement data and enteral feeding history. In each case report, both patients were observed to be receiving Pediasure Peptide® for enteral formula.</p><p><b>Results:</b> Case 1: A 14-year-old male receiving exclusive post-pyloric enteral nutrition for two years. This patient presented with pancytopenia and worsening anemia. Laboratory data was drawn on 3/2017 and demonstrated deficient levels of copper (< 10 ug/dL) and ceruloplasmin (20 mg/dL). Repletion initiated with intravenous cupric chloride 38 mcg/kg/day for 3 days and then transitioned to 119 mcg/kg/day via j-tube for 4 months. Labs were redrawn 2 months after initial episode of deficiency and indicated overall improvement of pancytopenia (Table 1). After 4 months, cupric chloride decreased to 57 mcg/kg/day as a maintenance dose. Laboratory data redrawn two and a half years after initial episode of deficiency and revealed deficient levels of copper (27 ug/dL) and ceruloplasmin (10 mg/dL) despite lower dose supplementation being administered. Case 2: An 8-year-old female receiving exclusive post-pyloric enteral nutrition for 3 months. Laboratory data was drawn on 3/2019 and revealed deficient levels of copper (38 ug/dL) and ceruloplasmin (13 mg/dL). Supplementation of 50 mcg/kg/day cupric chloride administered through jejunal tube daily. Copper and ceruloplasmin labs redrawn at 11 months and 15 months after initiation of supplementation and revealed continued deficiency though hematologic values remained stable (Table 2).</p><p><b>Conclusion:</b> There are currently no guidelines for clinicians for prevention, screening, treatment, and maintenance of copper deficiency in post-pyloric enteral feeding in pediatrics. Current dosing for copper repletion in profound copper deficiency is largely based on case series and expert opinions. At NCH, the current standard-of-care supplementation demonstrates inconsistent improvement in copper repletion, as evidenced by case reports discussed above. Future research should determine appropriate supplementation and evaluate their efficacy in patients with post-pyloric enteral feeding.</p><p><b>Table 1.</b> Laboratory Evaluation of Case 1.</p><p></p><p>'-' indicates no data available, bolded indicates result below the lower limit of normal for age.</p><p><b>Table 2.</b> Laboratory Evaluation of Case 2.</p><p></p><p>'-' indicates no data available, bolded indicates result below the lower limit of normal for age.</p><p>Meighan Marlo, PharmD<sup>1</sup>; Ethan Mezoff, MD<sup>1</sup>; Shawn Pierson, PhD, RPh<sup>1</sup>; Zachary Thompson, PharmD, MPH, BCPPS<sup>1</sup></p><p><sup>1</sup>Nationwide Children's Hospital, Columbus, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Parenteral nutrition (PN) is and remains a high-risk therapy, typically containing more than 40 different ingredients. Incorporation of PN prescribing into the electronic health record (EHR) has been recommended to minimize the risk of transcription errors and allow for implementation of important safety alerts for prescribers. Ambulatory PN prescribing workflows typically require manual transcription of orders in the absence of robust EHR interoperability between systems used for transitions of care. Pediatric patients receiving ambulatory PN have an increased risk of medication events due to the need for weight-based customization and low utilization of ambulatory PN leading to inexperience for many pharmacies. Development and implementation of a prescribing system incorporating inpatient and outpatient functionality within the EHR is necessary to improve the quality and safety of ambulatory PN in pediatric patients. The primary goal was to create a workflow that provided improvement in transitions of care through minimization of manual transcription and improved medication safety. We describe modification of standard EHR tools to achieve this aim.</p><p><b>Methods:</b> Utilizing a multidisciplinary team, development and incorporation of ambulatory PN prescribing within the EHR at Nationwide Children's Hospital was completed. Inpatient and outpatient variances, safety parameters to provide appropriate alerts to prescribers, and legal requirements were considered and evaluated for optimization within the new system.</p><p><b>Results:</b> The final product successfully incorporated ambulatory PN prescribing while allowing seamless transfer of prescriptions between care settings. The prescriber orders the patient specific ambulatory PN during an inpatient or outpatient encounter. The order is subsequently queued for pharmacist review/verification to assess the adjustments and determine extended stability considerations in the ambulatory setting. After pharmacist review, the prescription prints and is signed by the provider to be faxed to the pharmacy.</p><p><b>Conclusion:</b> To our knowledge, this is the first institution to be able to develop and incorporate pediatric PN prescribing into the EHR that transfers over in both the inpatient and outpatient settings independent of manual transcription while still allowing for customization of PN.</p><p>Faith Bala, PhD<sup>1</sup>; Enas Alshaikh, PhD<sup>1</sup>; Sudarshan Jadcherla, MD<sup>1</sup></p><p><sup>1</sup>The Research Institute at Nationwide Children's Hospital, Columbus, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Extrauterine growth remains a concern among preterm-born infants admitted to the neonatal ICU (NICU) as it rarely matches intrauterine fetal growth rates. Although the reasons are multifactorial, the role played by the duration of exclusive parenteral nutrition (EPN) and the transition to reach exclusive enteral nutrition (EEN) phase remains unclear. Significant nutrient deficits can exist during the critical phase from birth to EEN, and thereafter, and these likely impact short and long-term outcomes. Given this rationale, our aims were to examine the relationship between the duration from birth to EEN on growth and length of hospital stay (LOHS) among convalescing preterm-born infants with oral feeding difficulties.</p><p><b>Methods:</b> This is a retrospective analysis of prospectively collected data from 77 preterm infants admitted to the all-referral level IV NICU at Nationwide Children's Hospital, Columbus, Ohio, who were later referred to our innovative neonatal and infant feeding disorders program for the evaluation and management of severe feeding/aero-digestive difficulties. Inclusion criteria: infants born < 32 weeks gestation, birthweight < 1500 g, absence of chromosomal/genetic disorders, discharged at term equivalent postmenstrual age (37-42 weeks, PMA) on full oral feeding. Growth variables were converted to age- and gender-specific Z-scores using the Fenton growth charts. Using the Academy of Nutrition and Dietetics criteria for neonates and preterm populations, extrauterine growth restriction (EUGR) was defined as weight Z-score decline from birth to discharge > 0.8. Clinical characteristics stratified by EUGR status were compared using the Chi-Square test, Fisher exact test, Mann Whitney U test, and T-test as appropriate. Multivariate regression was used to explore and assess the relationship between the duration from birth to EEN and growth Z-scores at discharge simultaneously. Multiple Linear regression was used to assess the relationship between the duration from birth to EEN and LOHS.</p><p><b>Results:</b> Forty-two infants (54.5%) had EUGR at discharge, and those with weight and length percentiles < 10% were significantly greater at discharge than at birth (Table 1). The growth-restricted infants at discharge had significantly lower birth gestational age, a higher proportion required mechanical ventilation at birth, had a higher incidence of sepsis, and took longer days to attain EEN (Table 2). The duration from birth to EEN was significantly negatively associated with weight, length, and head circumference Z scores at discharge. Likewise, the duration from birth to EEN was significantly positively associated with the LOHS (Figure 1).</p><p><b>Conclusion:</b> The duration from birth to exclusive enteral nutrition (EEN) can influence growth outcomes. We speculate that significant gaps between recommended and actual nutrient intake exist during the period to EEN, particularly for those with chronic tube-feeding difficulties. Well-planned and personalized nutrition is relevant until EEN is established, and enteral nutrition advancement strategies using standardized feeding protocols offers prospects for generalizability, albeit provides opportunities for personalization.</p><p><b>Table 1.</b> Participant Growth Characteristics.</p><p></p><p><b>Table 2.</b> Participants Clinical Characteristics.</p><p></p><p></p><p><b>Figure 1.</b> Relationship between the Duration from Birth to EEN Versus Growth Parameters and Length of Hospital Stay.</p><p>Alayne Gatto, MS, MBA, RD, CSP, LD, FAND<sup>1</sup>; Jennifer Fowler, MS, RDN, CSPCC, LDN<sup>2</sup>; Deborah Abel, PhD, RDN, LDN<sup>3</sup>; Christina Valentine, MD, MS, RDN, FAAP, FASPEN<sup>4</sup></p><p><sup>1</sup>Florida International University, Bloomingdale, GA; <sup>2</sup>East Carolina Health, Washington, NC; <sup>3</sup>Florida International University, Miami Beach, FL; <sup>4</sup>Banner University Medical Center, The University of Arizona, Tucson, AZ</p><p><b>Financial Support:</b> The Rickard Foundation.</p><p><b>Background:</b> The neonatal registered dietitian nutritionist (NICU RDN) plays a crucial role in the care of premature and critically ill infants in the neonatal intensive care unit (NICU). Advanced pediatric competencies are frequently a gap in nutrition degree coursework or dietetic internships. Critical care success with such vulnerable patients requires expertise in patient-centered care, multidisciplinary collaboration, and adaptive clinical problem-solving. This research aimed to identify the needs, engagement levels, and expertise of NICU RDNs, while also providing insights into their job satisfaction and career longevity. Currently in the US, there are approximately 850 Level III and Level IV NICUs in which neonatal dietitians are vital care team members, making it imperative that hospital provide appropriate compensation, benefits, and educational support.</p><p><b>Methods:</b> This was a cross-sectional examination using a national, online, IRB approved survey during March 2024 sent to established Neonatal and Pediatric Dietitian practice groups. A Qualtrics link was provided for current and former NICU RDNs to complete a 10-minute online survey that provided an optional gift card for completion. The link remained open until 200 gift cards were exhausted, approximately one week after the survey opened. Statistical analyses were performed using Stats IQ Qualtrics. In the descriptive statistics, frequencies of responses were represented as counts and percentages. For comparison of differences, the Chi-Squared test and Fisher's Exact test are used for categorical analysis.</p><p><b>Results:</b> In total, 253 current (n = 206) and former (n = 47) NICU RDNs completed the online questionnaire. Of the 210 respondents, 84 (40%) reported having pediatric clinical experience, 94 (44%) had clinical pediatric dietetic intern experience, 21 (10%) had previously worked as a WIC nutritionist, 15 (7.1%) had specialized pediatric certification or fellowship, and 12 (5.7%) had no prior experience before starting in the NICU. (Table 1) Of 163 respondents, 83 (50.9%) reported receiving financial support or reimbursement for additional NICU training. Respondents who felt valued as team members planned to stay in the NICU RD role for more than 5 years (p > 0.0046). Additionally, they reported having acknowledgement and appreciation (64.4%), motivation (54.1%), and opportunities for advancement (22.9%). (Table 2).</p><p><b>Conclusion:</b> NICU RDNs do not have a clear competency roadmap nor a career development track. In addition, financial support or reimbursement for continuing education is not consistently an employee benefit which may play a key role in job satisfaction and retention. This data provides valuable insight for not only managers of dietitians but also professional societies to build programs and retention opportunities.</p><p><b>Table 1.</b> Question: When You Started as a NICU RD, What Experience Did You Have? (n = 210).</p><p></p><p>N and Percentages will total more than 210 as respondents could check multiple answers.</p><p><b>Table 2.</b> Comparison of Questions: Do You Feel You Have the Following in Your Role in the NICU and How Long Do You Plan to Stay in Your Role?</p><p></p><p>Sivan Kinberg, MD<sup>1</sup>; Christine Hoyer, RD<sup>2</sup>; Everardo Perez Montoya, RD<sup>2</sup>; June Chang, MA<sup>2</sup>; Elizabeth Berg, MD<sup>2</sup>; Jyneva Pickel, DNP<sup>2</sup></p><p><sup>1</sup>Columbia University Irving Medical Center, New York, NY; <sup>2</sup>Columbia University Medical Center, New York, NY</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Patients with short bowel syndrome (SBS) can have significant fat malabsorption due to decreased intestinal surface area, bile acid deficiency, and rapid transit time, often leading to feeding intolerance and dependence on parenteral nutrition (PN). Patients with SBS can have deficiency of pancreatic enzymes and/or reduced effectiveness of available pancreatic enzymes, resulting in symptoms of exocrine pancreatic insufficiency (EPI), including weight loss, poor weight gain, abdominal pain, diarrhea, and fat-soluble vitamin deficiencies. Oral pancreatic enzyme replacement therapy (PERT) is often tried but is impractical for use with enteral nutrition (EN) due to inconsistent enzyme delivery and risk of clogged feeding tubes. When used with continuous EN, oral PERT provides inadequate enzyme delivery as its ability to hydrolyze fats decreases significantly after 30 minutes of ingestion. The significant need for therapies to improve enteral absorption in this population has led to interest in using an in-line digestive cartridge to treat EPI symptoms in SBS patients on tube feedings. Immobilized lipase cartridge is an FDA-approved in-line digestive cartridge designed to hydrolyze fat in EN before it reaches the gastrointestinal tract, allowing for the delivery of absorbable fats through continuous or bolus feeds. In patients with SBS, this modality may be more effective in improving enteral fat absorption compared to oral preparations of PERT. Preclinical studies have demonstrated that use of an in-line digestive cartridge in a porcine SBS model increased fat-soluble vitamin absorption, reduced PN dependence, and improved intestinal adaptation. Our study aims to evaluate changes in PN, EN, growth parameters, stool output, and fat-soluble vitamin levels in pediatric SBS patients using an in-line digestive cartridge at our center.</p><p><b>Methods:</b> Single-center retrospective study in pediatric patients with SBS on EN who used an in-line immobilized lipase (RELiZORB) cartridge. Data collection included patient demographics, etiology of SBS, surgical history, PN characteristics (calories, volume, infusion hours/days), EN characteristics (tube type, bolus or continuous feeds, formula, calories, volume, hours), immobilized lipase cartridge use (#cartridges/day, duration), anthropometrics, stool output, gastrointestinal symptoms, medications (including previous PERT use), laboratory assessments (fat-soluble vitamin levels, fatty acid panels, pancreatic elastase), indication to start immobilized lipase cartridge, and any reported side effects. Patients with small intestinal transplant or cystic fibrosis were excluded.</p><p><b>Results:</b> Eleven patients were included in the study (mean age 10.4 years, 55% female). The most common etiology of SBS was necrotizing enterocolitis (45%) and 7 (64%) of patients were dependent on PN. Results of interim analysis show: mean duration of immobilized lipase cartridge use of 3.9 months, PN calorie decrease in 43% of patients, weight gain in 100% of patients, and improvement in stool output in 6/9 (67%) patients. Clogging of the cartridges was the most common reported technical difficulty (33%), which was overcome with better mixing of the formula. No adverse events or side effects were reported.</p><p><b>Conclusion:</b> In this single-center study, use of an in-line immobilized lipase digestive cartridge in pediatric patients with SBS demonstrated promising outcomes, including weight gain, improved stool output and reduced dependence on PN. These findings suggest that in-line digestive cartridges may play a role in improving fat malabsorption and decreasing PN dependence in pediatric SBS patients. Larger multicenter studies are needed to further evaluate the efficacy, tolerability, and safety of in-line digestive cartridges in this population.</p><p>Vikram Raghu, MD, MS<sup>1</sup>; Feras Alissa, MD<sup>2</sup>; Simon Horslen, MB ChB<sup>3</sup>; Jeffrey Rudolph, MD<sup>2</sup></p><p><sup>1</sup>University of Pittsburgh School of Medicine, Gibsonia, PA; <sup>2</sup>UPMC Children's Hospital of Pittsburgh, Pittsburgh, PA; <sup>3</sup>University of Pittsburgh School of Medicine, Pittsburgh, PA</p><p><b>Financial Support:</b> National Center for Advancing Translational Sciences (KL2TR001856.)</p><p><b>Background:</b> Administrative databases can provide unique perspectives in rare disease due to the ability to query multicenter data efficiently. Pediatric intestinal failure can be challenging to study due to the rarity of the condition at single centers and the previous lack of a single diagnosis code. In October 2023, a new diagnosis code for intestinal failure was added to the International Classification of Diseases, 10<sup>th</sup> revision (ICD-10). We aimed to describe the usage and limitations of this code to identify children with intestinal failure.</p><p><b>Methods:</b> We performed a multicenter cross-sectional study using the Pediatric Health Information Systems database from October 1, 2023, to June 30, 2024. Children with intestinal failure were identified by ICD-10 code (K90.83). Descriptive statistics were used to characterize demographics, diagnoses, utilization, and outcomes.</p><p><b>Results:</b> We identified 1804 inpatient encounters from 849 unique patients with a diagnosis code of intestinal failure. Figure 1 shows the trend of code use by month since its inception in October 2023. The 849 patients had a total of 7085 inpatient encounters over that timeframe, meaning only 25% of their encounters included the intestinal failure diagnosis. Among these 849 patients, 638 had at least one encounter over the timeframe in which they received parenteral nutrition; 400 corresponded to an admission in which they also had an intestinal failure diagnosis code. Examining only inpatient stays longer than 2 days, 592/701 (84%) patients with such a stay received parenteral nutrition. Central line-associated bloodstream infections accounted for 501 encounters. Patients spent a median of 37 days (IQR 13-96) in the hospital. Patients were predominantly non-Hispanic White (43.7%), non-Hispanic Black (14.8%), or Hispanic (27.9%). Most had government-based insurance (63.5%). Child Opportunity Index was even split among all five quintiles. The total standardized cost from all encounters with an intestinal failure diagnosis totaled $157 million with the total from all encounters with these patients totaling $259 million. The median cost over those 9 months per patients was $104,890 (IQR $31,149 - $315,167). Death occurred in 28 patients (3.3%) over the study period.</p><p><b>Conclusion:</b> The diagnosis code of intestinal failure has been used inconsistently since implementation in October 2023, perhaps due to varying definitions of intestinal failure. Children with intestinal failure experience high inpatient stay costs and rare but significant mortality. Future work must consider the limitations of using only the new code in identifying these patients.</p><p></p><p><b>Figure 1.</b> Number of Encounters With an Intestinal Failure Diagnosis Code.</p><p><b>Poster of Distinction</b></p><p>Kera McNelis, MD, MS<sup>1</sup>; Allison Ta, MD<sup>2</sup>; Ting Ting Fu, MD<sup>2</sup></p><p><sup>1</sup>Emory University, Atlanta, GA; <sup>2</sup>Cincinnati Children's Hospital Medical Center, Cincinnati, OH</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> 6th Annual Pediatric Early Career Research Conference, August 27, 2024, Health Sciences Research Building I, Emory University.</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> The neonatal period is a time of rapid growth, and many infants who require intensive care need extra nutritional support. The Academy of Nutrition and Dietetics (AND) published an expert consensus statement to establish criteria for the identification of neonatal malnutrition. There is limited evidence regarding outcomes associated with diagnosis from these opinion-derived criteria. The objective of this study was to compare anthropometric-based malnutrition indicators with direct body composition measurements in infancy.</p><p><b>Methods:</b> Air displacement plethysmography is considered the gold standard for non-invasive body composition measurement, and this was incorporated into routine clinical care at a referral Level IV neonatal intensive care unit. Late preterm and term infants (34-42 weeks gestational age) with body composition measurement available were included in this study. Infants were categorized as having malnutrition per AND criteria. Fat mass, fat-free mass, and body fat percentage z-scores were determined per Norris body composition growth curves. Logistic regression was conducted to ascertain the relationship of fat mass, fat-free mass, and body fat percentage with malnutrition diagnosis. Linear regression was performed to predict body mass index (BMI) at age 18-24 months from each body composition variable.</p><p><b>Results:</b> Eighty-four infants were included, with 39% female and 96% singleton (Table 1). Fifteen percent were small for gestational age and 12% were large for gestational age at birth. Nearly half had a congenital intestinal anomaly, including gastroschisis and intestinal atresia. Sixty-three percent of the group met at least one malnutrition criterion. Fat-free mass z-score was negatively associated with a malnutrition diagnosis, with an odds ratio 0.77 (95% CI 0.59-0.99, p < 0.05). There was not a statistically significant association between malnutrition diagnosis and body fat percentage or fat mass. There was not a statistically significant relationship between any body composition variable and BMI at 18-24 months, even after removing outliers with a high Cook's distance.</p><p><b>Conclusion:</b> Malnutrition diagnosis is associated with low fat-free mass in critically ill term infants. Body composition is not a predictor of later BMI in this small study.</p><p><b>Table 1.</b> Characteristics of Late Preterm and Term Infants in the Neonatal Intensive Care Unit With a Body Composition Measurement Performed Via Air Displacement Plethysmography.</p><p></p><p>John Stutts, MD, MPH<sup>1</sup>; Yong Choe, MAS<sup>1</sup></p><p><sup>1</sup>Abbott, Columbus, OH</p><p><b>Financial Support:</b> Abbott.</p><p><b>Background:</b> The prevalence of obesity in children is rising. Despite the awareness and work toward weight reduction, less is known about malnutrition in children with obesity. The purpose of this study was to evaluate the prevalence of obesity in U.S. children and determine which combination of indicators best define malnutrition in this population.</p><p><b>Methods:</b> The 2017-2018 National Health and Nutrition Examination Survey (NHANES) database was utilized to assess biomarkers (most recent complete dataset due to Covid-19 pandemic). Trends in prevalence were obtained from 2013-2018 survey data. Obesity was defined as ≥ 95th percentile of the CDC sex-specific BMI-for-age growth charts. Cohort age range was 12-18 years. Nutrient intake and serum were analyzed for vitamin D, vitamin E, vitamin C, potassium, calcium, vitamin B9, vitamin A, total protein, albumin, globulin, high sensitivity C-reactive protein (hs-CRP), iron, hemoglobin and mean corpuscular volume (MCV). Intake levels of fiber were also analyzed. Children consuming supplements were excluded. Categorical and continuous data analysis was performed using SAS® Proc SURVEYFREQ (with Wald chi-square test) and Proc SURVEYMEANS (with t-test) respectively in SAS® Version 9.4 and SAS® Enterprise Guide Version 8.3. Hypothesis tests were performed using 2-sided, 0.05 level tests. Results were reported as mean ± standard error (SE) (n = survey sample size) or percent ± SE (n).</p><p><b>Results:</b> The prevalence of obesity in the cohort was 21.3% ± 1.3 (993). Prevalence trended upward annually; 20.3% ± 2.1 (1232) in 2013-2014, 20.5% ± 2.0 (1129) in 2015-2016. When compared with children without obesity, the mean serum levels of those with obesity were significantly (P ≤ 0.05) lower for vitamin D (50.3 ± 2.4 vs. 61.5 ± 2.1, p < 0.001), iron (14.4 ± 0.5 vs. 16.4 ± 0.4, p < 0.001), albumin (41.7 ± 0.3 vs. 43.3 ± 0.3, p < 0.001), and MCV (83.8 ± 0.5 vs. 85.8 ± 0.3, p = 0.003). When compared with children without obesity, the mean serum levels of those with obesity were significantly (p ≤ 0.05) higher for total protein (73.2 ± 0.4 vs. 72.0 ± 0.3, p = 0.002), globulin (31.5 ± 0.4 vs. 28.7 ± 0.3, p < 0.001) and hs-CRP (3.5 ± 0.3 vs. 1.2 ± 0.2, p < 0.001). A higher prevalence of insufficiency was found for Vitamin D (51.9% ± 5.6 vs. 26.8% ± 3.7, p = 0.001), hemoglobin (16.3% ± 3.1 vs. 7.5% ± 1.8, p = 0.034) and the combination of low hemoglobin + low MCV (11.2% ± 2.9 vs. 3.3% ± 1.0, p = 0.049). All other serum levels were not significantly (p > 0.05) different, with no significant difference in intake.</p><p><b>Conclusion:</b> Results indicate a continued increase in prevalence of obesity in children. When comparing with the non-obese pediatric population, it also shows the differences in micro- and macronutrient serum levels, with no significant differences in dietary intake of these nutrients. The higher prevalence of low hemoglobin + low MCV supports iron deficiency and adds clinical relevance to the data surrounding low mean blood levels of iron. Children with obesity show higher mean globulin and hs-CRP levels consistent with an inflammatory state. The results underscore the existence of malnutrition in children with obesity and the need for nutrition awareness in this pediatric population.</p><p>Elisha London, BS, RD<sup>1</sup>; Derek Miketinas, PhD, RD<sup>2</sup>; Ariana Bailey, PhD, MS<sup>3</sup>; Thomas Houslay, PhD<sup>4</sup>; Fabiola Gutierrez-Orozco, PhD<sup>1</sup>; Tonya Bender, MS, PMP<sup>5</sup>; Ashley Patterson, PhD<sup>1</sup></p><p><sup>1</sup>Reckitt/Mead Johnson, Evansville, IN; <sup>2</sup>Data Minded Consulting, LLC, Houston, TX; <sup>3</sup>Reckitt/Mead Johnson Nutrition, Henderson, KY; <sup>4</sup>Reckitt/Mead Johnson Nutrition, Manchester, England; <sup>5</sup>Reckitt/Mead Johnson Nutrition, Newburgh, IN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> The objective was to examine whether nutrient intake varied across malnutrition classification among a nationally representative sample of children and adolescents.</p><p><b>Methods:</b> This was a secondary analysis of children and adolescents 1-18 y who participated in the National Health and Nutrition Examination Survey 2001-March 2020. Participants were excluded if they were pregnant or did not provide at least one reliable dietary recall. The degree of malnutrition risk was assessed by weight-for-height and BMI-for-age Z-scores for 1-2 y and 3-18 y, respectively. As per the Academy of Nutrition and Dietetics/American Society for Parenteral and Enteral Nutrition, malnutrition was classified using Z-scores: none (Z > -1), mild (Z between -1 and -1.9), and moderate/severe (Z ≤ -2). Dietary intake was assessed from one to two 24-hr dietary recalls. Usual intakes of macronutrients from foods and beverages, and of micronutrients from foods, beverages, and supplements were assessed using the National Cancer Institute method. The cut-point approach was used to estimate the proportion of children with intake below the estimated average requirement for micronutrients and outside the acceptable macronutrient distribution range for macronutrients. All analyses were adjusted for sampling methodology and the appropriate sample weights were applied. Independent samples t-tests were conducted to compare estimates within age groups between nutrition status classifications, with no malnutrition as the reference group.</p><p><b>Results:</b> A total of 32,188 participants were analyzed. Of those, 31,689 (98.4%) provided anthropometrics. The majority (91%) did not meet criteria for malnutrition, while 7.4% and 1.6% met criteria for mild and moderate/severe malnutrition, respectively. Dietary supplement use (mean[SE]) was reported among 33.3[0.7]% of all children/adolescents. Of those experiencing moderate/severe malnutrition, inadequate calcium intake was greatest in adolescents 14-18 y (70.6[3.3]%), and older children 9-13 y (69.7[3.4]%), compared to 3-8 y (29.5[3.9]%) and 1-2 y (5.0[1.2]%). In children 9-13 y, risk for inadequate calcium intake was greater among those experiencing mild (65.3[2.1]%) and moderate/severe malnutrition (69.7[2.1]%) compared to those not experiencing malnutrition (61.2[1.1]%). Similarly, in those experiencing moderate/severe malnutrition, inadequate zinc and phosphorus intake was greatest in adolescents 14-18 y (20.9[3.3]% and 30.6[3.4]%, respectively) and 9-13 y (13.4[2.6]% and 33.9[3.6]%, respectively) compared to younger children (range 0.2[0.1]-0.9[0.4]% and 0.2[0.1]-0.4[0.2]% in 1-8 y, respectively). Although the greatest risk for inadequate protein intake was observed among those experiencing moderate/severe malnutrition, percent energy intake from protein and carbohydrate was adequate for most children and adolescents. Total and saturated fat were consumed in excess by all age groups regardless of nutrition status. The greatest risk for excessive saturated fat intake was reported in those experiencing moderate/severe malnutrition (range across all age groups: 85.9-95.1%).</p><p><b>Conclusion:</b> Older children and adolescents experiencing malnutrition were at greatest risk for inadequate intakes of certain micronutrients such as calcium, zinc, and phosphorus. These results may indicate poor diet quality among those at greatest risk for malnutrition, especially adolescents.</p><p>Anna Benson, DO<sup>1</sup>; Louis Martin, PhD<sup>2</sup>; Katie Huff, MD, MS<sup>2</sup></p><p><sup>1</sup>Indiana University School of Medicine, Carmel, IN; <sup>2</sup>Indiana University School of Medicine, Indianapolis, IN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Trace metals are essential for growth and are especially important in the neonatal period. Recommendations are available for intake of these trace metals in the neonate. However, these recommendations are based on limited data and there are few available descriptions regarding trace metal levels in neonates and their influence on outcomes. In addition, monitoring trace metal levels can be difficult as multiple factors, including inflammation, can affect accuracy. The goal of this project was to evaluate patient serum levels of zinc, selenium and copper and related outcomes including growth, rate of cholestasis, sepsis, bronchopulmonary dysplasia (BPD), and death in a cohort admitted to the neonatal intensive care unit (NICU) and parenterally dependent.</p><p><b>Methods:</b> We completed a retrospective chart review of NICU patients who received parenteral nutrition (PN) and had trace metal panels drawn between January 2016 and February 2023. Charts were reviewed for baseline labs, time on PN, trace metal panel level, dose of trace metals in PN, enteral feeds and supplements, and outcomes including morbidities and mortality. Sepsis was diagnosed based on positive blood culture and cholestasis as a direct bilirubin >2 mg/dL. Fisher's Exact Test or Chi square were used to assess association between categorical variables. Spearman correlation was used to assess the correlation between two continuous variables. A p-value of 0.05 was used for significance.</p><p><b>Results:</b> We included 98 patients in the study with demographic data shown in Table 1. The number of patients with trace metal elevation and deficiency are noted in Tables 1 and 2, respectively. The correlation between growth and trace metal levels is shown in Figure 1. Patient outcomes related to the diagnosis of trace metal deficiency are noted in Table 2. Copper deficiency was found to be significantly associated with sepsis (p = 0.010) and BPD (p = 0.033). Selenium deficiency was associated with cholestasis (p = 0.001) and BPD (p = 0.003). To further assess the relation of selenium and cholestasis, spearman correlation noted a significant negative correlation between selenium levels and direct bilirubin levels with (p = 0.002; Figure 2).</p><p><b>Conclusion:</b> Trace metal deficiency was common in our population. In addition, selenium and copper deficiency were associated neonatal morbidities including sepsis, cholestasis, and BPD. When assessing selenium deficiency and cholestasis, the measured selenium level was found to correlate with direct bilirubin level. While there was correlation between trace metal levels and growth, the negative association noted is unclear and highlights the need for further assessment to determine the influence of other patient factors and the technique used for growth measurement. Overall this project highlights the important relation between trace metals and neonatal morbidities. Further research is needed to understand how trace metal supplementation might be used to optimize neonatal outcomes in the future.</p><p><b>Table 1.</b> Patient Demographic and Outcome Information for Entire Population Having Trace Metal Levels Obtained. (Total population n = 98 unless noted).</p><p></p><p>Patient demographic and outcome information for entire population having trace metal levels obtained.</p><p><b>Table 2.</b> Rate of Trace Metal Deficiency and Association With Patient Outcomes.</p><p>(Total n = 98).</p><p></p><p>Rate of trace metal deficiency and association with patient outcomes.</p><p></p><p>Scatter plot of average trace metal level and change in growth over time. (Growth represented as change in parameter over days). The Spearman correlation coefficient is listed for each graph. And significance note by symbol with *p-value < 0.05, †p-value < 0.01, ‡p-value < 0.001.</p><p><b>Figure 1.</b> Correlation of Trace Metal Level and Growth.</p><p></p><p>Scatter plot of individual direct bilirubin levels plotted by selenium levels. Spearman correlation coefficient noted with negative correlation with p-value 0.002.</p><p><b>Figure 2.</b> Correlation of Selenium Level With Direct Bilirubin Level.</p><p>Kaitlin Berris, RD, PhD (student)<sup>1</sup>; Qian Zhang, MPH<sup>2</sup>; Jennifer Ying, BA<sup>3</sup>; Tanvir Jassal, BSc<sup>3</sup>; Rajavel Elango, PhD<sup>4</sup></p><p><sup>1</sup>BC Children's Hospital, North Vancouver, BC; <sup>2</sup>BCCHR, Vancouver, BC; <sup>3</sup>University of British Columbia, Vancouver, BC; <sup>4</sup>UBC/BCCHR, Vancouver, BC</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Pediatric critical illness causes increased demand for several nutrients. Children admitted requiring nutrition support have a naso-gastric tube to deliver enteral nutrition (EN) formula as liquid nutrition. These formulas are developed using dietary reference intakes (DRI) for healthy populations and does not account for altered nutrient needs in critical illness. Imbalanced nutrient delivery during this vulnerable time could lead to malnutrition, and negatively impact hospital outcomes. Published guidelines by the American Society of Parenteral and Enteral Nutrition (ASPEN) in 2017 aim to alleviate poor nutrition: use concentrated formula in fluid restriction, meet 2/3 estimated calories (resting energy expenditure, REE) by the end of first week, a minimum of 1.5 g/kg/day dietary protein and provide the DRI for each micronutrient. The objective of this retrospective cohort study was to evaluate nutrition delivery (prescribed vs. delivered) compared to 2017 guidelines, and correlate to adequacy in children admitted to a Canadian PICU.</p><p><b>Methods:</b> Three-years of charts were included over two retrospective cohorts: September 2018- December 2020 and February 2022- March 2023. The first cohort, paper chart based, included children 1-18 y with tube feeding started within 3 d after admission. The second cohort, after transition to electronic medical records, included children 1-6 y on exclusive tube feeding during the first week of admission. Patient characteristics, daily formula type, rate prescribed (physician order), amount delivered (nursing notes) and interruption hours and reasons were collected. Statistical analysis included descriptive analysis of characteristics, logistic regression for odds of achieving adequacy of intake with two exposures: age categories and formula type. Pearson correlation was used to interpret interruption hours with percentage of calories met.</p><p><b>Results:</b> Patients (n = 86) that were included spanned 458 nutrition support days (NSD). Admissions predominantly were for respiratory disease (73%), requiring ventilation support (81.4%). Calorie prescription (WHO REE equation) was met in 20.3% of NSD and 43.9% met 2/3 of calorie recommendation (table 1). Concentrated calories were provided in 34% of patients. Hours of interruptions vs. percentage of goal calories met was negatively correlated (r = -.52, p = .002) when looking at those ordered EN without prior EN history (i.e. home tube fed). More than 4 h of interruptions was more likely to not meet 2/3 calorie goal. Calorie goals were met when standard pediatric formula was used in children 1-8 y and concentrated adult formula in 9-18 y. Odds of meeting calorie goal increased by 85% per 1 day increase (OR 1.85 [1.52, 2.26], p < .0001) with a median of 4 d after admission to meet 2/3 calories. Minimum protein intake (1.5 g/kg/d) was only met in 24.9% of all NSD. Micronutrients examined, except for vitamin D, met DRI based on prescribed amount. Delivered amounts provided suboptimal micronutrient intake, especially for vitamin D (figure 1).</p><p><b>Conclusion:</b> Current ASPEN recommendations for EN were not being achieved in critically ill children. Concentrated formula was often not chosen and decreased ability to meet calorie goals in younger patients. Prescribing shorter continuous EN duration (20/24 h) may improve odds of meeting calorie targets. Evaluation of NSD showed improving trend in calorie intake over the first week to meet 2/3 goal recommendation. However, results highlight inadequacy of protein even if calorie needs are increasingly met. Attention to micronutrient delivery with supplementation of Vitamin D is required.</p><p><b>Table 1.</b> Enteral Nutrition Characteristics Per Nutrition Support Days. (N = 458)</p><p></p><p></p><p>Estimated Vitamin D Intake and 95% Confidence Intervals by Age and Formula Groups.</p><p><b>Figure 1.</b> Estimated Vitamin D Intake by Age and Formula Groups.</p><p>Dana Steien, MD<sup>1</sup>; Megan Thorvilson, MD<sup>1</sup>; Erin Alexander, MD<sup>1</sup>; Molissa Hager, NP<sup>1</sup>; Andrea Armellino, RDN<sup>1</sup></p><p><sup>1</sup>Mayo Clinic, Rochester, MN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Home parenteral nutrition (HPN) is a life-sustaining therapy for children with long-term digestive dysfunction. Historically HPN has been considered a bridge to enteral autonomy, or intestinal transplantation. However, thanks to medical and management improvements, HPN is now used for a variety of diagnoses, including intractable feeding intolerance (IFI) in children with severe neurological impairment (SNI). IFI occurs most often near the end-of-life (EOL) in patients with SNI. Thus, outpatient planning and preparation for HPN in this population, vastly differs from historical HPN use.</p><p><b>Methods:</b> Case series of four pediatric patients with SNI who develop IFI and utilized HPN during their EOL care. Data was collected by retrospective chart review. The hospital pediatric palliative care service was heavily involved in the patients’ care when HPN was discussed and planned. The pediatric intestinal rehabilitation (PIR) and palliative care teams worked closely together during discharge planning and throughout the outpatient courses.</p><p><b>Results:</b> The children with SNI in this case series, developed IFI between ages 1 and 12 years. Duration of HPN use varied from 5 weeks to 2 years. All patients were enrolled in hospice, but at various stages. Routine, outpatient HPN management plans and expectations were modified, based on each family's goals of EOL care. Discussions regarding the use and timing of laboratory studies, fever plans, central line issues, growth, and follow up appointments, required detailed discussions and planning.</p><p><b>Conclusion:</b> EOL care for children differs from most EOL care in adults. Providing HPN to children with SNI and IFI can provide time, opportunities, and peace for families during their child's EOL journey, if it aligns with their EOL goals. PIR teams can provide valuable HPN expertise for palliative care services and families during these challenging times.</p><p>Jessica Lowe, DCN, MPH, RDN<sup>1</sup>; Carolyn Ricciardi, MS, RD<sup>2</sup>; Melissa Blandford, MS, RD<sup>3</sup></p><p><sup>1</sup>Nutricia North America, Roseville, CA; <sup>2</sup>Nutricia North America, Rockville, MD; <sup>3</sup>Nutricia North America, Greenville, NC</p><p><b>Financial Support:</b> This study was conducted by Nutricia North America.</p><p><b>Background:</b> Extensively hydrolyzed formulas (eHFs) are indicated for the management of cow milk allergy (CMA) and related symptoms. This category of formulas is often associated with an unpleasant smell and bitter taste, however; whey-based eHFs are considered more palatable than casein-based eHFs.<sup>1-4</sup> The inclusion of lactose, the primary carbohydrate in human milk, in hypoallergenic formula can also promote palatability. Historically, concerns for residual protein traces in lactose has resulted in complete avoidance of lactose in CMA. However, “adverse reactions to lactose in CMA are not supported in the literature, and complete avoidance of lactose in CMA is not warranted.”<sup>5</sup> Clinicians in the United Kingdom previously reported that taste and acceptance of an eHF is an important consideration when prescribing, as they believe a palatable eHF may result in decreased formula refusal and lead to more content families.<sup>1</sup> The objective of this study was to understand caregiver sensory perspectives on an infant, when-based eHF containing lactose.</p><p><b>Methods:</b> Fifteen clinical sites were recruited from across the United States. Clinicians enrolled 132 infants, whose families received the whey based eHF for 2 weeks, based on clinician recommendation. Caregivers completed two surveys: an enrollment survey and 2-week-post-survey characterizing eHF intake, CMA related symptoms, stooling patterns, sensory perspectives and satisfaction with eHF. Data was analyzed using SPSS 27 and descriptive statistics.</p><p><b>Results:</b> One hundred and twenty-two infants completed the study. At enrollment infants were 22 ( ± 14.7) weeks old. Prior to study initiation, 12.3% of infants were breastfed, 40.2% were on a casein-based eHF, and 18.9% were on a standard formula with intact proteins. Most patients (97.5%) were fed orally and 2.5% were tube fed. Among all parents who responded, 92.5% (n = 86/93) reported better taste and 88.9% (n = 96/108) reported better small for whey-based formula containing lactose compared to the previous formula. For caregivers whose child was on a casein-based eHF at enrollment and responded, 97.6% (n = 41/42) reported better taste and 95.7% (n = 44/46) reported better smell than the previous formula. Additional caregiver reported perceptions of taste and smell are reported in Figure 1 and Figure 2, respectively. Finally, 89.3% of caregivers said it was easy to start their child on the whey-based eHF containing lactose and 91.8% would recommend it to other caregivers whose child requires a hypoallergenic formula.</p><p><b>Conclusion:</b> The majority of caregivers had a positive sensory experience with the whey-based eHF containing lactose compared to the baseline formulas. Additionally, they found the trial formula easy to transition to and would recommend it to other families. These data support the findings of Maslin et al. and support the clinicians' expectation that good palatability would result in better acceptance and more content infants and families.<sup>1</sup> Further research is needed to better understand how improved palatability can contribute to decreased waste and heath care costs.</p><p></p><p><b>Figure 1.</b> Caregiver Ranking: Taste of Whey-Based, Lactose-Containing eHF.</p><p></p><p><b>Figure 2.</b> Caregiver Ranking: Smell of Whey-Based, Lactose-Containing eHF.</p><p>Michele DiCarlo, PharmD<sup>1</sup>; Emily Barlow, PharmD, BCPPS<sup>1</sup>; Laura Dinnes, PharmD, BCIDP<sup>1</sup></p><p><sup>1</sup>Mayo Clinic, Rochester, MN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> There is limited information on hyperkalemia in adult patients that received trimethoprim-sulfamethoxazole (TMP-SMX). The mechanism for hyperkalemia is related to the TMP component, which is structurally related to the potassium-sparing diuretic amiloride. In children, there is no information on clinical impact or monitoring required. We noted that a pediatric patient on total parenteral nutrition (TPN), had a drop in the TPN potassium dosing once the TMP-SMX was started. This reduction remained for two weeks, following the last dose of the antibiotic. Case presentation: A 7-month-old was in a cardiac intensive care unit following complex surgical procedures and extracorporeal membrane oxygenation requirement. TPN was started due to concerns related to poor perfusion and possible necrotizing enterocolitis. TPN continued for a total of one hundred and ten days. Per protocol while on TPN, electrolytes and renal function (urine output, serum creatinine) were monitored daily. Diuretic therapy, including loop diuretics, chlorothiazide and spironolactone, were prescribed prior to TPN and continued for the entire duration. Renal function remained stable for the duration of TPN therapy. Dosing of potassium in the TPN was initiated per ASPEN guidelines and adjusted for serum potassium levels. Due to respiratory requirement and positive cultures, TMP-SMX was added to the medication regimen on two separate occasions. TMP-SMX 15 mg/kg/day was ordered twelve days after the start of the TPN and continued for three days. TMP-SMX 15 mg/kg/day was again started on day forty-three of TPN and continued for a five-day duration. Serum potassium was closely monitored for adjustments once TMP-SMX was started. When the TPN was successfully weaned off, we reviewed this information again. There was an obvious drop in TPN potassium dosing by day two of both regimens of TMP-SMX start and did not return to the prior stable dosing until approximately two weeks after the last dose of the antibiotic. This reduction lasted far beyond the projected half-life of TMP-SMX. (Table 1) Discussion: TMP-SMX is known for potential hyperkalemia in adult patients with multiple confounding factors. Factors include high dosage, renal dysfunction, congestive heart failure, and concomitant medications known to cause hyperkalemia. Little literature exists to note this side effect in pediatrics. The onset of our patient's increased serum potassium levels, and concurrent decrease in TPN dosing, could be expected, as TMP-SMX's time to peak effect is 1-4 hours. Half-life of TMP in children < 2 years old is 5.9 hours. Given this information, one would expect TMP-SMX to be cleared approximately thirty hours from the last dose administered. Our patient's potassium dosing took approximately two weeks from the end of the TMP-SMX administration to return to the pre TMP-SMX potassium dosing for both treatment regimens. Potential causes for the extended time to stabilize include concurrent high dose TMP-SMX and continuation of the potassium sparing diuretic. Prolonged potassium monitoring for pediatric patients started on high dose TMP-SMX while on TPN should be considered and further evaluation explored.</p><p><b>Methods:</b> None Reported.</p><p><b>Results:</b> None Reported.</p><p><b>Conclusion:</b> None Reported.</p><p></p><p>Graph representing TPN Potassium dose (in mEq/kg/day), and addition of TMP-SMX regimen on two separate occasions. Noted drop in TPN potassium dose and delayed return after each TMP-SMX regimen.</p><p><b>Figure 1.</b> TPN Potassium Dose and TMP-SMX Addition.</p><p>Jennifer Smith, MS, RD, CSP, LD, LMT<sup>1</sup>; Praveen Goday, MBBS<sup>2</sup>; Lauren Storch, MS, RD, CSP, LD<sup>2</sup>; Kirsten Jones, RD, CSP, LD<sup>2</sup>; Hannah Huey, MDN<sup>2</sup>; Hilary Michel, MD<sup>2</sup></p><p><sup>1</sup>Nationwide Children's Hospital, Dresden, OH; <sup>2</sup>Nationwide Children's Hospital, Columbus, OH</p><p><b>Financial Support:</b> North American Society for Pediatric Gastroenterology, Hepatology, and Nutrition (NASPGHAN) Foundation.</p><p><b>Background:</b> The prevalence of Avoidant/Restrictive Food Intake Disorder (ARFID) is approximately 3% in the general population and 10% in adults with inflammatory bowel disease (IBD). Up to 10% of adolescents with IBD have reported disordered eating behaviors; however, there have been no prospective studies on the prevalence of ARFID eating behaviors in this population.</p><p><b>Methods:</b> This is a one-time, cross-sectional, non-consecutive study of English-speaking patients with a confirmed diagnosis of IBD, aged 12-18 years. Participants completed the validated Nine Item ARFID Screen (NIAS), the SCOFF eating disorder screen (this screen utilizes an acronym [<span>S</span>ick, <span>C</span>ontrol, <span>O</span>ne, <span>F</span>at, and <span>F</span>ood] in relation to the five questions on the screen) and answered one question about perceived food intolerances. The NIAS is organized into the three specific ARFID domains: eating restriction due to picky eating, poor appetite/limited interest in eating, and fear of negative consequences from eating, each of which is addressed by three questions. Questions are based on a 6-point Likert scale. Participants scoring ≥23 on the total scale or ≥12 on an individual subscale were considered to meet criteria for ARFID eating behaviors. Since some individuals with a positive NIAS screen may have anorexia nervosa or bulimia, we also used the SCOFF questionnaire to assess the possible presence of an eating disorder. A score of two or more positive answers has a sensitivity of 100% and specificity of 90% for anorexia nervosa or bulimia. Apart from descriptive statistics, Chi-square testing was used to study the prevalence of malnutrition, positive SCOFF screening, or food intolerances in patients with and without positive NIAS screens.</p><p><b>Results:</b> We enrolled 82 patients whose demographics are shown in Table 1. Twenty percent (16/82) scored positive on the NIAS questionnaire, 16% (13/82) scored positive on the SCOFF questionnaire, and 48% (39/82) noted food intolerances. Of the 16 participants who scored positive on the NIAS, 50% (8/16) were male, 56% (9/16) had a diagnosis of Crohn's Disease, and 69% (11/16) had inactive disease. Twenty-five percent of those with a positive NIAS (4/16) met criteria for malnutrition (1 mild, 2 moderate, and 1 severe). Sixty-nine percent of those who scored positive on the NIAS (11/16) noted food intolerances and 30% (5/16) had a positive SCOFF screener. The prevalence of malnutrition (p = 0.4), the percentage of patients who scored positive on the SCOFF eating disorder screen (p = 0.3), or those who with reported food intolerances (p = 0.6) was similar in participants who scored positive on the NIAS vs. not.</p><p><b>Conclusion:</b> Using the NIAS, 20% of adolescents with IBD met criteria for ARFID. Participants were no more likely to have malnutrition, a positive score on the SCOFF eating disorder screen, or reported food intolerances whether or not they met criteria for ARFID. Routine screening of adolescents with IBD for ARFID or other eating disorders may identify patients who would benefit from further evaluation.</p><p><b>Table 1.</b> Demographics.</p><p></p><p>Qian Wen Sng, RN<sup>1</sup>; Jacqueline Soo May Ong<sup>2</sup>; Sin Wee Loh, MB BCh BAO (Ireland), MMed (Paed) (Spore), MRCPCH (RCPCH, UK)<sup>1</sup>; Joel Kian Boon Lim, MBBS, MRCPCH, MMed (Paeds)<sup>1</sup>; Li Jia Fan, MBBS, MMed (Paeds), MRCPCH (UK)<sup>3</sup>; Rehena Sultana<sup>4</sup>; Chengsi Ong, BS (Dietician), MS (Nutrition and Public Health), PhD<sup>1</sup>; Charlotte Lin<sup>3</sup>; Judith Ju Ming Wong, MB BCh BAO, LRCP & SI (Ireland), MRCPCH (Paeds) (RCPCH, UK)<sup>1</sup>; Ryan Richard Taylor<sup>3</sup>; Elaine Hor<sup>2</sup>; Pei Fen Poh, MSc (Nursing), BSN<sup>1</sup>; Priscilla Cheng<sup>2</sup>; Jan Hau Lee, MCI, MRCPCH (UK), USMLE, MBBS<sup>1</sup></p><p><sup>1</sup>KK Hospital, Singapore; <sup>2</sup>National University Hospital, Singapore; <sup>3</sup>National University Hospital Singapore, Singapore; <sup>4</sup>Duke-NUS Graduate Medical School, Singapore</p><p><b>Financial Support:</b> This work is supported by the National Medical Research Council, Ministry of Health, Singapore.</p><p><b>Background:</b> Protein-energy malnutrition is pervasive in pediatric intensive care unit (PICU) patients. There remains clinical equipoise on the impact of protein supplementation in critically ill children. Our primary aim was to determine the feasibility of conducting a randomized controlled trial (RCT) of protein supplementation versus standard enteral nutrition (EN) in the PICU.</p><p><b>Methods:</b> An open-labelled pilot RCT was conducted from January 2021 to June 2024 in 2 tertiary pediatric centers in Singapore. Children with body mass index (BMI) z-score < 0, who were expected to require invasive or non-invasive mechanical ventilation for at least 48 hours and required EN support for feeding were included. Patients were randomized (1:1 allocation) to protein supplementation of ³1.5 g/kg/day in addition to standard EN or standard EN alone for 7 days after enrolment or discharge to high dependency unit, whichever was earlier. Feasibility was based on 4 outcomes: Effective screening (>80% eligible patients approached for consent), satisfactory enrolment (>1 patient/center/month), timely protocol implementation (>80% of participants receiving protein supplementation within first 72 hours) and protocol adherence (receiving >80% of protein supplementation as per protocol).</p><p><b>Results:</b> A total of 20 patients were recruited - 10 (50.0%) and 10 (50.0%) in protein supplementation and standard EN groups, respectively. Median age was 13.0 [Interquartile range (IQR) 4.2, 49.1] months. Respiratory distress was the most common reason for PICU admission [11 (55.0%)]. Median PICU and hospital length of stay were 8.0 (IQR 4.5, 16.5) and 19.0 (IQR 11.5, 36.5) days, respectively. There were 3 (15%) deaths which were not related trial intervention. Screening rate was 50/74 (67.6%). Mean enrollment was 0.45 patient/center/month. Timely protocol implementation was performed in 15/20 (75%) participants. Protocol adherence was achieved by the participants in 11/15 (73.3%) of protein supplementation days.</p><p><b>Conclusion:</b> Satisfactory feasibility outcomes were not met in this pilot RCT. Based on inclusion criteria of this pilot study and setup of centers, a larger study in Singapore alone will not be feasible. With incorporation of revised logistic arrangements, a larger feasibility multi-center center study involving regional countries should be piloted.</p><p>Veronica Urbik, MD<sup>1</sup>; Kera McNelis, MD<sup>1</sup></p><p><sup>1</sup>Emory University, Atlanta, GA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Many clinical management and physiologic knowledge gaps are present in the care of the tiny baby population (neonates born at ≤23 weeks gestation, also labelled periviable gestational age). There is therefore significant center-specific variation in practice, as well as increased morbidity and mortality observed in infants born at 22-23 weeks compared to those born at later gestational ages<sup>1</sup>. The importance of nutrition applies to tiny babies, and appropriate nutrition is perhaps even more critical in this population than others. Numerous studies have demonstrated the benefits of full early enteral feeding, including decreased rates of central-line associated blood infection and cholestasis<sup>2,3</sup>. The risk of developing necrotizing enterocolitis (NEC) is a balancing measure for advancement of enteral feeds<sup>4</sup>. Adequate nutrition is critical for growth, reducing morbidity and mortality, and improving overall outcomes. Current proposed protocols for this population target full enteral feeding volumes to be reached by 10-14 days of life<sup>5</sup>.</p><p><b>Methods:</b> From baseline data collected at two Level III neonatal intensive care units (NICU) attended by a single group of academic neonatology faculty from January 2020 – January 2024, the average length of time from birth until full enteral feeds was achieved was 31 days. Using quality improvement (QI) methodology, we identified the barriers in advancement to full enteral feeds (defined as 120cc/kg/day) in babies born at 22- and 23-weeks gestational age admitted to the pediatric resident staffed Level III NICU.</p><p><b>Results:</b> The Pareto chart displays the primary barriers including undefined critical illness, vasopressor use, evaluation for NEC, or spontaneous intestinal perforation, patent ductus arteriosus treatment, and electrolyte derangements (Figure 1). In many cases, no specific reason was able to be identified in chart review for not advancing toward full enteral feeds.</p><p><b>Conclusion:</b> In this ongoing QI project, our SMART aim is to reduce the number of days to reach full feeds over the period of January 2024 -June 2025 by 10%, informed by root cause analysis and the key driver diagram developed from the data thus far (Figure 2). The first plan-do-study-act cycle started on January 16, 2024. Data are analyzed using statistical process control methods.</p><p></p><p>Pareto Chart.</p><p><b>Figure 1.</b></p><p></p><p>Key Driver Diagram.</p><p><b>Figure 2.</b></p><p>Bridget Hron, MD, MMSc<sup>1</sup>; Katelyn Ariagno, RD, LDN, CNSC, CSPCC<sup>1</sup>; Matthew Mixdorf<sup>1</sup>; Tara McCarthy, MS, RD, LDN<sup>1</sup>; Lori Hartigan, ND, RN, CPN<sup>1</sup>; Jennifer Lawlor, RN, BSN, CPN<sup>1</sup>; Coleen Liscano, MS, RD, CSP, LDN, CNSC, CLE, FAND<sup>1</sup>; Michelle Raymond, RD, LDN, CDCES<sup>1</sup>; Tyra Bradbury, MPH, RD, CSP, LDN<sup>1</sup>; Erin Keenan, MS, RD, LDN<sup>1</sup>; Christopher Duggan, MD, MPH<sup>1</sup>; Melissa McDonnell, RD, LDN, CSP<sup>1</sup>; Rachel Rosen, MD, MPH<sup>1</sup>; Elizabeth Hait, MD, MPH<sup>1</sup></p><p><sup>1</sup>Boston Children's Hospital, Boston, MA</p><p><b>Financial Support:</b> Some investigators received support from agencies including National Institutes of Health and NASPGHAN which did not directly fund this project.</p><p><b>Background:</b> The widespread shortage of amino acid-based formula in February 2022 highlighted the need for urgent and coordinated hospital response to disseminate accurate information to front-line staff in both inpatient and ambulatory settings.</p><p><b>Methods:</b> An interdisciplinary working group consisting of pediatric gastroenterologists, dietitians, nurses and a quality improvement analyst was established in September 2022. The group met at regular intervals to conduct a needs assessment for all disciplines. The group developed and refined a novel clinical process algorithm to respond to reports of formula shortages and/or recalls. The Clinical Process Map is presented in Figure 1. Plan-do-study-act cycles were implemented to improve the quality of the communication output based on staff feedback. The key performance indicator is time from notification of possible shortage to dissemination of communication to stakeholders, with a goal of < 24 hours.</p><p><b>Results:</b> From September 2022 to August 2024, the group met 18 times for unplanned responses to formula recall/shortage events. Email communication was disseminated within 24 hours for 8/18 (44%) events; within 48 hours for 9/18 (50%) and 1/18 (6%) after 48 hours. Iterative changes included the initiation of an urgent huddle for key stakeholders to identify impact and substitution options; development of preliminary investigation pathway to ensure validity of report; development of structured email format that was further refined to table format including images of products (Figure 2) and creation of email distribution list to disseminate shortage reports. The keys to this project's success were the creation of a multidisciplinary team dedicated to meeting urgently for all events, and the real time drafting and approval of communication within the meeting. Of note, the one communication which was substantially delayed (94.7 hours) was addressed over email only, underscoring the importance of the multidisciplinary working meeting.</p><p><b>Conclusion:</b> Establishing clear lines of communication and assembling key stakeholders resulted in timely, accurate and coordinated communication regarding nutrition recalls/shortage events at our institution.</p><p></p><p><b>Figure 1.</b> Formula Recall Communication Algorithm.</p><p></p><p><b>Figure 2.</b></p><p>Nicole Misner, MS, RDN<sup>1</sup>; Michelle Yavelow, MS, RDN, LDN, CNSC, CSP<sup>1</sup>; Athanasios Tsalatsanis, PhD<sup>1</sup>; Racha Khalaf, MD, MSCS<sup>1</sup></p><p><sup>1</sup>University of South Florida, Tampa, FL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Early introduction of peanuts and eggs decreases the incidence of peanut and egg allergies in infants who are at high risk of developing food allergies. Prevention guidelines and clinical practice have shifted with recent studies. However, little is known about introducing common food allergens in infants fed via enteral feeding tubes. Early introduction of allergens could be of importance in infants working towards tube feeding wean and those that may benefit from blended tube feeding in the future. We aimed to compare the characteristics of patients with enteral tubes who received education during their gastroenterology visit verses those who did not.</p><p><b>Methods:</b> We performed a single center retrospective chart review involving all patients ages 4 to 24 months of age with an enteral feeding tube seen at the University of South Florida Pediatric Gastroenterology clinic from August 2020 to July 2024. Corrected age was used for infants born < 37 weeks’ gestation age. All types of enteral nutrition were included i.e. nasogastric, gastrostomy, gastrostomy-jejunostomy tube. Data on demographics, clinical characteristics and parent-reported food allergen exposure were collected. An exception waiver was received by the University of South Florida Institution Review Board for this retrospective chart review. Differences between patients who received education and those who did not were evaluated using Student's t-test for continuous variables and chi-square test for categorical variables. All analysis was performed using R Statistical Software (v4.4.2). A p value < =0.05 was considered statistically significant.</p><p><b>Results:</b> A total of 77 patients met inclusion criteria, with 349 total visits. Patient demographics at each visit are shown in Table 1. There was a documented food allergy in 12% (43) of total visits. Education on early introduction of common food allergens was provided in 12% (42) of total visits. Patients who received education at their visit were significantly younger compared to those who did not and were also more likely to have eczema. Table 2 compares nutrition characteristics of the patient at visits where education was discussed vs those where it was not. Infants with any percent of oral intake were more likely to have received education than those that were nil per os (p = 0.013). There was a significant association between starting solids and receiving education (p < 0.001). Reported allergen exposure across all visits was low. For total visits with the patient < 8 months of age (n = 103), only 6% (6) reported peanut and 8% (8) egg exposure. Expanded to < 12 months of age at the time of visit (n = 198), there was minimal increase in reported allergen exposure, 7% (14) reported peanut and 8% (15) egg exposure. Oral feeds were the most common source of reported form of allergen exposure. Only one patient received commercial early allergen introduction product. Cow's milk exposure was the most reported allergen exposure with 61% (63) under 8 months and 54% (106) under 12 months of age, majority from their infant formula.</p><p><b>Conclusion:</b> Age and any proportion of oral intake were associated with receiving education on common food allergen introduction at their visit. However, there were missed opportunities for education in infants with enteral feeding tubes. There were few visits with patients that reported peanut or egg exposure. Further research and national guidelines are needed on optimal methods of introduction in this population.</p><p><b>Table 1.</b> Demographics.</p><p></p><p><b>Table 2.</b> Nutrition Characteristics.</p><p></p><p>Samantha Goedde-Papamihail, MS, RD, LD<sup>1</sup>; Ada Lin, MD<sup>2</sup>; Stephanie Peters, MS, CPNP-PC/AC<sup>2</sup></p><p><sup>1</sup>Nationwide Children's Hospital, Grove City, OH; <sup>2</sup>Nationwide Children's Hospital, Columbus, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Critically ill children with severe acute kidney injury (AKI) requiring continuous renal replacement therapy (CRRT) are at high risk for micronutrient deficiencies, including vitamin C (VC). VC acts as an antioxidant and enzyme cofactor. It cannot be made endogenously and must be derived from the diet. Critically ill patients often enter the hospital with some degree of nutritional debt and may have low VC concentrations upon admission. This is exacerbated in the case of sepsis, multi-organ dysfunction, burns, etc. when VC needs are higher to address increased oxidative and inflammatory stress. When these patients develop a severe AKI requiring CRRT, their kidneys do not reabsorb VC as healthy kidneys would, further increasing losses. Moreover, VC is a small molecule and is filtered out of the blood via CRRT. The combination of increased VC losses and elevated VC requirements in critically ill patients on CRRT result in high risk of VC deficiency. The true prevalence of VC deficiency in this population is not well-known; whereas the prevalence of deficiency in the general population is 5.9% and 18.3% in critically ill children, on average. The aim of this study is to ascertain the prevalence of VC deficiency in critically ill pediatric patients on CRRT by monitoring serum VC levels throughout their CRRT course and correcting noted deficiencies.</p><p><b>Methods:</b> An observational study was conducted from May 2023 through August 2024 in a 54 bed high-acuity PICU offering ECMO, CRRT, Level 1 Trauma, burn care, solid organ transplant and bone marrow transplantation as tertiary care services. Fifteen patients were identified upon initiation of CRRT and followed until transfer from the PICU. Serial serum VC levels were checked after 5-10 days on CRRT and rechecked weekly thereafter as appropriate. Deficiency was defined as VC concentrations < 11 umol/L. Inadequacy was defined as concentrations between 11-23 umol/L. Supplementation was initiated for levels < 23 umol/L; dose varied 250-500 mg/d depending on age and clinical situation. Most deficient patients received 500 mg/d supplementation. Those with inadequacy received 250 mg/d.</p><p><b>Results:</b> Of 15 patients, 9 had VC deficiency and 4 had VC inadequacy [FIGURE 1]. Of those with deficiency, 5 of 9 patients were admitted for septic shock [FIGURE 2]. VC level was rechecked in 8 patients; level returned to normal in 5 patients and 4 of those 5 received 500 mg/d supplementation. Levels remained low in 3 patients; all received 250 mg/d supplementation [FIGURE 3]. Supplementation dose changes noted in figure 4.</p><p><b>Conclusion:</b> VC deficiency was present in 60% of CRRT patients, suggesting deficiency is more common in this population and that critically ill patients on CRRT are at higher risk of developing deficiency than those who are not receiving CRRT. Septic shock and degree of VC deficiency appeared to be correlated; 56% of deficient patients were admitted with septic shock. Together, this suggests a need to start supplementation earlier, perhaps upon CRRT initiation vs upon admission to the PICU in a septic patient; and utilize higher supplementation doses as our patients with low VC levels at their follow-up check were all receiving 250 mg/d. Study limitations include: 1. Potential for VC deficiency prior to CRRT initiation with critical illness and/or poor intakes as confounding factors and 2. Small sample size. Our results suggest that critically ill children with AKI requiring CRRT are at increased risk of VC deficiency while on CRRT. Future research should focus on identifying at-risk micronutrients for this patient population and creating supplementation regimens to prevent the development of deficiencies. Our institution is currently crafting a quality improvement project with these aims.</p><p></p><p><b>Figure 1.</b> Initial Serum Vitamin C Levels of Our Patients on CRRT, Obtained 5-10 Days After CRRT Initiation (N = 15).</p><p></p><p><b>Figure 2.</b> Underlying Disease Process of Patients on CRRT (N = 15).</p><p></p><p><b>Figure 3.</b> Follow Up Vitamin C Levels After Supplementation (N = 8), Including Supplementation Regimen Prior to Follow-Up Lab (dose/route).</p><p></p><p><b>Figure 4.</b> Alterations in Supplementation Regimen (dose) Based on Follow-up Lab Data (N = 6).</p><p>Tanner Sergesketter, RN, BSN<sup>1</sup>; Kanika Puri, MD<sup>2</sup>; Emily Israel, PharmD, BCPS, BCPPS<sup>1</sup>; Ryan Pitman, MD, MSc<sup>3</sup>; Elaina Szeszycki, BS, PharmD, CNSC<sup>2</sup>; Ahmad Furqan Kazi, PharmD, MS<sup>1</sup>; Ephrem Abebe, PhD<sup>1</sup></p><p><sup>1</sup>Purdue University College of Pharmacy, West Lafayette, IN; <sup>2</sup>Riley Hospital for Children at Indiana University Health, Indianapolis, IN; <sup>3</sup>Indiana University, Indianapolis, IN</p><p><b>Financial Support:</b> The Gerber Foundation.</p><p><b>Background:</b> During the hospital-to-home transition period, family members or caregivers of medically complex children are expected to assume the responsibility of managing medication and feeding regimens for the child under their care. However, this transition period represents a time of high vulnerability, including the risk of communication breakdowns and lack of tools tailored to caregivers’ context. This vulnerability is further heightened when a language barrier is present as it introduces additional opportunities for misunderstandings leading to adverse events and errors in the home setting. Hence, it is critical that caregivers are educated to develop skills that aid in successful implementation of post-discharge care plans. Addressing unmet needs for caregivers who use languages other than English (LOE) requires an in-depth understanding of the current challenges associated with educating and preparing caregivers for the post-discharge period.</p><p><b>Methods:</b> In this prospective qualitative study, healthcare workers (HCWs) were recruited from a tertiary care children's hospital in central Indiana and were eligible to participate if they were involved directly or indirectly in preparing and assisting families of children under three years of age with medication and feeding needs around the hospital discharge period and/or outpatient care following discharge. Each HCW completed a brief demographic survey followed by observation on the job for 2-3 hours, and participated in a follow-up semi-structured interview using video conferencing technology to expand on observed behavior. Handwritten field notes were taken during the observations, which were immediately typed and expanded upon post-observation. Audio recordings from the interviews were transcribed and de-identified. Both the typed observation notes and interview transcripts were subjected to thematic content analysis, which was completed using the Dedoose software.</p><p><b>Results:</b> Data collection is ongoing with anticipated completion in October 2024. Fourteen HCW interviews have been completed to date, with a target sample of 20-25 participants. Preliminary analysis presented is from transcripts of seven interviews. Participants included six females and one male with a mean age of 35.4 years (range, 24 - 59). HCWs were from diverse inpatient and outpatient clinical backgrounds including registered dieticians, physicians, pharmacists, and nurses. Four overarching themes describe the challenges that HCWs experience when communicating with caregivers who use LOE during the hospital-to-home transition. These themes include lack of equipment and materials in diverse languages, challenges with people and technologies that assist with translating information, instructions getting lost in translation/uncertainty of translation, and difficulty getting materials translated in a timely manner. Main themes, subthemes, and examples are presented in Figure 1, and themes, subthemes, and quotes are presented in Table 1.</p><p><b>Conclusion:</b> The study is ongoing, however based on the preliminary analysis, it is evident that the systems and processes that are in place to aid in communication between HCWs and caregivers who use LOE can be improved. This can ultimately lead to improved quality of care provided to caregivers who use LOE during the hospital-to-home transition and resultant safer care in the home setting for medically complex children.</p><p><b>Table 1.</b> Themes, Subthemes, and Quotes.</p><p></p><p></p><p><b>Figure 1.</b> Main Themes, Subthemes, and Examples.</p><p>Ruthfirst Ayande, PhD, MSc, RD<sup>1</sup>; Shruti Gupta, MD, NABBLM-C<sup>1</sup>; Sarah Taylor, MD, MSCR<sup>1</sup></p><p><sup>1</sup>Yale University, New Haven, CT</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Growth faltering among preterm neonates admitted to NICUs may be managed with hypercaloric and/or hypervolemic feeding. However, in intractable cases of growth faltering or when fluid restriction is indicated, fortification with extensively hydrolyzed liquid protein may offer a solution to meeting the high protein demands for infant growth while limiting overfeeding. Yet, there is limited clinical data regarding supplementation with liquid protein, leaving clinicians to make decisions about dosing and duration on a case-by-case basis.</p><p><b>Methods:</b> We present a case of neonatal growth faltering managed with a liquid protein modular in a level IV NICU in North America.</p><p><b>Results:</b> Case Summary: A male infant, born extremely preterm (GA: 24, 1/7) and admitted to the NICU for respiratory distress, requiring intubation. NICU course was complicated by patent ductus arteriosus (PDA), requiring surgery on day of life (DOL) 31 and severe bronchopulmonary dysplasia. Birth Anthropometrics: weight: 0.78 kg; height: 31.5 cm. TPN was initiated at birth, with trophic feeds of donor human milk per gavage (PG) for a total provision of 117 ml/kg, 75 kcal/kg, and 3.5 gm/kg of protein. The regimen was advanced per unit protocol; however, on DOL 5, total volume was decreased in the setting of metabolic acidosis and significant PDA. PG feeds of maternal milk were fortified to 24 kcal/oz on DOL 23, and the infant reached full feeds on DOL 26. Feed provision by DOL 28 was ~144 ml/kg/day and 4 g/kg of protein based on estimated dry weight. TPN was first discontinued on DOL 25 but restarted on DOL 32 due to frequent NPO status and clinical instability. Of note, the infant required diuretics during the hospital stay. TPN was again discontinued on DOL 43. At DOL 116, the infant was receiving and tolerating PG feeds fortified to 24 kcal/oz at 151 ml/kg, 121 kcal/kg, and 2.1 gm/kg protein. The infant's weight gain rate was 56 g/day; however, linear growth was impaired, with a gain of 0.5 cm over 21 days (rate of ~0.2 cm/week). Liquid protein was commenced at DOL 124 to supply an additional 0.5 gm/kg of protein. A week after adding liquid protein, the infant's weight gain rate was 39 g/day, and height increased by 2.5 cm/week. Feed fortification was reduced to 22 kcal/oz on DOL 156 due to rapid weight gain, and liquid protein dosage increased to 0.6 gm/kg for a total protein intake of 2.2 g/kg. At DOL 170, Calorie fortification of maternal milk was discontinued, and liquid protein dosage was increased to 1 g/kg in the setting of a relapse of poor linear growth for a total protein intake of 3.1 g/kg. Liquid protein was provided for two months until discontinuation (d/c) at DOL 183 per parent request. At the time of d/c of liquid protein, the infant's weight and length gain rates for the protein supplementation period (59 days) was 42 gm/day and 1.78 cm/week, respectively.</p><p><b>Conclusion:</b> While we observed objective increases in linear growth for the presented case following the addition of a liquid protein modular, it is crucial to note that these findings are not generalizable, and there is limited evidence and guidelines on the use of hydrolyzed liquid protein. Larger, well-controlled studies examining mechanisms of action, appropriate dosage and duration, short- and long-term efficacy, and safety are required to guide best practices for using these modulars.</p><p>Sarah Peterson, PhD, RD<sup>1</sup>; Nicole Salerno, BS<sup>1</sup>; Hannah Buckley, RDN, LDN<sup>1</sup>; Gretchen Coonrad, RDN, LDN<sup>1</sup></p><p><sup>1</sup>Rush University Medical Center, Chicago, IL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Accurate assessment of growth and nutritional status is critical for preterm infants. Various growth charts have been developed to track the growth of preterm infants, but differences in reference standards may influence the diagnosis and management of malnutrition. The goal of this study was to compare the rate of malnutrition, defined by a decline in weight-for-age z-score, using the Fenton growth chart, the Olsen growth chart, and the INTERGROWTH-21st Preterm Postnatal Growth Standard.</p><p><b>Methods:</b> All preterm infants born between 24 and 37 weeks of gestational age who were admitted to the neonatal intensive care unit (NICU) in 2022 and had weight at birth and on day 28 recorded were included. Preterm infants were excluded if they were admitted to the NICU ≥seven days after birth. Sex and gestational age were recorded for each infant. Weight and weight-for-age z-score at birth and on day 28 were recorded. Weight-for-age z-score was determined using three growth charts: (1) the Fenton growth chart, which was developed using growth data from uncomplicated preterm infants who were at least 22 weeks of gestational age from the United States (US), Australia, Canada, Germany, Italy, and Scotland; (2) the Olsen growth chart, which was developed using growth data from uncomplicated preterm infants who were at least 23 weeks of gestational age from the US; and (3) the INTERGROWTH-21st Preterm Postnatal Growth Standard, which was developed using growth data from uncomplicated preterm infants who were at least 24 weeks of gestational age from the US, United Kingdom, Brazil, China, India, Italy, Kenya, and Oman. Change in weight-for-age z-score from birth to day 28 was calculated using z-scores from each growth chart. Malnutrition was defined as a decline in weight-for-age z-score of ≥0.8; any infant meeting this cut-point had their malnutrition status further classified into three categories: (1) mild malnutrition was defined as a weight-for-age z-score decline between 0.8-1.1, (2) moderate malnutrition was defined as a weight-for-age z-score decline between 1.2-1.9, and (3) severe malnutrition was defined as a weight-for-age z-score decline of ≥2.0.</p><p><b>Results:</b> The sample included 102 preterm infants, 58% male, with a mean gestational age of 29.3 weeks. At birth, the average weight was 1,192 grams, and the average weight-for-age z-score was -0.50, -0.36, and -1.14 using the Olsen, Fenton, and INTERGROWTH-21st growth charts, respectively. At 28 days, the average weight was 1,690 grams, and the average weight-for-age z-score was -0.96, -1.00, and -1.43 using the Olsen, Fenton, and INTERGROWTH-21st growth charts, respectively. Using the Olsen growth chart, 29 infants met the criteria for malnutrition; 15 had mild malnutrition and 14 had moderate malnutrition. Using the Fenton growth chart, 33 infants met the criteria for malnutrition; 21 had mild malnutrition and 12 had moderate malnutrition. Using the INTERGROWTH-21st growth chart, 32 infants met the criteria for malnutrition; 14 had mild malnutrition, 14 had moderate malnutrition, and 4 had severe malnutrition. In total, 24 infants met the criteria for malnutrition using all three growth charts, while 19 infants were only categorized as malnourished by one of the three growth charts.</p><p><b>Conclusion:</b> The findings of this study reveal important discrepancies in average z-scores and the classification of malnutrition among preterm infants when using the Olsen, Fenton, and INTERGROWTH-21st growth charts. These differences suggest that the choice of growth chart has important implications for identifying rates of malnutrition. Therefore, standardized guidelines for growth monitoring in preterm infants are necessary to ensure consistent and accurate diagnosis of malnutrition.</p><p>Emaan Abbasi, BSc<sup>1</sup>; Debby Martins, RD<sup>2</sup>; Hannah Piper, MD<sup>2</sup></p><p><sup>1</sup>Univery of Galway, Vancouver, BC; <sup>2</sup>BC Children's Hospital, Vancouver, BC</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Infants with gastroschisis have variable intestinal function with some achieving enteral independence within a few weeks and others remaining dependent on parental nutrition (PN) for prolonged periods. Establishing enteral feeds can be challenging for many of these neonates due to poor intestinal motility, frequent emesis and/or abdominal distension. Therefore, many care teams use standardized post-natal nutrition protocols in an attempt to minimize PN exposure and maximize oral feeding. However, it remains unclear whether initiating continuous feeds is advantageous or whether bolus feeding is preferred. Potential benefits of bolus feeding include that it is more physiologic and that the feeds can be given orally, but there remain concerns about feed tolerance and prolonged periods of withholding feeds with this approach. The objective of this study was to compare the initial feeding strategy in infants with gastroschisis to determine whether bolus feeding is a feasible approach.</p><p><b>Methods:</b> After obtaining REB approval (H24-01052) a retrospective chart review was performed in neonates born with gastroschisis, cared for by a neonatal intestinal rehabilitation team between 2018 and 2023. A continuous feeding protocol was used between 2018-2020 (human milk at 1 ml/h with 10 ml/kg/d advancements given continuously until 50 ml/kg/d and then trialing bolus feeding) and a bolus protocol was used between 2021-2023 (10-15 ml/kg divided into 8 feeds/d with 15-20 ml/kg/d advancements). Clinical data was collected including: gestational age, gastroschisis prognosis score (GPS), need for intestinal resection, age when feeds initiated, time to full feeds, route of feeding at full feeds and hepatic cholestasis were compared between groups. Welch's t-test and chi square test were performed to compare variables with p-values < 0.05 considered significant.</p><p><b>Results:</b> Forty-one infants with gastroschisis were reviewed (23 who were managed with continuous feed initiation and 18 with bolus feed initiation). Continuous feed and bolus feed groups had comparable mean gestational age at birth, GPS score, need for intestinal surgery, age at feed initiation, and the incidence of cholestasis (Table 1). Time to achieve enteral independence was similar between both groups with nearly half of infants reaching full feeds by 6 weeks (48% continuous feeds vs. 44% bolus feeds) and most by 9 weeks of life (74% continuous vs. 72% bolus). Significantly more infants in the bolus feeding group were feeding exclusively orally compared to the continuous feeding group at the time of reaching full enteral feeds (50% vs. 17%, p = 0.017).</p><p><b>Conclusion:</b> Initiating bolus enteral feeding for infants with gastroschisis, including those requiring intestinal resection, is safe and does not result in prolonged time to full enteral feeding. Avoiding continuous feeds may improve oral feeding in this population.</p><p><b>Table 1.</b> Clinical Characteristics and Initial Feeding Strategy.</p><p></p><p><b>International Poster of Distinction</b></p><p>Matheus Albuquerque<sup>1</sup>; Diogo Ferreira<sup>1</sup>; João Victor Maldonado<sup>2</sup>; Mateus Margato<sup>2</sup>; Luiz Eduardo Nunes<sup>1</sup>; Emanuel Sarinho<sup>1</sup>; Lúcia Cordeiro<sup>1</sup>; Amanda Fifi<sup>3</sup></p><p><sup>1</sup>Federal University of Pernambuco, Recife, Pernambuco; <sup>2</sup>University of Brasilia, Brasília, Distrito Federal; <sup>3</sup>University of Miami, Miami, FL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Intestinal failure secondary to short bowel syndrome is a malabsorptive condition, caused by intestinal resection. Patients with intestinal failure require parenteral support to maintain hydration and nutrition. Long-term parenteral nutrition leads to complications. Teduglutide, an analog of GLP-2, may improve intestinal adaptation thereby minimizing reliance on parenteral nutrition. This meta-analysis evaluates the efficacy of teduglutide in reducing parenteral nutrition dependency in pediatric patients with intestinal failure.</p><p><b>Methods:</b> We included randomised controlled trials (RCTs) that assessed the efficacy of teglutide in reducing parenteral nutrition support and improving anthropometrics in pediatric patients with intestinal failure secondary to short bowel syndrome. The RoB-2 tool (Cochrane) evaluated the risk of bias, and statistical analyses were conducted utilizing RevMan 5.4.1 software. The results are expressed as mean differences with CI 95% and p-value.</p><p><b>Results:</b> Data was extracted from three clinical trials, involving a total of 172 participants. Teduglutide use was associated with a reduction in parenteral nutrition volume (-17.92 mL, 95% CI -24.65 to -11.20, p < 0.00001) with most patients reducing parenteral support by >20% (11.79, 95% CI 2.04 to 68.24, p = 0.006) (Figure2). Treatment with teduglutide also improved height (0.27 Z-score, 95% CI 0.08, to 0.46, p = 0.005) but did not significantly increase weight when compared to the control group (-0.13 Z-score, 95% CI, -0.41 to 0.16, p = 0.38) (Figure 3).</p><p><b>Conclusion:</b> This meta-analysis suggests that therapy with teduglutide reduces parenteral nutrition volume in patients with short bowel syndrome and intestinal failure. Reduced pareneteral nutrition dependency can minimize complications and improve quality of life in patients with short bowel syndrome and intestinal failure.</p><p></p><p><b>Figure 1.</b> Parenteral Nutrition Support Volume Change.</p><p></p><p><b>Figure 2.</b> Anthropometric Data (Weight and Height) Change from Baseline.</p><p>Korinne Carr<sup>1</sup>; Liyun Zhang, MS<sup>1</sup>; Amy Pan, PhD<sup>1</sup>; Theresa Mikhailov, MD, PhD<sup>2</sup></p><p><sup>1</sup>Medical College of Wisconsin, Milwaukee, WI; <sup>2</sup>Childrens Hospital of Wisconsin, Milwaukee, WI</p><p><b>Financial Support:</b> Medical College of Wisconsin, Department of Pediatrics.</p><p><b>Background:</b> Malnutrition is a significant concern in pediatric patients, particularly those critically ill. In children with diabetes mellitus (DM), the presence of malnutrition can exacerbate complications such as unstable blood sugar levels and delayed wound healing, potentially leading to worse clinical outcomes. Despite the known impact of malnutrition on adult patients and critically ill children, established criteria for identifying malnutrition in critically ill children are lacking. This study was designed to determine the relationship between malnutrition, mortality, and length of stay (LOS) in critically ill pediatric patients with diabetes mellitus.</p><p><b>Methods:</b> We conducted a retrospective cohort study using the VPS (Virtual Pediatric Systems, LLC) Database. We categorized critically ill pediatric patients with DM as malnourished or at risk of being malnourished based on admission nutrition screens. We compared mortality rates between malnourished and non-malnourished patients using Fisher's Exact test. We used logistic regression analysis to compare mortality controlling for measures like PRISM3 (a severity of illness measure), demographic, and clinical factors. We compared the LOS in the Pediatric Intensive Care Unit (PICU) between malnourished and non-malnourished patients using the Mann-Whitney-Wilcoxon test. Additionally, we used a general linear model with appropriate transformation to adjust for the severity of illness, demographic, and clinical factors. We considered statistical significance at p < 0.05.</p><p><b>Results:</b> We analyzed data for 4,014 patients, of whom 2,653 were screened for malnutrition. Of these 2,653, 88.5% were type 1 DM, 9.3% were type 2 DM, and the remaining patients were unspecified DM. Of the 2,653 patients, 841 (31.7%) were malnourished based on their nutrition screen at admission to the PICU. Mortality in patients who were screened as malnourished did not differ from mortality in those who were not malnourished (0.4% vs. 0.2%, p = 0.15). Malnourished patients also had longer PICU LOS, with a geometric mean and 95% CI of 1.03 (0.94–1.13) days, compared to 0.91 (0.86–0.96) days for non-malnourished patients. Similarly, the malnourished patients had longer hospital LOS with a geometric mean and 95% CI of 5.31 (4.84–5.83) days, compared to 2.67 (2.53–2.82) days for those who were not malnourished. Both differences were significant with p < 0.0001, after adjusting for age, race/ethnicity, and PRISM3.</p><p><b>Conclusion:</b> We found no difference in mortality rates, but critically ill children who were screened as malnourished had longer PICU and hospital LOS than those who were not malnourished. This was true even after adjusting for age, race/ethnicity, and PRISM3.</p><p>Emily Gutzwiller<sup>1</sup>; Katie Huff, MD, MS<sup>1</sup></p><p><sup>1</sup>Indiana University School of Medicine, Indianapolis, IN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Neonates with intestinal failure require parenteral nutrition for survival. While life sustaining, it can lead to serious complications, including intestinal failure associated liver disease (IFALD). The etiology of IFALD is likely multifactorial, with intravenous lipid emulsions (ILE) being a large contributor, particularly soybean oil-based lipid emulsions (SO-ILE). Alternate ILEs, including those containing fish oil, can be used to prevent and treat IFALD. Fish oil-based ILE (FO-ILE) is only approved at a dose of 1 g/kg/d, limiting calories prescribed from fat and shifting the calorie delivery to carbohydrate predominance. While FO-ILE was shown to have comparable growth to SO-ILE, a comparison to soy, MCT, olive, fish oil-based ILE (SO,MCT,OO,FO-ILE) has not been conducted to our knowledge. The purpose of this study is to compare the growth and laboratory data associated with SO,MCT,OO,FO-ILE and FO-ILE therapy in the setting of IFALD in a cohort of neonates treated during two time periods.</p><p><b>Methods:</b> We performed a retrospective chart review of patients with IFALD receiving SO,MCT,OO,FO-ILE (SMOFlipid) or FO-ILE (Omegaven) in a level IV neonatal intensive care unit from September 2016 – May 2024. IFALD was defined as direct bilirubin >2 mg/dL after receiving >2 weeks of parental nutrition. Patients with underlying genetic or hepatic diagnoses, as well as patients with elevated direct bilirubin prior to two weeks of life were excluded. Patients were divided based on the period they were treated. Data was collected on demographic characteristics, parental and enteral nutrition, weekly labs, and growth changes while receiving SO,MCT,OO,FO-ILE or FO-ILE. Rate of change of weight, length, and head circumference and comparison of z-scores over time were studied for each ILE group. Secondary outcomes included nutritional data in addition to hepatic labs. Nonparametric analysis using Mann-Whitney U test was conducted to compare ILE groups and a p-value of < 0.05 was used to define statistical significance.</p><p><b>Results:</b> A total of 51 patients were enrolled, 25 receiving SO,MCT,OO,FO-ILE and 26 FO-ILE. Table 1 notes the demographic and baseline characteristics of the two ILE groups. There was no difference in the rate of OFC (p = 0.984) or length (p = 0.279) growth between the two treatment groups (Table 2). There was a difference, however, in the rate of weight gain between groups. (p = 0.002; Table 2), with the FO-ILE group gaining more weight over time. When comparing nutritional outcomes (Table 2), SO,MCT,OO,FO-ILE patients received greater total calories than FO-ILE patients (p = 0.005) including a higher ILE dose (p < 0.001) and enteral calories (p = 0.029). The FO-ILE group, however received a higher carbohydrate dose (p = 0.003; Table 2). There was no difference in amino acid dose (p = 0.127) or parental nutrition calories (p = 0.821). Hepatic labs differed over time with the FO-ILE having a larger decrease in AST, direct bilirubin, and total bilirubin over time compared to the SO,MCT,OO,FO-ILE group (Table 2).</p><p><b>Conclusion:</b> Our results show the FO-ILE patients did have a significant increase in weight gain compared to the SO,MCT,OO,FO-ILE patients. This is despite SO,MCT,OO,FO-ILE patients receiving greater total calories and enteral calories. The FO-ILE only received greater calories in the form of glucose infusion. With this increased weight gain but similar length growth between groups, concerns regarding alterations in body composition and increased fat mass arise. Further research is needed to determine the influence of this various ILE products on neonatal body composition over time.</p><p><b>Table 1.</b> Demographic and Baseline Lab Data by Lipid Treatment Group.</p><p>(All data presented as median and interquartile range, unless specified.)</p><p></p><p><b>Table 2.</b> Nutritional, Hepatic Lab, and Growth Outcomes by Lipid Treatment Group.</p><p>(All data presented as median interquartile range unless specified.)</p><p>*z-score change compares z-score at end and beginning of study period</p><p>OFC-occipitofrontal circumference</p><p></p><p>Rachel Collins, BSN, RN<sup>1</sup>; Brooke Cherven, PhD, MPH, RN, CPON<sup>2</sup>; Ann-Marie Brown, PhD, APRN, CPNP-AC/PC, CCRN, CNE, FCCM, FAANP, FASPEN<sup>1</sup>; Christina Calamaro, PhD, PPCNP-BC, FNP-BC, FAANP, FAAN<sup>3</sup></p><p><sup>1</sup>Emory University Nell Hodgson Woodruff School of Nursing, Atlanta, GA; <sup>2</sup>Emory University School of Medicine; Children's Healthcare of Atlanta, Atlanta, GA; <sup>3</sup>Emory University Nell Hodgson Woodruff School of Nursing; Children's Healthcare of Atlanta, Atlanta, GA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Many children receiving hematopoietic stem cell transplant (HSCT) are malnourished prior to the start of transplant or develop malnutrition post-infusion. Children are at risk for malnutrition due to the pre-transplant conditioning phase, from chemotherapy treatments for their primary diagnosis, and from acute graft versus host disease (GVHD). These factors can cause impaired oral feeding, vomiting, diarrhea, and mucositis. Enteral nutrition (EN) and parenteral nutrition (PN) are support options for this patient population when oral feeding is impaired. There is currently a paucity of research in evidence-based guidelines of nutrition in this population. The purpose of this integrative literature review was to determine the outcomes of EN and PN in pediatric HSCT and to discuss evidence-based implications for practice to positively affect quality of life for these patients.</p><p><b>Methods:</b> A literature search was conducted using the following databases: PubMed, CINAHL, Cochrane, and Healthsource: Nursing/Academic Edition. The search strategy included randomized controlled trials, prospective and retrospective cohort studies, case control studies, cross sectional studies, systematic reviews, and meta-analyses. Relevant papers were utilized if they discussed patients 0-21 years of age, allogenic or autogenic HSCT, and enteral and/or parenteral nutrition. Papers were excluded if there was no English translation, they did not discuss nutrition, or they had animal subjects.</p><p><b>Results:</b> Initially 477 papers were identified and after the screening process 15 papers were utilized for this integrative review. EN and PN have effects on clinical outcomes, complications, and hospital and survival outcomes. EN was associated with faster platelet engraftment, improved gut microbiome, and decreased mucositis and GVHD. PN was used more in severe mucositis due to interference with feeding tube placement, therefore decreasing the use of EN. Use of PN is more common in severe grades III-IV of gut GVHD. Initiation of EN later in treatment, such as after conditioning and the presence of mucositis, can be associated with severe grades III-IV of gut GVHD. This is because conditioning can cause damage to the gut leading to mucosal atrophy and intestinal permeability altering gut microbiota. PN can induce gut mucosal atrophy and dysbiosis allowing for bacterial translocation, while EN improves the gut epithelium and microbiota reducing translocation. Additionally, increase access to central venous lines in PN can introduce bacterial infections to the bloodstream. Feeding tube placement complications include dislodgement, refusal of replacement, and increased risk of bleeding due to thrombocytopenia. Electrolyte imbalance can be attributed to loss of absorption through the gut. If a gastrostomy tube is present, there can be infection at the site. There is currently no consensus in the appropriate timeline of tube placement. There was no significant difference in neutrophil engraftment and variable findings in morbidity/mortality and weight gain. Weight gain can be attributed to edema in PN. Length of stay was significantly shorter in only EN than PN (p < 0.0001).</p><p><b>Conclusion:</b> This literature review indicates a need for a more comprehensive nutritional assessment to adequately evaluate nutritional status before and after HSCT. EN should be given as a first line therapy and should be considered prior to the conditioning phase. The initiation of a feeding tube prior to conditioning should be considered. Finally, PN may be considered if EN cannot be tolerated. More research is needed for a sensitive nutritional evaluation, earlier administration of EN, and standardized pathways for EN and PN nutrition in pediatric HSCT.</p>","PeriodicalId":16668,"journal":{"name":"Journal of Parenteral and Enteral Nutrition","volume":"49 S1","pages":"S90-S308"},"PeriodicalIF":3.2000,"publicationDate":"2025-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/jpen.2735","citationCount":"0","resultStr":"{\"title\":\"Poster Abstracts\",\"authors\":\"\",\"doi\":\"10.1002/jpen.2735\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><b>P1–P34 Parenteral Nutrition Therapy</b></p><p><b>P35–P52 Enteral Nutrition Therapy</b></p><p><b>P53–P83 Malnutrition and Nutrition Assessment</b></p><p><b>P84–P103 Critical Care and Critical Health Issues</b></p><p><b>P104–P131 GI, Obesity, Metabolic, and Other Nutrition Related Concepts</b></p><p><b>P132–P165 Pediatric, Neonatal, Pregnancy, and Lactation</b></p><p><b>Parenteral Nutrition Therapy</b></p><p>Sarah Williams, MD, CNSC<sup>1</sup>; Angela Zimmerman, RD, CNSC<sup>2</sup>; Denise Jezerski, RD, CNSC<sup>2</sup>; Ashley Bestgen, RD, CNSC<sup>2</sup></p><p><sup>1</sup>Cleveland Clinic Foundation, Parma, OH; <sup>2</sup>Cleveland Clinic Foundation, Cleveland, OH</p><p><b>Financial Support:</b> Morrison Healthcare.</p><p><b>Background:</b> Essential fatty acid deficiency (EFAD) is a rare disorder among the general population but can be a concern in patients reliant on home parenteral nutrition (HPN), particularly those who are not receiving intravenous lipid emulsions (ILE). In the US, the only ILE available until 2016 was soybean oil based (SO-ILE), which contains more than adequate amounts of essential fatty acids, including alpha-linolenic acid (ALA, an omega-3 fatty acid) and linoleic acid (LA, an omega-6 fatty acid). In 2016, a mixed ILE containing soybean oil, medium chain triglycerides, olive oil and fish oil, became available (SO, MCT, OO, FO-ILE). However, it contains a lower concentration of essential fatty acids compared to SO-ILE, raising theoretical concerns for development of EFAD if not administered in adequate amounts. Liver dysfunction is a common complication in HPN patients that can occur with soybean based ILE use due to their pro-inflammatory properties. Short-term studies and case reports in patients receiving SO, MCT, OO, FO-ILE have shown improvements in liver dysfunction for some patients. Our study evaluates the long-term impact of SO, MCT, OO, FO-ILE in our HPN patient population.</p><p><b>Methods:</b> This single-center, retrospective cohort study was conducted at the Cleveland Clinic Center for Human Nutrition using data from 2017 to 2020. It involved adult patients who received HPN with SO, MCT, OO, FO-ILE for a minimum of one year. The study assessed changes in essential fatty acid profiles, including triene-tetraene ratios (TTRs) and liver function tests (LFTs) over the year. Data was described as mean and standard deviation for normal distributed continuous variables, medians and interquartile range for non-normally distributed continuous variables and frequency for categorical variables. The Wilcoxon signed rank test was used to compare the baseline and follow-up TTR values (mixed time points). The Wilcoxon signed rank test with pairwise comparisons was used to compare the LFTs at different time points and to determine which time groups were different. P-values were adjusted using Bonferroni corrections. Ordinal logistic regression was used to assess the association between lipid dosing and follow-up TTR level. Analyses were performed using R software and a significance level of 0.05 was assumed for all tests.</p><p><b>Results:</b> Out of 110 patients screened, 26 met the inclusion criteria of having baseline and follow-up TTRs. None of the patients developed EFAD, and there was no significant difference in the distribution of TTR values between baseline and follow-up. Additionally, 5.5% of patients reported adverse GI symptoms while receiving SO, MCT, OO, FO-ILE. A separate subgroup of 14 patients who had abnormal LFTs, including bilirubin, alkaline phosphatase (AP), aspartate aminotransferase (AST) or alanine aminotransferase (ALT), were evaluated. There was a statistically significant improvement of AST and ALT and decreases in bilirubin and AP that were not statistically significant.</p><p><b>Conclusion:</b> We found that using SO, MCT, OO, FO-ILE as the primary lipid source did not result in EFAD in any of our subset of 26 patients, and TTRs remained statistically unchanged after introduction of SO, MCT, OO, FO-ILE. Additionally, there was a statistically significant decrease in AST and ALT following the start of SO, MCT, OO, FO-ILE. While liver dysfunction from PN is multifactorial, the use of fish oil based lipids has been shown to improve LFT results due to a reduction of phytosterol content as well as less pro-inflammatory omega-6 content when compared to SO-ILEs. A significant limitation was the difficulty in obtaining TTR measurements by home health nursing in the outpatient setting, which considerably reduced the number of patients who could be analyzed for EFAD.</p><p><b>Table 1.</b> Summary Descriptive Statistics of 26 Patients With Baseline and Follow Up TTR.</p><p></p><p><b>Table 2.</b> Change in LFTs From Baseline Levels Compared to 3 Months, 6 Months, 9 Months and 12 Months.</p><p></p><p>Wendy Raissle, RD, CNSC<sup>1</sup>; Hannah Welch, MS, RD<sup>2</sup>; Jan Nguyen, PharmD<sup>3</sup></p><p><sup>1</sup>Optum Infusion Pharmacy, Buckeye, AZ; <sup>2</sup>Optum Infusion Pharmacy, Phoenix, AZ; <sup>3</sup>Optum Infusion Pharmacy, Mesa, AZ</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Aluminum is a non-nutrient contaminant of parenteral nutrition (PN) solution. The additive effects of PN components can contribute to toxicity and cause central nervous system issues as well as contribute to metabolic bone disease as observed in adults with osteomalacia. When renal function and gastrointestinal mechanisms are impaired, aluminum can accumulate in the body. Aluminum toxicity can result in anemia, dementia, bone disease and encephalopathy. Symptoms of aluminum toxicity may include mental status change, bone pain, muscle weakness, nonhealing fractures and premature osteoporosis. In July 2004, the U.S. Food and Drug Administration (FDA) mandated labeling of aluminum content with a goal to limit exposure to less than 5mCg/kg/day. Adult and pediatric dialysis patients, as well as patients of all ages receiving PN support, have an increased risk of high aluminum exposure. Reducing PN additives high in aluminum is the most effective way to decrease aluminum exposure and risk of toxicity. This abstract presents a unique case where antiperspirant use contributed to an accumulation of aluminum in an adult PN patient.</p><p><b>Methods:</b> A patient on long-term PN (Table 1) often had results of low ionized calcium of < 3 mg/dL, leading to consideration of other contributing factors. In addition, patient was taking very high doses of vitamin D daily (by mouth) to stay in normal range (50,000IU orally 6 days/week). Risk factors for developing metabolic bone disease include mineral imbalances of calcium, magnesium, phosphorus, vitamin D, corticosteroid use, long-term PN use and aluminum toxicity (Table 2). A patient with known osteoporosis diagnosis had two stress fractures in left lower leg. Aluminum testing was completed in order to identify other factors that may be contributing to low ionized calcium values and osteoporosis. During patient discussion, the patient revealed they used an aluminum-containing antiperspirant one time daily. The range of aluminum content in antiperspirants is unknown, but studies show that minimal absorption may be possible, especially in populations with kidney insufficiency.</p><p><b>Results:</b> After an elevated aluminum value resulted on July 3, 2023 (Figure 1), patient changed products to a non-aluminum containing antiperspirant. Aluminum values were rechecked at 3 and 7 months. Results indicate that patient's antiperspirant choice may have been contributing to aluminum content through skin absorption. Antiperspirant choice may not lead to aluminum toxicity but can contribute to an increased total daily aluminum content.</p><p><b>Conclusion:</b> Preventing aluminum accumulation is vital for patients receiving long-term PN support due to heightened risk of aluminum toxicity. Other potential sources of contamination outside of PN include dialysis, processed food, aluminum foil, cosmetic products (antiperspirants, deodorant, toothpaste) medications (antacids), vaccinations, work environment with aluminum welding and certain processing industry plants. Aluminum content of medications and PN additives vary based on brands and amount. Clinicians should review all potential aluminum containing sources and assess ways to reduce aluminum exposure and prevent potential aluminum toxicity in long-term PN patients.</p><p><b>Table 1.</b> Patient Demographics.</p><p></p><p><b>Table 2.</b> Aluminum Content in PN Prescription.</p><p></p><p></p><p><b>Figure 1.</b> Aluminum Lab Value Result.</p><p>Haruka Takayama, RD, PhD<sup>1</sup>; Kazuhiko Fukatsu, MD, PhD<sup>2</sup>; MIdori Noguchi, BA<sup>3</sup>; Nana Matsumoto, RD, MS<sup>2</sup>; Tomonori Narita, MD<sup>4</sup>; Reo Inoue, MD, PhD<sup>3</sup>; Satoshi Murakoshi, MD, PhD<sup>5</sup></p><p><sup>1</sup>St. Luke's International Hospital, Chuo-ku, Tokyo; <sup>2</sup>The University of Tokyo, Bunkyo-ku, Tokyo; <sup>3</sup>The University of Tokyo Hospital, Bunkyo-ku, Tokyo; <sup>4</sup>The University of Tokyo, Chuo-City, Tokyo; <sup>5</sup>Kanagawa University of Human Services, Yokosuka-city, Kanagawa</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Our previous study has demonstrated beta-hydroxy-beta-methylbutyrate (HMB)-supplemented total parenteral nutrition (TPN) partially to restore gut-associated lymphoid tissue (GALT) atrophy observed in standard TPN-fed mice. Oral intake of HMB is now popular in body builders and athletes. Herein, we examined whether oral supplementation of HMB could increase GALT mass in mice which eat dietary chow ad libitum.</p><p><b>Methods:</b> Six-week-old male Institute of Cancer Research (ICR) mice were divided into the Control (n = 9), the H600 (n = 9) and the H2000 (n = 9) groups. All mice were allowed to take chow and water ad libitum for 7 days. The H600 or H2000 mice were given water containing Ca-HMB at 3 mg or 10 mg/mL water, while the Controls drank normal tap water. Because these mice drank approximately 6-7 mL water per day, the H600 and H2000 groups took 600 and 2000mg/kg Ca-HMB in one day, respectively. After 7 days manipulation, all mice were killed with cardiac puncture under general anesthesia, and the whole small intestine was harvested for GALT cell isolation. GALT cell numbers were evaluated in each tissue (Payer patches; PPs, Intraepithelial spaces; IE, and Lamina propria; LP). The nasal washings, bronchoalveolar lavage fluid (BALF), and intestinal washings were also collected for IgA level measurement by ELISA. The Kruskal-Wallis test was used for all parameter analyses, and the significance level was set at less than 5%.</p><p><b>Results:</b> There were no significant differences in the number of GALT cells in any sites among the 3 groups (Table 1). Likewise, mucosal IgA levels did not differ between any of the 2 groups (Table 2).</p><p><b>Conclusion:</b> Oral intake of HMB does not affect GALT cell number or mucosal IgA levels when the mice are given normal diet orally. It appears that beneficial effects of HMB on the GALT are expected only in parenterally fed mice. We should examine the influences of IV-HMB in orally fed model in the next study.</p><p><b>Table 1.</b> GALT Cell Number (x10<sup>7</sup>/body).</p><p></p><p><b>Table 2.</b> IgA Levels.</p><p></p><p>Median (interquartile range). Kruskal-Wallis test. n; Control=9, H600 = 8, H2000 = 9.</p><p>Nahoki Hayashi, MS<sup>1</sup>; Yoshikuni Kawaguchi, MD, PhD, MPH, MMA<sup>2</sup>; Kenta Murotani, PhD<sup>3</sup>; Satoru Kamoshita, BA<sup>1</sup></p><p><sup>1</sup>Medical Affairs Department, Research and Development Center, Chiyoda-ku, Tokyo; <sup>2</sup>Hepato-Biliary-Pancreatic Surgery Division, Bunkyo-ku, Tokyo; <sup>3</sup>School of Medical Technology, Kurume, Fukuoka</p><p><b>Financial Support:</b> Otsuka Pharmaceutical Factory, Inc.</p><p><b>Background:</b> The guideline of American Society for Parenteral and Enteral Nutrition recommends a target energy intake of 20 to 30 kcal/kg/day in patients undergoing surgery. Infectious complications reportedly decreased when the target energy and protein intake were achieved in the early period after gastrointestinal cancer surgery. However, no studies investigated the association of prescribed parenteral energy doses with clinical outcomes in patients who did not receive oral/tube feeding in the early period after gastrointestinal cancer surgery.</p><p><b>Methods:</b> Data of patients who underwent gastrointestinal cancer surgery during 2011–2022 and fasted for 7 days or longer after surgery were extracted from a nationwide medical claims database. The patients were divided into 3 groups based on the mean prescribed parenteral energy doses during 7 days after surgery as follows: the very-low group (<10 kcal/kg/day), the low group (10–20 kcal/kg/day), and the moderate group (≥20 kcal/kg/day). Multivariable logistic regression model analysis was performed using in-hospital mortality, postoperative complications, length of hospital stay, and total in-hospital medical cost as the objective variable and the 3 group and confounding factors as the explanatory variables.</p><p><b>Results:</b> Of the 18,294 study patients, the number of patients in the very low, low, and moderate groups was 6,727, 9,760, and 1,807, respectively. The median prescribed energy doses on the 7th day after surgery were 9.2 kcal/kg, 16 kcal/kg, and 27 kcal/kg in the very low, low, and moderate groups, respectively. The adjusted odds ratio (95% confidence interval) for in-hospital mortality with reference to the very low group was 1.060 (1.057–1.062) for the low group and 1.281 (1.275–1.287) for the moderate group. That of postoperative complications was 1.030 (0.940–1.128) and 0.982 (0.842–1.144) for the low and moderate groups, respectively. The partial regression coefficient (95% confidence interval) for length of hospital stay (day) with reference to the very low group was 2.0 (0.7–3.3) and 3.2 (1.0–5.5), and that of total in-hospital medical cost (US$) was 1,220 (705–1,735) and 2,000 (1,136–2,864), for the low and moderate groups, respectively.</p><p><b>Conclusion:</b> Contrary to the guideline recommendation, the prescribed energy doses of ≥ 10 kcal/kg/day was associated with the increase in in-hospital mortality, length of hospital stay, and total in-hospital medical cost. Our findings questioned the efficacy of guideline-recommended energy intake for patients during 7 days after gastrointestinal cancer surgery.</p><p>Jayme Scali, BS<sup>1</sup>; Gaby Luna, BS<sup>2</sup>; Kristi Griggs, MSN, FNP-C, CRNI<sup>3</sup>; Kristie Jesionek, MPS, RDN, LDN<sup>4</sup>; Christina Ritchey, MS, RD, LD, CNSC, FASPEN, FNHIA<sup>5</sup></p><p><sup>1</sup>Optum Infusion Pharmacy, Thornton, PA; <sup>2</sup>Optum Infusion Pharmacy, Milford, MA; <sup>3</sup>Optum Infusion Pharmacy, Murphy, NC; <sup>4</sup>Optum Infusion Pharmacy, Franklin, TN; <sup>5</sup>Optum Infusion Pharmacy, Bulverde, TX</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Home parenteral nutrition (HPN) is a life-sustaining nutrition support therapy that is administered through a central venous access device (CVAD). Caregivers of children on HPN are initially trained to provide CVAD care and therapy administration by their clinical team or home infusion nurse. Often there is a gap in training when the patient is ready to assume responsibility for CVAD care. Without proper training, patients are at significant risk of complications such as bloodstream infections and catheter occlusions. The purpose of this study was twofold: 1) to explore the caregiver's perspective about current and future CVAD training practices and 2) to evaluate the need for a proactive formalized CVAD training program when care is transitioned from caregiver to patient.</p><p><b>Methods:</b> An 8-question survey was created using an online software tool. The target audience included caregivers of children receiving HPN. A link to the survey was sent via email and posted on various social media platforms that support the HPN community. The survey was conducted from June 17 to July 18, 2024. Respondents who were not caregivers of a child receiving HPN via a CVAD were excluded.</p><p><b>Results:</b> The survey received 114 responses, but only 86 were included in the analysis based on exclusion criteria. The distribution of children with a CVAD receiving HPN was evenly weighted between 0 and 18 years of age. The majority of the time, initial training regarding HPN therapy and CVAD care was conducted by the HPN clinic/hospital resource/learning center or home infusion pharmacy (Table 1). Forty-eight percent of respondents indicated their HPN team never offers reeducation or shares best practices (Figure 1). Most respondents selected the best individual to train their child on CVAD care and safety is the caregiver (Figure 2). In addition, 60% of respondents selected yes, they would want their child to participate in CVAD training if offered (Figure 3).</p><p><b>Conclusion:</b> This survey confirms that most caregivers anticipate training their child to perform CVAD care when it is determined the child is ready for this responsibility. One challenge to this provision of training is that almost half of the respondents in this survey stated they never receive reeducation or best practice recommendations from their team. This finding demonstrates a need for a formalized training program to assist caregivers when transitioning CVAD care to the patient. Since most respondents reported relying on their intestinal rehab or GI/motility clinic for CVAD related concerns, these centers would be the best place to establish a transition training program. Limitations of the study are as follows: It was only distributed via select social platforms, and users outside of these platforms were not captured. Additional studies would be beneficial in helping to determine the best sequence and cadence for content training.</p><p><b>Table 1.</b> Central Venous Access Device (CVAD) Training and Support Practices.</p><p></p><p></p><p><b>Figure 1.</b> How Often Does Your HPN Team Offer Reeducation or Share Best Practices?</p><p></p><p><b>Figure 2.</b> Who is Best to Train Your Child on CVAD Care Management and Safety?</p><p></p><p><b>Figure 3.</b> If Formalized CVAD Training is Offered, Would You Want Your Child to Participate?</p><p>Laryssa Grguric, MS, RDN, LDN, CNSC<sup>1</sup>; Elena Stoyanova, MSN, RN<sup>2</sup>; Crystal Wilkinson, PharmD<sup>3</sup>; Emma Tillman, PharmD, PhD<sup>4</sup></p><p><sup>1</sup>Nutrishare, Tamarac, FL; <sup>2</sup>Nutrishare, Kansas City, MO; <sup>3</sup>Nutrishare, San Diego, CA; <sup>4</sup>Indiana University, Carmel, IN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Long-term parenteral nutrition (LTPN) within the home is a lifeline for many patients throughout the United States. Patients utilize central venous access devices (CVAD) to administer LTPN. Central line-associated bloodstream infection (CLABSI) is a serious risk associated with patients who require LTPN. Rates of CLABSI in LTPN populations range from 0.9-1.1 per 1000 catheter days. The aim of this study was to determine the incidence of CLABSI in a cohort of patients serviced by a national home infusion provider specializing in LTPN and identify variables associated with an increased incidence of CLABSI.</p><p><b>Methods:</b> A retrospective review of electronic medical records of LTPN patients with intestinal failure was queried from March 2023 to May 2024 for patient demographics, anthropometric data, nursing utilization, parenteral nutrition prescription including lipid type, length of therapy use, geographic distribution, prescriber specialty, history of CLABSI, blood culture results as available, and use of ethanol lock. Patient zip codes were used to determine rural health areas, as defined by the US Department of Health & Human Services. Patients were divided into two groups: 1) patients that had at least one CLABSI and 2) patients with no CLABSI during the study period. Demographic and clinical variables were compared between the two groups. Nominal data were analyzed by Fisher's exact test and continuous data were analyzed with student t-test for normal distributed data and Mann-Whitney U-test was used for non-normal distributed data.</p><p><b>Results:</b> We identified 198 persons that were maintained on LTPN during the study time. The overall CLABSI rate for this cohort during the study period was 0.49 per 1000 catheter days. Forty-four persons with LTPN had one or more CLABSI and 154 persons with LTPN did not have a CLABSI during the study period. Persons who experienced CLABSI weighed significantly more, had fewer days of infusing injectable lipid emulsions (ILE), and had a shorter catheter dwell duration compared to those that did not have a CLABSI (Table 1). There was no significant difference between the CLABSI and no CLABSI groups in the length of time on LTPN, location of consumer (rural versus non-rural), utilization of home health services, number of days parenteral nutrition (PN) was infused, or use of ethanol locks (Table 1).</p><p><b>Conclusion:</b> In this retrospective cohort study, we report a CLABSI rate of 0.49 per 1000 catheter days, which is lower than previously published CLABSI rates for similar patient populations. Patient weight, days of infusing ILE, and catheter dwell duration were significantly different between those that did and did not have a CLABSI in this study period. Yet, variables such as use of ethanol lock and proximity to care providers that had previously been reported to impact CLABSI were not significantly different in this cohort. An expanded study with more LTPN patients or a longer study duration may be necessary to confirm these results and their impact on CLABSI rates.</p><p><b>Table 1.</b> Long Term Parenteral Nutrition (LTPN) Characteristics.</p><p></p><p>Silvia Figueiroa, MS, RD, CNSC<sup>1</sup>; Stacie Townsend, MS, RD, CNSC<sup>2</sup></p><p><sup>1</sup>MedStar Washington Hospital Center, Bethesda, MD; <sup>2</sup>National Institutes of Health, Bethesda, MD</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> In hospitalized patients, lipid emulsions constitute an essential component of balanced parenteral nutrition (PN). Soy oil-based lipid injectable emulsions (SO-ILE) were traditionally administered as part of PN formulations primarily as a source of energy and for prevention of essential fatty acid deficiency. Contemporary practice has evolved to incorporate mixtures of different lipid emulsions, including a combination of soy, MCT, olive, and fish oils (SO, MCT, OO, FO-ILE). Evidence suggests that the use of SO, MCT, OO, FO-ILEs may alter essential fatty acid profiles, impacting hepatic metabolism and other processes associated with clinical benefits. The aim of this project was to compare essential fatty acid profile (EFAP), triglycerides (TGL), liver function tests (LFTs), and total bilirubin (TB) levels in adult patients receiving parenteral nutrition with SO-ILE or SO, MCT, OO, FO-ILE in a unique hospital entirely dedicated to conducting clinical research.</p><p><b>Methods:</b> This was a retrospective chart review from 1/1/2019 to 12/31/2023 of adult patients in our hospital who received PN with SO-ILE or SO, MCT, OO, FO-ILE and had EFAP assessed after 7 days of receiving ILE. Data included demographic, clinical, and nutritional parameters. Patients with no laboratory markers, those on propofol, and those who received both ILE products in the 7 days prior to collection of EFAP were excluded. Data was statistically analyzed using Fisher's tests and Mann-Whitney U tests as appropriate.</p><p><b>Results:</b> A total of 42 patient charts were included (14 SO-ILE; 28 SO, MCT, OO, FO-ILE). Group characteristics can be found in Table 1. Patients on SO-ILE received more ILE (0.84 vs 0.79 g/kg/day, p < 0.0001). TGL levels changed significantly after start of ILE (p < 0.0001). LFTs were found to be elevated in 57% of patients in the SO-ILE group and 60% in the SO, MCT, OO, FO-ILE group, while TB was increased in 21% and 40% of the patients respectively (Figure 1). Further analysis showed no significant differences in LFTs and TB between the two groups. Assessment of EFAP revealed a significant difference in the levels of DHA, docosenoic acid, and EPA, which were found to be higher in the group receiving SO, MCT, OO, FO-ILE. Conversely, significant differences were also observed in the levels of linoleic acid, homo-g-linolenic acid, and total omega 6 fatty acids, with those being higher in patients administered SO ILE (Figure 2). No differences were observed between groups regarding the presence of essential fatty acid deficiency, as indicated by the triene:tetraene ratio.</p><p><b>Conclusion:</b> In our sample analysis, LFTs and TB levels did not differ significantly between SO-ILE and SO, MCT, OO, FO-ILE groups. Increased levels of DHA, docosenoic acid, and EPA were found in the SO, MCT, OO, FO-ILE group, while linoleic acid, homo-g-linolenic acid, and total omega 6 fatty acids tended to be higher in the SO-ILE group. Although in our sample the SO, MCT, OO, FO-ILE group received a lower dosage of ILE/kg/day, there were no differences in the rate of essential fatty acid deficiency between groups.</p><p><b>Table 1.</b> General Characteristics (N = 42).</p><p></p><p></p><p><b>Figure 1.</b> Liver Function Tests (N = 39).</p><p></p><p><b>Figure 2.</b> Essential Fatty Acid Profile (N = 42).</p><p>Kassandra Samuel, MD, MA<sup>1</sup>; Jody (Lind) Payne, RD, CNSC<sup>2</sup>; Karey Schutte, RD<sup>3</sup>; Kristen Horner, RDN, CNSC<sup>3</sup>; Daniel Yeh, MD, MHPE, FACS, FCCM, FASPEN, CNSC<sup>3</sup></p><p><sup>1</sup>Denver Health, St. Joseph Hospital, Denver, CO; <sup>2</sup>Denver Health, Parker, CO; <sup>3</sup>Denver Health, Denver, CO</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Early nutritional support in hospitalized patients has long been established as a key intervention to improve overall patient outcomes. Patients admitted to the hospital often have barriers to receiving adequate nutrition via enteral route means and may be candidates for parenteral nutrition (PN). Central parenteral nutrition (CPN) requires central access, which has historically led to concerns for central line-associated bloodstream infection (CLABSI). Obtaining central access can be resource-intensive and may result in treatment delays while awaiting access. Conversely, peripheral parenteral nutrition (PPN) can be delivered without central access. In this quality improvement project, we sought to characterize our PPN utilization at a large urban tertiary hospital.</p><p><b>Methods:</b> We performed a retrospective review of adult inpatients receiving PN at our facility from 1/1/23–12/31/23. Patients were excluded from review if they had PN initiated prior to hospitalization. Demographic information, duration of treatment, timely administration status, and information regarding formula nutrition composition were collected.</p><p><b>Results:</b> A total of 128 inpatients received PN for a total of 1302 PN days. The mean age of these patients was 53.8 years old (SD: 17.9) and 65 (50%) were male. Twenty-six (20%) patients received only PPN for a median [IQR] length of 3 [2–4] days, and 61 (48%) patients received only CPN for a median length 6 [3–10] days. Thirty-nine (30%) patients were started on PPN with the median time to transition to CPN of 1 [1-3] day(s) and a median total duration of CPN being 8 [5-15.5] days. A small minority of patients received CPN and then transitioned to PPN (2%).</p><p><b>Conclusion:</b> At our institution, PPN is utilized in more than 50% of all inpatient PN, most commonly at PN initiation and then eventually transitioning to CPN for a relatively short duration of one to two weeks. Additional research is required to identify those patients who might avoid central access by increasing PPN volume and macronutrients to provide adequate nutrition therapy.</p><p>Nicole Halton, NP, CNSC<sup>1</sup>; Marion Winkler, PhD, RD, LDN, CNSC, FASPEN<sup>2</sup>; Elizabeth Colgan, MS, RD<sup>3</sup>; Benjamin Hall, MD<sup>4</sup></p><p><sup>1</sup>Brown Surgical Associates, Providence, RI; <sup>2</sup>Department of Surgery and Nutritional Support at Rhode Island Hospital, Providence, RI; <sup>3</sup>Rhode Island Hospital, Providence, RI; <sup>4</sup>Brown Surgical Associates, Brown University School of Medicine, Providence, RI</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Parenteral nutrition (PN) provides adequate nutrition and fluids to patients with impaired gastrointestinal function who cannot meet their nutritional needs orally or enterally. PN requires a venous access device that has associated risks including infection as well as metabolic abnormalities associated with the therapy. Monitoring of PN therapy in the hospital setting involves regular blood work, yet improperly collected samples can lead to abnormal laboratory results and unnecessary medical interventions.</p><p><b>Methods:</b> An IRB exempted quality improvement study was conducted at Rhode Island Hospital by the Surgical Nutrition Service which manages all adult PN. The purpose of the study was to quantify the occurrence of contaminated blood samples among PN patients between January 1, 2024, and August 31, 2024. Demographic data, venous access device, and PN-related diagnoses were collected. Quantification of contaminated blood specimens was determined per patient, per hospital unit, and adjusted for total PN days. Comparisons were made between serum glucose, potassium, and phosphorus levels from contaminated and redrawn blood samples. Descriptive data are reported.</p><p><b>Results:</b> 138 patients received PN for a total of 1840 days with a median length of PN therapy of 8 days (IQR 9, range 2-84). The most common vascular access device was dual lumen peripherally inserted central catheter. The majority (63%) of patients were referred by surgery teams and received care on surgical floors or critical care units. The most frequent PN related diagnoses were ileus, gastric or small bowel obstruction, and short bowel syndrome. There were 74 contaminated blood specimens among 42 (30%) patients receiving TPN for a rate of 4% per total patient days. Of 25 nursing units, 64% had at least one occurrence of contaminated blood specimens among TPN patients on that unit. Contaminated samples showed significantly different serum glucose, potassium, and phosphorus compared to redrawn samples (p < 0.001); glucose in contaminated vs redrawn samples was 922 ± 491 vs 129 ± 44 mg/dL; potassium 6.1 ± 1.6 vs 3.9 ± 0.5 mEq/L; phosphorus 4.9 ± 1.2 vs 3.3 ± 0.6 mg/dL. The average time delay between repeated blood samples was 3 hours.</p><p><b>Conclusion:</b> Contaminated blood samples can lead to delays in patient care, discomfort from multiple blood draws, unnecessary medical interventions (insulin; discontinuation of PN), delay in placement of timely PN orders, and increased infection risk. Nursing re-education on proper blood sampling techniques is critical for reducing contamination occurrences. All policies and procedures will be reviewed, and an educational program will be implemented. Following this, occurrences of blood contamination during PN will be reassessed.</p><p>Hassan Dashti, PhD, RD<sup>1</sup>; Priyasahi Saravana<sup>1</sup>; Meghan Lau<sup>1</sup></p><p><sup>1</sup>Massachusetts General Hospital, Boston, MA</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> ASN Nutrition 2024.</p><p><b>Publication:</b> Saravana P, Lau M, Dashti HS. Continuous glucose monitoring in adults with short bowel syndrome receiving overnight infusions of home parenteral nutrition. Eur J Clin Nutr. 2024 Nov 23. doi: 10.1038/s41430-024-01548-z. Online ahead of print. PMID: 39580544.</p><p><b>Financial Support:</b> ASPEN Rhoads Research Foundation.</p><p>Maria Romanova, MD<sup>1</sup>; Azadeh Lankarani-Fard, MD<sup>2</sup></p><p><sup>1</sup>VA Greater Los Angeles Healthcare System, Oak Park, CA; <sup>2</sup>GA Greater Los Angeles Healthcare System, Los Angeles, CA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition is a serious complication of the hospital stay. Parenteral nutrition (PN) in most sophisticated way of addressing it but requires on-going monitoring. In our medical center PN provision is guided by the interdisciplinary Nutrition Support Team (NST). In 2024 we began creation of a dashboard to monitor safety and utilization of PN at the Greater Los Angeles VA. Here we discuss the collaborative process of developing the dashboard and its first use.</p><p><b>Methods:</b> A dashboard was constructed using data from the VA electronic health record. The dashboard used Microsoft Power BI technology to customize data visualization. The NST group worked closely with the Data Analytics team at the facility to modify and validate the dashboard to accommodate the needs of the group. The dashboard was maintained behind a VA firewall and only accessible to members of the NST. The dashboard reviewed patient level data for whom a Nutrition Support consult was placed for the last 2 years. The variables included were the data of admission, date of consult request, the treating specialty at the time of request, demographics, admission diagnosis, discharge diagnosis, number of orders for PPN/TPN, number of blood sugars >200 mg/dL after admission, number of serum phosphorus values < 2.5 mg/dL, number of serum potassium values < 3.5 mmol/L, any discharge diagnosis of refeeding (ICD 10 E87.8), micronutrient levels during admission, and any discharge diagnosis of infection. The ICD10 codes used to capture infection were for: bacteremia (R78.81), sepsis (A41.*), or catheter associated line infection (ICD10 = T80.211*). The asterix (*) denotes any number in that ICD10 classification. The dashboard was updated once a week. The NST validated the information on the dashboard to ensure validity, and refine information as needed.</p><p><b>Results:</b> The initial data extraction noted duplicate consult request as patients changed treating specialties during the same admission and duplicate orders for PPN/TPN as the formulations were frequently modified before administration. The Data Analytics team worked to reduce these duplicates. The NST also collaborated with the Data Analytics team to modify their existing documentation to better capture the data needed going forward. Dashboard data was verified by direct chart review. Between April 2022- 2024 68 consults were placed from the acute care setting and 58 patients received PPN or TPN during this time period. Thirty-five patients experienced hyperglycemia. Two patients were deemed to have experience refeeding at the time of discharge. Fourteen episodes of infection were noted in those who received PPN/TPN but the etiology was unclear from the dashboard alone and required additional chart review.</p><p><b>Conclusion:</b> A dashboard can facilitate monitoring of Nutrition Support services in the hospital. Refinement of the dashboard requires collaboration between the clinical team and the data analytics team to ensure validity and workload capture.</p><p>Michael Fourkas, MS<sup>1</sup>; Julia Rasooly, MS<sup>1</sup>; Gregory Schears, MD<sup>2</sup></p><p><sup>1</sup>PuraCath Medical Inc., Newark, CA; <sup>2</sup>Department of Anesthesiology and Perioperative Medicine, Mayo Clinic, Rochester, MN</p><p><b>Financial Support:</b> Funding of the study has been provided by Puracath Medical.</p><p><b>Background:</b> Intravenous catheters can provide venous access for drug and nutrition delivery in patients for extended periods of time, but risk the occurrence of central line associated bloodstream infections (CLABSI) due to inadequate asepsis. Needleless connectors (NC), which provide access for injection of medications, are known to be one of the major sources of contamination. Studies demonstrate that current methods of disinfecting connectors such as a 15 second antiseptic wipe do not guarantee complete disinfection inside of connectors. With the rise of superbugs such as Candida auris, there is an urgent need for better aseptic technique compliance and non-antibiotic disinfection methods. Ultraviolet light-C (UV-C) is an established technology that is commonly used in hospital settings for disinfection of equipment and rooms. In this study, we investigate the efficacy of our novel UV-C light disinfection device on UV-C light-transmissive NCs inoculated with common CLABSI-associated organisms.</p><p><b>Methods:</b> Staphylococcus aureus (ATCC #6538), Candida albicans (ATCC #10231), Candida auris (CDC B11903), Escherichia coli (ATCC #8739), Pseudomonas aeruginosa (ATCC #9027), and Staphylococcus epidermidis (ATCC #12228) were used as test organisms for this study. A total of 29 NC samples were tested for each organism with 3 positive controls and 1 negative control. Each UV-C light-transmissive NC was inoculated with 10 µl of cultured inoculum (7.00-7.66 log) and were exposed to an average of 48 mW/cm2 of UV light for 1 second using our in-house UV light disinfection device FireflyTM. After UV disinfection, 10 mL of 0.9% saline solution was flushed through the NC and filtered through a 0.45 µm membrane. The membrane filter was plated onto an agar medium matched to the organism and was incubated overnight at 37°C for S. aureus, E. coli, S. epidermidis, and P. aeruginosa, and two days at room temperature for C. albicans and C. auris. Positive controls followed the same procedure without exposure to UV light and diluted by 100x before being spread onto agar plates in triplicates. The negative controls followed the same procedure without inoculation. After plates were incubated, the number of colonies on each plate were counted and recorded. Log reduction was calculated by determining the positive control log concentration over the sample concentration in cfu/mL. 1 cfu/10 mL was used to make calculations for total kills.</p><p><b>Results:</b> Using our UV light generating device, we were able to achieve greater than 4 log reduction average and complete kills for all test organisms. The log reduction for S. aureus, C. albicans, C. auris, E. coli, P. aeruginosa, and S. epidermidis were 5.29, 5.73, 5.05, 5.24, 5.10, and 5.19, respectively.</p><p><b>Conclusion:</b> We demonstrated greater than 4-log reduction in common CLABSI-associated organisms using our UV light disinfection device and UV-C transmissive NCs. By injecting inoculum directly inside the NC, we demonstrated that disinfection inside NCs can be achieved, which is not possible with conventional scrubbing methods. A one second NC disinfection time will allow less disruption in the workflow in hospitals, particularly in intensive care units where highly effective and efficient disinfection rates are essential for adoption of the technology.</p><p><b>Table 1.</b> Log Reduction of Tested Organisms After Exposure to 48 mW/cm2 UV-C for 1 Second.</p><p></p><p>Yaiseli Figueredo, PharmD<sup>1</sup></p><p><sup>1</sup>University of Miami Hospital, Miami, FL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Octreotide belongs to the somatostatin analog class. It is used off-label for malignant bowel obstructions (MBO). Somatostatin analogs (SSA) inhibit the release and action of multiple hormones, reducing gastric secretions, peristalsis, and splanchnic blood flow while enhancing water and electrolyte absorption. National Comprehensive Cancer Network (NCCN) guidelines recommend octreotide 100-300 mcg subcutaneous twice to three times a day or 10-40 mcg/hour continuous infusion for the management of malignant bowel obstructions, and if prognosis is greater than 8 weeks, consider long-acting release (LAR) or depot injection. Using octreotide as an additive to parenteral nutrition solutions has been a debatable topic due to concerns of formation of a glycosyl octreotide conjugate that may decrease the octreotide's efficacy. However, other compatibility studies have concluded little octreotide loss over 48 hours in TPN solutions at room temperature in ambient room light. At the University of Miami Hospital, it is practiced using octreotide as an additive to Total Parenteral Nutrition (TPN) solutions to reduce gastro-intestinal secretions in patients with malignant bowel obstructions. The starting dose is 300 mcg, and dose is increased on 300 mcg increments to a maximum dose of 900 mcg if output remains uncontrolled/elevated. The objective of this study is to evaluate the effectiveness and safety of octreotide as an additive to TPN in hospitalized patients with malignant bowel obstructions.</p><p><b>Methods:</b> A three-year retrospective chart review (June 2021-June 2024) was conducted to evaluate the effectiveness and safety of octreotide as an additive to TPN in hospitalized patients with MBO diagnosis at UMH. The following information was obtained from chart review: age, gender, oncologic diagnosis, TPN indication, TPN dependency, octreotide doses used, baseline and final gastrointestinal secretion output recorded, type of venting gastrostomy in place, length of hospital stay, and baseline and final hepatic function tests.</p><p><b>Results:</b> A total of 27 patients were identified to have malignant bowel obstruction requiring TPN which had octreotide additive. All patients were started on octreotide 300 mcg/day added into 2-in-1 TPN solution. The gastrointestinal secretion output was reduced on average by 65% among all patients with a final average daily amount of 540 mL recorded. The baseline average output recorded was 1,518 mL/day. The average length of treatment as an inpatient was 23 days, range 3-98 days. Liver function tests (LFTs) were assessed at baseline and last inpatient value available for the admission. Four out of the 27 patients (15%) reviewed were observed to have a significant rise in liver enzymes greater than three times the upper limit of normal.</p><p><b>Conclusion:</b> Octreotide represents a valuable addition to the limited pharmacological options for managing malignant bowel obstruction. Its ability to reduce gastrointestinal secretions by 65% on average as observed in this retrospective chart review can significantly alleviate symptoms and improve patient care. Using octreotide as an additive to TPN solutions for patients with malignant bowel obstructions who are TPN dependent reduces the number of infusions or subcutaneous injections patients receive per day. According to octreotide's package insert, the incidence of hepato-biliary complications is up to 63%. The finding that 15% of patients from this retrospective chart review had significant liver enzyme elevations remains an important monitoring parameter to evaluate.</p><p>Pavel Tesinsky, Assoc. Prof., MUDr.<sup>1</sup>; Jan Gojda, Prof., MUDr, PhD<sup>2</sup>; Petr Wohl, MUDr, PhD<sup>3</sup>; Katerina Koudelkova, MUDr<sup>4</sup></p><p><sup>1</sup>Department of Medicine, Prague, Hlavni mesto Praha; <sup>2</sup>Department of Medicine, University Hospital, 3rd Faculty of Medicine Charles University in Prague, Praha, Hlavni mesto Praha; <sup>3</sup>Institute for Clinical and Experimental Medicine, Prague, Hlavni mesto Praha; <sup>4</sup>Department of Medicine, University Hospital and 3rd Faculty of Medicine in Prague, Prague, Hlavni mesto Praha</p><p><b>Financial Support:</b> The Registry was supported by Takeda and Baxter scientific grants.</p><p><b>Background:</b> Trends in indications, syndromes, performance, weaning, and complications of patients on total HPN based on the updated 30 years analysis and stratification of patients on home parenteral nutrition (HPN) in Czech Republic.</p><p><b>Methods:</b> Records from the HPN National Registry were analysed for the time period 2007 – 2023, based on the data from the HPN centers. Catheter related sepsis (CRS), catheter occlusions, and thrombotic complications were analyzed for the time–to–event using the competing-risks regression (Fine and Gray) model. Other data is presented as median or mean with 95% CI (p < 0.05 as significant).</p><p><b>Results:</b> The incidence rate of HPN is 1.98 per 100.000 inhabitants (population 10.5 mil.). Lifetime dependency is expected in 20% patients, potential weaning in 40%, and 40% patients are palliative. Out of 1838 records representing almost 1.5 million catheter days, short bowel syndrome was present in 672 patients (36.6%), intestinal obstruction in 531 patients (28.9%), malabsorption in 274 patients (14.9%), and the rest of 361 patients (19.6%) was split among fistulas, dysphagia, or remained unspecified. The majority of SBS were type I (57.8 %) and II (20.8%). Mean length of residual intestine was 104.3 cm (35.9 - 173.4 cm) with longer remnants in type I SBS. Dominant indications for HPN were pseudoobstruction (35.8%), non-maignant surgical conditions (8.9%), Crohn disease (7.3%), and mesenteric occlusion (6.8%). Mobility for a substantial part of the day was reported from 77.8% HPN patients, economic activity and independence from 162 (24.8 %) out of 653 economically potent patients. A tunneled catheter was primarily used in 49.1%, PICC in 24.3%, and IV port in 19.8% patients. Commercially prepared bags were used in 69.7%, and pharmacy-prepared admixtures in 24.7% patients. A total of 66.9% patients were administered 1 bag per day/7 days a week. The sepsis ratio per 1000 catheter days decreased from 0.84 in 2013 to 0.15 in 2022. The catheter occlusions ratio decreased from 0.152 to 0.10 per 1000 catheter days, and thrombotic complications ratio from 0.05 to 0.04. Prevalence of metabolic bone disease is 15.6 %, and prevalence of PNALD is 22.3%. In the first 12 months, 28 % patients achieved intestinal autonomy increasing to 45 % after 5 years. Patient survival rate is 62% in the first year, 45% at 5 years, and 35% at the 10-years mark. Tedeglutide was indicated in 36 patients up to date with reduction of the daily HPN volume to 60.3% on average.</p><p><b>Conclusion:</b> Prevalence of HPN patients in the Czech Republic is increasing in the past ten years and it is corresponding to the incidence rate. Majority of patients are expected to terminate HPN within the first year. Risk of CRS decreased significantly in the past five years and remains low, while catheter occlusion and thrombotic complications have a stable trend. Tedeglutide significantly reduced the required IV volume.</p><p></p><p><b>Figure 1.</b> Per-Year Prevalence of HPN Patients and Average Number of Catheter Days Per Patient (2007 - 2022).</p><p></p><p><b>Figure 2.</b> Annual Incidence of HPN Patients (2007 - 2022).</p><p></p><p><b>Figure 3.</b> Catheter related bloodstream infections (events per 1,000 catheter-days).</p><p>Jill Murphree, MS, RD, CNSC, LDN<sup>1</sup>; Anne Ammons, RD, LDN, CNSC<sup>2</sup>; Vanessa Kumpf, PharmD, BCNSP, FASPEN<sup>2</sup>; Dawn Adams, MD, MS, CNSC<sup>2</sup></p><p><sup>1</sup>Vanderbilt University Medical Center, Nashville, TN; <sup>2</sup>Vanderbilt University Medical Center, Nashville, TN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Determining macronutrient goals in patients requiring home parenteral nutrition (HPN) can be difficult due to various factors. While indirect calorimetry is the gold standard for measuring energy expenditure, it is not readily available in the outpatient setting. Therefore, clinicians typically rely on less accurate weight-based equations for assessment of protein and energy requirements. Energy goals are also impacted by the targeted desire for weight loss, weight gain, or weight maintenance. Patients receiving HPN may consume some oral dietary intake and experience variable degrees of macronutrient absorption. These factors, as well as underlying clinical conditions, can significantly impact protein and energy requirements and may change over the course of HPN therapy. The purpose of this study was to evaluate the range of protein and energy doses prescribed in patients receiving HPN managed by an interdisciplinary intestinal failure clinic at a large academic medical center.</p><p><b>Methods:</b> Patient demographics including patient age, gender, and PN indication/diagnosis were retrospectively obtained for all patients discharged home with PN between May 2021 to May 2023 utilizing an HPN patient database. Additional information was extracted from the electronic medical record at the start of HPN, then at 2-week, 2 to 3 month, and 6-month intervals following discharge home that included height, actual weight, target weight, HPN energy dose, HPN protein dose, and whether the patient was eating. Data collection ended at completion of HPN therapy or up to 6 months of HPN. All data was entered and stored in an electronic database.</p><p><b>Results:</b> During the study period, 248 patients were started on HPN and 56 of these patients received HPN for at least 6 months. Patient demographics are included in Table 1. At the start of HPN, prescribed energy doses ranged from 344 to 2805 kcal/d (6 kcal/kg/d to 45 kcal/kg/d) and prescribed protein doses ranged from 35 to 190 g/d (0.6 g/kg/d to 2.1 g/kg/d). There continued to be a broad range of prescribed energy and protein doses at 2-week, 2 to 3 month, and 6-month intervals of HPN. Figures 1 and 2 provide the prescribed energy and protein doses for all patients and for those who are eating and not eating. For patients not eating, the prescribed range for energy was 970 to 2791 kcal/d (8 kcal/kg/d to 45 kcal/kg/d) and for protein was 40 to 190 g/d (0.6 g/kg/d to 2.0 g/kg/d) at the start of PN therapy. The difference between actual weight and target weight was assessed at each study interval. Over the study period, patients demonstrated a decrease in the difference between actual and target weight to suggest improvement in reaching target weight (Figure 3).</p><p><b>Conclusion:</b> The results of this study demonstrate a wide range of energy and protein doses prescribed in patients receiving HPN. This differs from use of PN in the inpatient setting, where weight-based macronutrient goals tend to be more defined. Macronutrient adjustments may be necessary in the long-term setting when patients are consuming oral intake, for achievement/maintenance of target weight, or for changes in underlying conditions. Patients receiving HPN require an individualized approach to care that can be provided by interdisciplinary nutrition support teams specializing in intestinal failure.</p><p><b>Table 1.</b> Patient Demographics Over 6-Month Study Period.</p><p></p><p></p><p><b>Figure 1.</b> Parenteral Nutrition (PN) Energy Range.</p><p></p><p><b>Figure 2.</b> Parenteral Nutrition (PN) Protein Range.</p><p></p><p><b>Figure 3.</b> Difference Between Actual Weight and Target Weight.</p><p>Jennifer Lachnicht, RD, CNSC<sup>1</sup>; Christine Miller, PharmD<sup>2</sup>; Jessica Younkman, RD CNSC<sup>2</sup></p><p><sup>1</sup>Soleo Home Infusion, Frisco, TX; <sup>2</sup>Soleo Health, Frisco, TX</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Since the 1990s, initiating parenteral nutrition (PN) at home has been performed, though some clinicians prefer hospital initiation due to risks like refeeding syndrome (RS). A key factor in successful home PN initiation is careful evaluation by an experienced nutrition support clinician, particularly assessing RS risk. In 2020, ASPEN published consensus recommendations for identifying patients at risk for RS and guidelines for initiating and advancing nutrition. Home PN initiation offers advantages, including avoiding hospitalization, reducing healthcare costs, minimizing hospital-acquired infections, and improving quality of life. Literature suggests a savings of $2,000 per day when PN is started at home. This study aims to determine the risk and incidence of RS in patients who began PN at home, based on the 2020 ASPEN Consensus Recommendations for RS.</p><p><b>Methods:</b> A national home infusion provider's nutrition support service reviewed medical records for 27 adult patients who initiated home PN between September 2022 and June 2024. Patients were evaluated for RS risk before PN initiation and the actual incidence of RS based on pre- and post-feeding phosphorus, potassium, and magnesium levels using ASPEN 2020 criteria. Initial lab work was obtained before and after PN initiation. Refeeding risk was categorized as mild, moderate, moderate to severe, or severe based on initial nutrition assessment, including BMI, weight loss history, recent caloric intake, pre-feeding lab abnormalities, fat/muscle wasting, and high-risk comorbidities. The percent change in phosphorus, potassium, and magnesium was evaluated and categorized as mild, moderate, or severe if levels decreased after therapy start. Initial PN prescriptions included multivitamins and supplemental thiamin per provider policy and consensus recommendations.</p><p><b>Results:</b> The average baseline BMI for the study population was 19.6 kg/m² (range 12.7-31.8, median 18.9). Weight loss was reported in 88.9% of patients, averaging 22%. Little to no oral intake at least 5-10 days before assessment was reported in 92.3% of patients. Initial lab work was obtained within 5 days of therapy start in 96.2% of cases, with 18.5% showing low prefeeding electrolytes. 100% had high-risk comorbidities. RS risk was categorized as mild (4%), moderate (48%), moderate to severe (11%), and severe (37%). Home PN was successfully initiated in 25 patients (93%). Two patients could not start PN at home: one due to persistently low pre-PN electrolyte levels despite IV repletion, and one due to not meeting home care admission criteria. Starting dextrose averaged 87.2 g/d (range: 50-120, median 100). Average total starting calories were 730 kcals/d, representing 12.5 kcals/kg (range: 9-20, median 12). Initial PN formula electrolyte content included potassium (average 55.4 mEq/d, range: 15-69, median 60), magnesium (average 11.6 mEq/d, range: 4-16, median 12), and phosphorus (average 15.6 mmol/d, range: 8-30, median 15). Labs were drawn on average 4.3 days after therapy start. Potassium, magnesium, and phosphorus levels were monitored for decreases ≥10% of baseline to detect RS. Decreases in magnesium and potassium were classified as mild (10-20%) and experienced by 4% of patients, respectively. Eight patients (32%) had a ≥ 10% decrease in phosphorus: 4 mild (10-20%), 2 moderate (20-30%), 2 severe (>30%).</p><p><b>Conclusion:</b> Home initiation of PN can be safely implemented with careful monitoring and evaluation of RS risk. This review showed a low incidence of RS based on ASPEN criteria, even in patients at moderate to severe risk prior to home PN initiation. Close monitoring of labs and patient status, along with adherence to initial prescription recommendations, resulted in successful PN initiation at home in 92.5% of patients.</p><p>Dana Finke, MS, RD, CNSC<sup>1</sup>; Christine Miller, PharmD<sup>1</sup>; Paige Paswaters, RD, CNSC<sup>1</sup>; Jessica Younkman, RD, CNSC<sup>1</sup></p><p><sup>1</sup>Soleo Health, Frisco, TX</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Iron deficiency anemia is common in home parenteral nutrition (HPN) patients (Hwa et al., 2016). Post iron infusion, hypophosphatemia due to fibroblast growth factor 23 is a known side effect of some IV iron formulations (Wolf et al., 2019). Ferric carboxymaltose can cause serum phosphorus levels to drop below 2 mg/dL in 27% of patients (Onken et al., 2014). Hypophosphatemia (< 2.7 mg/dL) can lead to neurological, neuromuscular, cardiopulmonary, and hematologic issues, and long-term effects like osteopenia and osteoporosis (Langley et al., 2017). This case series reviews the occurrence and clinical implications of transient serum phosphorus drops in patients receiving ferric carboxymaltose and HPN.</p><p><b>Methods:</b> A retrospective case series review was performed for three patients who were administered ferric carboxymaltose while on HPN therapy. Serum phosphorus levels were measured at baseline prior to initial iron dose, within 1 week post injection, and at subsequent time intervals up to 4 months post initial injection. Data was collected on the timing, magnitude, and duration of any phosphorus decreases, and any associated clinical symptoms or complications. Patient records were also reviewed to evaluate any correlations with HPN composition or phosphorus dosing.</p><p><b>Results:</b> Among the three patients reviewed, all exhibited short-term drops in serum phosphorus levels following ferric carboxymaltose injection (Table 1). All patients had baseline serum phosphorus levels within normal limits prior to the initial dose of ferric carboxymaltose. All cases involved multiple doses of ferric carboxymaltose, which contributed to the fluctuations in phosphorus levels. The average drop in serum phosphorus from baseline to the lowest point was 50.3%. The lowest recorded phosphorus level among these three patients was 1.4 mg/dL, and this was in a patient who received more than two doses of ferric carboxymaltose. In two cases, increases were made in HPN phosphorus in response to serum levels, and in one case no HPN changes were made. However, all serum phosphorus levels returned to normal despite varied interventions. Despite the low phosphorus levels, none of the patients reported significant symptoms of hypophosphatemia during the monitoring periods. Ferric carboxymaltose significantly impacts serum phosphorus in HPN patients, consistent with existing literature. The need for vigilant monitoring is highlighted, patients receiving HPN are closely monitored by a trained nutrition support team with frequent lab monitoring. Lab monitoring in patients receiving ferric carboxymaltose who are not on HPN may be less common. The lowest level recorded was 1.4 mg/dL, indicating potential severity. Despite significant drops, no clinical symptoms were observed, suggesting subclinical hypophosphatemia may be common. In two of the reviewed cases, hypophosphatemia was addressed by making incremental increases in the patient's HPN formulas. Note that there are limitations to phosphorus management in HPN due to compatibility and stability issues, and alternative means of supplementation may be necessary depending upon the patient's individual formula.</p><p><b>Conclusion:</b> Three HPN patients receiving ferric carboxymaltose experienced transient, generally mild reductions in serum phosphorus. Monitoring is crucial, but the results from this case series suggest that clinical complications are rare. Adjustments to HPN or additional supplementation may be needed based on individual patient needs, with some cases self-correcting over time.</p><p><b>Table 1.</b> Timeline of Iron Injections and the Resulting Serum Phosphorus Levels and HPN Formula Adjustments.</p><p></p><p>Danial Nadeem, MD<sup>1</sup>; Stephen Adams, MS, RPh, BCNSP<sup>2</sup>; Bryan Snook<sup>2</sup></p><p><sup>1</sup>Geisinger Wyoming Valley, Bloomsburg, PA; <sup>2</sup>Geisinger, Danville, PA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Ferric carboxymaltose (FC) is a widely used intravenous iron formulation, primarily employed in the treatment of iron deficiency. It offers significant benefits, particularly in cases where oral iron supplementation proves ineffective or is not well-tolerated. However, an important potential adverse effect associated with FC use is hypophosphatemia. This condition has been observed in multiple patients following their treatment with FC. The paper discusses the potential mechanisms leading to this adverse effect and its significant implications for patient care.</p><p><b>Methods:</b> A middle-aged female with history of malnutrition and iron deficiency receiving parenteral nutrition at home had received multiple doses Venofer in the past, with the last dose given in 2017 to which the patient developed an anaphylactic reaction. She was therefore switched to ferric carboxymaltose (FCM) therapy. However, upon receiving multiple doses of FCM in 2018, the patient developed significant hypophosphatemia. As hypophosphatemia was noted, adjustments were made to the patient's total parenteral nutrition (TPN) regimen to increase the total phosphorus content in an effort to treat the low phosphate levels. The patient also received continued doses of FCM in subsequent years, with persistent hypophosphatemia despite repletion.</p><p><b>Results:</b> Ferric carboxymaltose (FC) is a widely used intravenous iron therapy for the treatment of iron deficiency. It is particularly beneficial in cases where oral iron supplementation is ineffective or not tolerated. FC works by delivering iron directly to the macrophages in the reticuloendothelial system. The iron is then released slowly for use by the body, primarily for the production of hemoglobin. However, recent studies have highlighted the potential adverse effect of hypophosphatemia associated with the use of FC. Hypophosphatemia induced by FC is thought to be caused by an increase in the secretion of the hormone fibroblast growth factor 23 (FGF23). FGF23 is a hormone that regulates phosphate homeostasis. When FGF23 levels rise, the kidneys increase the excretion of phosphate, leading to lower levels of phosphate in the blood. There are many implications of hypophosphatemia in regards to patient care. Symptoms of hypophosphatemia can include muscle weakness, fatigue, bone pain, and confusion. In severe cases, persistent hypophosphatemia can lead to serious complications such as rhabdomyolysis, hemolysis, respiratory failure, and even death. Therefore, it is crucial for clinicians to be aware of the potential risk of hypophosphatemia when administering FC. Regular monitoring of serum phosphate levels is recommended in patients receiving repeated doses of FC. Further research is needed to fully understand the mechanisms underlying this adverse effect and to develop strategies for its prevention and management.</p><p><b>Conclusion:</b> In conclusion, while FC is an effective treatment for iron deficiency, it is important for clinicians to be aware of the potential risk of hypophosphatemia. Regular monitoring of serum phosphate levels is recommended in patients receiving repeated doses of FC. Further research is needed to fully understand the mechanisms underlying this adverse effect and to develop strategies for its prevention and management.</p><p><b>Table 1.</b> Phosphorous Levels and Iron Administration.</p><p></p><p>Table 1 shows the response to serum phosphorous levels in a patient given multiple doses of intravenous iron over time.</p><p>Ashley Voyles, RD, LD, CNSC<sup>1</sup>; Jill Palmer, RD, LD, CNSC<sup>1</sup>; Kristin Gillespie, MD, RD, LDN, CNSC<sup>1</sup>; Suzanne Mack, MS, MPH, RD, LDN, CNSC<sup>1</sup>; Tricia Laglenne, MS, RD, LDN, CNSC<sup>1</sup>; Jessica Monczka, RD, CNSC, FASPEN<sup>1</sup>; Susan Dietz, PharmD, BCSCP<sup>1</sup>; Kathy Martinez, RD, LD<sup>1</sup></p><p><sup>1</sup>Option Care Health, Bannockburn, IL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Home parenteral nutrition (HPN) can be successfully initiated in the home setting with careful evaluation and management by an experienced nutrition support team (NST).<sup>1,2</sup> Safe candidates for HPN initiation include medically stable patients with an appropriate indication, a safe environment, and the means for reliable follow-up. However, some patients are not appropriate to start directly with HPN due to logistical reasons or the risk of refeeding syndrome (RFS).<sup>2</sup> Consensus Recommendations for RFS provide guidance regarding recognizing and managing risk.<sup>3</sup> An experienced NST that provides individualized care can recommend intravenous (IV) hydration before initiating HPN to expedite the initiation of therapy and normalize blood chemistry to mitigate the risk of RFS. The purpose of this study is to evaluate the impact of IV hydration on adult patients managed by a home infusion NST who received IV hydration prior to initiating HPN. The proportion of patients who received IV hydration prior to HPN, the reason for initiating IV hydration, and the impact this intervention may have had on their care and outcomes will be described.</p><p><b>Methods:</b> This retrospective review includes 200 HPN patients 18 years of age and older initiated on HPN therapy with a national home infusion pharmacy over a 6-month period from January 1, 2024, to June 30, 2024. Data collection included baseline demographics, indication for HPN, risk of RFS, days receiving IV hydration prior to initiating HPN, the number of rehospitalizations within the first 2 weeks, whether they received IV hydration, and if so, the indication for IV hydration and the components of the orders. Data was collected via electronic medical records and deidentified into a standardized data collection form.</p><p><b>Results:</b> Of the 200 total patients, 19 (9.5%) received IV hydration prior to HPN. Of these 19 patients, 16 were female, and 3 were male (Table 1). The most common indications for HPN were bariatric surgery complication (5), intestinal failure (4), and oncology diagnosis (4) (Figure 1). Among these patients, 9 (47%) were at moderate RFS risk and 10 (53%) were at high RFS risk. The indications for IV hydration included 7 (37%) due to electrolyte abnormalities/RFS risk, 5 (26%) due to delay in central line placement, and 7 (37%) due to scheduling delays (Figure 2). IV hydration orders included electrolytes in 15 (79%) of the orders. All orders without electrolytes (4) had an indication related to logistical reasons (Figure 3). All 19 patients started HPN within 7 days of receiving IV hydration. Two were hospitalized within the first two weeks of therapy with admitting diagnoses unrelated to HPN.</p><p><b>Conclusion:</b> In this group of patients, HPN was successfully initiated in the home setting when managed by an experienced NST, preventing unnecessary hospitalizations. This study demonstrated that safe initiation of HPN may include IV hydration with or without electrolytes first, either to mitigate RFS or due to logistical reasons, when started on HPN within 7 days. The IV hydration orders were individualized to fit the needs of each patient. This data only reflects IV hydration dispensed through the home infusion pharmacy and does not capture IV hydration received at an outside clinic. This patient population also does not include those deemed medically unstable for HPN or those not conducive to starting in the home setting for other factors. Future research should account for these limitations and detail IV hydration components, dosing, and frequency of orders.</p><p><b>Table 1.</b> Demographics.</p><p></p><p></p><p><b>Figure 1.</b> HPN Indications of IV Hydration.</p><p></p><p><b>Figure 2.</b> Indication for IV Hydration and Refeeding Risk.</p><p></p><p><b>Figure 3.</b> Indications and Types of IV Hydration.</p><p>Emily Boland Kramer, MS, RD, LDN, CNSC<sup>1</sup>; Jessica Monczka, RD, CNSC, FASPEN<sup>1</sup>; Tricia Laglenne, MS, RD, LDN, CNSC<sup>1</sup>; Ashley Voyles, RD, LD, CNSC<sup>1</sup>; Susan Dietz, PharmD, BCSCP<sup>1</sup>; Kathy Martinez, RD, LD<sup>1</sup></p><p><sup>1</sup>Option Care Health, Bannockburn, IL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Parenteral nutrition (PN) is a critical therapy for patients who are unable to absorb nutrients adequately via the gastrointestinal tract.<sup>1</sup> PN is complex, with 10 or more individually dosed components in each order which inherently increases the risk for dosing errors. <sup>2</sup> This study seeks to analyze the PN orders at hospital discharge received by a home infusion provider and identify the incidence of the omission of the standard components, as determined by ASPEN Recommendations on Appropriate Parenteral Nutrition Dosing.<sup>3</sup> The primary objective of this study was to identify missing sodium, potassium, magnesium, calcium, phosphorus, and multivitamin components in hospital discharge orders. The secondary objective was to determine whether identified missing components were added back during the transition of care (TOC) process from hospital to home.</p><p><b>Methods:</b> This multi-center, retrospective chart review analyzed patients referred to a national home infusion provider over a 3-month period. Data was collected from the electronic medical record and internal surveillance software. The Registered Dietitian, Certified Nutrition Support Clinician (RD, CNSC) reviewed all PN hospital discharge orders for patients transitioning clinical management and PN therapy from hospital to home. Inclusion criteria were patients 18 years of age or older with PN providing the majority of their nutritional needs, as defined in Table 1, who were missing sodium, potassium, magnesium, calcium, phosphorus, or multivitamin from their PN order at hospital discharge. Exclusion criteria were patients less than 18 years of age, patients receiving supplemental PN not providing majority of nutritional needs, and patients with doses ordered for all electrolytes and multivitamin.</p><p><b>Results:</b> During the 3-month period (April 1, 2024 to June 30, 2024), 267 patients were identified who were greater than 18 years of age, receiving the majority of their nutritional needs via PN, and missing at least one PN component in their hospital discharge order. See Table 2 and Figure 1 for demographics. One hundred seventy-five (65.5%) patients were missing one component and 92 (34.5%) were missing multiple components from the hospital discharge order. One hundred seventy-five (65.5%) patients were missing calcium, 68 (25.5%) phosphorus, 38 (14%) multivitamin, 23 (8.6%) magnesium, 20 (7.5%) potassium, and 20 (7.5%) sodium. During the transition from hospital to home, after discussion with the provider, 94.9% of patients had calcium added back, 94.7% multivitamin, 91.3% magnesium, 90% potassium, 88.2% phosphorus, and 80% sodium.</p><p><b>Conclusion:</b> This study highlights the prevalence of missing components in PN hospital discharge orders, with calcium being the most frequently omitted at a rate of 65.5%. Given that many patients discharging home on PN will require long term therapy, adequate calcium supplementation is essential to prevent bone resorption and complications related to metabolic bone disease. In hospital discharge orders that were identified by the RD, CNSC as missing calcium, 94.9% of the time the provider agreed that it was clinically appropriate to add calcium to the PN order during the TOC process. This underlines the importance of nutrition support clinician review and communication during the transition from hospital to home. Future research should analyze the reasons why components are missing from PN orders and increase awareness of the need for a thorough clinical review of all patients going home on PN to ensure the adequacy of all components required for safe and optimized long term PN.</p><p><b>Table 1.</b> Inclusion and Exclusion Criteria.</p><p></p><p><b>Table 2.</b> Demographics.</p><p></p><p></p><p><b>Figure 1.</b> Primary PN Diagnosis.</p><p></p><p><b>Figure 2.</b> Components Missing from Order and Added Back During TOC Process.</p><p>Avi Toiv, MD<sup>1</sup>; Hope O'Brien, BS<sup>2</sup>; Arif Sarowar, MSc<sup>2</sup>; Thomas Pietrowsky, MS, RD<sup>1</sup>; Nemie Beltran, RN<sup>1</sup>; Yakir Muszkat, MD<sup>1</sup>; Syed-Mohammad Jafri, MD<sup>1</sup></p><p><sup>1</sup>Henry Ford Hospital, Detroit, MI; <sup>2</sup>Wayne State University School of Medicine, Detroit, MI</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Intestinal Failure Associated Liver Disease (IFALD) is a known complication in patients reliant on total parenteral nutrition (TPN), especially those awaiting intestinal transplantation. There is concern that IFALD may negatively impact post-transplant outcomes, including graft survival and overall patient mortality. This study aims to evaluate the impact of IFALD, as indicated by liver function test (LFT) abnormalities before intestinal or multivisceral transplant, on transplant outcomes.</p><p><b>Methods:</b> We conducted a retrospective chart review of all patients who underwent IT at an academic transplant center from 2010 to 2023. The primary outcome was patient survival and graft failure in transplant recipients.</p><p><b>Results:</b> Among 50 IT recipients, there were 30 IT recipients (60%) who required TPN before IT. The median age at transplant in was 50 years (range, 17-68). In both groups, the majority of transplants were exclusively IT, however they included MVT as well. 87% of patients on TPN developed elevated LFTs before transplant. 33% had persistently elevated LFTs 1 year after transplant. TPN-associated liver dysfunction in our cohort was associated with mixed liver injury patterns, with both hepatocellular injury (p < 0.001) and cholestatic injury (p < 0.001). No significant associations were found between TPN-related elevated LFTs and major transplant outcomes, including death (p = 0.856), graft failure (p = 0.144), or acute rejection (p = 0.306). Similarly, no significant difference was observed between elevated LFTs and death (p = 0.855), graft failure (p = 0.769), or acute rejection (p = 0.386) in patients who were not on TPN. TPN-associated liver dysfunction before transplant was associated with elevated LFTs at one year after transplant (p < 0.001) but lacked clinical relevance.</p><p><b>Conclusion:</b> Although IFALD related to TPN use is associated with specific liver dysfunction patterns in patients awaiting intestinal or multivisceral transplants, it does not appear to be associated with significant key transplant outcomes such as graft failure, mortality, or acute rejection. However, it is associated with persistently elevated LFTs even 1 year after transplant. These findings suggest that while TPN-related liver injury is common, it may not have a clinically significant effect on long-term transplant success. Further research is needed to explore the long-term implications of IFALD in this patient population.</p><p>Jody (Lind) Payne, RD, CNSC<sup>1</sup>; Kassandra Samuel, MD, MA<sup>2</sup>; Heather Young, MD<sup>3</sup>; Karey Schutte, RD<sup>3</sup>; Kristen Horner, RDN, CNSC<sup>3</sup>; Daniel Yeh, MD, MHPE, FACS, FCCM, FASPEN, CNSC<sup>3</sup></p><p><sup>1</sup>Denver Health, Parker, CO; <sup>2</sup>Denver Health, St. Joseph Hospital, Denver, CO; <sup>3</sup>Denver Health, Denver, CO</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Central line-associated blood stream infection (CLABSI) is associated with increased complications, length of stay and cost of care. The majority of CLABSI studies are focused on home parenteral nutrition (PN) patients and there is a paucity of data documenting the incidence of CLABSI attributable to PN in the inpatient setting. At our institution, we have observed that clinicians are reluctant to initiate PN in patients with clear indications for PN due to concerns about CLABSI. Therefore, we performed a quality improvement project to document our incidence of CLABSI rate for new central parenteral nutrition (CPN) initiated during hospitalization.</p><p><b>Methods:</b> We performed a retrospective review of adult inpatients who initiated CPN at our facility from 1/1/23-12/31/23. Patients were excluded if they received PN prior to admission or only received peripheral PN. The National Healthcare Safety Network (NHSN) definitions were used for CLABSI and secondary attribution. Further deeper review of CLABSI cases was provided by an Infectious Disease (ID) consultant to determine if positive cases were attributable to CPN vs other causes. The type of venous access for the positive patients was also reviewed.</p><p><b>Results:</b> A total of 106 inpatients received CPN for a total of 1121 CPN days. The median [IQR] length of CPN infusion was 8 [4-14] days. Mean (standard deviation) age of patients receiving CPN infusion was 53.3 (18.6) years and 65 (61%) were men. The CPN patients who met criteria for CLABSI were further reviewed by ID consultant and resulted in only four CLABSI cases being attributable to CPN. These four cases resulted in an incidence rate of 3.6 cases of CLABSI per 1000 CPN days. Two of these patients were noted for additional causes of infection including gastric ulcer perforation and bowel perforation with anastomotic leak. Three of the patients had CPN infused via a central venous catheter (a port, a femoral line, and a non-tunneled internal jugular catheter) and the fourth patient had CPN infused via a peripherally inserted central catheter. The incidence rate for CLABSI cases per catheter days was not reported in our review.</p><p><b>Conclusion:</b> At our institution, < 4% of patients initiating short-term CPN during hospitalization developed a CLABSI attributable to the CPN. This low rate of infection serves as a benchmark for our institution's quality improvement and quality assurance efforts. Collaboration with ID is recommended for additional deeper review of CPN patients with CLABSI to determine if the infection is more likely to be related to other causes than infusion of CPN.</p><p>Julianne Harcombe, RPh<sup>1</sup>; Jana Mammen, PharmD<sup>1</sup>; Hayato Delellis, PharmD<sup>1</sup>; Stefani Billante, PharmD<sup>1</sup></p><p><sup>1</sup>Baycare, St. Joseph's Hospital, Tampa, FL</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> Florida Residency Conference 2023.</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Purpose/Background: Refeeding syndrome is defined as potentially fatal shifts in fluids and electrolytes that may occur in malnourished patients receiving enteral or parenteral nutrition. When refeeding syndrome occurs, a reduction in phosphorus/magnesium/potassium levels and thiamine deficiency can be seen shortly after initiation of calorie provision. The American Society of Parenteral and Enteral Nutrition guidelines considers hypophosphatemia as the hallmark sign of refeeding syndrome; however, magnesium and potassium have been shown to be equally important. The purpose of this study is to identify the incidence of hypophosphatemia in patients who are at risk of refeeding syndrome and the importance of monitoring phosphorus.</p><p><b>Methods:</b> This study was a multicenter retrospective chart review that was conducted using the BayCare Health System medical records database, Cerner. The study included patients who were 18 years or older, admitted between January 2023 through December 2023, received total parenteral nutrition (TPN) and were at risk of refeeding syndrome. We defined patients at risk of refeeding syndrome as patients who meet two of the following criteria prior to starting the TPN: body mass index (BMI) prior to starting TPN < 18.5 kg/m2, 5% weight loss in 1 month, no oral intake 5 days, or low levels of serum phosphorus/magnesium/potassium. COVID-19 patients and patients receiving propofol were excluded from the study. The primary objective of the study was to evaluate the incidence of hypophosphatemia versus hypomagnesemia versus hypokalemia in patients receiving TPN who were at risk of refeeding syndrome. The secondary objective was to evaluate whether the addition of thiamine upon initiation of TPN showed benefit in the incidence of hypophosphatemia.</p><p><b>Results:</b> A total of 83 patients met the criteria for risk of refeeding syndrome. Out of the 83 patients, a total of 53 patients were used to run a pilot study to determine the sample size and 30 patients were included in the study. The results on day 1 and day 2 suggest the incidence of hypomagnesemia differs from that of hypophosphatemia and hypokalemia, with a notably lower occurrence. The Cochran's Q test yielded x2(2) = 9.57 (p-value = 0.008) on day 1 and x2(2) = 4.77 (p-value = 0.097) on day 2, indicating a difference in at least one group compared to the others on only day 1. A post hoc analysis found a difference on day 1 between the incidence of hypophosphatemia vs hypomagnesemia (30%) and hypomagnesemia vs hypokalemia (33.3%). For the secondary outcome, the difference in day 2 versus day 1 phosphorus levels with the addition of thiamine in the TPN was 0.073 (p-value = 0.668, 95% CI [-0.266 – 0.413]).</p><p><b>Conclusion:</b> Among patients who were on parenteral nutrition and were at risk of refeeding syndrome, there was not a statistically significant difference in the incidence of hypophosphatemia vs hypomagnesemia and hypomagnesemia vs hypokalemia. Among patients who were on parenteral nutrition and were at risk of refeeding syndrome, there was not a statistically significant difference on day 2 phosphorus levels vs day 1 phosphorus levels when thiamine was added.</p><p></p><p></p><p>Jennifer McClelland, MS, RN, FNP-BC<sup>1</sup>; Margaret Murphy, PharmD, BCNSP<sup>1</sup>; Matthew Mixdorf<sup>1</sup>; Alexandra Carey, MD<sup>1</sup></p><p><sup>1</sup>Boston Children's Hospital, Boston, MA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Iron deficiency anemia (IDA) is common in patients with intestinal failure (IF) dependent on parenteral nutrition (PN). Treatment with enteral iron is preferred; however, may not be tolerated or efficacious. In these cases, intravenous (IV) iron is a suitable alternative. Complications include adverse reactions and infection, though low when using low-molecular-weight (LMW) formulations. In a home PN (HPN) program, an algorithm (Figure 1) was developed to treat IDA utilizing IV iron.</p><p><b>Methods:</b> A retrospective chart review was conducted in large HPN program (~150 patients annually) from Jan 2019 - April 2024 who were prescribed IV iron following an algorithm. Laboratory studies were analyzed looking for instances of ferritin >500 ng/mL indicating potential iron overload, as well as transferrin saturation 12-20% indicating iron sufficiency. In instances of ferritin levels >500 further review was conducted to understand etiology, clinical significance and if the IV iron algorithm was adhered to.</p><p><b>Results:</b> HPN patients are diagnosed with IDA based on low iron panel (low hemoglobin and/or MCV, low ferritin, high reticulocyte count, serum iron and transferrin saturation and/or high total iron binding capacity (TIBC). If the patient can tolerate enteral iron supplementation, a dose of 3-6 mg/kg/day is initiated. If patient cannot tolerate enteral iron, the IV route is initiated. Initial IV dose is administered in the hospital or infusion center for close monitoring and to establish home maintenance administration post repletion dosing. Iron dextran is preferred as it can be directly added into the PN and run for duration of the cycle. Addition to the PN eliminates an extra infusion and decrease additional CVC access. Iron dextran is incompatible with IV lipids, so the patient must have one lipid-free day weekly to be able to administer. If patient receives daily IV lipids, iron sucrose is given as separate infusion from the PN. Maintenance IV iron dosing is 1 mg/kg/week, with dose and frequency titrated based on clinical status, lab studies and trends. Iron panel and C-reactive protein (CRP) are ordered every 2 months. If lab studies are below the desired range and consistent with IDA, IV iron dose is increased by 50% by dose or frequency; if studies are over the desired range, IV iron dose is decreased by 50% by dose or frequency. Maximum home dose is < 3 mg/kg/dose; if higher dose needed, patient is referred to an infusion center. IV iron is suspended if ferritin >500 ng/mL due to risk for iron overload and deposition in the liver. Ferritin results (n = 4165) for all patients in HPN program from January 2019-April 2024 were reviewed looking for levels >500 ng/mL indicating iron overload. Twenty-nine instances of ferritin >500 ng/mL (0.7% of values reviewed) were identified in 14 unique patients on maintenance IV iron. In 9 instances, the high ferritin level occurred with concomitant acute illness with an elevated CRP; elevated ferritin in these cases was thought to be related to an inflammatory state vs. iron overload. In 2 instances, IV iron dose was given the day before lab draw, rendering a falsely elevated result. Two patients had 12 instances (0.28% of values reviewed) of elevated ferritin thought to be related to IV iron dosing in the absence of inflammation, with normal CRP levels. During this period, there were no recorded adverse events.</p><p><b>Conclusion:</b> IDA is common in patients with IF dependent on PN. Iron is not a standard component or additive in PN. Use of IV iron in this population can increase quality of life by decreasing need for admissions, visits to infusion centers, or need for blood transfusions in cases of severe anemia. IV iron can be safely used for maintenance therapy in HPN patients with appropriate dosing and monitoring.</p><p></p><p><b>Figure 1.</b> Intravenous Iron in the Home Parenteral Nutrition dependent patient Algorithm.</p><p>Lynne Sustersic, MS, RD<sup>1</sup>; Debbie Stevenson, MS, RD, CNSC<sup>2</sup></p><p><sup>1</sup>Amerita Specialty Infusion Services, Thornton, CO; <sup>2</sup>Amerita Specialty Infusion Services, Rochester Hills, MI</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Desmoplastic small round tumor (DSRT) is a soft-tissue sarcoma that causes tumors to form in the abdomen and pelvis. To improve control, cytoreduction, hyperthermic intraperitoneal chemotherapy (CRS/HIPEC) and radiotherapy are often used, which can result in bowel obstruction secondary to sclerosing peritonitis. This necessitates total parenteral nutrition therapy (TPN) due to the inability to consume nutrition orally or enterally. A major complication of parenteral nutrition therapy is parenteral nutrition associated liver disease (PNALD) and the most common metastasis for DSRT is the liver. This case report details the substitution of an olive and soy oil-based intravenous lipid emulsion (OO, SO-ILE) for a soy, MCT, olive, fish oil-based intravenous lipid emulsion (SO, MCT, OO, FO-ILE) to treat high liver function tests (LFTs).</p><p><b>Methods:</b> A 28-year-old male with DSRT metastatic to peritoneum and large hepatic mass complicated by encapsulating peritonitis and enterocutaneous fistula (ECF), following CRS/HIPEC presented to the parenteral nutrition program at Amerita Specialty Infusion Services in 2022. TPN was initiated from January 2022 to March 2023, stopped, and restarted in December 2023 following a biliary obstruction. TPN was initiated and advanced using 1.3 g/kg/day SMOFlipid, (SO, MCT, OO, FO-ILE), known to help mitigate PNALD. The patient developed rising LFTs, with alanine transferase (ALT) peaking at 445 U/L, aspartate transferase (AST) at 606 U/L, and alkaline phosphatase (ALP) at 1265 U/L. Despite transitioning the patient to a cyclic regimen and maximizing calories in dextrose and amino acids, liver function continued to worsen. A switch to Clinolipid, (OO, SO-ILE), at 1.3 g/kg/day was tried.</p><p><b>Results:</b> Following the initiation of OO, SO-ILE, LFTs improved in 12 days with ALT resulting at 263 U/L, AST at 278 U/L, and ALP at 913 U/L. These values continued to improve until the end of therapy in June 2024 with a final ALT value of 224 U/L, AST at 138 U/L, and ALP at 220 U/L. See Figure 1. No significant improvements in total bilirubin were found. The patient was able to successfully tolerate this switch in lipid emulsions and was able to increase his weight from 50 kg to 53.6 kg.</p><p><b>Conclusion:</b> SO, MCT, OO, FO-ILE is well-supported to help prevent and alleviate adverse effects of PNALD, however lipid emulsion impacts on other forms of liver disease need further research. Our case suggests that elevated LFTs were likely cancer induced, rather than associated with prolonged use of parenteral nutrition. A higher olive oil lipid concentration may have beneficial impacts on LFTs that are not associated with PNALD. It is also worth noting that soybean oil has been demonstrated in previous research to have a negative impact on liver function, and the concentration of soy in SO, MCT, OO, FO-ILE is larger (30%) compared to OO, SO-ILE (20%). This may warrant further investigation into specific soy concentrations’ impact on liver function. LFTs should be assessed and treated on a case-by-case basis that evaluates disease mechanisms, medication-drug interactions, parenteral nutrition composition, and patient subjective information.</p><p></p><p><b>Figure 1.</b> OO, SO-ILE Impact on LFTs.</p><p>Shaurya Mehta, BS<sup>1</sup>; Ajay Jain, MD, DNB, MHA<sup>1</sup>; Kento Kurashima, MD, PhD<sup>1</sup>; Chandrashekhara Manithody, PhD<sup>1</sup>; Arun Verma, MD<sup>1</sup>; Marzena Swiderska-Syn<sup>1</sup>; Shin Miyata, MD<sup>1</sup>; Mustafa Nazzal, MD<sup>1</sup>; Miguel Guzman, MD<sup>1</sup>; Sherri Besmer, MD<sup>1</sup>; Matthew Mchale, MD<sup>1</sup>; Jordyn Wray<sup>1</sup>; Chelsea Hutchinson, MD<sup>1</sup>; John Long, DVM<sup>1</sup></p><p><sup>1</sup>Saint Louis University, St. Louis, MO</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> North American Society for Pediatric Gastroenterology, Hepatology and Nutrition (NASPGHAN) Florida.</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Short bowel syndrome (SBS) is a devastating condition. In absence of enteral nutrition (EN), patients are dependent on Total Parenteral Nutrition (TPN) and suffer from of intestinal failure associated liver disease and gut atrophy. Intestinal adaptation (IA) and enteral autonomy (EA) remains the clinical goal. We hypothesized EA can be achieved using our DREAM system, (US patent 63/413,988) which allows EN via the stomach and then a mechanism for cyclical recirculation of nutrient rich distal intestinal content into proximal bowel enabling full EN despite SBS.</p><p><b>Methods:</b> 24 neonatal pigs were randomly allocated to enteral nutrition (EN; n = 8); TPN-SBS (on TPN only, n = 8); or DREAM (n = 8). Liver, gut, and serum were collected for histology, and serum biochemistry. Statistical analysis was performed using ‘Graph Pad Prism 10.1.2 (324)’. All tests were 2-tailed using a significance level of 0.05.</p><p><b>Results:</b> TPN-SBS piglets had significant cholestasis vs DREAM (p = 0.001) with no statistical difference in DREAM vs EN (p = 0.14). DREAM transitioned to full EN by day 4. Mean serum conjugated bilirubin for EN was 0.037 mg/dL, TPN-SBS 1.2 mg/dL, and DREAM 0.05 mg/dL. Serum bile acids were significantly elevated in TPN-SBS vs EN (p = 0.007) and DREAM (p = 0.03). Mean GGT, a marker of cholangiocytic injury was significantly higher in TPN-SBS vs EN (p < 0.001) and DREAM (p < 0.001) with values of EN 21.2 U/L, TPN-SBS 47.9 U/L, and DREAM 22.5 U/L (p = 0.89 DREAM vs EN). To evaluate gut growth, we measured lineal gut mass (LGM), calculated as the weight of the bowel per centimeter. There was significant IA and preservation in gut atrophy with DREAM. Mean proximal gut LGM was EN 0.21 g/cm, TPN-SBS 0.11 g/cm, and DREAM 0.31 g/cm (p = 0.004, TPN-SBS vs DREAM). Distal gut LGM was EN 0.34 g/cm, TPN-SBS 0.13 g/cm, and DREAM 0.43 g/cm (p = 0.006, TPN-SBS vs DREAM). IHC revealed DREAM had similar hepatic CK-7 (bile duct epithelium marker), p = 0.18 and hepatic Cyp7A1, p = 0.3 vs EN. No statistical differences were noted in LGR5 positive intestinal stem cells in EN vs DREAM, p = 0.18. DREAM prevented changes in hepatic, CyP7A1, BSEP, FGFR4, SHP, SREPBP-1 and gut FXR, TGR5, EGF vs TPN and SBS groups.</p><p><b>Conclusion:</b> DREAM resulted in a significant reduction of hepatic cholestasis, prevented gut atrophy, and presents a novel method enabling early full EN despite SBS. This system, by driving IA and EN autonomy highlights a major advancement in SBS management, bringing a paradigm change to life saving strategies for SBS patients.</p><p>Silvia Figueiroa, MS, RD, CNSC<sup>1</sup>; Paula Delmerico, MS, RD, CNSC<sup>2</sup></p><p><sup>1</sup>MedStar Washington Hospital Center, Bethesda, MD; <sup>2</sup>MedStar Washington Hospital Center, Arlington, VA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Parenteral nutrition (PN) therapy is a vital clinical intervention for patients of all ages and across care settings. The complexity of PN has the potential to cause significant patient harm, especially when errors occur. According to The Institute for Safe Medication Practices (ISMP), PN is classified as a high-alert medication and safety-focused strategies should be formulated to minimize errors and harm. Processing PN is multifactorial and includes prescribing, order review and verification, compounding, labeling, and administration. PN prescription should consider clinical appropriateness and formulation safety. The PN formulation must be written to provide appropriate amounts of macronutrients and micronutrients based on the patient's clinical condition, laboratory parameters, and nutrition status. The PN admixture should not exceed the total amounts, recommended concentrations, or rate of infusion of these nutrients as it could result in toxicities and formulation incompatibility or instability. The ASPEN Parenteral Nutrition Safety Consensus Recommendations recommend PN be prescribed using standardized electronic orders via a computerized provider order entry (CPOE) system, as handwritten orders have potential for error. Data suggests up to 40% of PN-related errors occur during the prescription and transcription steps. Registered pharmacists (RPh) are tasked with reviewing PN orders for order accuracy and consistency with recommendations made by the nutrition support team. These RPh adjustments ensure formula stability and nutrient optimization. This quality improvement project compares the frequency of RPh PN adjustments following initial provider order after transition from a paper to CPOE ordering system. Our hypothesis is CPOE reduces the need for PN adjustments by pharmacists during processing which increases clinical effectiveness and maximizes resource efficiency.</p><p><b>Methods:</b> This was a retrospective evaluation of PN ordering practices at a large, academic medical center after shifting from paper to electronic orders. PN orders were collected during three-week periods in December 2022 (Paper) and December 2023 (CPOE) and analyzed for the frequency of order adjustments by the RPh. Adjustments were classified into intravascular access, infusion rate, macronutrients, electrolytes, multivitamin (MVI) and trace elements (TE), and medication categories. The total number of adjustments made by the RPh during final PN processing was collected. These adjustments were made per nutrition support team.</p><p><b>Results:</b> Daily PN orders for 106 patients – totaling 694 orders – were reviewed for provider order accuracy at the time of fax (paper) and electronic (CPOE) submission. Order corrections made by the RPh decreased by 96% for infusion rate, 91.5% for macronutrients, 79.6% for electrolytes, 81.4% for MVI and TE, and 50% for medication additives (Table 1).</p><p><b>Conclusion:</b> Transitioning to CPOE led to reduction in the need for PN order adjustments at the time of processing. One reason for this decline is improvement in physician understanding of PN recommendations. With CPOE, the registered dietitian's formula recommendation is viewable within the order template and can be referenced at the time of ordering. The components of the active PN infusion also automatically populate upon ordering a subsequent bag. This information can aid the provider when calculating insulin needs or repleting electrolytes outside the PN, increasing clinical effectiveness. A byproduct of this process change is improved efficiency, as CPOE requires less time for provider prescription and RPh processing and verification.</p><p><b>Table 1.</b> RPh Order Adjustments Required During Collection Period.</p><p></p><p>Elaina Szeszycki, BS, PharmD, CNSC<sup>1</sup>; Emily Gray, PharmD<sup>2</sup>; Kathleen Doan, PharmD, BCPPS<sup>3</sup>; Kanika Puri, MD<sup>1</sup></p><p><sup>1</sup>Riley Hospital for Children at Indiana University Health, Indianapolis, IN; <sup>2</sup>Lurie Children's Hospital, Chicago, IL; <sup>3</sup>Riley Hospital for Children at IU Health, Indianapolis, IN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Parenteral nutrition (PN) is ordered daily at Riley Hospital for Children at IU Health by different specialties. Our hospital is a stand-alone pediatric hospital, including labor, delivery, and high-risk maternal care. Historically, the PN orders were due by early afternoon with a hard cut-off by end of the day shift for timely central compounding at a nearby adult hospital. Due to the relocation of staff and equipment to a new sterile compounding facility, a hard deadline was created with an earlier cutoff time and contingency plans for orders received after the deadline. This updated process was created to allow for timely delivery to Riley and subsequently to the patients to meet the standard PN hang-time of 2100.The Nutrition Support Team (NST) and Pharmacy and Therapeutics (P&T) Committee approved an updated PN order process as follows:</p><p>Enforce Hard PN Deadline of 1200 for new and current PN orders If PN not received by 1200, renew active PN order for the next 24 hours If active PN order is not appropriate for the next 24 hours, the providers will need to order IVF in place of PN until the following day PN orders into PN order software by 1500</p><p><b>Methods:</b> A quality improvement (QI) process check was performed 3 months initiation of the updated PN order process. Data collection was performed for 1 month with the following data points: Total PN orders, Missing PN orders at 1200, PN orders re-ordered per P&T policy after 1200 deadline, Lab review, Input and output, Subsequent order changes for 24 hours after renewal of active PN order, Waste of PN, and Service responsible for late PN order.</p><p><b>Results:</b></p><p></p><p><b>Conclusion:</b> The number of late PN orders after the hard deadline was < 5% and there was a minimal number of renewed active PN orders due to the pharmacists' concern for ensuring safety of our patients. No clinically significant changes resulted from renewal of active PN, so considered a safe process despite small numbers. The changes made to late PN orders were minor or related to the planned discontinuation of PN. After review of results by NST and pharmacy administration, it was decided to take the following actions: Review data and process with pharmacy staff to assist with workload flow and education Create a succinct Riley TPN process document for providers, specifically services with late orders, reviewing PN order entry hard deadline and need for DC PN order by deadline to assist with pharmacy staff workflow and avoidance of potential PN waste Repeat QI analysis in 6-12 months.</p><p><b>International Poster of Distinction</b></p><p>Muna Islami, PharmD, BCNSP<sup>1</sup>; Mohammed Almusawa, PharmD, BCIDP<sup>2</sup>; Nouf Alotaibi, PharmD, BCPS, BCNSP<sup>3</sup>; Jwael Alhamoud, PharmD<sup>1</sup>; Maha Islami, PharmD<sup>4</sup>; Khalid Eljaaly, PharmD, MS, BCIDP, FCCP, FIDSA<sup>4</sup>; Majda Alattas, PharmD, BCPS, BCIDP<sup>1</sup>; Lama Hefni, RN<sup>5</sup>; Basem Alraddadi, MD<sup>1</sup></p><p><sup>1</sup>King Faisal Specialist Hospital, Jeddah, Makkah; <sup>2</sup>Wayne State University, Jeddah, Makkah; <sup>3</sup>Umm al Qura University, Jeddah, Makkah; <sup>4</sup>King Abdulaziz University Hospital, Jeddah, Makkah; <sup>5</sup>King Faisal Specialist Hospital, Jeddah, Makkah</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Parenteral nutrition (PN) is a critical therapy for patients unable to meet their nutritional needs through the gastrointestinal tract. While it offers a life-saving solution, it also carries the risk of central line-associated bloodstream infections (CLABSIs). However, there is a lack of comprehensive studies examining the risk factors for CLABSIs in a more heterogeneous cohort of PN recipients. This study aims to identify the risk factors associated with CLABSIs in patients receiving PN therapy in Saudi Arabia.</p><p><b>Methods:</b> This retrospective cohort multicenter study was conducted in three large tertiary referral centers in Saudi Arabia. The study included all hospitalized patients who received PN therapy through central lines between 2018 and 2022. The purpose of the study was to investigate the association between parenteral nutrition (PN) and central line-associated bloodstream infections (CLABSIs), using both univariate and multivariate analysis.</p><p><b>Results:</b> Out of 662 hospitalized patients who received PN and had central lines, 123 patients (18.6%) developed CLABSI. Among our patients, the duration of parenteral nutrition was a dependent risk factor for CLABSI development (OR, 1.012; 95% CI, 0.9-1.02). In patients who were taken PN, the incidence of CLABSI did not change significantly over the course of the study's years.</p><p><b>Conclusion:</b> The length of PN therapy is still an important risk factor for CLABSIs; more research is required to determine the best ways to reduce the incidence of CLABSI in patients on PN.</p><p><b>Table 1.</b> Characteristics of Hospitalized Patients Who Received PN.</p><p></p><p>1 n (%); Median (IQR) BMI, Body Mass Index.</p><p><b>Table 2.</b> The Characteristics of Individuals With and Without CLABSI Who Received PN.</p><p></p><p>1 n (%); Median (IQR), 2 Fisher's exact test; Pearson's Chi-squared test; Mann Whitney U test PN, Parenteral Nutrition</p><p></p><p>CLABSI, central line-associated bloodstream infections PN, parenteral nutrition</p><p><b>Figure 1.</b> Percentage of Patients With a Central Line Receiving PN Who Experienced CLABSI.</p><p>Duy Luu, PharmD<sup>1</sup>; Rachel Leong, PharmD<sup>2</sup>; Nisha Dave, PharmD<sup>2</sup>; Thomas Ziegler, MD<sup>2</sup>; Vivian Zhao, PharmD<sup>2</sup></p><p><sup>1</sup>Emory University Hospital - Nutrition Support Team, Lawrenceville, GA; <sup>2</sup>Emory Healthcare, Atlanta, GA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Intravenous lipid emulsion (ILE) is an essential component in parenteral nutrition (PN)-dependent patients because it provides calories and essential fatty acids; however, the use of soybean oil-based ILE (SO-ILE) may contribute to the development of PN-associated liver disease (PNALD) in patients with intestinal failure who require chronic PN. To mitigate this risk, new formulations of ILE, such as a mixture of SO, medium chain triglycerides (MCT), olive oil (OO), and fish oil-based ILE (SO/MCT/OO/FO-ILE), or pure fish oil-based ILE (FO-ILE) are now available in the US. FO-ILE is only approved for pediatric use for PN-associated cholestasis. This patient case highlights the benefits of using a combination of FO-containing ILEs to improve PNALD.</p><p><b>Methods:</b> A 65-year-old female with symptomatic achalasia required robotic-assisted esophagomyotomy with fundoplasty in February 2020. Her postoperative course was complicated with bowel injury that required multiple small bowel resections and total colectomy with end jejunostomy, resulting in short bowel syndrome with 80 centimeters of residual small bowel. Postoperatively, daily PN containing SO-ILE was initiated along with tube feedings (TF) during hospitalization, and she was discharged home with PN and TF. Her surgeon referred her to the Emory University Hospital (EUH) Nutrition Support Team (NST) for management. She received daily cyclic-PN infused over 12 hours, providing 25.4 kcal/kg/day with SO-ILE 70 grams (1.2 g/kg) three times weekly and standard TF 1-2 containers/day. In September 2020, she complained of persistent jaundice and was admitted to EUH. She presented with scleral icterus, hyperbilirubinemia, and elevated liver function tests (LFTs). EUH NST optimized PN to provide SO/MCT/OO/FO-ILE (0.8 g/kg/day), which improved blood LFTs, and the dose was then increased to 1 g/kg/day. In the subsequent four months, her LFTs worsened despite optimizing pharmacotherapy, continuing cyclic-TF, and reducing and then discontinuing ILE. She required multiple readmissions at EUH and obtained two liver biopsies that confirmed a diagnosis of PN-induced hepatic fibrosis after 15 months of PN. Her serum total bilirubin level peaked at 18.5 mg/dL, which led to an intensive care unit admission and required molecular adsorbent recirculating system therapy. In March 2022, the NST exhausted all options and incorporated FO-ILE (0.84 g/kg/day) three times weekly (separate infusion) and SO/MCT/OO/FO-ILE (1 g/kg/day) weekly.</p><p><b>Results:</b> The patient's LFTs are shown in Figure 1. The blood level of aspartate aminotransferase improved from 123 to 60 units/L, and alanine aminotransferase decreased from 84 to 51 units/L after 2 months and returned to normal after 4 months of the two ILEs. Similarly, the total bilirubin decreased from 5.6 to 2.2 and 1.1 mg/dL by 2 and 6 months, respectively. Both total bilirubin and transaminase levels remained stable. Although her alkaline phosphatase continued to fluctuate and elevated in the last two years, this marker decreased from 268 to 104 units/L. All other PNALD-related symptoms were resolved.</p><p><b>Conclusion:</b> This case demonstrates that the combination of FO-containing ILEs significantly improved and stabilized LFTs in an adult with PNALD. Additional research is needed to investigate the effect of FO-ILE in adult PN patients to mitigate PNALD.</p><p></p><p>SO: soybean oil; MCT: median-chain triglyceride; OO: olive oil; FO: fish oil; AST: Aspartate Aminotransferase; ALT: Alanine Aminotransferase.</p><p><b>Figure 1.</b> Progression of Liver Enzymes Status in Relation to Lipid Injectable Emulsions.</p><p>Narisorn Lakananurak, MD<sup>1</sup>; Leah Gramlich, MD<sup>2</sup></p><p><sup>1</sup>Department of Medicine, Faculty of Medicine, Chulalongkorn University, Bangkok, Krung Thep; <sup>2</sup>Division of Gastroenterology, Faculty of Medicine and Dentistry, University of Alberta, Edmonton, AB</p><p><b>Financial Support:</b> This research study received a grant from Baxter, Canada.</p><p><b>Background:</b> Pre-operative parenteral nutrition (PN) has been shown to enhance outcomes in malnourished surgical patients. Traditionally, pre-operative PN necessitates hospital admission, which leads to increased length of stay (LOS) and higher hospital costs. Furthermore, inpatient pre-operative PN may not be feasible or prioritized when access to hospital beds is restricted. Outpatient PN presents a potential solution to this issue. To date, the feasibility and impact of outpatient PN for surgical patients have not been investigated. This study aims to assess the outcomes and feasibility of outpatient pre-operative PN in malnourished surgical patients.</p><p><b>Methods:</b> Patients scheduled for major surgery who were identified as at risk of malnutrition using the Canadian Nutrition Screening Tool and classified as malnourished by Subjective Global Assessment (SGA) B or C were enrolled. Exclusion criteria included severe systemic diseases as defined by the American Society of Anesthesiologists (ASA) classification III to V, insulin-dependent diabetes mellitus, and extreme underweight (less than 40 kg). Eligible patients received a peripherally inserted central catheter (PICC) line and outpatient PN (Olimel 7.6%E 1,000 ml) for 5-10 days using the maximum infusion days possible prior to surgery at an infusion clinic. PN was administered via an infusion pump over 4-5 hours by infusion clinic nurses. The outcomes and feasibility of outpatient PN were assessed. Safety outcomes, including refeeding syndrome, dysglycemia, volume overload, and catheter-related complications, were monitored.</p><p><b>Results:</b> Outpatient PN was administered to eight patients (4 males, 4 females). Pancreatic cancer and Whipple's procedure were the most common diagnoses and operations, accounting for 37.5% of cases. (Table 1) The mean (SD) duration of PN was 5.5 (1.2) days (range 5-8 days). Outpatient PN was completed in 75% of patients, with an 88% completion rate of PN days (44/50 days). Post-PN infusion, mean body weight and body mass index increased by 4.6 kg and 2.1 kg/m², respectively. The mean PG-SGA score improved by 4.9 points, and mean handgrip strength increased from 20 kg to 25.2 kg. Quality of life, as measured by SF-12, improved in both physical and mental health domains (7.3 and 3.8 points, respectively). Patient-reported feasibility scores were high across all aspects (Acceptability, Appropriateness, and Feasibility), with a total score of 55.7/60 (92.8%). Infusion clinic nurses (n = 3) also reported high total feasibility scores (52.7/60, 87.8%). (Table 2) No complications were observed in any of the patients.</p><p><b>Conclusion:</b> Outpatient pre-operative PN was a feasible approach that was associated with improved outcomes in malnourished surgical patients. This novel approach has the potential to enhance outcomes and decrease the necessity for hospital admission in malnourished surgical patients. Future studies involving larger populations are needed to evaluate the efficacy of outpatient PN.</p><p><b>Table 1.</b> Baseline Characteristics of the Participants (n = 8).</p><p></p><p><b>Table 2.</b> Outcomes and Feasibility of Outpatient Preoperative Parenteral Nutrition (n = 8).</p><p></p><p>Adrianna Wierzbicka, MD<sup>1</sup>; Rosmary Carballo Araque, RD<sup>1</sup>; Andrew Ukleja, MD<sup>1</sup></p><p><sup>1</sup>Cleveland Clinic Florida, Weston, FL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Gastroparesis (GP) is a chronic motility disorder marked by delayed gastric emptying, associated with symptoms: nausea, vomiting and abdominal pain. Treatment consists of diet modifications and medications, with nutritional support tailored to disease severity. Severe refractory cases may require enteral or parenteral nutrition (PN). However, the role of home parenteral nutrition (HPN) in managing GP is underexplored. This study aims to enhance nutrition therapy practice by examining the utilization of HPN in GP population, addressing a significant gap in current nutrition support strategies.</p><p><b>Methods:</b> We conducted a retrospective, single-center analysis of patients receiving HPN from August 2022 to August 2024. Data were obtained through a review of electronic medical records as part of a quality improvement monitoring process. Patients' demographics, etiology of GP, indications for HPN, types of central access, duration of therapy, and PN-related complications were analyzed using descriptive statistics. Inclusion criteria were: adults (>18 yrs.), GP diagnosis by gastric scintigraphy, and HPN for a minimum of 2 consecutive months. Among 141 identified HPN patients, 10 were diagnosed with GP as indication for PN.</p><p><b>Results:</b> GP patients constituted 7% (10/141) of our home PN population. In this cohort analysis of 10 patients with GP receiving HPN, the demographic profile was predominantly female (80%); a mean age of 42.6 yrs., all individuals identified as Caucasian. All patients had idiopathic GP, severe gastric emptying delay was found in 80% of cases, with all experiencing predominant symptoms of nausea/vomiting. Type of central access: 50% PICC lines, 30% Hickman catheters, 10% Powerlines, and 10% mediport. The mean weight change with PN therapy was an increase of 21.9 lbs. 80% of patients experienced infection-related complications, including bacteremia (Methicillin-Sensitive Staphylococcus Aureus (MSSA), Methicillin-Resistant Staphylococcus Aureus (MRSA)), Pseudomonas, and fungemia. Deep vein thrombosis (DVT) was identified in 20% of patients, alongside one case of a cardiac thrombus. Tube feeding trials were attempted in 70% of cases, but 50% ultimately discontinued due to intolerance, such as abdominal pain or complications like buried bumper syndrome. Chronic pain management was used in 60% of patients, with 40% on opioid therapy (morphine, fentanyl). PN was discontinued in 50% of patients due to recurrent infections (20%), advancement to tube feeding (20%), questionable compliance (20%), or improvement in oral intake (40%).</p><p><b>Conclusion:</b> This retrospective analysis underscores the potential of HPN as a nutritional strategy for GP, particularly in patients with refractory symptoms and severe delay in gastric emptying who previously failed EN or experienced complications related to the enteral access. In addition to the observed mean weight gain, HPN seems to play a crucial role in alleviating debilitating symptoms such as nausea, vomiting, and abdominal pain, thereby improving patients' overall quality of life. Nonetheless, the prevalence of infection-related complications and the requirement for chronic pain management underscore the challenges associated with GP treatment. The variability in patient responses to different nutritional strategies emphasizes the importance of individualized care plans. These findings advocate for further research to optimize HPN protocols and improve comprehensive management strategies in GP.</p><p></p><p><b>Figure 1.</b> Reasons for PN Discontinuation.</p><p></p><p><b>Figure 2.</b> Complication Associated with PN.</p><p>Longchang Huang, MD<sup>1</sup>; Peng Wang<sup>2</sup>; Shuai Liu<sup>3</sup>; Xin Qi<sup>1</sup>; Li Zhang<sup>1</sup>; Xinying Wang<sup>4</sup></p><p><sup>1</sup>Department of General Surgery, Jinling Hospital, Medical School of Nanjing University, Nanjing, Jiangsu; <sup>2</sup>Department of Digestive Disease Research Center, Gastrointestinal Surgery, The First People's Hospital of Foshan, Guangdong, Foshan; <sup>3</sup>Department of General Surgery, Jinling Hospital, Medical School of Nanjing University, Nanjing, Jiangsu; <sup>4</sup>Wang, Department of General Surgery, Jinling Hospital, Medical School of Nanjing University, Nanjing, Jiangsu</p><p><b>Financial Support:</b> National Natural Science Foundation of China, 82170575 and 82370900.</p><p><b>Background:</b> Total parenteral nutrition (TPN) induced gut microbiota dysbiosis is closely linked to intestinal barrier damage, but the mechanism remains unclear.</p><p><b>Methods:</b> Through the application of 16S rRNA gene sequencing and metagenomic analysis, we examined alterations in the gut microbiota of patients with chronic intestinal failure (CIF) and TPN mouse models subjected to parenteral nutrition, subsequently validating these observations in an independent verification cohort. Additionally, we conducted a comprehensive analysis of key metabolites utilizing liquid chromatography-mass spectrometry (LC-MS). Moreover, we explored modifications in essential innate-like lymphoid cell populations through RNA sequencing (RNA-seq), flow cytometry, and single-cell RNA sequencing (scRNA-seq).</p><p><b>Results:</b> The gut barrier damage associated with TPN is due to decreased Lactobacillus murinus. L.murinus mitigates TPN-induced intestinal barrier damage through the metabolism of tryptophan into indole-3-carboxylic acid (ICA). Furthermore, ICA stimulates innate lymphoid cells 3 (ILC3) to secrete interleukin-22 by targeting the nuclear receptor Rorc to enhance intestinal barrier protection.</p><p><b>Conclusion:</b> We elucidate the mechanisms driving TPN-associated intestinal barrier damage and indicate that interventions with L. murinus or ICA could effectively ameliorate TPN-induced gut barrier injury.</p><p></p><p><b>Figure 1.</b> TPN Induces Intestinal Barrier Damage in Humans and Mice. (a) the rate of febrile and admission of ICU in the Cohort 1. (b-d) Serum levels of IFABP, CRP, and LPS in patients with CIF. (e) Representative intestinal H&E staining and injury scores (f) (n = 10 mice per group). (g) Results of the electrical resistance of the intestine in mice by Ussing Chamber (n = 5 mice per group). (h) Immunofluorescence experiments in the intestines and livers of mice. (i) The results of Western blot in the Chow and TPN groups.</p><p></p><p><b>Figure 2.</b> TPN Induces Gut Dysbiosis in Humans and Mice. (a) PCoA for 16S rRNA of fecal content from Cohort 1 (n = 16 individuals/group). (b) Significant abundance are identified using linear discriminant analysis (LDA). (c) Top 10 abundant genus. (d) PCoA of the relative genus or species abundances (n = 5 mice per group). (e) LDA for mice. (f) Sankey diagram showing the top 10 abundant genus of humans and mice. (g) The following heatmap illustrates the correlation between the abundance of species in intestinal microbiota and clinical characteristics of patients with CIF.</p><p></p><p><b>Figure 3.</b> Metabolically active L.murinus ameliorate intestinal barrier damage. (a) RT-PCR was conducted to quantify the abundance of L. murinus in feces from L-PN and H-PN patients (Cohorts 1 and 2). (b) Representative intestinal H&E staining and injury scores (c) (n = 10 mice per group). (d) The results of the electrical resistance of the intestine in mice by Ussing Chamber (n = 5 mice per group). (e) The results of Western Blot. (f) 3D-PCA and volcano plot (g) analyses between the Chow and TPN group mice. (h) The metabolome-wide pathways were enriched based on the metabolomics data obtained from fecal content from Chow and TPN group mice (n = 5 mice per group). (i) The heatmap depicts the correlation between the abundance of intestinal microbiota in species level and tryptophan metabolites of the Chow and TPN group mice (n = 5 mice per group). (j) VIP scores of 3D-PCA. A taxon with a variable importance in projection (VIP) score of >1.5 was deemed to be of significant importance in the discrimination process.</p><p></p><p><b>Figure 4.</b> ICA is critical for the effects of L.murinus. (a) The fecal level of ICA from TPN mice treated PBS control or ICA(n = 10 mice per group). (b) Representative intestinal H&E staining and injury scores (c) (n = 10 mice per group). (d) The results of the electrical resistance of the intestine in mice by Ussing Chamber (n = 5 mice per group). (e) The results of Western blot. (f) This metabolic pathway illustrates the production of ICA by the bacterium L. murinus from the tryptophan. (g) PLS-DA for the profiles of metabolite in feces from TPN mice receiving ΔArAT or live L. murinus (n = 5 mice per group). (h) The heat map of tryptophan-targeted metabolomics in fecal samples from TPN mice that received either ΔArAT (n = 5) or live L. murinus (n = 5). (i) Representative intestinal H&E staining and injury scores (j)(n = 10 mice per group). (k) The results of Western Blot.</p><p>Callie Rancourt, RDN<sup>1</sup>; Osman Mohamed Elfadil, MBBS<sup>1</sup>; Yash Patel, MBBS<sup>1</sup>; Taylor Dale, MS, RDN<sup>1</sup>; Allison Keller, MS, RDN<sup>1</sup>; Alania Bodi, MS, RDN<sup>1</sup>; Suhena Patel, MBBS<sup>1</sup>; Andrea Morand, MS, RDN, LD<sup>1</sup>; Amanda Engle, PharmD, RPh<sup>1</sup>; Manpreet Mundi, MD<sup>1</sup></p><p><sup>1</sup>Mayo Clinic, Rochester, MN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Although patent foramen ovale (PFO) are generally asymptomatic and cause no health concerns, they can be a risk factor for embolism and stroke. Due to this theoretical risk, some institutions have established protocols requiring most IV solutions to be administered through a small micron filter in patients with PFO. While 2-in-1 dextrose and amino acid solutions can be filtered through a 0.22-micron filter with relative ease, injectable lipid emulsions (ILEs), whether on their own or as part of a total admixture, consist of larger particles, requiring a 1.2-micron or bigger filter size. The use of the larger filter precludes the administration of ILE, an essential source of calories, in patients with PFO. It is unknown if patients who do receive ILE have an increased incidence of lipid embolism and stroke.</p><p><b>Methods:</b> A single-center retrospective review of patients on central parenteral nutrition (CPN) was completed. Demographics, and baseline clinical characteristics including co-morbidities and history of CVA were collected. The outcome of interest is defined as an ischemic cerebrovascular accident (CVA) within 30 days of CPN and, therefore, potentially attributable to it. Other cardiovascular and thromboembolic events were captured. All Patients with a PFO diagnosis and inpatient CPN administration between January 1, 2018, and December 18, 2023, at our quaternary care referral center were included as the case cohort. A 3:1 control group matched to age, gender, duration of inpatient CPN, and clinical co-morbidities was identified and utilized to examine the difference in the outcome of interest.</p><p><b>Results:</b> Patients with PFO who received CPN (n = 38, 53.8% female) had a mean age of 63.5 ± 13.1 years and a mean BMI of 31.1 ± 12.1 at CPN initiation (Table 1). The PFO varied in size, with the majority (38.5%) having a very small/trivial one (Table 2). All patients in this cohort had appropriate size filters placed for CPN and ILE administration. CPN prescription and duration were comparable between both groups. The majority of patients with PFO (53.8%) received mixed oil ILE, followed by soy-olive oil ILE (23.1%), whereas the majority of patients without PFO (51.8%) received soy-olive oil ILE and (42.9%) received mixed oil ILE (Table 3). Case and control groups had cardiovascular risks at comparable prevalence, including obesity, hypertension, diabetes, and dyslipidemia. However, more patients had a history of vascular cardiac events and atrial fibrillation in the PFO group, and more patients were smokers in the non-PFO group (Table 4). Patients with PFO received PN for a median of 7 days (IQR: 5,13), and 32 (84.2%) received ILE. Patients without PFO who received CPN (n = 114, 52.6%) had a mean age of 64.9 ± 10 years and a mean BMI of 29.6 ± 8.5 at CPN initiation. Patients in this cohort received PN for a median of 7 days (IQR: 5,13.5), and 113 (99.1%) received ILE. There was no difference in the incidence of ischemic CVA within 30 days of receiving CPN between both groups (2 (5.3%) in the PFO group vs. 1 (0.8%) in the non-PFO group; p = 0.092) (Table 4).</p><p><b>Conclusion:</b> The question of the risk of CVA/stroke with CPN in patients with PFO is clinically relevant and often lacks a definitive answer. Our study revealed no difference in ischemic CVA potentially attributable to CPN between patients with PFO and patients without PFO in a matched control cohort in the first 30 days after administration of PN. This finding demonstrates that CPN with ILE is likely safe for patients with PFO in an inpatient setting.</p><p><b>Table 1.</b> Baseline Demographics and Clinical Characteristics.</p><p></p><p><b>Table 2.</b> PFO Diagnosis.</p><p></p><p>*All received propofol concomitantly.</p><p><b>Table 3.</b> PN Prescription.</p><p></p><p><b>Table 4.</b> Outcomes and Complications.</p><p></p><p><b>Enteral Nutrition Therapy</b></p><p>Osman Mohamed Elfadil, MBBS<sup>1</sup>; Edel Keaveney, PhD<sup>2</sup>; Adele Pattinson, RDN<sup>1</sup>; Danelle Johnson, MS, RDN<sup>1</sup>; Rachael Connolly, BSc.<sup>2</sup>; Suhena Patel, MBBS<sup>1</sup>; Yash Patel, MBBS<sup>1</sup>; Ryan Hurt, MD, PhD<sup>1</sup>; Manpreet Mundi, MD<sup>1</sup></p><p><sup>1</sup>Mayo Clinic, Rochester, MN; <sup>2</sup>Rockfield MD, Galway</p><p><b>Financial Support:</b> Rockfield Medical Devices.</p><p><b>Background:</b> Patients on home enteral nutrition (HEN), many of whom are mobile, can experience significant hardships and reduced quality of life (QoL) due to limitations on mobility, on top of burdens due to underlying disease processes. Improving mobility while feeding could reduce burdens associated with HEN and potentially improve QoL. This prospective cohort study aims to evaluate participants’ perspectives on their mobility, ease of performing physical activities while feeding, and QoL following the use of a novel enteral feeding system (EFS).</p><p><b>Methods:</b> A prospective single-center study was conducted to evaluate a novel EFS, which is an FDA-cleared elastomeric system (Mobility + ®) that consists of a lightweight feeding pouch (reservoir for 500 mL feed), a filling set (used in conjunction with a syringe to fill EFS) and a feeding set to deliver EN formula to an extension set/feeding tube with an ISO 80369-3 compatible connector. Adult HEN-dependent patients were recruited by invitation to use the study EFS for a minimum of 2 feeds a day for 14 days, preceded by a familiarization period of 5-7 days. Participant perspectives on how they rated performing typical daily activities while feeding (e.g., moving, traveling, socializing) and feeding system parameters (ease of use, portability, noise, discretion, performance) were evaluated using HEN-expert validated questionnaires. A score was given for each rating from 1 to 5, with 5 being the most positive response. An overall score was calculated and averaged for the cohort. Participants were followed up during the familiarization period. On days 7 and 14, additional telephone interviews were conducted regarding compliance, enteral feed intake, participant perspectives on study EFS vs. current system, and other measures. We excluded those with reduced functional capacity due to their underlying disease(s).</p><p><b>Results:</b> Seventeen participants completed the study (mean age 63.8 ± 12 years; 70.6% male). Participants used various feeding systems, including gravity, bolus method, and pump, with the majority (82.4%) having a G-tube placed (Table 1). Sixteen (94.1%) patients achieved use of study EFS for at least two feeds a day (and majority of daily EN calories) for all study days (Table 2). The ratings for the ability to perform various activities using study EFS were significantly different compared to those of the systems used before the study. An improvement in ratings was noted for the ease of performing common daily activities, including moving between rooms or on stairs, taking short and long walks, traveling by car or public transport, engaging in moderate- to high-intensity activities, sleeping, and socializing with family and friends, between the time point before enrolment and end of study (day 14) (p-value < 0.0001) (Table 3). Ratings of feeding system parameters were significantly different between systems used before the study and the study EFS (p < 0.0001) (Table 3), with the largest increases in positive ratings noted in relation to easiness to carry, noise level, and ability to feed discreetly. Ratings for overall satisfaction with the performance of study EFS did not differ from the ratings for the systems used before the study, with participants reporting that the main influencing factors were the length of time and the effort needed to fill study EFS. No difference was noted in the QoL rating.</p><p><b>Conclusion:</b> The studied EFS is safe and effective as an enteral feeding modality that provides an alternative option for HEN recipients. Participants reported a significant positive impact of study EFS on their activities of daily living. Although the overall QoL rating remained the same, improvements in mobility, discretion, and ease of carrying— aspects of QoL—were associated with the use of study EFS.</p><p><b>Table 1.</b> Baseline Demographics and Clinical Characteristics.</p><p></p><p><b>Table 2.</b> Safety and Effectiveness.</p><p></p><p><b>Table 3.</b> Usability and Impact of the Study EFS.</p><p></p><p>Talal Sharaiha, MD<sup>1</sup>; Martin Croce, MD, FACS<sup>2</sup>; Lisa McKnight, RN, BSN MS<sup>2</sup>; Alejandra Alvarez, ACP, PMP, CPXP<sup>2</sup></p><p><sup>1</sup>Aspisafe Solutions Inc., Brooklyn, NY; <sup>2</sup>Regional One Health, Memphis, TN</p><p><b>Financial Support:</b> Talal Sharaiha is an executive of Aspisafe Solutions Inc. Martin Croce, Lisa McKnight and Alejandra Alvarez are employees of Regional One Health institutions. Regional One Health has a financial interest in Aspisafe Solutions Inc. through services, not cash. Aspisafe provided the products at no charge.</p><p><b>Background:</b> Feeding tube securement has seen minimal innovation over the past decades, leaving medical adhesives to remain as the standard method. However, adhesives frequently fail to maintain secure positioning, with dislodgement rates reported between 36% and 62%, averaging approximately 40%. Dislodgement can lead to adverse outcomes, including aspiration, increased risk of malnutrition, higher healthcare costs, and extended nursing care. We aimed to evaluate the safety and efficacy of a novel feeding tube securement device in preventing NG tube dislodgement compared to standard adhesive tape. The device consists of a bracket that sits on the patient's upper lip. The bracket has a non-adhesive mechanism for securing feeding tubes ranging from sizes 10 to 18 French. It extends to two cheek pads that sit on either side of the patient's cheeks and is further supported by a head strap that wraps around the patient's head (Fig. 1 + Fig. 2).</p><p><b>Methods:</b> We conducted a prospective, case-control trial at Regional One Health Center, Memphis, TN, comparing 50 patients using the novel securement device, the NG Guard, against 50 patients receiving standard care with adhesive tape. The primary outcome was the rate of accidental or intentional NG tube dislodgement. Secondary outcomes included the number of new NG tubes required as a result of dislodgement, and device-related complications or adhesive-related skin injuries in the control group. Statistical analyses employed Student's t-test for continuous variables and Fisher's exact test for categorical variables. Significance was set at an alpha level of 0.05. We adjusted for confounding variables, including age, sex, race, and diagnosis codes related to delirium, dementia, and confusion (Table 1).</p><p><b>Results:</b> There were no significant differences between the groups in baseline characteristics, including age, sex, race, or confusion-related diagnoses (Table 2) (p ≥ 0.09). Nasogastric tube dislodgement occurred significantly more often in the adhesive tape group (31%) compared to the intervention group (11%) (p < 0.05). The novel device reduced the risk of tube dislodgement by 65%. Additionally, 12 new tubes were required in the control group compared to 3 in the intervention group (p < 0.05), translating to 18 fewer reinsertion events per 100 tubes inserted in patients secured by the novel device. No device-related complications or adhesive-related injuries were reported in either group.</p><p><b>Conclusion:</b> The novel securement device significantly reduced the incidence of nasogastric tube dislodgement compared to traditional adhesive tape. It is a safe and effective securement method and should be considered for use in patients with nasogastric tubes to reduce the likelihood of dislodgement and the need for reinsertion of new NG tubes.</p><p><b>Table 1.</b> Diagnosis Codes Related to Dementia and Delirium.</p><p></p><p><b>Table 2.</b> Baseline Demographics.</p><p></p><p></p><p><b>Figure 1.</b> Novel Securement Device - Front View.</p><p></p><p><b>Figure 2.</b> Novel Securement Device - Side Profile.</p><p><b>Best of ASPEN-Enteral Nutrition Therapy</b></p><p><b>Poster of Distinction</b></p><p>Alexandra Kimchy, DO<sup>1</sup>; Sophia Dahmani, BS<sup>2</sup>; Sejal Dave, RDN<sup>1</sup>; Molly Good, RDN<sup>1</sup>; Salam Sunna, RDN<sup>1</sup>; Karen Strenger, PA-C<sup>1</sup>; Eshetu Tefera, MS<sup>3</sup>; Alex Montero, MD<sup>1</sup>; Rohit Satoskar, MD<sup>1</sup></p><p><sup>1</sup>MedStar Georgetown University Hospital, Washington, DC; <sup>2</sup>Georgetown University Hospital, Washington, DC; <sup>3</sup>MedStar Health Research Institute, Columbia, MD</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Early nutrition intervention is of high importance in patients with cirrhosis given the faster onset of protein catabolism for gluconeogenesis compared to those without liver disease. Severe malnutrition is associated with frequent complications of cirrhosis such as infection, hepatic encephalopathy, and ascites. Furthermore, studies have demonstrated higher mortality rates in cirrhotic patients who are severely malnourished in both the pre- and post-transplant setting. The current practice guidelines encourage the use of enteral nutrition in cirrhotic patients who are unable to meet their intake requirements with an oral diet. The aim of this study was to evaluate the utilization and implications of enteral feeding in hospitalized patients with cirrhosis diagnosed with severe protein calorie malnutrition at our institution.</p><p><b>Methods:</b> This was a retrospective study of patients admitted to the transplant hepatology inpatient service at MedStar Georgetown University Hospital from 2019-2023. ICD-10-CM code E43 was then used to identity patients with a diagnosis of severe protein calorie malnutrition. The diagnosis of cirrhosis and pre-transplant status were confirmed by review of the electronic medical record. Patients with the following characteristics were excluded: absence of cirrhosis, history of liver transplant, admission diagnosis of upper gastrointestinal bleed, and/or receipt of total parenteral nutrition. Wilcoxon rank sum and two sample t-tests were used to examine differences in the averages of continuous variables between two groups. Chi-square and Fisher exact tests were used to investigate differences for categorical variables. Statistical significance was defined as p-values ≤ 0.05.</p><p><b>Results:</b> Of the 96 patients with cirrhosis and severe protein calorie malnutrition, 31 patients (32%) received enteral nutrition. Time from admission to initiation of enteral feeding was on average 7 days with an average total duration of enteral nutrition of 10 days. In the group that received enteral nutrition, there was no significant change in weight, BMI, creatinine, total bilirubin or MELD 3.0 score from admission to discharge; however, albumin, sodium and INR levels had significantly increased (Table 1). A comparative analysis between patients with and without enteral nutrition showed a significant increase in length of stay, intensive care requirement, bacteremia, gastrointestinal bleeding, discharge MELD 3.0 score and in hospital mortality rates among patients with enteral nutrition. There was no significant difference in rates of spontaneous bacterial peritonitis, pneumonia, admission MELD 3.0 score or post-transplant survival duration in patients with enteral nutrition compared to those without enteral nutrition (Table 2).</p><p><b>Conclusion:</b> In this study, less than fifty percent of patients hospitalized with cirrhosis received enteral nutrition despite having a diagnosis of severe protein calorie malnutrition. Initiation of enteral nutrition was found to be delayed a week, on average, after hospital admission. Prolonged length of stay and higher in-hospital mortality rates suggest a lack of benefit of enteral nutrition when started late in the hospital course. Based on these findings, our institution has implemented a quality improvement initiative to establish earlier enteral feeding in hospitalized patients with cirrhosis and severe protein calorie malnutrition. Future studies will evaluate the efficacy of this initiative and implications for clinical outcomes.</p><p><b>Table 1.</b> The Change in Clinical End Points from Admission to Discharge Among Patients Who Received Enteral Nutrition.</p><p></p><p>Abbreviations: kg, kilograms; BMI, body mass index; INR, international normalized ratio; Na, sodium, Cr, creatinine; TB, total bilirubin; MELD, model for end stage liver disease; EN, enteral nutrition; Std, standard deviation</p><p><b>Table 2.</b> Comparative Analysis of Clinical Characteristics and Outcomes Between Patients With And Without Enteral Nutrition.</p><p></p><p>Abbreviations: MASLD, metabolic dysfunction-associated steatotic liver disease; HCV, hepatitis C virus; HBV, hepatitis B virus; AIH, autoimmune hepatitis; PSC, primary sclerosing cholangitis; PBC, primary biliary cholangitis; EN, enteral nutrition; RD, registered dietician; MICU, medical intensive care unit; PNA, pneumonia; SBP, spontaneous bacterial peritonitis; GIB, gastrointestinal bleed; MELD, model for end stage liver disease; d, days; N, number; Std, standard deviation.</p><p>Jesse James, MS, RDN, CNSC<sup>1</sup></p><p><sup>1</sup>Williamson Medical Center, Franklin, TN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Feeding tubes (Tubes) are used to deliver enteral nutrition to patients who are unable to safely ingest nutrients and medications orally, a population at elevated risk of malnutrition and dehydration. Unfortunately, these Tubes have a propensity for becoming clogged. Staff will attempt to unclog Tubes using standard bedside techniques including warm water flushes or chemical enzymes. However, not only are these practices time-consuming, often they are unsuccessful, requiring replacement. An actuated mechanical device for restoring patency in clogged small bore Tubes was evaluated at a level 2 medical center as an alternative declogging method from September 2021- July 2023. Study objectives were to explore the actuated mechanical device's ability to unclog indwelling Tubes and monitor any potential safety issues.</p><p><b>Methods:</b> The TubeClear® System (actuated mechanical device, Actuated Medical, Inc., Bellefonte, PA, Figure 1) was developed to resolve clogs from various indwelling Tubes. N = 20 patients (Table 1) with n = 16, 10Fr 109 cm long nasogastric (NG) tubes, and n = 4, 10Fr 140 cm long nasojejunal (NJ) tubes, underwent clearing attempts with the actuated mechanical device. Initially, patients underwent standard declogging strategies for a minimum of 30 minutes, including warm water flushes and Creon/NaHCO3 slurry. Following unsuccessful patency restoration (n = 17) or patency restoration and reclogging occurring (n = 3), the actuated mechanical device was attempted. Procedure time was estimated from electronic monitoring records charting system and included set up, use, and cleaning time for the actuated mechanical device, to the closest five minutes. All clearing procedures were completed by three trained registered dietitians.</p><p><b>Results:</b> The average time to restore Tube patency (n = 20) was 26.5 min (25 minutes for NG, 32.5 min for NJ) with 90% success (Table 2), and no significant safety issues reported by the operator or patient. User satisfaction was 100% (20/20) and patient discomfort being 10% (2/20).</p><p><b>Conclusion:</b> Based on presented results, the actuated mechanical device was significantly more successful at resolving clogs compared to alternative bedside practices. Operators noted that the “Actuated mechanical device was able to work with clogs when slurries/water can't be flushed.” It was noted that actuated mechanical device use prior to formation of full clog, utilizing a prophylactic approach, “was substantially easier than waiting until the Tube fully clogged.” For a partly clogged Tube, “despite it being somewhat patent and useable, a quick pass of the actuated mechanical device essentially restored full patency and likely prevented a full clog.” For an NG patient, “no amount of flushing or medication slurry was effective, but the actuated mechanical device worked in just minutes without issue.” “Following standard interventions failure after multiple attempts, only the actuated mechanical device was able to restore Tube patency, saving money on not having to replace Tube.” For a failed clearance, the operator noted “that despite failure to restore patency, there was clearly no opportunity for flushes to achieve a better result and having this option [actuated mechanical device] was helpful to attempt to avoid tube replacement.” For an NJ patient, “there would have been no other conventional method to restore patency of a NJ small bore feeding tube without extensive x-ray exposure and \\\"guess work,\\\" which would have been impossible for this patient who was critically ill and ventilator dependent.” Having an alternative to standard bedside unclogging techniques proved beneficial to this facility, with 90% effectiveness and saving those patients from undergoing a Tube replacement and saving our facility money by avoiding Tube replacement costs.</p><p><b>Table 1.</b> Patient and Feeding Tube Demographics.</p><p></p><p><b>Table 2.</b> Actuated Mechanical Device Uses.</p><p></p><p></p><p><b>Figure 1.</b> Actuated Mechanical Device for Clearing Partial and Fully Clogged Indwelling Feeding Tubes.</p><p>Vicki Emch, MS, RD<sup>1</sup>; Dani Foster<sup>2</sup>; Holly Walsworth, RD<sup>3</sup></p><p><sup>1</sup>Aveanna Medical Solutions, Lakewood, CO; <sup>2</sup>Aveanna Medical Solutions, Chandler, AZ; <sup>3</sup>Aveanna Medical Solutions, Erie, CO</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Homecare providers have managed through multiple formula backorders since the pandemic. Due to creative problem-solving, clinicians have successfully been able to offer substitutions. However, when pump feeding sets are on backorder, the options are limited; feeding sets are specific to the brand of pump. Recently, a backorder of feeding sets used for pumps common in acute and home care resulted in a severe shortage in the home care supply chain. This required fast action to ensure patients are able to continue administering their tube feeding and prevent readmissions. A solution is to change the patient to a pump brand which is not on backorder. Normally transitioning a patient to a different brand of pump would require in-person teaching. Due to the urgency of the situation, a more efficient method needed to be established. A regional home care provider determined that 20% of patients using enteral feeding pumps were using backordered sets, and 50% were pediatric patients who tend not to tolerate other methods of feeding. In response this home care provider assembled a team to create a new educational model for pump training. The team was composed of Registered Nurses, Registered Dietitians, and patient care and distribution representatives. The hypothesis is that providing high quality educational material with instructional videos, detailed communication on the issue and telephonic clinical support will allow for a successful transition.</p><p><b>Methods:</b> To determine urgency of transition we prioritized patients with a diagnosis of short bowel syndrome, gastroparesis, glycogen storage disease, vent dependency, < 2 years of age, those living in a rural area with a 2-day shipping zip code and conducted a clinical review to determine patients with jejunal feeding tube. (See Table 1) A pump conversion team contacted patients/caregivers to review the situation, discuss options for nutrition delivery, determine current inventory of sets, assessed urgency for transition and coordinated pump, sets and educational material delivery. Weekly reporting tracked number of patients using the impacted pump, transitioned patients, and those requesting to transition back to their original pump.</p><p><b>Results:</b> A total of 2111 patients were using the feeding pump with backordered sets and 50% of these patients were under the age of 12 yrs. old. Over a period of 3 months, 1435 patients or 68% of this patient population were successfully transitioned to a different brand of pump and of those only 7 patients or 0.5% requested to return to their original pump even though they understood the risk of potentially running short on feeding sets. (See Figure 1).</p><p><b>Conclusion:</b> A team approach which included proactively communicating with patients/caregivers, prioritizing patient risk level, providing high-quality educational material with video links and outbound calls from a clinician resulted in a successful transition to a new brand of feeding pump.</p><p><b>Table 1.</b> Patient Priority Levels for Pump with Backordered Sets (Table 1).</p><p></p><p></p><p><b>Figure 1.</b> Number of Pump Conversions (Chart 1).</p><p>Desiree Barrientos, DNP, MSN, RN, LEC<sup>1</sup></p><p><sup>1</sup>Coram CVS, Chino, CA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Follow-up care for the enteral nutrition therapy community is essential for good outcomes. No data had been collected regarding home enteral nutrition (HEN) outcomes at a major university medical center. There was no robust program in place to follow up with patients who were discharged on tube feedings. Consequently, there was little information regarding the follow-up care or patient outcomes related to current practice, complications, re-hospitalizations, and equipment issues for this population.</p><p><b>Methods:</b> The tools utilized were the questionnaire for the 48-hours and 30-day post-discharge discharge outreach calls, pre-discharge handouts, and feeding pump handouts.</p><p><b>Results:</b> Education: Comparison of 48-Hours and 30 days. Q1: Can you tell me why you were hospitalized?Q2: Did a provider contact you for education prior to discharging home? Q3: Do you understand your nutrition orders from your Doctor? Q4: Can you tell me the steps of how to keep your PEG/TF site clean? Q5: Can you tell me how much water to flush your tube? There was an improvement in patient understanding, self-monitoring, and navigation between the two survey timepoints. Regarding patient education in Q3, there was an improved understanding of nutrition orders from 91% to 100%, Q4: steps to keeping tube feeding site clean resulted from 78% to 96%, and knowledge of water flushed before and after each feeding from 81% to 100% at the 48-hour and 30-day timepoints, respectively.</p><p><b>Conclusion:</b> There was an improvement in patient understanding, self-monitoring, and navigation between the two survey timepoints. Verbal responses to open-ended and informational questions were aggregated to analyze complications, care gaps, and service failures.</p><p><b>Table 1.</b> Questionnaire Responses At 48 Hours and 30 Days.</p><p></p><p><b>Table 2.</b> Questionnaire Responses At 48 Hours and 30 Days.</p><p></p><p></p><p><b>Figure 1.</b> Education: Comparison at 48-hours and 30-days.</p><p></p><p><b>Figure 2.</b> Self-monitoring and Navigation: Comparison at 48-hours and 30-days.</p><p>Rachel Ludke, MS, RD, CD, CNSC, CCTD<sup>1</sup>; Cayla Marshall, RD, CD<sup>2</sup></p><p><sup>1</sup>Froedtert Memorial Lutheran Hospital, Waukesha, WI; <sup>2</sup>Froedtert Memorial Lutheran Hospital, Big Bend, WI</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Initiation of early enteral nutrition plays an essential role in improving patient outcomes<sup>1</sup>. Historically, feeding tubes have been placed by nurses, doctors and advanced practice providers. Over the past two decades, the prevalence of dietitians (RDNs) placing feedings tubes at the bedside has grown. This practice has been endorsed by the Academy of Nutrition and Dietetics and American Society for Enteral and Parenteral Nutrition through modifications to the Scope of Practice for the Registered Dietitian Nutritionist and Standards of Professional Performance for Registered Dietitian Nutritionists.<sup>2,3</sup> Feeding tubes placed at the bedside by RDNs has the potential to decrease nursing, fluoroscopy and internal transport time, which is of interest to our hospital. In fall of 2023, we launched a pilot to evaluate the feasibility of an RDN-led bedside tube placement team at our 800-bed level 1 trauma center.</p><p><b>Methods:</b> RDNs first worked closely with nursing leadership to create a tube placement workflow to identify appropriate patients, outline communication needed with the bedside nurse and provider, and establish troubleshooting solutions (Figure 1). Intensive care unit nursing staff then trained RDNs in tube placement using camera-guided technology (IRIS) and deemed RDNs competent after 10 successful tube placements. Given limited literature on RDN led tube placement, we defined success as >80% of tube placements in an appropriate position within the gastrointestinal tract.</p><p><b>Results:</b> To date, the pilot includes 57 patients; Forty-six tubes (80.7%; 39 gastric and 7 post-pyloric) were placed successfully, as confirmed by KUB. Of note, 2 of the 39 gastric tubes were originally ordered to be placed post-pylorically, however gastric tubes were deemed acceptable for these patients after issues were identified during placement. Eleven (19%) tube placements were unsuccessful due to behavioral issues, blockages in the nasal cavity, and anatomical abnormalities.</p><p><b>Conclusion:</b> This pilot demonstrated that well trained RDNs can successfully place feeding tubes at the bedside using camera-guided tube placement technology. One limitation to this pilot is the small sample size. We initially limited the pilot to 2 hospital floors and had trouble educating nursing staff on the availability of the RDN to place tubes. Through evaluation of tube placement orders, we found the floors placed on average 198 tubes during our pilot, indicating 141 missed opportunities for RDNs to place tubes. To address these issues, we created numerous educational bulletins and worked with unit nursing staff to encourage contacting the RDN when feeding tube placement was needed. We also expanded the pilot hospital-wide and are looking into time periods when tubes are most often placed. Anecdotally, bedside feeding tube placement takes 30 to 60 minutes, therefore this pilot saved 1350 to 2700 minutes of nursing time and 180 to 360 minutes of fluoroscopy time necessary to place post-pyloric tubes. Overall, our pilot has demonstrated feasibility in RDN-led bedside feeding tube placement, allowing staff RDNs to practice at the top of their scope and promoting effective use of hospital resources.</p><p></p><p><b>Figure 1.</b> Dietitian Feeding Tube Insertion Pilot: 2NT and 9NT.</p><p>Lauren Murch, MSc, RD<sup>1</sup>; Janet Madill, PhD, RD, FDC<sup>2</sup>; Cindy Steel, MSc, RD<sup>3</sup></p><p><sup>1</sup>Nestle Health Science, Cambridge, ON; <sup>2</sup>Brescia School of Food and Nutritional Sciences, Faculty of Health Sciences, Western University, London, ON; <sup>3</sup>Nestle Health Science, Hamilton, ON</p><p><b>Financial Support:</b> Nestle Health Science.</p><p><b>Background:</b> Continuing education (CE) is a component of professional development which serves two functions: maintaining practice competencies, and translating new knowledge into practice. Understanding registered dietitian (RD) participation and perceptions of CE facilitates creation of more effective CE activities to enhance knowledge acquisition and practice change. This preliminary analysis of a practice survey describes RD participation in CE and evaluates barriers to CE participation.</p><p><b>Methods:</b> This was a cross-sectional survey of clinical RDs working across care settings in Canada. Targeted participants (n = 4667), identified using a convenience sample and in accordance with applicable law, were invited to complete a 25-question online survey, between November 2023 to February 2024. Descriptive statistics and frequencies were reported.</p><p><b>Results:</b> Nationally, 428 RDs working in acute care, long term care and home care, fully or partially completed the survey (9.1% response). Respondents indicated the median ideal number of CE activities per year was 3 in-person, 5 reviews of written materials, and 6 virtual activities. However, participation was most frequently reported as “less often than ideal” for in person activities (74.7% of respondents) and written material (53.6%) and “as often as ideal” for virtual activities (50.7%). Pre-recorded video presentations, live virtual presentations and critically reviewing written materials were the most common types of CE that RDs had participated in at least once in the preceding 12-months. In-person hands-on sessions, multimodal education and simulations were the least common types of CE that RDs had encountered in the preceding 12-months (Figure 1). The most frequent barriers to participation in CE were cost (68% of respondents), scheduling (60%), and location (51%). However, encountered barriers that most greatly limited participation in CE were inadequate staff coverage, lack of dedicated education days within role, and lack of dedicated time during work hours (Table 1). When deciding to participate in CE, RDs ranked the most important aspects of the content as 1) from a credible source, 2) specific/narrow topic relevant to practice and 3) enabling use of practical tools/skills at the bedside.</p><p><b>Conclusion:</b> This data suggests there is opportunity for improvement in RD CE participation, with the greatest gaps in in-person and written activities. Although RDs recognize the importance of relevant and practical content, they reported infrequent exposure to types of CE that are well-suited to this, such as simulations and hands-on activities. When planning and executing CE, content should be credible, relevant, and practical, using a format that is both accessible and impactful. Results of this study help benchmark Canadian RD participation in CE and provide convincing evidence to address barriers and maximize optimal participation.</p><p><b>Table 1.</b> Frequent and Impactful Barriers Limiting Participation in CE Activities.</p><p></p><p>Note: 1. Percentages total greater than 100% because all respondents selected the 3 most important barriers impacting their participation in CE activities. 2. Items are ranked based on a weighted score calculated from a 5-point Likert scale, indicating the extent to which the barrier was perceived to have limited participation in CE activities.</p><p></p><p><b>Figure 1.</b> Types of Continuing Education Activities Dietitians Participated In At Least Once, In The Preceding 12-Months.</p><p>Karen Sudders, MS, RDN, LDN<sup>1</sup>; Alyssa Carlson, RD, CSO, LDN, CNSC<sup>2</sup>; Jessica Young, PharmD<sup>3</sup>; Elyse Roel, MS, RDN, LDN, CNSC<sup>2</sup>; Sophia Vainrub, PharmD, BCPS<sup>4</sup></p><p><sup>1</sup>Medtrition, Huntingdon Valley, PA; <sup>2</sup>Endeavor Health/Aramark Healthcare +, Evanston, IL; <sup>3</sup>Parkview Health, Fort Wayne, IN; <sup>4</sup>Endeavor Health, Glenview, IL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Adequate nutrition is a critical component of patient care and plays a significant role in reducing hospital length of stay (LOS). Malnutrition is associated with numerous adverse outcomes, including increased morbidity, delayed recovery, and prolonged hospitalization (Tappenden et al., 2013). Timely nutrition interventions, and strategies for integrating successful nutrition care into hospital protocols to reduce LOS are essential parts of medical nutrition therapy. Modular nutrition often plays a key role in these interventions. A study by Klek et al. (2020) emphasizes the role of modular nutrition in providing personalized nutritional support for the specific needs of critically ill patients. The study suggests that using nutrient modules allows for a more precise adjustment of nutrition based on the metabolic requirements of patients (Klek et al., 2020). Modular nutrition has also been shown to positively impact clinical outcomes in ICU patients. An observational study by Compher et al. (2019) reported that the targeted administration of protein modules to achieve higher protein intake was associated with improved clinical outcomes, such as reduced ICU LOS (Compher et al., 2019).</p><p><b>Methods:</b> Administration of modular nutrition can be a challenge. Typically, modular proteins (MP) are ordered through the dietitian and dispensed as part of the diet order. The nursing team is responsible for administration and documentation of the MP. There is commonly a disconnect between MP prescription and administration. In some cases, it's related to the MP not being a tracked task in the electronic health record (EHR). The goal of this evaluation was to review data related to a quality improvement (QI)initiative where MP (ProSource TF) was added to the medication administration record (MAR) which used a barcode scanning process to track provision and documentation of MP. The objective of this evaluation was to determine possible correlation between the QI initiative and patients’ ICU LOS. The QI initiative evaluated a pre-implementation timeframe from June 1<sup>st</sup>, 2021 to November 30<sup>th</sup>, 2021, with a post implementation timeframe from January 1<sup>st</sup>, 2022 to June 30<sup>th</sup>, 2022. There were a total of 1962 ICU encounters in the pre-implementation period and 1844 ICU encounters in the post implementation period. The data was analyzed using a series of statistical tests.</p><p><b>Results:</b> The t-test for the total sample was significant, t(3804) = 8.35, p < .001, indicating the average LOS was significantly lower at post compared to pre implementation(TABLE 1). This positive correlation allows us to assume that improved provision of MP may be related to a reduced LOS in the ICU. In addition to LOS, we can also suggest a relationship with the MAR and MP utilization. Pre-implementation, 1600 doses of MP were obtained with an increase of 2400 doses obtained post implementation. The data suggests there is a correlation between product use and MAR implementation even though the overall encounters at post implementation were reduced. There was a 50% increase in product utilization post implementation compared to previous.</p><p><b>Conclusion:</b> The data provided suggests the benefit for adding MP on the MAR to help improve provision, streamline documentation and potentially reduce ICU LOS.</p><p><b>Table 1.</b> Comparison of LOS Between Pre and Post Total Encounters.</p><p></p><p>Table 1 displays the t-test comparison of LOS in pre vs post implementation of MP on the MAR.</p><p></p><p><b>Figure 1.</b> Displays Product Utilization and Encounters Pre vs Post Implementation of MP on the MAR.</p><p><b>International Poster of Distinction</b></p><p>Eliana Giuntini, PhD<sup>1</sup>; Ana Zanini, RD, MSc<sup>2</sup>; Hellin dos Santos, RD, MSc<sup>2</sup>; Ana Paula Celes, MBA<sup>2</sup>; Bernadette Franco, PhD<sup>3</sup></p><p><sup>1</sup>Food Research Center/University of São Paulo, São Paulo; <sup>2</sup>Prodiet Medical Nutrition, Curitiba, Parana; <sup>3</sup>Food Research Center/School of Pharmaceutical Sciences/University of São Paulo, São Paulo</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Critically ill patients present an increased need for protein to preserve muscle mass due to anabolic resistance. Additionally, these patients are more prone to developing hyperglycemia, and one of the nutritional strategies that can be adopted is to provide a diet with a low glycemic index. Hypercaloric and high-protein enteral formulas can help meet energy and protein goals for these patients. Due to their reduced carbohydrate content, these formulas can also contribute to lowering the postprandial glycemic response. The study aimed to evaluate the glycemic index (GI) and the glycemic load (GL) of a specialized high-protein enteral nutrition formula.</p><p><b>Methods:</b> Fifteen healthy volunteers were selected, based on self-reported absence of diseases or regular medication use, aged between 21 and 49 years, with normal glucose tolerance according to fasting and postprandial glucose assessments over 2 hours. The individuals attended after a 10-hour fast, once per week, consuming the glucose solution – reference food – for 3 weeks, and the specialized high-protein enteral formula (Prodiet Medical Nutrition) in the following week, both in amounts equivalent to 25 g of available carbohydrates. The specialized high-protein formula provides 1.5 kcal/ml, 26% protein (98 g/L), 39% carbohydrates, and 35% lipids, including EPA + DHA. Capillary blood sampling was performed at regular intervals, at 0 (before consumption), 15, 30, 45, 60, 90, and 120 minutes. The incremental area under the curve (iAUC) was calculated, excluding areas below the fasting line. Glycemic load (GL) was determined based on the equation GL = [GI (glucose=reference) X grams of available carbohydrates in the portion]/100. Student's t-test were conducted to identify differences (p < 0.05).</p><p><b>Results:</b> To consume 25 g of available carbohydrates, the individuals ingested 140 g of the high-protein formula. The high-protein enteral formula showed a low GI (GI = 23) with a significant difference compared to glucose (p < 0.0001) and a low GL (GL = 8.2). The glycemic curve data showed significant differences at all time points between glucose and the specialized high-protein formula, except at T90, with the glycemic peak occurring at T30 for glucose (126 mg/dL) and at both T30 and T45 for the specialized high-protein enteral formula, with values significantly lower than glucose (102 vs 126 mg/dL). The iAUC was smaller for the specialized high-protein formula compared to glucose (538 ± 91 vs 2061 ± 174 mg/dL x min) (p < 0.0001), exhibiting a curve without high peak, typically observed in foods with a reduced glycemic index.</p><p><b>Conclusion:</b> The specialized high-protein enteral nutrition formula showed a low GI and GL, resulting in a significantly reduced postprandial glycemic response, with lower glucose elevation and variation. This may reduce insulin requirements, and glycemic variability.</p><p></p><p><b>Figure 1.</b> Mean Glycemic Response of Volunteers (N = 15) to 25 G of Available Carbohydrates After Consumption of Reference Food and a Specialized High-Protein Enteral Nutrition Formula, in 120 Min.</p><p>Lisa Epp, RDN, LD, CNSC, FASPEN<sup>1</sup>; Bethaney Wescott, APRN, CNP, MS<sup>2</sup>; Manpreet Mundi, MD<sup>2</sup>; Ryan Hurt, MD, PhD<sup>2</sup></p><p><sup>1</sup>Mayo Clinic Rochester, Rochester, MN; <sup>2</sup>Mayo Clinic, Rochester, MN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Hypnotherapy is the use of hypnosis for the treatment of a medical or psychological disorder. Specifically, gut directed hypnotherapy is a treatment option for functional gastrointestinal disorders and disorders of the gut brain axis. It has been shown to be effective in management of GI symptoms such as abdominal pain, nausea, functional dyspepsia and irritable bowel syndrome symptoms. Evidence suggests that 6%–19% of patients with these GI symptoms exhibit characteristics of Avoidant/restrictive food intake disorder (ARFID). Multiple studies show improvement in GI symptoms and ability to maintain that improvement after 1 year. However, there is a paucity of data regarding use of hypnotherapy in home enteral nutrition patients.</p><p><b>Methods:</b> A case report involving a 67-year-old adult female with h/o Irritable bowel syndrome (diarrhea predominant) and new mucinous appendiceal cancer s/p debulking of abdominal tumor, including colostomy and distal gastrectomy is presented. She was on parenteral nutrition (PN) for 1 month post op due to delayed return of bowel function before her oral diet was advanced. Unfortunately, she had difficulty weaning from PN as she was “scared to start eating” due to functional dysphagia with gagging at the sight of food, even on TV. After 4 weeks of PN, a nasojejunal feeding tube was placed and she was dismissed home.</p><p><b>Results:</b> At multidisciplinary outpatient nutrition clinic visit, the patient was dependent on enteral nutrition and reported inability to tolerate oral intake for unclear reasons. Long term enteral access was discussed, however the patient wished to avoid this and asked for alternative interventions she could try to help her eat. She was referred for gut directed hypnotherapy. After 4 in-person sessions over 3 weeks of hypnotherapy the patient was able to tolerate increasing amounts of oral intake and remove her nasal jejunal feeding. Upon follow-up 4 months later, she was still eating well and continued to praise the outcome she received from gut directed hypnotherapy.</p><p><b>Conclusion:</b> Patient-centered treatments for gut-brain axis disorders and disordered eating behaviors and/or eating disorders are important to consider in addition to nutrition support. These include but are not limited to Cognitive Behavior Therapy, mindfulness interventions, acupuncture, biofeedback strategies, and gut directed hypnotherapy. Group, online and therapist directed therapies could be considered for treatment avenues dependent on patient needs and preferences. Additional research is needed to better delineate impact of these treatment modalities in the home enteral nutrition population.</p><p>Allison Krall, MS, RD, LD, CNSC<sup>1</sup>; Cassie Fackler, RD, LD, CNSC<sup>1</sup>; Gretchen Murray, RD, LD, CNSC<sup>1</sup>; Amy Patton, MHI, RD, CNSC, LSSGB<sup>2</sup></p><p><sup>1</sup>The Ohio State University Wexner Medical Center, Columbus, OH; <sup>2</sup>The Ohio State University Wexner Medical Center, Westerville, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> It is well documented that unnecessary hospital admissions can have a negative impact on patient's physical and emotional wellbeing and can increase healthcare costs.<sup>1</sup> Numerous strategies exist to limit unnecessary hospital admissions; one innovative strategy being utilized at our 1000+ bed Academic Medical Center involves Registered Dietitians (RDs). Literature demonstrates that feeding tube placement by a dedicated team using electromagnetic tracking improves patient morbidity and mortality, and is a cost effective solution for this procedure<sup>.2</sup> RDs have been part of feeding tube teams for many years, though exact numbers of RD only teams are unclear.<sup>3</sup> The Revised 2021 Standards of Practice and Standards of Professional Performance for RDNs (Competent, Proficient, and Expert) in Nutrition Support identifies that dietitians at the “expert” level strive for additional experience and training, and thus may serve a vital role in feeding tube placement teams.<sup>4</sup> Prior to the implementation of the RD led tube team, there was no uniform process at our hospital for these patients to obtain enteral access in a timely manner.</p><p><b>Methods:</b> In December 2023 an “RD tube team” consult and order set went live within the electronic medical record at our hospital. The original intent of the tube team was to cover the inpatient units, but it soon became apparent that there were opportunities to extend this service to observation areas and the Emergency Department (ED). This case series abstract will outline case studies from three patients with various clinical backgrounds and how the tube team was able to prevent an inpatient admission. Patient 1: An 81-year-old female who returned to the ED on POD# 4 s/p esophageal repair with a dislodged nasoenteric feeding tube. The RD tube team was consulted and was able to replace her tube and bridled it in place. Patient discharged from ED without requiring hospital readmission. Patient 2: An 81-year-old male with a history of ENT cancer who transferred to our ED after outside hospital ED had no trained staff available to replace his dislodge nasoenteric feeding tube. RD tube team replaced his tube and bridled it into place. Patient was able to discharge from the ED without admission. Patient 3: A 31-year-old female with complex GI history and multiple prolonged hospitalizations due to PO intolerance. Patient returned to ED 2 days post discharge with a clogged nasoenteric feeding tube. Tube was unable to be unclogged, thus the RD tube team was able to replace tube in ED and prevent readmission.</p><p><b>Results:</b> Consult volumes validated there was a need for a tube team service. In the first 8 months of consult and order set implementation, a total of 403 tubes were placed by the RD team. Of those, 24 (6%) were placed in the ED and observation units. In the 3 patient cases described above, and numerous other patient cases, the RD was able to successfully place a tube using an electromagnetic placement device (EMPD), thereby preventing a patient admission.</p><p><b>Conclusion:</b> Creating a feeding tube team can be a complex process to navigate and requires support from senior hospital administration, physician champions, nursing teams and legal/risk management teams. Within the first year of implementation, our hospital system was able to demonstrate that RD led tube teams have the potential to not only help with establishing safe enteral access for patients, but also can be an asset to the medical facility by preventing admissions and readmissions.</p><p><b>Table 1.</b> RD Tube Team Consults (December 11, 2023-August 31, 2024).</p><p></p><p>Arina Cazac, RD<sup>1</sup>; Joanne Matthews, RD<sup>2</sup>; Kirsten Willemsen, RD<sup>3</sup>; Paisley Steele, RD<sup>4</sup>; Savannah Zantingh, RD<sup>5</sup>; Sylvia Rinaldi, RD, PhD<sup>2</sup></p><p><sup>1</sup>Internal Equilibrium, King City, ON; <sup>2</sup>London Health Sciences Centre, London, ON; <sup>3</sup>NutritionRx, London, ON; <sup>4</sup>Vanier Children's Mental Wellness, London, ON; <sup>5</sup>Listowel-Wingham and Area Family Health Team, Wingham, ON</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Parkinson's disease is the second most prevalent neurodegenerative disease with a predominant disease-related symptom known as dysphagia. Dysphagia can increase the risk of aspiration, the inhaling of food or liquids into the lungs, potentially instigating the onset of pneumonia, a recurrent fatality in patients with Parkinson's disease. Therefore, feeding tubes are placed to deliver nutrition into the stomach or the small intestine to maintain appropriate nutrition delivery and reduce the risk of aspiration by oral intake. To the best of our knowledge, there is no research comparing the differences in outcomes between gastric (G) or jejunal (J) tube feeding in patients with Parkinson's disease-related dysphagia, however, limited research does exist in critically ill populations comparing these two modalities. The purpose of this study is to compare the differences in hospital readmissions related to aspiration events and differences in mortality rates after placement of a gastric or jejunal feeding tube in patients with Parkinson's disease-related dysphagia.</p><p><b>Methods:</b> This was a retrospective chart review of patients either admitted to the medicine or clinical neurosciences units at University Hospital in London Ontario, Canada, between January 1, 2010, to December 31, 2022. Patients were included if they had a documented diagnosis of Parkinson's or associated diseases and had a permanent feeding tube placed during hospital admission that was used for enteral nutrition. Patients were excluded from the study if they had a comorbidity that would affect their survival, such as cancer, or if a feeding tube was placed unrelated to Parkinson's Disease-related dysphagia, for example, feeding tube placement post-stroke. A p-value < 0.05 was considered statistically significant.</p><p><b>Results:</b> 25 participants were included in this study; 7 had gastric feeding tubes, and 18 had jejunal feeding tubes. Demographic data is shown in Table 1. No statistically significant differences were found in demographic variables between the G- and J-tube groups. Of interest, none of the 28% of participants that had dementia were discharged to home; 5 were discharged to long-term care, 1 was discharged to a complex continuing care facility, and 1 passed in hospital. Differences in readmission rates and morality between groups did not reach significance, likely due to our small sample size in the G-tube group (Figures 1 and 2). However, we found that 50% of participants were known to have passed within 1 year of initiating enteral nutrition via their permanent feeding tube, and there was a trend of higher readmission rates in the G-tube group.</p><p><b>Conclusion:</b> While this study did not yield statistically significant results, it highlights the need for further research of a larger sample size to assess confounding factors, such as concurrent oral intake, that affect the difference in outcomes between G- and J-tube groups. Future research would also benefit from examining the influence on quality of life in these patients. Additional research is necessary to inform clinical practice guidelines and clinical decision-making for clinicians, patients and families when considering a permanent feeding tube.</p><p><b>Table 1.</b> Participant Demographics.</p><p></p><p></p><p>Readmission rates were calculated as a hospital. If a participant was readmitted more than once within the defined percentage of the number of readmissions to the number of discharges from timeframes, subsequent readmissions were counted as a new readmission and new discharge event. Readmission rate calculations did not include participants who passed during or after the defined timeframes. Differences in readmission rates between gastric and jejunal feeding tube groups did no reach statistical significance.</p><p><b>Figure 1.</b> Readmission Rate.</p><p></p><p>Mortality rates were calculated from the time that enteral nutrition was initiated through a permanent feeding tube in 30-day, 60-day, 90-day, and 1-year time intervals. Differences in mortality rates between gastric and jejunal feeding tube groups did not reach statistical significance.</p><p><b>Figure 2.</b> Mortality Rate.</p><p>Jennifer Carter, MHA, RD<sup>1</sup></p><p><sup>1</sup>Winchester Medical Center, Valley Health, Winchester, VA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Early enteral nutrition is shown to improve patient outcomes and can decrease, or attenuate, the progression of malnutrition. Placement of nasoenteric feeding tubes was deemed within the scope of practice (SOP) for Registered Dietitian Nutritionists (RDNs) in 2007, with recent revisions in 2021, by the Academy of Nutrition and Dietetics (AND) and the American Society of Parenteral and Enteral Nutrition (ASPEN). Due to enhanced order writing privileges, RDNs are knowledgeable and aware of those in need of enteral nutrition recommendations. The goal of this abstract is to bring light to the efficiency of a RDN led nasoenteric tube placement team.</p><p><b>Methods:</b> A retrospective chart review of the first 70 patients who received a nasoenteric tube placed by a RDN in 2023 was conducted. Data points collected include time of tube order to tube placement and time of tube order to enteral nutrition order.</p><p><b>Results:</b> Out of 70 tubes placed, the average time from tube order to tube placement was 2.28 hours. The longest time from tube order to placement was 19 hours. The average time from tube order to enteral nutrition order was 5.58 hours. The longest time from tube order to enteral nutrition order was 23.6 hours.</p><p><b>Conclusion:</b> This retrospective review reflects the timeliness of placement and provision of enteral nutrition in the acute care setting when performed by an all RDN team. Overall, placement occurred within less than 2.5 hours of tube placement order, and enteral nutrition orders entered less than 6 hours of tube placement order. The RDNs at Winchester Medical Center have been placing nasoenteric-feeding tubes since 2013 using an electromagnetic tube placement device (EMPD). As of 2018, it became an all RDN team. With the enhanced support from AND and ASPEN, nasoenteric feeding tube placement continues to be promoted within the SOP for RDNS. Also, the Accreditation Council for Education in Nutrition and Dietetics (ACEND) now requires dietetic interns to learn, observe, and even assist with nasoenteric tube placements. Over time, more RDNs in the acute care setting will likely advance their skillset to include this expertise.</p><p></p><p><b>Figure 1.</b> Time From MD Order to Tube Placement in Hours.</p><p></p><p><b>Figure 2.</b> Time From MD Order of Tube to Tube Feed Order in Hours.</p><p><b>Poster of Distinction</b></p><p>Vanessa Millovich, DCN, MS, RDN, CNSC<sup>1</sup>; Susan Ray, MS, RD, CNSC, CDCES<sup>2</sup>; Robert McMahon, PhD<sup>3</sup>; Christina Valentine, MD, RDN, FAAP, FASPEN<sup>4</sup></p><p><sup>1</sup>Kate Farms, Hemet, CA; <sup>2</sup>Kate Farms, Temecula, CA; <sup>3</sup>Seven Hills Strategies, Columbus, OH; <sup>4</sup>Kate Farms, Cincinnati, OH</p><p><b>Financial Support:</b> Kate Farms provided all financial support.</p><p><b>Background:</b> Whole food plant-based diets have demonstrated metabolic benefits across many populations. The resulting increased intake of dietary fiber and phytonutrients is integral to the success of this dietary pattern due to the positive effect on the digestive system. Patients dependent on tube feeding may not receive dietary fiber or sources of phytonutrients, and the impact of this is unknown. Evidence suggests that the pathways that promote digestive health include more than traditional prebiotic sources from carbohydrate fermentation. Data on protein fermentation metabolites and potential adverse effects on colon epithelial cell integrity are emerging. These lesser-known metabolites, branched-chain fatty acids (BCFAs), are produced through proteolytic fermentation. Emerging research suggests that the overproduction of BCFAs via protein fermentation may be associated with toxic by-products like p-cresol. These resulting by-products may play a role in digestive disease pathogenesis. Enteral formulas are often used to support the nutritional needs of those with digestive conditions. Plant-based formulations made with yellow pea protein have been reported to improve GI tolerance symptoms. However, the underlying mechanisms responsible have yet to be investigated. The purpose of this study was to assess the impact of a mixed food matrix enteral formula containing pea protein, fiber, and phytonutrients on various markers of gut health in healthy children and adults, using an in-vitro model.</p><p><b>Methods:</b> Stool samples of ten healthy pediatric and 10 adult donors were collected and stored at -80°C. The yellow pea protein formulas (Kate Farms™ Pediatric Standard 1.2 Vanilla-P1, Pediatric Peptide 1.0 Vanilla-P2, and Standard 1.4 Plain-P3) were first predigested using standardized intestinal processing to simulate movement along the digestive tract. The in-vitro model was ProDigest's Colon-on-a-Plate (CoaP®) simulation platform which has demonstrated in vivo-in vitro correlation. Measurements of microbial metabolic activity included pH, production of gas, SCFAs, BCFA, ammonia, and microbiota shifts. Paired two-sided t-tests were performed to evaluate differences between treatment and control. Differential abundance analysis was performed using LEfSe and treeclimbR. Statistical significance, as compared to negative control, is indicated by a p-value of < 0.05.</p><p><b>Results:</b> In the pediatric group, the microbial analysis showed significant enrichment of Bifidobacteria as well as butyrate-producing genera Agathobacter and Agathobaculum with the use of the pediatric formulas when compared to the control. P1 resulted in a statistically significant reduction of BCFA production (p < = 0.05). P1 and P2 resulted in statistically significant increases in acetate and propionate. In the adult group, with treatment using P3, microbial analysis showed significant enrichment of Bifidobacteria compared to the control group. P3 also resulted in a reduction of BCFAs, although not statistically significant. Gas production and drop in pH were statistically significant (p < = 0.05) for all groups P1, P2, and P3 compared to control, which indicates microbial activity.</p><p><b>Conclusion:</b> All enteral formulas demonstrated a consistent prebiotic effect on the gut microbial community composition in healthy pediatric and adult donors. These findings provide insight into the mechanisms related to digestive health and highlight the importance of designing prospective interventional research to better understand the role of fiber and phytonutrients within enteral products.</p><p>Hill Johnson, MEng<sup>1</sup>; Shanshan Chen, PhD<sup>2</sup>; Garrett Marin<sup>3</sup></p><p><sup>1</sup>Luminoah Inc, Charlottesville, VA; <sup>2</sup>Virginia Commonwealth University, Richmond, VA; <sup>3</sup>Luminoah Inc, San Diego, CA</p><p><b>Financial Support:</b> Research was conducted with the support of VBHRC's VA Catalyst Grant Funding.</p><p><b>Background:</b> Medical devices designed for home use must prioritize user safety, ease of operation, and reliability, especially in critical activities such as enteral feeding. This study aimed to validate the usability, safety, and overall user satisfaction of a novel enteral nutrition system through summative testing and task analysis.</p><p><b>Methods:</b> A simulation-based, human factors summative study was conducted with 36 participants, including both caregivers and direct users of enteral feeding technology. Participants were recruited across three major cities: Houston, Chicago, and Phoenix. Task analysis focused on critical and non-critical functions of the Luminoah FLOW™ Enteral Nutrition System, while user satisfaction was measured using the System Usability Scale (SUS). The study evaluated successful task completion, potential use errors, and qualitative user feedback.</p><p><b>Results:</b> All critical tasks were completed successfully by 100% of users, with the exception of a cleaning task, which had an 89% success rate. Non-critical tasks reached an overall completion rate of 95.7%, demonstrating the ease of use and intuitive design of the system. The SUS score was exceptionally high, with an average score of 91.5, indicating strong user preference for the device over current alternatives. Furthermore, 91% of participants indicated they would choose the new system over other products in the market.</p><p><b>Conclusion:</b> The innovative portable enteral nutrition system demonstrated excellent usability and safety, meeting the design requirements for its intended user population. High completion rates for critical tasks and an overwhelmingly positive SUS score underscore the system's ease of use and desirability. These findings suggest that the system is a superior option for home enteral feeding, providing both safety and efficiency in real-world scenarios. Further refinements in instructional materials may improve user performance on non-critical tasks.</p><p>Elease Tewalt<sup>1</sup></p><p><sup>1</sup>Phoenix Veterans Affairs Administration, Phoenix, AZ</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Enhanced Recovery After Surgery (ERAS) protocols, including preoperative carbohydrate loading, aim to accelerate recovery by reducing the stress responses and insulin resistance. These protocols have been shown to decrease hospital stays, postoperative complications, and healthcare costs. However, there is limited knowledge about the safety and efficacy of ERAS for diabetic patients. Patients with diabetes make up 15% of surgical cases and often have longer hospital stays and more postoperative complications. This study evaluated outcome measures important to patients with diabetes in a non-diabetic population in order to support the groundwork for future trials that could include diabetic patients in ERAS protocols.</p><p><b>Methods:</b> A retrospective chart review at the Phoenix Veterans Affairs Health Care System compared 24 colorectal surgery patients who received preoperative carbohydrate drinks with 24 who received traditional care. Outcomes assessed included blood glucose (BG) levels, aspiration, and postoperative complications. Additional analyses evaluated adherence, length of hospital stay, and healthcare costs.</p><p><b>Results:</b> The demographics of the two groups were comparable (Table 1). The preoperative BG levels of the carbohydrate loading group were similar (164.6 ± 36.3 mg/dL) to the control group (151.8 ± 47.7 mg/dL) (p > 0.05) (Figure 1). The carbohydrate loading group demonstrated lower and more stable postoperative BG levels (139.4 ± 37.5 mg/dL) compared to the control group (157.6 ± 61.9 mg/dL), but this difference was not statistically significant (p > 0.05) (Figure 2). There were no significant differences in aspiration or vomiting between the groups (p > 0.05) (Table 2). The carbohydrate loading group had a shorter average hospital stay by one day, but this difference was not statistically significant (p > 0.05) (Table 2).</p><p><b>Conclusion:</b> Carbohydrate loading as part of ERAS protocols was associated with better postoperative glucose control, no increased risk of complications, and reduced hospital stays. Although diabetic patients were not included in this study, these findings suggest thatcarbohydrate loading is a safe and effective component of ERAS. Including diabetic patients in ERAS is a logical next step that could significantly improve surgical outcomes for this population. Future research should focus on incorporating diabetic patients to assess the impact of carbohydrate loading on postoperative glucose control, complication rates, length of stay, and healthcare costs.</p><p><b>Table 1.</b> Demographics.</p><p></p><p>The table includes the demographics of the two groups. The study group consists of participants who received the carbohydrate drink, while the control group includes those who received traditional care.</p><p><b>Table 2.</b> Postoperative Outcomes.</p><p></p><p>The table includes the postoperative outcomes of the two groups. The study group consists of participants who received the carbohydrate drink, while the control group includes those who received traditional care.</p><p></p><p>The figure shows the preoperative BG levels of the carbohydrate and non-carbohydrate groups, along with a trendline for each group (p > 0.05).</p><p><b>Figure 1.</b> Preoperative BG Levels.</p><p></p><p>The figure shows the postoperative BG levels of the carbohydrate and non-carbohydrate groups, along with a trendline for each group (p > 0.05).</p><p><b>Figure 2.</b> Postoperative BG Levels.</p><p><b>Malnutrition and Nutrition Assessment</b></p><p>Amy Patton, MHI, RD, CNSC, LSSGB<sup>1</sup>; Elisabeth Schnicke, RD, LD, CNSC<sup>2</sup>; Sarah Holland, MSc, RD, LD, CNSC<sup>3</sup>; Cassie Fackler, RD, LD, CNSC<sup>2</sup>; Holly Estes-Doetsch, MS, RDN, LD<sup>4</sup>; Kristen Roberts, PhD, RDN, LD, CNSC, FASPEN, FAND<sup>5</sup>; Christopher Taylor, PhD, RDN<sup>4</sup></p><p><sup>1</sup>The Ohio State University Wexner Medical Center, Westerville, OH; <sup>2</sup>The Ohio State University Wexner Medical Center, Columbus, OH; <sup>3</sup>The Ohio State University Wexner Medical Center, Upper Arlington, OH; <sup>4</sup>The Ohio State University, Columbus, OH; <sup>5</sup>The Ohio State University, Granville, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> The unfavorable association of malnutrition on hospital outcomes such as longer length of stays (LOS), increased falls, and increased hospital readmissions has been well documented in the literature. We aimed to see if a different model of care that lowered Registered Dietitian (RD) to patient ratios would lead to increased malnutrition identification and documentation within our facility. We also evaluated the relationship between these metrics and LOS on monitored hospital units.</p><p><b>Methods:</b> In July 2022, two additional RDs were hired at a large 1000+ bed academic medical center as part of a pilot program focused on malnutrition identification, documentation integrity, and staff training. The RD to patient ratio was reduced from 1:75 to 1:47 on three units of the hospital. On the pilot units, the RD completed a full nutrition assessment, including a Nutrition Focused Physical Exam (NFPE), for all patients who were identified as \\\"at risk\\\" per hospital nutrition screening policy. Those patients that were not identified as “at risk” received a full RD assessment with NFPE by day 7 of admission or per consult request. A malnutrition dashboard was created with assistance from a Quality Data Manager from the Quality and Operations team. This visual graphic allowed us to track and monitor RD malnutrition identification rates by unit and the percentage of patients that had a malnutrition diagnosis captured by the billing and coding team. Data was also pulled from the Electronic Medical Record (EMR) to look at other patient outcomes. In a retrospective analysis we compared the new model of care to the standard model on one of these units.</p><p><b>Results:</b> There was an increase in the RD identified capture rate of malnutrition on the pilot units. On a cardiac care unit, the RD identification rate went from a baseline of 6% in Fiscal Year (FY) 2022 to an average of 12.5% over FY 2023-2024. On two general medicine units, the malnutrition rates identified by RD nearly doubled during the two-year intervention (Table 1). LOS was significantly lower on one of the general medicine intervention floors compared to a control unit (p < 0.001, Cohen's D: 13.8) (Table 2). LOS was reduced on all units analyzed between FY22 and FY23/24. Those patients with a malnutrition diagnosis had a 15% reduction in LOS FY22 to FY23/24 in control group compared to 19% reduction in LOS for those identified with malnutrition on intervention unit. When comparing intervention versus control units for FY23 and FY24 combined, the intervention had a much lower LOS than control unit.</p><p><b>Conclusion:</b> Dietitian assessments and related interventions may contribute in reducing LOS. Reducing RD to patient ratios may allow for greater identification of malnutrition and support patient outcomes such as LOS. There is an opportunity to evaluate other patient outcomes for the pilot units including falls, readmission rates and Case Mix Index.</p><p><b>Table 1.</b> RD Identified Malnutrition Rates on Two General Medicine Pilot Units.</p><p></p><p><b>Table 2.</b> Control Unit and Intervention Unit Length of Stay Comparison.</p><p></p><p>Amy Patton, MHI, RD, CNSC, LSSGB<sup>1</sup>; Misty McGiffin, DTR<sup>2</sup></p><p><sup>1</sup>The Ohio State University Wexner Medical Center, Westerville, OH; <sup>2</sup>The Ohio State University Wexner Medical Center, Columbus, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Delays in identifying patients at nutrition risk can impact patient outcomes. Per facility policy, Dietetic Technicians (DTs) and Registered Dietitians (RDs) review patient charts, meet with patients, and assign a nutrition risk level. High risk patients are then assessed by the RD for malnutrition among other nutrition concerns. Based on a new tracking process implemented in January of 2023, an average of 501 patient nutrition risk assignments were overdue or incomplete per month in January thru April of 2023 with a dramatic increase to 835 in May. Missed risk assignments occur daily. If the risk assignment is not completed within the 72-hour parameters of the policy, this can result in late or missed RD assessment opportunities and policy compliance concerns.</p><p><b>Methods:</b> In June 2023, a Lean Six Sigma quality improvement project using the DMAIC (Define, Measure, Analyze, Improve, Control) framework was initiated at a 1000+ bed Academic Medical Center with the goal to improve efficiency of the nutrition risk assignment (NRA) process for RDs and DTs. A secondary goal was to see if potential improved efficiency would also lead to an increase in RD identified malnutrition based on Academy of Nutrition and Dietetics/American Society for Parenteral and Enteral Nutrition Indicators or Malnutrition (AAIM) criteria. A group of RDs and DTs met to walk through each of the quality improvement process steps. The problem was defined and baseline measurements of malnutrition identification rates and missed nutrition risk assignments were analyzed. A fishbone diagram was used to help with root cause analysis and later a payoff matrix was used to identify potential interventions. The improve phase was implemented in October and November 2023 and included changes to the screening policy itself and redistributing clinical nutrition staff on certain patient units.</p><p><b>Results:</b> Identified improvements did have a positive impact on incomplete work and on malnutrition identification rates. Malnutrition identification rates on average May thru October were 11.7% compared to malnutrition rates November thru April of 12.1%. The number of missed NRA's decreased from an average of 975 per month May thru October to 783 per month November thru April, a decrease of 192 per month (20%). An additional quality improvement process cycle is currently underway to further improve these metrics.</p><p><b>Conclusion:</b> Fostering a culture of ongoing improvement presents a significant challenge for both clinicians and leaders. Enhancing nutrition care and boosting clinician efficiency are critical goals. Introducing clinical nutrition leaders to tools designed for quality monitoring and enhancement can lead to better performance outcomes and more effective nutrition care. Tools such as those used for this project along with PDSA (Plan, Do, Study, Act) projects are valuable in this process. Involving team members from the beginning of these improvement efforts can also help ensure the successful adoption of practice changes.</p><p><b>Table 1.</b> RD Identified Malnutrition Rates.</p><p></p><p><b>Table 2.</b> Incomplete Nutrition Risk Assignments (NRA's).</p><p></p><p>Maurice Jeanne Aguero, RN, MD<sup>1</sup>; Precy Gem Calamba, MD, FPCP, DPBCN<sup>2</sup></p><p><sup>1</sup>Department of Internal Medicine, Prosperidad, Agusan del Sur; <sup>2</sup>Medical Nutrition Department, Tagum City, Davao del Norte</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition is a strong predictor for mortality and morbidity through poor response to therapy, and quality of life among Gastro-intestinal (GI) cancer patients. In a tertiary government hospital in Tagum City where a cancer center is present, although malnutrition screening among patients with cancer is routinary, no studies focusing on determining the association between nutritional status and quality of life among GI cancer patients were conducted in the past. This study generally aims to determine if nutritional status is associated with the quality of life among adult GI cancer Filipino patients seeking cancer therapy in a tertiary government hospital.</p><p><b>Methods:</b> A quantitative, observational, cross-sectional, survey analytical, and predictive type of research was done. World Health Organization Quality of Life Brief Version (WHOQOL-BREF) questionnaire was utilized to determine the quality of life of cases. Logistic regression analysis was used for the association between the demographic, clinical and nutritional profile among patients with gastrointestinal cancer.</p><p><b>Results:</b> Among respondents (n = 160, mean age 56.4 ± 12 years), majority were male (61.9%), married (77.5%), Roman Catholic (81.1%), and finished high school (38.1%). Almost half were diagnosed cases of colon adenocarcinoma (43.125%), followed by rectal adenocarcinoma (22.5%), rectosigmoid adenocarcinoma (11.875%), then GI stromal tumor (5.625%). On cancer staging, 40.625% were Stage 4, followed by Stage 3b (19.375%), Stage 3c (10%), Stage 3a (5.625%), then Stage 2a (4.375%). Only 2.5% were Stage 4a, while 0.625% were Stage 4b. More than one fourth received CAPEOX (38.125%), followed by FOLFOX (25.625%), then IMATINIB (5.625%). Among cases, 15.6% were underweight or obese, or overweight (34.4%). In terms of SGA grading, 38.1% had severe level, 33.8% moderate level, while the rest normal to mild. On quality of life, mean scores per variable were: generally good for general quality of life (3.71 ± 0.93), generally satisfied for perception on general health, being satisfied with one's self, and his or her relationship with others (3.46 to 3.86 ± 0.97), generally of moderate satisfaction on having enough energy to daily life, on accepting bodily appearance, on the availability of information needed for daily living, and on the extent of having the opportunity for leisure (2.71 to 3.36 ± 1.02), a little level of satisfaction was thrown on having enough money to meet their needs (2.38 ± 0.92). Participants, on average, experienced quite often negative feelings such as having blue mood, despair, depression and anxiety (2.81 ± 0.79). A significant association between age (p = 0.047), cancer diagnosis (p = 0.001), BMI status (p = 0.028), and SGA nutritional status (p = 0.010) relative to the quality of life among adult cancer patients was documented.</p><p><b>Conclusion:</b> Nutritional status was significantly associated with the quality of life among adult GI cancer Filipino patients seeking cancer therapy in a tertiary government hospital. Public health interventions may play a critical role on these factors to improve patient survival and outcome.</p><p>Carmen, Kaman Lo, MS, RD, LDN, CNSC<sup>1</sup>; Hannah Jacobs, OTD, OTR/L<sup>2</sup>; Sydney Duong, MS, RD, LDN<sup>3</sup>; Julie DiCarlo, MS<sup>4</sup>; Donna Belcher, EdD, MS, RD, LDN, CDCES, CNSC, FAND<sup>5</sup>; Galina Gheihman, MD<sup>6</sup>; David Lin, MD<sup>7</sup></p><p><sup>1</sup>Massachusetts General Hospital, Sharon, MA; <sup>2</sup>MedStart National Rehabilitation Hospital, Washington, DC; <sup>3</sup>New England Baptist Hospital, Boston, MA; <sup>4</sup>Center for Neurotechnology and Neurorecovery, Mass General Hospital, Boston, MA; <sup>5</sup>Nutrition and Food Services, MGH, Boston, MA; <sup>6</sup>Harvard Medical School and Mass General Hospital, Boston, MA; <sup>7</sup>Neurocritical Care & Neurorecovery, MGH, Boston, MA</p><p><b>Financial Support:</b> Academy of Nutrition and Dietetics, Dietitian in Nutrition Support Member Research Award.</p><p><b>Background:</b> Nutritional status is a known modifiable factor for optimal recovery in brain injury survivors, yet, data on specific benchmarks for optimizing clinical outcomes through nutrition are limited. This pilot study aimed to quantify the clinical, nutritional, and functional characteristics of brain injury survivors from ICU admission to 90 days post-discharge.</p><p><b>Methods:</b> Patients admitted to the Massachusetts General Hospital (MGH) Neurosciences ICU over a 12-month period were enrolled based on these criteria: age 18 years and greater, primary diagnoses of acute brain injury, ICU stay for at least 72 hours, and meeting MGH Neurorecovery Clinic referral criteria for outpatient follow-up with survival beyond 90 days post-discharge. Data were collected from the electronic health record and from the neurorecovery clinic follow-visit/phone interview. These included patient characteristics, acute clinical outcomes, nutrition intake, surrogate nutrition and functional scores at admission, discharge, and 90 days post-discharge. Descriptive statistics were used for analysis.</p><p><b>Results:</b> Of 212 admissions during the study period, 50 patients were included in the analysis. The mean APACHE (n = 50), GCS (n = 50), and NIHSS (n = 20) scores were 18, 11 and 15, respectively. Seventy-eight percent of the patients required ventilation with a mean duration of 6.3 days. Mean ICU and post ICU length of stay were 17.4 and 15.9 days, respectively. Eighty percent received nutrition (enteral or oral) within 24-48 hours of ICU admission. The first 7-day mean ICU energy and protein intake were 1128 kcal/day and 60.3 g protein/day, respectively, both 63% of estimated needs. Assessing based on ASPEN guidelines, patients with a BMI ≥ 30 received less energy (11.6 vs 14.6 kcal/kg/day) but higher protein (1.04 vs 0.7 g protein/kg/day) than those with a BMI < 30. Twelve percent of patients had less than 50% of their nutritional intake for at least 7 days pre-discharge and were considered nutritionally at risk. Forty-six percent were discharged with long-term enteral access. Only 16% of the patients were discharged to home rather than a rehabilitation facility. By 90 days post-discharge, 32% of the patients were readmitted, with 27% due to stroke. Upon admission, patients’ mean MUST (Malnutrition Universal Screening Tool) and MST (Malnutrition Screening Tool) scores were 0.56 and 0.48, respectively, reflecting that they were at low nutritional risk. By discharge, the mean MUST and MST scores of these patients increased to 1.16 and 2.08, respectively, suggesting these patients had become nutritionally at risk. At 90 days post-discharge, both scores returned to a low nutrition risk (MUST 0.48 and MST 0.59). All patients’ functional scores, as measured by the modified Rankin scale (mRS), followed a similar pattern: the mean score was 0.1 at admission, 4.2 at discharge, and 2.8 at 90 days post-discharge. The 90 days post-discharge Barthel index was 64.1, indicating a moderate dependence in these patients.</p><p><b>Conclusion:</b> This pilot study highlighted key nutritional, clinical, and functional characteristics of brain injury survivors from ICU admission to 90 days post-discharge. Further statistical analysis will help delineate the relationships between nutritional status and clinical and functional outcomes, which may guide future research and practice in ICU nutrition and neurorehabilitation.</p><p>Lavanya Chhetri, BS<sup>1</sup>; Amanda Van Jacob, MS, RDN, LDN, CCTD<sup>1</sup>; Sandra Gomez, PhD, RD<sup>1</sup>; Pokhraj Suthar, MBBS<sup>1</sup>; Sarah Peterson, PhD, RD<sup>1</sup></p><p><sup>1</sup>Rush University Medical Center, Chicago, IL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Identifying frailty in patients with liver disease provides valuable insights into a patient's nutritional status and physical resilience. However, it is unclear if reduced muscle is an important etiology of frailty in liver disease. Identifying the possible connection between frailty and muscle mass may lead to better risk prediction, personalized interventions, and improved outcomes in patient care. The purpose of this study is to determine if frail patients have lower skeletal muscle index (SMI) compared to not-frail patients with liver disease undergoing liver transplant evaluation.</p><p><b>Methods:</b> A retrospective, cross-sectional study design was utilized. Patients greater than 18 years of age who underwent a liver transplant evaluate from January 1<sup>st</sup>, 2019 until December 31, 2023 were included if they had a liver frailly index (LFI) assessment completed during the initial liver transplant evaluation and a diagnostic abdominal CT scan completed within 30 days of the initial liver transplant evaluation. Demographic data (age, sex, height, and BMI), etiology of liver disease, MELD-Na score, history of diabetes and hepatocellular carcinoma, liver disease complications (ascites, hepatocellular carcinoma, hepatic encephalopathy & esophageal varices), and LFI score were recorded for each patient. LFI was recorded as both a continuous variable and dichotomized into a categorical variable (frail: defined as LFI ≥ 4.5 versus not frail: defined as LFI ≤ 4.4). Cross-sectional muscle area (cm<sup>2</sup>) from the third lumbar region of the CT was quantified; SMI was calculated (cm<sup>2</sup>/height in meters<sup>2</sup>) and low muscle mass was dichotomized into a categorical variable (low muscle mass: defined as SMI ≤ 50 cm<sup>2</sup>/m<sup>2</sup> for males and ≤39 cm<sup>2</sup>/m<sup>2</sup> for females versus normal muscle mass: defined as SMI > 50 cm<sup>2</sup>/m<sup>2</sup> for males and >39 cm<sup>2</sup>/m<sup>2</sup> for females). An independent t-test analysis was used to determine if there is a difference in SMI between patients who are categorized as frail versus not frail.</p><p><b>Results:</b> A total of 104 patients, 57% male with a mean age of 57 ± 10 years and mean of BMI 28.1 ± 6.4 kg/m<sup>2</sup>, were included. The mean MELD-Na score was 16.5 ± 6.9; 25% had a history of hepatocellular carcinoma and 38% had a history of diabetes. The majority of the sample had at least one liver disease complication (72% had ascites, 54% had hepatic encephalopathy, and 67% had varices). The mean LFI score was 4.5 ± 0.9 and 44% were categorized as frail. The mean SMI was 45.3 ± 12.6 cm<sup>2</sup>/m<sup>2</sup> and 52% were categorized as having low muscle mass (males: 63% and females: 38%). There was no difference in SMI between patients who were frail versus not frail (43.5 ± 10.6 versus 47.3 ± 13.9 cm<sup>2</sup>/m<sup>2</sup>, p = 0.06). The difference between SMI by frailty status was reported for males and females, no significance testing was used due to the small sample size. Both frail males (43.5 ± 12.2 versus 48.4 ± 14.9) and females (43.4 ± 9.3 versus 45.2 ± 11.8) had a lower SMI compared to non-frail patients.</p><p><b>Conclusion:</b> No difference in SMI between frail versus not frail patients was observed; however, based on the p-value of 0.06 a marginal trend and possible difference may exist, but further research is needed to confirm the findings. Additionally, it is concerning that men had a higher rate of low muscle mass and the mean SMI for both frail and not frail men was below the cut-off used to identify low muscle mass (SMI ≤ 50 cm<sup>2</sup>/m<sup>2</sup>). Additional research is needed to explore the underlying factors contributing to low muscle mass in men, particularly in frail populations, and to determine whether targeted interventions aimed at improving muscle mass could mitigate frailty and improve clinical outcomes in patients undergoing liver transplant evaluation.</p><p>Rebekah Preston, MS, RD, LD<sup>1</sup>; Keith Pearson, PhD, RD, LD<sup>2</sup>; Stephanie Dobak, MS, RD, LDN, CNSC<sup>3</sup>; Amy Ellis, PhD, MPH, RD, LD<sup>1</sup></p><p><sup>1</sup>The University of Alabama, Tuscaloosa, AL; <sup>2</sup>The University of Alabama at Birmingham, Birmingham, AL; <sup>3</sup>Thomas Jefferson University, Philadelphia, PA</p><p><b>Financial Support:</b> The ALS Association Quality of Care Grant.</p><p><b>Background:</b> Amyotrophic Lateral Sclerosis (ALS) is a progressive neurodegenerative disease. Malnutrition is common in people with ALS (PALS) due to dysphagia, hypermetabolism, self-feeding difficulty and other challenges. In many clinical settings, malnutrition is diagnosed using the Academy of Nutrition and Dietetics/American Society of Parenteral and Enteral Nutrition indicators to diagnose malnutrition (AAIM) or the Global Leadership Initiative on Malnutrition (GLIM) criteria. However, little is known about malnutrition assessment practices in ALS clinics. This qualitative study explored how RDs at ALS clinics are diagnosing malnutrition in PALS.</p><p><b>Methods:</b> Researchers conducted 6 virtual focus groups with 22 RDs working in ALS clinics across the United States. Audio recordings were transcribed verbatim, and transcripts imported to NVivo 14 software (QSR International, 2023, Melbourne, Australia). Two research team members independently analyzed the data using deductive thematic analysis.</p><p><b>Results:</b> The AAIM indicators were identified as the most used malnutrition diagnostic criteria. Two participants described using a combination of AAIM and GLIM criteria. Although all participants described performing thorough nutrition assessments, some said they do not formally document malnutrition in the outpatient setting due to lack of reimbursement and feeling as though the diagnosis would not change the intervention. Conversely, others noted the importance of documenting malnutrition for reimbursement purposes. Across all groups, RDs reported challenges in diagnosing malnutrition because of difficulty differentiating between disease- versus malnutrition-related muscle loss. Consequently, several RDs described adapting current malnutrition criteria to focus on weight loss, decreased energy intake, or fat loss.</p><p><b>Conclusion:</b> Overall, RDs agreed that malnutrition is common among PALS, and they conducted thorough nutrition assessments as part of standard care. Among those who documented malnutrition, most used AAIM indicators to support the diagnosis. However, as muscle loss is a natural consequence of ALS, RDs perceived difficulty in assessing nutrition-related muscle loss. This study highlights the need for malnutrition criteria specific for PALS.</p><p><b>Table 1.</b> Themes Related to Diagnosing Malnutrition in ALS.</p><p></p><p>Carley Rusch, PhD, RDN, LDN<sup>1</sup>; Nicholas Baroun, BS<sup>2</sup>; Katie Robinson, PhD, MPH, RD, LD, CNSC<sup>1</sup>; Maria Geraldine E. Baggs, PhD<sup>1</sup>; Refaat Hegazi, MD, PhD, MPH<sup>1</sup>; Dominique Williams, MD, MPH<sup>1</sup></p><p><sup>1</sup>Abbott Nutrition, Columbus, OH; <sup>2</sup>Miami University, Oxford, OH</p><p><b>Financial Support:</b> This study was supported by Abbott Nutrition.</p><p><b>Background:</b> Malnutrition is increasingly recognized as a condition that is present in all BMI categories. Although much research to date has focused on malnutrition in patients with lower BMIs, there is a need for understanding how nutrition interventions may alter outcomes for those with higher BMIs and existing comorbidities. In a post-hoc analysis of the NOURISH trial, which investigated hospitalized older adults with malnutrition, we sought to determine whether consuming a specialized ONS containing high energy, protein and beta-hydroxy-beta-methylbutyrate (ONS+HMB) can improve vitamin D and nutritional status in those with a BMI ≥ 27.</p><p><b>Methods:</b> Using data from the NOURISH trial, a randomized, placebo-controlled, multi-center, double-blind study conducted in hospitalized participants with malnutrition and a primary diagnosis of congestive heart failure, acute myocardial infarction, pneumonia, or chronic obstructive pulmonary disease, post-hoc analysis was conducted. In the trial, participants received standard care with either ONS + HMB or a placebo beverage (target 2 servings/day) during hospitalization and for 90 days post-discharge. Nutritional status was assessed using Subjective Global Assessment (SGA) and handgrip strength at baseline, 0-, 30-, 60- and 90-days post-discharge. Vitamin D (25-hydroxyvitamin D) was assessed within 72 hours of admission (baseline), 30- and 60-days post-discharge. Participants with a BMI ≥ 27 formed the analysis cohort. Treatment effect was determined using analysis of covariance adjusted for baseline measures.</p><p><b>Results:</b> The post-hoc cohort consisted of 166 patients with a BMI ≥ 27, mean age 76.41 ± 8.4 years, and who were predominantly female (51.2%). Baseline handgrip strength (n = 137) was 22.3 ± 0.8 kg while serum concentrations of 25-hydroxyvitamin D (n = 138) was 26.0 ± 1.14 ng/mL. At day 90, ONS+HMB improved nutritional status in which 64% of ONS+HMB group were well-nourished (SGA-A) vs. 37% of the control group (p = 0.011). There was a trend towards higher changes in handgrip strength with ONS+HMB during index hospitalization (baseline to discharge) compared to placebo (least squares means ± standard error: 1.34 kg ± 0.35 vs. 0.41 ± 0.39; p = 0.081) but was not significant at all other timepoints. Vitamin D concentrations were significantly higher at day 60 in those receiving ONS + HMB compared to placebo (29.7 ± 0.81 vs. 24.8 ± 0.91; p < 0.001).</p><p><b>Conclusion:</b> Hospitalized older patients with malnutrition and a BMI ≥ 27 had significant improvements in their vitamin D and nutritional status at day 60 and 90, respectively, if they received standard care + ONS+HMB as compared to placebo. This suggests transitions of care post-acute setting should consider continuation of nutrition for patients with elevated BMI and malnutrition using interventions such as ONS+HMB in combination with standard care.</p><p>Aline Dos Santos<sup>1</sup>; Isis Helena Buonso<sup>2</sup>; Marisa Chiconeli Bailer<sup>2</sup>; Maria Fernanda Jensen Kok<sup>2</sup></p><p><sup>1</sup>Hospital Samaritano Higienópolis, São Paulo; <sup>2</sup>Hospital Samaritano Higienopolis, São Paulo</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition negatively impacts the length of hospital stay, infection rate, mortality, clinical complications, hospital readmission and also on average healthcare costs. It is believed that early nutritional interventions could reduce negative events and generate economic impact. Therefore, our objective was to evaluate the average cost of hospitalization of patients at nutritional risk through nutritional screening with indication of oral nutritional supplementation.</p><p><b>Methods:</b> Retrospective study including 110 adult patients hospitalized in a private institution admitted between August 2023 and January 2024. Nutritional screening was performed within 24 hours of admission. To classify low muscle mass according to calf circumference (CC), the cutoff points were considered: 33 cm for women and 34 cm for men, measured within 96 hours of hospital admission. They were evaluated in a grouped manner considering G1 patients with an indication for oral supplementation (OS) but not started for modifiable reasons, G2 patients with an indication for OS and started assertively (within 48 hours of the therapeutic indication) and G3 patients with an indication for OS but started late (after 48 hours of therapeutic indication) and G4 the joining of patients from G1 and G3 as both did not receive OS assertively. Patients receiving enteral or parenteral nutritional therapy were excluded.</p><p><b>Results:</b> G2 was prevalent in the studied sample (51%), had an intermediate average length of stay (20.9 days), lower average daily hospitalization cost, average age of 71 years, significant prevalence of low muscle mass (56%) and lower need for hospitalization in intensive care (IC) (63%) with an average length of stay (SLA) in IC of 13.5 days. G1 had a lower prevalence (9%), shorter average length of stay (16 days), average daily cost of hospitalization 41% higher than G2, average age of 68 years, unanimity of adequate muscle mass (100%) and considerable need for hospitalization in intensive care (70%), but with a SLA in IC of 7.7 days. G3 represented 40% of the sample studied, had a longer average length of stay (21.5 days), average daily cost of hospitalization 22% higher than G2, average age of 73 years, significant prevalence of low muscle mass (50%) and intermediate need for hospitalization in intensive care (66%) but with SLA in IC of 16.5 days. Compared to G2, G4 presented a similar sample group (G2: 56 patients and G4: 54 patients) as well as mean age (72 years), hospitalization (20.55 days), hospitalization in IC (66%), SLA in IC (64.23%) but higher average daily hospitalization cost (39% higher than G2) and higher prevalence of patients with low muscle mass (59%).</p><p><b>Conclusion:</b> From the results presented, we can conclude that the percentage of patients who did not receive OS and who spent time in the IC was on average 5% higher than the other groups, with unanimously adequate muscle mass in this group, but with the need for supplementation due to clinical conditions, food acceptance and weight loss. More than 50% of patients among all groups except G1 had low muscle mass. Regarding costs, patients supplemented assertively or late cost, respectively, 45% and 29% less compared to patients who did not receive OS. Comparing G2 with G4, the cost remains 39% lower in patients assertively supplemented.</p><p><b>International Poster of Distinction</b></p><p>Daphnee Lovesley, PhD, RD<sup>1</sup>; Rajalakshmi Paramasivam, MSc, RD<sup>1</sup></p><p><sup>1</sup>Apollo Hospitals, Chennai, Tamil Nadu</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition in hospitals, once an underreported issue, has gained significant attention in recent decades. This widespread problem negatively impacts recovery, length of stay (LOS), and overall outcomes. This study aimed to evaluate the effectiveness of clinical nutrition practices and the role of a Nutrition Steering Committee (NSC) in addressing and potentially eradicating this persistent problem by improving nutritional care and patient outcomes.</p><p><b>Methods:</b> Consecutive patients admitted to non-critical units of a tertiary care hospital between January 2018 and August 2024 were included in the study. Patient demographics, body mass index (BMI), modified Subjective Global Assessment (mSGA), oral nutritional supplements (ONS) usage, and clinical outcomes were retrospectively extracted from electronic medical records. Data was analyzed using SPSS version 20.0, comparing results before and after the implementation of the NSC.</p><p><b>Results:</b> Out of 239,630 consecutive patients, 139,895 non-critical patients were included, with a mean age of 57.10 ± 15.89 years; 64.3% were men and 35.7% women. The mean BMI was 25.76 ± 4.74 kg/m<sup>2</sup>, and 49.6% of the patients were polymorbid. The majority (25.8%) were admitted with cardiac illness. According to the modified Subjective Global Assessment (mSGA), 87.1% were well-nourished, 12.8% moderately malnourished, and 0.1% severely malnourished. ONS were prescribed for 10% of the population and ONS prescription was highest among underweight (28.4%); Normal BMI (13%); Overweight (9.1%); Obese (7.7%) (p = 0.000) and mSGA – well-nourished (5.5%); moderately malnourished (MM) 41%; severely malnourished (SM) 53.2% (p = 0.000) and pulmonology (23.3%), followed by gastroenterology & Hepatology (19.2%) (p = 0.000). The mean hospital LOS was 4.29 ± 4.03 days, with an overall mortality rate of 1.2%. Severe malnutrition, as rated by mSGA, significantly impacted mortality (0.8% vs. 5.1%, p = 0.000). Mortality risk increased with polymorbidity (0.9% vs. 1.5%) and respiratory illness (2.6%, p = 0.000). Poor nutritional status, as assessed by mSGA (34.7%, 57.4%, 70.9%) and BMI (43.7% vs. 38%), was associated with longer hospital stays (LOS ≥ 4 days, p = 0.000). The implementation of the NSC led to significant improvements - average LOS decreased (4.4 vs. 4.1 days, p = 0.000), and mortality risk was reduced from 1.6% to 0.7% (p = 0.000). No significant changes were observed in baseline nutritional status, indicating effective clinical nutrition practices in assessing patient nutrition. ONS prescriptions increased from 5.2% to 9.7% between 2022 and 2024 (p = 0.000), contributing to the reduction in mortality rates to below 1% after 2022, compared to over 1% before NSC (p = 0.000). A significant negative correlation was found between LOS and ONS usage (p = 0.000). Step-wise binary logistic regression indicated that malnutrition assessed by mSGA predicts mortality with an odds ratio of 2.49 and LOS with an odds ratio of 1.7, followed by ONS prescription and polymorbidity (p = 0.000).</p><p><b>Conclusion:</b> A well-functioning NSC is pivotal in driving successful nutritional interventions and achieving organizational goals. Early identification of malnutrition through mSGA, followed by timely and appropriate nutritional interventions, is essential to closing care gaps and improving clinical outcomes. Strong leadership and governance are critical in driving these efforts, ensuring that the patients receive optimal nutritional support to enhance recovery and reduce mortality.</p><p><b>Table 1.</b> Patient Characteristics: Details of Baseline Anthropometric & Nutritional Status.</p><p></p><p>Baseline details of Anthropometric Measurements and Nutrition Status.</p><p><b>Table 2.</b> Logistic Regression to Predict Hospital LOS and Mortality.</p><p></p><p>Step-wise binary logistic regression indicated that malnutrition assessed by mSGA predicts mortality with an odds ratio of 2.49 and LOS with an odds ratio of 1.7, followed by ONS prescription and polymorbidity (p = 0.000).</p><p></p><p>mSGA-rated malnourished patients stayed longer in the hospital compared to the well-nourished category (p = 0.000)</p><p><b>Figure 1.</b> Nutritional Status (mSGA) Vs Hospital LOS (>4days).</p><p>Hannah Welch, MS, RD<sup>1</sup>; Wendy Raissle, RD, CNSC<sup>2</sup>; Maria Karimbakas, RD, CNSC<sup>3</sup></p><p><sup>1</sup>Optum Infusion Pharmacy, Phoenix, AZ; <sup>2</sup>Optum Infusion Pharmacy, Buckeye, AZ; <sup>3</sup>Optum Infusion Pharmacy, Milton, MA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Food insecurity is when people do not have enough food to eat and do not know where their next meal will come from. In the United States approximately 49 million people relied on food assistance charities in 2022 (data from Feeding America). Patients receiving parenteral nutrition (PN), who may be capable of supplementing with oral intake, may experience food insecurity due to chronic health conditions limiting work capability and total family income. Patients may also experience lack of affordable housing, increased utilities and the burden of medical expenses. Signs of food insecurity may present as weight loss, malnourishment, low energy, difficulty concentrating, or other physical indicators such as edema, chronically cracked lips, dry skin, and itchy eyes. The purpose of this abstract is to highlight two unique patient case presentations where food insecurity prompted clinicians to intervene.</p><p><b>Methods:</b> Patient 1: 50-year-old male with short bowel syndrome (SBS) on long-term PN called the registered dietitian (RD) regarding financial difficulties with feeding the family (see Table 1). The patient and clinician relationship allowed the patient to convey sensitive concerns to the RD regarding inability to feed himself and his family, which resulted in the patient relying on the PN for all nutrition. Due to the food insecurity present, the clinician made changes to PN/hydration to help improve patient's clinical status. Patient 2: A 21-year-old male with SBS on long-term PN spoke with his in-home registered nurse (RN) regarding family's difficulties affording food (see Table 2). The RN informed the clinical team of suspected food insecurity and the insurance case manager (CM) was contacted regarding food affordability. The RD reached out to local community resources such as food banks, food boxes and community programs. A community program was able to assist the patient with meals until patient's aunt started cooking meals for him. This patient did not directly share food insecurity with RD; however, the relationship with the in-home RN proved valuable in having these face-to-face conversations with the patient.</p><p><b>Results:</b> In these two patient examples, difficulty obtaining food affected the patients’ clinical status. The clinical team identified food insecurity and the need for further education for the interdisciplinary team. A food insecurity informational handout was created by the RD with an in-service to nursing to help aid recognition of signs (Figure 1) to detect possible food insecurity and potential patient resources available in the community. Figure 2 presents suggested questions to ask a patient if an issue is suspected.</p><p><b>Conclusion:</b> Given the prevalence of food insecurity, routine assessment for signs and symptoms is essential. Home nutrition support teams (including RDs, RNs, pharmacists and care technicians) are positioned to assist in this effort as they have frequent phone and in-home contact with patients and together build a trusted relationship with patients and caregivers. Clinicians should be aware regarding potential social situations which can warrant changes to PN formulations. To approach this sensitive issue thoughtfully, PN infusion providers should consider enhancing patient assessments and promote education across the interdisciplinary team to create awareness of accessible community resources.</p><p><b>Table 1.</b> Patient 1 Information.</p><p></p><p><b>Table 2.</b> Suspected Food Insecurity Timeline.</p><p></p><p></p><p><b>Figure 1.</b> Signs to Detect Food Insecurity.</p><p></p><p><b>Figure 2.</b> Questions to Ask.</p><p><b>Poster of Distinction</b></p><p>Christan Bury, MS, RD, LD, CNSC<sup>1</sup>; Amanda Hodge Bode, RDN, LD<sup>2</sup>; David Gardinier, RD, LD<sup>3</sup>; Roshni Sreedharan, MD, FASA, FCCM<sup>3</sup>; Maria Garcia Luis, MS, RD, LD<sup>4</sup></p><p><sup>1</sup>Cleveland Clinic, University Heights, OH; <sup>2</sup>Cleveland Clinic Foundation, Sullivan, OH; <sup>3</sup>Cleveland Clinic, Cleveland, OH; <sup>4</sup>Cleveland Clinic Cancer Center, Cleveland, OH</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> The Society of Critical Care Medicine, Critical Care Congress in Orlando, FL on February 25<sup>th</sup>.</p><p><b>Publication:</b> Critical Care Medicine.2025;53(1):In press.</p><p><b>Financial Support:</b> Project support provided by Morrison Cleveland Clinic Nutrition Research collaborative.</p><p><b>Background:</b> Hospitalized and critically ill patients who have preexisting malnutrition can have worse outcomes and an increased length of stay (LOS). Currently, Registered Dietitians (RDs) assess malnutrition with a nutrition focused physical exam (NFPE). Recent recommendations encourage the use of body composition tools such as computed tomography (CT) along with the NFPE. Trained RDs can use CT scans to evaluate skeletal muscle mass at the third lumbar vertebrae (L3) and then calculate Skeletal Muscle Index (SMI) and mean Hounsfield Units (HU) to determine muscle size and quality, respectively. This has been validated in various clinical populations and may be particularly useful in the critically ill where the NFPE is difficult. We aim to evaluate if using CT scans in the surgical and critical care population can be a supportive tool to capture a missed malnutrition diagnosis.</p><p><b>Methods:</b> One-hundred and twenty patients admitted to the Cleveland Clinic from 2021-2023 with a malnutrition evaluation that included an NFPE within 2 days of an abdominal CT were evaluated. Of those, 59 patients had a major surgery or procedure completed that admission and were included in the final analysis. The CT scans were read by a trained RD at L3 using Terarecon, and results were cross-referenced by an artificial intelligence software (AI) called Veronai. Age, sex, BMI, SMI & HU were analyzed, along with the malnutrition diagnosis.</p><p><b>Results:</b> Fifty-nine patients were analyzed. Of these, 61% were male, 51% were >65 years old, and 24% had a BMI > 30. Malnutrition was diagnosed in 47% of patients. A total of 24 patients had no muscle wasting based on the NFPE while CT captured low muscle mass in 58% in that group. Twenty-two percent of patients (13/59) had a higher level of malnutrition severity when using CT. Additionally, poor muscle quality was detected in 71% of patients among all age groups. Notably, there was a 95% agreement between AI and the RD's assessment in detecting low muscle mass.</p><p><b>Conclusion:</b> RDs can effectively analyze CT scans and use SMI and HU with their NFPE. The NFPE alone is not always sensitive enough to detect low muscle in surgical and critically ill patients. The degree of malnutrition dictates nutrition interventions, including the timing and type of nutrition support, so it is imperative to accurately diagnose and tailor interventions to improve outcomes.</p><p><b>Table 1.</b> Change in Malnutrition Diagnosis Using CT.</p><p></p><p>The graph shows the change in Malnutrition Diagnosis when CT was applied in conjunction with the NFPE utilizing ASPEN Guidelines.</p><p><b>Table 2.</b> Muscle Assessment: CT vs NFPE.</p><p></p><p>This graph compares muscle evaluation using both CT and the NFPE.</p><p></p><p>CT scan at 3rd lumbar vertebrae showing normal muscle mass and normal muscle quality in a pt >65 years old.</p><p><b>Figure 1.</b> CT Scans Evaluating Muscle Size and Quality.</p><p></p><p>CT scan at 3rd lumbar vetebrae showing low muscle mass and low muscle quality in a patient with obesity.</p><p><b>Figure 2.</b> CT Scans Evaluating Muscle Size and Quality.</p><p>Elif Aysin, PhD, RDN, LD<sup>1</sup>; Rachel Platts, RDN, LD<sup>1</sup>; Lori Logan, RN<sup>1</sup></p><p><sup>1</sup>Henry Community Health, New Castle, IN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Disease-related malnutrition alters body composition and causes functional decline. In acute care hospitals, 20-50% of inpatients have malnutrition when they are admitted to the hospital. Patients with malnutrition experience higher medical costs, increased mortality, and longer hospital stays. Malnutrition is associated with an increased risk of readmission and complications. Monitoring, diagnosis, treatment, and documentation of malnutrition are important for treating patients. It also contributes to proper Diagnosis Related Group (DRG) coding and accurate CMI (Case Mix Index), which can increase reimbursement.</p><p><b>Methods:</b> After dietitians completed the Nutrition Focused Physical Exam (NFPE) course and sufficient staff had been provided, the malnutrition project was initiated under the leadership of RDNs in our small rural community hospital. The interdisciplinary medical committee approved the project to improve malnutrition screening, diagnosis, treatment practices, and coding. It was decided to use the Academy/American Society of Parenteral and Enteral Nutrition (ASPEN) and Global Leadership Initiative on Malnutrition (GLIM) criteria to diagnose malnutrition. The Malnutrition Screening Tool (MST) was completed by nurses to determine the risk of malnutrition. The Nutrition and Dietetics department created a new custom report that provides NPO-Clear-Full liquids patients' reports using the nutrition database. RDNs check NPO-Clear-Full liquids patients' reports, BMI reports, and length of stay (LOS) patient lists on weekdays and weekends. RDNs also performed the NFPE exam to evaluate nutritional status. If malnutrition is identified, RDNs communicate with providers through the hospital messenger system. Providers add malnutrition diagnosis in their documentation and plan of care. RDNs created a dataset and shared it with Coders and Clinical Documentation Integrity Specialists/Care Coordination. We keep track of malnutrition patients. In addition, RDNs spent more time with malnourished patients. They contributed to discharge planning and education.</p><p><b>Results:</b> The prevalence of malnutrition diagnosis and the amount of reimbursement for 2023 were compared to the six months after implementing the malnutrition project. Malnutrition diagnosis increased from 2.6% to 10.8%. Unspecified protein-calorie malnutrition diagnoses decreased from 39% to 1.5%. RDN-diagnosed malnutrition has been documented in provider notes for 82% of cases. The malnutrition diagnosis rate increased by 315% and the malnutrition reimbursement rate increased by 158%. Of those patients identified with malnutrition, 59% received malnutrition DRG code. The remaining 41% of patients received higher major complications and comorbidities (MCCs) codes. Our malnutrition reimbursement increased from ~$106,000 to ~$276,000.</p><p><b>Conclusion:</b> The implementation of evidence-based practice guidelines was key in identifying and accurately diagnosing malnutrition. The provision of sufficient staff with the necessary training and multidisciplinary teamwork has improved malnutrition diagnosis documentation in our hospital, increasing malnutrition reimbursement.</p><p><b>Table 1.</b> Before and After Malnutrition Implementation Results.</p><p></p><p></p><p><b>Figure 1.</b> Prevalence of Malnutrition Diagnosis.</p><p>Elisabeth Schnicke, RD, LD, CNSC<sup>1</sup>; Sarah Holland, MSc, RD, LD, CNSC<sup>2</sup></p><p><sup>1</sup>The Ohio State University Wexner Medical Center, Columbus, OH; <sup>2</sup>The Ohio State University Wexner Medical Center, Upper Arlington, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition is associated with increased length of stay, readmissions, mortality and poor outcomes. Early identification and treatment are essential. The Malnutrition Screening Tool (MST) is a quick, easy tool recommended for screening malnutrition in adult hospitalized patients. This is commonly used with or without additional indicators. We aimed to evaluate our current nutrition screening policy, which utilizes MST, age and body mass index (BMI), to improve malnutrition identification.</p><p><b>Methods:</b> This quality improvement project data was obtained over a 3-month period on 4 different adult services at a large academic medical center. Services covered included general medicine, hepatology, heart failure and orthopedic surgery. Patients were assessed by a Registered Dietitian (RD) within 72hrs of admission if they met the following high-risk criteria: MST score >2 completed by nursing on admission, age ≥65 yrs or older, BMI ≤ 18.5 kg/m<sup>2</sup>. If none of the criteria were met, patients were seen within 7 days of admission or sooner by consult request. Malnutrition was diagnosed using Academy of Nutrition and Dietetics/American Society for Parenteral and Enteral Nutrition indicators of malnutrition (AAIM) criteria. Data collected included malnutrition severity and etiology, age, gender, BMI and MST generated on admission.</p><p><b>Results:</b> A total of 239 patients were diagnosed with malnutrition. Table 1 shows detailed characteristics. Malnutrition was seen similarly across gender (51% male, 49% female) and age groups. Age range was 21-92 yrs with an average age of 61 yrs. BMI range was 9.8-50.2 kg/m2 with an average BMI of 24.6 kg/m2. More patients were found to have moderate malnutrition at 61.5% and chronic malnutrition at 54%. When data was stratified by age ≥65 yrs, similar characteristics were seen for malnutrition severity and etiology. Notably, more patients (61.5%) had an MST of < 2 or an incomplete MST compared to patients < 65 yrs of age (56%). There were 181 patients (76%) that met high risk screening criteria and were seen for an assessment within 72hrs. Seventy patients (39%) were screened only due to age ≥65 yrs. Forty-five (25%) were screened due to MST alone. There were 54 (30%) that met 2 indicators for screening. Only a small number of patients met BMI criteria alone or 3 indicators (6 patients or 3% each).</p><p><b>Conclusion:</b> Utilizing MST alone would have missed over half of patients diagnosed with malnutrition and there was a higher miss rate with older adults using MST alone. Age alone as a screening criteria caught more patients than MST alone did. Adding BMI to screening criteria added very little and we still missed 24% of patients with our criteria. A multi-faceted tool should be explored to best capture patients.</p><p><b>Table 1.</b> Malnutrition characteristics.</p><p></p><p>*BMI: excludes those with amputations, paraplegia; all patients n = 219, patients ≥65yo n = 106.</p><p>Robin Nuse Tome, MS, RD, CSP, LDN, CLC, FAND<sup>1</sup></p><p><sup>1</sup>Nemours Children's Hospital, DE, Landenberg, PA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition is a global problem impacting patients of all ages, genders, and races. Malnourished hospitalized patients are associated with poorer outcomes including longer in-hospital length of stays, higher rate of death, higher need for home healthcare service, and a higher rate of 30day readmission. This results in a higher economic burden for the healthcare industry, insurance, and the individual. Malnutrition documentation plays a vital role in capturing the status of the patient and starting the conversation about interventions to address then concern.</p><p><b>Methods:</b> A taskforce made up of physicians, registered dietitians (RDs), and clinical documentation specialists met to discuss strategies to increase documentation of the malnutrition diagnosis to facilitate better conversation about the concern and potential interventions. Options included inbox messages, a best practice alert, a SmartLink between the physician and RD note, and adding the diagnosis to the problem list. Of the options, the team selected to develop a SmartLink was developed within the Electronic Medical Record (EMR) that links text from the RD note about malnutrition to the physician note to capture the diagnosis of malnutrition, the severity, and the progression of the diagnosis over time.</p><p><b>Results:</b> Preliminary data shows that physician documentation of the malnutrition diagnosis as well as the severity and progression of the diagnosis increased by 20% in the pilot medical team. Anecdotally, physicians were more aware of the patient's nutrition status with documentation linked to their note and collaboration between the medical team and the RD to treat malnutrition has increased.</p><p><b>Conclusion:</b> We hypothesize that expanding the practice to the entire hospital, will increase documentation of malnutrition diagnosis in the physician note. This will help increase awareness of the nutrition status of the patient, draw attention and promote collaboration on interventions to treat, and increase billable revenue to the hospital by capturing the documentation of the degree of malnutrition in the physician note.</p><p>David López-Daza, RD<sup>1</sup>; Cristina Posada-Alvarez, Centro Latinoamericano de Nutrición<sup>1</sup>; Alejandra Agudelo-Martínez, Universidad CES<sup>2</sup>; Ana Rivera-Jaramillo, Boydorr SAS<sup>3</sup>; Yeny Cuellar-Fernández, Centro Latinoamericano de Nutrición<sup>1</sup>; Ricardo Merchán-Chaverra, Centro Latinoamericano de Nutrición<sup>1</sup>; María-Camila Gómez-Univio, Centro Latinoamericano de Nutrición<sup>1</sup>; Patricia Savino-Lloreda, Centro Latinoamericano de Nutrición<sup>1</sup></p><p><sup>1</sup>Centro Latinoamericano de Nutrición (Latin American Nutrition Center), Chía, Cundinamarca; <sup>2</sup>Universidad CES (CES University), Medellín, Antioquia; <sup>3</sup>Boydorr SAS, Chía, Cundinamarca</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> The Malnutrition Screening Tool (MST) is a simple and quick instrument designed to identify the risk of malnutrition in various healthcare settings, including home hospitalization. Its use has become widespread due to its ease of application and ability to preliminarily assess the nutritional status of patients. In the context of home care, where clinical resources may be limited, an efficient screening tool is crucial to ensure early interventions that prevent nutritional complications in vulnerable populations. The objective was to evaluate the diagnostic accuracy of the MST in detecting malnutrition among patients receiving home care.</p><p><b>Methods:</b> A diagnostic test study was conducted, collecting sociodemographic data, MST results, and malnutrition diagnoses based on the Global Leadership Initiative on Malnutrition (GLIM) criteria. A positive MST score was defined as a value of 2 or above. Categorical data were summarized using proportions, while quantitative variables were described using measures of central tendency. Sensitivity, specificity, the area under the receiver operating characteristic curve (AUC), positive likelihood ratio (LR + ), and negative likelihood ratio (LR-) were estimated along with their respective 95% confidence intervals.</p><p><b>Results:</b> A total of 676 patients were included, with a median age of 82 years (interquartile range: 68-89 years), and 57.3% were female. According to the GLIM criteria, 59.8% of the patients met the criteria for malnutrition. The MST classified patients into low risk (62.4%), medium risk (30.8%), and high risk (6.8%). The sensitivity of the MST was 11.4% (95% CI: 8.5-14.9%), while its specificity was 100% (95% CI: 98.7-100%). The positive likelihood ratio (LR + ) was 1.75, and the negative likelihood ratio (LR-) was 0.0. The area under the curve was 0.71 (95% CI: 0.69-0.73), indicating moderate discriminative capacity.</p><p><b>Conclusion:</b> While the MST demonstrates extremely high specificity, its low sensitivity limits its effectiveness in accurately identifying malnourished patients in the context of home care. This suggests that, although the tool is highly accurate for confirming the absence of malnutrition, it fails to detect a significant number of patients who are malnourished. As a result, while the MST may be useful as an initial screening tool, its use should be complemented with more comprehensive assessments to ensure more precise detection of malnutrition in this high-risk population.</p><p><b>Poster of Distinction</b></p><p>Colby Teeman, PhD, RDN, CNSC<sup>1</sup>; Kaylee Griffith, BS<sup>2</sup>; Karyn Catrine, MS, RDN, LD<sup>3</sup>; Lauren Murray, MS, RD, CNSC, LD<sup>3</sup>; Amanda Vande Griend, BS, MS<sup>2</sup></p><p><sup>1</sup>University of Dayton, Xenia, OH; <sup>2</sup>University of Dayton, Dayton, OH; <sup>3</sup>Premier Health, Dayton, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> The prevalence of malnutrition in critically ill populations has previously been shown to be between 38-78%. Previously published guidelines have stated that patients in the ICU should be screened for malnutrition within 24-48 hours and all patients in the ICU for >48 hours should be considered at high risk for malnutrition. Patients with a malnutrition diagnosis in the ICU have been shown to have poorer clinical outcomes; including longer length of stay, greater readmission rates, and increased mortality. The purpose of the current study was to see if severity of malnutrition impacted the time to initiate enteral nutrition and the time to reach goal enteral nutrition rate in critically ill patients and determine the possible impact of malnutrition severity on clinical outcomes.</p><p><b>Methods:</b> A descriptive, retrospective chart review was conducted in multiple ICU units at a large level I trauma hospital in the Midwest. All participants included in analysis had been assessed for malnutrition by a registered dietitian according to ASPEN clinical practice guidelines. Exclusion criteria included patients receiving EN prior to the RDN assessment, those who received EN for < 24 hours total, patients on mixed oral and enteral nutrition diets, and patients receiving any parenteral nutrition. Participants were grouped by malnutrition status; no malnutrition n = 27, moderate malnutrition n = 22, and severe malnutrition n = 32. All data analysis were analyzed SPSS version 29.</p><p><b>Results:</b> There was no difference in primary outcomes (time to EN initiation, time to EN goal rate) by malnutrition status (both p > 0.05). Multiple regression analysis found neither moderately malnourished or severely malnourished patients were more likely to have enteral nutrition initiation delayed for >48 hours from admission (p > 0.05). Neither ICU LOS nor hospital LOS was different among malnutrition groups (p > 0.05). Furthermore, neither ICU nor hospital mortality was different among malnutrition groups (p < 0.05). Among patients who were moderately malnourished, 81.8% required vasopressors, compared to 75% of patients who were severely malnourished, and 44.4% of patients who did not have a malnutrition diagnosis (p = 0.010). 90.9% of moderately malnourished patients required extended time on a ventilator (>72 hours), compared to 59.4% of severely malnourished patients, and 51.9% of patients without a malnutrition diagnosis (p = 0.011).</p><p><b>Conclusion:</b> Although the severity of malnutrition did not impact LOS, readmission, or mortality, malnutrition status did significantly predict greater odds of a patient requiring vasopressors and spending an extended amount of time on a ventilator. Further studies with larger sample sizes are warranted to continue developing a better understanding of the relationship between malnutrition status and clinical outcomes.</p><p>Jamie Grandic, RDN-AP, CNSC<sup>1</sup>; Cindi Stefl, RN, BSN, CCDS<sup>2</sup></p><p><sup>1</sup>Inova Health System, Fairfax Station, VA; <sup>2</sup>Inova Health System, Fairfax, VA</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> Vizient Connections Summit 2024 (Sept 16-19, 2024).</p><p><b>Publication:</b> 2024 Vizient supplement to the American Journal of Medical Quality (AJMQ).</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Research reveals that up to 50% of hospitalized patients are malnourished, yet only 9% of these cases are diagnosed. <sup>(1)</sup> Inadequate diagnosis and intervention of malnutrition can lead to poorer patient outcomes and reduced revenue. Our systemwide malnutrition awareness campaign successfully enhanced dietitian engagement, provider education, and streamlined documentation processes. This initiative resulted in a two-fold increase in the capture of malnutrition codes, a notable rise in malnutrition variable capture, and an average increase in diagnosis-related group relative weight by approximately 0.9. Consequently, there was a ~ 300% increase in revenue associated with accurate malnutrition diagnosis and documentation, alongside improvements in the observed-to-expected (O/E) ratio for mortality and length of stay. Given the critical impact of malnutrition on mortality, length of stay, and costs, enhanced identification programs are essential. Within our health system, a malnutrition identification program has been implemented across five hospitals for several years. System leadership roles for clinical nutrition and clinical documentation integrity (CDI) were established to ensure consistency, implement best practices, and optimize program efficacy. In 2022, a comprehensive analysis identified opportunities for improvement: a low systemwide capture rate (2%), limited awareness of the program's benefits, and inconsistent documentation practices. The leadership team, with the support of our executive sponsors, addressed these issues, engaged with service line leaders, and continue to drive program enhancements.</p><p><b>Methods:</b> A 4-part malnutrition education campaign was implemented: Strengthened collaboration between Clinical Nutrition and CDI, ensuring daily systemwide communication of newly identified malnourished patients. Leadership teams, including coding and compliance, reviewed documentation protocols considering denial risks and regulatory audits. Launched a systemwide dietitian training program with a core RD malnutrition optimization team, a 5-hour comprehensive training, and monthly chart audits aiming for >80% documentation compliance. Created a Provider Awareness Campaign featuring interactive presentations on the malnutrition program's benefits and provider documentation recommendations. Developed an electronic health record (EHR) report and a malnutrition EHR tool to standardize documentation. EHR and financial reports were used to monitor program impact and capture rates.</p><p><b>Results:</b> The malnutrition campaign has notably improved outcomes through ongoing education for stakeholders. A malnutrition EHR tool was also created in November 2022. This tool is vital for enhancing documentation, significantly boosting provider and CDI efficiency. Key results include: Dietitian documentation compliance increased from 85% (July 2022) to 95% (2024); RD-identified malnutrition cases increased from 2% (2021) to 16% (2024); Monthly average of final coded malnutrition diagnoses increased from 240 (2021) to 717 (2023); Average DRG relative weight climbed from 1.24 (2021) to 2.17 (2023); Financial impact increased from $5.5 M (2021) to $17.7 M (2024); and LOS O/E improved from 1.04 to 0.94 and mortality O/E improved from 0.77 to 0.62 (2021-2023).</p><p><b>Conclusion:</b> This systemwide initiative not only elevates capture rates and documentation but also enhances overall outcomes. By CDI and RD teams taking on more of a collaborative, leadership role, providers can concentrate more on patient care, allowing these teams to operate at their peak. Looking ahead to 2025, the focus will shift towards leading indicators to refine malnutrition identification and assess the educational campaign's impact further.</p><p>Ryota Sakamoto, MD, PhD<sup>1</sup></p><p><sup>1</sup>Kyoto University, Kyoto</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Growing concern about the environmental impact of meat eating has led to consideration of a shift to a plant-based diet. One nutrient that tends to be particularly deficient in a plant-based diet is vitamin B12. Since vitamin B12 deficiency can cause anemia, limb paresthesia and muscle weakness, and psychiatric symptoms including depression, delirium, and cognitive impairment, it is important to find sustainable local sources of vitamin B12. In this study, we focused on gundruk and sinki, two fermented and preserved vegetables traditionally consumed mainly in Nepal, India, and Bhutan, and investigated their vitamin B12 content. The sinki is mainly made from radish roots, while gundruk is made from green leaves such as mustard leaves, which are preserved through fermentation and sun-drying processes. Previous reports indicated that, in these regions, not only vegetarians and vegans but also a significant number of people may have consistently low meat intake, especially among the poor. The governments and other organizations have been initiating feeding programs to supply fortified foods with vitamins A, B1, B2, B3, B6, B9, B12, iron, and zinc, especially to schools. At this time, however, it is not easy to get fortified foods to residents in the community. It is important to explore the possibility of getting vitamin B12 from locally available products that can be taken by vegetarians, vegans, or the poor in the communities.</p><p><b>Methods:</b> Four samples of gundruk and five samples of sinki were obtained from markets, and the vitamin B12 content in them was determined using Lactobacillus delbrueckii subsp. lactis (Lactobacillus leichmannii) ATCC7830. The lower limit of quantification was set at 0.03 µg/100 g. The sample with the highest vitamin B12 concentration in the microbial quantification method was also measured for cyanocobalamin using LC-MS/MS (Shimadzu LC system equipped with Triple Quad 5500 plus AB-Sciex mass spectrometer). The Multiple Reaction Monitoring transition pattern for cyanocobalamin were Q1: 678.3 m/z, Q3: 147.1 m/z.</p><p><b>Results:</b> For gundruk, vitamin B12 was detected in all four of the four samples, with values of 5.0 µg/100 g, 0.13 µg/100 g, 0.12 µg/100 g, and 0.04 µg/100 g, respectively, from highest to lowest. For sinki, it was detected in four of the five samples, with values of 1.4 µg/100 g, 0.41 µg/100 g, 0.34 µg/100 g, and 0.16 µg/100 g, respectively, from highest to lowest. The cyanocobalamin concentration by LC-MS/MS in one sample was estimated to be 1.18 µg/100 g.</p><p><b>Conclusion:</b> According to “Vitamin and mineral requirements in human nutrition (2nd edition) (2004)” by the World Health Organization and the Food and Agriculture Organization of the United Nations, the recommended intake of vitamin B12 is 2.4 µg/day for adults, 2.6 µg/day for pregnant women and 2.8 µg/day for lactating women. The results of this study suggest that gundruk and sinki have potential as a source of vitamin B12, although there is a great deal of variability among samples. In order to use gundruk and sinki as a source of vitamin B12, it may be necessary to find a way to stabilize the vitamin B12 content while focusing on the relationship between vitamin B12 and the different ways of making gundruk and sinki.</p><p>Teresa Capello, MS, RD, LD<sup>1</sup>; Amanda Truex, MS, RRT, RCP, AE-C<sup>1</sup>; Jennifer Curtiss, MS, RD, LD, CLC<sup>1</sup>; Ada Lin, MD<sup>1</sup></p><p><sup>1</sup>Nationwide Children's Hospital, Columbus, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> The metabolic demands of critically ill children are defined by an increase in resting energy expenditure. (1,2). Energy needs in the PICU are ever changing and accurate evaluations are challenging to obtain. (3) Predictive equations have been found to be inaccurate due to over or under feeding patients which can lead to negative outcomes such as muscle loss and poor healing (underfeeding) and weight gain as adipose (overfeeding) (1,4,5). Indirect calorimetry (IC) is considered the gold standard to assess metabolic demand especially for critically ill pediatric patients (1,2,4). The use of IC may be limited due to staffing, equipment availability and cost as well as other patient related issues and/or cart specifications (6). In our facility, we identified limited use of IC by dietitians. Most tests were ordered by PICU dietitians and rarely outside the critical care division even though testing would benefit patients in other divisions of the hospital such as the CTICU, rehab, NICU and stepdown areas. Informal polling of non-PICU dietitians revealed that they had significant uncertainty interpreting data and providing recommendations based on test results. Reasons for uncertainty mostly centered from a lack of familiarity with this technology. The purpose of this study was to develop guidelines and a worksheet for consistently evaluating IC results with the goal of encouraging increased use of indirect calorimetry at our pediatric facility.</p><p><b>Methods:</b> A committee of registered dietitians (RDs) and respiratory therapists (RTs) met in January 2023 and agreed on step-by-step guidelines which were trialed, reviewed, and updated monthly. Finalized guidelines were transitioned to a worksheet to improve consistency of use and aid in interpretation of IC results. A shared file was established for articles about IC as well as access to the guidelines and worksheet (Figures 1 and 2). For this study, IC data from January 1, 2022 to July 31, 2024 was reviewed. This data included number of tests completed and where the orders originated.</p><p><b>Results:</b> Since the guidelines have been implemented, the non-PICU areas using IC data increased from 16% in 2022 to 30% in 2023 and appears to be on track to be the same in 2024 (Figure 3). RDs report an improved comfort level with evaluating test results as well as making recommendations for test ordering.</p><p><b>Conclusion:</b> The standardized guidelines and worksheet increased RD's level of comfort and interpretation of test results. The PICU RDs have become more proficient and comfortable explaining IC during PICU rounds. It is our hope that with the development of the guidelines/worksheet, more non-PICU RDs will utilize the IC testing outside of the critical care areas where longer lengths of stay may occur. IC allows for more individualized nutrition prescriptions. An additional benefit was the mutual exchange of information between disciplines. The RTs provided education on the use of the machine to the RDs. This enhanced RDs understanding of IC test results from the RT perspective. In return, the RDs educated the RTs as to why certain aspects of the patient's testing environment were helpful to report with the results for the RD to interpret the information correctly. The committee continues to meet and discuss patients’ tests to see how testing can be optimized as well as how results may be used to guide nutrition care.</p><p></p><p><b>Figure 1.</b> Screen Capture of Metabolic Cart Shared File.</p><p></p><p><b>Figure 2.</b> IC Worksheet.</p><p></p><p><b>Figure 3.</b> Carts completed per year by unit: 2022 is pre-intervention; 2023 and 2024 are post intervention. Key: H2B = PICU; other areas are non-PICU (H4A = cardiothoracic stepdown, H4B = cardiothoracic ICU, H5B = burn, H8A = pulmonary, H8B = stable trach/vent unit, H10B = Neurosurgery/Neurology, H11B = Nephrology/GI, H12A = Hematology/Oncology, C4A = NICU, C5B = infectious disease).</p><p>Alfredo Lozornio-Jiménez-de-la-Rosa, MD, MSCN<sup>1</sup>; Minu Rodríguez-Gil, MSCN<sup>2</sup>; Luz Romero-Manriqe, MSCN<sup>2</sup>; Cynthia García-Vargas, MD, MSCN<sup>2</sup>; Rosa Castillo-Valenzuela, PhD<sup>2</sup>; Yolanda Méndez-Romero, MD, MSC<sup>1</sup></p><p><sup>1</sup>Colegio Mexicano de Nutrición Clinica y Terapia Nutricional (Mexican College of Clinical Nutrition and Nutritional Therapy), León, Guanajuato; <sup>2</sup>Colegio Mexicano de Nutrición Clínica y Terapia Nutricionalc (Mexican College of Clinical Nutrition and Nutritional Therapy), León, Guanajuato</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Sarcopenia is a systemic, progressive musculoskeletal disorder associated with an increased risk of adverse events and is highly prevalent in older adults. This condition leads to a decline in functionality, quality of life, and has an economic impact. Sarcopenia is becoming increasingly common due to age-related changes, metabolic changes, obesity, sedentary lifestyle, chronic degenerative diseases, malnutrition, and pro-inflammatory states. The objective of this study was to investigate the relationship between strength and muscle mass, measured by calf circumference, both corrected for BMI, in young and older Mexican adults.</p><p><b>Methods:</b> This is a prospective, observational, cross-sectional, population-based clinical study conducted among Mexican men and women aged 30 to 90 years old, obtained through convenience sampling. The study was approved by the Ethics Committee of the Aranda de la Parra Hospital in León, Guanajuato, Mexico, and adheres to the Declaration of Helsinki. Informed consent was obtained from all participants after explaining the nature of the study. Inclusion criteria were: Mexican men and women aged 30 to 90 years old who were functionally independent (Katz index category \\\"a\\\"). Participants with amputations, movement disorders, or immobilization devices on their extremities were excluded. The research team was previously standardized in anthropometric measurements. Demographic data and measurements of weight, height, BMI, calf circumference, and grip strength using a dynamometer were collected. Data are presented as average and standard deviation. Spearman's correlation analysis was used to assess the relationship between BMI and calf circumference adjusted for BMI, with grip strength, considering a significance level of p < 0.05.</p><p><b>Results:</b> The results of 1032 subjects were presented, 394 men and 638 women from central Mexico, located in workplaces, recreation centers, and health facilities, aged between 30 and 90 years old. Table 1 shows the distribution of the population in each age category, categorized by sex. Combined obesity and overweight were found in 75.1% of the sample population, with a frequency of 69.2% in men and 78.7% in women; 20% had a normal weight, with 25.6% in men and 16.6% in women, and 4.8% had low BMI, with 5.1% of men and 4.7% of women (Graph 1). The depletion of calf circumference corrected for BMI and age in the female population begins at 50 years old with exacerbation at 65 years old and older, while in men a greater depletion can be observed from 70 years old onwards. (Graph 2). When analyzing the strength corrected for BMI and age, grip strength lowers at 55 years old, lowering even more as age increases, in both genders; Chi-square=83.5, p < 0.001 (Graph 3). By Spearman correlation, an inverse and high relationship was found in both genders between age and grip strength, that is, as age increases, grip strength decreases (r = -0.530, p < 0.001). A moderate and negative correlation was found between age and calf circumference, as age increases, calf circumference decreases independently of BMI (r = -0.365, p < 0.001). Calf circumference and grip strength are positively and moderately related, as calf circumference decreases, the grip strength decreases, independently of BMI (r = 0.447, p < 0.0001).</p><p><b>Conclusion:</b> These results show that the study population exhibited a decrease in grip strength, not related to BMI, from early ages, which may increase the risk of early-onset sarcopenia. This findings encourage early assessment of both grip strength and muscle mass, using simple and accessible measurements such as grip strength and calf circumference, adjusted for BMI. These measurements can be performed in the office during the initial patient encounter or in large populations, as in this study.</p><p><b>Table 1.</b> Distribution of the Population According to Age and Gender.</p><p></p><p>Alison Hannon, Medical Student<sup>1</sup>; Anne McCallister, DNP, CPNP<sup>2</sup>; Kanika Puri, MD<sup>3</sup>; Anthony Perkins, MS<sup>1</sup>; Charles Vanderpool, MD<sup>1</sup></p><p><sup>1</sup>Indiana University School of Medicine, Indianapolis, IN; <sup>2</sup>Indiana University Health, Indianapolis, IN; <sup>3</sup>Riley Hospital for Children at Indiana University Health, Indianapolis, IN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> The Academy of Nutrition and Dietetics and American Society for Parenteral and Enteral Nutrition have published criteria to diagnose a patient with mild, moderate, or severe malnutrition that use either single or multiple data points. Malnutrition is associated with worse clinical outcomes, and previous data from our institution showed that hospitalized children with severe malnutrition are at higher risk of mortality compared to mild and moderate malnutrition. This project aims to determine if differences in clinical outcomes exist in patients with severe malnutrition based on the diagnostic criteria or anthropometrics differences in patients.</p><p><b>Methods:</b> We included all patients discharged from Riley Hospital for Children within the 2023 calendar year diagnosed with severe malnutrition, excluding maternity discharges. Diagnostic criteria used to determine severe malnutrition was collected from registered dietitian (RD) documentation and RD-assigned malnutrition statement within medical records for the admission. Data was collected on readmission rates, mortality, length of stay (LOS), LOS index, cost, operative procedures, pediatric intensive care unit (ICU) admissions, and anthropometrics measured on admission. We used mixed-effects regression or mixed effects logistic regression to test whether the outcome of interest differed by severe malnutrition type. Each model contained a random effect for patient to account for correlation within admissions by the same patient and a fixed effect for severe malnutrition type. All analyses were performed using SAS v9.4.</p><p><b>Results:</b> Data was gathered on 409 patient admissions. 383 admissions had diagnostic criteria clearly defined regarding severity of malnutrition. This represented 327 unique patients (due to readmissions). There was no difference in any measured clinical outcomes based on the criteria used for severe malnutrition, including single or multiple point indicators or patients who met both single and multiple point indicators (Table 1). Anthropometric data was analyzed including weight Z-score (n = 398) and BMI Z-score (n = 180). There was no difference seen in the majority of measured clinical outcomes in admissions with severe malnutrition when comparing based on weight or BMI Z-score categories of Z < -2, -2 < Z < -0.01, or Z > 0 (Table 2). Patients admitted with severe malnutrition and a BMI Z score > 0 had an increase in median cost (p = 0.042) compared to BMI < -2 or between -2 and 0 (Table 2). There was a trend towards increased median cost (p = 0.067) and median LOS (p = 0.104) in patients with weight Z score > 0.</p><p><b>Conclusion:</b> Hospitalized patients with severe malnutrition have similar clinical outcomes regardless of diagnostic criteria used to determine the diagnosis of severe malnutrition. Fewer admissions with severe malnutrition (n = 180 or 44%) had sufficient anthropometric data to determine BMI. Based on this data, future programs at our institution aimed at intervention prior to and following admission will need to focus on all patients with severe malnutrition and will not be narrowed based on criteria (single, multiple data point) of severe malnutrition or anthropometrics. Quality improvement projects include improving height measurement and BMI determination upon admission, which will allow for future evaluation of impact of anthropometrics on clinical outcomes.</p><p><b>Table 1.</b> Outcomes by Severe Malnutrition Diagnosis Category.</p><p></p><p>Outcomes by severe malnutrition compared based on the diagnostic criteria used to determine malnutrition diagnosis. Patients with severe malnutrition only are represented. Diagnostic criteria determined based on ASPEN/AND guidelines and defined during admission by registered dietitian (RD). OR = operative room; ICU = intensive care unit; LOS = length of stay. Data on 383 admissions presented, total of 327 patients due to readmissions: 284 patients had 1 admission; 33 patients had 2 admissions; 8 patients had 3 admissions; 1 patient had 4 admissions; 1 patient had 5 admissions</p><p><b>Table 2.</b> Outcomes By BMI Z-score Category.</p><p></p><p>Outcomes of patients admitted with severe malnutrition, stratified based on BMI Z-score. Patients with severe malnutrition only are represented. BMI Z-score determined based on weight and height measurement at time of admission, recorded by bedside admission nurse. OR = operative room; ICU = intensive care unit; LOS = length of stay. Due to incomplete height measurements, data on only 180 admissions was available, total of 158 patients: 142 patients had 1 admission; 12 patients had 2 admissions; 3 patients had 3 admissions; 1 patient had 5 admissions</p><p>Claudia Maza, ND MSc<sup>1</sup>; Isabel Calvo, MD, MSc<sup>2</sup>; Andrea Gómez, ND<sup>2</sup>; Tania Abril, MSc<sup>3</sup>; Evelyn Frias-Toral, MD, MSc<sup>4</sup></p><p><sup>1</sup>Centro Médico Militar (Military Medical Center), Guatemala, Santa Rosa; <sup>2</sup>Hospital General de Tijuana (Tijuana General Hospital), Tijuana, Baja California; <sup>3</sup>Universidad Católica de Santiago de Guayaquil (Catholic University of Santiago de Guayaquil), Guayaquil, Guayas; <sup>4</sup>Universidad Espíritu Santo (Holy Spirit University), Dripping Springs, TX</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition is a common and significant issue among hospitalized patients, particularly in older adults or those with multiple comorbidities. The presence of malnutrition in such patients is associated with an increased risk of morbidity, mortality, prolonged hospital stays, and elevated healthcare costs. A crucial indicator of nutritional status is muscle strength, which can be effectively measured using handgrip strength (HGS). This study aimed to describe the relationship between nutritional status and muscle strength reduction in patients from two hospitals in Latin America.</p><p><b>Methods:</b> A retrospective observational study was conducted from February to May 2022. Data were collected from two hospitals: one in Guatemala and one in Mexico. A total of 169 patients aged 19-98 years were initially considered for the study, and 127 met the inclusion criteria. The sample comprised adult patients of both sexes admitted to internal medicine, surgery, and geriatrics departments. Handgrip strength, demographic data, baseline medical diagnosis, weight, and height were recorded at admission and on the 14th day of hospitalization. Exclusion criteria included patients with arm or hand movement limitations, those under sedation or mechanical ventilation, and those hospitalized for less than 24 hours. HGS was measured using JAMAR® and Smedley dynamometers, following standard protocols. Statistical analysis was performed using measures of central tendency, and results were presented in tables and figures.</p><p><b>Results:</b> In the first hospital (Mexico), 62 patients participated, with a predominant female sample. The average weight was 69.02 kg, height 1.62 meters, and BMI 26.14 kg/m² (classified as overweight). The most common admission diagnoses were infectious diseases, nervous system disorders, and digestive diseases. (Table 1) A slight increase in HGS (0.49 kg) was observed between the first and second measurements. (Figure 1) In the second hospital (Guatemala), 62 patients also met the inclusion criteria, with a predominant male sample. The average weight was 65.92 kg, height 1.61 meters, and BMI 25.47 kg/m² (classified as overweight). Infectious diseases and musculoskeletal disorders were the most common diagnoses. (Table 1) HGS decreased by 2 kg between the first and second measurements. (Figure 2) Low HGS was associated with underweight patients and those with class II and III obesity. Patients with normal BMI in both centers exhibited significant reductions in muscle strength, indicating that weight alone is not a sufficient indicator of muscle strength preservation.</p><p><b>Conclusion:</b> This multicenter study highlights the significant relationship between nutritional status and decreased muscle strength in hospitalized patients. While underweight patients showed reductions in HGS, those with class II and III obesity also experienced significant strength loss. These findings suggest that HGS is a valuable, non-invasive tool for assessing both nutritional status and muscle strength in hospitalized patients. Early identification of muscle strength deterioration can help healthcare providers implement timely nutritional interventions to improve patient outcomes.</p><p><b>Table 1.</b> Baseline Demographic and Clinical Characteristics of the Study Population.</p><p></p><p>NS: Nervous System, BMI: Body Mass Index</p><p></p><p><b>Figure 1.</b> Relationship Between Nutritional Status and Handgrip Strength (Center 1 - Mexico).</p><p></p><p><b>Figure 2.</b> Relationship Between Nutritional Status and Handgrip Strength (Center 2 - Guatemala).</p><p>Reem Farra, MDS, RD, CNSC, CCTD<sup>1</sup>; Cassie Greene, RD, CNSC, CDCES<sup>2</sup>; Michele Gilson, MDA, RD, CEDS<sup>2</sup>; Mary Englick, MS, RD, CSO, CDCES<sup>2</sup>; Kristine Thornham, MS, RD, CDE<sup>2</sup>; Debbie Andersen, MS, RD, CEDRD-S, CHC<sup>3</sup>; Stephanie Hancock, RD, CSP, CNSC<sup>4</sup></p><p><sup>1</sup>Kaiser Permanente, Lone Tree, CO; <sup>2</sup>Kaiser Permanente, Denver, CO; <sup>3</sup>Kaiser Permanente, Castle Rock, CO; <sup>4</sup>Kaiser Permanente, Littleton, CO</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Currently, there is no standardized screening required by the Centers for Medicare and Medicaid Services for malnutrition in outpatient settings. This raises concerns about early identification of malnutrition, the likelihood of nutrition interventions, and increased healthcare costs. While malnutrition screening in hospitalized patients is well-studied, its impact in outpatient care has not been thoroughly examined. Studies show that malnourished patients are 40% more likely to be readmitted within 30 days of discharge, adding over $10,000 to hospitalization costs. This quality improvement project aimed to evaluate the impact of implementing a standardized malnutrition screening tool for all patients as part of nutrition assessments by Registered Dietitians (RDs).</p><p><b>Methods:</b> The Malnutrition Screening Tool (MST) was chosen due to its validity, sensitivity, and specificity in identifying malnutrition risk in outpatient settings. This tool assesses risk by asking about recent unintentional weight loss and decreased intake due to appetite loss. Based on responses, a score of 0-5 indicates the severity of risk. Scores of 0-1 indicate no risk, while scores of 2-5 indicate a risk for malnutrition. This questionnaire was integrated into the nutrition assessment section of the electronic medical record to standardize screening for all patients. Those with scores of 2 or greater were included, with no exclusions for disease states. Patients were scheduled for follow-ups 2-6 weeks after the initial assessment, during which their MST score was recalculated.</p><p><b>Results:</b> A total of 414 patients were screened, with 175 completing follow-up visits. Of these, 131 showed improvements in their MST scores after nutrition interventions, 12 had score increases, and 32 maintained the same score. Two hundred thirty-nine patients were lost to follow-up for various reasons, including lack of response, limited RD scheduling access, changes in insurance, and mortality. Those with improved MST scores experienced an average cost avoidance of $15,000 each in subsequent hospitalizations. This cost avoidance was due to shorter hospitalizations, better treatment responses, and reduced need for medical interventions. The project raised awareness about the importance of early nutrition interventions among the multidisciplinary team.</p><p><b>Conclusion:</b> This project suggests that standardizing malnutrition screening in outpatient settings could lead to cost avoidance for patients and health systems, along with improved overall care. Further studies are needed to identify the best tools for outpatient malnutrition screening, optimal follow-up timelines, and effective nutrition interventions for greatest cost avoidance.</p><p>Amy Sharn, MS, RDN, LD<sup>1</sup>; Raissa Sorgho, PhD, MScIH<sup>2</sup>; Suela Sulo, PhD, MSc<sup>3</sup>; Emilio Molina-Molina, PhD, MSc, MEd<sup>4</sup>; Clara Rojas Montenegro, RD<sup>5</sup>; Mary Jean Villa-Real Guno, MD, FPPS, FPSPGHAN, MBA<sup>6</sup>; Sue Abdel-Rahman, PharmD, MA<sup>7</sup></p><p><sup>1</sup>Abbott Nutrition, Columbus, OH; <sup>2</sup>Center for Wellness and Nutrition, Public Health Institute, Sacramento, CA; <sup>3</sup>Global Medical Affairs and Research, Abbott Nutrition, Chicago, IL; <sup>4</sup>Research & Development, Abbott Nutrition, Granada, Andalucia; <sup>5</sup>Universidad del Rosario, Escuela de Medicina, Bogota, Cundinamarca; <sup>6</sup>Ateneo de Manila University, School of Medicine and Public Health, Metro Manila, National Capital Region; <sup>7</sup>Health Data Synthesis Institute, Chicago, IL</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> American Society for Nutrition, June 29-July 2, Chicago, IL, USA; American Academy of Pediatrics, September 27-October 1, Orlando, FL, USA.</p><p><b>Publication:</b> Sharn AR, Sorgho R, Sulo S, Molina-Molina E, Rojas Montenegro C, Villa-Real Guno MJ, Abdel-Rahman S. Using mid-upper arm circumference z-score measurement to support youth malnutrition screening as part of a global sports and wellness program and improve access to nutrition care. Front Nutr. 2024 Aug 12;11:1423978. doi: 10.3389/fnut.2024.1423978. PMID: 39188981; PMCID: PMC11345244.</p><p><b>Financial Support:</b> This study was financially supported by the Abbott Center for Malnutrition Solutions, Chicago, IL, USA.</p><p>Veeradej Pisprasert, MD, PhD<sup>1</sup>; Kittipadh Boonyavarakul, MD<sup>2</sup>; Sornwichate Rattanachaiwong, MD<sup>3</sup>; Thunchanok Kuichanuan, MD<sup>3</sup>; Pranithi Hongsprabhas, MD<sup>3</sup>; Chingching Foocharoen, MD<sup>3</sup></p><p><sup>1</sup>Faculty of Medicine, Khon Kaen University, Muang Khon Kaen, Khon Kaen; <sup>2</sup>Chulalongkorn University, Bangkok, Krung Thep; <sup>3</sup>Department of Medicine, Khon Kaen University, Muang Khon Kaen, Khon Kaen</p><p><b>Financial Support:</b> Grant supported by Khon Kaen University.</p><p><b>Background:</b> Systemic sclerosis (SSc) is an autoimmune disease which malnutrition is a common complication caused by chronic inflammation of its natural history and/or gastrointestinal tract involvement. Current nutritional assessment tools, e.g. GLIM criteria, may include data regarding muscle mass measurement for nutritional diagnosis. Anthropometric measurement is a basic method in determining muscle mass, however, data in such condition is limited. This study aimed to determine utility of determining muscle mass and muscle function by anthropometric measurement for diagnosing malnutrition in SSc patients.</p><p><b>Methods:</b> A cross-sectional diagnostic study was conducted in adult SSc patients at Srinagarind Hospital, Thailand. All patients were assessed for malnutrition based on Subjective Global Assessment (SGA). Muscle mass was measured by mid-upper-arm muscle circumference (MUAC) and calf circumference (CC), in addition, muscle function was determined by handgrip strength (HGS).</p><p><b>Results:</b> A total of 208 SSc patients were included, of which 149 were females (71.6%). The respective mean age and body mass index was 59.3 ± 11.0 years and 21.1 ± 3.9 kg/m². Nearly half (95 cases; 45.7%) were malnourished based on SGA. Mean values of MUAC, CC, and HGS were 25.9 ± 3.83, 31.5 ± 3.81, and 19.0 ± 6.99 kg, respectively. Area under the curve (AUC) of receiver operating characteristic (ROC) curves of MUAC for diagnosing malnutrition was 0.796, of CC was 0.759, and HGS was 0.720. Proposed cut-off values were shown in table 1.</p><p><b>Conclusion:</b> Muscle mass and muscle function were associated with malnutrition. Assessment of muscle mass and/or function by anthropometric measurement may be one part of nutritional assessment in patients with systemic sclerosis.</p><p><b>Table 1.</b> Proposed Cut-Off Values of MUAC, CC, and HGS in Patients With Systemic Sclerosis.</p><p></p><p>CC; calf circumference, HGS; handgrip strength, MUAC; mid-upper-arm circumference.</p><p></p><p><b>Figure 1.</b> ROC Curve of MUAC, CC, and HGS in Diagnosing Malnutrition by Subjective Global Assessment (SGA).</p><p>Trevor Sytsma, BS<sup>1</sup>; Megan Beyer, MS, RD, LDN<sup>2</sup>; Hilary Winthrop, MS, RD, LDN, CNSC<sup>3</sup>; William Rice, BS<sup>4</sup>; Jeroen Molinger, PhDc<sup>5</sup>; Suresh Agarwal, MD<sup>3</sup>; Cory Vatsaas, MD<sup>3</sup>; Paul Wischmeyer, MD, EDIC, FCCM, FASPEN<sup>6</sup>; Krista Haines, DO, MA<sup>3</sup></p><p><sup>1</sup>Duke University, Durham, NC; <sup>2</sup>Duke University School of Medicine- Department of Anesthesiology, Durham, NC; <sup>3</sup>Duke University School of Medicine, Durham, NC; <sup>4</sup>Eastern Virginia Medical School, Norfolk, VA; <sup>5</sup>Duke University Medical Center - Department of Anesthesiology - Duke Heart Center, Raleigh, NC; <sup>6</sup>Duke University Medical School, Durham, NC</p><p><b>Financial Support:</b> Baxter.</p><p><b>Background:</b> Achieving acceptable nutritional goals is a crucial but often overlooked component of postoperative care, impacting patient-important outcomes like reducing infectious complications and shortening ICU length of stay. Predictive resting energy expenditure (pREE) equations poorly correlate with actual measured REE (mREE), leading to potentially harmful over- or under-feeding. International ICU guidelines now recommend the use of indirect calorimetry (IC) to determine mREE and personalized patient nutrition. Surgical stress increases protein catabolism and insulin resistance, but the effect of age on postoperative mREE trends, which commonly used pREE equations do not account for, is not well studied. This study hypothesized that older adults undergoing major abdominal surgery experience slower metabolic recovery than younger patients, as measured by IC.</p><p><b>Methods:</b> This was an IRB-approved prospective trial of adult patients following major abdominal surgery with open abdomens secondary to blunt or penetrating trauma, sepsis, or vascular emergencies. Patients underwent serial IC assessments to guide postoperative nutrition delivery. Assessments were offered within the first 72 hours after surgery, then every 3 ± 2 days during ICU stay, followed by every 7 ± 2 days in stepdown. Patient mREE was determined using the Q-NRG® Metabolic Monitor (COSMED) in ventilator mode for mechanically ventilated patients or the mask or canopy modes, depending on medical feasibility. IC data were selected from ≥ 3-minute intervals that met steady-state conditions, defined by a variance of oxygen consumption and carbon dioxide production of less than 10%. Measurements not meeting these criteria were excluded from the final analysis. Patients without mREE data during at least two-time points in the first nine postoperative days were also excluded. Older adult patients were defined as ≥ 65 years and younger patients as ≤ 50. Trends in REE were calculated using the method of least squares and compared using t-tests assuming unequal variance (α = 0.05). Patients' pREE values were calculated from admission anthropometric data using ASPEN-SCCM equations and compared against IC measurements.</p><p><b>Results:</b> Eighteen older and 15 younger adults met pre-specified eligibility criteria and were included in the final analysis. Average rates and standard error of REE recovery in older and younger adult patients were 28.9 ± 17.1 kcal/day and 75.5 ± 18.9 kcal/day, respectively, which approached – but did not reach – statistical significance (p = 0.07). The lower and upper bands of pREE for the older cohort averaged 1728 ± 332 kcal and 2093 ± 390 kcal, respectively, markedly exceeding the mREE obtained by IC for most patients and failing to capture the observed variability identified using mREE. In younger adults, pREE values were closer to IC measurements, with lower and upper averages of 1705 ± 278 kcal and 2084 ± 323 kcal, respectively.</p><p><b>Conclusion:</b> Our data signal a difference in rates of metabolic recovery after major abdominal surgery between younger and older adult patients but did not reach statistical significance, possibly due to insufficient sample size. Predictive energy equations do not adequately capture changes in REE and may overestimate postoperative energy requirements in older adult patients, failing to appreciate the increased variability in mREE that our study found in older patients. These findings reinforce the importance of using IC to guide nutrition delivery during the early recovery period post-operatively. Larger trialsemploying IC and quantifying protein metabolism contributions are needed to explore these questions further.</p><p><b>Table 1.</b> Patient Demographics.</p><p></p><p></p><p><b>Figure 1.</b> Postoperative Changes in mREE in Older and Younger Adult Patients Following Major Abdominal Surgery Compared to pREE (ASPEN).</p><p>Amber Foster, BScFN, BSc<sup>1</sup>; Heather Resvick, PhD(c), MScFN, RD<sup>2</sup>; Janet Madill, PhD, RD, FDC<sup>3</sup>; Patrick Luke, MD, FRCSC<sup>2</sup>; Alp Sener, MD, PhD, FRCSC<sup>4</sup>; Max Levine, MD, MSc<sup>5</sup></p><p><sup>1</sup>Western University, Ilderton, ON; <sup>2</sup>LHSC, London, ON; <sup>3</sup>Brescia School of Food and Nutritional Sciences, Faculty of Health Sciences, Western University, London, ON; <sup>4</sup>London Health Sciences Centre, London, ON; <sup>5</sup>University of Alberta, Edmonton, AB</p><p><b>Financial Support:</b> Brescia University College MScFN stipend.</p><p><b>Background:</b> Currently, body mass index (BMI) is used as the sole criterion to determine whether patients with chronic kidney disease (CKD) are eligible for kidney transplantation. However, BMI is not a good measure of health in this patient population, as it does not distinguish between muscle mass, fat mass, and water weight. Individuals with end-stage kidney disease often experience shifts in fluid balance, resulting in fluid retention, swelling, and weight gain. Consequently, BMI of these patients may be falsely elevated. Therefore, it is vitally important to consider more accurate and objective measures of body composition for this patient population. The aim this study is to determine whether there is a difference in body composition between individuals with CKD who are categorized as having healthy body weight, overweight, or obesity.</p><p><b>Methods:</b> This was a cross-sectional study analyzing body composition of 114 adult individuals with CKD being assessed for kidney transplantation. Participants were placed into one of three BMI groups: healthy weight (group 1, BMI < 24.9 kg/m<sup>2</sup>, n = 29), overweight (group 2, BMI ≥ 24.9-29.9 kg/m2, n = 39) or with obesity (group 3, BMI ≥ 30 kg/m2, n = 45). Fat mass, lean body mass (LBM), and phase angle (PhA) were measured using Bioelectrical Impedance Analysis (BIA). Standardized phase angle (SPhA), a measure of cellular health, was calculated by [(observed PhA-mean PhA)/standard deviation of PhA]. Handgrip strength (HGS) was measured using a Jamar dynamometer, and quadriceps muscle layer thickness (QMLT) was measured using ultrasonography. Normalized HGS (nHGS) was calculated as [HGS/body weight (kg)], and values were compared to age- and sex-specific standardized cutoff values. Fat free mass index (FFMI) was calculated using [LBM/(height (m))<sup>2</sup>]. Low FFMI, which may identify high risk of malnutrition, was determined using ESPEN cut-off values of < 17 kg/m2 for males and < 15 kg/m2 for females. Frailty status was determined using the validated Fried frailty phenotype assessment tool. Statistical analysis: continuous data was analyzed using one-way ANOVA followed by Tukey post hoc, while chi-square tests were used for analysis of categorical data. IBM SPSS version 29, significance p < 0.05.</p><p><b>Results:</b> Participants in group 1 were younger than either group 2 (p = 0.004) or group 3 (p < 0.001). There was no significant difference in males and females between the three groups. FFMI values below cutoff were significantly higher for group 1 (13%), versus group 2 (0%) and group 3 (2.1%) (p = 0.02). A significant difference in nHGS was found between groups, with lower muscle strength occurring more frequently among participants in group 3 (75%) vs 48.7% in group 2 and 28.5% in group 1 (p < 0.001). No significant differences were seen in QMLT, SPhA, HGS, or frailty status between the three BMI groups.</p><p><b>Conclusion:</b> It appears that there is no difference in body composition parameters such as QMLT, SPhA, or frailty status between the three BMI groups. However, patients with CKD categorized as having a healthy BMI were more likely to be at risk for malnutrition. Furthermore, those individuals categorized as having a healthy BMI appeared to have more muscle strength compared to the other two groups. Taken together, these results provide convincing evidence that BMI should not be the sole criterion for listing patients for kidney transplantation. Further research is needed to confirm these findings.</p><p>Kylie Waynick, BS<sup>1</sup>; Katherine Petersen, MS, RDN, CSO<sup>2</sup>; Julie Kurtz, MS, CDCES, RDN<sup>2</sup>; Maureen McCoy, MS, RDN<sup>3</sup>; Mary Chew, MS, RDN<sup>4</sup></p><p><sup>1</sup>Arizona State University and Veterans Healthcare Administration, Phoenix, AZ; <sup>2</sup>Veterans Healthcare Administration, Phoenix, AZ; <sup>3</sup>Arizona State University, Phoenix, AZ; <sup>4</sup>Phoenix VAHCS, Phoenix, AZ</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition does not have a standardized definition nor universal identification criteria. Registered dietitian nutritionists (RDNs) most often diagnose based on Academy and ASPEN Identification of Malnutrition (AAIM) criteria while physicians are required to use International Classification of Diseases Version 10 (ICD-10-CM). However, there are major differences in how malnutrition is identified between the two. The malnutrition ICD-10 codes (E43.0, E44.0, E44.1, E50.0-E64.9) have vague diagnostic criteria leading providers to use clinical expertise and prior nutrition education. For dietitians, AAIM's diagnostic criteria is clearly defined and validated to identify malnutrition based on reduced weight and intake, loss of muscle and fat mass, fluid accumulation, and decrease in physical functioning. Due to lack of standardization, the process of identifying and diagnosing malnutrition is inconsistent. The purpose of this study was to analyze the congruence of malnutrition diagnosis between physicians and dietitians using the two methods and to compare patient outcomes between the congruent and incongruent groups.</p><p><b>Methods:</b> A retrospective chart review of 668 inpatients assigned a malnutrition diagnostic code were electronically pulled from the Veteran Health Administration's Clinical Data Warehouse for the time periods of April through July in 2019, 2020, and 2021. Length of stay, infection, pressure injury, falls, thirty day readmissions, and documentation of communication between providers was collected from chart review. Data for cost to the hospital was pulled from Veterans Equitable Resource Allocation (VERA) and paired with matching social security numbers in the sample. Chi squares were used for comparing differences between incongruency and congruency for infection, pressure injury, falls, and readmissions. Means for length of stay and cost to hospital between the two groups were analyzed using ANOVA through SPSS.</p><p><b>Results:</b> The diagnosis of malnutrition is incongruent between providers. The incongruent group had a higher percentage of adverse patient outcomes than those with congruent diagnosis. Congruent diagnoses were found to be significantly associated with incidence of documented communication (p < 0.001).</p><p><b>Conclusion:</b> This study showcases a gap in malnutrition patient care. Further research needs to be conducted to understand the barriers to congruent diagnosis and communication between providers.</p><p>Nana Matsumoto, RD, MS<sup>1</sup>; Koji Oba, Associate Professor<sup>2</sup>; Tomonori Narita, MD<sup>3</sup>; Reo Inoue, MD<sup>2</sup>; Satoshi Murakoshi, MD, PhD<sup>4</sup>; Yuki Taniguchi, MD<sup>2</sup>; Kenichi Kono, MD<sup>2</sup>; MIdori Noguchi, BA<sup>5</sup>; Seiko Tsuihiji<sup>2</sup>; Kazuhiko Fukatsu, MD, PhD<sup>2</sup></p><p><sup>1</sup>The University of Tokyo, Bunkyo-City, Tokyo; <sup>2</sup>The University of Tokyo, Bunkyo-ku, Tokyo; <sup>3</sup>The University of Tokyo, Chuo-City, Tokyo; <sup>4</sup>Kanagawa University of Human Services, Yokosuka-city, Kanagawa; <sup>5</sup>The University of Tokyo Hospital, Bunkyo-ku, Tokyo</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Therapeutic diets are often prescribed for patients with various disorders, for example, diabetes, renal dysfunction and hypertension. However, due to the limitations of nutrients amount given, therapeutic diets might reduce appetite. Hospital meals maintain patients’ nutritional status when the meals are fully consumed regardless of the diet types. It is possible that therapeutic diets are at least partly associated with malnutrition in hospital. Therefore, we conducted an exploratory retrospective cohort study to investigate whether there are differences inpatients’ oral consumption between therapeutic and regular diets, taking into account other factors.</p><p><b>Methods:</b> The study protocol was approved by the Ethics Committee of the University of Tokyo under protocol No.2023396NI-(1). We retrospectively extracted information from medical record of the patients who were admitted to the department of orthopedic and spine surgery at the University of Tokyo Hospital between June to October 2022. Eligible patients were older than 20 years old and hospitalized more than 7 days. These patients were provided oral diets as main source of nutrition. Patients prescribed texture modified diet, half or liquid diet are excluded. The measurements include percentages of oral food intake at various points during the hospitalization (e.g. at admission, before and after surgery and discharge), sex and ages. The differences in patient's oral consumption rate between therapeutic diet and regular diet were analyzed through a linear mixed-effect model.</p><p><b>Results:</b> A total of 290 patients were analyzed, with 50 patients receiving a therapeutic diet and 240 patients receiving regular diet at admission. The mean percentage of oral intake was 83.1% in a therapeutic diet and 87.2% in a regular diet, and consistently 4-6% higher for regular diets compared to therapeutic diets, at each timing of hospitalization (Figure). In a linear mixed effect model with adjustment of sex and age, mean percentage of oral intake of a regular diet was 4.0% higher (95% confidence interval [CI], -0.8% to 8.9%, p = 0.100) than a therapeutic diet, although the difference did not reach statistical significance. The mean percentage of oral intake in women were 15.6% lower than men (95%CI, -19.5% to -11.8%.) Likewise, older patient's intake rate was reduced compared than younger patients (difference, -0.2% per age, 95%CI -0.3% to -0.1%).</p><p><b>Conclusion:</b> This exploratory study failed to show that therapeutic diets may reduce food intake in orthopedic and supine surgery patients as compared to regular diet. However, sex and ages were important factors affecting food intake. We need to pay special attention to female and/or aged patients for increasing oral food intake. Future research will increase the number of patients examined, expand the cohort to other department and proceed to prospective study to find out what factors truthfully affect patient's oral intake during the hospitalization.</p><p></p><p><b>Figure 1.</b> The Percentage of Oral Intake During Hospitalization in Each Diet.</p><p>Lorena Muhaj, MS<sup>1</sup>; Michael Owen-Michaane, MD, MA, CNSC<sup>2</sup></p><p><sup>1</sup>1 Institute of Human Nutrition, Vagelos College of Physicians and Surgeons, Columbia University Irving Medical Center, New York, NY; <sup>2</sup>Columbia University Irving Medical Center, New York, NY</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Muscle mass is crucial for overall health and well-being. Consequently, accurate estimation of muscle mass is essential for diagnosing malnutrition and conditions such as sarcopenia and cachexia. The Kim equation uses biomarker data to estimate muscle mass, but whether this tool provides an accurate estimate in populations with high BMI and kidney disease remains uncertain. Therefore, the aim of this study is to assess whether the Kim equation is suitable and reliable for estimating muscle mass and predicting malnutrition, sarcopenia, and related outcomes in a cohort with diverse BMI and kidney disease.</p><p><b>Methods:</b> This is a cross-sectional study using data from the All of Us Research Program. Data on demographics, weight, height, creatinine, cystatin C, and diagnoses of malnutrition, hip fractures, and cachexia were obtained from electronic health records (EHR). The Kim equation was derived from creatinine, cystatin C and weight (Table 1) and compared with established sarcopenia cutoffs for appendicular lean mass (ALM), including ALM/BMI and ALM/height<sup>2</sup>. Malnutrition was identified through specific ICD-10-CM codes recorded in the EHR and participants were categorized based on malnutrition status (with or without malnutrition). Muscle mass and biomarker levels were compared between groups with or without severe/moderate malnutrition, and the relationships of BMI with creatinine and cystatin C levels were analyzed using linear regression. Wilcoxon rank-sum tests were used to assess associations between estimated muscle mass and malnutrition diagnosis.</p><p><b>Results:</b> Baseline characteristics were stratified by gender for comparison. The mean age of participants was 58.2 years (SD = 14.7). The mean BMI was 30.4 kg/m<sup>2</sup> (SD = 7.5). Mean serum creatinine and cystatin C levels were 2.01 mg/dL (SD = 1.82) and 0.18 mg/dL (SD = 0.11), respectively. The mean estimated muscle mass was 80.1 kg (SD = 21), with estimated muscle mass as a percentage of body weight being 92.8%. Mean ALM/BMI was 2.38 (SD = 0.32), while ALM/height<sup>2</sup> value was 25.41 kg/m<sup>2</sup> (SD = 5.6). No participant met the cutoffs for sarcopenia. All calculated variables are summarized in Table 1. In this cohort, < 2% were diagnosed with severe malnutrition and < 2% with moderate malnutrition (Table 2). Muscle mass was lower in participants with severe malnutrition compared to those without (W = 2035, p < 0.05) (Figure 1).</p><p><b>Conclusion:</b> This study suggests the Kim equation overestimates muscle mass in populations with high BMI or kidney disease, as no participants met sarcopenia cutoffs despite the expected prevalence in CKD. This overestimation is concerning given the known risk of low muscle mass in CKD. While lower muscle mass was significantly associated with severe malnutrition (p < 0.05), the Kim equation identified fewer malnutrition cases than expected based on clinical data. Though biomarkers like creatinine and cystatin C may help diagnose malnutrition, the Kim equation may not accurately estimate muscle mass or predict malnutrition and sarcopenia in diverse populations. Further research is needed to improve these estimates.</p><p><b>Table 1.</b> Muscle Mass Metrics Calculations and Diagnosis of Sarcopenia Based on FNIH (ALM/BMI) and EWGSOP (ALM/Height2) Cut-off Values.</p><p></p><p>Abbreviations: BMI-Body Mass Index; TBMM-Total Body Muscle Mass (referred as Muscle Mass as well) (calculated using the Kim Equation); ALM-Appendicular Lean Muscle Mass (using the McCarthy equation); ALM/Height2-Appendicular Lean Muscle Mass adjusted for height square (using EWGSOP cutoffs for diagnosing sarcopenia); ALM/BMI-Appendicular Lean Muscle Mass adjusted for BMI (using FNIH cutoffs for diagnosing sarcopenia). Equation 1: Kim equation - Calculated body muscle mass = body weight * serum creatinine/((K * body weight * serum cystatin C) + serum creatinine)</p><p><b>Table 2.</b> Prevalence of Severe and Moderate Malnutrition.</p><p></p><p>(Counts less than 20 suppressed to prevent reidentification of participants).</p><p></p><p><b>Figure 1.</b> Muscle Mass in Groups With and Without Severe Malnutrition.</p><p><b>Poster of Distinction</b></p><p>Robert Weimer, BS<sup>1</sup>; Lindsay Plank, PhD<sup>2</sup>; Alisha Rovner, PhD<sup>1</sup>; Carrie Earthman, PhD, RD<sup>1</sup></p><p><sup>1</sup>University of Delaware, Newark, DE; <sup>2</sup>University of Auckland, Auckland</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Loss of skeletal muscle is common in patients with liver cirrhosis and low muscle mass is internationally recognized as a key phenotypic criterion to diagnose malnutrition.<sup>1,2</sup> Muscle can be clinically assessed through various modalities, although reference data and consensus guidance are limited for the interpretation of muscle measures to define malnutrition in this patient population. The aim of this study was to evaluate the sensitivity and specificity of published sarcopenia cutpoints applied to dual-energy X-ray absorptiometry (DXA) muscle measures to diagnose malnutrition by the GLIM criteria, using in vivo neutron activation analysis (IVNAA) measures of total body protein (TBP) as the reference.</p><p><b>Methods:</b> Adults with liver cirrhosis underwent IVNAA and whole body DXA at the Body Composition Laboratory of the University of Auckland. DXA-fat-free mass (FFM) and appendicular skeletal muscle mass (ASMM, with and without correction for wet bone mass<sup>3</sup>) were measured and indexed to height squared (FFMI, ASMI). The ratio of measured to predicted TBP based on healthy reference data matched for age, sex and height was calculated as a protein index; values less than 2 standard deviations below the mean (< 0.77) were defined as protein depletion (malnutrition). Published cut points from recommended guidelines were evaluated (Table 1).<sup>4-9</sup> DXA values below the cut point were interpreted as ‘sarcopenic’. Sensitivity and specificity for each cut point were determined.</p><p><b>Results:</b> Study sample included 350 adults (238 males/112 females, median age 52 years) with liver cirrhosis and median model for end-stage liver disease (MELD) score 12 (range 5-36). Application of published sarcopenia cutpoints to diagnose malnutrition by DXA in patients with liver cirrhosis had sensitivity ranging from 40.8% - 79.0% and specificity ranging from 79.6% to 94.2% (Table 1). Although all of the selected published cutpoints for DXA-measured ASMI were similar, the Baumgartner<sup>4</sup> and Newman<sup>5</sup> ASMI cutpoints when applied to our DXA-measured ASMI, particularly after correction for wet bone mass, yielded the best combination of sensitivity and specificity for diagnosing malnutrition as identified by protein depletion in patients with liver cirrhosis. The Studentski ASMM cutpoints in Table 1 yielded unacceptably low sensitivity (41-55%).</p><p><b>Conclusion:</b> These findings suggest that the use of the DXA-derived Baumgartner/Newman bone-corrected ASMI cutpoints offer acceptable validity in the diagnosis of malnutrition by GLIM in patients with liver cirrhosis. However, given that it is not common practice to make this correction for wet bone mass in DXA measures of ASMI, the application of these cutpoints to standard uncorrected measures of ASMI by DXA would likely yield much lower sensitivity, suggesting that many individuals with low muscularity and malnutrition would be misdiagnosed as non-malnourished when applying these cutpoints.</p><p><b>Table 1.</b> Evaluation of Selected Published Cut-Points for Dual-Energy X-Ray Absorptiometry Appendicular Skeletal Muscle Index to Identify Protein Depletion in Patients with Liver Cirrhosis.</p><p></p><p>Abbreviations: DXA, dual-energy X-ray absorptiometry; M, male; F, female; GLIM, Global Leadership Initiative on Malnutrition; EWGSOP, European Working Group on Sarcopenia in Older People; AWGS, Asia Working Group for Sarcopenia; FNIH, Foundation for the National Institutes of Health; ASMM, appendicular skeletal muscle mass in kg determined from DXA-measured lean soft tissue of the arms and legs; ASMI, ASMM indexed to height in meters-squared; ASMM-BC, ASMM corrected for wet bone mass according to Heymsfield et al 1990; ASMI-BC, ASMI corrected for wet bone mass.</p><p><b>Critical Care and Critical Health Issues</b></p><p>Amir Kamel, PharmD, FASPEN<sup>1</sup>; Tori Gray, PharmD<sup>2</sup>; Cara Nys, PharmD, BCIDP<sup>3</sup>; Erin Vanzant, MD, FACS<sup>4</sup>; Martin Rosenthal, MD, FACS, FASPEN<sup>1</sup></p><p><sup>1</sup>University of Florida, Gainesville, FL; <sup>2</sup>Cincinnati Children, Gainesville, FL; <sup>3</sup>Orlando Health, Orlando, FL; <sup>4</sup>Department of Surgery, Division of Trauma and Acute Care Surgery, College of Medicine, University of Florida, Gainesville, FL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Amino acids (AAs) serve different purposes in our body including structural, enzymatic, and integral cellular functions. Amino acids utilization and demand may vary between healthy and disease states. Certain conditions such as chronic kidney disease or short bowel syndrome can affect plasma AA levels. Previous research has identified citrulline as a marker of intestinal function and absorptive capacity. Stressors such as surgery or trauma can alter AAs metabolism, potentially leading to a hypercatabolic state and changes in the available AAs pool. The primary objective of this study is to compare AA levels in patients who have undergone abdominal surgery with those who have not. The secondary endpoint is to describe post-surgical complications and correlate plasma AAs level to such complications.</p><p><b>Methods:</b> This study was a single-center retrospective analysis conducted between January 1, 2007and March 15, 2019, of patients who were referred to the University of Florida Health Nutrition Support Team (NST) and had a routine metabolic evaluation with amino acid levels as a part of nutrition support consult. Amino acid data were excluded if specimen were deemed contaminated. Patients with genetic disorders were also excluded from the study. During the study period, the amino acid bioassay was performed using Bio-chrome Ion exchange chromatography (ARUP Laboratories, Salt Lake City, UT).</p><p><b>Results:</b> Of the 227 patients screened, 181 patients were included in the study (58 who underwent abdominal surgery and 123 who did not). The mean age, BMI and height of participants were 52.2 years, 25.1 kg/m<sup>2</sup> and 169 cm respectively. Baseline characteristics were similar between the two groups, 31% of the surgery arm had undergone a surgical procedure within a year from the index encounter, 86.5% retained their colon, 69.2% had a bowel resection with mean of 147.6 cm of bowel left for those with documented length of reaming bowel (36 out of 58). Postoperative complications of small bowel obstruction, ileus, leak, abscess, bleeding and surgical site infection (SSI) were 12.1%, 24%, 17.2%, 20.7%, 3.4% and 17.2% respectively. Among the 19 AAs evaluated, median citrulline and methionine levels were significantly different between the 2 groups (23 [14-35] vs 17 [11-23]; p = 0.0031 and 27 [20-39] vs 33[24-51]; p = 0.0383. Alanine and arginine levels were associated with postoperative ileus, leucine levels correlated with SSI and glutamic acid and glycine levels were linked to postoperative fistula formation.</p><p><b>Conclusion:</b> Most amino acid levels showed no significant differences between patients who underwent abdominal surgery and those who did not except for citrulline and methionine. Specific amino acids, such as alanine, arginine, leucine, glutamic acid and glycine may serve as an early indicator of post-surgical complications, however larger prospective trial is warranted to validate our findings.</p><p>Kento Kurashima, MD, PhD<sup>1</sup>; Grace Trello<sup>1</sup>; James Fox<sup>1</sup>; Edward Portz<sup>1</sup>; Shaurya Mehta, BS<sup>1</sup>; Austin Sims<sup>1</sup>; Arun Verma, MD<sup>1</sup>; Chandrashekhara Manithody, PhD<sup>1</sup>; Yasar Caliskan, MD<sup>1</sup>; Mustafa Nazzal, MD<sup>1</sup>; Ajay Jain, MD, DNB, MHA<sup>1</sup></p><p><sup>1</sup>Saint Louis University, St. Louis, MO</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Marginal donor livers (MDLs) have been used for liver transplantation to address major organ shortages. However, MDLs are notably susceptible to Ischemia/Reperfusion injury (IRI). Recent investigations have highlighted Ferroptosis, a new type of programmed cell death, as a potential contributor to IRI. We hypothesized that modulating ferroptosis by the iron chelator deferoxamine (DFO) could alter the course of IRI.</p><p><b>Methods:</b> Using our novel Perfusion Regulated Organ Therapeutics with Enhanced Controlled Testing (PROTECT) model (US provision Patent, US63/136,165), six human MDLs (liver A to F) were procured and split into paired lobes. Simultaneous perfusion was performed on both lobes, with one lobe subjected to DFO while the other serving as an internal control. Histology, serum chemistry, expression of ferroptosis-associated genes, determination of iron accumulation, and measurement of lipid peroxidation, were performed.</p><p><b>Results:</b> Histological analysis revealed severe macrovesicular steatosis (>30%) in liver A and D, while liver B and E exhibited mild to moderate macrovesicular steatosis. Majority of the samples noted mild inflammation dominantly in zone 3. No significant necrosis was noted during perfusion. Perl's Prussian blue stain and non-heme iron quantification demonstrated a suppression of iron accumulation in liver A to D with DFO treatment (p < 0.05). Based on the degree of iron chelation, 12 lobes were categorized into two groups: lobes with decreased iron (n = 4) and those with increased iron (n = 8). Comparative analysis demonstrated that ferroptosis-associated genes (HIF1-alpha, RPL8, IREB2, ACSF2, NQO1) were significantly downregulated in the former (p = 0.0338, p = 0.0085, p = 0.0138, p = 0.0138, p = 0.0209, respectively). Lipid peroxidation was significantly suppressed in lobes with decreased iron (p = 0.02). While serum AST was lower in iron chelated lobes this did not reach statistical significance.</p><p><b>Conclusion:</b> This study affirmed that iron accumulation was driven by normothermic perfusion. Reduction of iron content suppressed ferroptosis-associated genes and lipid peroxidation to mitigate IRI. Our results using human MDLs revealed a novel relationship between iron content and ferroptosis, providing a solid foundation for future development of IRI therapeutics.</p><p>Gabriella ten Have, PhD<sup>1</sup>; Macie Mackey, BSc<sup>1</sup>; Carolina Perez, MSc<sup>1</sup>; John Thaden, PhD<sup>1</sup>; Sarah Rice, PhD<sup>1</sup>; Marielle Engelen, PhD<sup>1</sup>; Nicolaas Deutz, PhD, MD<sup>1</sup></p><p><sup>1</sup>Texas A&M University, College Station, TX</p><p><b>Financial Support:</b> Department of Defense: CDMRP PR190829 - ASPEN Rhoads Research Foundation - C. Richard Fleming Grant & Daniel H. Teitelbaum Grant.</p><p><b>Background:</b> Sepsis is a potentially life-threatening complication of infection in critically ill patients and is characterized by severe tissue breakdown in several organs, leading to long-term muscle weakness, fatigue, and reduced physical activity. Recent guidelines suggest increasing protein intake gradually in the first days of recovery from a septic event. It is, however, unclear how this affects whole-body protein turnover. Therefore in an acute sepsis-recovery pig model, we studied whole-body protein metabolism in the early sepsis recovery phase after restricted feeding with a balanced meal of amino acids (AA).</p><p><b>Methods:</b> In 25 catheterized pigs (± 25 kg), sepsis was induced by IV infusion of live Pseudomonas aeruginosa bacteria (5*108 CFU/hour). At t = 9 h, recovery was initiated with IV administration of gentamicin. Post sepsis, twice daily food was given incrementally (Day 1:25%, 2:50%, 3:75%, >4:100%). The 100% meals contained per kg BW: 15.4gr CHO and 3.47gr fat, and a balanced free AA (reflecting muscle AA profile) mixture (0.56 gr n = 3.9 gr AA). Before sepsis (Baseline) and on recovery day 3, an amino acid stable isotope mixture was administered IV as pulse (post-absorptive). Subsequently, arterial blood samples were collected postabsorptive for 2 hours. Amino acid concentrations and enrichments were determined with LCMS. Statistics: RM-ANOVA, <i><b>α </b></i>= 0.05.</p><p><b>Results:</b> At day 3, animal body weight was decreased (2.4 [0.9, 3.9]%, p = 0.0025). Compared to baseline values, plasma AA concentration profiles were changed. Overall, the total non-essential AA plasma concentration did not change. Essential AA plasma concentrations of histidine, leucine, methionine, phenylalanine, tryptophan, and valine were lower (p < 0.05) and lysine higher (p = 0.0027). No change in isoleucine. We observed lower whole-body production (WBP) of the non-essential amino acids arginine (p < 0.0001), glutamine (p < 0.0001), glutamate (p < 0.0001), glycine (p < 0.0001), hydro-proline (p = 0.0041), ornithine (p = 0.0003), taurine (p < 0.0001), and tyrosine (p < 0.0001). Citrulline production has not changed. In addition, lower WBP was observed for the essential amino acids, isoleucine (p = 0.0002), leucine (p < 0.0001), valine (p < 0.0001), methionine (p < 0.0001), tryptophane (p < 0.0001), and lysine (p < 0.0001). Whole-body protein breakdown and protein synthesis were also lower (p < 0.0001), while net protein breakdown has not changed.</p><p><b>Conclusion:</b> Our sepsis-recovery pig model suggests that food restriction in the early phase of sepsis recovery leads to diminished protein turnover.</p><p>Gabriella ten Have, PhD<sup>1</sup>; Macie Mackey, BSc<sup>1</sup>; Carolina Perez, MSc<sup>1</sup>; John Thaden, PhD<sup>1</sup>; Sarah Rice, PhD<sup>1</sup>; Marielle Engelen, PhD<sup>1</sup>; Nicolaas Deutz, PhD, MD<sup>1</sup></p><p><sup>1</sup>Texas A&M University, College Station, TX</p><p><b>Financial Support:</b> Department of Defense: CDMRP PR190829 - ASPEN Rhoads Research Foundation - C. Richard Fleming Grant & Daniel H. Teitelbaum Grant.</p><p><b>Background:</b> Sepsis is a potentially life-threatening complication of infection in critically ill patients. It is characterized by severe tissue breakdown in several organs, leading to long-term muscle weakness, fatigue, and reduced physical activity (ICU-AW). In an acute sepsis-recovery ICU-AW pig model, we studied whether meals that only contain essential amino acids (EAA) can restore the metabolic deregulations during sepsis recovery, as assessed by comprehensive metabolic phenotyping1.</p><p><b>Methods:</b> In 49 catheterized pigs (± 25 kg), sepsis was induced by IV infusion of live Pseudomonas aeruginosa bacteria (5*108 CFU/hour). At t = 9 h, recovery was initiated with IV administration of gentamicin. Post sepsis, twice daily food was given blindly and incrementally (Day 1:25%, 2:50%, 3:75%, >4:100%). The 100% meals contained per kg BW: 15.4gr CHO and 3.47gr fat, and 0.56 gr N of an EAA mixture (reflecting muscle protein EAA, 4.3 gr AA) or control (TAA, 3.9 gr AA). Before sepsis (Baseline) and on recovery day 7, an amino acid stable isotope mixture was administered IV as pulse (post-absorptive). Subsequently, arterial blood samples were collected for 2 hours. AA concentrations and enrichments were determined with LCMS. Statistics: RM-ANOVA, <i><b>α</b></i> = 0.05.</p><p><b>Results:</b> A body weight reduction was found after sepsis, restored on Day 7 post sepsis. Compared to baseline, in the EAA group, increased muscle fatigue (p < 0.0001), tau-methylhistidine whole-body production (WBP) (reflects myofibrillar muscle breakdown, p < 0.0001), and whole-body net protein breakdown (p < 0.0001) was observed but less in the control group (muscle fatigue: p < 0.0001, tau-methylhistidine: p = 0.0531, net protein breakdown (p < 0.0001). In addition on day 7, lower WBP was observed of glycine (p < 0.0001), hydroxyproline (p < 0.0001), glutamate (p < 0.0001), glutamine (p < 0.0001), and taurine (p < 0.0001), however less (glycine: p = 0.0014; hydroxyproline (p = 0.0007); glutamate p = 0.0554) or more (glutamine: p = 0.0497; taurine: p < 0.0001) in the control group. In addition, the WBP of citrulline (p = 0.0011) was increased on day seven but less in the control group (p = 0.0078). Higher plasma concentrations of asparagine (p < 0.0001), citrulline (p < 0.0001), glutamine (p = 0.0001), tau-methylhistidine (0.0319), serine (p < 0.0001), taurine (p < 0.0001), and tyrosine (p < 0.0001) were observed in the EAA group. In the EAA group, the clearance was lower (p < 0.05), except for glycine, tau-methylhistidine, and ornithine.</p><p><b>Conclusion:</b> Conclusion Our sepsis-recovery pig ICU-AW model shows that feeding EAA-only meals after sepsis relates to an increased muscle and whole-body net protein breakdown and affects non-EAA metabolism. We hypothesize that non-essential amino acids in post-sepsis nutrition are needed to improve protein anabolism.</p><p>Rebecca Wehner, RD, LD, CNSC<sup>1</sup>; Angela Parillo, MS, RD, LD, CNSC<sup>1</sup>; Lauren McGlade, RD, LD, CNSC<sup>1</sup>; Nan Yang, RD, LD, CNSC<sup>1</sup>; Allyson Vasu-Sarver, MSN, APRN-CNP<sup>1</sup>; Michele Weber, DNP, RN, APRN-CNS, APRN-NP, CCRN, CCNS, OCN, AOCNS<sup>1</sup>; Stella Ogake, MD, FCCP<sup>1</sup></p><p><sup>1</sup>The Ohio State University Wexner Medical Center, Columbus, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Patients in the intensive care unit, especially those on mechanical ventilation, frequently receive inadequate enteral nutrition (EN) therapy. Critically ill patients receive on average 40-50% of prescribed nutritional requirements, while ASPEN/SCCM guidelines encourage efforts be made to provide > 80% of goal energy and protein needs. One method to help achieve these efforts is the use of volume-based feedings (VBF). At our institution, an hourly rate-based feeding (RBF) approach is standard. In 2022, our Medical Intensive Care Unit (MICU) operations committee inquired about the potential benefits of implementing VBF. Before changing our practice, we collected data to assess our current performance in meeting EN goals and to identify the reasons for interruptions. While literature suggests that VBF is considered relatively safe in terms of EN complications compared to RBF, to our knowledge, there is currently no information on the safety of starting VBF in patients at risk of gastrointestinal (GI) intolerance. Therefore, we also sought to determine whether EN is more frequently held due to GI intolerance versus operative procedures.</p><p><b>Methods:</b> We conducted a retrospective evaluation of EN delivery compared to EN goal and the reason for interruption if EN delivery was below goal in the MICU of a large tertiary academic medical center. We reviewed ten days of information on any MICU patient on EN. One day constituted the total EN volume received, in milliliters, from 0700-0659 hours. Using the QI Assessment Form, we collected the following data: goal EN volume in 24 hours, volume received in 24 hours, percent volume received versus prescribed, and hours feeds were held (or below goal rate) each day. The reasons for holding tube feeds were divided into six categories based on underlying causes: feeding initiation/titration, GI issues (constipation, diarrhea, emesis, nausea, distention, high gastric residual volume), operative procedure, non-operative procedure, mechanical issue, and practice issues. Data was entered on a spreadsheet, and descriptive statistics were used to evaluate results.</p><p><b>Results:</b> MICU patients receiving EN were observed over ten random days in a two-week period in August 2022. Eighty-two patients were receiving EN. Three hundred and four EN days were observed. Average percent EN delivered was 70% among all patients. EN was withheld for the following reasons: 34 cases (23%) were related to feeding initiation, 55 (37%) GI issues, 19 (13%) operative procedures, 32 (22%) non-operative procedures, 2 (1%) mechanical issues, and 5 (3%) cases were related to practice issues. VBF could have been considered in 51 cases (35%).</p><p><b>Conclusion:</b> These results suggest that EN delivery in our MICU is most often below prescribed amount due to GI issues and feeding initiation. Together, they comprised 89 cases (60%). VBF protocols would not improve delivery in either case. VBF would likely lead to increased discomfort in patients experiencing GI issues, and feeding initiation can be improved with changes to advancement protocols. Due to VBF having potential benefit in only 35% of cases, as well as observing above average EN delivery, this protocol was not implemented in the observed MICU.</p><p>Delaney Adams, PharmD<sup>1</sup>; Brandon Conaway, PharmD<sup>2</sup>; Julie Farrar, PharmD<sup>3</sup>; Saskya Byerly, MD<sup>4</sup>; Dina Filiberto, MD<sup>4</sup>; Peter Fischer, MD<sup>4</sup>; Roland Dickerson, PharmD<sup>3</sup></p><p><sup>1</sup>Regional One Health, Memphis, TN; <sup>2</sup>Veterans Affairs Medical Center, Memphis, TN; <sup>3</sup>University of Tennessee College of Pharmacy, Memphis, TN; <sup>4</sup>University of Tennessee College of Medicine, Memphis, TN</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> Society for Critical Care Medicine 54th Annual Critical Care Congress. February 23 to 25, 2025, Orlando, FL.</p><p><b>Publication:</b> Critical Care Medicine.2025;53(1):In press.</p><p><b>Financial Support:</b> None Reported.</p><p><b>Best of ASPEN-Critical Care and Critical Health Issues</b></p><p>Megan Beyer, MS, RD, LDN<sup>1</sup>; Krista Haines, DO, MA<sup>2</sup>; Suresh Agarwal, MD<sup>2</sup>; Hilary Winthrop, MS, RD, LDN, CNSC<sup>2</sup>; Jeroen Molinger, PhDc<sup>3</sup>; Paul Wischmeyer, MD, EDIC, FCCM, FASPEN<sup>4</sup></p><p><sup>1</sup>Duke University School of Medicine- Department of Anesthesiology, Durham, NC; <sup>2</sup>Duke University School of Medicine, Durham, NC; <sup>3</sup>Duke University Medical Center - Department of Anesthesiology - Duke Heart Center, Raleigh, NC; <sup>4</sup>Duke University Medical School, Durham, NC</p><p><b>Financial Support:</b> Baxter, Abbott.</p><p><b>Background:</b> Resting energy expenditure (REE) is critical in managing nutrition in intensive care unit (ICU) patients. Accurate energy needs assessments are vital for optimizing nutritional interventions. However, little is known about how different disease states specifically influence REE in ICU patients. Existing energy recommendations are generalized and do not account for the metabolic variability across disease types. Indirect calorimetry (IC) is considered the gold standard for measuring REE but is underutilized. This study addresses this gap by analyzing REE across disease states using metabolic cart assessments in a large academic medical center. The findings are expected to inform more precise, disease-specific nutritional recommendations in critical care.</p><p><b>Methods:</b> This is a pooled analysis of patients enrolled in four prospective clinical trials evaluating ICU patients across a range of disease states. Patients included in this analysis were admitted to the ICU with diagnoses of COVID-19, respiratory failure, cardiothoracic (CT) surgery, trauma, or surgical intensive care conditions. All patients underwent IC within 72 hours of ICU admission to assess REE, with follow-up measurements conducted when patients were medically stable. In each study, patients were managed under standard ICU care protocols in each study, and nutritional interventions were individualized or standardized based on clinical trial protocols. The primary outcome was the measured REE, expressed in kcal/day and normalized to body weight (kcal/kg/day). Summary statistics for demographic and clinical characteristics, such as age, gender, height, weight, BMI, and comorbidities, were reported. Comparative analyses across the five disease states were performed using ANOVA tests to determine the significance of differences in REE.</p><p><b>Results:</b> The analysis included 165 ICU patients. The cohort had a mean age of 58 years, with 58% male and 42% female. Racial demographics included 36% black, 52% white, and 10% from other backgrounds. Patients in the surgical ICU group had the lowest caloric requirements, averaging 1503 kcal/day, while COVID-19 patients had the highest calorie needs of 1982 kcal/day. CT surgery patients measured 1644 kcal/day, respiratory failure measured 1763 kcal/day, and trauma patients required1883 kcal/day. ANOVA analysis demonstrated statistically significant differences in REE between these groups (p < 0.001). When normalized to body weight (kcal/kg/day), the range of REE varied from 20.3 to 23.5 kcal/kg/day, with statistically significant differences between disease states (p < 0.001).</p><p><b>Conclusion:</b> This study reveals significant variability in REE across different disease states in ICU patients, highlighting the need for disease-specific energy recommendations. These findings indicate that specific disease processes, such as COVID-19 and trauma, may increase metabolic demands, while patients recovering from surgical procedures may have comparatively lower energy needs. These findings emphasize the importance of individualized nutritional interventions based on a patient's disease state to optimize recovery, clinical outcomes and prevent underfeeding or overfeeding, which can adversely affect patient outcomes. The results suggest IC should be more widely implemented in ICU settings to guide precise and effective nutrition delivery based on real-time metabolic data rather than relying on standard predictive equations. Further research is needed to refine these recommendations and explore continuous monitoring of REE and tailored nutrition needs in the ICU.</p><p><b>Table 1.</b> Demographic and Clinical Characteristics.</p><p></p><p><b>Table 2.</b> Disease Group Diagnoses.</p><p></p><p></p><p><b>Figure 1.</b> Average Measured Resting Energy Expenditure by Disease Group.</p><p>Hailee Prieto, MA, RD, LDN, CNSC<sup>1</sup>; Emily McDermott, MS, RD, LDN, CNSC<sup>2</sup></p><p><sup>1</sup>Northwestern Memorial Hospital, Shorewood, IL; <sup>2</sup>Northwestern Memorial Hospital, Chicago, IL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Communication between Registered Dietitians and CTICU teams is non-standardized and often ineffective in supporting proper implementation of RD recommendations. Late or lack of RD support can impact the quality of nutrition care provided to patients. In FY23, the CTICU nutrition consult/risk turnaround time was 58% within 24hrs and missed nutrition consults/risk was 9%. Our goal was to improve RD consult/risk turnaround time within 24hrs, based on our department goal, from 58% to 75% and missed RD consult/risk from 9% to 6%, through standardizing communication between RD and CTICU APRNs. The outcome metrics nutrition risk turnaround time and nutrition consult turnaround time. The process metric was our percent of RD presence in rounds.</p><p><b>Methods:</b> We used the DMAIC module to attempt to solve our communication issue in CTICU. We took the voice of the customer and surveyed the CTICU ARPNs and found that a barrier was the RDs limited presence in the CTICU. We found that the CTICU APRNs find it valuable to have an RD rounding daily with their team. We then did a literature search on RDs rounding in ICU, specifically cardiac/thoracic ICUs and found critically ill cardiac surgery patients are at high risk of developing malnutrition, however initiation of medical nutrition therapy and overall adequacy of nutrition provision is lower compared to noncardiac surgical or MICU patients. RD written orders directly improve outcomes in patients receiving nutrition support. To have the most influence, RDs need to be present in the ICU and be involved when important decisions are being made. Dietitian involvement in the ICU team significantly improves the team's ability to implement prompt, relevant nutrition support interventions. We used process mapping to address that rounding times overlap with the step-down cardiac floors and ICU rounding. We optimized the schedule for the RDs daily to be able to attend as many rounds as possible daily, including the CTICU rounds. We then implemented a new rounding structure within the Cardiac Service Line based on literature search for the standard of care and RD role in ICU rounding.</p><p><b>Results:</b> Our percentage of turnaround time for nutrition consults/risks increased by 26% within 24hrs (58 to 84%) and decreased for missed consults/risks to 1% to exceed goals. The number of nutrition interventions we were able to implement increased with more RDs attending rounds, which was tracked with implementation of a RD rounding structure within the CTICU. The number of implemented interventions from 1 to 2 RDs was skewed due to the RD attempting to round with both teams each day there was only 1 RD.</p><p><b>Conclusion:</b> Communication between the CTICU team and Clinical Nutrition continues to improve with consistent positive feedback from the ICU providers regarding the new rounding structure. The new workflow was implemented in the Clinical Nutrition Cardiac Service Line. For future opportunities, there are other ICU teams at NMH that do not have a dedicated RD to round with them due to RD staffing that could also benefit from a dedicated RD in those rounds daily.</p><p><b>Table 1.</b> New Rounding Structure.</p><p></p><p>*Critical Care Rounds; Green: Attend; Gold: Unable to attend.</p><p><b>Table 2.</b> Control Plan.</p><p></p><p></p><p><b>Figure 1.</b> Results Consult Risk Turn Around Time Pre & Post Rounding.</p><p></p><p><b>Figure 2.</b> Number of Implemented RD Nutrition Interventions by Number of RDs Rounding.</p><p>Kenny Ngo, PharmD<sup>1</sup>; Rachel Leong, PharmD<sup>2</sup>; Vivian Zhao, PharmD<sup>2</sup>; Nisha Dave, PharmD<sup>2</sup>; Thomas Ziegler, MD<sup>2</sup></p><p><sup>1</sup>Emory Healthcare, Macon, GA; <sup>2</sup>Emory Healthcare, Atlanta, GA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Micronutrients play a crucial role in biochemical processes in the body. During critical illness, the status of micronutrients can be affected by factors such as disease severity and medical interventions. Extracorporeal membrane oxygenation (ECMO) is a vital supportive therapy that has seen increased utilization for critically ill patients with acute severe refractory cardiorespiratory failure. The potential alterations in micronutrient status and requirements during ECMO are an area of significant interest, but data are limited. This study aimed to determine the incidence of micronutrient depletion in critically ill patients requiring ECMO.</p><p><b>Methods:</b> A retrospective chart review was conducted for patients with at least one micronutrient level measured in blood while receiving ECMO between January 1, 2015, and September 30, 2023. Chart reviews were completed using the Emory Healthcare electronic medical record system after the Emory University Institutional Review Board approved the study. A full waiver for informed consent and authorization was approved for this study. Data on demographic characteristics, ECMO therapy-related information, and reported micronutrient levels were collected. Descriptive statistics were used to evaluate the data.</p><p><b>Results:</b> A total of 77 of the 128 reviewed patients met inclusion criteria and were included in data analysis (Table 1). The average age of patients was 49, and 55.8% were female. The average duration of ECMO was 14 days, and the average length of stay in the intensive care unit was 46.7 days. Among the included patients, 56 required continuous renal replacement therapy (CRRT) along with ECMO. Patients who required CRRT received empiric standard supplementation of folic acid (1 mg), pyridoxine (200 mg), and thiamine (100 mg) every 48 hours. Of the 77 patients, 44% had below-normal blood levels of at least one of the measured micronutrients. The depletion percentages of various nutrients were as follows: vitamin C (80.6%), vitamin D (75.0%), iron (72.7%), copper (53.3%), carnitine (31.3%), selenium (30.0%), pyridoxine (25.0%), folic acid (18.8%), vitamin A (10.0%), zinc (7.1%), and thiamine (3.7%) (Table 2). Measured vitamin B12, manganese, and vitamin E levels were within normal limits.</p><p><b>Conclusion:</b> This study demonstrated that 60% of patients on ECMO had orders to evaluate at least one micronutrient in their blood. Of these, almost half had at least one micronutrient level below normal limits. These findings underscore the need for regular nutrient monitoring for critically ill patients. Prospective studies are needed to understand the impact of ECMO on micronutrient status, determine the optimal time for evaluation, and assess the need for and efficacy of supplementation in these patients.</p><p><b>Table 1.</b> General Demographic and ECMO Characteristics (N = 77).</p><p></p><p><b>Table 2.</b> Observed Micronutrient Status during ECMO for Critically Ill Patients.</p><p></p><p>Diane Nowak, RD, LD, CNSC<sup>1</sup>; Mary Kronik, RD, LD, CNSC<sup>2</sup>; Caroline Couper, RD, LD, CNSC<sup>3</sup>; Mary Rath, MEd, RD, LD, CNSC<sup>4</sup>; Ashley Ratliff, MS, RD, LD, CNSC<sup>4</sup>; Eva Leszczak-Lesko, BS Health Sciences, RRT<sup>4</sup></p><p><sup>1</sup>Cleveland Clinic, Elyria, OH; <sup>2</sup>Cleveland Clinic, Olmsted Twp, OH; <sup>3</sup>Cleveland Clinic, Rocky River, OH; <sup>4</sup>Cleveland Clinic, Cleveland, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Indirect calorimetry (IC) is the gold standard for the accurate determination of energy expenditure. The team performed a comprehensive literature review on current IC practices across the nation which showed facilities employing IC typically follow a standard protocol dictated by length of stay (LOS). Intensive Care Unit (ICU) Registered Dietitians (RD) have directed IC intervention to reduce reliance on inaccurate predictive equations and judiciously identify patients (1, 2) with the assistance of IC order-based practice. While IC candidacy is determined by clinical criteria, implementation has been primarily dictated by RD time constraints. Our project aims to include IC in our standard of care by using a standardized process for implementation.</p><p><b>Methods:</b> To implement IC at our 1,299-bed quaternary care hospital, including 249 ICU beds, a multidisciplinary team including ICU RDs and Respiratory Therapists (RT) partnered with a physician champion. Three Cosmed QNRG+ indirect calorimeters were purchased after a 6-month trial period. Due to the potential for rapid clinical status changes and RD staffing, the ICU team selected an order-based practice as opposed to a protocol. The order is signed by a Licensed Independent Practitioner (LIP) and includes three components: an order for IC with indication, a nutrition assessment consult, and a conditional order for the RD to release to RT once testing approved. After the order is signed, the RD collaborates with the Registered Nurse and RT by verifying standardized clinical criteria to assess IC candidacy. If appropriate, the RD will release the order for RT prior to testing to allow for documentation of ventilator settings. To begin the test, the RD enters patient information and calibrates the pneumotach following which RT secures ventilation connections. Next, RD starts the test and remains at bedside for the standardized 20-minute duration to ensure steady state is not interrupted. Once testing is completed, the best 5-minute average is selected to obtain the measured resting energy expenditure (mREE). The RD interprets the results considering a multitude of factors and, if warranted, modifies nutrition interventions.</p><p><b>Results:</b> Eight ICU registered dietitians completed 87 IC measurements from May 2024 through August 2024 which included patients across various ICUs. All 87 patients were selected by the RD due to concerns for over or underfeeding. Eighty-three percent of the measurements were valid tests and seventy-nine percent of the measurements led to intervention modifications. The amount of face-to-face time spent was 66 hours and 45 minutes or an average of 45 minutes per test. Additional time spent interpreting results and making modifications to interventions ranged from 15-30 minutes.</p><p><b>Conclusion:</b> IC has the ability to capture accurate energy expenditures in the critically ill. RD directed IC order-based practice has allowed for the successful IC introduction at our institution. The future transition from limited IC implementation to a standard of care will be dependent on the consideration of numerous challenges including RD time constraints and patient volumes amid the ebbs and flows of critical care. To align with the ever-changing dynamics of critical care, staffing level and workflows are being actively evaluated.</p><p><b>Table 1.</b> Indirect Calorimetry (IC) Checklist.</p><p></p><p></p><p><b>Figure 1.</b> IC Result with Invalid Test.</p><p></p><p><b>Figure 2.</b> IC Result with Valid Test.</p><p></p><p><b>Figure 3.</b> IC Indications and Contraindications.</p><p></p><p><b>Figure 4.</b> IC EPIC Order.</p><p>Rebecca Frazier, MS, RD, CNSC<sup>1</sup>; Chelsea Heisler, MD, MPH<sup>1</sup>; Bryan Collier, DO, FACS, FCCM<sup>1</sup></p><p><sup>1</sup>Carilion Roanoke Memorial Hospital, Roanoke, VA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Adequate energy intake with appropriate macronutrient composition is an essential part of patient recovery; however, predictive equations have been found to be of variable accuracy. Indirect calorimetry (IC), gives insight into the primary nutrition substrate being utilized as metabolic fuel and caloric needs, often identifying over- and under-feeding. Though IC is considered the gold standard for determining resting energy expenditure, it has challenges with cost, equipment feasibility, and time restraints with personnel, namely respiratory therapy (RT). Our hypothesis: Registered Dietitian (RD)-led IC tests can be conducted in a safe and feasible manner without undue risk of complication. In addition, IC will show a higher caloric need, particularly in patients who require at least seven days of ventilatory support.</p><p><b>Methods:</b> A team of RDs screened surgical ICU patients at a single institution. Intubated patients for at least 3 days were considered eligible for testing. Exclusion criteria included a PEEP > 10, fraction of inspired oxygen >60%, Richmond Agitation Sedation Scale ≥ 1, chest tube air leak, extracorporeal membrane oxygenation use, and >1°C change in 1 hour. Tests were completed using the Q-NRG+ Portable Metabolic Monitor (Baxter) based on RD, and patient availability. Test results were compared to calculated needs based on the Penn State Equation (39 tests). Overfeeding/underfeeding was defined as >15% deviation from the equation results. Analysis of mean difference in energy needs was calculated using a standard paired, two tailed t-test for </= 7 total ventilated days and >7 ventilated days.</p><p><b>Results:</b> Thirty patients underwent IC testing; a total of 39 tests were completed. There were no complications in RD led IC testing, and minimal RT involvement was required after 5 tests completed. Overall, 56.4% of all IC regardless of ventilator days indicated overfeeding. In addition, 33.3% of tests indicated appropriate feeding (85-115% of calculated REE), and 10.3% of tests demonstrated underfeeding. When stratified by ventilator days (> 7 d vs. ≤7 d), similar results were found noting 66% of IC tests were >15% from calculated caloric needs; 54.4-60.0% via equation were overfed and 12.5-6.7% were underfed, respectively.</p><p><b>Conclusion:</b> Equations estimating caloric needs provide inconsistent results. Nutritional equations under and overestimate nutritional needs similarly regardless of ventilatory days compared to IC. Despite the lack of statistical significance, the effects of poor nutrition are well documented and vastly clinically significant. With minimal training, IC can be performed safely with an RD and bedside RN. Utilizing the RD to coordinate and perform IC testing is a feasible process that maximizes personnel efficiency and allows for immediate adjustment in the nutrition plan. IC as the gold standard for nutrition estimation should be performed on surgical ICU patients to assist in developing nutritional treatment algorithms.</p><p>Dolores Rodríguez<sup>1</sup>; Mery Guerrero<sup>2</sup>; María Centeno<sup>2</sup>; Barbara Maldonado<sup>2</sup>; Sandra Herrera<sup>2</sup>; Sergio Santana<sup>3</sup></p><p><sup>1</sup>Ecuadorian Society for the Fight against Cancer, Guayaquil, Guayas; <sup>2</sup>SOLCA, Guayaquil, Guayas; <sup>3</sup>University of Havana, La Habana, Ciudad de la Habana</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> In 2022, the International Agency for Research on Cancer-Globocan reported nearly 20 million new cases of cancer worldwide, with 30,888 cases in Ecuador. Breast, prostate, and stomach cancers were the most diagnosed types. Oncohematological diseases (OHD) significantly affect the nutritional status of patients. The ELAN Ecuador-2014 study, involving over 5,000 patients, found malnutrition in 37% of participants overall, rising to 65% among those with OHD. The Latin American Study of Malnutrition in Oncology (LASOMO), conducted by FELANPE between 2019 and 2020, revealed a 59.1% frequency of malnutrition among 1,842 patients across 52 health centers in 10 Latin American countries. This study aims to present the current state of malnutrition associated with OHD among patients treated in Ecuadorian hospitals.</p><p><b>Methods:</b> The Ecuadorian segment of the LASOMO Study was conducted between 2019 and 2020, as part of the previously mentioned regional epidemiological initiative. This study was designed as a one-day, nationwide, multicenter survey involving health centers and specialized services for patients with Oncohematological diseases (OHD) across five hospitals located in the provinces of Guayas (3), Manabí (1), and Azuay (1). The nutritional status of patients with Oncohematological diseases (OHD) was assessed using the B + C scores from Detsky et al.'s Subjective Global Assessment (SGA). This study included male and female patients aged 18 years and older, admitted to clinical, surgical, intensive care, and bone marrow transplant (BMT) units during October and November 2019. Participation was voluntary, and patients provided informed consent by signing a consent form. Data were analyzed using location, dispersion, and aggregation statistics based on variable types. The nature and strength of relationships were assessed using chi-square tests for independence, with a significance level of < 5% to identify significant associations. Odds ratios for malnutrition were calculated along with their associated 95% confidence intervals.</p><p><b>Results:</b> The study enrolled 390 patients, with 63.6% women and 36.4% men, averaging 55.3 ± 16.5 years old; 47.2% were aged 60 years or older. The most common tumor locations included kidneys, urinary tract, uterus, ovaries, prostate, and testicles, accounting for 18.7% of all cases (refer to Table 1). Chemotherapy was the predominant oncological treatment, administered to 42.8% of the patients surveyed. Malnutrition affected 49.7% of the patients surveyed, with 14.4% categorized as severely malnourished (see figure 1). The incidence of malnutrition was found to be independent of age, educational level, tumor location, and current cytoreductive treatment (refer to Table 2). Notably, the majority of the malnourished individuals were men.</p><p><b>Conclusion:</b> Malnutrition is highly prevalent in patients treated for OHD in Ecuadorian hospitals.</p><p><b>Table 1.</b> Most Frequent Location of Neoplastic Disease in Hospitalized Ecuadorian Patients and Type of Treatment Received. The number and {in brackets} the percentage of patients included in the corresponding.</p><p></p><p><b>Table 2.</b> Distribution of Malnutrition Associated with Cancer According to Selected Demographic, Clinical, and Health Characteristics of Patients Surveyed During the Oncology Malnutrition Study in Ecuador. The number and {in brackets} the percentage of malnourished patients included in each characteristic category are presented. The frequency of malnutrition was estimated using the Subjective Global Assessment (Detsky et al,. 1987.)</p><p></p><p></p><p><b>Figure 1.</b> State of Malnutrition Among Patients Treated for Cancer in Hospitals In Ecuador.</p><p>Ranna Modir, MS, RD, CNSC, CDE, CCTD<sup>1</sup>; Christina Salido, RD<sup>1</sup>; William Hiesinger, MD<sup>2</sup></p><p><sup>1</sup>Stanford Healthcare, Stanford, CA; <sup>2</sup>Stanford Medicine, Stanford, CA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Cardiovascular Intensive Care Unit (CVICU) patients are at high risk for significant nutrient deficits, especially during the early intensive care unit (ICU) phase, when postoperative complications and hemodynamic instability are prevalent. These deficits can exacerbate the catabolic state, leading to muscle wasting, impaired immune function, and delayed recovery. A calorie deficit > 10,000 with meeting < 80% of nutritional needs in the early ICU phase (first 14 days) has been linked to worse outcomes, including prolonged intubation, increased ICU length of stay (LOS) and higher risk of organ dysfunction and infections like central line-associated bloodstream infections (CLABSI). When evaluating CLABSI risk factors, the role of nutrition adequacy and malnutrition if often underestimated and overlooked, with more emphasis placed on the type of nutrition support (NS) provided, whether enteral nutrition (EN) or parenteral nutrition (PN). Historically, there has been practice of avoiding PN to reduce CLABSI risk, rather than ensuring that nutritional needs are fully met. This practice is based on initial reports from decades ago linking PN with hospital-acquired infections. However, updated guidelines based on modern data now indicate no difference in infectious outcomes with EN vs PN. In fact, adding PN when EN alone is insufficient can help reduce nutritional deficiencies, supporting optimal immune response and resilience to infection. As part of an ongoing NS audit in our CVICU, we reviewed all CLABSI cases over a 23-month period to assess nutrition adequacy and the modality of NS (EN vs PN) provided.</p><p><b>Methods:</b> Data were extracted from electronic medical records for all CLABSI cases from September 2020 to July 2022. Data collected included patient characteristics, clinical and nutrition outcomes (Table 1). Descriptive statistics (means, standard deviations, frequencies) were calculated. Chi-square or Fisher's exact test assessed the association between type of NS and meeting >80% of calorie/protein targets within 14 days and until CLABSI onset. A significance level of 0.05 was used.</p><p><b>Results:</b> In a 23 month period, 28/51 (54.9%) patients with a CLABSI required exclusive NS throughout their entire CVICU stay with n = 18 male (64.3%), median age 54.5 years, mean BMI 27.4, median CVICU LOS was 49.5 days with a 46.4% mortality rate. Surgical intervention was indicated in 60.7% patients with 41.2% requiring preoperative extracorporeal membrane oxygenation (ECMO) and 52.9% postoperative ECMO (Table 1). Majority of patients received exclusive EN (53.6%) with 46.4% EN + PN and 0% exclusive PN. In the first 14 ICU days, 21.4% met >80% of calorie needs, 32.1% met >80% of protein needs with 32.1% having a calorie deficit >10,000 kcal. No difference in type of NS and ability to meet >80% of nutrient targets in the first 14 days (Table 1, p = 0.372, p = 0.689). Majority of PN (61.5%) was initiated after ICU day 7. From ICU day 1 until CLABSI onset, the EN + PN group were more able to meet >80% of calorie targets vs exclusive EN (p = 0.016). 50% were diagnosed with malnutrition. 82% required ECMO cannulas and 42.9% dialysis triple lumen. Enterococcus Faecalis was the most common organism for the EN (43.7%) and EN + PN group (35.7%) (Table 2).</p><p><b>Conclusion:</b> This single-center analysis of CVICU CLABSI patients found majority requiring exclusive NS failed to meet >80% of nutrition needs during the early ICU phase. Exclusive EN was the primary mode of NS compared to EN + PN or PN alone, challenging the assumption that PN inherently increases CLABSI risk. In fact, EN + PN improved ability to meet calorie targets until CLABSI onset. These findings suggest that early nutrient deficits may increase CLABSI risk and that the risk is not dependent on the type of NS provided.</p><p><b>Table 1.</b> Patient Characteristics, Clinical and Nutritional Outcomes.</p><p></p><p><b>Table 2.</b> Type of Central Access Device and Microorganism in Relation to Modality of Nutrition Support Provided.</p><p></p><p>Oki Yonatan, MD<sup>1</sup>; Faya Nuralda Sitompul<sup>2</sup></p><p><sup>1</sup>ASPEN, Jakarta, Jakarta Raya; <sup>2</sup>Osaka University, Minoh, Osaka</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Ginseng, widely used as a functional food or therapeutic supplement in Asia, contains bioactive compounds such as ginsenosides, which exert a range of biological effects, including hypoglycemic, anti-inflammatory, cardioprotective, and anti-tumor properties. However, studies have indicated that ginseng also has anticoagulant and anti-aggregation effects and may be associated with bleeding. This case report presents a potential case of ginseng-induced bleeding in an elderly patient with advanced pancreatic cancer. Case Description: A 76-year-old male with stage IV pancreatic cancer and metastases to the liver, lymph nodes, and peritoneum was in home care with BIPAP ventilation, NGT feed, ascites drainage, and foley catheter. He had a history of type A aortic dissection repair, anemia, and thrombocytopenia, with platelet counts consistently below 50,000/µL. Despite no history of anticoagulant use, the patient developed massive gastrointestinal bleeding and hematuria after consuming 100 grams of American Ginseng (AG) per day for five days. Disseminated intravascular coagulation (DIC) was initially suspected, but no signs of bleeding were observed until the third week of care, coinciding with ginseng consumption. Endoscopy was not performed due to the patient's unstable condition and the family's refusal. Discussion: The consumption of AG may have triggered bleeding due to the patient's already unstable condition and low platelet count. Ginsenosides, particularly Rg1, Rg2, and Rg3, have been shown to exert anticoagulant effects, prolonging clotting times and inhibiting platelet aggregation. Studies have demonstrated that AG extracts can significantly extend clotting times and reduce platelet activity, potentially contributing to the observed bleeding. Conclusion: This case highlights the possible role of AG in inducing severe bleeding in a patient with pancreatic cancer and thrombocytopenia. Given ginseng's known anticoagulant properties, caution should be exercised when administering it to patients with hematological abnormalities or bleeding risks, and further research is warranted to assess its safety in these populations.</p><p><b>Methods:</b> None Reported.</p><p><b>Results:</b> None Reported.</p><p><b>Conclusion:</b> None Reported.</p><p>Kursat Gundogan, MD<sup>1</sup>; Mary Nellis, PhD<sup>2</sup>; Nurhayat Ozer, PhD<sup>3</sup>; Sahin Temel, MD<sup>3</sup>; Recep Yuksel, MD<sup>4</sup>; Murat Sungar, MD<sup>5</sup>; Dean Jones, PhD<sup>2</sup>; Thomas Ziegler, MD<sup>6</sup></p><p><sup>1</sup>Division of Clinical Nutrition, Erciyes University Health Sciences Institute, Kayseri; <sup>2</sup>Emory University, Atlanta, GA; <sup>3</sup>Erciyes University Health Sciences Institute, Kayseri; <sup>4</sup>Department of Internal Medicine, Erciyes University School of Medicine, Kayseri; <sup>5</sup>Department of Internal Medicine, Erciyes University School of Medicine, Kayseri; <sup>6</sup>Emory Healthcare, Atlanta, GA</p><p><b>Financial Support:</b> Erciyes University Scientific Research Committee (TSG- 2021–10078) and TUBITAK 2219 program (1059B-192000150), each to KG, and National Institutes of Health grant P30 ES019776, to DPJ and TRZ.</p><p><b>Background:</b> Metabolomics represents a promising profiling technique for the investigation of significant metabolic alterations that arise in response to critical illness. The present study utilized plasma high-resolution metabolomics (HRM) analysis to define systemic metabolism associated with common critical illness severity scores in critically ill adults.</p><p><b>Methods:</b> This cross-sectional study performed at Erciyes University Hospital, Kayseri, Turkiye and Emory University, Atlanta, GA, USA. Participants were critically ill adults with an expected length of intensive care unit stay longer than 48 h. Plasma for metabolomics was obtained on the day of ICU admission. Data was analyzed using regression analysis of two ICU admission illness severity scores (APACHE II and mNUTRIC) against all plasma metabolomic features in metabolome-wide association studies (MWAS). APACHE II score was analyzed as a continuous variable, and mNUTRIC score was analyzed as a dichotomous variable [≤4 (low) vs. > 4 (high)]. Pathway enrichment analysis was performed on significant metabolites (raw p < 0.05) related to each of the two illness severity scores independently.</p><p><b>Results:</b> A total of 77 patients were included. The mean age was 69 years (range 33-92 years); 65% were female. More than 15,000 metabolomic features were identified for MWAS of APACHE II and mNUTRIC scores, respectively. Metabolic pathways significantly associated with APACHE II score at ICU admission included: C21-steroid hormone biosynthesis, and urea cycle, vitamin E, seleno amino acid, aspartate/asparagine, and thiamine metabolism. Metabolic pathways associated with mNUTRIC score at ICU admission were N-glycan degradation, and metabolism of fructose/mannose, vitamin D3, pentose phosphate, sialic acid, and linoleic acid. Within the significant pathways, the gut microbiome-derived metabolites hippurate and N-acetyl ornithine were downregulated, and creatine and glutamate were upregulated with increasing APACHE II scores. Metabolites involved in energy metabolism that were altered with a high (> 4) mNUTRIC score included N-acetylglucosamine (increased) and gluconate (decreased).</p><p><b>Conclusion:</b> Plasma HRM identified significant associations between two commonly used illness severity scores and metabolic processes involving steroid biosynthesis, the gut microbiome, skeletal muscle, and amino acid, vitamin, and energy metabolism in adult critically ill patients.</p><p>Hilary Winthrop, MS, RD, LDN, CNSC<sup>1</sup>; Megan Beyer, MS, RD, LDN<sup>2</sup>; Jeroen Molinger, PhDc<sup>3</sup>; Suresh Agarwal, MD<sup>4</sup>; Paul Wischmeyer, MD, EDIC, FCCM, FASPEN<sup>5</sup>; Krista Haines, DO, MA<sup>4</sup></p><p><sup>1</sup>Duke Health, Durham, NC; <sup>2</sup>Duke University School of Medicine- Department of Anesthesiology, Durham, NC; <sup>3</sup>Duke University Medical Center - Department of Anesthesiology - Duke Heart Center, Raleigh, NC; <sup>4</sup>Duke University School of Medicine, Durham, NC; <sup>5</sup>Duke University Medical School, Durham, NC</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Little evidence currently exists on the effect of body mass index (BMI) on resting energy expenditure (REE). Current clinical practice guidelines are based primarily on expert opinion and provide a wide range of calorie recommendations without delineations specific to BMI categories, making it difficult for the clinician to accurately prescribe calorie needs for their hospitalized patients. This abstract utilizes metabolic cart data from studies conducted at a large academic healthcare system to investigate trends within BMI and REE.</p><p><b>Methods:</b> A pooled cohort of hospitalized patients was compiled from three clinical trials where metabolic cart measurements were collected. In all three studies, indirect calorimetry was initially conducted in the intensive care setting, with follow up measurements conducted as clinically able. Variables included in the analysis was measured resting energy expenditure (mREE) in total kcals as well as kcals per kilogram of body weight on ICU admission, respiratory quotient, days post ICU admission, as well as demographics and clinical characteristics. ANOVA tests were utilized to analyze continuous data.</p><p><b>Results:</b> A total of 165 patients were included in the final analysis with 338 indirect calorimetry measurements. Disease groups included COVID-19 pneumonia, non-COVID respiratory failure, surgical ICU, cardiothoracic surgery, and trauma. The average age of patients was 58 years old, with 96 males (58.2%) and 69 females (41.8%), and an average BMI of 29.0 kg/m<sup>2</sup>. The metabolic cart measurements on average were taken on day 8 post ICU admission (ranging from day 1 to day 61). See Table 1 for more demographics and clinical characteristics. Indirect calorimetry measurements were grouped into three BMI categories: BMI ≤ 29.9 (normal), BMI 30-39.9 (obese), and BMI ≥ 40 (super obese). ANOVA analyses showed statistical significance amongst the three BMI groups in both total kcals (p < 0.001) and kcals per kg (p < 0.001). The normal BMI group had an average mREE of 1632 kcals (range of 767 to 4023), compared to 1868 kcals (range of 1107 to 3754) in the obese BMI group, and 2004 kcals (range of 1219 to 3458) in the super obese BMI group. Similarly, when analyzing kcals per kg, the normal BMI group averaged 23.3 kcals/kg, the obese BMI group 19.8, and the super obese BMI group 16.3.</p><p><b>Conclusion:</b> Without access to a metabolic cart to accurately measure REE, the majority of nutrition clinicians are left to estimations. Current clinical guidelines and published data do not provide the guidance that is necessary to accurately feed many hospitalized patients. This current analysis only scratches the surface on the metabolic demands of different patient populations based on their BMI status, especially given the wide ranges of energy expenditures. Robust studies are needed to further elucidate the relationships between BMI and REE in different disease states.</p><p><b>Table 1.</b> Demographics and Clinical Characteristics.</p><p></p><p></p><p><b>Figure 1.</b> Average Measured Resting Energy Expenditure (in Total Kcals), by BMI Group.</p><p></p><p><b>Figure 2.</b> Average Measured Resting Energy Expenditure (in kcals/kg), by BMI Group.</p><p>Carlos Reyes Torres, PhD, MSc<sup>1</sup>; Daniela Delgado Salgado, Dr<sup>2</sup>; Sergio Diaz Paredes, Dr<sup>1</sup>; Sarish Del Real Ordoñez, Dr<sup>1</sup>; Eva Willars Inman, Dr<sup>1</sup></p><p><sup>1</sup>Hospital Oncológico de Coahuila (Oncological Hospital of Coahuila), Saltillo, Coahuila de Zaragoza; <sup>2</sup>ISSSTE, Saltillo, Coahuila de Zaragoza</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Chemotherapy is one of the principal treatment in cancer. Is described that some degree of toxicity is present in 98% of patients. Changes in body composition are frequent and are related to worse outcomes. Low muscle mass is associated with chemotherapy toxicity in observational studies. Phase angle (PhA) is an indicator cell integrity and positively correlates with adequate nutritional status and muscle mass. There is limited studies that have been evaluated the associaton of PhA and chemotherapy toxicity. The aim of this study was to evaluate the association of PhA, body composition and chemotherapy toxicity in cancer patients.</p><p><b>Methods:</b> A prospective cohort study was conducted in adult patients with solid neoplasic disease with first-line systemic treatments. The subjects were evaluated at first chemotherapy treatment using bioelectrical impedance analysis with a RJL device and according to the standardized technique. The outcome is to know chemotherapy toxicity in the first 4 cycles of chemotherapy and associated with PhA and body composition. Toxicity was evaluated using National Cancer Institute (NCI) common terminology criteria for adverse events version 5.0. A PhA < 4.7 was considered low according to other studies.</p><p><b>Results:</b> A total of 54 patients were evaluated and included in the study. The most common cancer diagnosis was breast cancer (40%), gastrointestinal tumors (33%), lung cancer (18%). Chemotherapy toxicity was presented in 46% of the patients. The most common adverse effects were gastrointestinal (48%), blood disorders (32%) and metabolically disorders (40%). There were statistically differences in PhA between patients with chemotherapy toxicity and patients without adverse effects: 4.45º (3.08-4.97) vs 6.07º (5.7-6.2) respectively, p vale < 0.001. PhA was associated with the risk of chemotherapy toxicity HR 8.7 (CI 95% 6.1-10.7) log rank test p = 0.02.</p><p><b>Conclusion:</b> PhA was associated with the risk of chemotherapy toxicity in cancer patients.</p><p>Lizl Veldsman, RD, M Nutr, BSc Dietetics<sup>1</sup>; Guy Richards, MD, PhD<sup>2</sup>; Carl Lombard, PhD<sup>3</sup>; Renée Blaauw, PhD, RD<sup>1</sup></p><p><sup>1</sup>Division of Human Nutrition, Department of Global Health, Faculty of Medicine & Health Sciences, Stellenbosch University, Cape Town, Western Cape; <sup>2</sup>Department of Surgery, Division of Critical Care, Faculty of Health Sciences, University of the Witwatersrand, Johannesburg, Gauteng; <sup>3</sup>Division of Epidemiology and Biostatistics, Department of Global Health, Faculty of Medicine and Health Sciences, Stellenbosch University, Cape Town, Western Cape</p><p><b>Financial Support:</b> Fresenius Kabi JumpStart Research Grant.</p><p><b>Background:</b> Critically ill patients lose a significant amount of muscle mass over the first ICU week. We sought to determine the effect of bolus amino acid (AA) supplementation on the urea-to-creatinine ratio (UCR) trajectory over time, and whether UCR correlates with histology myofiber cross-sectional area (CSA) as a potential surrogate marker of muscle mass.</p><p><b>Methods:</b> This was a secondary analysis of data from a registered clinical trial (ClinicalTrials.gov NCT04099108) undertaken in a predominantly trauma surgical ICU. Participants were randomly assigned into two groups both of which received standard care nutrition (SCN) and mobilisation. Study participants randomised to the intervention group also received a bolus AA supplement, with a 45-minute in-bed cycling session, from average ICU day 3 for a mean of 6 days. The change in vastus lateralis myofiber CSA, measured from pre-intervention (average ICU day 2) to post-intervention (average ICU day 8), was assessed through biopsy and histological analysis. A linear mixed-effects regression model was used to compare the mean daily UCR profiles between study groups from ICU day 0 to day 10. As a sensitivity analysis, we adjusted for disease severity on admission (APACHE II and SOFA scores), daily fluid balance, and the presence of acute kidney injury (AKI). A spearman correlation compared the UCR on ICU day 2 (pre-intervention) and ICU day 8 (post-intervention) with the corresponding myofiber CSA.</p><p><b>Results:</b> A total of 50 enrolled participants were randomised to the study intervention and control groups in a 1:1 ratio. The control and intervention groups received, on average, 87.62 ± 32.18 and 85.53 ± 29.29 grams of protein per day (1.26 ± 0.41 and 1.29 ± 0.40 g/kg/day, respectively) from SCN, and the intervention group an additional 30.43 ± 5.62 grams of AA (0.37 ± 0.06 g/kg protein equivalents) from the AA supplement. At baseline, the mean UCR for the control (75.6 ± 31.5) and intervention group (63.8 ± 27.1) were similar. Mean UCR increased daily from baseline in both arms, but at a faster rate in the intervention arm with a significant intervention effect (p = 0.0127). The UCR for the intervention arm at day 7 and 8 was significantly higher by 21 and 22 units compared to controls (p = 0.0214 and p = 0.0215, respectively). After day 7 the mean daily UCR plateaued in the intervention arm, but not in the controls. Adjusting for disease severity, daily fluid balance, and AKI did not alter the intervention effect. A significant negative association was found between UCR and myofiber CSA (r = -0.39, p = 0.011) at ICU day 2 (pre-intervention), but not at day 8 (post-intervention) (r = 0.23, p = 0.153).</p><p><b>Conclusion:</b> Bolus amino acid supplementation significantly increases the UCR during the first ICU week, thereafter plateauing. UCR at baseline may be an indicator of muscle status.</p><p></p><p><b>Figure 1.</b> Change in Urea-to-Creatinine Ratio (UCR) Over ICU Days in the Control and Intervention Group. Error bars Represent 95% Confidence Intervals (CIs).</p><p>Paola Renata Lamoyi Domínguez, MSc<sup>1</sup>; Iván Osuna Padilla, PhD<sup>2</sup>; Lilia Castillo Martínez, PhD<sup>3</sup>; Josué Daniel Cadeza-Aguilar, MD<sup>2</sup>; Martín Ríos-Ayala, MD<sup>2</sup></p><p><sup>1</sup>UNAM, National Autonomous University of Mexico, Mexico City, Distrito Federal; <sup>2</sup>National Institute of Respiratory Diseases, Mexico City, Distrito Federal; <sup>3</sup>National Institute of Medical Sciences and Nutrition Salvador Zubirán, Mexico City, Distrito Federal</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Non-defecation (ND) is highly prevalent in critically ill patients on mechanical ventilation (MV) and has been reported in up to 83% of cases. This disorder is associated with high morbidity and mortality rates. Most existing research has focused on the association between clinical data and non-defecation; however, there is a lack of evidence regarding its association with dietary factors. We aimed to analyze the association between dietary fiber in enteral nutrition and the amount of fluids administered through enteral and parenteral routes with defecation during the first 6-days of MV in critically ill patients with pneumonia and other lung manifestations.</p><p><b>Methods:</b> We conducted a longitudinal analysis of MV patients receiving enteral nutrition (EN) at a tertiary care hospital in Mexico City between May 2023 and April 2024. The inclusion criteria were age >18 years, MV, admission to the respiratory intensive care unit (ICU) for pneumonia or other lung manifestations, and nutritional assessment performed during the first 24 h after ICU admission. Exclusion criteria was established as patients who required parenteral nutrition, major surgery, traumatic brain injury, or neuromuscular disorders were excluded from this study. A nutritional assessment, including the NUTRIC score, SOFA and APACHE II assessments, and an estimation of energy-protein requirements, was performed by trained dietitians within the first 24 hours after ICU admission. During each day of follow-up (0 to 6 days), we recorded the amount of fiber provided in EN, and volume of infusion fluids, including enteral and parenteral route, and medical prescription of opioids, sedatives, neuromuscular blockers and vasopressors. ND was defined as >6 days without defecation from ICU admission. The differences between ND and defecation were also assessed. Association of ND with dietary factors were examined using discrete-time survival analysis.</p><p><b>Results:</b> Seventy-four patients were included; ND was observed in 40 patients (54%). Non-defecation group had higher ICU length of stay, and 50% of this group had the first defecation until day 10. No differences in fiber provision and volume of infusion fluids were observed between the groups. In multivariate analysis, no associations between ND and fiber (fiber intake 10 to 20 g per day, OR 1.17 95% CI:0.41-3.38, p = 0.29) or total fluids (fluids intake 25 to 30 ml/kg/d, OR 1.85 95%CI:0.44-7.87, p = 0.404) were observed.</p><p><b>Conclusion:</b> Non-defecation affected 54% of the study population. Although fiber and fluids are considered a treatment for non-defecation, we did not find an association in critically ill patients.</p><p><b>Table 1.</b> Demographic and Clinical Characteristics by Groups.</p><p></p><p><b>Table 2.</b> Daily Comparison of Dietary Factors.</p><p></p><p>Andrea Morand, MS, RDN, LD<sup>1</sup>; Osman Mohamed Elfadil, MBBS<sup>1</sup>; Kiah Graber, RDN<sup>1</sup>; Yash Patel, MBBS<sup>1</sup>; Suhena Patel, MBBS<sup>1</sup>; Chloe Loersch, RDN<sup>1</sup>; Isabelle Wiggins, RDN<sup>1</sup>; Anna Santoro, MS, RDN<sup>1</sup>; Natalie Johnson, MS<sup>1</sup>; Kristin Eckert, MS, RDN<sup>1</sup>; Dana Twernbold, RDN<sup>1</sup>; Dacia Talmo, RDN<sup>1</sup>; Elizabeth Engel, RRT, LRT<sup>1</sup>; Avery Erickson, MS, RDN<sup>1</sup>; Alex Kirby, MS, RDN<sup>1</sup>; Mackenzie Vukelich, RDN<sup>1</sup>; Kate Sandbakken, RDN<sup>1</sup>; Victoria Vasquez, RDN<sup>1</sup>; Manpreet Mundi, MD<sup>1</sup></p><p><sup>1</sup>Mayo Clinic, Rochester, MN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Current guidelines for nutrition support in critically ill patients recommend the utilization of indirect calorimetry (IC) to determine energy needs. However, IC testing is limited at many institutions due to accessibility, labor, and costs, which leads to reliance on predictive equations to determine caloric targets. A quality improvement (QI) initiative was implemented to assess the impact on nutrition care when IC is routinely completed.</p><p><b>Methods:</b> A prospective QI project in selected medical and surgical intensive care units (ICU) included critically ill patients assessed by the dietitian within 24-48 hours of consult order or by hospital day 4. Patients with contraindications to IC were excluded, including those requiring ECMO, CRRT, MARS, or FiO2 > 55, as well as spontaneously breathing patients requiring significant supplemental oxygen. The initial caloric target was established utilizing predictive equations, which is the standard of care at our institution. After day 4 of the ICU stay, IC measurements, predictive equations, and weight-based nomograms were collected. The predictive equations utilized included Harris-Benedict (HB) - basal, adjusted HB (75% of basal when body mass index (BMI) > 30), Penn State if ventilated, Mifflin St. Jeor (MSJ), revised HB, and revised HB adjusted (75% of basal when BMI > 30). Additional demographic, anthropometric, and clinical data were collected.</p><p><b>Results:</b> Patients (n = 85) were majority male (n = 53, 62.4%), admitted to the surgical ICU (n = 57, 67.1%), overweight (mean BMI 29.8 kg/m^2), and the average age was 61.3 years old (SD 16.5). At the time of the IC test, the median ICU length of stay was 6 days; 77.6% (n = 66) were supported with mechanical ventilation, and median ventilator days were 4 days (Table 1). Mean IC measured REE was compared to predictive equations, showing that except for weight-based nomogram high caloric needs (p = 0.3615), all equations were significantly lower than IC (p < 0.0001). Median absolute differences from IC for evaluated predictive equations (Figure 1). After REE was measured, caloric goals increased significantly for patients on enteral (EN) and parenteral nutrition (PN) (p = 0.0016 and p = 0.05, respectively). In enterally fed the mean calorie goal before REE was 1655.4 (SD 588), and after REE 1917.6 (SD 528.6), an average increase of 268.4 kcal/day; In parenterally fed patients, the mean calorie goal before REE was1395.2 kcal (SD 313.6) and after REE 1614.1 (SD 239.3), an average increase of 167.5 kcal (Table 2). The mean REE per BMI category per actual body weight was BMI < 29.9 = 25.7 ± 7.9 kcal/kg, BMI 30-34.9 = 20.3 ± 3.8 kcal/kg, BMI 35-39.9 = 22.8 ± 4.6 kcal/kg, and BMI ≥ 40 = 16.3 ± 2.9 kcal/kg (25.4 ± 10.5 kcal/kg of ideal body weight). (Figure 2) illustrates the average daily calorie need broken down by BMI for IC and examined predictive equations.</p><p><b>Conclusion:</b> There was a significant difference between IC measurements and various predictive equations except for weight-based high-estimated calorie needs. Nutrition goals changed significantly in response to IC measurements. It is recommended that we expand the use of IC in the critically ill population at our institution. In settings where IC is not possible, weight-based nomograms should be utilized.</p><p><b>Table 1.</b> Baseline Demographics and Clinical Characteristics.</p><p></p><p><b>Table 2.</b> Nutrition Support.</p><p></p><p></p><p><b>Figure 1.</b> Difference in Daily Calorie Estimation Utilizing IC Compared to Predictive Equations.</p><p></p><p><b>Figure 2.</b> RMR by IC and Other Predictive Equations by BMI.</p><p><b>GI, Obesity, Metabolic, and Other Nutrition Related Concepts</b></p><p>Suhena Patel, MBBS<sup>1</sup>; Osman Mohamed Elfadil, MBBS<sup>1</sup>; Yash Patel, MBBS<sup>1</sup>; Chanelle Hager, RN<sup>1</sup>; Manpreet Mundi, MD<sup>1</sup>; Ryan Hurt, MD, PhD<sup>1</sup></p><p><sup>1</sup>Mayo Clinic, Rochester, MN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Chronic capillary leak syndrome is a rare but potentially fatal side effect of immunotherapy for some malignancies, mainly manifested with intractable generalized edema and often refractory hypotension. An idiopathic type of the syndrome is also known. It can be diagnosed by exclusion in patients with a single or recurrent episode of intravascular hypovolemia or generalized edema, primarily manifested by the diagnostic triad of hypotension, hemoconcentration, and hypoalbuminemia in the absence of an identifiable alternative cause. Supportive care with a role for steroids remains the standard treatment. In capillary leak syndrome secondary to cancer immune therapy, discontinuing the offending agent is typically considered. This relatively rare syndrome can be associated with significant clinical challenges. This clinical case report focuses on aspects of nutrition care.</p><p><b>Methods:</b> A 45-year-old male with a past medical history of hypertension, pulmonary tuberculosis in childhood, and rectal cancer was admitted for evaluation of anasarca. He was first diagnosed with moderately differentiated invasive adenocarcinoma of rectum IIIb (cT3, cN1, cM0) in October 2022. As initial therapy, he was enrolled in a clinical trial. He received 25 cycles of immunotherapy with the study drug Vudalimab (PD1/CTLA4 bispecific antibody), achieving a complete clinical response without additional chemotherapy, radiation, or surgery. He, unfortunately, has developed extensive capillary leak syndrome manifested with recurrent anasarca, chylous ascites, and pleural effusions since November 2023. His treatment was also complicated by the development of thyroiditis and insulin-dependent diabetes. The patient most recently presented with abdominal fullness, ascites, and peripheral edema that did not improve despite diuretic therapy. A diagnostic and therapeutic paracentesis was performed, and chylous ascites were revealed. Two weeks later, the patient was presented with re-accumulation of ascites and worsening anasarca with pleural and pericardial effusion. A PET CT then was negative for malignant lesions but revealed increased uptake along the peritoneal wall, suggestive of peritonitis. A lymphangiogram performed for further evaluation revealed no gross leak/obstruction; however, this study could not rule out microleak from increased capillary permeability. However, he required bilateral pleural and peritoneal drains (output ranged from 0.5 to 1 L daily). A diagnosis of Capillary leak syndrome was made. In addition to Octreotide, immunosuppression therapy was initiated with IV methyl prednisone (40 mg BID) followed by a transition to oral steroids (60 mg PO); however, the patient's symptoms reappeared with a reduction in dose of prednisone and transition to oral steroids. His immunosuppression regimen was modified to include a trial of IVIG weekly and IV albumin twice daily. From a nutritional perspective, he was initially on a routine oral diet. Still, his drain was increased specifically after fatty food consumption, so he switched to a low fat 40 g/day and high protein diet to prevent worsening chylous ascites. In the setting of worsening anasarca and moderate malnutrition based on ASPEN criteria, along with significant muscle loss clinically, he was started on TPN. A no-fat diet was initiated to minimize lymphatic flow with subsequent improvement in his chest tube output volume, followed by a transition to home parenteral nutrition with mixed oil and oral diet.</p><p><b>Results:</b> None Reported.</p><p><b>Conclusion:</b> Chronic capillary/lymphatic leak syndrome can be challenging and necessitate dietary modification. Along with dietary changes to significantly reduce oral fat intake, short—or long-term PN can be considered.</p><p>Kishore Iyer, MBBS<sup>1</sup>; Francisca Joly, MD, PhD<sup>2</sup>; Donald Kirby, MD, FACG, FASPEN<sup>3</sup>; Simon Lal, MD, PhD, FRCP<sup>4</sup>; Kelly Tappenden, PhD, RD, FASPEN<sup>2</sup>; Palle Jeppesen, MD, PhD<sup>5</sup>; Nader Youssef, MD, MBA<sup>6</sup>; Mena Boules, MD, MBA, FACG<sup>6</sup>; Chang Ming, MS, PhD<sup>6</sup>; Tomasz Masior, MD<sup>6</sup>; Susanna Huh, MD, MPH<sup>7</sup>; Tim Vanuytsel, MD, PhD<sup>8</sup></p><p><sup>1</sup>Icahn School of Medicine at Mount Sinai, New York, NY; <sup>2</sup>Nutritional Support, Hôpital Beaujon, Paris, Ile-de-France; <sup>3</sup>Department of Intestinal Failure and Liver Diseases, Cleveland, OH; <sup>4</sup>Salford Royal NHS Foundation Trust, Salford, England; <sup>5</sup>Department of Intestinal Failure and Liver Diseases, Rigshospitalet, Copenhagen, Denmark, Hovedstaden; <sup>6</sup>Ironwood Pharmaceuticals, Basel, Basel-Stadt; <sup>7</sup>Ironwood Pharmaceuticals, Boston, MA; <sup>8</sup>University Hospitals Leuven, Leuven, Brabant Wallon</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> American College of Gastroenterology 2024, October 25-30, 2024, Philadelphia, Pennsylvania.</p><p><b>Financial Support:</b> None Reported.</p><p><b>International Poster of Distinction</b></p><p>Francisca Joly, MD, PhD<sup>1</sup>; Tim Vanuytsel, MD, PhD<sup>2</sup>; Donald Kirby, MD, FACG, FASPEN<sup>3</sup>; Simon Lal, MD, PhD, FRCP<sup>4</sup>; Kelly Tappenden, PhD, RD, FASPEN<sup>1</sup>; Palle Jeppesen, MD, PhD<sup>5</sup>; Federico Bolognani, MD, PhD<sup>6</sup>; Nader Youssef, MD, MBA<sup>6</sup>; Carrie Li, PhD<sup>6</sup>; Reda Sheik, MPH<sup>6</sup>; Isabelle Statovci, BS, CH<sup>6</sup>; Susanna Huh, MD, MPH<sup>7</sup>; Kishore Iyer, MBBS<sup>8</sup></p><p><sup>1</sup>Nutritional Support, Hôpital Beaujon, Paris, Ile-de-France; <sup>2</sup>University Hospitals Leuven, Leuven, Brabant Wallon; <sup>3</sup>Department of Intestinal Failure and Liver Diseases, Cleveland, OH; <sup>4</sup>Salford Royal NHS Foundation Trust, Salford, England; <sup>5</sup>Department of Intestinal Failure and Liver Diseases, Rigshospitalet, Copenhagen, Denmark, Hovedstaden; <sup>6</sup>Ironwood Pharmaceuticals, Basel, Basel-Stadt; <sup>7</sup>Ironwood Pharmaceuticals, Boston, MA; <sup>8</sup>Icahn School of Medicine at Mount Sinai, New York, NY</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> Digestive Disease Week 2024, May 18 - 21, 2024, Washington, US.</p><p><b>Financial Support:</b> None Reported.</p><p>Tim Vanuytsel, MD, PhD<sup>1</sup>; Simon Lal, MD, PhD, FRCP<sup>2</sup>; Kelly Tappenden, PhD, RD, FASPEN<sup>3</sup>; Donald Kirby, MD, FACG, FASPEN<sup>4</sup>; Palle Jeppesen, MD, PhD<sup>5</sup>; Francisca Joly, MD, PhD<sup>3</sup>; Tomasz Masior, MD<sup>6</sup>; Patricia Valencia, PharmD<sup>7</sup>; Chang Ming, MS, PhD<sup>6</sup>; Mena Boules, MD, MBA, FACG<sup>6</sup>; Susanna Huh, MD, MPH<sup>7</sup>; Kishore Iyer, MBBS<sup>8</sup></p><p><sup>1</sup>University Hospitals Leuven, Leuven, Brabant Wallon; <sup>2</sup>Salford Royal NHS Foundation Trust, Salford, England; <sup>3</sup>Nutritional Support, Hôpital Beaujon, Paris, Ile-de-France; <sup>4</sup>Department of Intestinal Failure and Liver Diseases, Cleveland, OH; <sup>5</sup>Department of Intestinal Failure and Liver Diseases, Rigshospitalet, Copenhagen, Denmark, Hovedstaden; <sup>6</sup>Ironwood Pharmaceuticals, Basel, Basel-Stadt; <sup>7</sup>Ironwood Pharmaceuticals, Boston, MA; <sup>8</sup>Icahn School of Medicine at Mount Sinai, New York, NY</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> American College of Gastroenterology 2024, October 25-30, 2024, Philadelphia, Pennsylvania.</p><p><b>Financial Support:</b> None Reported.</p><p>Boram Lee, MD<sup>1</sup>; Ho-Seong Han, PhD<sup>1</sup></p><p><sup>1</sup>Seoul National University Bundang Hospital, Seoul, Seoul-t'ukpyolsi</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Pancreatic cancer is one of the fatal malignancies, with a 5-year survival rate of less than 10%. Despite advancements in treatment, its incidence is increasing, driven by aging populations and increasing obesity rates. Obesity is traditionally considered to be a negative prognostic factor for many cancers, including pancreatic cancer. However, the \\\"obesity paradox\\\" suggests that obesity might be associated with better outcomes in certain diseases. This study investigated the effect of obesity on the survival of long-term pancreatic cancer survivors after pancreatectomy.</p><p><b>Methods:</b> A retrospective analysis was conducted on 404 patients with pancreatic ductal adenocarcinoma (PDAC) patients who underwent surgery between January 2004 and June 2022. Patients were classified into the non-obese (BMI 18.5-24.9) (n = 313) and obese (BMI ≥ 25.0) (n = 91) groups. The data collected included demographic, clinical, perioperative, and postoperative information. Survival outcomes (overall survival [OS], recurrence-free survival [RFS], and cancer-specific survival [CSS]) were analyzed using Kaplan-Meier curves and Cox regression models. A subgroup analysis examined the impact of the visceral fat to subcutaneous fat ratio (VSR) on survival within the obese cohort.</p><p><b>Results:</b> Obese patients (n = 91) had a significantly better 5-year OS (38.9% vs. 27.9%, p = 0.040) and CSS (41.4% vs. 33%, p = 0.047) than non-obese patients. RFS did not differ significantly between the groups. Within the obese cohort, a lower VSR was associated with improved survival (p = 0.012), indicating the importance of fat distribution in outcomes.</p><p><b>Conclusion:</b> Obesity is associated with improved overall and cancer-specific survival in patients with pancreatic cancer undergoing surgery, highlighting the potential benefits of a nuanced approach to managing obese patients. The distribution of adipose tissue, specifically higher subcutaneous fat relative to visceral fat, further influences survival, suggesting that tailored treatment strategies could enhance the outcomes.</p><p>Nicole Nardella, MS<sup>1</sup>; Nathan Gilchrist, BS<sup>1</sup>; Adrianna Oraiqat, BS<sup>1</sup>; Sarah Goodchild, BS<sup>1</sup>; Dena Berhan, BS<sup>1</sup>; Laila Stancil, HS<sup>1</sup>; Jeanine Milano, BS<sup>1</sup>; Christina Santiago, BS<sup>1</sup>; Melissa Adams, PA-C<sup>1</sup>; Pamela Hodul, MD<sup>1</sup></p><p><sup>1</sup>Moffitt Cancer Center, Tampa, FL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Pancreatic cancer (PC) is a devastating diagnosis with 66,440 new cases and 51,750 deaths estimated in 2024. The prevalence of malnutrition in patients with cancer has been reported to range from 30-85% depending on patient age, cancer type, and stage of disease. Specifically, PC patients frequently present with malnutrition which can lead to negative effects on quality of life and tumor therapy. We hypothesize that increased awareness of early nutritional intervention for PC patients has led to high utilization of dietitian consultations at our tertiary cancer center.</p><p><b>Methods:</b> This IRB-exempt retrospective review included newly diagnosed, treatment naïve PC patients presenting to our institution in 2021-2023 (n = 701). We define newly diagnosed as having pathologically confirmed adenocarcinoma within 30 days of clinic presentation. Patients were screened for weight loss and risk of malnutrition using the validated Malnutrition Screening Tool (MST) (89.5 positive predictive value) at the initial consultation and were referred to a dietitian based on risk or patient preference. Data was collected on demographics, disease stage (localized vs metastatic), tumor location (head/neck/uncinate, body/tail, multi-focal), presenting symptoms (percent weight loss, abdominal pain, bloating, nausea/vomiting, fatigue, change in bowel habits), experience of jaundice, pancreatitis, and gastric outlet obstruction, and dietitian consultation. Descriptive variables and Fisher's exact test were used to report outcomes.</p><p><b>Results:</b> The majority of patients were male (54%) with median age of 70 (27-95). About half of patients had localized disease (54%) with primary tumor location in the head/neck/uncinate region (57%). Patients with head/neck/uncinate tumor location mostly had localized disease (66%), while patients with body/tail tumors tended to have metastatic disease (63%). See Table 1 for further demographics. Unintentional weight loss was experienced by 66% of patients (n = 466), 69% of localized patients (n = 261) and 64% of metastatic patients (n = 205). Patients with localized disease stated a 12% loss in weight over a median of 3 months, while metastatic patients reported a 10% weight loss over a median of 5 months. Of the localized patients, the majority presented with symptoms of abdominal pain (66%), nausea/vomiting/fatigue (61%), and change in bowel habits (44%). Presenting symptoms of the metastatic patients were similar (see Table 2). There was no statistical significance of tumor location in relation to presenting symptoms. Dietitian consults occurred for 67% (n = 473) of the patient population, 77% for those with localized disease and 57% for those with metastatic disease. Of those with reported weight loss, 74% (n = 343) had dietitian consultation.</p><p><b>Conclusion:</b> Overall, it was seen that a high number of newly diagnosed, treatment naïve PC patients present with malnutrition. Patients with localized disease and tumors located in the head/neck/uncinate region experience the greatest gastrointestinal symptoms of nausea, vomiting, change in bowel habits, and fatigue. Early implementation of a proactive nutritional screening program resulted in increased awareness of malnutrition and referral for nutritional intervention for newly diagnosed PC patients.</p><p><b>Table 1.</b> Demographics and Disease Characteristics.</p><p></p><p><b>Table 2.</b> Presenting Symptoms.</p><p></p><p>Nicole Nardella, MS<sup>1</sup>; Nathan Gilchrist, BS<sup>1</sup>; Adrianna Oraiqat, BS<sup>1</sup>; Sarah Goodchild, BS<sup>1</sup>; Dena Berhan, BS<sup>1</sup>; Laila Stancil, HS<sup>1</sup>; Jeanine Milano, BS<sup>1</sup>; Christina Santiago, BS<sup>1</sup>; Melissa Adams, PA-C<sup>1</sup>; Pamela Hodul, MD<sup>1</sup></p><p><sup>1</sup>Moffitt Cancer Center, Tampa, FL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Pancreatic cancer (PC) is an aggressive disease, with 5-year survival rate of 13%. Symptoms occur late in the disease course, leading to approximately 50% of patients presenting with metastatic disease. New-onset diabetes is often one of the first symptoms of PC, with diagnosis occurring up to 3 years before cancer diagnosis. We hypothesize increasing awareness of PC prevalence in diabetic patients, both new-onset and pre-existing, may lead to early PC diagnosis.</p><p><b>Methods:</b> This IRB-exempt retrospective review included new PC patients presenting to our institution in 2021-2023 with diabetes diagnosis (n = 458). We define new-onset diabetes as having been diagnosed within 3 years prior to pathologically confirmed adenocarcinoma. We define newly diagnosed PC as having pathologically confirmed adenocarcinoma within 30 days of clinic presentation. Data was collected on demographics, staging (localized vs metastatic), tumor location (head/neck/uncinate, body/tail, multi-focal), treatment initiation, diabetes onset (new-onset vs pre-existing), diabetes regimen, and weight loss. Descriptive variables were used to report outcomes.</p><p><b>Results:</b> In the study period, 1,310 patients presented to our institution. Of those, 35% had a diabetes diagnosis (n = 458). The majority of patients were male (61%) with age at PC diagnosis of 69 (41-92). Patients mostly had localized disease (57%) with primary tumor location in the head/neck/uncinate region (59%). New-onset diabetes was present in 31% of diabetics, 11% of all new patients, with 63% having localized disease (79% head/neck/uncinate) and 37% metastatic (66% body/tail). Of those with pre-existing diabetes (69%), 54% had localized disease (69% head/neck/uncinate) and 46% had metastatic disease (53% body/tail). See Table 1 for further demographic/disease characteristics. Abrupt worsening of diabetes was seen in 10% (n = 31) of patients with pre-existing diabetes, and 12% had a change in their regimen prior to PC diagnosis. Hence 13% (175/1,310) of all new patients presented with either new-onset or worsening diabetes. Weight loss was present in 75% (n = 108) of patients with new-onset diabetes, with a median of 14% weight loss (3%-38%) over 12 months (1-24). Alternatively, weight loss was present in 66% (n = 206) of patients with pre-existing diabetes, with a median of 14% weight loss (4%-51%) over 6 months (0.5-18). Diabetes medication was as follows: 41% oral, 30% insulin, 20% both oral and insulin, 10% no medication. Of those patients with new-onset diabetes, 68% were diagnosed within 1 year of PC diagnosis and 32% were diagnosed within 1-3 years of PC diagnosis. Of those within 1 year of diagnosis, 68% had localized disease with 81% having head/neck/uncinate tumors. Of the metastatic (31%), 73% had body/tail tumors. For patients with diabetes diagnosis within 1-3 years of PC diagnosis, 52% had localized disease (75% head/neck/uncinate) and 48% had metastatic disease (59% body/tail). See Table 2 for further characteristics.</p><p><b>Conclusion:</b> Overall, approximately one-third of new patients presenting with PC at our institution had diabetes, and new-onset diabetes was present in one-third of those patients. The majority of diabetes patients presented with localized head/neck/uncinate tumors. When comparing new-onset vs pre-existing diabetes, new-onset tended to experience greater weight loss over a longer time with more localized disease than pre-existing diabetes patients. Patients with diabetes diagnosis within 1 year of PC diagnosis had more localized disease (head/neck/uncinate). Hence increased awareness of diabetes in relation to PC, particularly new onset and worsening pre-existing, may lead to early diagnosis.</p><p><b>Table 1.</b> Demographics and Disease Characteristics.</p><p></p><p><b>Table 2.</b> New-Onset Diabetes Characteristics.</p><p></p><p>Marcelo Mendes, PhD<sup>1</sup>; Gabriela Oliveira, RD<sup>2</sup>; Ana Zanini, RD, MSc<sup>2</sup>; Hellin dos Santos, RD, MSc<sup>2</sup></p><p><sup>1</sup>Cicatripelli, Belém, Para; <sup>2</sup>Prodiet Medical Nutrition, Curitiba, Parana</p><p><b>Encore Poster</b></p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> According to the NPUAP, a pressure injury (PI) is damage that occurs to the skin and/or underlying soft tissue, primarily over bony prominences, and may also be related to the use of medical devices. PIs can range from intact skin to deeper ulcers, affecting structures such as muscles and bones. The aim of this study was to report the experience of using a specialized supplement for wound healing in the treatment of a PI.</p><p><b>Methods:</b> This is a case report based on the experience of the nurses from Cicatripelli®. Data were collected from May to July 2024 through a review of medical records and photographs showing the wound's progression. The patient was a 69-year-old woman with COPD, diabetes mellitus, and systemic arterial hypertension, who denied smoking and alcohol consumption. She developed a stage 4 PI in the sacral region after 15 days of mechanical ventilation due to bacterial pneumonia and was admitted to a private clinic for treatment on May 2, 2024. Initial wound assessment: Measurements: 16.5x13x4cm (WxLxD); abundant purulent exudate with a foul odor; intact peripheral skin; mild to moderate pain; 75% granulation tissue and 25% liquefactive necrosis (slough) (Figure 1). The wound was cleansed with 0.1% polyhexamethylene biguanide (PHMB) solution, followed by conservative debridement, primary coverage with hydrophiber with 1.2% silver, and secondary coverage with cotton gauze and transparent film, with dressing changes scheduled every 72 hours. Supplementation (Correctmax – Prodiet Medical Nutrition) started on 05/20/2024, with a dosage of 2 sachets per day, containing 10 g of collagen peptide, 3 g of L-arginine, 612 mg of vitamin A, 16 mg of vitamin E, 508 mg of vitamin C, 30 mcg of selenium, and 16 mg of zinc.</p><p><b>Results:</b> On the 17th day of supplementation, the hydrophiber with silver dressing was replaced with PHMB-impregnated gauze, as the wound showed no signs of infection and demonstrated significant clinical improvement. Measurements: 8x6x2cm (WxLxD); moderate serohematic exudate; intact peripheral skin; 100% granulation tissue; significant improvement in pain and odor (Figure 2). On the 28th day, the dressing was switched to calcium and sodium alginate to optimize exudate control due to the appearance of mild dermatitis. Low-intensity laser therapy was applied, and a skin protective spray was used. Wound assessment: Measurements: 7x5.5x1.5 cm (WxLxD), with maintained characteristics (Figure 3). On the 56th day, the patient returned for dressing change and discharge instructions, as she could not continue the treatment due to a lack of resources. The approach remained the same with dressing changes every 3 days. Wound assessment: Measurements: 5x3.5x0.5 cm (WxLxD), with approximately 92% reduction in wound area, epithelialized margins, and maintained characteristics (Figure 4).</p><p><b>Conclusion:</b> Nutritional intervention with specific nutrient supplementation can aid in the management of complex wounds, serving as a crucial tool in the healing process and contributing to reduced healing time.</p><p></p><p><b>Figure 1.</b> Photo of the wound on the day of the initial assessment on 05/02/2024.</p><p></p><p><b>Figure 2.</b> Photo of the wound after 17 days of supplementation on 06/06/2024.</p><p></p><p><b>Figure 3.</b> Photo of the wound after 28 days of supplementation on 06/17/2024.</p><p></p><p><b>Figure 4.</b> Photo of the wound after 56 days of supplementation on 07/15/2024.</p><p>Ludimila Ribeiro, RD, MSc<sup>1</sup>; Bárbara Gois, RD, PhD<sup>2</sup>; Ana Zanini, RD, MSc<sup>3</sup>; Hellin dos Santos, RD, MSc<sup>3</sup>; Ana Paula Celes, MBA<sup>3</sup>; Flávia Corgosinho, PhD<sup>2</sup>; Joao Mota, PhD<sup>4</sup></p><p><sup>1</sup>School of Nutrition, Federal University of Goiás, Goiania, Goias; <sup>2</sup>School of Nutrition, Federal University of Goiás, Goiânia, Goias; <sup>3</sup>Prodiet Medical Nutrition, Curitiba, Parana; <sup>4</sup>Federal University of Goias, Goiania, Goias</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Postprandial blood glucose is considered an important risk factor for the development of macrovascular and microvascular diseases. Despite the use of hypoglycemic agents, patients with diabetes often experience postprandial hyperglycemia due to unbalanced meals. The aim of this study was to compare the effects of a low glycemic index formula for glycemic control as a substitute for a standard breakfast in patients with type 2 diabetes.</p><p><b>Methods:</b> This randomized, placebo-controlled, crossover study included 18 individuals with type 2 diabetes. Participants were instructed to consume, in random order, either a nutritional formula or a typical Brazilian breakfast with the same caloric content for three consecutive week days in different weeks. The nutritional formula (200 mL) provided 200 kcal, with 20 g of carbohydrates, 8.8 g of protein, 9.4 g of fat (MUFA: 5.6 g, PUFA: 2.0 g), and 3.0 g of fiber (DiamaxIG – Prodiet Medical Nutrition), serving as a breakfast substitute. Both the nutritional formula and the standard breakfast were provided to the participants. During the two weeks of intervention, participants used continuous glucose monitoring sensors (Libre 2). Weight and height were measured to calculate body mass index (BMI), and medication use was monitored.</p><p><b>Results:</b> The sample consisted of 61% females, with a mean age of 50.28 ± 12.58 years. The average blood glucose level was 187.13 ± 77.98 mg/dL and BMI 29.67 ± 4.86 kg/m². All participants were taking metformin, and two were taking it concomitantly with insulin. There were no changes in medication doses or regimens during the study. The incremental area under the curve was significantly lower in the nutritional formula group compared to the standard breakfast group (2,794.02 ± 572.98 vs. 4,461.55 ± 2,815.73, p = 0.01).</p><p><b>Conclusion:</b> The low glycemic index formula for glycemic control significantly reduced postprandial glycemic response compared to a standard Brazilian breakfast in patients with type 2 diabetes. These findings suggest that incorporating low glycemic index meals could be an effective strategy for better managing postprandial blood glucose levels in this population, which may help mitigate the risk of developing macro and microvascular complications.</p><p>Kirk Kerr, PhD<sup>1</sup>; Bjoern Schwander, PhD<sup>2</sup>; Dominique Williams, MD, MPH<sup>1</sup></p><p><sup>1</sup>Abbott Nutrition, Columbus, OH; <sup>2</sup>AHEAD GmbH, Bietigheim-Bissingen, Baden-Wurttemberg</p><p><b>Financial Support:</b> Abbott Nutrition.</p><p><b>Background:</b> According to the World Health Organization, obesity is a leading risk factor for global non communicable diseases like diabetes, heart disease and cancer. Weight cycling, often defined as intentionally losing and unintentionally regaining weight, is observed in people with obesity and can have adverse health and economic consequences. This study develops the first model of the health economic consequences of weight cycling in individuals with obesity, defined by a body-mass index (BMI) ≥ 30 kg/m².</p><p><b>Methods:</b> A lifetime state-transition model (STM) with monthly cycles was developed to simulate a cohort of individuals with obesity, comparing “weight cyclers” versus “non-cyclers”. The simulated patient cohort was assumed to have average BMI 35.5, with 11% of patients with cardiovascular disease, and 6% of patients with type 2 diabetes. Key outcomes were the cost per obesity-associated event avoided, the cost per life-year (LY) gained, and the cost per quality-adjusted life year (QALY) gained, using a US societal perspective. Transition probabilities for obesity-associated diseases were informed by meta-analyses and were based on US disease-related base risks. Risks were adjusted by BMI-related, weight-cycling-related, and T2D-related relative risks (RR). BMI progression, health utilities, direct and indirect costs, and other population characteristics were informed by published US studies. Future costs and effects were discounted by 3% per year. Deterministic and probabilistic sensitivity analyses were performed to investigate the robustness of the results.</p><p><b>Results:</b> Simulating a lifetime horizon, non-cyclers had 0.090 obesity-associated events avoided, 0.602 LYs gained, 0.518 QALYs gained and reduced total costs of approximately $4,592 ($1,004 direct and $3,588 indirect costs) per person. Sensitivity analysis showed the model was most responsive to changes in patient age and cardiovascular disease morbidity and mortality risks. Non-cycling as the cost-effective option was robust in sensitivity analyses.</p><p><b>Conclusion:</b> The model shows that weight cycling has a major impact on the health of people with obesity, resulting in increased direct and indirect costs. The approach to chronic weight management programs should focus not only on weight reduction but also on weight maintenance to prevent the enhanced risks of weight cycling.</p><p>Avi Toiv, MD<sup>1</sup>; Arif Sarowar, MSc<sup>2</sup>; Hope O'Brien, BS<sup>2</sup>; Thomas Pietrowsky, MS, RD<sup>1</sup>; Nemie Beltran, RN<sup>1</sup>; Yakir Muszkat, MD<sup>1</sup>; Syed-Mohammad Jafri, MD<sup>1</sup></p><p><sup>1</sup>Henry Ford Hospital, Detroit, MI; <sup>2</sup>Wayne State University School of Medicine, Detroit, MI</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Age is an important factor in the transplant evaluation as age at transplantation is historically thought to influence transplant outcomes in organ transplant recipients. There is limited data on the impact of age on intestinal (IT) and multivisceral (MVT) organ transplantation. This study investigates the impact of age on post-transplant outcomes in patients who received an intestinal or multivisceral (including intestine) transplant, comparing those under 40 years old to those aged 40 and above.</p><p><b>Methods:</b> We conducted a retrospective chart review of all patients who underwent IT at an academic transplant center from 2010 to 2023. The primary outcome was patient survival and graft failure analyzed with Kaplan-Meier survival analysis.</p><p><b>Results:</b> Among 50 IT recipients, there were 11 IT recipients < 40years old and 39 IT recipients ≥40 years old (Table). The median age at transplant in the <40 group was 37 years (range, 17-39) and in the ≥40 group was 54 years (range, 40-68). In both groups, the majority of transplants were exclusively IT, however they included MVT as well. Kaplan-Meier survival analysis revealed no significant differences between the two age groups in graft survival or patient mortality. Reoperation within 1 month was significantly linked to decreased survival (p = 0.015) and decreased graft survival (p = 0.003) as was moderate to severe rejection within 1 month (p = 0.009) but was not significantly different between the two age groups. Wilcoxon rank-sum test showed no difference between groups in regard to reoperation or moderate to severe rejection at 1 or 3 months or the development of chronic kidney disease.</p><p><b>Conclusion:</b> Age at the time of intestinal transplantation (< 40 vs. ≥40 years old) does not appear to significantly impact major transplant outcomes, such as patient mortality, graft survival, or rejection rates. While reoperation and moderate to severe rejection within 1 and 3 months negatively affected overall outcomes, these complications were not more frequent in older or younger patients.</p><p><b>Table 1.</b> Demographic Characteristics of Intestinal Transplant Recipients.</p><p></p><p>BMI, body mass index; TPN, total parenteral nutrition.</p><p><b>International Poster of Distinction</b></p><p>Gabriela de Oliveira Lemos, MD<sup>1</sup>; Natasha Mendonça Machado, PhD<sup>2</sup>; Raquel Torrinhas, PhD<sup>3</sup>; Dan Linetzky Waitzberg, PhD<sup>3</sup></p><p><sup>1</sup>University of Sao Paulo School of Medicine, Brasília, Distrito Federal; <sup>2</sup>University of Sao Paulo School of Medicine, São Paulo; <sup>3</sup>Faculty of Medicine of the University of São Paulo, São Paulo</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> Ganepão 2023.</p><p><b>Publication:</b> Braspen Journal. ISSN 2764-1546 | Online Version.</p><p><b>Financial Support:</b> This study is linked to project no. 2011/09612-3 and was funded by the Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP).</p><p><b>Background:</b> Sphingolipids (SLs) are key molecules in cell signaling and play a central role in lipotoxicity. Moreover, they are major constituents of eucaryotic membrane cells. Data on SLs remodeling in the gastrointestinal tract after Roux-en-Y gastric bypass (RYGB) are lacking and may help to elicit tissue turnover and metabolism. This protocol aims to evaluate the SLs’ profile and remodeling within the plasma and different segments of the gastrointestinal tract (GIT) before and 3 months after RYGB in a population of women with obesity and type 2 diabetes mellitus (T2DM) and to correlate the changes within these tissues. This investigation is part of the SURMetaGIT study, registered at www.clinicalTrials.gov (NCT01251016).</p><p><b>Methods:</b> Twenty-eight women with obesity and T2DM who underwent RYGB were enrolled in this protocol. Those on insulin therapy were excluded. We collected plasma (n = 28) and intestinal samples from the gastric pouch (n = 9), duodenum (n = 8), jejunum (n = 9), and ileum (n = 9) for untarget metabolomics analysis at baseline and 3 months post-surgery. Indian ink (SPOT®) was used to check the places for the following biopsies. SLs were identified using high-performance liquid chromatography coupled mass spectrometry. Data were processed and analyzed using AnalysisBaseFileConverter and MS-DIAL, respectively. The magnitude of SLs’ changes after RYGB was assessed by the fold change (log2 post-surgery mean/pre-surgery mean). The Spearman test was performed for the correlation analysis. P value < 0.05 was considered significant. Statistics were carried out in the Jamovi software (2.2.5) and MetaboAnalyst 5.0.</p><p><b>Results:</b> 34 SLs were identified, including sphingomyelins (SM), ceramides (Cer), and glycosphingolipids (GlcSL). SMs were the most common SL found in the plasma – 27 SM, 3 Cer, and 4 GlcSL- and in GIT -16 SM, 13 Cer, and 5 GlcSL. Every GIT tissue presented distinct SL remodeling. The plasma and the jejunum better-discriminated SLs’ changes following surgery (Figure 1). The jejunum expressed the most robust changes, followed by the plasma and the duodenum (Figure 2). Figure 3 presents the heatmap of the plasma and the GIT tissues. Correlation analysis showed that the plasmatic SLs, particularly SM(d32:0), GlcCer(d42:1), and SM(d36:1), strongly correlated with jejunum SLs. These lipids showed a negative-strong correlation with jejunal sphingomyelins, but a positive-strong correlation with jejunal ceramides (Table 1).</p><p><b>Conclusion:</b> RYGB was associated with SL remodeling in the plasma and GIT. SM was the main SL found in the plasma and the GIT. The most robust changes occurred in the jejunum and the plasma, and these 2 samples presented the more relevant correlation. Considering our findings, the role of SM in metabolic changes after RYGB should be investigated.</p><p><b>Table 1.</b> Correlation Analysis of Sphingolipids from the plasma with the Sphingolipids from the Gastrointestinal Tract.</p><p></p><p>*p < ,05; **p < ,01; ***p < 0,001.</p><p></p><p>The green circle represents samples at baseline and the red circles represent the samples 3 months after RYGB.</p><p><b>Figure 1.</b> Principal Component Analysis (PCA) from GIT Tissues and Plasma.</p><p></p><p>Fold change = log2 post-surgery mean/pre-surgery mean.</p><p><b>Figure 2.</b> Fold Change of Sphingolipids from the Plasma and Gastrointestinal Tract.</p><p></p><p>The map under the right top green box represents lipids’ abundance before surgery, and the map under the left top red box represents lipids’ abundance after RYGB.</p><p><b>Figure 3.</b> Heatmap of Sphingolipids from the Plasma and Gastrointestinal Tract.</p><p>Lucas Santander<sup>1</sup>; Gabriela de Oliveira Lemos, MD<sup>2</sup>; Daiane Mancuzo<sup>3</sup>; Natasha Mendonça Machado, PhD<sup>4</sup>; Raquel Torrinhas, PhD<sup>5</sup>; Dan Linetzky Waitzberg, PhD<sup>5</sup></p><p><sup>1</sup>Universidade Santo Amaro (Santo Amaro University), São Bernardo Do Campo, Sao Paulo; <sup>2</sup>University of Sao Paulo School of Medicine, Brasília, Distrito Federal; <sup>3</sup>Universidade São Caetano (São Caetano University), São Caetano do Sul, Sao Paulo; <sup>4</sup>University of Sao Paulo School of Medicine, São Paulo; <sup>5</sup>Faculty of Medicine of the University of São Paulo, São Paulo</p><p><b>Financial Support:</b> Fundação de Amparo a Pesquisa do Estado de São Paulo.</p><p><b>Background:</b> Microalbuminuria (MAL) is an early biomarker of kidney injury, linked to conditions like hypertension, type 2 diabetes (T2DM), and obesity. It is also associated with higher cardiovascular (CV) risk. This protocol examines the impact of Roux-en-Y gastric bypass (RYGB) on MAL and CV markers in patients with obesity, T2DM, and MAL.</p><p><b>Methods:</b> 8 women with grade II-III obesity, T2DM, and MAL who underwent RYGB were included. Patients on insulin therapy were not included. MAL was defined as a urinary albumin-to-creatinine ratio > 30 mg/g. MAL, glycemic, and lipid serum biomarkers were measured at baseline and three months post-surgery. Systolic (SBP) and diastolic (DBP) blood pressures and medical treatments for T2DM and hypertension were also assessed. T2DM remission was defined by ADA 2021 criteria. Categorical variables were reported as absolute and relative frequencies, while continuous variables were expressed as median and IQR based on normality tests. Intragroup and intergroup comparisons were conducted using the Wilcoxon and Mann-Whitney tests for numeric data. The Fisher test was performed when necessary to compare dichotomic variables. Data were analyzed in the JASP software version 0.18.1.0.</p><p><b>Results:</b> Overall, RYGB was associated with weight loss, improved body composition, and better CV markers (Table 1). Post-surgery, MAL decreased by at least 70% in all patients and resolved in half. All patients with MAL resolution had pre-surgery levels ≤ 100 mg/g. Those without resolution had severe pre-surgery MAL (33.8 vs. 667.5, p = 0.029), and higher SBP (193 vs. 149.5, p = 0.029) DBP (138 vs. 98, p = 0.025). Blood pressure decreased after surgery but remained higher in patients without MAL resolution: SBP (156.0 vs. 129.2, p = 0.069) and DBP (109.5 vs. 76.5, p < 0.001). MAL resolution was not linked to T2DM remission at 3 months (75% vs. 50%, p = 1.0). One patient had worsened MAL (193.0 vs. 386.9 mg/g) after RYGB. Glomerular filtration rate (GFR) tended to increase post-surgery only in the group with MAL resolution (95.4 vs. 108.2 ml/min/1.73 m², p = 0.089), compared to the group without MAL resolution (79.2 vs. 73.7 ml/min/1.73 m², p = 0.6).</p><p><b>Conclusion:</b> RYGB effectively reduced markers of renal dysfunction and cardiovascular risk in this cohort. Patients showed a decrease in MAL, with resolution in half of the patients. The small sample size and short follow-up period may have limited the overall impact of the surgery on renal function. Future studies with larger cohorts and longer follow-ups are needed to understand better the effects of bariatric surgery on MAL, and its relation to other CV markers.</p><p><b>Table 1.</b> Biochemical and Clinical Data Analysis Following RYGB.</p><p></p><p>eGFR: estimated glomerular filtration rate; HbA1c: glycated hemoglobin; HDL-c: high-density lipoprotein cholesterol; HOMA-BETA: beta-cell function by the homeostasis model; HOMA-IR: Homeostasis Model Assessment of Insulin Resistance; LDL-c: low-density lipoprotein cholesterol; Non-HDL-c: non-high-density lipoprotein cholesterol; VLDL-c: very-low-density lipoprotein cholesterol; DBP: diastolic blood pressure; SBP: systolic blood pressure; WC: waist circumference.</p><p>Michelle Nguyen, BSc, MSc<sup>1</sup>; Johane P Allard, MD, FRCPC<sup>2</sup>; Dane Christina Daoud, MD<sup>3</sup>; Maitreyi Raman, MD, MSc<sup>4</sup>; Jennifer Jin, MD, FRCPC<sup>5</sup>; Leah Gramlich, MD<sup>6</sup>; Jessica Weiss, MSc<sup>1</sup>; Johnny H. Chen, PhD<sup>7</sup>; Lidia Demchyshyn, PhD<sup>8</sup></p><p><sup>1</sup>Pentavere Research Group Inc., Toronto, ON; <sup>2</sup>Division of Gastroenterology, Department of Medicine, Toronto General Hospital, Toronto, ON; <sup>3</sup>Division of Gastroenterology, Centre Hospitalier de l'Université de Montréal (CHUM), Department of Medicine, University of Montreal, Montreal, QC; <sup>4</sup>Division of Gastroenterology, University of Calgary, Calgary, AB; <sup>5</sup>Department of Medicine, University of Alberta, Division of Gastroenterology, Royal Alexandra Hospital, Edmonton, AB; <sup>6</sup>Division of Gastroenterology, Faculty of Medicine and Dentistry, University of Alberta, Edmonton, AB; <sup>7</sup>Takeda Canada Inc., Vancouver, BC; <sup>8</sup>Takeda Canada Inc., Toronto, ON</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> 46th European Society for Clinical Nutrition and Metabolism (ESPEN) Congress. 7-10 September 2024, Milan, Italy.</p><p><b>Financial Support:</b> Funding of this study is from Takeda Canada Inc.</p><p><b>Background:</b> Teduglutide is indicated for the treatment of patients with short bowel syndrome (SBS) who are dependent on parenteral support (PS). This study evaluated longer-term teduglutide effectiveness and safety in Canadian patients diagnosed with SBS dependent on PS using Real-World Evidence.</p><p><b>Methods:</b> This was an observational, retrospective study, using data from the national Canadian Takeda patient support program, and included adults with SBS. Data were collected 6 months before teduglutide initiation and from initiation to Dec-01-2023, death, or loss of follow-up. Descriptive statistics characterized the population and treatment-emergent adverse events (TEAEs). Changes in parenteral nutrition/intravenous fluid supply (PN/IV) were assessed based on decreases in PN/IV volume from baseline. Statistical significance was set at p < 0.05.</p><p><b>Results:</b> 52 patients (60% women) were included in this study. Median age (range) was 54 (22–81) and 50% had Crohn's disease as their etiology of SBS. At 6 months, median (range) absolute and percentage reduction from baseline in PN/IV volume was 3,900 mL/week (-6,960–26,784; p < 0.001), and 28.1% (-82.9–100). At 24 months, median (range) absolute reduction from baseline was 6,650 mL/week (-4,400–26,850; p = 0.003), and the proportion of patients who achieved ≥20% reduction in weekly PN/IV was 66.7%. Over the study, 27% achieved independence from PN/IV. TEAEs were reported in 51 (98%) patients (83% were serious TEAEs) during the study period, the 3 most common were weight changes, diarrhea, and fatigue.</p><p><b>Conclusion:</b> Patients showed significant decreases in PN/IV volumes after initiating teduglutide, with no unexpected safety findings. This study demonstrates real-world, longer-term effectiveness and safety of teduglutide in Canadian patients with SBS, complimenting previous clinical trials, and real-world studies.</p><p><b>Poster of Distinction</b></p><p>Sarah Carter, RD, LDN, CNSC<sup>1</sup>; Ruth Fisher, RDN, LD, CNSC<sup>2</sup></p><p><sup>1</sup>Coram CVS/Specialty Infusion Services, Tullahoma, TN; <sup>2</sup>Coram CVS/Specialty Infusion Services, Saint Hilaire, MN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Perceived benefit is one factor determining therapy continuation. Little information is published regarding positive outcomes for patients receiving the GLP-2 analog teduglutide apart from success rates of weaning HPN and hydration volumes by 20%. Patients who are new to therapy may question how long after initiation should they expect to see results. Patients may collaborate with prescribers to improve therapy tolerance if they have hope for improvements in their quality of life. This data analysis will provide details regarding patients receiving teduglutide and their perceived benefits to therapy.</p><p><b>Methods:</b> Dietitians interview patients receiving teduglutide as part of a service agreement, monitoring persistency and helping to eliminate barriers to therapy. Dietitians make weekly and monthly calls based on patients’ drug start dates and document interventions in flowsheets in patients’ electronic medical records. The interventions are published to data visualization software to monitor compliance with dietitian outreach as part of a quality improvement project. All patients were diagnosed with short bowel syndrome, but baseline characteristics were not exported to this dashboard. This study is a retrospective analysis of the existing dashboard using 3 years of assessments (May 1, 2021-April 30, 2024). Exclusion criteria included therapy dispensed from a pharmacy not using the company's newest computer platform and patients who did not respond to outreach. We analyzed the time on therapy before a positive outcome was reported by the patient to the dietitian and which positive outcomes were most frequently reported.</p><p><b>Results:</b> The data set included 336 patients with 2509 phone assessments. The most frequently reported first positive outcome was improved ostomy output/less diarrhea (72%, n = 243), occurring just after a month on therapy (mean 31 days ± 26.4). The mean time for first positive outcome for all patients who reported one was 32 days ± 28.5 (n = 314). Of the 22 patients who reported no positive outcome, 13 didn't answer the dietitians’ calls after initial contact. A summary is listed in Table 1. Overall positive outcomes reported were improved ostomy output/less diarrhea (87%, n = 292), weight gain (70%, n = 236), improved appetite/interest in food (33%, n = 112), feeling stronger/more energetic (27%, n = 92), improved quality of life (23%, n = 90), improved lab results (13%, n = 45) and fewer antidiarrheal medications (12%, n = 40). Of the 218 patients receiving parenteral support, 44 patients stopped hydration and HPN completely (20%) with another 92 patients reporting less time or days on hydration and HPN (42%) for a total of 136 patients experiencing a positive outcome of parenteral support weaning (62%). Patients reported improvements in other areas of their lives including fewer hospitalizations (n = 39), being able to travel (n = 35), tolerating more enteral nutrition volume (n = 19), returning to work/school (n = 14) and improving sleep (n = 13). A summary is diagramed in Figure 2.</p><p><b>Conclusion:</b> This retrospective analysis indicates that teduglutide is associated with improved symptom control and improved quality of life measures, with most patients seeing a response to therapy within the first 2 months. Patients responded to teduglutide with a decrease in ostomy output and diarrhea as the most frequent recognizable response to therapy. In addition to the goal of weaning parenteral support, clinicians should be cognizant of improvements in patients’ clinical status that can have significant impact in quality of life.</p><p><b>Table 1.</b> Timing of First Reported Positive Outcome by Patients Receiving Teduglutide.</p><p></p><p></p><p><b>Figure 1.</b> Total Positive Outcomes Reported by Patients (n = 336).</p><p><b>Poster of Distinction</b></p><p>Jennifer Cholewka, RD, CNSC, CDCES, CDN<sup>1</sup>; Jeffrey Mechanick, MD<sup>1</sup></p><p><sup>1</sup>The Mount Sinai Hospital, New York, NY</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Bariatric surgery is a guideline-directed intervention for patients with severe obesity. Adherence to post-operative recommendations is variable and consequent undernutrition complicated by multiple micronutrient deficiencies is prevalent. A post-bariatric surgery syndrome (PBSS) is not well defined and therefore poorly understood, though early detection and intervention would likely decrease clinical and economic burdens. Our current experience with PBSS is presented here as a case series. We will define PBSS based on clinical evidence related to risk factors, interventions, and clinical/biochemical responses.</p><p><b>Methods:</b> Twenty four consecutive patients referred to our metabolic support service were identified between January 1, 2019 and December 31, 2023 who were admitted to The Mount Sinai Hospital in New York City with a history of RYGB (roux en y gastric bypass) or BPDDS (biliopancreatic diversion with duodenal switch) and found to have failure to thrive, undernutrition, clinical/biochemical features of at least one micronutrient deficiency, and indications for parenteral nutrition. Patients were excluded if they had surgical complications or were not treated with parenteral nutrition. Fifteen patients were included in this case series and de-identified prior to data collection using the electronic health records (EPIC) and coded data collection sheets. Descriptive statistical methods were used to analyze parametric variables as mean ± standard deviation and non-parametric variables as median (interquartile range).</p><p><b>Results:</b> Results are provided in Table 1.</p><p><b>Conclusion:</b> The PBSS is defined by significant decompensation following a bariatric surgery procedure with malabsorptive component characterized by failure to thrive, hypoalbuminemia, multiple micronutrient deficiencies, and need for parenteral nutrition. Major risk factors include inadequate protein and micronutrient intake due to either unawareness (e.g., not recommended) or poor adherence, significant alcohol consumption, or a complicating medical/surgical condition. Parenteral nutrition formulation was safe in this population and prioritizes adequate nitrogen, nonprotein calories, and micronutrition. Further analyses on risk factors, responses to therapy, and role of a multidisciplinary team are in progress.</p><p><b>Table 1.</b> Risks/Presentation.</p><p></p><p><b>Table 2.</b> Responses to Parenteral Nutrition Intervention.</p><p></p><p>Holly Estes-Doetsch, MS, RDN, LD<sup>1</sup>; Aimee Gershberg, RD, CDN, CPT<sup>2</sup>; Megan Smetana, PharmD, BCPS, BCTXP<sup>3</sup>; Lindsay Sobotka, DO<sup>3</sup>; Kristen Roberts, PhD, RDN, LD, CNSC, FASPEN, FAND<sup>4</sup></p><p><sup>1</sup>The Ohio State University, Columbus, OH; <sup>2</sup>NYC Health + Hospitals, New York City, NY; <sup>3</sup>The Ohio State University Wexner Medical Center, Columbus, OH; <sup>4</sup>The Ohio State University, Granville, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Decompensated cirrhosis increases the risk of fat maldigestion through altered bile synthesis and excretion through the bile canaliculi. Maldigestion increases the risk of vitamin and mineral deficiencies which when untreated contribute to consequential health issues such as metabolic bone disease, xerophthalmia, and hyperkeratosis. There is an absence of comprehensive guidelines for prevention and treatment of deficiencies.</p><p><b>Methods:</b> Medical and surgical history, anthropometrics, medications and nutritional supplements, laboratory data, and medical procedures were extracted and analyzed from the electronic medical record.</p><p><b>Results:</b> A patient with congenital biliary atresia and decompensated cirrhosis was seen in a hepatology outpatient clinic. Biochemical assessment revealed severe vitamin A deficiency and suboptimal vitamin D and zinc status. Physical assessment indicated telogen effluvium and transient blurry vision. Despite a history of high-dose oral retinyl acetate ranging from 10,000-50,000 units daily and a 3-day course of 100,000 units via intermuscular injection and co-treatment of zinc deficiency to ensure adequate circulating retinol binding protein, normalization of serum retinol was not possible over the last 10 years. The patient's serum vitamin A level normalized following liver transplantation.</p><p><b>Conclusion:</b> In decompensated cirrhosis, there is a lack of sufficient guidelines for micronutrient dosing when traditional treatment strategies are unsuccessful. Furthermore, altered secretion of transport proteins due to underlying liver dysfunction may pose challenges in evaluating laboratory markers of micronutrient status. Collaborations with pharmacy and medicine support a thorough assessment, and the establishment of a safe treatment and monitoring plan. Clinical research is needed to understand strategies for acceptable and safe dosing strategies for patients with chronic, unresponsive fat soluble vitamin deficiencies.</p><p>Gang Wang, PhD<sup>1</sup></p><p><sup>1</sup>Nimble Science, Calgary, AB</p><p><b>Financial Support:</b> This work was supported by the National Research Council of Canada Industrial Research Assistance Program (NRC IRAP), an Alberta Innovate Accelerating Innovations into CarE (AICE) grant, and partially by a Clinical Problems Incubator Grant from the Snyder Institute for Chronic Disease at the University of Calgary.</p><p><b>Background:</b> The small intestine (SI) microbiome plays a crucial role in nutrient absorption and emerging evidence indicates that fecal contents is insufficient to represent the SI microenvironment. Endoscopic sampling is possible but expensive and not scalable. Ingestible sampling capsule technologies are emerging. However, potential contamination becomes are major limitations of these devices.</p><p><b>Methods:</b> We previously reported the Small Intestinal MicroBiome Aspiration (SIMBA) capsules as effective means for sampling, sealing and preserving SI luminal contents for 16 s rRNA gene sequencing analysis. A subset of the DNA samples, including SI samples collected by SIMBA capsules (CAP) and the matched saliva (SAL), fecal (FEC) and duodenal endoscopic aspirate (ASP) and brush (BRU) samples, from 16 participants recruited for an observational clinical validation study were sent for shotgun metagenomic sequencing. The aims are 1) to compare the sampling performance of the capsule (CAP) compared to endoscopic aspirates (ASP) and 850 small intestine, large intestine and fecal samples from the Clinical Microbiomics data warehouse (PRJEB28097), and 2) to characterize samples from the 4 different sampling sites in terms of species composition and functional potential.</p><p><b>Results:</b> 4/80 samples (1/16 SAL, 2/16 ASP, 0/16 BRU, 1/16 CAP and 0/16 FEC) failed library preparation and 76 samples were shotgun sequenced (average of 38.5 M read pairs per sample) (Figure 1). Quality assessment demonstrated that despite the low raw DNA yield of CAP samples, they retained a minimal level of host contamination, in comparison to ASP and BRU (mean 5.27 % vs. 93.09 - 96.44% of average total reads per sample) (Figure 2). CAP samples shared majority of the species with ASP samples as well as a big fraction of species detected in the terminal ileum samples. ASP and CAP sample composition was more similar to duodenum, jejunum and saliva and very different from large intestine and stool samples. Functional genomics further revealed GI regional-specific differences: In both ASP and CAP samples we detect a number of Gut Metabolic Modules (GMMs) for carbohydrate digestion and short-chain fatty acids. However, probiotic species, and species and genes involved in the bile acid metabolism were mainly prevalent in CAP and FEC samples and could not be detected in ASP samples.</p><p><b>Conclusion:</b> CAP and ASP microbiome are compositionally similar despite of the high level of host contamination of ASP samples. CAP appears to be of better quality to reveal GI regional-specific functional potentials than ASP. This analysis demonstrates the great potential for the SIMBA capsule for unveiling the SI microbiome and supports the prospective use of SIMBA capsules in observational and interventional studies to investigate the impacts of short-term and long-term biotic foods interventions to the gut microbiome (Figure 3). A series of studies utilizing the SIMBA capsules are being conducted under way and the detectability of the biotic foods intervention impacts will be reported in near future (Table 1).</p><p><b>Table 1.</b> List of Ongoing Observational and Interventional Clinical Studies using SIMBA Capsule.</p><p></p><p></p><p><b>Figure 1.</b> Shotgun Metagenomic Sequencing: Taxonomy Overview (Relative Abundance of the 10 Most Species By Sampling Site).</p><p></p><p><b>Figure 2.</b> Shotgun Metagenomic Sequencing: High Quality Non-Host Contamination Reads On Sampling Sites.</p><p></p><p><b>Figure 3.</b> Short-term and Long-term Interventional Study Protocols Using SIMBA Capsules.</p><p>Darius Bazimya, MSc. Nutrition, RN<sup>1</sup>; Francine Mwitende, RN<sup>1</sup>; Theogene Uwizeyimana, Phn<sup>1</sup></p><p><sup>1</sup>University of Global Health Equity, Kigali</p><p><b>Financial Support:</b> University of Global Health Equity.</p><p><b>Background:</b> Rwanda is currently grappling with the double burden of malnutrition and rising obesity, particularly in urban populations. As the country experiences rapid urbanization and dietary shifts, traditional issues of undernutrition coexist with increasing rates of obesity and related metabolic disorders. This study aims to investigate the relationship between these nutrition-related issues and their impact on gastrointestinal (GI) health and metabolic outcomes in urban Rwandan populations.</p><p><b>Methods:</b> A cross-sectional study was conducted in Kigali, Rwanda's capital, involving 1,200 adult participants aged 18 to 65. Data were collected on dietary intake, body mass index (BMI), GI symptoms, and metabolic markers such as fasting glucose, cholesterol levels, and liver enzymes. Participants were categorized into three groups: undernourished, normal weight, and overweight/obese based on their BMI. GI symptoms were assessed using a validated questionnaire, and metabolic markers were evaluated through blood tests. Statistical analyses were performed to assess correlations between dietary patterns, BMI categories, and GI/metabolic health outcomes.</p><p><b>Results:</b> The study found that 25% of participants were classified as undernourished, while 22% were obese, reflecting the double burden of malnutrition and rising obesity in Rwanda's urban population. Among obese participants, 40% exhibited elevated fasting glucose levels (p < 0.01), and 30% reported significant GI disturbances, such as irritable bowel syndrome (IBS) and non-alcoholic fatty liver disease (NAFLD). In contrast, undernourished individuals reported fewer GI symptoms but showed a higher prevalence of micronutrient deficiencies, including anaemia (28%) and vitamin A deficiency (15%). Dietary patterns characterized by high-fat and low-fiber intake were significantly associated with increased GI disorders and metabolic dysfunction in both obese and normal-weight participants (p < 0.05).</p><p><b>Conclusion:</b> This study highlights the growing public health challenge posed by the coexistence of undernutrition and obesity in Rwanda's urban centres. The dietary shifts associated with urbanization are contributing to both ends of the nutritional spectrum, adversely affecting GI and metabolic health. Addressing these issues requires comprehensive nutrition interventions that consider the dual challenges of undernutrition and obesity, promoting balanced diets and improving access to health services. These findings have important implications for nutrition therapy and metabolic support practices in Rwanda, emphasizing the need for tailored interventions that reflect the country's unique nutritional landscape.</p><p>Levi Teigen, PhD, RD<sup>1</sup>; Nataliia Kuchma, MD<sup>2</sup>; Hijab Zehra, BS<sup>1</sup>; Annie Lin, PhD, RD<sup>3</sup>; Sharon Lopez, BS<sup>2</sup>; Amanda Kabage, MS<sup>2</sup>; Monika Fischer, MD<sup>4</sup>; Alexander Khoruts, MD<sup>2</sup></p><p><sup>1</sup>University of Minnesota, St. Paul, MN; <sup>2</sup>University of Minnesota, Minneapolis, MN; <sup>3</sup>University of Minnesota, Austin, MN; <sup>4</sup>Indiana University School of Medicine, Indianapolis, IN</p><p><b>Financial Support:</b> Achieving Cures Together.</p><p><b>Background:</b> Fecal microbiota transplantation (FMT) is a highly effective treatment for recurrent Clostridioides difficile infection (rCDI). The procedure results in the repair of the gut microbiota following severe antibiotic injury. Recovery from rCDI is associated with high incidence of post-infection irritable bowel syndrome (IBS). In addition, older age, medical comorbidities, and prolonged diarrheal illness contribute to frailty in this patient population. The effect of FMT on IBS symptoms and frailty in patients with rCDI is largely unknown. In this prospective cohort study, we collected IBS symptom and frailty data over the 3 months following FMT treatment for rCDI in patients at two large, academic medical centers.</p><p><b>Methods:</b> Consenting adults who underwent FMT treatment for rCDI were enrolled in this study (n = 113). We excluded patients who developed a recurrence of CDI within the 3-month follow-up period (n = 15) or had incomplete IBS symptom severity scale (IBS-SSS) scores at any timepoint. The IBS-SSS is a 5-item survey measuring symptom intensity, with a score range from 0 to 500 with higher scores representing greater severity. IBS-SSS was collected at baseline, 1-week post FMT, 1-month post-FMT, and 3-months post-FMT. Frailty was assessed at baseline and 3-months using the FRAIL scale (categorical variable: “Robust Health”, “Pre-Frail”, “Frail”). Kruskal-Wallis test was used to compare IBS-SSS across timepoints. Post-hoc analysis was performed with the Pairwise Wilcoxon Rank Sum Tests using the False Discovery Rate adjustment method. The Friedman test was used to compare frailty distribution between the baseline and 3-month timepoints.</p><p><b>Results:</b> Mean age of the cohort was 63.3 (SD 15.4) years; 75% of the patients were female sex (total n = 58 patients). The IBS-SSS scores across timepoints are presented in Table 1 and Figure 1. The median IBS score at baseline was 134 [IQR 121], which decreased to a median score of 65 [IQR 174] at 1-week post-FMT. The baseline timepoint was found to differ from the 1-week, 1-month, and 3-month timepoints (p < 0.05). No other differences between timepoints were observed. Frailty was assessed at baseline and 3 months (total n = 52). At baseline, 71% of patients (n = 37) were considered Pre-Frail or Frail, but this percentage decreased to 46% (n = 24) at 3-months (Table 2; p < 0.05).</p><p><b>Conclusion:</b> Findings from this multicenter, prospective cohort study indicate an overall improvement in both IBS symptoms and frailty in the 3-months following FMT therapy for rCDI. Notably, IBS symptom scores were found to improve by 1-week post FMT. Further research is required to understand what predicts IBS symptom improvement following FMT and if nutrition therapy can help support further improvement. It will also be important to further understand how nutrition therapy can help support the improvement in frailty status observed following FMT for rCDI.</p><p><b>Table 1.</b> Distribution of IBS-SSS Scores at Baseline and Following FMT.</p><p></p><p><b>Table 2.</b> Frailty Distribution Assessed by FRAIL Scale at Baseline and 3-Months Post-FMT.</p><p></p><p></p><p>Box-plot distributions of IBS-SSS scores across timepoints. Median IBS-SSS score are baseline was 134 and decreased to a median score of 65 at 1-week post-FMT. The baseline timepoint was found to differ from the 1-week, 1-month, and 3-month timepoints (p < 0.05).</p><p><b>Figure 1.</b> Distribution of IBS-SSS Scores by Timepoint.</p><p>Oshin Khan, BS<sup>1</sup>; Subanandhini Subramaniam Parameshwari, MD<sup>2</sup>; Kristen Heitman, PhD, RDN<sup>1</sup>; Kebire Gofar, MD, MPH<sup>2</sup>; Kristin Goheen, BS, RDN<sup>1</sup>; Gabrielle Vanhouwe, BS<sup>1</sup>; Lydia Forsthoefel, BS<sup>1</sup>; Mahima Vijaybhai Vyas<sup>2</sup>; Saranya Arumugam, MBBS<sup>2</sup>; Peter Madril, MS, RDN<sup>1</sup>; Praveen Goday, MBBS<sup>3</sup>; Thangam Venkatesan, MD<sup>2</sup>; Kristen Roberts, PhD, RDN, LD, CNSC, FASPEN, FAND<sup>4</sup></p><p><sup>1</sup>The Ohio State University, Columbus, OH; <sup>2</sup>The Ohio State University Wexner Medical Center, Columbus, OH; <sup>3</sup>Nationwide Children's Hospital, Columbus, OH; <sup>4</sup>The Ohio State University, Granville, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Cyclic vomiting syndrome (CVS) is a disorder of gut-brain interaction characterized by recurrent spells of nausea, vomiting, and abdominal pain lasting hours to days. Estimated prevalence is approximately 2% in adults and children. Due to recalcitrant symptoms, patients might develop disorder eating patterns though data are lacking. Identifying patients at risk for disordered eating patterns and malnutrition can help optimize nutritional status and may improve overall patient outcomes. The purpose of this study is to define nutrient intakes and dietary patterns of those with CVS seeking care at a tertiary care referral clinic, and to establish variation in dietary intakes based on disease severity.</p><p><b>Methods:</b> In this ongoing, cross-sectional study of adults diagnosed with CVS based on Rome IV criteria, participants are asked to complete validated surveys including a food frequency questionnaire (FFQ). Baseline demographics and clinical characteristics including disease severity (defined by the number of episodes per year) were ascertained. Healthy eating index (HEI) scores (scale of 0-100) were calculated to assess diet quality with higher scores indicating better diet quality compared to lower scores. Those with complete data were included in this interim analysis.</p><p><b>Results:</b> Data from 33 participants with an average age of 40 ± 16 years and an average BMI of 28.6 ± 7.9 is presented. The cohort was predominately female (67%), white (79%) and with moderate to severe disease (76%). The malnutrition screening tool supported that 42% of participants were at risk of malnutrition independent of BMI status (p = 0.358) and disease severity (p = 0.074). HEI scores were poor amongst those with CVS (55) and did not differ based on disease severity (58 vs 54; p = 0.452). Energy intakes varied ranging from 416-3974 kcals/day with a median intake of 1562 kcals/day.</p><p><b>Conclusion:</b> In CVS, dietary intake is poor and there is a high risk of malnutrition regardless of disease severity and BMI. Providers and registered dietitian nutritionists must be aware of the high rates of malnutrition risk and poor dietary intakes in this patient population to improve delivery of dietary interventions. Insight into disordered eating and metabolic derangements may improve the understanding of dietary intakes in CVS.</p><p>Hannah Huey, MDN<sup>1</sup>; Holly Estes-Doetsch, MS, RDN, LD<sup>2</sup>; Christopher Taylor, PhD, RDN<sup>2</sup>; Kristen Roberts, PhD, RDN, LD, CNSC, FASPEN, FAND<sup>3</sup></p><p><sup>1</sup>Nationwide Children's Hospital, Columbus, OH; <sup>2</sup>The Ohio State University, Columbus, OH; <sup>3</sup>The Ohio State University, Granville, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Micronutrient deficiencies are common in Crohn's disease (CD) due to poor dietary absorption. Pregnancy has an increased nutritional demand and when coupled with a malabsorptive condition like CD, clinicians must closely monitor micronutrient status. However, there is a lack of evidence-based guidelines for clinicians when managing these complex patients leaving clinicians to use clinical judgement for management. A case study of a pregnant female with CD presents for delivery with an undetected fat-soluble vitamin deficiency. The impact of a vitamin K deficiency on the post-partum management of a patient with CD is presented along with potential assessment and treatment strategies. At 25 weeks gestation, the patient presented with a biochemical iron deficiency anemia, vitamin B12 deficiency, and zinc deficiency for which she was treated with oral supplementation and/or intramuscular injections. No assessment of fat-soluble vitamins during the gestation period was conducted despite a pre-pregnancy history of multiple micronutrient deficiencies. At 33 weeks gestation, the mother was diagnosed with preeclampsia and delivered the fetus at 35 weeks. After birth, the infant presented to the NICU with a mediastinal mass, abnormal liver function tests and initial coagulopathy. The mother experienced a uterine hemorrhage post-cesarean section. At this time her INR was 14.8 with a severe prolongation of the PT and PTT and suboptimal levels of blood clotting factors II, VII, IX, and X. The patient was diagnosed with a vitamin K deficiency and was treated initially with 10 mg daily by mouth x 3 days resulting in an elevated serum vitamin K while PT and INR were trending towards normal limits. At discharge she was recommended to take 1 mg daily by mouth of vitamin K to prevent further deficiency. PT and INR were the biochemical assays that were reassessed every 3 months since serum vitamin K is more reflective of recent intake. CD represents a complex disorder and the impact of pregnancy on micronutrient status is unknown. During pregnancy, patients with CD may require additional micronutrient monitoring particularly in the case of historical micronutrient deficiencies or other risk factors. This case presents the need for further research into CD-specific micronutrient deficiencies and the creation of specific supplementation guidelines and treatment algorithms for detection of micronutrient deficiencies in at-risk patients.</p><p><b>Methods:</b> None Reported.</p><p><b>Results:</b> None Reported.</p><p><b>Conclusion:</b> None Reported.</p><p>Gretchen Murray, BS, RDN<sup>1</sup>; Kristen Roberts, PhD, RDN, LD, CNSC, FASPEN, FAND<sup>2</sup>; Phil Hart, MD<sup>1</sup>; Mitchell Ramsey, MD<sup>1</sup></p><p><sup>1</sup>The Ohio State University Wexner Medical Center, Columbus, OH; <sup>2</sup>The Ohio State University, Granville, OH</p><p><b>Financial Support:</b> UL1TR002733.</p><p><b>Background:</b> Enteric hyperoxaluria (EH) and resultant lithiasis is well documented in many malabsorptive conditions including inflammatory bowel disease, celiac disease, short bowel syndrome, and post gastric bypass. Chronic pancreatitis (CP) often leads to exocrine pancreatic insufficiency (EPI) and subsequent fat malabsorption increasing the risk of EH secondary to calcium binding to dietary fat leaving oxalates available for colonic absorption. Modulating oxalate intake by reducing whole grains, greens, baked beans, berries, nuts, beer and chocolate while simultaneously improving hydration is the accepted medical nutrition therapy (MNT) for EH and calcium-oxalate stones. Although sources of dietary oxalate are well-known, there is limited literature regarding dietary oxalates in the CP diet, leaving a paucity of data to guide MNT for EH and calcium-oxalate lithiasis for these patients.</p><p><b>Methods:</b> A cross-sectional, case-control study was performed comparing subjects with CP to healthy controls. Vioscreen™ food frequency questionnaire was used to assess and quantify total oxalic acid intake in the CP cohort and to describe dietary sources. Descriptive statistics were used to describe dietary intake of oxalic acid and contributing food sources.</p><p><b>Results:</b> A total of 52 subjects with CP were included and had a mean age of 50 ± 15 years. Most subjects were male (n = 35; 67%). Mean BMI was 24 ± 6 kg/m2 and 8 subjects (15%) were classified as underweight by BMI. Median daily caloric intake was 1549 kcal with a median daily oxalic acid intake of 104 mg (range 11-1428 mg). The top three contributors to dietary oxalate intake were raw and cooked greens such as spinach or lettuce, followed by mixed foods such as pizza, spaghetti, and tacos and tea. Other significant contributors (>100 mg) to dietary oxalate intake included sports or meal replacement bars, cookies and cakes, potato products (mashed, baked, chips, fried), and refined grains (breads, tortillas, bagels).</p><p><b>Conclusion:</b> In the CP population, highest contributors to oxalate intake include greens, mixed foods, tea, meal replacement bars, some desserts, potatoes, and refined grains. Many of the identified dietary oxalate sources are not considered for exclusion in a typical oxalate restricted diet. A personalized approach to dietary oxalate modulation is necessary to drive MNT for EH prevention in those with CP.</p><p>Qian Ren, PhD<sup>1</sup>; Peizhan Chen, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine<sup>2</sup></p><p><sup>1</sup>Department of Clinical Nutrition, Shanghai Sixth People's Hospital Affiliated to Shanghai Jiao Tong University School of Medicine, Shanghai; <sup>2</sup>Clinical Research Center, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai</p><p><b>Financial Support:</b> This study was supported by the Youth Cultivation Program of Shanghai Sixth People's Hospital (Grant No. ynqn202223), the Key Laboratory of Trace Element Nutrition, National Health Commission of the Peoples’ Republic of China (Grant No. wlkfz202308), and the Danone Institute China Diet Nutrition Research and Communication (Grant No. DIC2023-06).</p><p><b>Background:</b> Low serum vitamin D status was reported to be associated with reduced muscle mass; however, it is inconclusive whether this relationship is causal. This study used data from the National Health and Nutrition Examination Survey (NHANES) and two-sample Mendelian randomization (MR) analyses to ascertain the causal relationship between serum 25-hydroxyvitamin D [25(OH)D] and appendicular muscle mass (AMM).</p><p><b>Methods:</b> In the NHANES 2011–2018 dataset, 11,242 participants aged 18–59 years old were included, and multivariant linear regression was performed to assess the relationship between 25(OH)D and AMM measured by dual-energy X-ray absorptiometry (Figure 1). In two-sample MR analysis, 167 single nucleotide polymorphisms significantly associated with serum 25(OH)D were applied as instrumental variables (IVs) to assess vitamin D effects on AMM in the UK Biobank (417,580 Europeans) using univariable and multivariable MR models (Figure 2).</p><p><b>Results:</b> In the NHANES 2011–2018 dataset, serum 25(OH)D concentrations were positively associated with AMM (β = 0.013, SE = 0.001, p < 0.001) in all participants, after adjustment for age, race, season of blood collection, education, income, body mass index and physical activity. In stratification analysis by sex, males (β = 0.024, SE = 0.002, p < 0.001) showed more pronounced positive associations than females (β = 0.003, SE = 0.002, p = 0.024). In univariable MR, genetically higher serum 25(OH)D levels were positively associated with AMM in all participants (β = 0.049, SE = 0.024, p = 0.039) and males (β = 0.057, SE = 0.025, p = 0.021), but only marginally significant in females (β = 0.043, SE = 0.025, p = 0.090) based on IVW models were noticed. No significant pleiotropy effects were detected for the IVs in the two-sample MR investigations. In MVMR analysis, a positive causal effect of 25(OH)D on AMM were observed in total population (β = 0.116, SE = 0.051, p = 0.022), males (β = 0.111, SE = 0.053, p = 0.036) and females (β = 0.124, SE = 0.054, p = 0.021).</p><p><b>Conclusion:</b> Our results suggested a positive causal effect of serum 25(OH)D concentration on AMM; however, more researches are needed to understand the underlying biological mechanisms.</p><p></p><p><b>Figure 1.</b> Working Flowchart of Participants Selection in the Cross-Sectional Study.</p><p></p><p><b>Figure 2.</b> The study assumptions of the two-sample Mendelian Randomization analysis between serum 25(OH)D and appendicular muscle mass. The assumptions include: (1) the genetic instrumental variables (IVs) should exhibit a significant association with serum 25(OH)D; (2) the genetic IVs should not associate with any other potential confounding factors; and (3) the genetic IVs must only through serum 25(OH)D but not any other confounders to influence the appendicular muscle mass. The dotted lines indicate the violate of the assumptions.</p><p>Qian Ren, PhD<sup>1</sup>; Junxian Wu<sup>1</sup></p><p><sup>1</sup>Department of Clinical Nutrition, Shanghai Sixth People's Hospital Affiliated to Shanghai Jiao Tong University School of Medicine, Shanghai</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> A healthy diet is essential for both preventing and treating type 2 diabetes mellitus (T2DM), which has a negative impact on public health. Whole grains, which are rich in dietary fibre, can serve as a good source of carbohydrates. However, the correlation and causality between whole grain intake and the risk of T2DM and glucose metabolism remains to be clarified.</p><p><b>Methods:</b> First, the National Health and Nutrition Examination Survey database 2003-2018 were used to investigate the correlation between dietary whole grain/fibre intake and the risk of T2DM and glucose metabolism. Then, based on the largest publicly published genome-wide association analysis of whole grain intake in the Neale lab database, single nucleotide polymorphisms (SNPs) that were statistically and significantly associated with whole grain intake were selected as instrumental variables (p < 5×10<sup>-8</sup>, linkage disequilibrium r<sup>2</sup> < 0.1). Inverse variance weighted analysis (IVW), weighted median method and other methods were used to analyze the causal relationship between whole grain intake and T2DM. Heterogeneity test, gene pleiotropy test and sensitivity analysis were performed to evaluate the stability and reliability of the results.</p><p><b>Results:</b> The results showed that dietary intakes of whole grains (OR = 0.999, 95%CI: 0.999 ~ 1.000, p = 0.004)/fibre (OR = 0.996, 95% CI: 0.993 ~ 0.999, p = 0.014) were negatively associated with the risk of T2DM. In the population with normal glucose metabolism, dietary fibre intake was negatively associated with FPG (β = -0.003, SE = 0.001, p < 0.001), FINS (β = -0.023, SE = 0.011, p = 0.044), and HOMA-IR (β = -0.007, SE = 0.003, p = 0.023). In the population with abnormal glucose metabolism, dietary intakes of whole grains (β = -0.001, SE = 0.001, p = 0.036) and fibre (β = -0.006, SE = 0.002, p = 0.005) were negatively associated with HbA1c. For further MR analyses, the IVW method demonstrated that for every one standard deviation increase in whole grains intake, T2DM risk was decreased by 1.9% (OR = 0.981, 95%CI: 0.970 ~ 0.993, β = -0.019, p = 0.002), with consistent findings for IVW-multiplicative random effects, IVW-fixed effects and IVW-radial. Meanwhile, MR-Egger regression analysis (intercept = -2.7 × 10<sup>-5</sup>, p = 0.954) showed that gene pleiotropy doesn't influence the results of MR analysis. Leave-one-out analysis showed that individual SNP greatly doesn't influence the results (p<sub>heterogeneity</sub>= 0.445).</p><p><b>Conclusion:</b> Dietary intakes of whole grains may reduce the risk of T2DM and improve glucose metabolic homeostasis and insulin resistance. The causal relationship between whole grains intake and T2DM, as well as the optimal daily intake of whole grains still needs to be further explored in the future through large randomised controlled intervention studies and prospective cohort studies.</p><p>Hikono Sakata, Registered Dietitian<sup>1</sup>; MIsa Funaki, Registered Dietitian<sup>2</sup>; Kanae Masuda, Registered Dietitian<sup>2</sup>; Rio Kurihara, Registered Dietitian<sup>2</sup>; Tomomi Komura, Registered Dietitian<sup>2</sup>; Masaru Yoshida, Doctor<sup>2</sup></p><p><sup>1</sup>University of Hyogo, Ashiya-shi, Hyogo; <sup>2</sup>University of Hyogo, Himezi-shi, Hyogo</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> In recent years, lifestyle-related diseases such as obesity, diabetes, and dyslipidemia have been recognized as problems, one of the causes of which is an excessive fatty diet. Obesity and other related diseases are known to be risk factors for the severity of infectious diseases such as sepsis and novel coronavirus infection, but their pathomechanisms have not been clarified. Therefore, we hypothesized that a diet high in fat might induce functional changes not only in adipocytes but also in macrophages, weakening the immune response and leading to aggravation of infectious diseases. Therefore, in this study, we performed proteome analysis and RNA sequence analysis to examine what kind of gene and protein expression is induced in macrophages by high-fat diet loading.</p><p><b>Methods:</b> Four-week-old mice were divided into a normal diet (ND) group and a high-fat diet (HFD) group and kept for four weeks. Macrophages were collected intraperitoneally from mice that had been intraperitoneally injected with 2 mL of thioglycolate medium to promote macrophage proliferation one week before dissection, and incubated at 37°C with 5% CO2 using Roswell Park Memorial Institute medium (RPMI). After culturing in an environment for 2 hours, floating cells were removed, and proteome analysis was performed using the recovered macrophages. In addition, RNA sequence analysis was performed on RNA extracted from macrophages.</p><p><b>Results:</b> Proteome analysis identified more than 4000 proteins in each group. In the HFD group, compared to the ND group, decreased expression of proteins involved in phagocytosis, such as immunoglobulin and eosinophil peroxidase, was observed. In addition, RNA sequencing data analysis also showed a decrease in the expression levels of genes related to phagocytosis, which had been observed to decrease in proteome analysis.</p><p><b>Conclusion:</b> From the above, it was suggested that the phagocytic ability of macrophages is reduced by high-fat diet loading. This research is expected to clarify the molecular mechanisms by which high-fat dietary loading induces the expression of genes and proteins and induces immunosuppressive effects.</p><p>Benjamin Davies, BS<sup>1</sup>; Chloe Amsterdam, BA<sup>1</sup>; Basya Pearlmutter, BS<sup>1</sup>; Jackiethia Butsch, C-CHW<sup>2</sup>; Aldenise Ewing, PhD, MPH, CPH<sup>3</sup>; Erin Holley, MS, RDN, LD<sup>2</sup>; Subhankar Chakraborty, MD, PHD<sup>4</sup></p><p><sup>1</sup>The Ohio State University College of Medicine, Columbus, OH; <sup>2</sup>The Ohio State University Wexner Medical Center, Columbus, OH; <sup>3</sup>The Ohio State University College of Public Health, Columbus, OH; <sup>4</sup>The Ohio State University Wexner Medical Center, Dublin, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Food insecurity (FI) refers to lack of consistent access to sufficient food for an active, healthy life. This issue, influenced by economic, social, and environmental factors, disproportionately affects disadvantaged populations. According to the United States Department of Agriculture, more than 10% of U.S. households experience FI, leading to physical and mental health consequences. FI has been linked to poorer dietary quality, increased consumption of processed, calorie-dense foods, and a higher prevalence of obesity, diabetes, and cardiovascular diseases. Chronic stress from FI can also trigger or exacerbate mental health disorders, including depression and anxiety, further complicating health outcomes. Affecting ~20% of the global population, dyspepsia significantly diminishes quality of life and increases healthcare costs. Its etiology is multifactorial, involving abnormal gastric motility, visceral hypersensitivity, inflammation, and psychosocial stressors. Although research on the link between FI and disorders of gut-brain interaction (DGBIs) like dyspepsia is limited, emerging evidence suggests a bidirectional relationship. Therefore, this study was designed to examine the association between FI, dyspepsia, and other health-related social needs (HRSN). Our hypotheses include 1) patients with FI have more severe dyspepsia symptoms, and 2) FI is associated with HRSN in other domains.</p><p><b>Methods:</b> Patients presenting to a specialty motility clinic were prospectively enrolled into a registry created with the goal of holistically investigating the pathophysiology of DGBIs. Validated questionnaires for HRSN and dyspepsia were completed prior to their clinic visits. Data were managed with REDCap and statistical analyses were performed using SPSS.</p><p><b>Results:</b> 53 patients completed the questionnaires. 88.7% of patients were White and 73.6% were female with an average age of 45.6 years (21-72) and BMI of 28.7 kg/m<sup>2</sup> (17.8-51.1). FI was present in 13 (24.5%) patients. The overall severity of dyspepsia symptoms was significantly less in the food secure patients (13.8 vs. 18.8, p = 0.042). Of the four subscales of dyspepsia (nausea-vomiting, postprandial fullness, bloating, and loss of appetite), only loss of appetite was significantly greater in those with FI (2.3 vs. 1.3, p = 0.017). Patients with FI were more likely to be at medium (61.5% vs. 5.0%) or high risk (30.8% vs. 2.5%, p < 0.001) of financial hardship, experience unmet transportation needs (38.5% vs. 5.0%, p = 0.0.019) and housing instability (30.8% vs. 5.0%, p = 0.023) compared to those who were food secure. They were also at higher risk of depression (54% vs. 12.5%, p = 0.005), and reporting insufficient physical activity (92.3% vs. 55.0%, p = 0.05). After adjusting for age, gender, race, and BMI, FI was not a predictor of global dyspepsia severity. Greater BMI (O.R. 0.89, 95% C.I. 0.81-0.98) was associated with severity of early satiety. Female gender (O.R. 10.0, 95% C.I. 1.3-76.9, p = 0.03) was associated with the severity of nausea. Greater BMI (O.R. 1.10, 95% C.I. 1.001-1.22, p = 0.048) and female gender (O.R. 10.8, 95% C.I. 1.6-72.9, p = 0.015) both correlated with severity of postprandial fullness.</p><p><b>Conclusion:</b> FI affected ~25% of patients seen in our clinic. It was, however, not an independent predictor of overall severity of dyspepsia symptoms. Patients who experienced FI had higher prevalence of other HRSN, and risk of depression and physical inactivity. Our results highlight the importance of considering FI and other HRSN in the management of dyspepsia. Understanding this interaction is essential for improving clinical outcomes and guiding public health interventions.</p><p>Ashlesha Bagwe, MD<sup>1</sup>; Chandrashekhara Manithody, PhD<sup>1</sup>; Kento Kurashima, MD, PhD<sup>1</sup>; Shaurya Mehta, BS<sup>1</sup>; Marzena Swiderska-Syn<sup>1</sup>; Arun Verma, MD<sup>1</sup>; Austin Sims<sup>1</sup>; Uthayashanker Ezekiel<sup>1</sup>; Ajay Jain, MD, DNB, MHA<sup>1</sup></p><p><sup>1</sup>Saint Louis University, St. Louis, MO</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Farnesoid X receptor (FXR) a gut nuclear receptor regulates intestinal driven bile acid homeostasis in short bowel syndrome. Chenodeoxycholic acid (CDCA), a primary bile acid, acts as a FXR ligand. Stem cell-derived intestinal enteroids offer a valuable system to study intestinal FXR function. We hypothesized that transfection of porcine enteroids with small interfering RNA (siRNA) would modulate FXR to further define its mechanistic role.</p><p><b>Methods:</b> We developed a porcine protocol for matrigel-based 3D culture systems to generated enteroids from the small bowel of neonatal yorkshire pigs. After 7 days, cultures were passaged and expanded. RNA strands for Dicer-substrate siRNAs (DsiRNAs) were synthesized as single-strand RNAs (ssRNAs) by Integrated DNA Technologies and resuspended in an RNase-free buffer. DsiRNA targeted Sus scrofa, farnesoid x receptor (FXR) gene (KF597010.1). Three sets of DsiRNA were made for gene-specific silencing of FXR gene. Porcine enteroids were cultured and transfected with FXR-specific siRNA and control siRNA using Lipofectamine RNAiMAX reagent. They were also treated with escalating CDCA concentrations (25 μM to 50 μM) for 24 and 48 hrs. FXR mRNA levels were quantified by real-time PCR and functional assays performed to assess changes in bile acid uptake and efflux following transfection.</p><p><b>Results:</b> Data from 3 separate experiments of intestinal crypts showed similar results in enhanced FXR expression with CDCA against control (p < 0.01). CDCA treatment resulted in a dose-dependent increase in FXR mRNA expression, reaching peak levels after 48 hours of exposure (2.8X increase) with enhanced nuclear localization. Functionally, CDCA-treated enteroids displayed increased bile acid uptake and reduced efflux, validating FXR's role in mediating bile acid driven enterohepatic circulation. Several runs with siRNA were conducted. Using 30pMole siRNA (sense: 5' GAUCAACAGAAUCUUCUUCAUUATA 3' and antisense: 5' UAUAAUGAAGAAGAUUCUGUUGAUCUG 3') there was a 68% reduction in FXR expression against scramble. FXR silencing led to decreased bile acid uptake and increased efflux. No significant effects were observed in enteroids transfected with control siRNA. Paradoxically, CDCA treated cultures showed a higher proportion of immature enteroids to mature enteroids.</p><p><b>Conclusion:</b> In porcine enteroids, CDCA treatment increased FXR gene expression and promoted its nuclear localization. This finding implies the existence of a positive feedback cycle in which CDCA, an FXR ligand, induces further synthesis and uptake. siRNA transfection was able to significantly decrease FXR activity. By employing this innovative methodology, one can effectively examine the function of FXR in ligand treated or control systems.</p><p><b>Pediatric, Neonatal, Pregnancy, and Lactation</b></p><p>Kento Kurashima, MD, PhD<sup>1</sup>; Si-Min Park, MD<sup>1</sup>; Arun Verma, MD<sup>1</sup>; Marzena Swiderska-Syn<sup>1</sup>; Shaurya Mehta, BS<sup>1</sup>; Austin Sims<sup>1</sup>; Mustafa Nazzal, MD<sup>1</sup>; John Long, DVM<sup>1</sup>; Chandrashekhara Manithody, PhD<sup>1</sup>; Shin Miyata, MD<sup>1</sup>; Ajay Jain, MD, DNB, MHA<sup>1</sup></p><p><sup>1</sup>Saint Louis University, St. Louis, MO</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> North American Society for Pediatric Gastroenterology, Hepatology and Nutrition (NASPGHAN) Florida.</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Biliary atresia (BA) is a major cause of obstructive neonatal cholestatic disease. Although hepato-portoenterostomy (HPE) is routinely performed for BA patients, more than half eventually require liver transplantation. The intricate mechanisms of bile ductular injury driving its pathogenesis remains elusive and not recapitulated in current small animal and rodent models that rely on bile duct ligation. Addressing prevailing lacunae, we hypothesized that extra and intra hepatic bile duct destruction through an endothelial irritant would recapitulate human condition. We have thus developed a novel neonatal piglet BA model called, ‘BATTED’. Piglets have liver and gastro-intestinal homology to human infants and share anatomic and physiological processes providing a robust platform for BATTED and HPE.</p><p><b>Methods:</b> Six 7-10 day old piglets were randomized to BATTED (US provisional Patent US63/603,995) or sham surgery. BATTED included cholecystectomy, common bile duct and hepatic duct injection of 95% ethanol and a retainer suture for continued ethanol induced intrahepatic injury. Vascular access ports were placed and scheduled ultrasound guided liver biopsies were performed. Six-weeks post-BATTED piglets underwent HPE. 8-weeks after initial surgery, animals were euthanized. Serology, histology, gene expression and immunohistochemistry were performed.</p><p><b>Results:</b> Serological evaluation revealed a surge in conjugated bilirubin 6 weeks after BATTED procedure from baseline (mean Δ 0.39 mg/dL to 3.88 mg/dL). Gamma-glutamyl transferase (GGT) also exhibited a several fold increase (mean: Δ 16.3IU to 89.5IU). Sham did not display these elevations (conjugated bilirubin: Δ 0.39 mg/dL to 0.98 mg/dL, GGT: Δ 9.2IU to 10.4IU). Sirius red staining demonstrated significant periportal and diffuse liver fibrosis (16-fold increase) and bile duct proliferation marker CK-7 increased 9-fold with BATTED. Piglets in the BATTED group demonstrated enhanced CD-3 (7-fold), alpha-SMA (8.85-fold), COL1A1 (11.7-fold) and CYP7A1 (7-fold), vs sham. Successful HPE was accomplished in piglets with improved nutritional status and a reduction in conjugated bilirubin (Δ 4.89 mg/dL to 2.11 mg/dL).</p><p><b>Conclusion:</b> BATTED replicated BA features, including hyperbilirubinemia, GGT elevation, significant hepatic fibrosis, bile duct proliferation, and inflammatory infiltration with subsequent successful HPE. This model offers substantial opportunities to elucidate the mechanism underlying BA and adaptation post HPE, paving the path for the development of diagnostics and therapeutics.</p><p>Sirine Belaid, MBBS, MPH<sup>1</sup>; Vikram Raghu, MD, MS<sup>1</sup></p><p><sup>1</sup>UPMC, Pittsburgh, PA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> At our institution, pediatric residents are responsible for managing patients with intestinal failure (IF) in the inpatient units. However, they have reported feelings of inexperience and anxiety when dealing with these complex cases. This project aimed to identify knowledge gaps and evaluate the confidence levels of pediatric residents in managing IF patients.</p><p><b>Methods:</b> We conducted an online needs assessment survey using Qualtrics, which included Likert-scale, multiple-choice, open-ended and rating questions to assess residents' confidence levels (from 1 to 10) in performing tasks related to IF patient care. This voluntary survey, approved by the IRB as exempt, was distributed to all pediatric residents at the University of Pittsburgh via QR codes and emails.</p><p><b>Results:</b> Of the respondents, 32% participated in the survey, with nearly 50% having completed a rotation on the Intestinal Rehabilitation (IR) service. Residents reported the lowest confidence in calculating Total Parenteral Nutrition (TPN)-like Intravenous (IV) fluids (solutions administered centrally or peripherally, containing dextrose and major electrolytes designed to match the patients’ home TPN contents), identifying signs of D-lactic acidosis and small intestinal bacterial overgrowth (SIBO), and managing SIBO (average confidence rating of 2/10). They also expressed low confidence in ordering home TPN and TPN-like IV Fluids, understanding related anatomy, ensuring proper stoma care, managing central access loss, and addressing poor catheter blood flow (average confidence ratings of 3-4/10). Conversely, residents felt more confident managing feeding intolerance and central line-associated infections (average confidence ratings of 5-6/10). Additionally, they rated their ability to identify signs of septic and hypovolemic shock as 8/10, though they felt less confident in managing these conditions (7/10). Furthermore, 64% of respondents agreed that managing IF patients is educationally valuable and preferred laminated cards or simulated sessions as educational resources (average ratings of 8 and 6.8 out of 10, respectively).</p><p><b>Conclusion:</b> The survey highlights several areas where pediatric residents need further education. Addressing these knowledge gaps through targeted curricular interventions can better equip residents to manage IF patients and potentially increase their interest in this specialty.</p><p></p><p>CLABSI = Central Line-Associated Bloodstream Infection, TPN = Total Parenteral Nutrition, EHR = Electronic Health Record, IV = Intravenous, SIBO = Small Intestinal Bacterial Overgrowth.</p><p><b>Figure 1.</b> Stratifies the tasks related to managing patients with Intestinal Failure (IF) into three categories based on the average confidence rating score (>= 7/10, 5-6/10, <=4/10) of pediatric residents.</p><p></p><p><b>Figure 2.</b> Illustrates the distribution of pediatric residents’ opinions on the educational value of managing patients with intestinal failure.</p><p>Alyssa Ramuscak, MHSc, MSc<sup>1</sup>; Inez Martincevic, MSc<sup>1</sup>; Hebah Assiri, MD<sup>1</sup>; Estefania Carrion, MD<sup>2</sup>; Jessie Hulst, MD, PhD<sup>1</sup></p><p><sup>1</sup>The Hospital for Sick Children, Toronto, ON; <sup>2</sup>Hospital Metropolitano de Quito, Quito, Pichincha</p><p><b>Financial Support:</b> Nestle Health Science Canada, North York, Ontario, Canada.</p><p><b>Background:</b> Enteral nutrition provides fluids and nutrients to individuals unable to meet needs orally. Recent interest in real food-based formulas highlights a shift towards providing nutrition with familiar fruit, vegetable and protein ingredients. This study aimed to evaluate the tolerability and nutritional adequacy of hypercaloric, plant-based, real food ingredient formula in pediatric tube-fed patients.</p><p><b>Methods:</b> This prospective, single-arm, open-label study evaluated the tolerability and nutritional adequacy of a hypercaloric, plant-based formula, Compleat® Junior 1.5, in medically complex, stable, pediatric tube-fed patients aged 1-13 years. Participants were recruited from outpatient clinics at The Hospital for Sick Children, Toronto from May 2023 to June 2024. Demographic and anthropometric measurements (weight, height) were obtained at baseline. Daily dose of study product was isocaloric when compared to routine feeds. Total fluid needs were met by adding water. Participants transitioned to the study formula over 3 days, followed by 14-days of exclusive use of study product. Caregivers monitored volume of daily intake, feed tolerance, and bowel movements during the transition and study period using an electronic database (Medrio®). An end of study visit was conducted to collect weight measurements. Descriptive statistics summarize demographic and clinical characteristics (Table 1). The paired t-test compared weight-for-age and BMI-for-age z-scores between baseline and the end-of-study. Symptoms of intolerance and bowel movements, using either the Bristol Stool Scale or Brussel's Infant and Toddler Stool Scale, were described as frequency of events and compared at baseline to intervention period. The percent calorie and protein goals during the study period were calculated as amount calories received over prescribed, and amount protein received as per dietary reference intake for age and weight.</p><p><b>Results:</b> In total, 27 ambulatory pediatric participants with a median age of 5.5 years (IQR, 2.5-7) were recruited for the study with 26 completing (Table 1). Participant weight-for-age and BMI-for-age z-scores significantly improved between baseline and end of study, from -1.75 ± 1.93 to -1.67 ± 1.88 (p < 0.05), and from -0.47 ± 1.46 to 0.15 ± 0.23 (p < 0.05), respectively. There was no significant difference in the frequency of any GI symptom, including vomiting, gagging/retching, tube venting or perceived pain/discomfort with feeds, between baseline and end of study. There was no significant difference in frequency or type of stool between baseline and end of study. Study participants met 100% of their prescribed energy for the majority (13 ± 1.7 days) of the study period. All participants exceeded protein requirements during the study period. Twenty families (76.9%) indicated wanting to continue to use study product after completing the study.</p><p><b>Conclusion:</b> This prospective study demonstrated that a hypercaloric, plant-based, real food ingredient formula among stable, yet medically complex children was well tolerated and calorically adequate to maintain or facilitate weight gain over a 14-day study period. The majority of caregivers preferred to continue use of the study product.</p><p><b>Table 1.</b> Demographic and Clinical Characteristics of Participants (n = 27).</p><p></p><p><b>Poster of Distinction</b></p><p>Gustave Falciglia, MD, MSCI, MSHQPS<sup>1</sup>; Daniel Robinson, MD, MSCI<sup>1</sup>; Karna Murthy, MD, MSCI<sup>1</sup>; Irem Sengul Orgut, PhD<sup>2</sup>; Karen Smilowitz, PhD, MS<sup>3</sup>; Julie Johnson, MSPH PhD<sup>4</sup></p><p><sup>1</sup>Northwestern University Feinberg School of Medicine, Chicago, IL; <sup>2</sup>University of Alabama Culverhouse College of Business, Tuscaloosa, AL; <sup>3</sup>Northwestern University Kellogg School of Business & McCormick School of Engineering, Evanston, IL; <sup>4</sup>University of North Carolina School of Medicine, Chapel Hill, NC</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> Children's Hospital Neonatal Consortium (CHNC) Annual Conference, November 1, 2021, Houston, TX.</p><p><b>Financial Support:</b> None Reported.</p><p>Lyssa Lamport, MS, RDN, CDN<sup>1</sup>; Abigail O'Rourke, MD<sup>2</sup>; Barry Weinberger, MD<sup>2</sup>; Vitalia Boyar, MD<sup>2</sup></p><p><sup>1</sup>Cohen Children's Medical Center of New York, Port Washington, NY; <sup>2</sup>Cohen Children's Medical Center of NY, New Hyde Park, NY</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Premature infants in the neonatal intensive care unit (NICU) are at high risk for peripheral intravenous catheter infiltration (PIVI) because of the frequent cannulation of small and fragile vessels. The most common infusate in neonates is parenteral nutrition (PN), followed by antibiotics. Previous reports have suggested that the intrinsic properties of infusates, such as pH, osmolality, and calcium content, determine the severity of PIVIs. This has led to the common practice of restricting the intake of protein and/or calcium to less than that which is recommended for optimal growth and bone mineralization.</p><p><b>Methods:</b> Our objective was to identify the characteristics of infants and intravenous (IV) infusates that associated with the development of severe neonatal PIVIs. We conducted a retrospective analysis of PIVIs in our level IV NICU from 2018-2022 (n = 120). Each PIVI was evaluated by a wound certified neonatologist and classified as mild, moderate, or severe using a scoring system based on the Infusion Nurses Society (INS) staging criteria. Comparison between groups were done using ANOVA and chi-square analysis or Mann-Whitney tests for non-parametric data.</p><p><b>Results:</b> Infants with severe PIVIs had a lower mean birthweight than those with mild or moderate PIVIs (1413.1 g vs 2116.9 g and 2020.3 g respectively, p = .01) (Table 1). Most PIVIs occurred during infusions of PN and lipids, but the severity was not associated with the infusion rate, osmolality, or with the concentration of amino acids (median 3.8 g/dL in the mild group, 3.5 g/dL in the moderate group and 3.4 g/dL in the severe group) or calcium (median 6500 mg/L in all groups) (Table 2, Figure 1). Of note, the infusion of IV medications within 24 hours of PIVI was most common in the severe PIVI group (p = .03)(Table 2). Most PIVIs, including mild, were treated with hyaluronidase. Advanced wound care was required for 4% of moderate and 44% of severe PIVIs, and none required surgical intervention.</p><p><b>Conclusion:</b> Severe PIVIs in the NICU are most likely to occur in infants with a low birthweight and within 24 hours of administration of IV medications. This is most likely because medications must be kept at acidic or basic pH for stability, and many have high osmolarity and/or intrinsic caustic properties. Thus, medications may induce chemical phlebitis and extravasation with inflammation. In contrast, PN components, including amino acids and calcium, are not related to the severity of extravasations. Our findings suggest that increased surveillance of IV sites for preterm infants following medication administration may decease the risk of severe PIVIs. Conversely, reducing or withholding parenteral amino acid and or calcium to mitigate PIVI risk may introduce nutritional deficiencies without decreasing the risk of clinically significant PIVIs.</p><p><b>Table 1.</b> Characteristic Comparison of Mild, Moderate, and Severe Pivis in Neonatal ICU. PIVI Severity Was Designated Based on INS Criteria.</p><p></p><p><b>Table 2.</b> Association of Medication Administration and Components of Infusates With the Incidence and Severity of PIVI in NICU.</p><p></p><p></p><p><b>Figure 1.</b> Infusate Properties.</p><p>Stephanie Oliveira, MD, CNSC<sup>1</sup>; Josie Shiff<sup>2</sup>; Emily Romantic, RD<sup>3</sup>; Kathryn Hitchcock, RD<sup>4</sup>; Gillian Goddard, MD<sup>4</sup>; Paul Wales, MD<sup>5</sup></p><p><sup>1</sup>Cincinnati Children's Hospital Medical Center, Mason, OH; <sup>2</sup>University of Cincinnati, Cincinnati, OH; <sup>3</sup>Cincinnati Children's Hospital Medical Center, Cincinnati, OH; <sup>4</sup>Cincinnati Children's Hospital, Cincinnati, OH; <sup>5</sup>Cincinnati Children's Hospital Medical Center, Cincinnati, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> It is common for children with intestinal failure on parenteral nutrition to be fed an elemental enteral formula as it is believed they are typically better tolerated due to the protein module being free amino acids, the absence of other allergens, and the presence of long chain fatty acids. In February 2022, a popular elemental formula on the market was recalled due to bacterial contamination that necessitated an immediate transition to an alternative enteral formula. This included initiating plant-based options for some of our patients. We have experienced a growing interest and request from families to switch to plant-based formulas due to religious practices, cost concerns, and personal preferences. While plant-based formulas lack major allergens and may contain beneficial soluble fiber, they are under studied in this patient population. This study aims to determine if growth was affected amongst children with intestinal failure on parenteral nutrition who switched from elemental to plant-based formulas.</p><p><b>Methods:</b> We conducted a retrospective cohort study of IF patients on PN managed by our intestinal rehabilitation program who transitioned from elemental to plant-based formulas during the product recall. Data were collected on demographics, intestinal anatomy, formula intake, parenteral nutrition support, tolerance, stool consistency, and weight gain for 6 months before and after formula transition. Paired analyses were performed for change in growth and nutritional intake, using the Wilcoxon-signed rank test. Chi-squared tests were performed to compare formula tolerance. An alpha value < 0.05 was considered significant.</p><p><b>Results:</b> Eleven patients were included in the study [8 Males; median gestational age 33 (IQR = 29, 35.5) weeks, median age at assessment 20.4 (IQR = 18.7,29.7) months]. All participants had short bowel syndrome (SBS) as their IF category. Residual small bowel length was 28(IQR = 14.5,47.5) cm. Overall, there was no statistically significant difference in growth observed after switching to plant-based formulas (p = 0.76) (Figure 1). Both median enteral formula volume and calorie intake were higher on plant-based formula, but not statistically significant (p = 0.83 and p = 0.41) (Figure 2). 7 of 11 patients (64%) reported decreased stool count (p = 0.078) and improved stool consistency (p = 0.103) after switching to plant-based formula. Throughout the study, the rate of PN calorie and volume weaning were not different after switching to plant-based formula (calories: p = 0.83; volume: p = 0.52) (Figure 3).</p><p><b>Conclusion:</b> In this small study of children with IF, the switch from free amino acid formula to an intact plant-based formula was well tolerated. Growth was maintained between groups. After switching to plant-based formulas these children tolerated increased enteral volumes, but we were underpowered to demonstrate a statistical difference. There was no evidence of protein allergy among children who switched. Plant-based formulas may be an alternative option to elemental formulas for children with intestinal failure.</p><p></p><p><b>Figure 1:</b> Change in Weight Gain (g/day) on Elemental and Plant-Based Formulas.</p><p></p><p><b>Figure 2.</b> Change in Enteral Calories (kcal/kg/day) and Volume (mL/kg/day) on Elemental and Plant-Based Formulas.</p><p></p><p><b>Figure 3.</b> Change in PN Calories (kcal/kg/day) and Volume (mL/kg/day) on Elemental and Plant-Based Formulas.</p><p>Carly McPeak, RD, LD<sup>1</sup>; Amanda Jacobson-Kelly, MD, MSc<sup>1</sup></p><p><sup>1</sup>Nationwide Children's Hospital, Columbus, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> In pediatrics, post-pyloric enteral feeding via jejunostomy, gastrojejunostomy or naso-jejunostomy tubes is increasingly utilized to overcome problems related to aspiration, severe gastroesophageal reflux, poor gastric motility, and gastric outlet obstruction. Jejunal enteral feeding bypasses the duodenum, a primary site of absorption for many nutrients. Copper is thought to be primarily absorbed in the stomach and proximal duodenum, thus patients receiving nutritional support in a way that bypasses this region can experience copper deficiency. The prevalence and complications of copper deficiency in the pediatric population are not well documented.</p><p><b>Methods:</b> This was a retrospective case series of two patients treated at Nationwide Children's Hospital (NCH). Medical records were reviewed to collect laboratory, medication/supplement data and enteral feeding history. In each case report, both patients were observed to be receiving Pediasure Peptide® for enteral formula.</p><p><b>Results:</b> Case 1: A 14-year-old male receiving exclusive post-pyloric enteral nutrition for two years. This patient presented with pancytopenia and worsening anemia. Laboratory data was drawn on 3/2017 and demonstrated deficient levels of copper (< 10 ug/dL) and ceruloplasmin (20 mg/dL). Repletion initiated with intravenous cupric chloride 38 mcg/kg/day for 3 days and then transitioned to 119 mcg/kg/day via j-tube for 4 months. Labs were redrawn 2 months after initial episode of deficiency and indicated overall improvement of pancytopenia (Table 1). After 4 months, cupric chloride decreased to 57 mcg/kg/day as a maintenance dose. Laboratory data redrawn two and a half years after initial episode of deficiency and revealed deficient levels of copper (27 ug/dL) and ceruloplasmin (10 mg/dL) despite lower dose supplementation being administered. Case 2: An 8-year-old female receiving exclusive post-pyloric enteral nutrition for 3 months. Laboratory data was drawn on 3/2019 and revealed deficient levels of copper (38 ug/dL) and ceruloplasmin (13 mg/dL). Supplementation of 50 mcg/kg/day cupric chloride administered through jejunal tube daily. Copper and ceruloplasmin labs redrawn at 11 months and 15 months after initiation of supplementation and revealed continued deficiency though hematologic values remained stable (Table 2).</p><p><b>Conclusion:</b> There are currently no guidelines for clinicians for prevention, screening, treatment, and maintenance of copper deficiency in post-pyloric enteral feeding in pediatrics. Current dosing for copper repletion in profound copper deficiency is largely based on case series and expert opinions. At NCH, the current standard-of-care supplementation demonstrates inconsistent improvement in copper repletion, as evidenced by case reports discussed above. Future research should determine appropriate supplementation and evaluate their efficacy in patients with post-pyloric enteral feeding.</p><p><b>Table 1.</b> Laboratory Evaluation of Case 1.</p><p></p><p>'-' indicates no data available, bolded indicates result below the lower limit of normal for age.</p><p><b>Table 2.</b> Laboratory Evaluation of Case 2.</p><p></p><p>'-' indicates no data available, bolded indicates result below the lower limit of normal for age.</p><p>Meighan Marlo, PharmD<sup>1</sup>; Ethan Mezoff, MD<sup>1</sup>; Shawn Pierson, PhD, RPh<sup>1</sup>; Zachary Thompson, PharmD, MPH, BCPPS<sup>1</sup></p><p><sup>1</sup>Nationwide Children's Hospital, Columbus, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Parenteral nutrition (PN) is and remains a high-risk therapy, typically containing more than 40 different ingredients. Incorporation of PN prescribing into the electronic health record (EHR) has been recommended to minimize the risk of transcription errors and allow for implementation of important safety alerts for prescribers. Ambulatory PN prescribing workflows typically require manual transcription of orders in the absence of robust EHR interoperability between systems used for transitions of care. Pediatric patients receiving ambulatory PN have an increased risk of medication events due to the need for weight-based customization and low utilization of ambulatory PN leading to inexperience for many pharmacies. Development and implementation of a prescribing system incorporating inpatient and outpatient functionality within the EHR is necessary to improve the quality and safety of ambulatory PN in pediatric patients. The primary goal was to create a workflow that provided improvement in transitions of care through minimization of manual transcription and improved medication safety. We describe modification of standard EHR tools to achieve this aim.</p><p><b>Methods:</b> Utilizing a multidisciplinary team, development and incorporation of ambulatory PN prescribing within the EHR at Nationwide Children's Hospital was completed. Inpatient and outpatient variances, safety parameters to provide appropriate alerts to prescribers, and legal requirements were considered and evaluated for optimization within the new system.</p><p><b>Results:</b> The final product successfully incorporated ambulatory PN prescribing while allowing seamless transfer of prescriptions between care settings. The prescriber orders the patient specific ambulatory PN during an inpatient or outpatient encounter. The order is subsequently queued for pharmacist review/verification to assess the adjustments and determine extended stability considerations in the ambulatory setting. After pharmacist review, the prescription prints and is signed by the provider to be faxed to the pharmacy.</p><p><b>Conclusion:</b> To our knowledge, this is the first institution to be able to develop and incorporate pediatric PN prescribing into the EHR that transfers over in both the inpatient and outpatient settings independent of manual transcription while still allowing for customization of PN.</p><p>Faith Bala, PhD<sup>1</sup>; Enas Alshaikh, PhD<sup>1</sup>; Sudarshan Jadcherla, MD<sup>1</sup></p><p><sup>1</sup>The Research Institute at Nationwide Children's Hospital, Columbus, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Extrauterine growth remains a concern among preterm-born infants admitted to the neonatal ICU (NICU) as it rarely matches intrauterine fetal growth rates. Although the reasons are multifactorial, the role played by the duration of exclusive parenteral nutrition (EPN) and the transition to reach exclusive enteral nutrition (EEN) phase remains unclear. Significant nutrient deficits can exist during the critical phase from birth to EEN, and thereafter, and these likely impact short and long-term outcomes. Given this rationale, our aims were to examine the relationship between the duration from birth to EEN on growth and length of hospital stay (LOHS) among convalescing preterm-born infants with oral feeding difficulties.</p><p><b>Methods:</b> This is a retrospective analysis of prospectively collected data from 77 preterm infants admitted to the all-referral level IV NICU at Nationwide Children's Hospital, Columbus, Ohio, who were later referred to our innovative neonatal and infant feeding disorders program for the evaluation and management of severe feeding/aero-digestive difficulties. Inclusion criteria: infants born < 32 weeks gestation, birthweight < 1500 g, absence of chromosomal/genetic disorders, discharged at term equivalent postmenstrual age (37-42 weeks, PMA) on full oral feeding. Growth variables were converted to age- and gender-specific Z-scores using the Fenton growth charts. Using the Academy of Nutrition and Dietetics criteria for neonates and preterm populations, extrauterine growth restriction (EUGR) was defined as weight Z-score decline from birth to discharge > 0.8. Clinical characteristics stratified by EUGR status were compared using the Chi-Square test, Fisher exact test, Mann Whitney U test, and T-test as appropriate. Multivariate regression was used to explore and assess the relationship between the duration from birth to EEN and growth Z-scores at discharge simultaneously. Multiple Linear regression was used to assess the relationship between the duration from birth to EEN and LOHS.</p><p><b>Results:</b> Forty-two infants (54.5%) had EUGR at discharge, and those with weight and length percentiles < 10% were significantly greater at discharge than at birth (Table 1). The growth-restricted infants at discharge had significantly lower birth gestational age, a higher proportion required mechanical ventilation at birth, had a higher incidence of sepsis, and took longer days to attain EEN (Table 2). The duration from birth to EEN was significantly negatively associated with weight, length, and head circumference Z scores at discharge. Likewise, the duration from birth to EEN was significantly positively associated with the LOHS (Figure 1).</p><p><b>Conclusion:</b> The duration from birth to exclusive enteral nutrition (EEN) can influence growth outcomes. We speculate that significant gaps between recommended and actual nutrient intake exist during the period to EEN, particularly for those with chronic tube-feeding difficulties. Well-planned and personalized nutrition is relevant until EEN is established, and enteral nutrition advancement strategies using standardized feeding protocols offers prospects for generalizability, albeit provides opportunities for personalization.</p><p><b>Table 1.</b> Participant Growth Characteristics.</p><p></p><p><b>Table 2.</b> Participants Clinical Characteristics.</p><p></p><p></p><p><b>Figure 1.</b> Relationship between the Duration from Birth to EEN Versus Growth Parameters and Length of Hospital Stay.</p><p>Alayne Gatto, MS, MBA, RD, CSP, LD, FAND<sup>1</sup>; Jennifer Fowler, MS, RDN, CSPCC, LDN<sup>2</sup>; Deborah Abel, PhD, RDN, LDN<sup>3</sup>; Christina Valentine, MD, MS, RDN, FAAP, FASPEN<sup>4</sup></p><p><sup>1</sup>Florida International University, Bloomingdale, GA; <sup>2</sup>East Carolina Health, Washington, NC; <sup>3</sup>Florida International University, Miami Beach, FL; <sup>4</sup>Banner University Medical Center, The University of Arizona, Tucson, AZ</p><p><b>Financial Support:</b> The Rickard Foundation.</p><p><b>Background:</b> The neonatal registered dietitian nutritionist (NICU RDN) plays a crucial role in the care of premature and critically ill infants in the neonatal intensive care unit (NICU). Advanced pediatric competencies are frequently a gap in nutrition degree coursework or dietetic internships. Critical care success with such vulnerable patients requires expertise in patient-centered care, multidisciplinary collaboration, and adaptive clinical problem-solving. This research aimed to identify the needs, engagement levels, and expertise of NICU RDNs, while also providing insights into their job satisfaction and career longevity. Currently in the US, there are approximately 850 Level III and Level IV NICUs in which neonatal dietitians are vital care team members, making it imperative that hospital provide appropriate compensation, benefits, and educational support.</p><p><b>Methods:</b> This was a cross-sectional examination using a national, online, IRB approved survey during March 2024 sent to established Neonatal and Pediatric Dietitian practice groups. A Qualtrics link was provided for current and former NICU RDNs to complete a 10-minute online survey that provided an optional gift card for completion. The link remained open until 200 gift cards were exhausted, approximately one week after the survey opened. Statistical analyses were performed using Stats IQ Qualtrics. In the descriptive statistics, frequencies of responses were represented as counts and percentages. For comparison of differences, the Chi-Squared test and Fisher's Exact test are used for categorical analysis.</p><p><b>Results:</b> In total, 253 current (n = 206) and former (n = 47) NICU RDNs completed the online questionnaire. Of the 210 respondents, 84 (40%) reported having pediatric clinical experience, 94 (44%) had clinical pediatric dietetic intern experience, 21 (10%) had previously worked as a WIC nutritionist, 15 (7.1%) had specialized pediatric certification or fellowship, and 12 (5.7%) had no prior experience before starting in the NICU. (Table 1) Of 163 respondents, 83 (50.9%) reported receiving financial support or reimbursement for additional NICU training. Respondents who felt valued as team members planned to stay in the NICU RD role for more than 5 years (p > 0.0046). Additionally, they reported having acknowledgement and appreciation (64.4%), motivation (54.1%), and opportunities for advancement (22.9%). (Table 2).</p><p><b>Conclusion:</b> NICU RDNs do not have a clear competency roadmap nor a career development track. In addition, financial support or reimbursement for continuing education is not consistently an employee benefit which may play a key role in job satisfaction and retention. This data provides valuable insight for not only managers of dietitians but also professional societies to build programs and retention opportunities.</p><p><b>Table 1.</b> Question: When You Started as a NICU RD, What Experience Did You Have? (n = 210).</p><p></p><p>N and Percentages will total more than 210 as respondents could check multiple answers.</p><p><b>Table 2.</b> Comparison of Questions: Do You Feel You Have the Following in Your Role in the NICU and How Long Do You Plan to Stay in Your Role?</p><p></p><p>Sivan Kinberg, MD<sup>1</sup>; Christine Hoyer, RD<sup>2</sup>; Everardo Perez Montoya, RD<sup>2</sup>; June Chang, MA<sup>2</sup>; Elizabeth Berg, MD<sup>2</sup>; Jyneva Pickel, DNP<sup>2</sup></p><p><sup>1</sup>Columbia University Irving Medical Center, New York, NY; <sup>2</sup>Columbia University Medical Center, New York, NY</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Patients with short bowel syndrome (SBS) can have significant fat malabsorption due to decreased intestinal surface area, bile acid deficiency, and rapid transit time, often leading to feeding intolerance and dependence on parenteral nutrition (PN). Patients with SBS can have deficiency of pancreatic enzymes and/or reduced effectiveness of available pancreatic enzymes, resulting in symptoms of exocrine pancreatic insufficiency (EPI), including weight loss, poor weight gain, abdominal pain, diarrhea, and fat-soluble vitamin deficiencies. Oral pancreatic enzyme replacement therapy (PERT) is often tried but is impractical for use with enteral nutrition (EN) due to inconsistent enzyme delivery and risk of clogged feeding tubes. When used with continuous EN, oral PERT provides inadequate enzyme delivery as its ability to hydrolyze fats decreases significantly after 30 minutes of ingestion. The significant need for therapies to improve enteral absorption in this population has led to interest in using an in-line digestive cartridge to treat EPI symptoms in SBS patients on tube feedings. Immobilized lipase cartridge is an FDA-approved in-line digestive cartridge designed to hydrolyze fat in EN before it reaches the gastrointestinal tract, allowing for the delivery of absorbable fats through continuous or bolus feeds. In patients with SBS, this modality may be more effective in improving enteral fat absorption compared to oral preparations of PERT. Preclinical studies have demonstrated that use of an in-line digestive cartridge in a porcine SBS model increased fat-soluble vitamin absorption, reduced PN dependence, and improved intestinal adaptation. Our study aims to evaluate changes in PN, EN, growth parameters, stool output, and fat-soluble vitamin levels in pediatric SBS patients using an in-line digestive cartridge at our center.</p><p><b>Methods:</b> Single-center retrospective study in pediatric patients with SBS on EN who used an in-line immobilized lipase (RELiZORB) cartridge. Data collection included patient demographics, etiology of SBS, surgical history, PN characteristics (calories, volume, infusion hours/days), EN characteristics (tube type, bolus or continuous feeds, formula, calories, volume, hours), immobilized lipase cartridge use (#cartridges/day, duration), anthropometrics, stool output, gastrointestinal symptoms, medications (including previous PERT use), laboratory assessments (fat-soluble vitamin levels, fatty acid panels, pancreatic elastase), indication to start immobilized lipase cartridge, and any reported side effects. Patients with small intestinal transplant or cystic fibrosis were excluded.</p><p><b>Results:</b> Eleven patients were included in the study (mean age 10.4 years, 55% female). The most common etiology of SBS was necrotizing enterocolitis (45%) and 7 (64%) of patients were dependent on PN. Results of interim analysis show: mean duration of immobilized lipase cartridge use of 3.9 months, PN calorie decrease in 43% of patients, weight gain in 100% of patients, and improvement in stool output in 6/9 (67%) patients. Clogging of the cartridges was the most common reported technical difficulty (33%), which was overcome with better mixing of the formula. No adverse events or side effects were reported.</p><p><b>Conclusion:</b> In this single-center study, use of an in-line immobilized lipase digestive cartridge in pediatric patients with SBS demonstrated promising outcomes, including weight gain, improved stool output and reduced dependence on PN. These findings suggest that in-line digestive cartridges may play a role in improving fat malabsorption and decreasing PN dependence in pediatric SBS patients. Larger multicenter studies are needed to further evaluate the efficacy, tolerability, and safety of in-line digestive cartridges in this population.</p><p>Vikram Raghu, MD, MS<sup>1</sup>; Feras Alissa, MD<sup>2</sup>; Simon Horslen, MB ChB<sup>3</sup>; Jeffrey Rudolph, MD<sup>2</sup></p><p><sup>1</sup>University of Pittsburgh School of Medicine, Gibsonia, PA; <sup>2</sup>UPMC Children's Hospital of Pittsburgh, Pittsburgh, PA; <sup>3</sup>University of Pittsburgh School of Medicine, Pittsburgh, PA</p><p><b>Financial Support:</b> National Center for Advancing Translational Sciences (KL2TR001856.)</p><p><b>Background:</b> Administrative databases can provide unique perspectives in rare disease due to the ability to query multicenter data efficiently. Pediatric intestinal failure can be challenging to study due to the rarity of the condition at single centers and the previous lack of a single diagnosis code. In October 2023, a new diagnosis code for intestinal failure was added to the International Classification of Diseases, 10<sup>th</sup> revision (ICD-10). We aimed to describe the usage and limitations of this code to identify children with intestinal failure.</p><p><b>Methods:</b> We performed a multicenter cross-sectional study using the Pediatric Health Information Systems database from October 1, 2023, to June 30, 2024. Children with intestinal failure were identified by ICD-10 code (K90.83). Descriptive statistics were used to characterize demographics, diagnoses, utilization, and outcomes.</p><p><b>Results:</b> We identified 1804 inpatient encounters from 849 unique patients with a diagnosis code of intestinal failure. Figure 1 shows the trend of code use by month since its inception in October 2023. The 849 patients had a total of 7085 inpatient encounters over that timeframe, meaning only 25% of their encounters included the intestinal failure diagnosis. Among these 849 patients, 638 had at least one encounter over the timeframe in which they received parenteral nutrition; 400 corresponded to an admission in which they also had an intestinal failure diagnosis code. Examining only inpatient stays longer than 2 days, 592/701 (84%) patients with such a stay received parenteral nutrition. Central line-associated bloodstream infections accounted for 501 encounters. Patients spent a median of 37 days (IQR 13-96) in the hospital. Patients were predominantly non-Hispanic White (43.7%), non-Hispanic Black (14.8%), or Hispanic (27.9%). Most had government-based insurance (63.5%). Child Opportunity Index was even split among all five quintiles. The total standardized cost from all encounters with an intestinal failure diagnosis totaled $157 million with the total from all encounters with these patients totaling $259 million. The median cost over those 9 months per patients was $104,890 (IQR $31,149 - $315,167). Death occurred in 28 patients (3.3%) over the study period.</p><p><b>Conclusion:</b> The diagnosis code of intestinal failure has been used inconsistently since implementation in October 2023, perhaps due to varying definitions of intestinal failure. Children with intestinal failure experience high inpatient stay costs and rare but significant mortality. Future work must consider the limitations of using only the new code in identifying these patients.</p><p></p><p><b>Figure 1.</b> Number of Encounters With an Intestinal Failure Diagnosis Code.</p><p><b>Poster of Distinction</b></p><p>Kera McNelis, MD, MS<sup>1</sup>; Allison Ta, MD<sup>2</sup>; Ting Ting Fu, MD<sup>2</sup></p><p><sup>1</sup>Emory University, Atlanta, GA; <sup>2</sup>Cincinnati Children's Hospital Medical Center, Cincinnati, OH</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> 6th Annual Pediatric Early Career Research Conference, August 27, 2024, Health Sciences Research Building I, Emory University.</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> The neonatal period is a time of rapid growth, and many infants who require intensive care need extra nutritional support. The Academy of Nutrition and Dietetics (AND) published an expert consensus statement to establish criteria for the identification of neonatal malnutrition. There is limited evidence regarding outcomes associated with diagnosis from these opinion-derived criteria. The objective of this study was to compare anthropometric-based malnutrition indicators with direct body composition measurements in infancy.</p><p><b>Methods:</b> Air displacement plethysmography is considered the gold standard for non-invasive body composition measurement, and this was incorporated into routine clinical care at a referral Level IV neonatal intensive care unit. Late preterm and term infants (34-42 weeks gestational age) with body composition measurement available were included in this study. Infants were categorized as having malnutrition per AND criteria. Fat mass, fat-free mass, and body fat percentage z-scores were determined per Norris body composition growth curves. Logistic regression was conducted to ascertain the relationship of fat mass, fat-free mass, and body fat percentage with malnutrition diagnosis. Linear regression was performed to predict body mass index (BMI) at age 18-24 months from each body composition variable.</p><p><b>Results:</b> Eighty-four infants were included, with 39% female and 96% singleton (Table 1). Fifteen percent were small for gestational age and 12% were large for gestational age at birth. Nearly half had a congenital intestinal anomaly, including gastroschisis and intestinal atresia. Sixty-three percent of the group met at least one malnutrition criterion. Fat-free mass z-score was negatively associated with a malnutrition diagnosis, with an odds ratio 0.77 (95% CI 0.59-0.99, p < 0.05). There was not a statistically significant association between malnutrition diagnosis and body fat percentage or fat mass. There was not a statistically significant relationship between any body composition variable and BMI at 18-24 months, even after removing outliers with a high Cook's distance.</p><p><b>Conclusion:</b> Malnutrition diagnosis is associated with low fat-free mass in critically ill term infants. Body composition is not a predictor of later BMI in this small study.</p><p><b>Table 1.</b> Characteristics of Late Preterm and Term Infants in the Neonatal Intensive Care Unit With a Body Composition Measurement Performed Via Air Displacement Plethysmography.</p><p></p><p>John Stutts, MD, MPH<sup>1</sup>; Yong Choe, MAS<sup>1</sup></p><p><sup>1</sup>Abbott, Columbus, OH</p><p><b>Financial Support:</b> Abbott.</p><p><b>Background:</b> The prevalence of obesity in children is rising. Despite the awareness and work toward weight reduction, less is known about malnutrition in children with obesity. The purpose of this study was to evaluate the prevalence of obesity in U.S. children and determine which combination of indicators best define malnutrition in this population.</p><p><b>Methods:</b> The 2017-2018 National Health and Nutrition Examination Survey (NHANES) database was utilized to assess biomarkers (most recent complete dataset due to Covid-19 pandemic). Trends in prevalence were obtained from 2013-2018 survey data. Obesity was defined as ≥ 95th percentile of the CDC sex-specific BMI-for-age growth charts. Cohort age range was 12-18 years. Nutrient intake and serum were analyzed for vitamin D, vitamin E, vitamin C, potassium, calcium, vitamin B9, vitamin A, total protein, albumin, globulin, high sensitivity C-reactive protein (hs-CRP), iron, hemoglobin and mean corpuscular volume (MCV). Intake levels of fiber were also analyzed. Children consuming supplements were excluded. Categorical and continuous data analysis was performed using SAS® Proc SURVEYFREQ (with Wald chi-square test) and Proc SURVEYMEANS (with t-test) respectively in SAS® Version 9.4 and SAS® Enterprise Guide Version 8.3. Hypothesis tests were performed using 2-sided, 0.05 level tests. Results were reported as mean ± standard error (SE) (n = survey sample size) or percent ± SE (n).</p><p><b>Results:</b> The prevalence of obesity in the cohort was 21.3% ± 1.3 (993). Prevalence trended upward annually; 20.3% ± 2.1 (1232) in 2013-2014, 20.5% ± 2.0 (1129) in 2015-2016. When compared with children without obesity, the mean serum levels of those with obesity were significantly (P ≤ 0.05) lower for vitamin D (50.3 ± 2.4 vs. 61.5 ± 2.1, p < 0.001), iron (14.4 ± 0.5 vs. 16.4 ± 0.4, p < 0.001), albumin (41.7 ± 0.3 vs. 43.3 ± 0.3, p < 0.001), and MCV (83.8 ± 0.5 vs. 85.8 ± 0.3, p = 0.003). When compared with children without obesity, the mean serum levels of those with obesity were significantly (p ≤ 0.05) higher for total protein (73.2 ± 0.4 vs. 72.0 ± 0.3, p = 0.002), globulin (31.5 ± 0.4 vs. 28.7 ± 0.3, p < 0.001) and hs-CRP (3.5 ± 0.3 vs. 1.2 ± 0.2, p < 0.001). A higher prevalence of insufficiency was found for Vitamin D (51.9% ± 5.6 vs. 26.8% ± 3.7, p = 0.001), hemoglobin (16.3% ± 3.1 vs. 7.5% ± 1.8, p = 0.034) and the combination of low hemoglobin + low MCV (11.2% ± 2.9 vs. 3.3% ± 1.0, p = 0.049). All other serum levels were not significantly (p > 0.05) different, with no significant difference in intake.</p><p><b>Conclusion:</b> Results indicate a continued increase in prevalence of obesity in children. When comparing with the non-obese pediatric population, it also shows the differences in micro- and macronutrient serum levels, with no significant differences in dietary intake of these nutrients. The higher prevalence of low hemoglobin + low MCV supports iron deficiency and adds clinical relevance to the data surrounding low mean blood levels of iron. Children with obesity show higher mean globulin and hs-CRP levels consistent with an inflammatory state. The results underscore the existence of malnutrition in children with obesity and the need for nutrition awareness in this pediatric population.</p><p>Elisha London, BS, RD<sup>1</sup>; Derek Miketinas, PhD, RD<sup>2</sup>; Ariana Bailey, PhD, MS<sup>3</sup>; Thomas Houslay, PhD<sup>4</sup>; Fabiola Gutierrez-Orozco, PhD<sup>1</sup>; Tonya Bender, MS, PMP<sup>5</sup>; Ashley Patterson, PhD<sup>1</sup></p><p><sup>1</sup>Reckitt/Mead Johnson, Evansville, IN; <sup>2</sup>Data Minded Consulting, LLC, Houston, TX; <sup>3</sup>Reckitt/Mead Johnson Nutrition, Henderson, KY; <sup>4</sup>Reckitt/Mead Johnson Nutrition, Manchester, England; <sup>5</sup>Reckitt/Mead Johnson Nutrition, Newburgh, IN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> The objective was to examine whether nutrient intake varied across malnutrition classification among a nationally representative sample of children and adolescents.</p><p><b>Methods:</b> This was a secondary analysis of children and adolescents 1-18 y who participated in the National Health and Nutrition Examination Survey 2001-March 2020. Participants were excluded if they were pregnant or did not provide at least one reliable dietary recall. The degree of malnutrition risk was assessed by weight-for-height and BMI-for-age Z-scores for 1-2 y and 3-18 y, respectively. As per the Academy of Nutrition and Dietetics/American Society for Parenteral and Enteral Nutrition, malnutrition was classified using Z-scores: none (Z > -1), mild (Z between -1 and -1.9), and moderate/severe (Z ≤ -2). Dietary intake was assessed from one to two 24-hr dietary recalls. Usual intakes of macronutrients from foods and beverages, and of micronutrients from foods, beverages, and supplements were assessed using the National Cancer Institute method. The cut-point approach was used to estimate the proportion of children with intake below the estimated average requirement for micronutrients and outside the acceptable macronutrient distribution range for macronutrients. All analyses were adjusted for sampling methodology and the appropriate sample weights were applied. Independent samples t-tests were conducted to compare estimates within age groups between nutrition status classifications, with no malnutrition as the reference group.</p><p><b>Results:</b> A total of 32,188 participants were analyzed. Of those, 31,689 (98.4%) provided anthropometrics. The majority (91%) did not meet criteria for malnutrition, while 7.4% and 1.6% met criteria for mild and moderate/severe malnutrition, respectively. Dietary supplement use (mean[SE]) was reported among 33.3[0.7]% of all children/adolescents. Of those experiencing moderate/severe malnutrition, inadequate calcium intake was greatest in adolescents 14-18 y (70.6[3.3]%), and older children 9-13 y (69.7[3.4]%), compared to 3-8 y (29.5[3.9]%) and 1-2 y (5.0[1.2]%). In children 9-13 y, risk for inadequate calcium intake was greater among those experiencing mild (65.3[2.1]%) and moderate/severe malnutrition (69.7[2.1]%) compared to those not experiencing malnutrition (61.2[1.1]%). Similarly, in those experiencing moderate/severe malnutrition, inadequate zinc and phosphorus intake was greatest in adolescents 14-18 y (20.9[3.3]% and 30.6[3.4]%, respectively) and 9-13 y (13.4[2.6]% and 33.9[3.6]%, respectively) compared to younger children (range 0.2[0.1]-0.9[0.4]% and 0.2[0.1]-0.4[0.2]% in 1-8 y, respectively). Although the greatest risk for inadequate protein intake was observed among those experiencing moderate/severe malnutrition, percent energy intake from protein and carbohydrate was adequate for most children and adolescents. Total and saturated fat were consumed in excess by all age groups regardless of nutrition status. The greatest risk for excessive saturated fat intake was reported in those experiencing moderate/severe malnutrition (range across all age groups: 85.9-95.1%).</p><p><b>Conclusion:</b> Older children and adolescents experiencing malnutrition were at greatest risk for inadequate intakes of certain micronutrients such as calcium, zinc, and phosphorus. These results may indicate poor diet quality among those at greatest risk for malnutrition, especially adolescents.</p><p>Anna Benson, DO<sup>1</sup>; Louis Martin, PhD<sup>2</sup>; Katie Huff, MD, MS<sup>2</sup></p><p><sup>1</sup>Indiana University School of Medicine, Carmel, IN; <sup>2</sup>Indiana University School of Medicine, Indianapolis, IN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Trace metals are essential for growth and are especially important in the neonatal period. Recommendations are available for intake of these trace metals in the neonate. However, these recommendations are based on limited data and there are few available descriptions regarding trace metal levels in neonates and their influence on outcomes. In addition, monitoring trace metal levels can be difficult as multiple factors, including inflammation, can affect accuracy. The goal of this project was to evaluate patient serum levels of zinc, selenium and copper and related outcomes including growth, rate of cholestasis, sepsis, bronchopulmonary dysplasia (BPD), and death in a cohort admitted to the neonatal intensive care unit (NICU) and parenterally dependent.</p><p><b>Methods:</b> We completed a retrospective chart review of NICU patients who received parenteral nutrition (PN) and had trace metal panels drawn between January 2016 and February 2023. Charts were reviewed for baseline labs, time on PN, trace metal panel level, dose of trace metals in PN, enteral feeds and supplements, and outcomes including morbidities and mortality. Sepsis was diagnosed based on positive blood culture and cholestasis as a direct bilirubin >2 mg/dL. Fisher's Exact Test or Chi square were used to assess association between categorical variables. Spearman correlation was used to assess the correlation between two continuous variables. A p-value of 0.05 was used for significance.</p><p><b>Results:</b> We included 98 patients in the study with demographic data shown in Table 1. The number of patients with trace metal elevation and deficiency are noted in Tables 1 and 2, respectively. The correlation between growth and trace metal levels is shown in Figure 1. Patient outcomes related to the diagnosis of trace metal deficiency are noted in Table 2. Copper deficiency was found to be significantly associated with sepsis (p = 0.010) and BPD (p = 0.033). Selenium deficiency was associated with cholestasis (p = 0.001) and BPD (p = 0.003). To further assess the relation of selenium and cholestasis, spearman correlation noted a significant negative correlation between selenium levels and direct bilirubin levels with (p = 0.002; Figure 2).</p><p><b>Conclusion:</b> Trace metal deficiency was common in our population. In addition, selenium and copper deficiency were associated neonatal morbidities including sepsis, cholestasis, and BPD. When assessing selenium deficiency and cholestasis, the measured selenium level was found to correlate with direct bilirubin level. While there was correlation between trace metal levels and growth, the negative association noted is unclear and highlights the need for further assessment to determine the influence of other patient factors and the technique used for growth measurement. Overall this project highlights the important relation between trace metals and neonatal morbidities. Further research is needed to understand how trace metal supplementation might be used to optimize neonatal outcomes in the future.</p><p><b>Table 1.</b> Patient Demographic and Outcome Information for Entire Population Having Trace Metal Levels Obtained. (Total population n = 98 unless noted).</p><p></p><p>Patient demographic and outcome information for entire population having trace metal levels obtained.</p><p><b>Table 2.</b> Rate of Trace Metal Deficiency and Association With Patient Outcomes.</p><p>(Total n = 98).</p><p></p><p>Rate of trace metal deficiency and association with patient outcomes.</p><p></p><p>Scatter plot of average trace metal level and change in growth over time. (Growth represented as change in parameter over days). The Spearman correlation coefficient is listed for each graph. And significance note by symbol with *p-value < 0.05, †p-value < 0.01, ‡p-value < 0.001.</p><p><b>Figure 1.</b> Correlation of Trace Metal Level and Growth.</p><p></p><p>Scatter plot of individual direct bilirubin levels plotted by selenium levels. Spearman correlation coefficient noted with negative correlation with p-value 0.002.</p><p><b>Figure 2.</b> Correlation of Selenium Level With Direct Bilirubin Level.</p><p>Kaitlin Berris, RD, PhD (student)<sup>1</sup>; Qian Zhang, MPH<sup>2</sup>; Jennifer Ying, BA<sup>3</sup>; Tanvir Jassal, BSc<sup>3</sup>; Rajavel Elango, PhD<sup>4</sup></p><p><sup>1</sup>BC Children's Hospital, North Vancouver, BC; <sup>2</sup>BCCHR, Vancouver, BC; <sup>3</sup>University of British Columbia, Vancouver, BC; <sup>4</sup>UBC/BCCHR, Vancouver, BC</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Pediatric critical illness causes increased demand for several nutrients. Children admitted requiring nutrition support have a naso-gastric tube to deliver enteral nutrition (EN) formula as liquid nutrition. These formulas are developed using dietary reference intakes (DRI) for healthy populations and does not account for altered nutrient needs in critical illness. Imbalanced nutrient delivery during this vulnerable time could lead to malnutrition, and negatively impact hospital outcomes. Published guidelines by the American Society of Parenteral and Enteral Nutrition (ASPEN) in 2017 aim to alleviate poor nutrition: use concentrated formula in fluid restriction, meet 2/3 estimated calories (resting energy expenditure, REE) by the end of first week, a minimum of 1.5 g/kg/day dietary protein and provide the DRI for each micronutrient. The objective of this retrospective cohort study was to evaluate nutrition delivery (prescribed vs. delivered) compared to 2017 guidelines, and correlate to adequacy in children admitted to a Canadian PICU.</p><p><b>Methods:</b> Three-years of charts were included over two retrospective cohorts: September 2018- December 2020 and February 2022- March 2023. The first cohort, paper chart based, included children 1-18 y with tube feeding started within 3 d after admission. The second cohort, after transition to electronic medical records, included children 1-6 y on exclusive tube feeding during the first week of admission. Patient characteristics, daily formula type, rate prescribed (physician order), amount delivered (nursing notes) and interruption hours and reasons were collected. Statistical analysis included descriptive analysis of characteristics, logistic regression for odds of achieving adequacy of intake with two exposures: age categories and formula type. Pearson correlation was used to interpret interruption hours with percentage of calories met.</p><p><b>Results:</b> Patients (n = 86) that were included spanned 458 nutrition support days (NSD). Admissions predominantly were for respiratory disease (73%), requiring ventilation support (81.4%). Calorie prescription (WHO REE equation) was met in 20.3% of NSD and 43.9% met 2/3 of calorie recommendation (table 1). Concentrated calories were provided in 34% of patients. Hours of interruptions vs. percentage of goal calories met was negatively correlated (r = -.52, p = .002) when looking at those ordered EN without prior EN history (i.e. home tube fed). More than 4 h of interruptions was more likely to not meet 2/3 calorie goal. Calorie goals were met when standard pediatric formula was used in children 1-8 y and concentrated adult formula in 9-18 y. Odds of meeting calorie goal increased by 85% per 1 day increase (OR 1.85 [1.52, 2.26], p < .0001) with a median of 4 d after admission to meet 2/3 calories. Minimum protein intake (1.5 g/kg/d) was only met in 24.9% of all NSD. Micronutrients examined, except for vitamin D, met DRI based on prescribed amount. Delivered amounts provided suboptimal micronutrient intake, especially for vitamin D (figure 1).</p><p><b>Conclusion:</b> Current ASPEN recommendations for EN were not being achieved in critically ill children. Concentrated formula was often not chosen and decreased ability to meet calorie goals in younger patients. Prescribing shorter continuous EN duration (20/24 h) may improve odds of meeting calorie targets. Evaluation of NSD showed improving trend in calorie intake over the first week to meet 2/3 goal recommendation. However, results highlight inadequacy of protein even if calorie needs are increasingly met. Attention to micronutrient delivery with supplementation of Vitamin D is required.</p><p><b>Table 1.</b> Enteral Nutrition Characteristics Per Nutrition Support Days. (N = 458)</p><p></p><p></p><p>Estimated Vitamin D Intake and 95% Confidence Intervals by Age and Formula Groups.</p><p><b>Figure 1.</b> Estimated Vitamin D Intake by Age and Formula Groups.</p><p>Dana Steien, MD<sup>1</sup>; Megan Thorvilson, MD<sup>1</sup>; Erin Alexander, MD<sup>1</sup>; Molissa Hager, NP<sup>1</sup>; Andrea Armellino, RDN<sup>1</sup></p><p><sup>1</sup>Mayo Clinic, Rochester, MN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Home parenteral nutrition (HPN) is a life-sustaining therapy for children with long-term digestive dysfunction. Historically HPN has been considered a bridge to enteral autonomy, or intestinal transplantation. However, thanks to medical and management improvements, HPN is now used for a variety of diagnoses, including intractable feeding intolerance (IFI) in children with severe neurological impairment (SNI). IFI occurs most often near the end-of-life (EOL) in patients with SNI. Thus, outpatient planning and preparation for HPN in this population, vastly differs from historical HPN use.</p><p><b>Methods:</b> Case series of four pediatric patients with SNI who develop IFI and utilized HPN during their EOL care. Data was collected by retrospective chart review. The hospital pediatric palliative care service was heavily involved in the patients’ care when HPN was discussed and planned. The pediatric intestinal rehabilitation (PIR) and palliative care teams worked closely together during discharge planning and throughout the outpatient courses.</p><p><b>Results:</b> The children with SNI in this case series, developed IFI between ages 1 and 12 years. Duration of HPN use varied from 5 weeks to 2 years. All patients were enrolled in hospice, but at various stages. Routine, outpatient HPN management plans and expectations were modified, based on each family's goals of EOL care. Discussions regarding the use and timing of laboratory studies, fever plans, central line issues, growth, and follow up appointments, required detailed discussions and planning.</p><p><b>Conclusion:</b> EOL care for children differs from most EOL care in adults. Providing HPN to children with SNI and IFI can provide time, opportunities, and peace for families during their child's EOL journey, if it aligns with their EOL goals. PIR teams can provide valuable HPN expertise for palliative care services and families during these challenging times.</p><p>Jessica Lowe, DCN, MPH, RDN<sup>1</sup>; Carolyn Ricciardi, MS, RD<sup>2</sup>; Melissa Blandford, MS, RD<sup>3</sup></p><p><sup>1</sup>Nutricia North America, Roseville, CA; <sup>2</sup>Nutricia North America, Rockville, MD; <sup>3</sup>Nutricia North America, Greenville, NC</p><p><b>Financial Support:</b> This study was conducted by Nutricia North America.</p><p><b>Background:</b> Extensively hydrolyzed formulas (eHFs) are indicated for the management of cow milk allergy (CMA) and related symptoms. This category of formulas is often associated with an unpleasant smell and bitter taste, however; whey-based eHFs are considered more palatable than casein-based eHFs.<sup>1-4</sup> The inclusion of lactose, the primary carbohydrate in human milk, in hypoallergenic formula can also promote palatability. Historically, concerns for residual protein traces in lactose has resulted in complete avoidance of lactose in CMA. However, “adverse reactions to lactose in CMA are not supported in the literature, and complete avoidance of lactose in CMA is not warranted.”<sup>5</sup> Clinicians in the United Kingdom previously reported that taste and acceptance of an eHF is an important consideration when prescribing, as they believe a palatable eHF may result in decreased formula refusal and lead to more content families.<sup>1</sup> The objective of this study was to understand caregiver sensory perspectives on an infant, when-based eHF containing lactose.</p><p><b>Methods:</b> Fifteen clinical sites were recruited from across the United States. Clinicians enrolled 132 infants, whose families received the whey based eHF for 2 weeks, based on clinician recommendation. Caregivers completed two surveys: an enrollment survey and 2-week-post-survey characterizing eHF intake, CMA related symptoms, stooling patterns, sensory perspectives and satisfaction with eHF. Data was analyzed using SPSS 27 and descriptive statistics.</p><p><b>Results:</b> One hundred and twenty-two infants completed the study. At enrollment infants were 22 ( ± 14.7) weeks old. Prior to study initiation, 12.3% of infants were breastfed, 40.2% were on a casein-based eHF, and 18.9% were on a standard formula with intact proteins. Most patients (97.5%) were fed orally and 2.5% were tube fed. Among all parents who responded, 92.5% (n = 86/93) reported better taste and 88.9% (n = 96/108) reported better small for whey-based formula containing lactose compared to the previous formula. For caregivers whose child was on a casein-based eHF at enrollment and responded, 97.6% (n = 41/42) reported better taste and 95.7% (n = 44/46) reported better smell than the previous formula. Additional caregiver reported perceptions of taste and smell are reported in Figure 1 and Figure 2, respectively. Finally, 89.3% of caregivers said it was easy to start their child on the whey-based eHF containing lactose and 91.8% would recommend it to other caregivers whose child requires a hypoallergenic formula.</p><p><b>Conclusion:</b> The majority of caregivers had a positive sensory experience with the whey-based eHF containing lactose compared to the baseline formulas. Additionally, they found the trial formula easy to transition to and would recommend it to other families. These data support the findings of Maslin et al. and support the clinicians' expectation that good palatability would result in better acceptance and more content infants and families.<sup>1</sup> Further research is needed to better understand how improved palatability can contribute to decreased waste and heath care costs.</p><p></p><p><b>Figure 1.</b> Caregiver Ranking: Taste of Whey-Based, Lactose-Containing eHF.</p><p></p><p><b>Figure 2.</b> Caregiver Ranking: Smell of Whey-Based, Lactose-Containing eHF.</p><p>Michele DiCarlo, PharmD<sup>1</sup>; Emily Barlow, PharmD, BCPPS<sup>1</sup>; Laura Dinnes, PharmD, BCIDP<sup>1</sup></p><p><sup>1</sup>Mayo Clinic, Rochester, MN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> There is limited information on hyperkalemia in adult patients that received trimethoprim-sulfamethoxazole (TMP-SMX). The mechanism for hyperkalemia is related to the TMP component, which is structurally related to the potassium-sparing diuretic amiloride. In children, there is no information on clinical impact or monitoring required. We noted that a pediatric patient on total parenteral nutrition (TPN), had a drop in the TPN potassium dosing once the TMP-SMX was started. This reduction remained for two weeks, following the last dose of the antibiotic. Case presentation: A 7-month-old was in a cardiac intensive care unit following complex surgical procedures and extracorporeal membrane oxygenation requirement. TPN was started due to concerns related to poor perfusion and possible necrotizing enterocolitis. TPN continued for a total of one hundred and ten days. Per protocol while on TPN, electrolytes and renal function (urine output, serum creatinine) were monitored daily. Diuretic therapy, including loop diuretics, chlorothiazide and spironolactone, were prescribed prior to TPN and continued for the entire duration. Renal function remained stable for the duration of TPN therapy. Dosing of potassium in the TPN was initiated per ASPEN guidelines and adjusted for serum potassium levels. Due to respiratory requirement and positive cultures, TMP-SMX was added to the medication regimen on two separate occasions. TMP-SMX 15 mg/kg/day was ordered twelve days after the start of the TPN and continued for three days. TMP-SMX 15 mg/kg/day was again started on day forty-three of TPN and continued for a five-day duration. Serum potassium was closely monitored for adjustments once TMP-SMX was started. When the TPN was successfully weaned off, we reviewed this information again. There was an obvious drop in TPN potassium dosing by day two of both regimens of TMP-SMX start and did not return to the prior stable dosing until approximately two weeks after the last dose of the antibiotic. This reduction lasted far beyond the projected half-life of TMP-SMX. (Table 1) Discussion: TMP-SMX is known for potential hyperkalemia in adult patients with multiple confounding factors. Factors include high dosage, renal dysfunction, congestive heart failure, and concomitant medications known to cause hyperkalemia. Little literature exists to note this side effect in pediatrics. The onset of our patient's increased serum potassium levels, and concurrent decrease in TPN dosing, could be expected, as TMP-SMX's time to peak effect is 1-4 hours. Half-life of TMP in children < 2 years old is 5.9 hours. Given this information, one would expect TMP-SMX to be cleared approximately thirty hours from the last dose administered. Our patient's potassium dosing took approximately two weeks from the end of the TMP-SMX administration to return to the pre TMP-SMX potassium dosing for both treatment regimens. Potential causes for the extended time to stabilize include concurrent high dose TMP-SMX and continuation of the potassium sparing diuretic. Prolonged potassium monitoring for pediatric patients started on high dose TMP-SMX while on TPN should be considered and further evaluation explored.</p><p><b>Methods:</b> None Reported.</p><p><b>Results:</b> None Reported.</p><p><b>Conclusion:</b> None Reported.</p><p></p><p>Graph representing TPN Potassium dose (in mEq/kg/day), and addition of TMP-SMX regimen on two separate occasions. Noted drop in TPN potassium dose and delayed return after each TMP-SMX regimen.</p><p><b>Figure 1.</b> TPN Potassium Dose and TMP-SMX Addition.</p><p>Jennifer Smith, MS, RD, CSP, LD, LMT<sup>1</sup>; Praveen Goday, MBBS<sup>2</sup>; Lauren Storch, MS, RD, CSP, LD<sup>2</sup>; Kirsten Jones, RD, CSP, LD<sup>2</sup>; Hannah Huey, MDN<sup>2</sup>; Hilary Michel, MD<sup>2</sup></p><p><sup>1</sup>Nationwide Children's Hospital, Dresden, OH; <sup>2</sup>Nationwide Children's Hospital, Columbus, OH</p><p><b>Financial Support:</b> North American Society for Pediatric Gastroenterology, Hepatology, and Nutrition (NASPGHAN) Foundation.</p><p><b>Background:</b> The prevalence of Avoidant/Restrictive Food Intake Disorder (ARFID) is approximately 3% in the general population and 10% in adults with inflammatory bowel disease (IBD). Up to 10% of adolescents with IBD have reported disordered eating behaviors; however, there have been no prospective studies on the prevalence of ARFID eating behaviors in this population.</p><p><b>Methods:</b> This is a one-time, cross-sectional, non-consecutive study of English-speaking patients with a confirmed diagnosis of IBD, aged 12-18 years. Participants completed the validated Nine Item ARFID Screen (NIAS), the SCOFF eating disorder screen (this screen utilizes an acronym [<span>S</span>ick, <span>C</span>ontrol, <span>O</span>ne, <span>F</span>at, and <span>F</span>ood] in relation to the five questions on the screen) and answered one question about perceived food intolerances. The NIAS is organized into the three specific ARFID domains: eating restriction due to picky eating, poor appetite/limited interest in eating, and fear of negative consequences from eating, each of which is addressed by three questions. Questions are based on a 6-point Likert scale. Participants scoring ≥23 on the total scale or ≥12 on an individual subscale were considered to meet criteria for ARFID eating behaviors. Since some individuals with a positive NIAS screen may have anorexia nervosa or bulimia, we also used the SCOFF questionnaire to assess the possible presence of an eating disorder. A score of two or more positive answers has a sensitivity of 100% and specificity of 90% for anorexia nervosa or bulimia. Apart from descriptive statistics, Chi-square testing was used to study the prevalence of malnutrition, positive SCOFF screening, or food intolerances in patients with and without positive NIAS screens.</p><p><b>Results:</b> We enrolled 82 patients whose demographics are shown in Table 1. Twenty percent (16/82) scored positive on the NIAS questionnaire, 16% (13/82) scored positive on the SCOFF questionnaire, and 48% (39/82) noted food intolerances. Of the 16 participants who scored positive on the NIAS, 50% (8/16) were male, 56% (9/16) had a diagnosis of Crohn's Disease, and 69% (11/16) had inactive disease. Twenty-five percent of those with a positive NIAS (4/16) met criteria for malnutrition (1 mild, 2 moderate, and 1 severe). Sixty-nine percent of those who scored positive on the NIAS (11/16) noted food intolerances and 30% (5/16) had a positive SCOFF screener. The prevalence of malnutrition (p = 0.4), the percentage of patients who scored positive on the SCOFF eating disorder screen (p = 0.3), or those who with reported food intolerances (p = 0.6) was similar in participants who scored positive on the NIAS vs. not.</p><p><b>Conclusion:</b> Using the NIAS, 20% of adolescents with IBD met criteria for ARFID. Participants were no more likely to have malnutrition, a positive score on the SCOFF eating disorder screen, or reported food intolerances whether or not they met criteria for ARFID. Routine screening of adolescents with IBD for ARFID or other eating disorders may identify patients who would benefit from further evaluation.</p><p><b>Table 1.</b> Demographics.</p><p></p><p>Qian Wen Sng, RN<sup>1</sup>; Jacqueline Soo May Ong<sup>2</sup>; Sin Wee Loh, MB BCh BAO (Ireland), MMed (Paed) (Spore), MRCPCH (RCPCH, UK)<sup>1</sup>; Joel Kian Boon Lim, MBBS, MRCPCH, MMed (Paeds)<sup>1</sup>; Li Jia Fan, MBBS, MMed (Paeds), MRCPCH (UK)<sup>3</sup>; Rehena Sultana<sup>4</sup>; Chengsi Ong, BS (Dietician), MS (Nutrition and Public Health), PhD<sup>1</sup>; Charlotte Lin<sup>3</sup>; Judith Ju Ming Wong, MB BCh BAO, LRCP & SI (Ireland), MRCPCH (Paeds) (RCPCH, UK)<sup>1</sup>; Ryan Richard Taylor<sup>3</sup>; Elaine Hor<sup>2</sup>; Pei Fen Poh, MSc (Nursing), BSN<sup>1</sup>; Priscilla Cheng<sup>2</sup>; Jan Hau Lee, MCI, MRCPCH (UK), USMLE, MBBS<sup>1</sup></p><p><sup>1</sup>KK Hospital, Singapore; <sup>2</sup>National University Hospital, Singapore; <sup>3</sup>National University Hospital Singapore, Singapore; <sup>4</sup>Duke-NUS Graduate Medical School, Singapore</p><p><b>Financial Support:</b> This work is supported by the National Medical Research Council, Ministry of Health, Singapore.</p><p><b>Background:</b> Protein-energy malnutrition is pervasive in pediatric intensive care unit (PICU) patients. There remains clinical equipoise on the impact of protein supplementation in critically ill children. Our primary aim was to determine the feasibility of conducting a randomized controlled trial (RCT) of protein supplementation versus standard enteral nutrition (EN) in the PICU.</p><p><b>Methods:</b> An open-labelled pilot RCT was conducted from January 2021 to June 2024 in 2 tertiary pediatric centers in Singapore. Children with body mass index (BMI) z-score < 0, who were expected to require invasive or non-invasive mechanical ventilation for at least 48 hours and required EN support for feeding were included. Patients were randomized (1:1 allocation) to protein supplementation of ³1.5 g/kg/day in addition to standard EN or standard EN alone for 7 days after enrolment or discharge to high dependency unit, whichever was earlier. Feasibility was based on 4 outcomes: Effective screening (>80% eligible patients approached for consent), satisfactory enrolment (>1 patient/center/month), timely protocol implementation (>80% of participants receiving protein supplementation within first 72 hours) and protocol adherence (receiving >80% of protein supplementation as per protocol).</p><p><b>Results:</b> A total of 20 patients were recruited - 10 (50.0%) and 10 (50.0%) in protein supplementation and standard EN groups, respectively. Median age was 13.0 [Interquartile range (IQR) 4.2, 49.1] months. Respiratory distress was the most common reason for PICU admission [11 (55.0%)]. Median PICU and hospital length of stay were 8.0 (IQR 4.5, 16.5) and 19.0 (IQR 11.5, 36.5) days, respectively. There were 3 (15%) deaths which were not related trial intervention. Screening rate was 50/74 (67.6%). Mean enrollment was 0.45 patient/center/month. Timely protocol implementation was performed in 15/20 (75%) participants. Protocol adherence was achieved by the participants in 11/15 (73.3%) of protein supplementation days.</p><p><b>Conclusion:</b> Satisfactory feasibility outcomes were not met in this pilot RCT. Based on inclusion criteria of this pilot study and setup of centers, a larger study in Singapore alone will not be feasible. With incorporation of revised logistic arrangements, a larger feasibility multi-center center study involving regional countries should be piloted.</p><p>Veronica Urbik, MD<sup>1</sup>; Kera McNelis, MD<sup>1</sup></p><p><sup>1</sup>Emory University, Atlanta, GA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Many clinical management and physiologic knowledge gaps are present in the care of the tiny baby population (neonates born at ≤23 weeks gestation, also labelled periviable gestational age). There is therefore significant center-specific variation in practice, as well as increased morbidity and mortality observed in infants born at 22-23 weeks compared to those born at later gestational ages<sup>1</sup>. The importance of nutrition applies to tiny babies, and appropriate nutrition is perhaps even more critical in this population than others. Numerous studies have demonstrated the benefits of full early enteral feeding, including decreased rates of central-line associated blood infection and cholestasis<sup>2,3</sup>. The risk of developing necrotizing enterocolitis (NEC) is a balancing measure for advancement of enteral feeds<sup>4</sup>. Adequate nutrition is critical for growth, reducing morbidity and mortality, and improving overall outcomes. Current proposed protocols for this population target full enteral feeding volumes to be reached by 10-14 days of life<sup>5</sup>.</p><p><b>Methods:</b> From baseline data collected at two Level III neonatal intensive care units (NICU) attended by a single group of academic neonatology faculty from January 2020 – January 2024, the average length of time from birth until full enteral feeds was achieved was 31 days. Using quality improvement (QI) methodology, we identified the barriers in advancement to full enteral feeds (defined as 120cc/kg/day) in babies born at 22- and 23-weeks gestational age admitted to the pediatric resident staffed Level III NICU.</p><p><b>Results:</b> The Pareto chart displays the primary barriers including undefined critical illness, vasopressor use, evaluation for NEC, or spontaneous intestinal perforation, patent ductus arteriosus treatment, and electrolyte derangements (Figure 1). In many cases, no specific reason was able to be identified in chart review for not advancing toward full enteral feeds.</p><p><b>Conclusion:</b> In this ongoing QI project, our SMART aim is to reduce the number of days to reach full feeds over the period of January 2024 -June 2025 by 10%, informed by root cause analysis and the key driver diagram developed from the data thus far (Figure 2). The first plan-do-study-act cycle started on January 16, 2024. Data are analyzed using statistical process control methods.</p><p></p><p>Pareto Chart.</p><p><b>Figure 1.</b></p><p></p><p>Key Driver Diagram.</p><p><b>Figure 2.</b></p><p>Bridget Hron, MD, MMSc<sup>1</sup>; Katelyn Ariagno, RD, LDN, CNSC, CSPCC<sup>1</sup>; Matthew Mixdorf<sup>1</sup>; Tara McCarthy, MS, RD, LDN<sup>1</sup>; Lori Hartigan, ND, RN, CPN<sup>1</sup>; Jennifer Lawlor, RN, BSN, CPN<sup>1</sup>; Coleen Liscano, MS, RD, CSP, LDN, CNSC, CLE, FAND<sup>1</sup>; Michelle Raymond, RD, LDN, CDCES<sup>1</sup>; Tyra Bradbury, MPH, RD, CSP, LDN<sup>1</sup>; Erin Keenan, MS, RD, LDN<sup>1</sup>; Christopher Duggan, MD, MPH<sup>1</sup>; Melissa McDonnell, RD, LDN, CSP<sup>1</sup>; Rachel Rosen, MD, MPH<sup>1</sup>; Elizabeth Hait, MD, MPH<sup>1</sup></p><p><sup>1</sup>Boston Children's Hospital, Boston, MA</p><p><b>Financial Support:</b> Some investigators received support from agencies including National Institutes of Health and NASPGHAN which did not directly fund this project.</p><p><b>Background:</b> The widespread shortage of amino acid-based formula in February 2022 highlighted the need for urgent and coordinated hospital response to disseminate accurate information to front-line staff in both inpatient and ambulatory settings.</p><p><b>Methods:</b> An interdisciplinary working group consisting of pediatric gastroenterologists, dietitians, nurses and a quality improvement analyst was established in September 2022. The group met at regular intervals to conduct a needs assessment for all disciplines. The group developed and refined a novel clinical process algorithm to respond to reports of formula shortages and/or recalls. The Clinical Process Map is presented in Figure 1. Plan-do-study-act cycles were implemented to improve the quality of the communication output based on staff feedback. The key performance indicator is time from notification of possible shortage to dissemination of communication to stakeholders, with a goal of < 24 hours.</p><p><b>Results:</b> From September 2022 to August 2024, the group met 18 times for unplanned responses to formula recall/shortage events. Email communication was disseminated within 24 hours for 8/18 (44%) events; within 48 hours for 9/18 (50%) and 1/18 (6%) after 48 hours. Iterative changes included the initiation of an urgent huddle for key stakeholders to identify impact and substitution options; development of preliminary investigation pathway to ensure validity of report; development of structured email format that was further refined to table format including images of products (Figure 2) and creation of email distribution list to disseminate shortage reports. The keys to this project's success were the creation of a multidisciplinary team dedicated to meeting urgently for all events, and the real time drafting and approval of communication within the meeting. Of note, the one communication which was substantially delayed (94.7 hours) was addressed over email only, underscoring the importance of the multidisciplinary working meeting.</p><p><b>Conclusion:</b> Establishing clear lines of communication and assembling key stakeholders resulted in timely, accurate and coordinated communication regarding nutrition recalls/shortage events at our institution.</p><p></p><p><b>Figure 1.</b> Formula Recall Communication Algorithm.</p><p></p><p><b>Figure 2.</b></p><p>Nicole Misner, MS, RDN<sup>1</sup>; Michelle Yavelow, MS, RDN, LDN, CNSC, CSP<sup>1</sup>; Athanasios Tsalatsanis, PhD<sup>1</sup>; Racha Khalaf, MD, MSCS<sup>1</sup></p><p><sup>1</sup>University of South Florida, Tampa, FL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Early introduction of peanuts and eggs decreases the incidence of peanut and egg allergies in infants who are at high risk of developing food allergies. Prevention guidelines and clinical practice have shifted with recent studies. However, little is known about introducing common food allergens in infants fed via enteral feeding tubes. Early introduction of allergens could be of importance in infants working towards tube feeding wean and those that may benefit from blended tube feeding in the future. We aimed to compare the characteristics of patients with enteral tubes who received education during their gastroenterology visit verses those who did not.</p><p><b>Methods:</b> We performed a single center retrospective chart review involving all patients ages 4 to 24 months of age with an enteral feeding tube seen at the University of South Florida Pediatric Gastroenterology clinic from August 2020 to July 2024. Corrected age was used for infants born < 37 weeks’ gestation age. All types of enteral nutrition were included i.e. nasogastric, gastrostomy, gastrostomy-jejunostomy tube. Data on demographics, clinical characteristics and parent-reported food allergen exposure were collected. An exception waiver was received by the University of South Florida Institution Review Board for this retrospective chart review. Differences between patients who received education and those who did not were evaluated using Student's t-test for continuous variables and chi-square test for categorical variables. All analysis was performed using R Statistical Software (v4.4.2). A p value < =0.05 was considered statistically significant.</p><p><b>Results:</b> A total of 77 patients met inclusion criteria, with 349 total visits. Patient demographics at each visit are shown in Table 1. There was a documented food allergy in 12% (43) of total visits. Education on early introduction of common food allergens was provided in 12% (42) of total visits. Patients who received education at their visit were significantly younger compared to those who did not and were also more likely to have eczema. Table 2 compares nutrition characteristics of the patient at visits where education was discussed vs those where it was not. Infants with any percent of oral intake were more likely to have received education than those that were nil per os (p = 0.013). There was a significant association between starting solids and receiving education (p < 0.001). Reported allergen exposure across all visits was low. For total visits with the patient < 8 months of age (n = 103), only 6% (6) reported peanut and 8% (8) egg exposure. Expanded to < 12 months of age at the time of visit (n = 198), there was minimal increase in reported allergen exposure, 7% (14) reported peanut and 8% (15) egg exposure. Oral feeds were the most common source of reported form of allergen exposure. Only one patient received commercial early allergen introduction product. Cow's milk exposure was the most reported allergen exposure with 61% (63) under 8 months and 54% (106) under 12 months of age, majority from their infant formula.</p><p><b>Conclusion:</b> Age and any proportion of oral intake were associated with receiving education on common food allergen introduction at their visit. However, there were missed opportunities for education in infants with enteral feeding tubes. There were few visits with patients that reported peanut or egg exposure. Further research and national guidelines are needed on optimal methods of introduction in this population.</p><p><b>Table 1.</b> Demographics.</p><p></p><p><b>Table 2.</b> Nutrition Characteristics.</p><p></p><p>Samantha Goedde-Papamihail, MS, RD, LD<sup>1</sup>; Ada Lin, MD<sup>2</sup>; Stephanie Peters, MS, CPNP-PC/AC<sup>2</sup></p><p><sup>1</sup>Nationwide Children's Hospital, Grove City, OH; <sup>2</sup>Nationwide Children's Hospital, Columbus, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Critically ill children with severe acute kidney injury (AKI) requiring continuous renal replacement therapy (CRRT) are at high risk for micronutrient deficiencies, including vitamin C (VC). VC acts as an antioxidant and enzyme cofactor. It cannot be made endogenously and must be derived from the diet. Critically ill patients often enter the hospital with some degree of nutritional debt and may have low VC concentrations upon admission. This is exacerbated in the case of sepsis, multi-organ dysfunction, burns, etc. when VC needs are higher to address increased oxidative and inflammatory stress. When these patients develop a severe AKI requiring CRRT, their kidneys do not reabsorb VC as healthy kidneys would, further increasing losses. Moreover, VC is a small molecule and is filtered out of the blood via CRRT. The combination of increased VC losses and elevated VC requirements in critically ill patients on CRRT result in high risk of VC deficiency. The true prevalence of VC deficiency in this population is not well-known; whereas the prevalence of deficiency in the general population is 5.9% and 18.3% in critically ill children, on average. The aim of this study is to ascertain the prevalence of VC deficiency in critically ill pediatric patients on CRRT by monitoring serum VC levels throughout their CRRT course and correcting noted deficiencies.</p><p><b>Methods:</b> An observational study was conducted from May 2023 through August 2024 in a 54 bed high-acuity PICU offering ECMO, CRRT, Level 1 Trauma, burn care, solid organ transplant and bone marrow transplantation as tertiary care services. Fifteen patients were identified upon initiation of CRRT and followed until transfer from the PICU. Serial serum VC levels were checked after 5-10 days on CRRT and rechecked weekly thereafter as appropriate. Deficiency was defined as VC concentrations < 11 umol/L. Inadequacy was defined as concentrations between 11-23 umol/L. Supplementation was initiated for levels < 23 umol/L; dose varied 250-500 mg/d depending on age and clinical situation. Most deficient patients received 500 mg/d supplementation. Those with inadequacy received 250 mg/d.</p><p><b>Results:</b> Of 15 patients, 9 had VC deficiency and 4 had VC inadequacy [FIGURE 1]. Of those with deficiency, 5 of 9 patients were admitted for septic shock [FIGURE 2]. VC level was rechecked in 8 patients; level returned to normal in 5 patients and 4 of those 5 received 500 mg/d supplementation. Levels remained low in 3 patients; all received 250 mg/d supplementation [FIGURE 3]. Supplementation dose changes noted in figure 4.</p><p><b>Conclusion:</b> VC deficiency was present in 60% of CRRT patients, suggesting deficiency is more common in this population and that critically ill patients on CRRT are at higher risk of developing deficiency than those who are not receiving CRRT. Septic shock and degree of VC deficiency appeared to be correlated; 56% of deficient patients were admitted with septic shock. Together, this suggests a need to start supplementation earlier, perhaps upon CRRT initiation vs upon admission to the PICU in a septic patient; and utilize higher supplementation doses as our patients with low VC levels at their follow-up check were all receiving 250 mg/d. Study limitations include: 1. Potential for VC deficiency prior to CRRT initiation with critical illness and/or poor intakes as confounding factors and 2. Small sample size. Our results suggest that critically ill children with AKI requiring CRRT are at increased risk of VC deficiency while on CRRT. Future research should focus on identifying at-risk micronutrients for this patient population and creating supplementation regimens to prevent the development of deficiencies. Our institution is currently crafting a quality improvement project with these aims.</p><p></p><p><b>Figure 1.</b> Initial Serum Vitamin C Levels of Our Patients on CRRT, Obtained 5-10 Days After CRRT Initiation (N = 15).</p><p></p><p><b>Figure 2.</b> Underlying Disease Process of Patients on CRRT (N = 15).</p><p></p><p><b>Figure 3.</b> Follow Up Vitamin C Levels After Supplementation (N = 8), Including Supplementation Regimen Prior to Follow-Up Lab (dose/route).</p><p></p><p><b>Figure 4.</b> Alterations in Supplementation Regimen (dose) Based on Follow-up Lab Data (N = 6).</p><p>Tanner Sergesketter, RN, BSN<sup>1</sup>; Kanika Puri, MD<sup>2</sup>; Emily Israel, PharmD, BCPS, BCPPS<sup>1</sup>; Ryan Pitman, MD, MSc<sup>3</sup>; Elaina Szeszycki, BS, PharmD, CNSC<sup>2</sup>; Ahmad Furqan Kazi, PharmD, MS<sup>1</sup>; Ephrem Abebe, PhD<sup>1</sup></p><p><sup>1</sup>Purdue University College of Pharmacy, West Lafayette, IN; <sup>2</sup>Riley Hospital for Children at Indiana University Health, Indianapolis, IN; <sup>3</sup>Indiana University, Indianapolis, IN</p><p><b>Financial Support:</b> The Gerber Foundation.</p><p><b>Background:</b> During the hospital-to-home transition period, family members or caregivers of medically complex children are expected to assume the responsibility of managing medication and feeding regimens for the child under their care. However, this transition period represents a time of high vulnerability, including the risk of communication breakdowns and lack of tools tailored to caregivers’ context. This vulnerability is further heightened when a language barrier is present as it introduces additional opportunities for misunderstandings leading to adverse events and errors in the home setting. Hence, it is critical that caregivers are educated to develop skills that aid in successful implementation of post-discharge care plans. Addressing unmet needs for caregivers who use languages other than English (LOE) requires an in-depth understanding of the current challenges associated with educating and preparing caregivers for the post-discharge period.</p><p><b>Methods:</b> In this prospective qualitative study, healthcare workers (HCWs) were recruited from a tertiary care children's hospital in central Indiana and were eligible to participate if they were involved directly or indirectly in preparing and assisting families of children under three years of age with medication and feeding needs around the hospital discharge period and/or outpatient care following discharge. Each HCW completed a brief demographic survey followed by observation on the job for 2-3 hours, and participated in a follow-up semi-structured interview using video conferencing technology to expand on observed behavior. Handwritten field notes were taken during the observations, which were immediately typed and expanded upon post-observation. Audio recordings from the interviews were transcribed and de-identified. Both the typed observation notes and interview transcripts were subjected to thematic content analysis, which was completed using the Dedoose software.</p><p><b>Results:</b> Data collection is ongoing with anticipated completion in October 2024. Fourteen HCW interviews have been completed to date, with a target sample of 20-25 participants. Preliminary analysis presented is from transcripts of seven interviews. Participants included six females and one male with a mean age of 35.4 years (range, 24 - 59). HCWs were from diverse inpatient and outpatient clinical backgrounds including registered dieticians, physicians, pharmacists, and nurses. Four overarching themes describe the challenges that HCWs experience when communicating with caregivers who use LOE during the hospital-to-home transition. These themes include lack of equipment and materials in diverse languages, challenges with people and technologies that assist with translating information, instructions getting lost in translation/uncertainty of translation, and difficulty getting materials translated in a timely manner. Main themes, subthemes, and examples are presented in Figure 1, and themes, subthemes, and quotes are presented in Table 1.</p><p><b>Conclusion:</b> The study is ongoing, however based on the preliminary analysis, it is evident that the systems and processes that are in place to aid in communication between HCWs and caregivers who use LOE can be improved. This can ultimately lead to improved quality of care provided to caregivers who use LOE during the hospital-to-home transition and resultant safer care in the home setting for medically complex children.</p><p><b>Table 1.</b> Themes, Subthemes, and Quotes.</p><p></p><p></p><p><b>Figure 1.</b> Main Themes, Subthemes, and Examples.</p><p>Ruthfirst Ayande, PhD, MSc, RD<sup>1</sup>; Shruti Gupta, MD, NABBLM-C<sup>1</sup>; Sarah Taylor, MD, MSCR<sup>1</sup></p><p><sup>1</sup>Yale University, New Haven, CT</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Growth faltering among preterm neonates admitted to NICUs may be managed with hypercaloric and/or hypervolemic feeding. However, in intractable cases of growth faltering or when fluid restriction is indicated, fortification with extensively hydrolyzed liquid protein may offer a solution to meeting the high protein demands for infant growth while limiting overfeeding. Yet, there is limited clinical data regarding supplementation with liquid protein, leaving clinicians to make decisions about dosing and duration on a case-by-case basis.</p><p><b>Methods:</b> We present a case of neonatal growth faltering managed with a liquid protein modular in a level IV NICU in North America.</p><p><b>Results:</b> Case Summary: A male infant, born extremely preterm (GA: 24, 1/7) and admitted to the NICU for respiratory distress, requiring intubation. NICU course was complicated by patent ductus arteriosus (PDA), requiring surgery on day of life (DOL) 31 and severe bronchopulmonary dysplasia. Birth Anthropometrics: weight: 0.78 kg; height: 31.5 cm. TPN was initiated at birth, with trophic feeds of donor human milk per gavage (PG) for a total provision of 117 ml/kg, 75 kcal/kg, and 3.5 gm/kg of protein. The regimen was advanced per unit protocol; however, on DOL 5, total volume was decreased in the setting of metabolic acidosis and significant PDA. PG feeds of maternal milk were fortified to 24 kcal/oz on DOL 23, and the infant reached full feeds on DOL 26. Feed provision by DOL 28 was ~144 ml/kg/day and 4 g/kg of protein based on estimated dry weight. TPN was first discontinued on DOL 25 but restarted on DOL 32 due to frequent NPO status and clinical instability. Of note, the infant required diuretics during the hospital stay. TPN was again discontinued on DOL 43. At DOL 116, the infant was receiving and tolerating PG feeds fortified to 24 kcal/oz at 151 ml/kg, 121 kcal/kg, and 2.1 gm/kg protein. The infant's weight gain rate was 56 g/day; however, linear growth was impaired, with a gain of 0.5 cm over 21 days (rate of ~0.2 cm/week). Liquid protein was commenced at DOL 124 to supply an additional 0.5 gm/kg of protein. A week after adding liquid protein, the infant's weight gain rate was 39 g/day, and height increased by 2.5 cm/week. Feed fortification was reduced to 22 kcal/oz on DOL 156 due to rapid weight gain, and liquid protein dosage increased to 0.6 gm/kg for a total protein intake of 2.2 g/kg. At DOL 170, Calorie fortification of maternal milk was discontinued, and liquid protein dosage was increased to 1 g/kg in the setting of a relapse of poor linear growth for a total protein intake of 3.1 g/kg. Liquid protein was provided for two months until discontinuation (d/c) at DOL 183 per parent request. At the time of d/c of liquid protein, the infant's weight and length gain rates for the protein supplementation period (59 days) was 42 gm/day and 1.78 cm/week, respectively.</p><p><b>Conclusion:</b> While we observed objective increases in linear growth for the presented case following the addition of a liquid protein modular, it is crucial to note that these findings are not generalizable, and there is limited evidence and guidelines on the use of hydrolyzed liquid protein. Larger, well-controlled studies examining mechanisms of action, appropriate dosage and duration, short- and long-term efficacy, and safety are required to guide best practices for using these modulars.</p><p>Sarah Peterson, PhD, RD<sup>1</sup>; Nicole Salerno, BS<sup>1</sup>; Hannah Buckley, RDN, LDN<sup>1</sup>; Gretchen Coonrad, RDN, LDN<sup>1</sup></p><p><sup>1</sup>Rush University Medical Center, Chicago, IL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Accurate assessment of growth and nutritional status is critical for preterm infants. Various growth charts have been developed to track the growth of preterm infants, but differences in reference standards may influence the diagnosis and management of malnutrition. The goal of this study was to compare the rate of malnutrition, defined by a decline in weight-for-age z-score, using the Fenton growth chart, the Olsen growth chart, and the INTERGROWTH-21st Preterm Postnatal Growth Standard.</p><p><b>Methods:</b> All preterm infants born between 24 and 37 weeks of gestational age who were admitted to the neonatal intensive care unit (NICU) in 2022 and had weight at birth and on day 28 recorded were included. Preterm infants were excluded if they were admitted to the NICU ≥seven days after birth. Sex and gestational age were recorded for each infant. Weight and weight-for-age z-score at birth and on day 28 were recorded. Weight-for-age z-score was determined using three growth charts: (1) the Fenton growth chart, which was developed using growth data from uncomplicated preterm infants who were at least 22 weeks of gestational age from the United States (US), Australia, Canada, Germany, Italy, and Scotland; (2) the Olsen growth chart, which was developed using growth data from uncomplicated preterm infants who were at least 23 weeks of gestational age from the US; and (3) the INTERGROWTH-21st Preterm Postnatal Growth Standard, which was developed using growth data from uncomplicated preterm infants who were at least 24 weeks of gestational age from the US, United Kingdom, Brazil, China, India, Italy, Kenya, and Oman. Change in weight-for-age z-score from birth to day 28 was calculated using z-scores from each growth chart. Malnutrition was defined as a decline in weight-for-age z-score of ≥0.8; any infant meeting this cut-point had their malnutrition status further classified into three categories: (1) mild malnutrition was defined as a weight-for-age z-score decline between 0.8-1.1, (2) moderate malnutrition was defined as a weight-for-age z-score decline between 1.2-1.9, and (3) severe malnutrition was defined as a weight-for-age z-score decline of ≥2.0.</p><p><b>Results:</b> The sample included 102 preterm infants, 58% male, with a mean gestational age of 29.3 weeks. At birth, the average weight was 1,192 grams, and the average weight-for-age z-score was -0.50, -0.36, and -1.14 using the Olsen, Fenton, and INTERGROWTH-21st growth charts, respectively. At 28 days, the average weight was 1,690 grams, and the average weight-for-age z-score was -0.96, -1.00, and -1.43 using the Olsen, Fenton, and INTERGROWTH-21st growth charts, respectively. Using the Olsen growth chart, 29 infants met the criteria for malnutrition; 15 had mild malnutrition and 14 had moderate malnutrition. Using the Fenton growth chart, 33 infants met the criteria for malnutrition; 21 had mild malnutrition and 12 had moderate malnutrition. Using the INTERGROWTH-21st growth chart, 32 infants met the criteria for malnutrition; 14 had mild malnutrition, 14 had moderate malnutrition, and 4 had severe malnutrition. In total, 24 infants met the criteria for malnutrition using all three growth charts, while 19 infants were only categorized as malnourished by one of the three growth charts.</p><p><b>Conclusion:</b> The findings of this study reveal important discrepancies in average z-scores and the classification of malnutrition among preterm infants when using the Olsen, Fenton, and INTERGROWTH-21st growth charts. These differences suggest that the choice of growth chart has important implications for identifying rates of malnutrition. Therefore, standardized guidelines for growth monitoring in preterm infants are necessary to ensure consistent and accurate diagnosis of malnutrition.</p><p>Emaan Abbasi, BSc<sup>1</sup>; Debby Martins, RD<sup>2</sup>; Hannah Piper, MD<sup>2</sup></p><p><sup>1</sup>Univery of Galway, Vancouver, BC; <sup>2</sup>BC Children's Hospital, Vancouver, BC</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Infants with gastroschisis have variable intestinal function with some achieving enteral independence within a few weeks and others remaining dependent on parental nutrition (PN) for prolonged periods. Establishing enteral feeds can be challenging for many of these neonates due to poor intestinal motility, frequent emesis and/or abdominal distension. Therefore, many care teams use standardized post-natal nutrition protocols in an attempt to minimize PN exposure and maximize oral feeding. However, it remains unclear whether initiating continuous feeds is advantageous or whether bolus feeding is preferred. Potential benefits of bolus feeding include that it is more physiologic and that the feeds can be given orally, but there remain concerns about feed tolerance and prolonged periods of withholding feeds with this approach. The objective of this study was to compare the initial feeding strategy in infants with gastroschisis to determine whether bolus feeding is a feasible approach.</p><p><b>Methods:</b> After obtaining REB approval (H24-01052) a retrospective chart review was performed in neonates born with gastroschisis, cared for by a neonatal intestinal rehabilitation team between 2018 and 2023. A continuous feeding protocol was used between 2018-2020 (human milk at 1 ml/h with 10 ml/kg/d advancements given continuously until 50 ml/kg/d and then trialing bolus feeding) and a bolus protocol was used between 2021-2023 (10-15 ml/kg divided into 8 feeds/d with 15-20 ml/kg/d advancements). Clinical data was collected including: gestational age, gastroschisis prognosis score (GPS), need for intestinal resection, age when feeds initiated, time to full feeds, route of feeding at full feeds and hepatic cholestasis were compared between groups. Welch's t-test and chi square test were performed to compare variables with p-values < 0.05 considered significant.</p><p><b>Results:</b> Forty-one infants with gastroschisis were reviewed (23 who were managed with continuous feed initiation and 18 with bolus feed initiation). Continuous feed and bolus feed groups had comparable mean gestational age at birth, GPS score, need for intestinal surgery, age at feed initiation, and the incidence of cholestasis (Table 1). Time to achieve enteral independence was similar between both groups with nearly half of infants reaching full feeds by 6 weeks (48% continuous feeds vs. 44% bolus feeds) and most by 9 weeks of life (74% continuous vs. 72% bolus). Significantly more infants in the bolus feeding group were feeding exclusively orally compared to the continuous feeding group at the time of reaching full enteral feeds (50% vs. 17%, p = 0.017).</p><p><b>Conclusion:</b> Initiating bolus enteral feeding for infants with gastroschisis, including those requiring intestinal resection, is safe and does not result in prolonged time to full enteral feeding. Avoiding continuous feeds may improve oral feeding in this population.</p><p><b>Table 1.</b> Clinical Characteristics and Initial Feeding Strategy.</p><p></p><p><b>International Poster of Distinction</b></p><p>Matheus Albuquerque<sup>1</sup>; Diogo Ferreira<sup>1</sup>; João Victor Maldonado<sup>2</sup>; Mateus Margato<sup>2</sup>; Luiz Eduardo Nunes<sup>1</sup>; Emanuel Sarinho<sup>1</sup>; Lúcia Cordeiro<sup>1</sup>; Amanda Fifi<sup>3</sup></p><p><sup>1</sup>Federal University of Pernambuco, Recife, Pernambuco; <sup>2</sup>University of Brasilia, Brasília, Distrito Federal; <sup>3</sup>University of Miami, Miami, FL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Intestinal failure secondary to short bowel syndrome is a malabsorptive condition, caused by intestinal resection. Patients with intestinal failure require parenteral support to maintain hydration and nutrition. Long-term parenteral nutrition leads to complications. Teduglutide, an analog of GLP-2, may improve intestinal adaptation thereby minimizing reliance on parenteral nutrition. This meta-analysis evaluates the efficacy of teduglutide in reducing parenteral nutrition dependency in pediatric patients with intestinal failure.</p><p><b>Methods:</b> We included randomised controlled trials (RCTs) that assessed the efficacy of teglutide in reducing parenteral nutrition support and improving anthropometrics in pediatric patients with intestinal failure secondary to short bowel syndrome. The RoB-2 tool (Cochrane) evaluated the risk of bias, and statistical analyses were conducted utilizing RevMan 5.4.1 software. The results are expressed as mean differences with CI 95% and p-value.</p><p><b>Results:</b> Data was extracted from three clinical trials, involving a total of 172 participants. Teduglutide use was associated with a reduction in parenteral nutrition volume (-17.92 mL, 95% CI -24.65 to -11.20, p < 0.00001) with most patients reducing parenteral support by >20% (11.79, 95% CI 2.04 to 68.24, p = 0.006) (Figure2). Treatment with teduglutide also improved height (0.27 Z-score, 95% CI 0.08, to 0.46, p = 0.005) but did not significantly increase weight when compared to the control group (-0.13 Z-score, 95% CI, -0.41 to 0.16, p = 0.38) (Figure 3).</p><p><b>Conclusion:</b> This meta-analysis suggests that therapy with teduglutide reduces parenteral nutrition volume in patients with short bowel syndrome and intestinal failure. Reduced pareneteral nutrition dependency can minimize complications and improve quality of life in patients with short bowel syndrome and intestinal failure.</p><p></p><p><b>Figure 1.</b> Parenteral Nutrition Support Volume Change.</p><p></p><p><b>Figure 2.</b> Anthropometric Data (Weight and Height) Change from Baseline.</p><p>Korinne Carr<sup>1</sup>; Liyun Zhang, MS<sup>1</sup>; Amy Pan, PhD<sup>1</sup>; Theresa Mikhailov, MD, PhD<sup>2</sup></p><p><sup>1</sup>Medical College of Wisconsin, Milwaukee, WI; <sup>2</sup>Childrens Hospital of Wisconsin, Milwaukee, WI</p><p><b>Financial Support:</b> Medical College of Wisconsin, Department of Pediatrics.</p><p><b>Background:</b> Malnutrition is a significant concern in pediatric patients, particularly those critically ill. In children with diabetes mellitus (DM), the presence of malnutrition can exacerbate complications such as unstable blood sugar levels and delayed wound healing, potentially leading to worse clinical outcomes. Despite the known impact of malnutrition on adult patients and critically ill children, established criteria for identifying malnutrition in critically ill children are lacking. This study was designed to determine the relationship between malnutrition, mortality, and length of stay (LOS) in critically ill pediatric patients with diabetes mellitus.</p><p><b>Methods:</b> We conducted a retrospective cohort study using the VPS (Virtual Pediatric Systems, LLC) Database. We categorized critically ill pediatric patients with DM as malnourished or at risk of being malnourished based on admission nutrition screens. We compared mortality rates between malnourished and non-malnourished patients using Fisher's Exact test. We used logistic regression analysis to compare mortality controlling for measures like PRISM3 (a severity of illness measure), demographic, and clinical factors. We compared the LOS in the Pediatric Intensive Care Unit (PICU) between malnourished and non-malnourished patients using the Mann-Whitney-Wilcoxon test. Additionally, we used a general linear model with appropriate transformation to adjust for the severity of illness, demographic, and clinical factors. We considered statistical significance at p < 0.05.</p><p><b>Results:</b> We analyzed data for 4,014 patients, of whom 2,653 were screened for malnutrition. Of these 2,653, 88.5% were type 1 DM, 9.3% were type 2 DM, and the remaining patients were unspecified DM. Of the 2,653 patients, 841 (31.7%) were malnourished based on their nutrition screen at admission to the PICU. Mortality in patients who were screened as malnourished did not differ from mortality in those who were not malnourished (0.4% vs. 0.2%, p = 0.15). Malnourished patients also had longer PICU LOS, with a geometric mean and 95% CI of 1.03 (0.94–1.13) days, compared to 0.91 (0.86–0.96) days for non-malnourished patients. Similarly, the malnourished patients had longer hospital LOS with a geometric mean and 95% CI of 5.31 (4.84–5.83) days, compared to 2.67 (2.53–2.82) days for those who were not malnourished. Both differences were significant with p < 0.0001, after adjusting for age, race/ethnicity, and PRISM3.</p><p><b>Conclusion:</b> We found no difference in mortality rates, but critically ill children who were screened as malnourished had longer PICU and hospital LOS than those who were not malnourished. This was true even after adjusting for age, race/ethnicity, and PRISM3.</p><p>Emily Gutzwiller<sup>1</sup>; Katie Huff, MD, MS<sup>1</sup></p><p><sup>1</sup>Indiana University School of Medicine, Indianapolis, IN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Neonates with intestinal failure require parenteral nutrition for survival. While life sustaining, it can lead to serious complications, including intestinal failure associated liver disease (IFALD). The etiology of IFALD is likely multifactorial, with intravenous lipid emulsions (ILE) being a large contributor, particularly soybean oil-based lipid emulsions (SO-ILE). Alternate ILEs, including those containing fish oil, can be used to prevent and treat IFALD. Fish oil-based ILE (FO-ILE) is only approved at a dose of 1 g/kg/d, limiting calories prescribed from fat and shifting the calorie delivery to carbohydrate predominance. While FO-ILE was shown to have comparable growth to SO-ILE, a comparison to soy, MCT, olive, fish oil-based ILE (SO,MCT,OO,FO-ILE) has not been conducted to our knowledge. The purpose of this study is to compare the growth and laboratory data associated with SO,MCT,OO,FO-ILE and FO-ILE therapy in the setting of IFALD in a cohort of neonates treated during two time periods.</p><p><b>Methods:</b> We performed a retrospective chart review of patients with IFALD receiving SO,MCT,OO,FO-ILE (SMOFlipid) or FO-ILE (Omegaven) in a level IV neonatal intensive care unit from September 2016 – May 2024. IFALD was defined as direct bilirubin >2 mg/dL after receiving >2 weeks of parental nutrition. Patients with underlying genetic or hepatic diagnoses, as well as patients with elevated direct bilirubin prior to two weeks of life were excluded. Patients were divided based on the period they were treated. Data was collected on demographic characteristics, parental and enteral nutrition, weekly labs, and growth changes while receiving SO,MCT,OO,FO-ILE or FO-ILE. Rate of change of weight, length, and head circumference and comparison of z-scores over time were studied for each ILE group. Secondary outcomes included nutritional data in addition to hepatic labs. Nonparametric analysis using Mann-Whitney U test was conducted to compare ILE groups and a p-value of < 0.05 was used to define statistical significance.</p><p><b>Results:</b> A total of 51 patients were enrolled, 25 receiving SO,MCT,OO,FO-ILE and 26 FO-ILE. Table 1 notes the demographic and baseline characteristics of the two ILE groups. There was no difference in the rate of OFC (p = 0.984) or length (p = 0.279) growth between the two treatment groups (Table 2). There was a difference, however, in the rate of weight gain between groups. (p = 0.002; Table 2), with the FO-ILE group gaining more weight over time. When comparing nutritional outcomes (Table 2), SO,MCT,OO,FO-ILE patients received greater total calories than FO-ILE patients (p = 0.005) including a higher ILE dose (p < 0.001) and enteral calories (p = 0.029). The FO-ILE group, however received a higher carbohydrate dose (p = 0.003; Table 2). There was no difference in amino acid dose (p = 0.127) or parental nutrition calories (p = 0.821). Hepatic labs differed over time with the FO-ILE having a larger decrease in AST, direct bilirubin, and total bilirubin over time compared to the SO,MCT,OO,FO-ILE group (Table 2).</p><p><b>Conclusion:</b> Our results show the FO-ILE patients did have a significant increase in weight gain compared to the SO,MCT,OO,FO-ILE patients. This is despite SO,MCT,OO,FO-ILE patients receiving greater total calories and enteral calories. The FO-ILE only received greater calories in the form of glucose infusion. With this increased weight gain but similar length growth between groups, concerns regarding alterations in body composition and increased fat mass arise. Further research is needed to determine the influence of this various ILE products on neonatal body composition over time.</p><p><b>Table 1.</b> Demographic and Baseline Lab Data by Lipid Treatment Group.</p><p>(All data presented as median and interquartile range, unless specified.)</p><p></p><p><b>Table 2.</b> Nutritional, Hepatic Lab, and Growth Outcomes by Lipid Treatment Group.</p><p>(All data presented as median interquartile range unless specified.)</p><p>*z-score change compares z-score at end and beginning of study period</p><p>OFC-occipitofrontal circumference</p><p></p><p>Rachel Collins, BSN, RN<sup>1</sup>; Brooke Cherven, PhD, MPH, RN, CPON<sup>2</sup>; Ann-Marie Brown, PhD, APRN, CPNP-AC/PC, CCRN, CNE, FCCM, FAANP, FASPEN<sup>1</sup>; Christina Calamaro, PhD, PPCNP-BC, FNP-BC, FAANP, FAAN<sup>3</sup></p><p><sup>1</sup>Emory University Nell Hodgson Woodruff School of Nursing, Atlanta, GA; <sup>2</sup>Emory University School of Medicine; Children's Healthcare of Atlanta, Atlanta, GA; <sup>3</sup>Emory University Nell Hodgson Woodruff School of Nursing; Children's Healthcare of Atlanta, Atlanta, GA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Many children receiving hematopoietic stem cell transplant (HSCT) are malnourished prior to the start of transplant or develop malnutrition post-infusion. Children are at risk for malnutrition due to the pre-transplant conditioning phase, from chemotherapy treatments for their primary diagnosis, and from acute graft versus host disease (GVHD). These factors can cause impaired oral feeding, vomiting, diarrhea, and mucositis. Enteral nutrition (EN) and parenteral nutrition (PN) are support options for this patient population when oral feeding is impaired. There is currently a paucity of research in evidence-based guidelines of nutrition in this population. The purpose of this integrative literature review was to determine the outcomes of EN and PN in pediatric HSCT and to discuss evidence-based implications for practice to positively affect quality of life for these patients.</p><p><b>Methods:</b> A literature search was conducted using the following databases: PubMed, CINAHL, Cochrane, and Healthsource: Nursing/Academic Edition. The search strategy included randomized controlled trials, prospective and retrospective cohort studies, case control studies, cross sectional studies, systematic reviews, and meta-analyses. Relevant papers were utilized if they discussed patients 0-21 years of age, allogenic or autogenic HSCT, and enteral and/or parenteral nutrition. Papers were excluded if there was no English translation, they did not discuss nutrition, or they had animal subjects.</p><p><b>Results:</b> Initially 477 papers were identified and after the screening process 15 papers were utilized for this integrative review. EN and PN have effects on clinical outcomes, complications, and hospital and survival outcomes. EN was associated with faster platelet engraftment, improved gut microbiome, and decreased mucositis and GVHD. PN was used more in severe mucositis due to interference with feeding tube placement, therefore decreasing the use of EN. Use of PN is more common in severe grades III-IV of gut GVHD. Initiation of EN later in treatment, such as after conditioning and the presence of mucositis, can be associated with severe grades III-IV of gut GVHD. This is because conditioning can cause damage to the gut leading to mucosal atrophy and intestinal permeability altering gut microbiota. PN can induce gut mucosal atrophy and dysbiosis allowing for bacterial translocation, while EN improves the gut epithelium and microbiota reducing translocation. Additionally, increase access to central venous lines in PN can introduce bacterial infections to the bloodstream. Feeding tube placement complications include dislodgement, refusal of replacement, and increased risk of bleeding due to thrombocytopenia. Electrolyte imbalance can be attributed to loss of absorption through the gut. If a gastrostomy tube is present, there can be infection at the site. There is currently no consensus in the appropriate timeline of tube placement. There was no significant difference in neutrophil engraftment and variable findings in morbidity/mortality and weight gain. Weight gain can be attributed to edema in PN. Length of stay was significantly shorter in only EN than PN (p < 0.0001).</p><p><b>Conclusion:</b> This literature review indicates a need for a more comprehensive nutritional assessment to adequately evaluate nutritional status before and after HSCT. EN should be given as a first line therapy and should be considered prior to the conditioning phase. The initiation of a feeding tube prior to conditioning should be considered. Finally, PN may be considered if EN cannot be tolerated. More research is needed for a sensitive nutritional evaluation, earlier administration of EN, and standardized pathways for EN and PN nutrition in pediatric HSCT.</p>\",\"PeriodicalId\":16668,\"journal\":{\"name\":\"Journal of Parenteral and Enteral Nutrition\",\"volume\":\"49 S1\",\"pages\":\"S90-S308\"},\"PeriodicalIF\":3.2000,\"publicationDate\":\"2025-03-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://onlinelibrary.wiley.com/doi/epdf/10.1002/jpen.2735\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Parenteral and Enteral Nutrition\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1002/jpen.2735\",\"RegionNum\":3,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"NUTRITION & DIETETICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Parenteral and Enteral Nutrition","FirstCategoryId":"3","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/jpen.2735","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"NUTRITION & DIETETICS","Score":null,"Total":0}
Background: Essential fatty acid deficiency (EFAD) is a rare disorder among the general population but can be a concern in patients reliant on home parenteral nutrition (HPN), particularly those who are not receiving intravenous lipid emulsions (ILE). In the US, the only ILE available until 2016 was soybean oil based (SO-ILE), which contains more than adequate amounts of essential fatty acids, including alpha-linolenic acid (ALA, an omega-3 fatty acid) and linoleic acid (LA, an omega-6 fatty acid). In 2016, a mixed ILE containing soybean oil, medium chain triglycerides, olive oil and fish oil, became available (SO, MCT, OO, FO-ILE). However, it contains a lower concentration of essential fatty acids compared to SO-ILE, raising theoretical concerns for development of EFAD if not administered in adequate amounts. Liver dysfunction is a common complication in HPN patients that can occur with soybean based ILE use due to their pro-inflammatory properties. Short-term studies and case reports in patients receiving SO, MCT, OO, FO-ILE have shown improvements in liver dysfunction for some patients. Our study evaluates the long-term impact of SO, MCT, OO, FO-ILE in our HPN patient population.
Methods: This single-center, retrospective cohort study was conducted at the Cleveland Clinic Center for Human Nutrition using data from 2017 to 2020. It involved adult patients who received HPN with SO, MCT, OO, FO-ILE for a minimum of one year. The study assessed changes in essential fatty acid profiles, including triene-tetraene ratios (TTRs) and liver function tests (LFTs) over the year. Data was described as mean and standard deviation for normal distributed continuous variables, medians and interquartile range for non-normally distributed continuous variables and frequency for categorical variables. The Wilcoxon signed rank test was used to compare the baseline and follow-up TTR values (mixed time points). The Wilcoxon signed rank test with pairwise comparisons was used to compare the LFTs at different time points and to determine which time groups were different. P-values were adjusted using Bonferroni corrections. Ordinal logistic regression was used to assess the association between lipid dosing and follow-up TTR level. Analyses were performed using R software and a significance level of 0.05 was assumed for all tests.
Results: Out of 110 patients screened, 26 met the inclusion criteria of having baseline and follow-up TTRs. None of the patients developed EFAD, and there was no significant difference in the distribution of TTR values between baseline and follow-up. Additionally, 5.5% of patients reported adverse GI symptoms while receiving SO, MCT, OO, FO-ILE. A separate subgroup of 14 patients who had abnormal LFTs, including bilirubin, alkaline phosphatase (AP), aspartate aminotransferase (AST) or alanine aminotransferase (ALT), were evaluated. There was a statistically significant improvement of AST and ALT and decreases in bilirubin and AP that were not statistically significant.
Conclusion: We found that using SO, MCT, OO, FO-ILE as the primary lipid source did not result in EFAD in any of our subset of 26 patients, and TTRs remained statistically unchanged after introduction of SO, MCT, OO, FO-ILE. Additionally, there was a statistically significant decrease in AST and ALT following the start of SO, MCT, OO, FO-ILE. While liver dysfunction from PN is multifactorial, the use of fish oil based lipids has been shown to improve LFT results due to a reduction of phytosterol content as well as less pro-inflammatory omega-6 content when compared to SO-ILEs. A significant limitation was the difficulty in obtaining TTR measurements by home health nursing in the outpatient setting, which considerably reduced the number of patients who could be analyzed for EFAD.
Table 1. Summary Descriptive Statistics of 26 Patients With Baseline and Follow Up TTR.
Table 2. Change in LFTs From Baseline Levels Compared to 3 Months, 6 Months, 9 Months and 12 Months.
Background: Aluminum is a non-nutrient contaminant of parenteral nutrition (PN) solution. The additive effects of PN components can contribute to toxicity and cause central nervous system issues as well as contribute to metabolic bone disease as observed in adults with osteomalacia. When renal function and gastrointestinal mechanisms are impaired, aluminum can accumulate in the body. Aluminum toxicity can result in anemia, dementia, bone disease and encephalopathy. Symptoms of aluminum toxicity may include mental status change, bone pain, muscle weakness, nonhealing fractures and premature osteoporosis. In July 2004, the U.S. Food and Drug Administration (FDA) mandated labeling of aluminum content with a goal to limit exposure to less than 5mCg/kg/day. Adult and pediatric dialysis patients, as well as patients of all ages receiving PN support, have an increased risk of high aluminum exposure. Reducing PN additives high in aluminum is the most effective way to decrease aluminum exposure and risk of toxicity. This abstract presents a unique case where antiperspirant use contributed to an accumulation of aluminum in an adult PN patient.
Methods: A patient on long-term PN (Table 1) often had results of low ionized calcium of < 3 mg/dL, leading to consideration of other contributing factors. In addition, patient was taking very high doses of vitamin D daily (by mouth) to stay in normal range (50,000IU orally 6 days/week). Risk factors for developing metabolic bone disease include mineral imbalances of calcium, magnesium, phosphorus, vitamin D, corticosteroid use, long-term PN use and aluminum toxicity (Table 2). A patient with known osteoporosis diagnosis had two stress fractures in left lower leg. Aluminum testing was completed in order to identify other factors that may be contributing to low ionized calcium values and osteoporosis. During patient discussion, the patient revealed they used an aluminum-containing antiperspirant one time daily. The range of aluminum content in antiperspirants is unknown, but studies show that minimal absorption may be possible, especially in populations with kidney insufficiency.
Results: After an elevated aluminum value resulted on July 3, 2023 (Figure 1), patient changed products to a non-aluminum containing antiperspirant. Aluminum values were rechecked at 3 and 7 months. Results indicate that patient's antiperspirant choice may have been contributing to aluminum content through skin absorption. Antiperspirant choice may not lead to aluminum toxicity but can contribute to an increased total daily aluminum content.
Conclusion: Preventing aluminum accumulation is vital for patients receiving long-term PN support due to heightened risk of aluminum toxicity. Other potential sources of contamination outside of PN include dialysis, processed food, aluminum foil, cosmetic products (antiperspirants, deodorant, toothpaste) medications (antacids), vaccinations, work environment with aluminum welding and certain processing industry plants. Aluminum content of medications and PN additives vary based on brands and amount. Clinicians should review all potential aluminum containing sources and assess ways to reduce aluminum exposure and prevent potential aluminum toxicity in long-term PN patients.
1St. Luke's International Hospital, Chuo-ku, Tokyo; 2The University of Tokyo, Bunkyo-ku, Tokyo; 3The University of Tokyo Hospital, Bunkyo-ku, Tokyo; 4The University of Tokyo, Chuo-City, Tokyo; 5Kanagawa University of Human Services, Yokosuka-city, Kanagawa
Financial Support: None Reported.
Background: Our previous study has demonstrated beta-hydroxy-beta-methylbutyrate (HMB)-supplemented total parenteral nutrition (TPN) partially to restore gut-associated lymphoid tissue (GALT) atrophy observed in standard TPN-fed mice. Oral intake of HMB is now popular in body builders and athletes. Herein, we examined whether oral supplementation of HMB could increase GALT mass in mice which eat dietary chow ad libitum.
Methods: Six-week-old male Institute of Cancer Research (ICR) mice were divided into the Control (n = 9), the H600 (n = 9) and the H2000 (n = 9) groups. All mice were allowed to take chow and water ad libitum for 7 days. The H600 or H2000 mice were given water containing Ca-HMB at 3 mg or 10 mg/mL water, while the Controls drank normal tap water. Because these mice drank approximately 6-7 mL water per day, the H600 and H2000 groups took 600 and 2000mg/kg Ca-HMB in one day, respectively. After 7 days manipulation, all mice were killed with cardiac puncture under general anesthesia, and the whole small intestine was harvested for GALT cell isolation. GALT cell numbers were evaluated in each tissue (Payer patches; PPs, Intraepithelial spaces; IE, and Lamina propria; LP). The nasal washings, bronchoalveolar lavage fluid (BALF), and intestinal washings were also collected for IgA level measurement by ELISA. The Kruskal-Wallis test was used for all parameter analyses, and the significance level was set at less than 5%.
Results: There were no significant differences in the number of GALT cells in any sites among the 3 groups (Table 1). Likewise, mucosal IgA levels did not differ between any of the 2 groups (Table 2).
Conclusion: Oral intake of HMB does not affect GALT cell number or mucosal IgA levels when the mice are given normal diet orally. It appears that beneficial effects of HMB on the GALT are expected only in parenterally fed mice. We should examine the influences of IV-HMB in orally fed model in the next study.
1Medical Affairs Department, Research and Development Center, Chiyoda-ku, Tokyo; 2Hepato-Biliary-Pancreatic Surgery Division, Bunkyo-ku, Tokyo; 3School of Medical Technology, Kurume, Fukuoka
Financial Support: Otsuka Pharmaceutical Factory, Inc.
Background: The guideline of American Society for Parenteral and Enteral Nutrition recommends a target energy intake of 20 to 30 kcal/kg/day in patients undergoing surgery. Infectious complications reportedly decreased when the target energy and protein intake were achieved in the early period after gastrointestinal cancer surgery. However, no studies investigated the association of prescribed parenteral energy doses with clinical outcomes in patients who did not receive oral/tube feeding in the early period after gastrointestinal cancer surgery.
Methods: Data of patients who underwent gastrointestinal cancer surgery during 2011–2022 and fasted for 7 days or longer after surgery were extracted from a nationwide medical claims database. The patients were divided into 3 groups based on the mean prescribed parenteral energy doses during 7 days after surgery as follows: the very-low group (<10 kcal/kg/day), the low group (10–20 kcal/kg/day), and the moderate group (≥20 kcal/kg/day). Multivariable logistic regression model analysis was performed using in-hospital mortality, postoperative complications, length of hospital stay, and total in-hospital medical cost as the objective variable and the 3 group and confounding factors as the explanatory variables.
Results: Of the 18,294 study patients, the number of patients in the very low, low, and moderate groups was 6,727, 9,760, and 1,807, respectively. The median prescribed energy doses on the 7th day after surgery were 9.2 kcal/kg, 16 kcal/kg, and 27 kcal/kg in the very low, low, and moderate groups, respectively. The adjusted odds ratio (95% confidence interval) for in-hospital mortality with reference to the very low group was 1.060 (1.057–1.062) for the low group and 1.281 (1.275–1.287) for the moderate group. That of postoperative complications was 1.030 (0.940–1.128) and 0.982 (0.842–1.144) for the low and moderate groups, respectively. The partial regression coefficient (95% confidence interval) for length of hospital stay (day) with reference to the very low group was 2.0 (0.7–3.3) and 3.2 (1.0–5.5), and that of total in-hospital medical cost (US$) was 1,220 (705–1,735) and 2,000 (1,136–2,864), for the low and moderate groups, respectively.
Conclusion: Contrary to the guideline recommendation, the prescribed energy doses of ≥ 10 kcal/kg/day was associated with the increase in in-hospital mortality, length of hospital stay, and total in-hospital medical cost. Our findings questioned the efficacy of guideline-recommended energy intake for patients during 7 days after gastrointestinal cancer surgery.
Background: Home parenteral nutrition (HPN) is a life-sustaining nutrition support therapy that is administered through a central venous access device (CVAD). Caregivers of children on HPN are initially trained to provide CVAD care and therapy administration by their clinical team or home infusion nurse. Often there is a gap in training when the patient is ready to assume responsibility for CVAD care. Without proper training, patients are at significant risk of complications such as bloodstream infections and catheter occlusions. The purpose of this study was twofold: 1) to explore the caregiver's perspective about current and future CVAD training practices and 2) to evaluate the need for a proactive formalized CVAD training program when care is transitioned from caregiver to patient.
Methods: An 8-question survey was created using an online software tool. The target audience included caregivers of children receiving HPN. A link to the survey was sent via email and posted on various social media platforms that support the HPN community. The survey was conducted from June 17 to July 18, 2024. Respondents who were not caregivers of a child receiving HPN via a CVAD were excluded.
Results: The survey received 114 responses, but only 86 were included in the analysis based on exclusion criteria. The distribution of children with a CVAD receiving HPN was evenly weighted between 0 and 18 years of age. The majority of the time, initial training regarding HPN therapy and CVAD care was conducted by the HPN clinic/hospital resource/learning center or home infusion pharmacy (Table 1). Forty-eight percent of respondents indicated their HPN team never offers reeducation or shares best practices (Figure 1). Most respondents selected the best individual to train their child on CVAD care and safety is the caregiver (Figure 2). In addition, 60% of respondents selected yes, they would want their child to participate in CVAD training if offered (Figure 3).
Conclusion: This survey confirms that most caregivers anticipate training their child to perform CVAD care when it is determined the child is ready for this responsibility. One challenge to this provision of training is that almost half of the respondents in this survey stated they never receive reeducation or best practice recommendations from their team. This finding demonstrates a need for a formalized training program to assist caregivers when transitioning CVAD care to the patient. Since most respondents reported relying on their intestinal rehab or GI/motility clinic for CVAD related concerns, these centers would be the best place to establish a transition training program. Limitations of the study are as follows: It was only distributed via select social platforms, and users outside of these platforms were not captured. Additional studies would be beneficial in helping to determine the best sequence and cadence for content training.
Table 1. Central Venous Access Device (CVAD) Training and Support Practices.
Figure 1. How Often Does Your HPN Team Offer Reeducation or Share Best Practices?
Figure 2. Who is Best to Train Your Child on CVAD Care Management and Safety?
Figure 3. If Formalized CVAD Training is Offered, Would You Want Your Child to Participate?
Laryssa Grguric, MS, RDN, LDN, CNSC1; Elena Stoyanova, MSN, RN2; Crystal Wilkinson, PharmD3; Emma Tillman, PharmD, PhD4
1Nutrishare, Tamarac, FL; 2Nutrishare, Kansas City, MO; 3Nutrishare, San Diego, CA; 4Indiana University, Carmel, IN
Financial Support: None Reported.
Background: Long-term parenteral nutrition (LTPN) within the home is a lifeline for many patients throughout the United States. Patients utilize central venous access devices (CVAD) to administer LTPN. Central line-associated bloodstream infection (CLABSI) is a serious risk associated with patients who require LTPN. Rates of CLABSI in LTPN populations range from 0.9-1.1 per 1000 catheter days. The aim of this study was to determine the incidence of CLABSI in a cohort of patients serviced by a national home infusion provider specializing in LTPN and identify variables associated with an increased incidence of CLABSI.
Methods: A retrospective review of electronic medical records of LTPN patients with intestinal failure was queried from March 2023 to May 2024 for patient demographics, anthropometric data, nursing utilization, parenteral nutrition prescription including lipid type, length of therapy use, geographic distribution, prescriber specialty, history of CLABSI, blood culture results as available, and use of ethanol lock. Patient zip codes were used to determine rural health areas, as defined by the US Department of Health & Human Services. Patients were divided into two groups: 1) patients that had at least one CLABSI and 2) patients with no CLABSI during the study period. Demographic and clinical variables were compared between the two groups. Nominal data were analyzed by Fisher's exact test and continuous data were analyzed with student t-test for normal distributed data and Mann-Whitney U-test was used for non-normal distributed data.
Results: We identified 198 persons that were maintained on LTPN during the study time. The overall CLABSI rate for this cohort during the study period was 0.49 per 1000 catheter days. Forty-four persons with LTPN had one or more CLABSI and 154 persons with LTPN did not have a CLABSI during the study period. Persons who experienced CLABSI weighed significantly more, had fewer days of infusing injectable lipid emulsions (ILE), and had a shorter catheter dwell duration compared to those that did not have a CLABSI (Table 1). There was no significant difference between the CLABSI and no CLABSI groups in the length of time on LTPN, location of consumer (rural versus non-rural), utilization of home health services, number of days parenteral nutrition (PN) was infused, or use of ethanol locks (Table 1).
Conclusion: In this retrospective cohort study, we report a CLABSI rate of 0.49 per 1000 catheter days, which is lower than previously published CLABSI rates for similar patient populations. Patient weight, days of infusing ILE, and catheter dwell duration were significantly different between those that did and did not have a CLABSI in this study period. Yet, variables such as use of ethanol lock and proximity to care providers that had previously been reported to impact CLABSI were not significantly different in this cohort. An expanded study with more LTPN patients or a longer study duration may be necessary to confirm these results and their impact on CLABSI rates.
Table 1. Long Term Parenteral Nutrition (LTPN) Characteristics.
1MedStar Washington Hospital Center, Bethesda, MD; 2National Institutes of Health, Bethesda, MD
Financial Support: None Reported.
Background: In hospitalized patients, lipid emulsions constitute an essential component of balanced parenteral nutrition (PN). Soy oil-based lipid injectable emulsions (SO-ILE) were traditionally administered as part of PN formulations primarily as a source of energy and for prevention of essential fatty acid deficiency. Contemporary practice has evolved to incorporate mixtures of different lipid emulsions, including a combination of soy, MCT, olive, and fish oils (SO, MCT, OO, FO-ILE). Evidence suggests that the use of SO, MCT, OO, FO-ILEs may alter essential fatty acid profiles, impacting hepatic metabolism and other processes associated with clinical benefits. The aim of this project was to compare essential fatty acid profile (EFAP), triglycerides (TGL), liver function tests (LFTs), and total bilirubin (TB) levels in adult patients receiving parenteral nutrition with SO-ILE or SO, MCT, OO, FO-ILE in a unique hospital entirely dedicated to conducting clinical research.
Methods: This was a retrospective chart review from 1/1/2019 to 12/31/2023 of adult patients in our hospital who received PN with SO-ILE or SO, MCT, OO, FO-ILE and had EFAP assessed after 7 days of receiving ILE. Data included demographic, clinical, and nutritional parameters. Patients with no laboratory markers, those on propofol, and those who received both ILE products in the 7 days prior to collection of EFAP were excluded. Data was statistically analyzed using Fisher's tests and Mann-Whitney U tests as appropriate.
Results: A total of 42 patient charts were included (14 SO-ILE; 28 SO, MCT, OO, FO-ILE). Group characteristics can be found in Table 1. Patients on SO-ILE received more ILE (0.84 vs 0.79 g/kg/day, p < 0.0001). TGL levels changed significantly after start of ILE (p < 0.0001). LFTs were found to be elevated in 57% of patients in the SO-ILE group and 60% in the SO, MCT, OO, FO-ILE group, while TB was increased in 21% and 40% of the patients respectively (Figure 1). Further analysis showed no significant differences in LFTs and TB between the two groups. Assessment of EFAP revealed a significant difference in the levels of DHA, docosenoic acid, and EPA, which were found to be higher in the group receiving SO, MCT, OO, FO-ILE. Conversely, significant differences were also observed in the levels of linoleic acid, homo-g-linolenic acid, and total omega 6 fatty acids, with those being higher in patients administered SO ILE (Figure 2). No differences were observed between groups regarding the presence of essential fatty acid deficiency, as indicated by the triene:tetraene ratio.
Conclusion: In our sample analysis, LFTs and TB levels did not differ significantly between SO-ILE and SO, MCT, OO, FO-ILE groups. Increased levels of DHA, docosenoic acid, and EPA were found in the SO, MCT, OO, FO-ILE group, while linoleic acid, homo-g-linolenic acid, and total omega 6 fatty acids tended to be higher in the SO-ILE group. Although in our sample the SO, MCT, OO, FO-ILE group received a lower dosage of ILE/kg/day, there were no differences in the rate of essential fatty acid deficiency between groups.
1Denver Health, St. Joseph Hospital, Denver, CO; 2Denver Health, Parker, CO; 3Denver Health, Denver, CO
Financial Support: None Reported.
Background: Early nutritional support in hospitalized patients has long been established as a key intervention to improve overall patient outcomes. Patients admitted to the hospital often have barriers to receiving adequate nutrition via enteral route means and may be candidates for parenteral nutrition (PN). Central parenteral nutrition (CPN) requires central access, which has historically led to concerns for central line-associated bloodstream infection (CLABSI). Obtaining central access can be resource-intensive and may result in treatment delays while awaiting access. Conversely, peripheral parenteral nutrition (PPN) can be delivered without central access. In this quality improvement project, we sought to characterize our PPN utilization at a large urban tertiary hospital.
Methods: We performed a retrospective review of adult inpatients receiving PN at our facility from 1/1/23–12/31/23. Patients were excluded from review if they had PN initiated prior to hospitalization. Demographic information, duration of treatment, timely administration status, and information regarding formula nutrition composition were collected.
Results: A total of 128 inpatients received PN for a total of 1302 PN days. The mean age of these patients was 53.8 years old (SD: 17.9) and 65 (50%) were male. Twenty-six (20%) patients received only PPN for a median [IQR] length of 3 [2–4] days, and 61 (48%) patients received only CPN for a median length 6 [3–10] days. Thirty-nine (30%) patients were started on PPN with the median time to transition to CPN of 1 [1-3] day(s) and a median total duration of CPN being 8 [5-15.5] days. A small minority of patients received CPN and then transitioned to PPN (2%).
Conclusion: At our institution, PPN is utilized in more than 50% of all inpatient PN, most commonly at PN initiation and then eventually transitioning to CPN for a relatively short duration of one to two weeks. Additional research is required to identify those patients who might avoid central access by increasing PPN volume and macronutrients to provide adequate nutrition therapy.
Nicole Halton, NP, CNSC1; Marion Winkler, PhD, RD, LDN, CNSC, FASPEN2; Elizabeth Colgan, MS, RD3; Benjamin Hall, MD4
1Brown Surgical Associates, Providence, RI; 2Department of Surgery and Nutritional Support at Rhode Island Hospital, Providence, RI; 3Rhode Island Hospital, Providence, RI; 4Brown Surgical Associates, Brown University School of Medicine, Providence, RI
Financial Support: None Reported.
Background: Parenteral nutrition (PN) provides adequate nutrition and fluids to patients with impaired gastrointestinal function who cannot meet their nutritional needs orally or enterally. PN requires a venous access device that has associated risks including infection as well as metabolic abnormalities associated with the therapy. Monitoring of PN therapy in the hospital setting involves regular blood work, yet improperly collected samples can lead to abnormal laboratory results and unnecessary medical interventions.
Methods: An IRB exempted quality improvement study was conducted at Rhode Island Hospital by the Surgical Nutrition Service which manages all adult PN. The purpose of the study was to quantify the occurrence of contaminated blood samples among PN patients between January 1, 2024, and August 31, 2024. Demographic data, venous access device, and PN-related diagnoses were collected. Quantification of contaminated blood specimens was determined per patient, per hospital unit, and adjusted for total PN days. Comparisons were made between serum glucose, potassium, and phosphorus levels from contaminated and redrawn blood samples. Descriptive data are reported.
Results: 138 patients received PN for a total of 1840 days with a median length of PN therapy of 8 days (IQR 9, range 2-84). The most common vascular access device was dual lumen peripherally inserted central catheter. The majority (63%) of patients were referred by surgery teams and received care on surgical floors or critical care units. The most frequent PN related diagnoses were ileus, gastric or small bowel obstruction, and short bowel syndrome. There were 74 contaminated blood specimens among 42 (30%) patients receiving TPN for a rate of 4% per total patient days. Of 25 nursing units, 64% had at least one occurrence of contaminated blood specimens among TPN patients on that unit. Contaminated samples showed significantly different serum glucose, potassium, and phosphorus compared to redrawn samples (p < 0.001); glucose in contaminated vs redrawn samples was 922 ± 491 vs 129 ± 44 mg/dL; potassium 6.1 ± 1.6 vs 3.9 ± 0.5 mEq/L; phosphorus 4.9 ± 1.2 vs 3.3 ± 0.6 mg/dL. The average time delay between repeated blood samples was 3 hours.
Conclusion: Contaminated blood samples can lead to delays in patient care, discomfort from multiple blood draws, unnecessary medical interventions (insulin; discontinuation of PN), delay in placement of timely PN orders, and increased infection risk. Nursing re-education on proper blood sampling techniques is critical for reducing contamination occurrences. All policies and procedures will be reviewed, and an educational program will be implemented. Following this, occurrences of blood contamination during PN will be reassessed.
Publication: Saravana P, Lau M, Dashti HS. Continuous glucose monitoring in adults with short bowel syndrome receiving overnight infusions of home parenteral nutrition. Eur J Clin Nutr. 2024 Nov 23. doi: 10.1038/s41430-024-01548-z. Online ahead of print. PMID: 39580544.
Financial Support: ASPEN Rhoads Research Foundation.
Maria Romanova, MD1; Azadeh Lankarani-Fard, MD2
1VA Greater Los Angeles Healthcare System, Oak Park, CA; 2GA Greater Los Angeles Healthcare System, Los Angeles, CA
Financial Support: None Reported.
Background: Malnutrition is a serious complication of the hospital stay. Parenteral nutrition (PN) in most sophisticated way of addressing it but requires on-going monitoring. In our medical center PN provision is guided by the interdisciplinary Nutrition Support Team (NST). In 2024 we began creation of a dashboard to monitor safety and utilization of PN at the Greater Los Angeles VA. Here we discuss the collaborative process of developing the dashboard and its first use.
Methods: A dashboard was constructed using data from the VA electronic health record. The dashboard used Microsoft Power BI technology to customize data visualization. The NST group worked closely with the Data Analytics team at the facility to modify and validate the dashboard to accommodate the needs of the group. The dashboard was maintained behind a VA firewall and only accessible to members of the NST. The dashboard reviewed patient level data for whom a Nutrition Support consult was placed for the last 2 years. The variables included were the data of admission, date of consult request, the treating specialty at the time of request, demographics, admission diagnosis, discharge diagnosis, number of orders for PPN/TPN, number of blood sugars >200 mg/dL after admission, number of serum phosphorus values < 2.5 mg/dL, number of serum potassium values < 3.5 mmol/L, any discharge diagnosis of refeeding (ICD 10 E87.8), micronutrient levels during admission, and any discharge diagnosis of infection. The ICD10 codes used to capture infection were for: bacteremia (R78.81), sepsis (A41.*), or catheter associated line infection (ICD10 = T80.211*). The asterix (*) denotes any number in that ICD10 classification. The dashboard was updated once a week. The NST validated the information on the dashboard to ensure validity, and refine information as needed.
Results: The initial data extraction noted duplicate consult request as patients changed treating specialties during the same admission and duplicate orders for PPN/TPN as the formulations were frequently modified before administration. The Data Analytics team worked to reduce these duplicates. The NST also collaborated with the Data Analytics team to modify their existing documentation to better capture the data needed going forward. Dashboard data was verified by direct chart review. Between April 2022- 2024 68 consults were placed from the acute care setting and 58 patients received PPN or TPN during this time period. Thirty-five patients experienced hyperglycemia. Two patients were deemed to have experience refeeding at the time of discharge. Fourteen episodes of infection were noted in those who received PPN/TPN but the etiology was unclear from the dashboard alone and required additional chart review.
Conclusion: A dashboard can facilitate monitoring of Nutrition Support services in the hospital. Refinement of the dashboard requires collaboration between the clinical team and the data analytics team to ensure validity and workload capture.
Michael Fourkas, MS1; Julia Rasooly, MS1; Gregory Schears, MD2
1PuraCath Medical Inc., Newark, CA; 2Department of Anesthesiology and Perioperative Medicine, Mayo Clinic, Rochester, MN
Financial Support: Funding of the study has been provided by Puracath Medical.
Background: Intravenous catheters can provide venous access for drug and nutrition delivery in patients for extended periods of time, but risk the occurrence of central line associated bloodstream infections (CLABSI) due to inadequate asepsis. Needleless connectors (NC), which provide access for injection of medications, are known to be one of the major sources of contamination. Studies demonstrate that current methods of disinfecting connectors such as a 15 second antiseptic wipe do not guarantee complete disinfection inside of connectors. With the rise of superbugs such as Candida auris, there is an urgent need for better aseptic technique compliance and non-antibiotic disinfection methods. Ultraviolet light-C (UV-C) is an established technology that is commonly used in hospital settings for disinfection of equipment and rooms. In this study, we investigate the efficacy of our novel UV-C light disinfection device on UV-C light-transmissive NCs inoculated with common CLABSI-associated organisms.
Methods: Staphylococcus aureus (ATCC #6538), Candida albicans (ATCC #10231), Candida auris (CDC B11903), Escherichia coli (ATCC #8739), Pseudomonas aeruginosa (ATCC #9027), and Staphylococcus epidermidis (ATCC #12228) were used as test organisms for this study. A total of 29 NC samples were tested for each organism with 3 positive controls and 1 negative control. Each UV-C light-transmissive NC was inoculated with 10 µl of cultured inoculum (7.00-7.66 log) and were exposed to an average of 48 mW/cm2 of UV light for 1 second using our in-house UV light disinfection device FireflyTM. After UV disinfection, 10 mL of 0.9% saline solution was flushed through the NC and filtered through a 0.45 µm membrane. The membrane filter was plated onto an agar medium matched to the organism and was incubated overnight at 37°C for S. aureus, E. coli, S. epidermidis, and P. aeruginosa, and two days at room temperature for C. albicans and C. auris. Positive controls followed the same procedure without exposure to UV light and diluted by 100x before being spread onto agar plates in triplicates. The negative controls followed the same procedure without inoculation. After plates were incubated, the number of colonies on each plate were counted and recorded. Log reduction was calculated by determining the positive control log concentration over the sample concentration in cfu/mL. 1 cfu/10 mL was used to make calculations for total kills.
Results: Using our UV light generating device, we were able to achieve greater than 4 log reduction average and complete kills for all test organisms. The log reduction for S. aureus, C. albicans, C. auris, E. coli, P. aeruginosa, and S. epidermidis were 5.29, 5.73, 5.05, 5.24, 5.10, and 5.19, respectively.
Conclusion: We demonstrated greater than 4-log reduction in common CLABSI-associated organisms using our UV light disinfection device and UV-C transmissive NCs. By injecting inoculum directly inside the NC, we demonstrated that disinfection inside NCs can be achieved, which is not possible with conventional scrubbing methods. A one second NC disinfection time will allow less disruption in the workflow in hospitals, particularly in intensive care units where highly effective and efficient disinfection rates are essential for adoption of the technology.
Table 1. Log Reduction of Tested Organisms After Exposure to 48 mW/cm2 UV-C for 1 Second.
Yaiseli Figueredo, PharmD1
1University of Miami Hospital, Miami, FL
Financial Support: None Reported.
Background: Octreotide belongs to the somatostatin analog class. It is used off-label for malignant bowel obstructions (MBO). Somatostatin analogs (SSA) inhibit the release and action of multiple hormones, reducing gastric secretions, peristalsis, and splanchnic blood flow while enhancing water and electrolyte absorption. National Comprehensive Cancer Network (NCCN) guidelines recommend octreotide 100-300 mcg subcutaneous twice to three times a day or 10-40 mcg/hour continuous infusion for the management of malignant bowel obstructions, and if prognosis is greater than 8 weeks, consider long-acting release (LAR) or depot injection. Using octreotide as an additive to parenteral nutrition solutions has been a debatable topic due to concerns of formation of a glycosyl octreotide conjugate that may decrease the octreotide's efficacy. However, other compatibility studies have concluded little octreotide loss over 48 hours in TPN solutions at room temperature in ambient room light. At the University of Miami Hospital, it is practiced using octreotide as an additive to Total Parenteral Nutrition (TPN) solutions to reduce gastro-intestinal secretions in patients with malignant bowel obstructions. The starting dose is 300 mcg, and dose is increased on 300 mcg increments to a maximum dose of 900 mcg if output remains uncontrolled/elevated. The objective of this study is to evaluate the effectiveness and safety of octreotide as an additive to TPN in hospitalized patients with malignant bowel obstructions.
Methods: A three-year retrospective chart review (June 2021-June 2024) was conducted to evaluate the effectiveness and safety of octreotide as an additive to TPN in hospitalized patients with MBO diagnosis at UMH. The following information was obtained from chart review: age, gender, oncologic diagnosis, TPN indication, TPN dependency, octreotide doses used, baseline and final gastrointestinal secretion output recorded, type of venting gastrostomy in place, length of hospital stay, and baseline and final hepatic function tests.
Results: A total of 27 patients were identified to have malignant bowel obstruction requiring TPN which had octreotide additive. All patients were started on octreotide 300 mcg/day added into 2-in-1 TPN solution. The gastrointestinal secretion output was reduced on average by 65% among all patients with a final average daily amount of 540 mL recorded. The baseline average output recorded was 1,518 mL/day. The average length of treatment as an inpatient was 23 days, range 3-98 days. Liver function tests (LFTs) were assessed at baseline and last inpatient value available for the admission. Four out of the 27 patients (15%) reviewed were observed to have a significant rise in liver enzymes greater than three times the upper limit of normal.
Conclusion: Octreotide represents a valuable addition to the limited pharmacological options for managing malignant bowel obstruction. Its ability to reduce gastrointestinal secretions by 65% on average as observed in this retrospective chart review can significantly alleviate symptoms and improve patient care. Using octreotide as an additive to TPN solutions for patients with malignant bowel obstructions who are TPN dependent reduces the number of infusions or subcutaneous injections patients receive per day. According to octreotide's package insert, the incidence of hepato-biliary complications is up to 63%. The finding that 15% of patients from this retrospective chart review had significant liver enzyme elevations remains an important monitoring parameter to evaluate.
Pavel Tesinsky, Assoc. Prof., MUDr.1; Jan Gojda, Prof., MUDr, PhD2; Petr Wohl, MUDr, PhD3; Katerina Koudelkova, MUDr4
1Department of Medicine, Prague, Hlavni mesto Praha; 2Department of Medicine, University Hospital, 3rd Faculty of Medicine Charles University in Prague, Praha, Hlavni mesto Praha; 3Institute for Clinical and Experimental Medicine, Prague, Hlavni mesto Praha; 4Department of Medicine, University Hospital and 3rd Faculty of Medicine in Prague, Prague, Hlavni mesto Praha
Financial Support: The Registry was supported by Takeda and Baxter scientific grants.
Background: Trends in indications, syndromes, performance, weaning, and complications of patients on total HPN based on the updated 30 years analysis and stratification of patients on home parenteral nutrition (HPN) in Czech Republic.
Methods: Records from the HPN National Registry were analysed for the time period 2007 – 2023, based on the data from the HPN centers. Catheter related sepsis (CRS), catheter occlusions, and thrombotic complications were analyzed for the time–to–event using the competing-risks regression (Fine and Gray) model. Other data is presented as median or mean with 95% CI (p < 0.05 as significant).
Results: The incidence rate of HPN is 1.98 per 100.000 inhabitants (population 10.5 mil.). Lifetime dependency is expected in 20% patients, potential weaning in 40%, and 40% patients are palliative. Out of 1838 records representing almost 1.5 million catheter days, short bowel syndrome was present in 672 patients (36.6%), intestinal obstruction in 531 patients (28.9%), malabsorption in 274 patients (14.9%), and the rest of 361 patients (19.6%) was split among fistulas, dysphagia, or remained unspecified. The majority of SBS were type I (57.8 %) and II (20.8%). Mean length of residual intestine was 104.3 cm (35.9 - 173.4 cm) with longer remnants in type I SBS. Dominant indications for HPN were pseudoobstruction (35.8%), non-maignant surgical conditions (8.9%), Crohn disease (7.3%), and mesenteric occlusion (6.8%). Mobility for a substantial part of the day was reported from 77.8% HPN patients, economic activity and independence from 162 (24.8 %) out of 653 economically potent patients. A tunneled catheter was primarily used in 49.1%, PICC in 24.3%, and IV port in 19.8% patients. Commercially prepared bags were used in 69.7%, and pharmacy-prepared admixtures in 24.7% patients. A total of 66.9% patients were administered 1 bag per day/7 days a week. The sepsis ratio per 1000 catheter days decreased from 0.84 in 2013 to 0.15 in 2022. The catheter occlusions ratio decreased from 0.152 to 0.10 per 1000 catheter days, and thrombotic complications ratio from 0.05 to 0.04. Prevalence of metabolic bone disease is 15.6 %, and prevalence of PNALD is 22.3%. In the first 12 months, 28 % patients achieved intestinal autonomy increasing to 45 % after 5 years. Patient survival rate is 62% in the first year, 45% at 5 years, and 35% at the 10-years mark. Tedeglutide was indicated in 36 patients up to date with reduction of the daily HPN volume to 60.3% on average.
Conclusion: Prevalence of HPN patients in the Czech Republic is increasing in the past ten years and it is corresponding to the incidence rate. Majority of patients are expected to terminate HPN within the first year. Risk of CRS decreased significantly in the past five years and remains low, while catheter occlusion and thrombotic complications have a stable trend. Tedeglutide significantly reduced the required IV volume.
Figure 1. Per-Year Prevalence of HPN Patients and Average Number of Catheter Days Per Patient (2007 - 2022).
Figure 2. Annual Incidence of HPN Patients (2007 - 2022).
Figure 3. Catheter related bloodstream infections (events per 1,000 catheter-days).
1Vanderbilt University Medical Center, Nashville, TN; 2Vanderbilt University Medical Center, Nashville, TN
Financial Support: None Reported.
Background: Determining macronutrient goals in patients requiring home parenteral nutrition (HPN) can be difficult due to various factors. While indirect calorimetry is the gold standard for measuring energy expenditure, it is not readily available in the outpatient setting. Therefore, clinicians typically rely on less accurate weight-based equations for assessment of protein and energy requirements. Energy goals are also impacted by the targeted desire for weight loss, weight gain, or weight maintenance. Patients receiving HPN may consume some oral dietary intake and experience variable degrees of macronutrient absorption. These factors, as well as underlying clinical conditions, can significantly impact protein and energy requirements and may change over the course of HPN therapy. The purpose of this study was to evaluate the range of protein and energy doses prescribed in patients receiving HPN managed by an interdisciplinary intestinal failure clinic at a large academic medical center.
Methods: Patient demographics including patient age, gender, and PN indication/diagnosis were retrospectively obtained for all patients discharged home with PN between May 2021 to May 2023 utilizing an HPN patient database. Additional information was extracted from the electronic medical record at the start of HPN, then at 2-week, 2 to 3 month, and 6-month intervals following discharge home that included height, actual weight, target weight, HPN energy dose, HPN protein dose, and whether the patient was eating. Data collection ended at completion of HPN therapy or up to 6 months of HPN. All data was entered and stored in an electronic database.
Results: During the study period, 248 patients were started on HPN and 56 of these patients received HPN for at least 6 months. Patient demographics are included in Table 1. At the start of HPN, prescribed energy doses ranged from 344 to 2805 kcal/d (6 kcal/kg/d to 45 kcal/kg/d) and prescribed protein doses ranged from 35 to 190 g/d (0.6 g/kg/d to 2.1 g/kg/d). There continued to be a broad range of prescribed energy and protein doses at 2-week, 2 to 3 month, and 6-month intervals of HPN. Figures 1 and 2 provide the prescribed energy and protein doses for all patients and for those who are eating and not eating. For patients not eating, the prescribed range for energy was 970 to 2791 kcal/d (8 kcal/kg/d to 45 kcal/kg/d) and for protein was 40 to 190 g/d (0.6 g/kg/d to 2.0 g/kg/d) at the start of PN therapy. The difference between actual weight and target weight was assessed at each study interval. Over the study period, patients demonstrated a decrease in the difference between actual and target weight to suggest improvement in reaching target weight (Figure 3).
Conclusion: The results of this study demonstrate a wide range of energy and protein doses prescribed in patients receiving HPN. This differs from use of PN in the inpatient setting, where weight-based macronutrient goals tend to be more defined. Macronutrient adjustments may be necessary in the long-term setting when patients are consuming oral intake, for achievement/maintenance of target weight, or for changes in underlying conditions. Patients receiving HPN require an individualized approach to care that can be provided by interdisciplinary nutrition support teams specializing in intestinal failure.
Table 1. Patient Demographics Over 6-Month Study Period.
Figure 1. Parenteral Nutrition (PN) Energy Range.
Figure 2. Parenteral Nutrition (PN) Protein Range.
Figure 3. Difference Between Actual Weight and Target Weight.
1Soleo Home Infusion, Frisco, TX; 2Soleo Health, Frisco, TX
Financial Support: None Reported.
Background: Since the 1990s, initiating parenteral nutrition (PN) at home has been performed, though some clinicians prefer hospital initiation due to risks like refeeding syndrome (RS). A key factor in successful home PN initiation is careful evaluation by an experienced nutrition support clinician, particularly assessing RS risk. In 2020, ASPEN published consensus recommendations for identifying patients at risk for RS and guidelines for initiating and advancing nutrition. Home PN initiation offers advantages, including avoiding hospitalization, reducing healthcare costs, minimizing hospital-acquired infections, and improving quality of life. Literature suggests a savings of $2,000 per day when PN is started at home. This study aims to determine the risk and incidence of RS in patients who began PN at home, based on the 2020 ASPEN Consensus Recommendations for RS.
Methods: A national home infusion provider's nutrition support service reviewed medical records for 27 adult patients who initiated home PN between September 2022 and June 2024. Patients were evaluated for RS risk before PN initiation and the actual incidence of RS based on pre- and post-feeding phosphorus, potassium, and magnesium levels using ASPEN 2020 criteria. Initial lab work was obtained before and after PN initiation. Refeeding risk was categorized as mild, moderate, moderate to severe, or severe based on initial nutrition assessment, including BMI, weight loss history, recent caloric intake, pre-feeding lab abnormalities, fat/muscle wasting, and high-risk comorbidities. The percent change in phosphorus, potassium, and magnesium was evaluated and categorized as mild, moderate, or severe if levels decreased after therapy start. Initial PN prescriptions included multivitamins and supplemental thiamin per provider policy and consensus recommendations.
Results: The average baseline BMI for the study population was 19.6 kg/m² (range 12.7-31.8, median 18.9). Weight loss was reported in 88.9% of patients, averaging 22%. Little to no oral intake at least 5-10 days before assessment was reported in 92.3% of patients. Initial lab work was obtained within 5 days of therapy start in 96.2% of cases, with 18.5% showing low prefeeding electrolytes. 100% had high-risk comorbidities. RS risk was categorized as mild (4%), moderate (48%), moderate to severe (11%), and severe (37%). Home PN was successfully initiated in 25 patients (93%). Two patients could not start PN at home: one due to persistently low pre-PN electrolyte levels despite IV repletion, and one due to not meeting home care admission criteria. Starting dextrose averaged 87.2 g/d (range: 50-120, median 100). Average total starting calories were 730 kcals/d, representing 12.5 kcals/kg (range: 9-20, median 12). Initial PN formula electrolyte content included potassium (average 55.4 mEq/d, range: 15-69, median 60), magnesium (average 11.6 mEq/d, range: 4-16, median 12), and phosphorus (average 15.6 mmol/d, range: 8-30, median 15). Labs were drawn on average 4.3 days after therapy start. Potassium, magnesium, and phosphorus levels were monitored for decreases ≥10% of baseline to detect RS. Decreases in magnesium and potassium were classified as mild (10-20%) and experienced by 4% of patients, respectively. Eight patients (32%) had a ≥ 10% decrease in phosphorus: 4 mild (10-20%), 2 moderate (20-30%), 2 severe (>30%).
Conclusion: Home initiation of PN can be safely implemented with careful monitoring and evaluation of RS risk. This review showed a low incidence of RS based on ASPEN criteria, even in patients at moderate to severe risk prior to home PN initiation. Close monitoring of labs and patient status, along with adherence to initial prescription recommendations, resulted in successful PN initiation at home in 92.5% of patients.
Background: Iron deficiency anemia is common in home parenteral nutrition (HPN) patients (Hwa et al., 2016). Post iron infusion, hypophosphatemia due to fibroblast growth factor 23 is a known side effect of some IV iron formulations (Wolf et al., 2019). Ferric carboxymaltose can cause serum phosphorus levels to drop below 2 mg/dL in 27% of patients (Onken et al., 2014). Hypophosphatemia (< 2.7 mg/dL) can lead to neurological, neuromuscular, cardiopulmonary, and hematologic issues, and long-term effects like osteopenia and osteoporosis (Langley et al., 2017). This case series reviews the occurrence and clinical implications of transient serum phosphorus drops in patients receiving ferric carboxymaltose and HPN.
Methods: A retrospective case series review was performed for three patients who were administered ferric carboxymaltose while on HPN therapy. Serum phosphorus levels were measured at baseline prior to initial iron dose, within 1 week post injection, and at subsequent time intervals up to 4 months post initial injection. Data was collected on the timing, magnitude, and duration of any phosphorus decreases, and any associated clinical symptoms or complications. Patient records were also reviewed to evaluate any correlations with HPN composition or phosphorus dosing.
Results: Among the three patients reviewed, all exhibited short-term drops in serum phosphorus levels following ferric carboxymaltose injection (Table 1). All patients had baseline serum phosphorus levels within normal limits prior to the initial dose of ferric carboxymaltose. All cases involved multiple doses of ferric carboxymaltose, which contributed to the fluctuations in phosphorus levels. The average drop in serum phosphorus from baseline to the lowest point was 50.3%. The lowest recorded phosphorus level among these three patients was 1.4 mg/dL, and this was in a patient who received more than two doses of ferric carboxymaltose. In two cases, increases were made in HPN phosphorus in response to serum levels, and in one case no HPN changes were made. However, all serum phosphorus levels returned to normal despite varied interventions. Despite the low phosphorus levels, none of the patients reported significant symptoms of hypophosphatemia during the monitoring periods. Ferric carboxymaltose significantly impacts serum phosphorus in HPN patients, consistent with existing literature. The need for vigilant monitoring is highlighted, patients receiving HPN are closely monitored by a trained nutrition support team with frequent lab monitoring. Lab monitoring in patients receiving ferric carboxymaltose who are not on HPN may be less common. The lowest level recorded was 1.4 mg/dL, indicating potential severity. Despite significant drops, no clinical symptoms were observed, suggesting subclinical hypophosphatemia may be common. In two of the reviewed cases, hypophosphatemia was addressed by making incremental increases in the patient's HPN formulas. Note that there are limitations to phosphorus management in HPN due to compatibility and stability issues, and alternative means of supplementation may be necessary depending upon the patient's individual formula.
Conclusion: Three HPN patients receiving ferric carboxymaltose experienced transient, generally mild reductions in serum phosphorus. Monitoring is crucial, but the results from this case series suggest that clinical complications are rare. Adjustments to HPN or additional supplementation may be needed based on individual patient needs, with some cases self-correcting over time.
Table 1. Timeline of Iron Injections and the Resulting Serum Phosphorus Levels and HPN Formula Adjustments.
Danial Nadeem, MD1; Stephen Adams, MS, RPh, BCNSP2; Bryan Snook2
1Geisinger Wyoming Valley, Bloomsburg, PA; 2Geisinger, Danville, PA
Financial Support: None Reported.
Background: Ferric carboxymaltose (FC) is a widely used intravenous iron formulation, primarily employed in the treatment of iron deficiency. It offers significant benefits, particularly in cases where oral iron supplementation proves ineffective or is not well-tolerated. However, an important potential adverse effect associated with FC use is hypophosphatemia. This condition has been observed in multiple patients following their treatment with FC. The paper discusses the potential mechanisms leading to this adverse effect and its significant implications for patient care.
Methods: A middle-aged female with history of malnutrition and iron deficiency receiving parenteral nutrition at home had received multiple doses Venofer in the past, with the last dose given in 2017 to which the patient developed an anaphylactic reaction. She was therefore switched to ferric carboxymaltose (FCM) therapy. However, upon receiving multiple doses of FCM in 2018, the patient developed significant hypophosphatemia. As hypophosphatemia was noted, adjustments were made to the patient's total parenteral nutrition (TPN) regimen to increase the total phosphorus content in an effort to treat the low phosphate levels. The patient also received continued doses of FCM in subsequent years, with persistent hypophosphatemia despite repletion.
Results: Ferric carboxymaltose (FC) is a widely used intravenous iron therapy for the treatment of iron deficiency. It is particularly beneficial in cases where oral iron supplementation is ineffective or not tolerated. FC works by delivering iron directly to the macrophages in the reticuloendothelial system. The iron is then released slowly for use by the body, primarily for the production of hemoglobin. However, recent studies have highlighted the potential adverse effect of hypophosphatemia associated with the use of FC. Hypophosphatemia induced by FC is thought to be caused by an increase in the secretion of the hormone fibroblast growth factor 23 (FGF23). FGF23 is a hormone that regulates phosphate homeostasis. When FGF23 levels rise, the kidneys increase the excretion of phosphate, leading to lower levels of phosphate in the blood. There are many implications of hypophosphatemia in regards to patient care. Symptoms of hypophosphatemia can include muscle weakness, fatigue, bone pain, and confusion. In severe cases, persistent hypophosphatemia can lead to serious complications such as rhabdomyolysis, hemolysis, respiratory failure, and even death. Therefore, it is crucial for clinicians to be aware of the potential risk of hypophosphatemia when administering FC. Regular monitoring of serum phosphate levels is recommended in patients receiving repeated doses of FC. Further research is needed to fully understand the mechanisms underlying this adverse effect and to develop strategies for its prevention and management.
Conclusion: In conclusion, while FC is an effective treatment for iron deficiency, it is important for clinicians to be aware of the potential risk of hypophosphatemia. Regular monitoring of serum phosphate levels is recommended in patients receiving repeated doses of FC. Further research is needed to fully understand the mechanisms underlying this adverse effect and to develop strategies for its prevention and management.
Table 1. Phosphorous Levels and Iron Administration.
Table 1 shows the response to serum phosphorous levels in a patient given multiple doses of intravenous iron over time.
Background: Home parenteral nutrition (HPN) can be successfully initiated in the home setting with careful evaluation and management by an experienced nutrition support team (NST).1,2 Safe candidates for HPN initiation include medically stable patients with an appropriate indication, a safe environment, and the means for reliable follow-up. However, some patients are not appropriate to start directly with HPN due to logistical reasons or the risk of refeeding syndrome (RFS).2 Consensus Recommendations for RFS provide guidance regarding recognizing and managing risk.3 An experienced NST that provides individualized care can recommend intravenous (IV) hydration before initiating HPN to expedite the initiation of therapy and normalize blood chemistry to mitigate the risk of RFS. The purpose of this study is to evaluate the impact of IV hydration on adult patients managed by a home infusion NST who received IV hydration prior to initiating HPN. The proportion of patients who received IV hydration prior to HPN, the reason for initiating IV hydration, and the impact this intervention may have had on their care and outcomes will be described.
Methods: This retrospective review includes 200 HPN patients 18 years of age and older initiated on HPN therapy with a national home infusion pharmacy over a 6-month period from January 1, 2024, to June 30, 2024. Data collection included baseline demographics, indication for HPN, risk of RFS, days receiving IV hydration prior to initiating HPN, the number of rehospitalizations within the first 2 weeks, whether they received IV hydration, and if so, the indication for IV hydration and the components of the orders. Data was collected via electronic medical records and deidentified into a standardized data collection form.
Results: Of the 200 total patients, 19 (9.5%) received IV hydration prior to HPN. Of these 19 patients, 16 were female, and 3 were male (Table 1). The most common indications for HPN were bariatric surgery complication (5), intestinal failure (4), and oncology diagnosis (4) (Figure 1). Among these patients, 9 (47%) were at moderate RFS risk and 10 (53%) were at high RFS risk. The indications for IV hydration included 7 (37%) due to electrolyte abnormalities/RFS risk, 5 (26%) due to delay in central line placement, and 7 (37%) due to scheduling delays (Figure 2). IV hydration orders included electrolytes in 15 (79%) of the orders. All orders without electrolytes (4) had an indication related to logistical reasons (Figure 3). All 19 patients started HPN within 7 days of receiving IV hydration. Two were hospitalized within the first two weeks of therapy with admitting diagnoses unrelated to HPN.
Conclusion: In this group of patients, HPN was successfully initiated in the home setting when managed by an experienced NST, preventing unnecessary hospitalizations. This study demonstrated that safe initiation of HPN may include IV hydration with or without electrolytes first, either to mitigate RFS or due to logistical reasons, when started on HPN within 7 days. The IV hydration orders were individualized to fit the needs of each patient. This data only reflects IV hydration dispensed through the home infusion pharmacy and does not capture IV hydration received at an outside clinic. This patient population also does not include those deemed medically unstable for HPN or those not conducive to starting in the home setting for other factors. Future research should account for these limitations and detail IV hydration components, dosing, and frequency of orders.
Table 1. Demographics.
Figure 1. HPN Indications of IV Hydration.
Figure 2. Indication for IV Hydration and Refeeding Risk.
Background: Parenteral nutrition (PN) is a critical therapy for patients who are unable to absorb nutrients adequately via the gastrointestinal tract.1 PN is complex, with 10 or more individually dosed components in each order which inherently increases the risk for dosing errors. 2 This study seeks to analyze the PN orders at hospital discharge received by a home infusion provider and identify the incidence of the omission of the standard components, as determined by ASPEN Recommendations on Appropriate Parenteral Nutrition Dosing.3 The primary objective of this study was to identify missing sodium, potassium, magnesium, calcium, phosphorus, and multivitamin components in hospital discharge orders. The secondary objective was to determine whether identified missing components were added back during the transition of care (TOC) process from hospital to home.
Methods: This multi-center, retrospective chart review analyzed patients referred to a national home infusion provider over a 3-month period. Data was collected from the electronic medical record and internal surveillance software. The Registered Dietitian, Certified Nutrition Support Clinician (RD, CNSC) reviewed all PN hospital discharge orders for patients transitioning clinical management and PN therapy from hospital to home. Inclusion criteria were patients 18 years of age or older with PN providing the majority of their nutritional needs, as defined in Table 1, who were missing sodium, potassium, magnesium, calcium, phosphorus, or multivitamin from their PN order at hospital discharge. Exclusion criteria were patients less than 18 years of age, patients receiving supplemental PN not providing majority of nutritional needs, and patients with doses ordered for all electrolytes and multivitamin.
Results: During the 3-month period (April 1, 2024 to June 30, 2024), 267 patients were identified who were greater than 18 years of age, receiving the majority of their nutritional needs via PN, and missing at least one PN component in their hospital discharge order. See Table 2 and Figure 1 for demographics. One hundred seventy-five (65.5%) patients were missing one component and 92 (34.5%) were missing multiple components from the hospital discharge order. One hundred seventy-five (65.5%) patients were missing calcium, 68 (25.5%) phosphorus, 38 (14%) multivitamin, 23 (8.6%) magnesium, 20 (7.5%) potassium, and 20 (7.5%) sodium. During the transition from hospital to home, after discussion with the provider, 94.9% of patients had calcium added back, 94.7% multivitamin, 91.3% magnesium, 90% potassium, 88.2% phosphorus, and 80% sodium.
Conclusion: This study highlights the prevalence of missing components in PN hospital discharge orders, with calcium being the most frequently omitted at a rate of 65.5%. Given that many patients discharging home on PN will require long term therapy, adequate calcium supplementation is essential to prevent bone resorption and complications related to metabolic bone disease. In hospital discharge orders that were identified by the RD, CNSC as missing calcium, 94.9% of the time the provider agreed that it was clinically appropriate to add calcium to the PN order during the TOC process. This underlines the importance of nutrition support clinician review and communication during the transition from hospital to home. Future research should analyze the reasons why components are missing from PN orders and increase awareness of the need for a thorough clinical review of all patients going home on PN to ensure the adequacy of all components required for safe and optimized long term PN.
Table 1. Inclusion and Exclusion Criteria.
Table 2. Demographics.
Figure 1. Primary PN Diagnosis.
Figure 2. Components Missing from Order and Added Back During TOC Process.
Avi Toiv, MD1; Hope O'Brien, BS2; Arif Sarowar, MSc2; Thomas Pietrowsky, MS, RD1; Nemie Beltran, RN1; Yakir Muszkat, MD1; Syed-Mohammad Jafri, MD1
1Henry Ford Hospital, Detroit, MI; 2Wayne State University School of Medicine, Detroit, MI
Financial Support: None Reported.
Background: Intestinal Failure Associated Liver Disease (IFALD) is a known complication in patients reliant on total parenteral nutrition (TPN), especially those awaiting intestinal transplantation. There is concern that IFALD may negatively impact post-transplant outcomes, including graft survival and overall patient mortality. This study aims to evaluate the impact of IFALD, as indicated by liver function test (LFT) abnormalities before intestinal or multivisceral transplant, on transplant outcomes.
Methods: We conducted a retrospective chart review of all patients who underwent IT at an academic transplant center from 2010 to 2023. The primary outcome was patient survival and graft failure in transplant recipients.
Results: Among 50 IT recipients, there were 30 IT recipients (60%) who required TPN before IT. The median age at transplant in was 50 years (range, 17-68). In both groups, the majority of transplants were exclusively IT, however they included MVT as well. 87% of patients on TPN developed elevated LFTs before transplant. 33% had persistently elevated LFTs 1 year after transplant. TPN-associated liver dysfunction in our cohort was associated with mixed liver injury patterns, with both hepatocellular injury (p < 0.001) and cholestatic injury (p < 0.001). No significant associations were found between TPN-related elevated LFTs and major transplant outcomes, including death (p = 0.856), graft failure (p = 0.144), or acute rejection (p = 0.306). Similarly, no significant difference was observed between elevated LFTs and death (p = 0.855), graft failure (p = 0.769), or acute rejection (p = 0.386) in patients who were not on TPN. TPN-associated liver dysfunction before transplant was associated with elevated LFTs at one year after transplant (p < 0.001) but lacked clinical relevance.
Conclusion: Although IFALD related to TPN use is associated with specific liver dysfunction patterns in patients awaiting intestinal or multivisceral transplants, it does not appear to be associated with significant key transplant outcomes such as graft failure, mortality, or acute rejection. However, it is associated with persistently elevated LFTs even 1 year after transplant. These findings suggest that while TPN-related liver injury is common, it may not have a clinically significant effect on long-term transplant success. Further research is needed to explore the long-term implications of IFALD in this patient population.
1Denver Health, Parker, CO; 2Denver Health, St. Joseph Hospital, Denver, CO; 3Denver Health, Denver, CO
Financial Support: None Reported.
Background: Central line-associated blood stream infection (CLABSI) is associated with increased complications, length of stay and cost of care. The majority of CLABSI studies are focused on home parenteral nutrition (PN) patients and there is a paucity of data documenting the incidence of CLABSI attributable to PN in the inpatient setting. At our institution, we have observed that clinicians are reluctant to initiate PN in patients with clear indications for PN due to concerns about CLABSI. Therefore, we performed a quality improvement project to document our incidence of CLABSI rate for new central parenteral nutrition (CPN) initiated during hospitalization.
Methods: We performed a retrospective review of adult inpatients who initiated CPN at our facility from 1/1/23-12/31/23. Patients were excluded if they received PN prior to admission or only received peripheral PN. The National Healthcare Safety Network (NHSN) definitions were used for CLABSI and secondary attribution. Further deeper review of CLABSI cases was provided by an Infectious Disease (ID) consultant to determine if positive cases were attributable to CPN vs other causes. The type of venous access for the positive patients was also reviewed.
Results: A total of 106 inpatients received CPN for a total of 1121 CPN days. The median [IQR] length of CPN infusion was 8 [4-14] days. Mean (standard deviation) age of patients receiving CPN infusion was 53.3 (18.6) years and 65 (61%) were men. The CPN patients who met criteria for CLABSI were further reviewed by ID consultant and resulted in only four CLABSI cases being attributable to CPN. These four cases resulted in an incidence rate of 3.6 cases of CLABSI per 1000 CPN days. Two of these patients were noted for additional causes of infection including gastric ulcer perforation and bowel perforation with anastomotic leak. Three of the patients had CPN infused via a central venous catheter (a port, a femoral line, and a non-tunneled internal jugular catheter) and the fourth patient had CPN infused via a peripherally inserted central catheter. The incidence rate for CLABSI cases per catheter days was not reported in our review.
Conclusion: At our institution, < 4% of patients initiating short-term CPN during hospitalization developed a CLABSI attributable to the CPN. This low rate of infection serves as a benchmark for our institution's quality improvement and quality assurance efforts. Collaboration with ID is recommended for additional deeper review of CPN patients with CLABSI to determine if the infection is more likely to be related to other causes than infusion of CPN.
Background: Purpose/Background: Refeeding syndrome is defined as potentially fatal shifts in fluids and electrolytes that may occur in malnourished patients receiving enteral or parenteral nutrition. When refeeding syndrome occurs, a reduction in phosphorus/magnesium/potassium levels and thiamine deficiency can be seen shortly after initiation of calorie provision. The American Society of Parenteral and Enteral Nutrition guidelines considers hypophosphatemia as the hallmark sign of refeeding syndrome; however, magnesium and potassium have been shown to be equally important. The purpose of this study is to identify the incidence of hypophosphatemia in patients who are at risk of refeeding syndrome and the importance of monitoring phosphorus.
Methods: This study was a multicenter retrospective chart review that was conducted using the BayCare Health System medical records database, Cerner. The study included patients who were 18 years or older, admitted between January 2023 through December 2023, received total parenteral nutrition (TPN) and were at risk of refeeding syndrome. We defined patients at risk of refeeding syndrome as patients who meet two of the following criteria prior to starting the TPN: body mass index (BMI) prior to starting TPN < 18.5 kg/m2, 5% weight loss in 1 month, no oral intake 5 days, or low levels of serum phosphorus/magnesium/potassium. COVID-19 patients and patients receiving propofol were excluded from the study. The primary objective of the study was to evaluate the incidence of hypophosphatemia versus hypomagnesemia versus hypokalemia in patients receiving TPN who were at risk of refeeding syndrome. The secondary objective was to evaluate whether the addition of thiamine upon initiation of TPN showed benefit in the incidence of hypophosphatemia.
Results: A total of 83 patients met the criteria for risk of refeeding syndrome. Out of the 83 patients, a total of 53 patients were used to run a pilot study to determine the sample size and 30 patients were included in the study. The results on day 1 and day 2 suggest the incidence of hypomagnesemia differs from that of hypophosphatemia and hypokalemia, with a notably lower occurrence. The Cochran's Q test yielded x2(2) = 9.57 (p-value = 0.008) on day 1 and x2(2) = 4.77 (p-value = 0.097) on day 2, indicating a difference in at least one group compared to the others on only day 1. A post hoc analysis found a difference on day 1 between the incidence of hypophosphatemia vs hypomagnesemia (30%) and hypomagnesemia vs hypokalemia (33.3%). For the secondary outcome, the difference in day 2 versus day 1 phosphorus levels with the addition of thiamine in the TPN was 0.073 (p-value = 0.668, 95% CI [-0.266 – 0.413]).
Conclusion: Among patients who were on parenteral nutrition and were at risk of refeeding syndrome, there was not a statistically significant difference in the incidence of hypophosphatemia vs hypomagnesemia and hypomagnesemia vs hypokalemia. Among patients who were on parenteral nutrition and were at risk of refeeding syndrome, there was not a statistically significant difference on day 2 phosphorus levels vs day 1 phosphorus levels when thiamine was added.
Jennifer McClelland, MS, RN, FNP-BC1; Margaret Murphy, PharmD, BCNSP1; Matthew Mixdorf1; Alexandra Carey, MD1
1Boston Children's Hospital, Boston, MA
Financial Support: None Reported.
Background: Iron deficiency anemia (IDA) is common in patients with intestinal failure (IF) dependent on parenteral nutrition (PN). Treatment with enteral iron is preferred; however, may not be tolerated or efficacious. In these cases, intravenous (IV) iron is a suitable alternative. Complications include adverse reactions and infection, though low when using low-molecular-weight (LMW) formulations. In a home PN (HPN) program, an algorithm (Figure 1) was developed to treat IDA utilizing IV iron.
Methods: A retrospective chart review was conducted in large HPN program (~150 patients annually) from Jan 2019 - April 2024 who were prescribed IV iron following an algorithm. Laboratory studies were analyzed looking for instances of ferritin >500 ng/mL indicating potential iron overload, as well as transferrin saturation 12-20% indicating iron sufficiency. In instances of ferritin levels >500 further review was conducted to understand etiology, clinical significance and if the IV iron algorithm was adhered to.
Results: HPN patients are diagnosed with IDA based on low iron panel (low hemoglobin and/or MCV, low ferritin, high reticulocyte count, serum iron and transferrin saturation and/or high total iron binding capacity (TIBC). If the patient can tolerate enteral iron supplementation, a dose of 3-6 mg/kg/day is initiated. If patient cannot tolerate enteral iron, the IV route is initiated. Initial IV dose is administered in the hospital or infusion center for close monitoring and to establish home maintenance administration post repletion dosing. Iron dextran is preferred as it can be directly added into the PN and run for duration of the cycle. Addition to the PN eliminates an extra infusion and decrease additional CVC access. Iron dextran is incompatible with IV lipids, so the patient must have one lipid-free day weekly to be able to administer. If patient receives daily IV lipids, iron sucrose is given as separate infusion from the PN. Maintenance IV iron dosing is 1 mg/kg/week, with dose and frequency titrated based on clinical status, lab studies and trends. Iron panel and C-reactive protein (CRP) are ordered every 2 months. If lab studies are below the desired range and consistent with IDA, IV iron dose is increased by 50% by dose or frequency; if studies are over the desired range, IV iron dose is decreased by 50% by dose or frequency. Maximum home dose is < 3 mg/kg/dose; if higher dose needed, patient is referred to an infusion center. IV iron is suspended if ferritin >500 ng/mL due to risk for iron overload and deposition in the liver. Ferritin results (n = 4165) for all patients in HPN program from January 2019-April 2024 were reviewed looking for levels >500 ng/mL indicating iron overload. Twenty-nine instances of ferritin >500 ng/mL (0.7% of values reviewed) were identified in 14 unique patients on maintenance IV iron. In 9 instances, the high ferritin level occurred with concomitant acute illness with an elevated CRP; elevated ferritin in these cases was thought to be related to an inflammatory state vs. iron overload. In 2 instances, IV iron dose was given the day before lab draw, rendering a falsely elevated result. Two patients had 12 instances (0.28% of values reviewed) of elevated ferritin thought to be related to IV iron dosing in the absence of inflammation, with normal CRP levels. During this period, there were no recorded adverse events.
Conclusion: IDA is common in patients with IF dependent on PN. Iron is not a standard component or additive in PN. Use of IV iron in this population can increase quality of life by decreasing need for admissions, visits to infusion centers, or need for blood transfusions in cases of severe anemia. IV iron can be safely used for maintenance therapy in HPN patients with appropriate dosing and monitoring.
Figure 1. Intravenous Iron in the Home Parenteral Nutrition dependent patient Algorithm.
1Amerita Specialty Infusion Services, Thornton, CO; 2Amerita Specialty Infusion Services, Rochester Hills, MI
Financial Support: None Reported.
Background: Desmoplastic small round tumor (DSRT) is a soft-tissue sarcoma that causes tumors to form in the abdomen and pelvis. To improve control, cytoreduction, hyperthermic intraperitoneal chemotherapy (CRS/HIPEC) and radiotherapy are often used, which can result in bowel obstruction secondary to sclerosing peritonitis. This necessitates total parenteral nutrition therapy (TPN) due to the inability to consume nutrition orally or enterally. A major complication of parenteral nutrition therapy is parenteral nutrition associated liver disease (PNALD) and the most common metastasis for DSRT is the liver. This case report details the substitution of an olive and soy oil-based intravenous lipid emulsion (OO, SO-ILE) for a soy, MCT, olive, fish oil-based intravenous lipid emulsion (SO, MCT, OO, FO-ILE) to treat high liver function tests (LFTs).
Methods: A 28-year-old male with DSRT metastatic to peritoneum and large hepatic mass complicated by encapsulating peritonitis and enterocutaneous fistula (ECF), following CRS/HIPEC presented to the parenteral nutrition program at Amerita Specialty Infusion Services in 2022. TPN was initiated from January 2022 to March 2023, stopped, and restarted in December 2023 following a biliary obstruction. TPN was initiated and advanced using 1.3 g/kg/day SMOFlipid, (SO, MCT, OO, FO-ILE), known to help mitigate PNALD. The patient developed rising LFTs, with alanine transferase (ALT) peaking at 445 U/L, aspartate transferase (AST) at 606 U/L, and alkaline phosphatase (ALP) at 1265 U/L. Despite transitioning the patient to a cyclic regimen and maximizing calories in dextrose and amino acids, liver function continued to worsen. A switch to Clinolipid, (OO, SO-ILE), at 1.3 g/kg/day was tried.
Results: Following the initiation of OO, SO-ILE, LFTs improved in 12 days with ALT resulting at 263 U/L, AST at 278 U/L, and ALP at 913 U/L. These values continued to improve until the end of therapy in June 2024 with a final ALT value of 224 U/L, AST at 138 U/L, and ALP at 220 U/L. See Figure 1. No significant improvements in total bilirubin were found. The patient was able to successfully tolerate this switch in lipid emulsions and was able to increase his weight from 50 kg to 53.6 kg.
Conclusion: SO, MCT, OO, FO-ILE is well-supported to help prevent and alleviate adverse effects of PNALD, however lipid emulsion impacts on other forms of liver disease need further research. Our case suggests that elevated LFTs were likely cancer induced, rather than associated with prolonged use of parenteral nutrition. A higher olive oil lipid concentration may have beneficial impacts on LFTs that are not associated with PNALD. It is also worth noting that soybean oil has been demonstrated in previous research to have a negative impact on liver function, and the concentration of soy in SO, MCT, OO, FO-ILE is larger (30%) compared to OO, SO-ILE (20%). This may warrant further investigation into specific soy concentrations’ impact on liver function. LFTs should be assessed and treated on a case-by-case basis that evaluates disease mechanisms, medication-drug interactions, parenteral nutrition composition, and patient subjective information.
Figure 1. OO, SO-ILE Impact on LFTs.
Shaurya Mehta, BS1; Ajay Jain, MD, DNB, MHA1; Kento Kurashima, MD, PhD1; Chandrashekhara Manithody, PhD1; Arun Verma, MD1; Marzena Swiderska-Syn1; Shin Miyata, MD1; Mustafa Nazzal, MD1; Miguel Guzman, MD1; Sherri Besmer, MD1; Matthew Mchale, MD1; Jordyn Wray1; Chelsea Hutchinson, MD1; John Long, DVM1
1Saint Louis University, St. Louis, MO
Encore Poster
Presentation: North American Society for Pediatric Gastroenterology, Hepatology and Nutrition (NASPGHAN) Florida.
Financial Support: None Reported.
Background: Short bowel syndrome (SBS) is a devastating condition. In absence of enteral nutrition (EN), patients are dependent on Total Parenteral Nutrition (TPN) and suffer from of intestinal failure associated liver disease and gut atrophy. Intestinal adaptation (IA) and enteral autonomy (EA) remains the clinical goal. We hypothesized EA can be achieved using our DREAM system, (US patent 63/413,988) which allows EN via the stomach and then a mechanism for cyclical recirculation of nutrient rich distal intestinal content into proximal bowel enabling full EN despite SBS.
Methods: 24 neonatal pigs were randomly allocated to enteral nutrition (EN; n = 8); TPN-SBS (on TPN only, n = 8); or DREAM (n = 8). Liver, gut, and serum were collected for histology, and serum biochemistry. Statistical analysis was performed using ‘Graph Pad Prism 10.1.2 (324)’. All tests were 2-tailed using a significance level of 0.05.
Results: TPN-SBS piglets had significant cholestasis vs DREAM (p = 0.001) with no statistical difference in DREAM vs EN (p = 0.14). DREAM transitioned to full EN by day 4. Mean serum conjugated bilirubin for EN was 0.037 mg/dL, TPN-SBS 1.2 mg/dL, and DREAM 0.05 mg/dL. Serum bile acids were significantly elevated in TPN-SBS vs EN (p = 0.007) and DREAM (p = 0.03). Mean GGT, a marker of cholangiocytic injury was significantly higher in TPN-SBS vs EN (p < 0.001) and DREAM (p < 0.001) with values of EN 21.2 U/L, TPN-SBS 47.9 U/L, and DREAM 22.5 U/L (p = 0.89 DREAM vs EN). To evaluate gut growth, we measured lineal gut mass (LGM), calculated as the weight of the bowel per centimeter. There was significant IA and preservation in gut atrophy with DREAM. Mean proximal gut LGM was EN 0.21 g/cm, TPN-SBS 0.11 g/cm, and DREAM 0.31 g/cm (p = 0.004, TPN-SBS vs DREAM). Distal gut LGM was EN 0.34 g/cm, TPN-SBS 0.13 g/cm, and DREAM 0.43 g/cm (p = 0.006, TPN-SBS vs DREAM). IHC revealed DREAM had similar hepatic CK-7 (bile duct epithelium marker), p = 0.18 and hepatic Cyp7A1, p = 0.3 vs EN. No statistical differences were noted in LGR5 positive intestinal stem cells in EN vs DREAM, p = 0.18. DREAM prevented changes in hepatic, CyP7A1, BSEP, FGFR4, SHP, SREPBP-1 and gut FXR, TGR5, EGF vs TPN and SBS groups.
Conclusion: DREAM resulted in a significant reduction of hepatic cholestasis, prevented gut atrophy, and presents a novel method enabling early full EN despite SBS. This system, by driving IA and EN autonomy highlights a major advancement in SBS management, bringing a paradigm change to life saving strategies for SBS patients.
Silvia Figueiroa, MS, RD, CNSC1; Paula Delmerico, MS, RD, CNSC2
1MedStar Washington Hospital Center, Bethesda, MD; 2MedStar Washington Hospital Center, Arlington, VA
Financial Support: None Reported.
Background: Parenteral nutrition (PN) therapy is a vital clinical intervention for patients of all ages and across care settings. The complexity of PN has the potential to cause significant patient harm, especially when errors occur. According to The Institute for Safe Medication Practices (ISMP), PN is classified as a high-alert medication and safety-focused strategies should be formulated to minimize errors and harm. Processing PN is multifactorial and includes prescribing, order review and verification, compounding, labeling, and administration. PN prescription should consider clinical appropriateness and formulation safety. The PN formulation must be written to provide appropriate amounts of macronutrients and micronutrients based on the patient's clinical condition, laboratory parameters, and nutrition status. The PN admixture should not exceed the total amounts, recommended concentrations, or rate of infusion of these nutrients as it could result in toxicities and formulation incompatibility or instability. The ASPEN Parenteral Nutrition Safety Consensus Recommendations recommend PN be prescribed using standardized electronic orders via a computerized provider order entry (CPOE) system, as handwritten orders have potential for error. Data suggests up to 40% of PN-related errors occur during the prescription and transcription steps. Registered pharmacists (RPh) are tasked with reviewing PN orders for order accuracy and consistency with recommendations made by the nutrition support team. These RPh adjustments ensure formula stability and nutrient optimization. This quality improvement project compares the frequency of RPh PN adjustments following initial provider order after transition from a paper to CPOE ordering system. Our hypothesis is CPOE reduces the need for PN adjustments by pharmacists during processing which increases clinical effectiveness and maximizes resource efficiency.
Methods: This was a retrospective evaluation of PN ordering practices at a large, academic medical center after shifting from paper to electronic orders. PN orders were collected during three-week periods in December 2022 (Paper) and December 2023 (CPOE) and analyzed for the frequency of order adjustments by the RPh. Adjustments were classified into intravascular access, infusion rate, macronutrients, electrolytes, multivitamin (MVI) and trace elements (TE), and medication categories. The total number of adjustments made by the RPh during final PN processing was collected. These adjustments were made per nutrition support team.
Results: Daily PN orders for 106 patients – totaling 694 orders – were reviewed for provider order accuracy at the time of fax (paper) and electronic (CPOE) submission. Order corrections made by the RPh decreased by 96% for infusion rate, 91.5% for macronutrients, 79.6% for electrolytes, 81.4% for MVI and TE, and 50% for medication additives (Table 1).
Conclusion: Transitioning to CPOE led to reduction in the need for PN order adjustments at the time of processing. One reason for this decline is improvement in physician understanding of PN recommendations. With CPOE, the registered dietitian's formula recommendation is viewable within the order template and can be referenced at the time of ordering. The components of the active PN infusion also automatically populate upon ordering a subsequent bag. This information can aid the provider when calculating insulin needs or repleting electrolytes outside the PN, increasing clinical effectiveness. A byproduct of this process change is improved efficiency, as CPOE requires less time for provider prescription and RPh processing and verification.
Table 1. RPh Order Adjustments Required During Collection Period.
1Riley Hospital for Children at Indiana University Health, Indianapolis, IN; 2Lurie Children's Hospital, Chicago, IL; 3Riley Hospital for Children at IU Health, Indianapolis, IN
Financial Support: None Reported.
Background: Parenteral nutrition (PN) is ordered daily at Riley Hospital for Children at IU Health by different specialties. Our hospital is a stand-alone pediatric hospital, including labor, delivery, and high-risk maternal care. Historically, the PN orders were due by early afternoon with a hard cut-off by end of the day shift for timely central compounding at a nearby adult hospital. Due to the relocation of staff and equipment to a new sterile compounding facility, a hard deadline was created with an earlier cutoff time and contingency plans for orders received after the deadline. This updated process was created to allow for timely delivery to Riley and subsequently to the patients to meet the standard PN hang-time of 2100.The Nutrition Support Team (NST) and Pharmacy and Therapeutics (P&T) Committee approved an updated PN order process as follows:
Enforce Hard PN Deadline of 1200 for new and current PN orders If PN not received by 1200, renew active PN order for the next 24 hours If active PN order is not appropriate for the next 24 hours, the providers will need to order IVF in place of PN until the following day PN orders into PN order software by 1500
Methods: A quality improvement (QI) process check was performed 3 months initiation of the updated PN order process. Data collection was performed for 1 month with the following data points: Total PN orders, Missing PN orders at 1200, PN orders re-ordered per P&T policy after 1200 deadline, Lab review, Input and output, Subsequent order changes for 24 hours after renewal of active PN order, Waste of PN, and Service responsible for late PN order.
Results:
Conclusion: The number of late PN orders after the hard deadline was < 5% and there was a minimal number of renewed active PN orders due to the pharmacists' concern for ensuring safety of our patients. No clinically significant changes resulted from renewal of active PN, so considered a safe process despite small numbers. The changes made to late PN orders were minor or related to the planned discontinuation of PN. After review of results by NST and pharmacy administration, it was decided to take the following actions: Review data and process with pharmacy staff to assist with workload flow and education Create a succinct Riley TPN process document for providers, specifically services with late orders, reviewing PN order entry hard deadline and need for DC PN order by deadline to assist with pharmacy staff workflow and avoidance of potential PN waste Repeat QI analysis in 6-12 months.
1King Faisal Specialist Hospital, Jeddah, Makkah; 2Wayne State University, Jeddah, Makkah; 3Umm al Qura University, Jeddah, Makkah; 4King Abdulaziz University Hospital, Jeddah, Makkah; 5King Faisal Specialist Hospital, Jeddah, Makkah
Financial Support: None Reported.
Background: Parenteral nutrition (PN) is a critical therapy for patients unable to meet their nutritional needs through the gastrointestinal tract. While it offers a life-saving solution, it also carries the risk of central line-associated bloodstream infections (CLABSIs). However, there is a lack of comprehensive studies examining the risk factors for CLABSIs in a more heterogeneous cohort of PN recipients. This study aims to identify the risk factors associated with CLABSIs in patients receiving PN therapy in Saudi Arabia.
Methods: This retrospective cohort multicenter study was conducted in three large tertiary referral centers in Saudi Arabia. The study included all hospitalized patients who received PN therapy through central lines between 2018 and 2022. The purpose of the study was to investigate the association between parenteral nutrition (PN) and central line-associated bloodstream infections (CLABSIs), using both univariate and multivariate analysis.
Results: Out of 662 hospitalized patients who received PN and had central lines, 123 patients (18.6%) developed CLABSI. Among our patients, the duration of parenteral nutrition was a dependent risk factor for CLABSI development (OR, 1.012; 95% CI, 0.9-1.02). In patients who were taken PN, the incidence of CLABSI did not change significantly over the course of the study's years.
Conclusion: The length of PN therapy is still an important risk factor for CLABSIs; more research is required to determine the best ways to reduce the incidence of CLABSI in patients on PN.
Table 1. Characteristics of Hospitalized Patients Who Received PN.
1 n (%); Median (IQR) BMI, Body Mass Index.
Table 2. The Characteristics of Individuals With and Without CLABSI Who Received PN.
1 n (%); Median (IQR), 2 Fisher's exact test; Pearson's Chi-squared test; Mann Whitney U test PN, Parenteral Nutrition
CLABSI, central line-associated bloodstream infections PN, parenteral nutrition
Figure 1. Percentage of Patients With a Central Line Receiving PN Who Experienced CLABSI.
1Emory University Hospital - Nutrition Support Team, Lawrenceville, GA; 2Emory Healthcare, Atlanta, GA
Financial Support: None Reported.
Background: Intravenous lipid emulsion (ILE) is an essential component in parenteral nutrition (PN)-dependent patients because it provides calories and essential fatty acids; however, the use of soybean oil-based ILE (SO-ILE) may contribute to the development of PN-associated liver disease (PNALD) in patients with intestinal failure who require chronic PN. To mitigate this risk, new formulations of ILE, such as a mixture of SO, medium chain triglycerides (MCT), olive oil (OO), and fish oil-based ILE (SO/MCT/OO/FO-ILE), or pure fish oil-based ILE (FO-ILE) are now available in the US. FO-ILE is only approved for pediatric use for PN-associated cholestasis. This patient case highlights the benefits of using a combination of FO-containing ILEs to improve PNALD.
Methods: A 65-year-old female with symptomatic achalasia required robotic-assisted esophagomyotomy with fundoplasty in February 2020. Her postoperative course was complicated with bowel injury that required multiple small bowel resections and total colectomy with end jejunostomy, resulting in short bowel syndrome with 80 centimeters of residual small bowel. Postoperatively, daily PN containing SO-ILE was initiated along with tube feedings (TF) during hospitalization, and she was discharged home with PN and TF. Her surgeon referred her to the Emory University Hospital (EUH) Nutrition Support Team (NST) for management. She received daily cyclic-PN infused over 12 hours, providing 25.4 kcal/kg/day with SO-ILE 70 grams (1.2 g/kg) three times weekly and standard TF 1-2 containers/day. In September 2020, she complained of persistent jaundice and was admitted to EUH. She presented with scleral icterus, hyperbilirubinemia, and elevated liver function tests (LFTs). EUH NST optimized PN to provide SO/MCT/OO/FO-ILE (0.8 g/kg/day), which improved blood LFTs, and the dose was then increased to 1 g/kg/day. In the subsequent four months, her LFTs worsened despite optimizing pharmacotherapy, continuing cyclic-TF, and reducing and then discontinuing ILE. She required multiple readmissions at EUH and obtained two liver biopsies that confirmed a diagnosis of PN-induced hepatic fibrosis after 15 months of PN. Her serum total bilirubin level peaked at 18.5 mg/dL, which led to an intensive care unit admission and required molecular adsorbent recirculating system therapy. In March 2022, the NST exhausted all options and incorporated FO-ILE (0.84 g/kg/day) three times weekly (separate infusion) and SO/MCT/OO/FO-ILE (1 g/kg/day) weekly.
Results: The patient's LFTs are shown in Figure 1. The blood level of aspartate aminotransferase improved from 123 to 60 units/L, and alanine aminotransferase decreased from 84 to 51 units/L after 2 months and returned to normal after 4 months of the two ILEs. Similarly, the total bilirubin decreased from 5.6 to 2.2 and 1.1 mg/dL by 2 and 6 months, respectively. Both total bilirubin and transaminase levels remained stable. Although her alkaline phosphatase continued to fluctuate and elevated in the last two years, this marker decreased from 268 to 104 units/L. All other PNALD-related symptoms were resolved.
Conclusion: This case demonstrates that the combination of FO-containing ILEs significantly improved and stabilized LFTs in an adult with PNALD. Additional research is needed to investigate the effect of FO-ILE in adult PN patients to mitigate PNALD.
SO: soybean oil; MCT: median-chain triglyceride; OO: olive oil; FO: fish oil; AST: Aspartate Aminotransferase; ALT: Alanine Aminotransferase.
Figure 1. Progression of Liver Enzymes Status in Relation to Lipid Injectable Emulsions.
Narisorn Lakananurak, MD1; Leah Gramlich, MD2
1Department of Medicine, Faculty of Medicine, Chulalongkorn University, Bangkok, Krung Thep; 2Division of Gastroenterology, Faculty of Medicine and Dentistry, University of Alberta, Edmonton, AB
Financial Support: This research study received a grant from Baxter, Canada.
Background: Pre-operative parenteral nutrition (PN) has been shown to enhance outcomes in malnourished surgical patients. Traditionally, pre-operative PN necessitates hospital admission, which leads to increased length of stay (LOS) and higher hospital costs. Furthermore, inpatient pre-operative PN may not be feasible or prioritized when access to hospital beds is restricted. Outpatient PN presents a potential solution to this issue. To date, the feasibility and impact of outpatient PN for surgical patients have not been investigated. This study aims to assess the outcomes and feasibility of outpatient pre-operative PN in malnourished surgical patients.
Methods: Patients scheduled for major surgery who were identified as at risk of malnutrition using the Canadian Nutrition Screening Tool and classified as malnourished by Subjective Global Assessment (SGA) B or C were enrolled. Exclusion criteria included severe systemic diseases as defined by the American Society of Anesthesiologists (ASA) classification III to V, insulin-dependent diabetes mellitus, and extreme underweight (less than 40 kg). Eligible patients received a peripherally inserted central catheter (PICC) line and outpatient PN (Olimel 7.6%E 1,000 ml) for 5-10 days using the maximum infusion days possible prior to surgery at an infusion clinic. PN was administered via an infusion pump over 4-5 hours by infusion clinic nurses. The outcomes and feasibility of outpatient PN were assessed. Safety outcomes, including refeeding syndrome, dysglycemia, volume overload, and catheter-related complications, were monitored.
Results: Outpatient PN was administered to eight patients (4 males, 4 females). Pancreatic cancer and Whipple's procedure were the most common diagnoses and operations, accounting for 37.5% of cases. (Table 1) The mean (SD) duration of PN was 5.5 (1.2) days (range 5-8 days). Outpatient PN was completed in 75% of patients, with an 88% completion rate of PN days (44/50 days). Post-PN infusion, mean body weight and body mass index increased by 4.6 kg and 2.1 kg/m², respectively. The mean PG-SGA score improved by 4.9 points, and mean handgrip strength increased from 20 kg to 25.2 kg. Quality of life, as measured by SF-12, improved in both physical and mental health domains (7.3 and 3.8 points, respectively). Patient-reported feasibility scores were high across all aspects (Acceptability, Appropriateness, and Feasibility), with a total score of 55.7/60 (92.8%). Infusion clinic nurses (n = 3) also reported high total feasibility scores (52.7/60, 87.8%). (Table 2) No complications were observed in any of the patients.
Conclusion: Outpatient pre-operative PN was a feasible approach that was associated with improved outcomes in malnourished surgical patients. This novel approach has the potential to enhance outcomes and decrease the necessity for hospital admission in malnourished surgical patients. Future studies involving larger populations are needed to evaluate the efficacy of outpatient PN.
Table 1. Baseline Characteristics of the Participants (n = 8).
Table 2. Outcomes and Feasibility of Outpatient Preoperative Parenteral Nutrition (n = 8).
Adrianna Wierzbicka, MD1; Rosmary Carballo Araque, RD1; Andrew Ukleja, MD1
1Cleveland Clinic Florida, Weston, FL
Financial Support: None Reported.
Background: Gastroparesis (GP) is a chronic motility disorder marked by delayed gastric emptying, associated with symptoms: nausea, vomiting and abdominal pain. Treatment consists of diet modifications and medications, with nutritional support tailored to disease severity. Severe refractory cases may require enteral or parenteral nutrition (PN). However, the role of home parenteral nutrition (HPN) in managing GP is underexplored. This study aims to enhance nutrition therapy practice by examining the utilization of HPN in GP population, addressing a significant gap in current nutrition support strategies.
Methods: We conducted a retrospective, single-center analysis of patients receiving HPN from August 2022 to August 2024. Data were obtained through a review of electronic medical records as part of a quality improvement monitoring process. Patients' demographics, etiology of GP, indications for HPN, types of central access, duration of therapy, and PN-related complications were analyzed using descriptive statistics. Inclusion criteria were: adults (>18 yrs.), GP diagnosis by gastric scintigraphy, and HPN for a minimum of 2 consecutive months. Among 141 identified HPN patients, 10 were diagnosed with GP as indication for PN.
Results: GP patients constituted 7% (10/141) of our home PN population. In this cohort analysis of 10 patients with GP receiving HPN, the demographic profile was predominantly female (80%); a mean age of 42.6 yrs., all individuals identified as Caucasian. All patients had idiopathic GP, severe gastric emptying delay was found in 80% of cases, with all experiencing predominant symptoms of nausea/vomiting. Type of central access: 50% PICC lines, 30% Hickman catheters, 10% Powerlines, and 10% mediport. The mean weight change with PN therapy was an increase of 21.9 lbs. 80% of patients experienced infection-related complications, including bacteremia (Methicillin-Sensitive Staphylococcus Aureus (MSSA), Methicillin-Resistant Staphylococcus Aureus (MRSA)), Pseudomonas, and fungemia. Deep vein thrombosis (DVT) was identified in 20% of patients, alongside one case of a cardiac thrombus. Tube feeding trials were attempted in 70% of cases, but 50% ultimately discontinued due to intolerance, such as abdominal pain or complications like buried bumper syndrome. Chronic pain management was used in 60% of patients, with 40% on opioid therapy (morphine, fentanyl). PN was discontinued in 50% of patients due to recurrent infections (20%), advancement to tube feeding (20%), questionable compliance (20%), or improvement in oral intake (40%).
Conclusion: This retrospective analysis underscores the potential of HPN as a nutritional strategy for GP, particularly in patients with refractory symptoms and severe delay in gastric emptying who previously failed EN or experienced complications related to the enteral access. In addition to the observed mean weight gain, HPN seems to play a crucial role in alleviating debilitating symptoms such as nausea, vomiting, and abdominal pain, thereby improving patients' overall quality of life. Nonetheless, the prevalence of infection-related complications and the requirement for chronic pain management underscore the challenges associated with GP treatment. The variability in patient responses to different nutritional strategies emphasizes the importance of individualized care plans. These findings advocate for further research to optimize HPN protocols and improve comprehensive management strategies in GP.
1Department of General Surgery, Jinling Hospital, Medical School of Nanjing University, Nanjing, Jiangsu; 2Department of Digestive Disease Research Center, Gastrointestinal Surgery, The First People's Hospital of Foshan, Guangdong, Foshan; 3Department of General Surgery, Jinling Hospital, Medical School of Nanjing University, Nanjing, Jiangsu; 4Wang, Department of General Surgery, Jinling Hospital, Medical School of Nanjing University, Nanjing, Jiangsu
Financial Support: National Natural Science Foundation of China, 82170575 and 82370900.
Background: Total parenteral nutrition (TPN) induced gut microbiota dysbiosis is closely linked to intestinal barrier damage, but the mechanism remains unclear.
Methods: Through the application of 16S rRNA gene sequencing and metagenomic analysis, we examined alterations in the gut microbiota of patients with chronic intestinal failure (CIF) and TPN mouse models subjected to parenteral nutrition, subsequently validating these observations in an independent verification cohort. Additionally, we conducted a comprehensive analysis of key metabolites utilizing liquid chromatography-mass spectrometry (LC-MS). Moreover, we explored modifications in essential innate-like lymphoid cell populations through RNA sequencing (RNA-seq), flow cytometry, and single-cell RNA sequencing (scRNA-seq).
Results: The gut barrier damage associated with TPN is due to decreased Lactobacillus murinus. L.murinus mitigates TPN-induced intestinal barrier damage through the metabolism of tryptophan into indole-3-carboxylic acid (ICA). Furthermore, ICA stimulates innate lymphoid cells 3 (ILC3) to secrete interleukin-22 by targeting the nuclear receptor Rorc to enhance intestinal barrier protection.
Conclusion: We elucidate the mechanisms driving TPN-associated intestinal barrier damage and indicate that interventions with L. murinus or ICA could effectively ameliorate TPN-induced gut barrier injury.
Figure 1. TPN Induces Intestinal Barrier Damage in Humans and Mice. (a) the rate of febrile and admission of ICU in the Cohort 1. (b-d) Serum levels of IFABP, CRP, and LPS in patients with CIF. (e) Representative intestinal H&E staining and injury scores (f) (n = 10 mice per group). (g) Results of the electrical resistance of the intestine in mice by Ussing Chamber (n = 5 mice per group). (h) Immunofluorescence experiments in the intestines and livers of mice. (i) The results of Western blot in the Chow and TPN groups.
Figure 2. TPN Induces Gut Dysbiosis in Humans and Mice. (a) PCoA for 16S rRNA of fecal content from Cohort 1 (n = 16 individuals/group). (b) Significant abundance are identified using linear discriminant analysis (LDA). (c) Top 10 abundant genus. (d) PCoA of the relative genus or species abundances (n = 5 mice per group). (e) LDA for mice. (f) Sankey diagram showing the top 10 abundant genus of humans and mice. (g) The following heatmap illustrates the correlation between the abundance of species in intestinal microbiota and clinical characteristics of patients with CIF.
Figure 3. Metabolically active L.murinus ameliorate intestinal barrier damage. (a) RT-PCR was conducted to quantify the abundance of L. murinus in feces from L-PN and H-PN patients (Cohorts 1 and 2). (b) Representative intestinal H&E staining and injury scores (c) (n = 10 mice per group). (d) The results of the electrical resistance of the intestine in mice by Ussing Chamber (n = 5 mice per group). (e) The results of Western Blot. (f) 3D-PCA and volcano plot (g) analyses between the Chow and TPN group mice. (h) The metabolome-wide pathways were enriched based on the metabolomics data obtained from fecal content from Chow and TPN group mice (n = 5 mice per group). (i) The heatmap depicts the correlation between the abundance of intestinal microbiota in species level and tryptophan metabolites of the Chow and TPN group mice (n = 5 mice per group). (j) VIP scores of 3D-PCA. A taxon with a variable importance in projection (VIP) score of >1.5 was deemed to be of significant importance in the discrimination process.
Figure 4. ICA is critical for the effects of L.murinus. (a) The fecal level of ICA from TPN mice treated PBS control or ICA(n = 10 mice per group). (b) Representative intestinal H&E staining and injury scores (c) (n = 10 mice per group). (d) The results of the electrical resistance of the intestine in mice by Ussing Chamber (n = 5 mice per group). (e) The results of Western blot. (f) This metabolic pathway illustrates the production of ICA by the bacterium L. murinus from the tryptophan. (g) PLS-DA for the profiles of metabolite in feces from TPN mice receiving ΔArAT or live L. murinus (n = 5 mice per group). (h) The heat map of tryptophan-targeted metabolomics in fecal samples from TPN mice that received either ΔArAT (n = 5) or live L. murinus (n = 5). (i) Representative intestinal H&E staining and injury scores (j)(n = 10 mice per group). (k) The results of Western Blot.
Background: Although patent foramen ovale (PFO) are generally asymptomatic and cause no health concerns, they can be a risk factor for embolism and stroke. Due to this theoretical risk, some institutions have established protocols requiring most IV solutions to be administered through a small micron filter in patients with PFO. While 2-in-1 dextrose and amino acid solutions can be filtered through a 0.22-micron filter with relative ease, injectable lipid emulsions (ILEs), whether on their own or as part of a total admixture, consist of larger particles, requiring a 1.2-micron or bigger filter size. The use of the larger filter precludes the administration of ILE, an essential source of calories, in patients with PFO. It is unknown if patients who do receive ILE have an increased incidence of lipid embolism and stroke.
Methods: A single-center retrospective review of patients on central parenteral nutrition (CPN) was completed. Demographics, and baseline clinical characteristics including co-morbidities and history of CVA were collected. The outcome of interest is defined as an ischemic cerebrovascular accident (CVA) within 30 days of CPN and, therefore, potentially attributable to it. Other cardiovascular and thromboembolic events were captured. All Patients with a PFO diagnosis and inpatient CPN administration between January 1, 2018, and December 18, 2023, at our quaternary care referral center were included as the case cohort. A 3:1 control group matched to age, gender, duration of inpatient CPN, and clinical co-morbidities was identified and utilized to examine the difference in the outcome of interest.
Results: Patients with PFO who received CPN (n = 38, 53.8% female) had a mean age of 63.5 ± 13.1 years and a mean BMI of 31.1 ± 12.1 at CPN initiation (Table 1). The PFO varied in size, with the majority (38.5%) having a very small/trivial one (Table 2). All patients in this cohort had appropriate size filters placed for CPN and ILE administration. CPN prescription and duration were comparable between both groups. The majority of patients with PFO (53.8%) received mixed oil ILE, followed by soy-olive oil ILE (23.1%), whereas the majority of patients without PFO (51.8%) received soy-olive oil ILE and (42.9%) received mixed oil ILE (Table 3). Case and control groups had cardiovascular risks at comparable prevalence, including obesity, hypertension, diabetes, and dyslipidemia. However, more patients had a history of vascular cardiac events and atrial fibrillation in the PFO group, and more patients were smokers in the non-PFO group (Table 4). Patients with PFO received PN for a median of 7 days (IQR: 5,13), and 32 (84.2%) received ILE. Patients without PFO who received CPN (n = 114, 52.6%) had a mean age of 64.9 ± 10 years and a mean BMI of 29.6 ± 8.5 at CPN initiation. Patients in this cohort received PN for a median of 7 days (IQR: 5,13.5), and 113 (99.1%) received ILE. There was no difference in the incidence of ischemic CVA within 30 days of receiving CPN between both groups (2 (5.3%) in the PFO group vs. 1 (0.8%) in the non-PFO group; p = 0.092) (Table 4).
Conclusion: The question of the risk of CVA/stroke with CPN in patients with PFO is clinically relevant and often lacks a definitive answer. Our study revealed no difference in ischemic CVA potentially attributable to CPN between patients with PFO and patients without PFO in a matched control cohort in the first 30 days after administration of PN. This finding demonstrates that CPN with ILE is likely safe for patients with PFO in an inpatient setting.
Table 1. Baseline Demographics and Clinical Characteristics.
Background: Patients on home enteral nutrition (HEN), many of whom are mobile, can experience significant hardships and reduced quality of life (QoL) due to limitations on mobility, on top of burdens due to underlying disease processes. Improving mobility while feeding could reduce burdens associated with HEN and potentially improve QoL. This prospective cohort study aims to evaluate participants’ perspectives on their mobility, ease of performing physical activities while feeding, and QoL following the use of a novel enteral feeding system (EFS).
Methods: A prospective single-center study was conducted to evaluate a novel EFS, which is an FDA-cleared elastomeric system (Mobility + ®) that consists of a lightweight feeding pouch (reservoir for 500 mL feed), a filling set (used in conjunction with a syringe to fill EFS) and a feeding set to deliver EN formula to an extension set/feeding tube with an ISO 80369-3 compatible connector. Adult HEN-dependent patients were recruited by invitation to use the study EFS for a minimum of 2 feeds a day for 14 days, preceded by a familiarization period of 5-7 days. Participant perspectives on how they rated performing typical daily activities while feeding (e.g., moving, traveling, socializing) and feeding system parameters (ease of use, portability, noise, discretion, performance) were evaluated using HEN-expert validated questionnaires. A score was given for each rating from 1 to 5, with 5 being the most positive response. An overall score was calculated and averaged for the cohort. Participants were followed up during the familiarization period. On days 7 and 14, additional telephone interviews were conducted regarding compliance, enteral feed intake, participant perspectives on study EFS vs. current system, and other measures. We excluded those with reduced functional capacity due to their underlying disease(s).
Results: Seventeen participants completed the study (mean age 63.8 ± 12 years; 70.6% male). Participants used various feeding systems, including gravity, bolus method, and pump, with the majority (82.4%) having a G-tube placed (Table 1). Sixteen (94.1%) patients achieved use of study EFS for at least two feeds a day (and majority of daily EN calories) for all study days (Table 2). The ratings for the ability to perform various activities using study EFS were significantly different compared to those of the systems used before the study. An improvement in ratings was noted for the ease of performing common daily activities, including moving between rooms or on stairs, taking short and long walks, traveling by car or public transport, engaging in moderate- to high-intensity activities, sleeping, and socializing with family and friends, between the time point before enrolment and end of study (day 14) (p-value < 0.0001) (Table 3). Ratings of feeding system parameters were significantly different between systems used before the study and the study EFS (p < 0.0001) (Table 3), with the largest increases in positive ratings noted in relation to easiness to carry, noise level, and ability to feed discreetly. Ratings for overall satisfaction with the performance of study EFS did not differ from the ratings for the systems used before the study, with participants reporting that the main influencing factors were the length of time and the effort needed to fill study EFS. No difference was noted in the QoL rating.
Conclusion: The studied EFS is safe and effective as an enteral feeding modality that provides an alternative option for HEN recipients. Participants reported a significant positive impact of study EFS on their activities of daily living. Although the overall QoL rating remained the same, improvements in mobility, discretion, and ease of carrying— aspects of QoL—were associated with the use of study EFS.
Table 1. Baseline Demographics and Clinical Characteristics.
Table 2. Safety and Effectiveness.
Table 3. Usability and Impact of the Study EFS.
Talal Sharaiha, MD1; Martin Croce, MD, FACS2; Lisa McKnight, RN, BSN MS2; Alejandra Alvarez, ACP, PMP, CPXP2
1Aspisafe Solutions Inc., Brooklyn, NY; 2Regional One Health, Memphis, TN
Financial Support: Talal Sharaiha is an executive of Aspisafe Solutions Inc. Martin Croce, Lisa McKnight and Alejandra Alvarez are employees of Regional One Health institutions. Regional One Health has a financial interest in Aspisafe Solutions Inc. through services, not cash. Aspisafe provided the products at no charge.
Background: Feeding tube securement has seen minimal innovation over the past decades, leaving medical adhesives to remain as the standard method. However, adhesives frequently fail to maintain secure positioning, with dislodgement rates reported between 36% and 62%, averaging approximately 40%. Dislodgement can lead to adverse outcomes, including aspiration, increased risk of malnutrition, higher healthcare costs, and extended nursing care. We aimed to evaluate the safety and efficacy of a novel feeding tube securement device in preventing NG tube dislodgement compared to standard adhesive tape. The device consists of a bracket that sits on the patient's upper lip. The bracket has a non-adhesive mechanism for securing feeding tubes ranging from sizes 10 to 18 French. It extends to two cheek pads that sit on either side of the patient's cheeks and is further supported by a head strap that wraps around the patient's head (Fig. 1 + Fig. 2).
Methods: We conducted a prospective, case-control trial at Regional One Health Center, Memphis, TN, comparing 50 patients using the novel securement device, the NG Guard, against 50 patients receiving standard care with adhesive tape. The primary outcome was the rate of accidental or intentional NG tube dislodgement. Secondary outcomes included the number of new NG tubes required as a result of dislodgement, and device-related complications or adhesive-related skin injuries in the control group. Statistical analyses employed Student's t-test for continuous variables and Fisher's exact test for categorical variables. Significance was set at an alpha level of 0.05. We adjusted for confounding variables, including age, sex, race, and diagnosis codes related to delirium, dementia, and confusion (Table 1).
Results: There were no significant differences between the groups in baseline characteristics, including age, sex, race, or confusion-related diagnoses (Table 2) (p ≥ 0.09). Nasogastric tube dislodgement occurred significantly more often in the adhesive tape group (31%) compared to the intervention group (11%) (p < 0.05). The novel device reduced the risk of tube dislodgement by 65%. Additionally, 12 new tubes were required in the control group compared to 3 in the intervention group (p < 0.05), translating to 18 fewer reinsertion events per 100 tubes inserted in patients secured by the novel device. No device-related complications or adhesive-related injuries were reported in either group.
Conclusion: The novel securement device significantly reduced the incidence of nasogastric tube dislodgement compared to traditional adhesive tape. It is a safe and effective securement method and should be considered for use in patients with nasogastric tubes to reduce the likelihood of dislodgement and the need for reinsertion of new NG tubes.
Table 1. Diagnosis Codes Related to Dementia and Delirium.
1MedStar Georgetown University Hospital, Washington, DC; 2Georgetown University Hospital, Washington, DC; 3MedStar Health Research Institute, Columbia, MD
Financial Support: None Reported.
Background: Early nutrition intervention is of high importance in patients with cirrhosis given the faster onset of protein catabolism for gluconeogenesis compared to those without liver disease. Severe malnutrition is associated with frequent complications of cirrhosis such as infection, hepatic encephalopathy, and ascites. Furthermore, studies have demonstrated higher mortality rates in cirrhotic patients who are severely malnourished in both the pre- and post-transplant setting. The current practice guidelines encourage the use of enteral nutrition in cirrhotic patients who are unable to meet their intake requirements with an oral diet. The aim of this study was to evaluate the utilization and implications of enteral feeding in hospitalized patients with cirrhosis diagnosed with severe protein calorie malnutrition at our institution.
Methods: This was a retrospective study of patients admitted to the transplant hepatology inpatient service at MedStar Georgetown University Hospital from 2019-2023. ICD-10-CM code E43 was then used to identity patients with a diagnosis of severe protein calorie malnutrition. The diagnosis of cirrhosis and pre-transplant status were confirmed by review of the electronic medical record. Patients with the following characteristics were excluded: absence of cirrhosis, history of liver transplant, admission diagnosis of upper gastrointestinal bleed, and/or receipt of total parenteral nutrition. Wilcoxon rank sum and two sample t-tests were used to examine differences in the averages of continuous variables between two groups. Chi-square and Fisher exact tests were used to investigate differences for categorical variables. Statistical significance was defined as p-values ≤ 0.05.
Results: Of the 96 patients with cirrhosis and severe protein calorie malnutrition, 31 patients (32%) received enteral nutrition. Time from admission to initiation of enteral feeding was on average 7 days with an average total duration of enteral nutrition of 10 days. In the group that received enteral nutrition, there was no significant change in weight, BMI, creatinine, total bilirubin or MELD 3.0 score from admission to discharge; however, albumin, sodium and INR levels had significantly increased (Table 1). A comparative analysis between patients with and without enteral nutrition showed a significant increase in length of stay, intensive care requirement, bacteremia, gastrointestinal bleeding, discharge MELD 3.0 score and in hospital mortality rates among patients with enteral nutrition. There was no significant difference in rates of spontaneous bacterial peritonitis, pneumonia, admission MELD 3.0 score or post-transplant survival duration in patients with enteral nutrition compared to those without enteral nutrition (Table 2).
Conclusion: In this study, less than fifty percent of patients hospitalized with cirrhosis received enteral nutrition despite having a diagnosis of severe protein calorie malnutrition. Initiation of enteral nutrition was found to be delayed a week, on average, after hospital admission. Prolonged length of stay and higher in-hospital mortality rates suggest a lack of benefit of enteral nutrition when started late in the hospital course. Based on these findings, our institution has implemented a quality improvement initiative to establish earlier enteral feeding in hospitalized patients with cirrhosis and severe protein calorie malnutrition. Future studies will evaluate the efficacy of this initiative and implications for clinical outcomes.
Table 1. The Change in Clinical End Points from Admission to Discharge Among Patients Who Received Enteral Nutrition.
Abbreviations: kg, kilograms; BMI, body mass index; INR, international normalized ratio; Na, sodium, Cr, creatinine; TB, total bilirubin; MELD, model for end stage liver disease; EN, enteral nutrition; Std, standard deviation
Table 2. Comparative Analysis of Clinical Characteristics and Outcomes Between Patients With And Without Enteral Nutrition.
Abbreviations: MASLD, metabolic dysfunction-associated steatotic liver disease; HCV, hepatitis C virus; HBV, hepatitis B virus; AIH, autoimmune hepatitis; PSC, primary sclerosing cholangitis; PBC, primary biliary cholangitis; EN, enteral nutrition; RD, registered dietician; MICU, medical intensive care unit; PNA, pneumonia; SBP, spontaneous bacterial peritonitis; GIB, gastrointestinal bleed; MELD, model for end stage liver disease; d, days; N, number; Std, standard deviation.
Jesse James, MS, RDN, CNSC1
1Williamson Medical Center, Franklin, TN
Financial Support: None Reported.
Background: Feeding tubes (Tubes) are used to deliver enteral nutrition to patients who are unable to safely ingest nutrients and medications orally, a population at elevated risk of malnutrition and dehydration. Unfortunately, these Tubes have a propensity for becoming clogged. Staff will attempt to unclog Tubes using standard bedside techniques including warm water flushes or chemical enzymes. However, not only are these practices time-consuming, often they are unsuccessful, requiring replacement. An actuated mechanical device for restoring patency in clogged small bore Tubes was evaluated at a level 2 medical center as an alternative declogging method from September 2021- July 2023. Study objectives were to explore the actuated mechanical device's ability to unclog indwelling Tubes and monitor any potential safety issues.
Methods: The TubeClear® System (actuated mechanical device, Actuated Medical, Inc., Bellefonte, PA, Figure 1) was developed to resolve clogs from various indwelling Tubes. N = 20 patients (Table 1) with n = 16, 10Fr 109 cm long nasogastric (NG) tubes, and n = 4, 10Fr 140 cm long nasojejunal (NJ) tubes, underwent clearing attempts with the actuated mechanical device. Initially, patients underwent standard declogging strategies for a minimum of 30 minutes, including warm water flushes and Creon/NaHCO3 slurry. Following unsuccessful patency restoration (n = 17) or patency restoration and reclogging occurring (n = 3), the actuated mechanical device was attempted. Procedure time was estimated from electronic monitoring records charting system and included set up, use, and cleaning time for the actuated mechanical device, to the closest five minutes. All clearing procedures were completed by three trained registered dietitians.
Results: The average time to restore Tube patency (n = 20) was 26.5 min (25 minutes for NG, 32.5 min for NJ) with 90% success (Table 2), and no significant safety issues reported by the operator or patient. User satisfaction was 100% (20/20) and patient discomfort being 10% (2/20).
Conclusion: Based on presented results, the actuated mechanical device was significantly more successful at resolving clogs compared to alternative bedside practices. Operators noted that the “Actuated mechanical device was able to work with clogs when slurries/water can't be flushed.” It was noted that actuated mechanical device use prior to formation of full clog, utilizing a prophylactic approach, “was substantially easier than waiting until the Tube fully clogged.” For a partly clogged Tube, “despite it being somewhat patent and useable, a quick pass of the actuated mechanical device essentially restored full patency and likely prevented a full clog.” For an NG patient, “no amount of flushing or medication slurry was effective, but the actuated mechanical device worked in just minutes without issue.” “Following standard interventions failure after multiple attempts, only the actuated mechanical device was able to restore Tube patency, saving money on not having to replace Tube.” For a failed clearance, the operator noted “that despite failure to restore patency, there was clearly no opportunity for flushes to achieve a better result and having this option [actuated mechanical device] was helpful to attempt to avoid tube replacement.” For an NJ patient, “there would have been no other conventional method to restore patency of a NJ small bore feeding tube without extensive x-ray exposure and "guess work," which would have been impossible for this patient who was critically ill and ventilator dependent.” Having an alternative to standard bedside unclogging techniques proved beneficial to this facility, with 90% effectiveness and saving those patients from undergoing a Tube replacement and saving our facility money by avoiding Tube replacement costs.
Table 1. Patient and Feeding Tube Demographics.
Table 2. Actuated Mechanical Device Uses.
Figure 1. Actuated Mechanical Device for Clearing Partial and Fully Clogged Indwelling Feeding Tubes.
1Aveanna Medical Solutions, Lakewood, CO; 2Aveanna Medical Solutions, Chandler, AZ; 3Aveanna Medical Solutions, Erie, CO
Financial Support: None Reported.
Background: Homecare providers have managed through multiple formula backorders since the pandemic. Due to creative problem-solving, clinicians have successfully been able to offer substitutions. However, when pump feeding sets are on backorder, the options are limited; feeding sets are specific to the brand of pump. Recently, a backorder of feeding sets used for pumps common in acute and home care resulted in a severe shortage in the home care supply chain. This required fast action to ensure patients are able to continue administering their tube feeding and prevent readmissions. A solution is to change the patient to a pump brand which is not on backorder. Normally transitioning a patient to a different brand of pump would require in-person teaching. Due to the urgency of the situation, a more efficient method needed to be established. A regional home care provider determined that 20% of patients using enteral feeding pumps were using backordered sets, and 50% were pediatric patients who tend not to tolerate other methods of feeding. In response this home care provider assembled a team to create a new educational model for pump training. The team was composed of Registered Nurses, Registered Dietitians, and patient care and distribution representatives. The hypothesis is that providing high quality educational material with instructional videos, detailed communication on the issue and telephonic clinical support will allow for a successful transition.
Methods: To determine urgency of transition we prioritized patients with a diagnosis of short bowel syndrome, gastroparesis, glycogen storage disease, vent dependency, < 2 years of age, those living in a rural area with a 2-day shipping zip code and conducted a clinical review to determine patients with jejunal feeding tube. (See Table 1) A pump conversion team contacted patients/caregivers to review the situation, discuss options for nutrition delivery, determine current inventory of sets, assessed urgency for transition and coordinated pump, sets and educational material delivery. Weekly reporting tracked number of patients using the impacted pump, transitioned patients, and those requesting to transition back to their original pump.
Results: A total of 2111 patients were using the feeding pump with backordered sets and 50% of these patients were under the age of 12 yrs. old. Over a period of 3 months, 1435 patients or 68% of this patient population were successfully transitioned to a different brand of pump and of those only 7 patients or 0.5% requested to return to their original pump even though they understood the risk of potentially running short on feeding sets. (See Figure 1).
Conclusion: A team approach which included proactively communicating with patients/caregivers, prioritizing patient risk level, providing high-quality educational material with video links and outbound calls from a clinician resulted in a successful transition to a new brand of feeding pump.
Table 1. Patient Priority Levels for Pump with Backordered Sets (Table 1).
Figure 1. Number of Pump Conversions (Chart 1).
Desiree Barrientos, DNP, MSN, RN, LEC1
1Coram CVS, Chino, CA
Financial Support: None Reported.
Background: Follow-up care for the enteral nutrition therapy community is essential for good outcomes. No data had been collected regarding home enteral nutrition (HEN) outcomes at a major university medical center. There was no robust program in place to follow up with patients who were discharged on tube feedings. Consequently, there was little information regarding the follow-up care or patient outcomes related to current practice, complications, re-hospitalizations, and equipment issues for this population.
Methods: The tools utilized were the questionnaire for the 48-hours and 30-day post-discharge discharge outreach calls, pre-discharge handouts, and feeding pump handouts.
Results: Education: Comparison of 48-Hours and 30 days. Q1: Can you tell me why you were hospitalized?Q2: Did a provider contact you for education prior to discharging home? Q3: Do you understand your nutrition orders from your Doctor? Q4: Can you tell me the steps of how to keep your PEG/TF site clean? Q5: Can you tell me how much water to flush your tube? There was an improvement in patient understanding, self-monitoring, and navigation between the two survey timepoints. Regarding patient education in Q3, there was an improved understanding of nutrition orders from 91% to 100%, Q4: steps to keeping tube feeding site clean resulted from 78% to 96%, and knowledge of water flushed before and after each feeding from 81% to 100% at the 48-hour and 30-day timepoints, respectively.
Conclusion: There was an improvement in patient understanding, self-monitoring, and navigation between the two survey timepoints. Verbal responses to open-ended and informational questions were aggregated to analyze complications, care gaps, and service failures.
Table 1. Questionnaire Responses At 48 Hours and 30 Days.
Table 2. Questionnaire Responses At 48 Hours and 30 Days.
Figure 1. Education: Comparison at 48-hours and 30-days.
Figure 2. Self-monitoring and Navigation: Comparison at 48-hours and 30-days.
1Froedtert Memorial Lutheran Hospital, Waukesha, WI; 2Froedtert Memorial Lutheran Hospital, Big Bend, WI
Financial Support: None Reported.
Background: Initiation of early enteral nutrition plays an essential role in improving patient outcomes1. Historically, feeding tubes have been placed by nurses, doctors and advanced practice providers. Over the past two decades, the prevalence of dietitians (RDNs) placing feedings tubes at the bedside has grown. This practice has been endorsed by the Academy of Nutrition and Dietetics and American Society for Enteral and Parenteral Nutrition through modifications to the Scope of Practice for the Registered Dietitian Nutritionist and Standards of Professional Performance for Registered Dietitian Nutritionists.2,3 Feeding tubes placed at the bedside by RDNs has the potential to decrease nursing, fluoroscopy and internal transport time, which is of interest to our hospital. In fall of 2023, we launched a pilot to evaluate the feasibility of an RDN-led bedside tube placement team at our 800-bed level 1 trauma center.
Methods: RDNs first worked closely with nursing leadership to create a tube placement workflow to identify appropriate patients, outline communication needed with the bedside nurse and provider, and establish troubleshooting solutions (Figure 1). Intensive care unit nursing staff then trained RDNs in tube placement using camera-guided technology (IRIS) and deemed RDNs competent after 10 successful tube placements. Given limited literature on RDN led tube placement, we defined success as >80% of tube placements in an appropriate position within the gastrointestinal tract.
Results: To date, the pilot includes 57 patients; Forty-six tubes (80.7%; 39 gastric and 7 post-pyloric) were placed successfully, as confirmed by KUB. Of note, 2 of the 39 gastric tubes were originally ordered to be placed post-pylorically, however gastric tubes were deemed acceptable for these patients after issues were identified during placement. Eleven (19%) tube placements were unsuccessful due to behavioral issues, blockages in the nasal cavity, and anatomical abnormalities.
Conclusion: This pilot demonstrated that well trained RDNs can successfully place feeding tubes at the bedside using camera-guided tube placement technology. One limitation to this pilot is the small sample size. We initially limited the pilot to 2 hospital floors and had trouble educating nursing staff on the availability of the RDN to place tubes. Through evaluation of tube placement orders, we found the floors placed on average 198 tubes during our pilot, indicating 141 missed opportunities for RDNs to place tubes. To address these issues, we created numerous educational bulletins and worked with unit nursing staff to encourage contacting the RDN when feeding tube placement was needed. We also expanded the pilot hospital-wide and are looking into time periods when tubes are most often placed. Anecdotally, bedside feeding tube placement takes 30 to 60 minutes, therefore this pilot saved 1350 to 2700 minutes of nursing time and 180 to 360 minutes of fluoroscopy time necessary to place post-pyloric tubes. Overall, our pilot has demonstrated feasibility in RDN-led bedside feeding tube placement, allowing staff RDNs to practice at the top of their scope and promoting effective use of hospital resources.
Figure 1. Dietitian Feeding Tube Insertion Pilot: 2NT and 9NT.
1Nestle Health Science, Cambridge, ON; 2Brescia School of Food and Nutritional Sciences, Faculty of Health Sciences, Western University, London, ON; 3Nestle Health Science, Hamilton, ON
Financial Support: Nestle Health Science.
Background: Continuing education (CE) is a component of professional development which serves two functions: maintaining practice competencies, and translating new knowledge into practice. Understanding registered dietitian (RD) participation and perceptions of CE facilitates creation of more effective CE activities to enhance knowledge acquisition and practice change. This preliminary analysis of a practice survey describes RD participation in CE and evaluates barriers to CE participation.
Methods: This was a cross-sectional survey of clinical RDs working across care settings in Canada. Targeted participants (n = 4667), identified using a convenience sample and in accordance with applicable law, were invited to complete a 25-question online survey, between November 2023 to February 2024. Descriptive statistics and frequencies were reported.
Results: Nationally, 428 RDs working in acute care, long term care and home care, fully or partially completed the survey (9.1% response). Respondents indicated the median ideal number of CE activities per year was 3 in-person, 5 reviews of written materials, and 6 virtual activities. However, participation was most frequently reported as “less often than ideal” for in person activities (74.7% of respondents) and written material (53.6%) and “as often as ideal” for virtual activities (50.7%). Pre-recorded video presentations, live virtual presentations and critically reviewing written materials were the most common types of CE that RDs had participated in at least once in the preceding 12-months. In-person hands-on sessions, multimodal education and simulations were the least common types of CE that RDs had encountered in the preceding 12-months (Figure 1). The most frequent barriers to participation in CE were cost (68% of respondents), scheduling (60%), and location (51%). However, encountered barriers that most greatly limited participation in CE were inadequate staff coverage, lack of dedicated education days within role, and lack of dedicated time during work hours (Table 1). When deciding to participate in CE, RDs ranked the most important aspects of the content as 1) from a credible source, 2) specific/narrow topic relevant to practice and 3) enabling use of practical tools/skills at the bedside.
Conclusion: This data suggests there is opportunity for improvement in RD CE participation, with the greatest gaps in in-person and written activities. Although RDs recognize the importance of relevant and practical content, they reported infrequent exposure to types of CE that are well-suited to this, such as simulations and hands-on activities. When planning and executing CE, content should be credible, relevant, and practical, using a format that is both accessible and impactful. Results of this study help benchmark Canadian RD participation in CE and provide convincing evidence to address barriers and maximize optimal participation.
Table 1. Frequent and Impactful Barriers Limiting Participation in CE Activities.
Note: 1. Percentages total greater than 100% because all respondents selected the 3 most important barriers impacting their participation in CE activities. 2. Items are ranked based on a weighted score calculated from a 5-point Likert scale, indicating the extent to which the barrier was perceived to have limited participation in CE activities.
Figure 1. Types of Continuing Education Activities Dietitians Participated In At Least Once, In The Preceding 12-Months.
1Medtrition, Huntingdon Valley, PA; 2Endeavor Health/Aramark Healthcare +, Evanston, IL; 3Parkview Health, Fort Wayne, IN; 4Endeavor Health, Glenview, IL
Financial Support: None Reported.
Background: Adequate nutrition is a critical component of patient care and plays a significant role in reducing hospital length of stay (LOS). Malnutrition is associated with numerous adverse outcomes, including increased morbidity, delayed recovery, and prolonged hospitalization (Tappenden et al., 2013). Timely nutrition interventions, and strategies for integrating successful nutrition care into hospital protocols to reduce LOS are essential parts of medical nutrition therapy. Modular nutrition often plays a key role in these interventions. A study by Klek et al. (2020) emphasizes the role of modular nutrition in providing personalized nutritional support for the specific needs of critically ill patients. The study suggests that using nutrient modules allows for a more precise adjustment of nutrition based on the metabolic requirements of patients (Klek et al., 2020). Modular nutrition has also been shown to positively impact clinical outcomes in ICU patients. An observational study by Compher et al. (2019) reported that the targeted administration of protein modules to achieve higher protein intake was associated with improved clinical outcomes, such as reduced ICU LOS (Compher et al., 2019).
Methods: Administration of modular nutrition can be a challenge. Typically, modular proteins (MP) are ordered through the dietitian and dispensed as part of the diet order. The nursing team is responsible for administration and documentation of the MP. There is commonly a disconnect between MP prescription and administration. In some cases, it's related to the MP not being a tracked task in the electronic health record (EHR). The goal of this evaluation was to review data related to a quality improvement (QI)initiative where MP (ProSource TF) was added to the medication administration record (MAR) which used a barcode scanning process to track provision and documentation of MP. The objective of this evaluation was to determine possible correlation between the QI initiative and patients’ ICU LOS. The QI initiative evaluated a pre-implementation timeframe from June 1st, 2021 to November 30th, 2021, with a post implementation timeframe from January 1st, 2022 to June 30th, 2022. There were a total of 1962 ICU encounters in the pre-implementation period and 1844 ICU encounters in the post implementation period. The data was analyzed using a series of statistical tests.
Results: The t-test for the total sample was significant, t(3804) = 8.35, p < .001, indicating the average LOS was significantly lower at post compared to pre implementation(TABLE 1). This positive correlation allows us to assume that improved provision of MP may be related to a reduced LOS in the ICU. In addition to LOS, we can also suggest a relationship with the MAR and MP utilization. Pre-implementation, 1600 doses of MP were obtained with an increase of 2400 doses obtained post implementation. The data suggests there is a correlation between product use and MAR implementation even though the overall encounters at post implementation were reduced. There was a 50% increase in product utilization post implementation compared to previous.
Conclusion: The data provided suggests the benefit for adding MP on the MAR to help improve provision, streamline documentation and potentially reduce ICU LOS.
Table 1. Comparison of LOS Between Pre and Post Total Encounters.
Table 1 displays the t-test comparison of LOS in pre vs post implementation of MP on the MAR.
Figure 1. Displays Product Utilization and Encounters Pre vs Post Implementation of MP on the MAR.
International Poster of Distinction
Eliana Giuntini, PhD1; Ana Zanini, RD, MSc2; Hellin dos Santos, RD, MSc2; Ana Paula Celes, MBA2; Bernadette Franco, PhD3
1Food Research Center/University of São Paulo, São Paulo; 2Prodiet Medical Nutrition, Curitiba, Parana; 3Food Research Center/School of Pharmaceutical Sciences/University of São Paulo, São Paulo
Financial Support: None Reported.
Background: Critically ill patients present an increased need for protein to preserve muscle mass due to anabolic resistance. Additionally, these patients are more prone to developing hyperglycemia, and one of the nutritional strategies that can be adopted is to provide a diet with a low glycemic index. Hypercaloric and high-protein enteral formulas can help meet energy and protein goals for these patients. Due to their reduced carbohydrate content, these formulas can also contribute to lowering the postprandial glycemic response. The study aimed to evaluate the glycemic index (GI) and the glycemic load (GL) of a specialized high-protein enteral nutrition formula.
Methods: Fifteen healthy volunteers were selected, based on self-reported absence of diseases or regular medication use, aged between 21 and 49 years, with normal glucose tolerance according to fasting and postprandial glucose assessments over 2 hours. The individuals attended after a 10-hour fast, once per week, consuming the glucose solution – reference food – for 3 weeks, and the specialized high-protein enteral formula (Prodiet Medical Nutrition) in the following week, both in amounts equivalent to 25 g of available carbohydrates. The specialized high-protein formula provides 1.5 kcal/ml, 26% protein (98 g/L), 39% carbohydrates, and 35% lipids, including EPA + DHA. Capillary blood sampling was performed at regular intervals, at 0 (before consumption), 15, 30, 45, 60, 90, and 120 minutes. The incremental area under the curve (iAUC) was calculated, excluding areas below the fasting line. Glycemic load (GL) was determined based on the equation GL = [GI (glucose=reference) X grams of available carbohydrates in the portion]/100. Student's t-test were conducted to identify differences (p < 0.05).
Results: To consume 25 g of available carbohydrates, the individuals ingested 140 g of the high-protein formula. The high-protein enteral formula showed a low GI (GI = 23) with a significant difference compared to glucose (p < 0.0001) and a low GL (GL = 8.2). The glycemic curve data showed significant differences at all time points between glucose and the specialized high-protein formula, except at T90, with the glycemic peak occurring at T30 for glucose (126 mg/dL) and at both T30 and T45 for the specialized high-protein enteral formula, with values significantly lower than glucose (102 vs 126 mg/dL). The iAUC was smaller for the specialized high-protein formula compared to glucose (538 ± 91 vs 2061 ± 174 mg/dL x min) (p < 0.0001), exhibiting a curve without high peak, typically observed in foods with a reduced glycemic index.
Conclusion: The specialized high-protein enteral nutrition formula showed a low GI and GL, resulting in a significantly reduced postprandial glycemic response, with lower glucose elevation and variation. This may reduce insulin requirements, and glycemic variability.
Figure 1. Mean Glycemic Response of Volunteers (N = 15) to 25 G of Available Carbohydrates After Consumption of Reference Food and a Specialized High-Protein Enteral Nutrition Formula, in 120 Min.
Lisa Epp, RDN, LD, CNSC, FASPEN1; Bethaney Wescott, APRN, CNP, MS2; Manpreet Mundi, MD2; Ryan Hurt, MD, PhD2
Background: Hypnotherapy is the use of hypnosis for the treatment of a medical or psychological disorder. Specifically, gut directed hypnotherapy is a treatment option for functional gastrointestinal disorders and disorders of the gut brain axis. It has been shown to be effective in management of GI symptoms such as abdominal pain, nausea, functional dyspepsia and irritable bowel syndrome symptoms. Evidence suggests that 6%–19% of patients with these GI symptoms exhibit characteristics of Avoidant/restrictive food intake disorder (ARFID). Multiple studies show improvement in GI symptoms and ability to maintain that improvement after 1 year. However, there is a paucity of data regarding use of hypnotherapy in home enteral nutrition patients.
Methods: A case report involving a 67-year-old adult female with h/o Irritable bowel syndrome (diarrhea predominant) and new mucinous appendiceal cancer s/p debulking of abdominal tumor, including colostomy and distal gastrectomy is presented. She was on parenteral nutrition (PN) for 1 month post op due to delayed return of bowel function before her oral diet was advanced. Unfortunately, she had difficulty weaning from PN as she was “scared to start eating” due to functional dysphagia with gagging at the sight of food, even on TV. After 4 weeks of PN, a nasojejunal feeding tube was placed and she was dismissed home.
Results: At multidisciplinary outpatient nutrition clinic visit, the patient was dependent on enteral nutrition and reported inability to tolerate oral intake for unclear reasons. Long term enteral access was discussed, however the patient wished to avoid this and asked for alternative interventions she could try to help her eat. She was referred for gut directed hypnotherapy. After 4 in-person sessions over 3 weeks of hypnotherapy the patient was able to tolerate increasing amounts of oral intake and remove her nasal jejunal feeding. Upon follow-up 4 months later, she was still eating well and continued to praise the outcome she received from gut directed hypnotherapy.
Conclusion: Patient-centered treatments for gut-brain axis disorders and disordered eating behaviors and/or eating disorders are important to consider in addition to nutrition support. These include but are not limited to Cognitive Behavior Therapy, mindfulness interventions, acupuncture, biofeedback strategies, and gut directed hypnotherapy. Group, online and therapist directed therapies could be considered for treatment avenues dependent on patient needs and preferences. Additional research is needed to better delineate impact of these treatment modalities in the home enteral nutrition population.
1The Ohio State University Wexner Medical Center, Columbus, OH; 2The Ohio State University Wexner Medical Center, Westerville, OH
Financial Support: None Reported.
Background: It is well documented that unnecessary hospital admissions can have a negative impact on patient's physical and emotional wellbeing and can increase healthcare costs.1 Numerous strategies exist to limit unnecessary hospital admissions; one innovative strategy being utilized at our 1000+ bed Academic Medical Center involves Registered Dietitians (RDs). Literature demonstrates that feeding tube placement by a dedicated team using electromagnetic tracking improves patient morbidity and mortality, and is a cost effective solution for this procedure.2 RDs have been part of feeding tube teams for many years, though exact numbers of RD only teams are unclear.3 The Revised 2021 Standards of Practice and Standards of Professional Performance for RDNs (Competent, Proficient, and Expert) in Nutrition Support identifies that dietitians at the “expert” level strive for additional experience and training, and thus may serve a vital role in feeding tube placement teams.4 Prior to the implementation of the RD led tube team, there was no uniform process at our hospital for these patients to obtain enteral access in a timely manner.
Methods: In December 2023 an “RD tube team” consult and order set went live within the electronic medical record at our hospital. The original intent of the tube team was to cover the inpatient units, but it soon became apparent that there were opportunities to extend this service to observation areas and the Emergency Department (ED). This case series abstract will outline case studies from three patients with various clinical backgrounds and how the tube team was able to prevent an inpatient admission. Patient 1: An 81-year-old female who returned to the ED on POD# 4 s/p esophageal repair with a dislodged nasoenteric feeding tube. The RD tube team was consulted and was able to replace her tube and bridled it in place. Patient discharged from ED without requiring hospital readmission. Patient 2: An 81-year-old male with a history of ENT cancer who transferred to our ED after outside hospital ED had no trained staff available to replace his dislodge nasoenteric feeding tube. RD tube team replaced his tube and bridled it into place. Patient was able to discharge from the ED without admission. Patient 3: A 31-year-old female with complex GI history and multiple prolonged hospitalizations due to PO intolerance. Patient returned to ED 2 days post discharge with a clogged nasoenteric feeding tube. Tube was unable to be unclogged, thus the RD tube team was able to replace tube in ED and prevent readmission.
Results: Consult volumes validated there was a need for a tube team service. In the first 8 months of consult and order set implementation, a total of 403 tubes were placed by the RD team. Of those, 24 (6%) were placed in the ED and observation units. In the 3 patient cases described above, and numerous other patient cases, the RD was able to successfully place a tube using an electromagnetic placement device (EMPD), thereby preventing a patient admission.
Conclusion: Creating a feeding tube team can be a complex process to navigate and requires support from senior hospital administration, physician champions, nursing teams and legal/risk management teams. Within the first year of implementation, our hospital system was able to demonstrate that RD led tube teams have the potential to not only help with establishing safe enteral access for patients, but also can be an asset to the medical facility by preventing admissions and readmissions.
1Internal Equilibrium, King City, ON; 2London Health Sciences Centre, London, ON; 3NutritionRx, London, ON; 4Vanier Children's Mental Wellness, London, ON; 5Listowel-Wingham and Area Family Health Team, Wingham, ON
Financial Support: None Reported.
Background: Parkinson's disease is the second most prevalent neurodegenerative disease with a predominant disease-related symptom known as dysphagia. Dysphagia can increase the risk of aspiration, the inhaling of food or liquids into the lungs, potentially instigating the onset of pneumonia, a recurrent fatality in patients with Parkinson's disease. Therefore, feeding tubes are placed to deliver nutrition into the stomach or the small intestine to maintain appropriate nutrition delivery and reduce the risk of aspiration by oral intake. To the best of our knowledge, there is no research comparing the differences in outcomes between gastric (G) or jejunal (J) tube feeding in patients with Parkinson's disease-related dysphagia, however, limited research does exist in critically ill populations comparing these two modalities. The purpose of this study is to compare the differences in hospital readmissions related to aspiration events and differences in mortality rates after placement of a gastric or jejunal feeding tube in patients with Parkinson's disease-related dysphagia.
Methods: This was a retrospective chart review of patients either admitted to the medicine or clinical neurosciences units at University Hospital in London Ontario, Canada, between January 1, 2010, to December 31, 2022. Patients were included if they had a documented diagnosis of Parkinson's or associated diseases and had a permanent feeding tube placed during hospital admission that was used for enteral nutrition. Patients were excluded from the study if they had a comorbidity that would affect their survival, such as cancer, or if a feeding tube was placed unrelated to Parkinson's Disease-related dysphagia, for example, feeding tube placement post-stroke. A p-value < 0.05 was considered statistically significant.
Results: 25 participants were included in this study; 7 had gastric feeding tubes, and 18 had jejunal feeding tubes. Demographic data is shown in Table 1. No statistically significant differences were found in demographic variables between the G- and J-tube groups. Of interest, none of the 28% of participants that had dementia were discharged to home; 5 were discharged to long-term care, 1 was discharged to a complex continuing care facility, and 1 passed in hospital. Differences in readmission rates and morality between groups did not reach significance, likely due to our small sample size in the G-tube group (Figures 1 and 2). However, we found that 50% of participants were known to have passed within 1 year of initiating enteral nutrition via their permanent feeding tube, and there was a trend of higher readmission rates in the G-tube group.
Conclusion: While this study did not yield statistically significant results, it highlights the need for further research of a larger sample size to assess confounding factors, such as concurrent oral intake, that affect the difference in outcomes between G- and J-tube groups. Future research would also benefit from examining the influence on quality of life in these patients. Additional research is necessary to inform clinical practice guidelines and clinical decision-making for clinicians, patients and families when considering a permanent feeding tube.
Table 1. Participant Demographics.
Readmission rates were calculated as a hospital. If a participant was readmitted more than once within the defined percentage of the number of readmissions to the number of discharges from timeframes, subsequent readmissions were counted as a new readmission and new discharge event. Readmission rate calculations did not include participants who passed during or after the defined timeframes. Differences in readmission rates between gastric and jejunal feeding tube groups did no reach statistical significance.
Figure 1. Readmission Rate.
Mortality rates were calculated from the time that enteral nutrition was initiated through a permanent feeding tube in 30-day, 60-day, 90-day, and 1-year time intervals. Differences in mortality rates between gastric and jejunal feeding tube groups did not reach statistical significance.
Figure 2. Mortality Rate.
Jennifer Carter, MHA, RD1
1Winchester Medical Center, Valley Health, Winchester, VA
Financial Support: None Reported.
Background: Early enteral nutrition is shown to improve patient outcomes and can decrease, or attenuate, the progression of malnutrition. Placement of nasoenteric feeding tubes was deemed within the scope of practice (SOP) for Registered Dietitian Nutritionists (RDNs) in 2007, with recent revisions in 2021, by the Academy of Nutrition and Dietetics (AND) and the American Society of Parenteral and Enteral Nutrition (ASPEN). Due to enhanced order writing privileges, RDNs are knowledgeable and aware of those in need of enteral nutrition recommendations. The goal of this abstract is to bring light to the efficiency of a RDN led nasoenteric tube placement team.
Methods: A retrospective chart review of the first 70 patients who received a nasoenteric tube placed by a RDN in 2023 was conducted. Data points collected include time of tube order to tube placement and time of tube order to enteral nutrition order.
Results: Out of 70 tubes placed, the average time from tube order to tube placement was 2.28 hours. The longest time from tube order to placement was 19 hours. The average time from tube order to enteral nutrition order was 5.58 hours. The longest time from tube order to enteral nutrition order was 23.6 hours.
Conclusion: This retrospective review reflects the timeliness of placement and provision of enteral nutrition in the acute care setting when performed by an all RDN team. Overall, placement occurred within less than 2.5 hours of tube placement order, and enteral nutrition orders entered less than 6 hours of tube placement order. The RDNs at Winchester Medical Center have been placing nasoenteric-feeding tubes since 2013 using an electromagnetic tube placement device (EMPD). As of 2018, it became an all RDN team. With the enhanced support from AND and ASPEN, nasoenteric feeding tube placement continues to be promoted within the SOP for RDNS. Also, the Accreditation Council for Education in Nutrition and Dietetics (ACEND) now requires dietetic interns to learn, observe, and even assist with nasoenteric tube placements. Over time, more RDNs in the acute care setting will likely advance their skillset to include this expertise.
Figure 1. Time From MD Order to Tube Placement in Hours.
Figure 2. Time From MD Order of Tube to Tube Feed Order in Hours.
Poster of Distinction
Vanessa Millovich, DCN, MS, RDN, CNSC1; Susan Ray, MS, RD, CNSC, CDCES2; Robert McMahon, PhD3; Christina Valentine, MD, RDN, FAAP, FASPEN4
Financial Support: Kate Farms provided all financial support.
Background: Whole food plant-based diets have demonstrated metabolic benefits across many populations. The resulting increased intake of dietary fiber and phytonutrients is integral to the success of this dietary pattern due to the positive effect on the digestive system. Patients dependent on tube feeding may not receive dietary fiber or sources of phytonutrients, and the impact of this is unknown. Evidence suggests that the pathways that promote digestive health include more than traditional prebiotic sources from carbohydrate fermentation. Data on protein fermentation metabolites and potential adverse effects on colon epithelial cell integrity are emerging. These lesser-known metabolites, branched-chain fatty acids (BCFAs), are produced through proteolytic fermentation. Emerging research suggests that the overproduction of BCFAs via protein fermentation may be associated with toxic by-products like p-cresol. These resulting by-products may play a role in digestive disease pathogenesis. Enteral formulas are often used to support the nutritional needs of those with digestive conditions. Plant-based formulations made with yellow pea protein have been reported to improve GI tolerance symptoms. However, the underlying mechanisms responsible have yet to be investigated. The purpose of this study was to assess the impact of a mixed food matrix enteral formula containing pea protein, fiber, and phytonutrients on various markers of gut health in healthy children and adults, using an in-vitro model.
Methods: Stool samples of ten healthy pediatric and 10 adult donors were collected and stored at -80°C. The yellow pea protein formulas (Kate Farms™ Pediatric Standard 1.2 Vanilla-P1, Pediatric Peptide 1.0 Vanilla-P2, and Standard 1.4 Plain-P3) were first predigested using standardized intestinal processing to simulate movement along the digestive tract. The in-vitro model was ProDigest's Colon-on-a-Plate (CoaP®) simulation platform which has demonstrated in vivo-in vitro correlation. Measurements of microbial metabolic activity included pH, production of gas, SCFAs, BCFA, ammonia, and microbiota shifts. Paired two-sided t-tests were performed to evaluate differences between treatment and control. Differential abundance analysis was performed using LEfSe and treeclimbR. Statistical significance, as compared to negative control, is indicated by a p-value of < 0.05.
Results: In the pediatric group, the microbial analysis showed significant enrichment of Bifidobacteria as well as butyrate-producing genera Agathobacter and Agathobaculum with the use of the pediatric formulas when compared to the control. P1 resulted in a statistically significant reduction of BCFA production (p < = 0.05). P1 and P2 resulted in statistically significant increases in acetate and propionate. In the adult group, with treatment using P3, microbial analysis showed significant enrichment of Bifidobacteria compared to the control group. P3 also resulted in a reduction of BCFAs, although not statistically significant. Gas production and drop in pH were statistically significant (p < = 0.05) for all groups P1, P2, and P3 compared to control, which indicates microbial activity.
Conclusion: All enteral formulas demonstrated a consistent prebiotic effect on the gut microbial community composition in healthy pediatric and adult donors. These findings provide insight into the mechanisms related to digestive health and highlight the importance of designing prospective interventional research to better understand the role of fiber and phytonutrients within enteral products.
Hill Johnson, MEng1; Shanshan Chen, PhD2; Garrett Marin3
1Luminoah Inc, Charlottesville, VA; 2Virginia Commonwealth University, Richmond, VA; 3Luminoah Inc, San Diego, CA
Financial Support: Research was conducted with the support of VBHRC's VA Catalyst Grant Funding.
Background: Medical devices designed for home use must prioritize user safety, ease of operation, and reliability, especially in critical activities such as enteral feeding. This study aimed to validate the usability, safety, and overall user satisfaction of a novel enteral nutrition system through summative testing and task analysis.
Methods: A simulation-based, human factors summative study was conducted with 36 participants, including both caregivers and direct users of enteral feeding technology. Participants were recruited across three major cities: Houston, Chicago, and Phoenix. Task analysis focused on critical and non-critical functions of the Luminoah FLOW™ Enteral Nutrition System, while user satisfaction was measured using the System Usability Scale (SUS). The study evaluated successful task completion, potential use errors, and qualitative user feedback.
Results: All critical tasks were completed successfully by 100% of users, with the exception of a cleaning task, which had an 89% success rate. Non-critical tasks reached an overall completion rate of 95.7%, demonstrating the ease of use and intuitive design of the system. The SUS score was exceptionally high, with an average score of 91.5, indicating strong user preference for the device over current alternatives. Furthermore, 91% of participants indicated they would choose the new system over other products in the market.
Conclusion: The innovative portable enteral nutrition system demonstrated excellent usability and safety, meeting the design requirements for its intended user population. High completion rates for critical tasks and an overwhelmingly positive SUS score underscore the system's ease of use and desirability. These findings suggest that the system is a superior option for home enteral feeding, providing both safety and efficiency in real-world scenarios. Further refinements in instructional materials may improve user performance on non-critical tasks.
Elease Tewalt1
1Phoenix Veterans Affairs Administration, Phoenix, AZ
Financial Support: None Reported.
Background: Enhanced Recovery After Surgery (ERAS) protocols, including preoperative carbohydrate loading, aim to accelerate recovery by reducing the stress responses and insulin resistance. These protocols have been shown to decrease hospital stays, postoperative complications, and healthcare costs. However, there is limited knowledge about the safety and efficacy of ERAS for diabetic patients. Patients with diabetes make up 15% of surgical cases and often have longer hospital stays and more postoperative complications. This study evaluated outcome measures important to patients with diabetes in a non-diabetic population in order to support the groundwork for future trials that could include diabetic patients in ERAS protocols.
Methods: A retrospective chart review at the Phoenix Veterans Affairs Health Care System compared 24 colorectal surgery patients who received preoperative carbohydrate drinks with 24 who received traditional care. Outcomes assessed included blood glucose (BG) levels, aspiration, and postoperative complications. Additional analyses evaluated adherence, length of hospital stay, and healthcare costs.
Results: The demographics of the two groups were comparable (Table 1). The preoperative BG levels of the carbohydrate loading group were similar (164.6 ± 36.3 mg/dL) to the control group (151.8 ± 47.7 mg/dL) (p > 0.05) (Figure 1). The carbohydrate loading group demonstrated lower and more stable postoperative BG levels (139.4 ± 37.5 mg/dL) compared to the control group (157.6 ± 61.9 mg/dL), but this difference was not statistically significant (p > 0.05) (Figure 2). There were no significant differences in aspiration or vomiting between the groups (p > 0.05) (Table 2). The carbohydrate loading group had a shorter average hospital stay by one day, but this difference was not statistically significant (p > 0.05) (Table 2).
Conclusion: Carbohydrate loading as part of ERAS protocols was associated with better postoperative glucose control, no increased risk of complications, and reduced hospital stays. Although diabetic patients were not included in this study, these findings suggest thatcarbohydrate loading is a safe and effective component of ERAS. Including diabetic patients in ERAS is a logical next step that could significantly improve surgical outcomes for this population. Future research should focus on incorporating diabetic patients to assess the impact of carbohydrate loading on postoperative glucose control, complication rates, length of stay, and healthcare costs.
Table 1. Demographics.
The table includes the demographics of the two groups. The study group consists of participants who received the carbohydrate drink, while the control group includes those who received traditional care.
Table 2. Postoperative Outcomes.
The table includes the postoperative outcomes of the two groups. The study group consists of participants who received the carbohydrate drink, while the control group includes those who received traditional care.
The figure shows the preoperative BG levels of the carbohydrate and non-carbohydrate groups, along with a trendline for each group (p > 0.05).
Figure 1. Preoperative BG Levels.
The figure shows the postoperative BG levels of the carbohydrate and non-carbohydrate groups, along with a trendline for each group (p > 0.05).
Figure 2. Postoperative BG Levels.
Malnutrition and Nutrition Assessment
Amy Patton, MHI, RD, CNSC, LSSGB1; Elisabeth Schnicke, RD, LD, CNSC2; Sarah Holland, MSc, RD, LD, CNSC3; Cassie Fackler, RD, LD, CNSC2; Holly Estes-Doetsch, MS, RDN, LD4; Kristen Roberts, PhD, RDN, LD, CNSC, FASPEN, FAND5; Christopher Taylor, PhD, RDN4
1The Ohio State University Wexner Medical Center, Westerville, OH; 2The Ohio State University Wexner Medical Center, Columbus, OH; 3The Ohio State University Wexner Medical Center, Upper Arlington, OH; 4The Ohio State University, Columbus, OH; 5The Ohio State University, Granville, OH
Financial Support: None Reported.
Background: The unfavorable association of malnutrition on hospital outcomes such as longer length of stays (LOS), increased falls, and increased hospital readmissions has been well documented in the literature. We aimed to see if a different model of care that lowered Registered Dietitian (RD) to patient ratios would lead to increased malnutrition identification and documentation within our facility. We also evaluated the relationship between these metrics and LOS on monitored hospital units.
Methods: In July 2022, two additional RDs were hired at a large 1000+ bed academic medical center as part of a pilot program focused on malnutrition identification, documentation integrity, and staff training. The RD to patient ratio was reduced from 1:75 to 1:47 on three units of the hospital. On the pilot units, the RD completed a full nutrition assessment, including a Nutrition Focused Physical Exam (NFPE), for all patients who were identified as "at risk" per hospital nutrition screening policy. Those patients that were not identified as “at risk” received a full RD assessment with NFPE by day 7 of admission or per consult request. A malnutrition dashboard was created with assistance from a Quality Data Manager from the Quality and Operations team. This visual graphic allowed us to track and monitor RD malnutrition identification rates by unit and the percentage of patients that had a malnutrition diagnosis captured by the billing and coding team. Data was also pulled from the Electronic Medical Record (EMR) to look at other patient outcomes. In a retrospective analysis we compared the new model of care to the standard model on one of these units.
Results: There was an increase in the RD identified capture rate of malnutrition on the pilot units. On a cardiac care unit, the RD identification rate went from a baseline of 6% in Fiscal Year (FY) 2022 to an average of 12.5% over FY 2023-2024. On two general medicine units, the malnutrition rates identified by RD nearly doubled during the two-year intervention (Table 1). LOS was significantly lower on one of the general medicine intervention floors compared to a control unit (p < 0.001, Cohen's D: 13.8) (Table 2). LOS was reduced on all units analyzed between FY22 and FY23/24. Those patients with a malnutrition diagnosis had a 15% reduction in LOS FY22 to FY23/24 in control group compared to 19% reduction in LOS for those identified with malnutrition on intervention unit. When comparing intervention versus control units for FY23 and FY24 combined, the intervention had a much lower LOS than control unit.
Conclusion: Dietitian assessments and related interventions may contribute in reducing LOS. Reducing RD to patient ratios may allow for greater identification of malnutrition and support patient outcomes such as LOS. There is an opportunity to evaluate other patient outcomes for the pilot units including falls, readmission rates and Case Mix Index.
Table 1. RD Identified Malnutrition Rates on Two General Medicine Pilot Units.
Table 2. Control Unit and Intervention Unit Length of Stay Comparison.
1The Ohio State University Wexner Medical Center, Westerville, OH; 2The Ohio State University Wexner Medical Center, Columbus, OH
Financial Support: None Reported.
Background: Delays in identifying patients at nutrition risk can impact patient outcomes. Per facility policy, Dietetic Technicians (DTs) and Registered Dietitians (RDs) review patient charts, meet with patients, and assign a nutrition risk level. High risk patients are then assessed by the RD for malnutrition among other nutrition concerns. Based on a new tracking process implemented in January of 2023, an average of 501 patient nutrition risk assignments were overdue or incomplete per month in January thru April of 2023 with a dramatic increase to 835 in May. Missed risk assignments occur daily. If the risk assignment is not completed within the 72-hour parameters of the policy, this can result in late or missed RD assessment opportunities and policy compliance concerns.
Methods: In June 2023, a Lean Six Sigma quality improvement project using the DMAIC (Define, Measure, Analyze, Improve, Control) framework was initiated at a 1000+ bed Academic Medical Center with the goal to improve efficiency of the nutrition risk assignment (NRA) process for RDs and DTs. A secondary goal was to see if potential improved efficiency would also lead to an increase in RD identified malnutrition based on Academy of Nutrition and Dietetics/American Society for Parenteral and Enteral Nutrition Indicators or Malnutrition (AAIM) criteria. A group of RDs and DTs met to walk through each of the quality improvement process steps. The problem was defined and baseline measurements of malnutrition identification rates and missed nutrition risk assignments were analyzed. A fishbone diagram was used to help with root cause analysis and later a payoff matrix was used to identify potential interventions. The improve phase was implemented in October and November 2023 and included changes to the screening policy itself and redistributing clinical nutrition staff on certain patient units.
Results: Identified improvements did have a positive impact on incomplete work and on malnutrition identification rates. Malnutrition identification rates on average May thru October were 11.7% compared to malnutrition rates November thru April of 12.1%. The number of missed NRA's decreased from an average of 975 per month May thru October to 783 per month November thru April, a decrease of 192 per month (20%). An additional quality improvement process cycle is currently underway to further improve these metrics.
Conclusion: Fostering a culture of ongoing improvement presents a significant challenge for both clinicians and leaders. Enhancing nutrition care and boosting clinician efficiency are critical goals. Introducing clinical nutrition leaders to tools designed for quality monitoring and enhancement can lead to better performance outcomes and more effective nutrition care. Tools such as those used for this project along with PDSA (Plan, Do, Study, Act) projects are valuable in this process. Involving team members from the beginning of these improvement efforts can also help ensure the successful adoption of practice changes.
1Department of Internal Medicine, Prosperidad, Agusan del Sur; 2Medical Nutrition Department, Tagum City, Davao del Norte
Financial Support: None Reported.
Background: Malnutrition is a strong predictor for mortality and morbidity through poor response to therapy, and quality of life among Gastro-intestinal (GI) cancer patients. In a tertiary government hospital in Tagum City where a cancer center is present, although malnutrition screening among patients with cancer is routinary, no studies focusing on determining the association between nutritional status and quality of life among GI cancer patients were conducted in the past. This study generally aims to determine if nutritional status is associated with the quality of life among adult GI cancer Filipino patients seeking cancer therapy in a tertiary government hospital.
Methods: A quantitative, observational, cross-sectional, survey analytical, and predictive type of research was done. World Health Organization Quality of Life Brief Version (WHOQOL-BREF) questionnaire was utilized to determine the quality of life of cases. Logistic regression analysis was used for the association between the demographic, clinical and nutritional profile among patients with gastrointestinal cancer.
Results: Among respondents (n = 160, mean age 56.4 ± 12 years), majority were male (61.9%), married (77.5%), Roman Catholic (81.1%), and finished high school (38.1%). Almost half were diagnosed cases of colon adenocarcinoma (43.125%), followed by rectal adenocarcinoma (22.5%), rectosigmoid adenocarcinoma (11.875%), then GI stromal tumor (5.625%). On cancer staging, 40.625% were Stage 4, followed by Stage 3b (19.375%), Stage 3c (10%), Stage 3a (5.625%), then Stage 2a (4.375%). Only 2.5% were Stage 4a, while 0.625% were Stage 4b. More than one fourth received CAPEOX (38.125%), followed by FOLFOX (25.625%), then IMATINIB (5.625%). Among cases, 15.6% were underweight or obese, or overweight (34.4%). In terms of SGA grading, 38.1% had severe level, 33.8% moderate level, while the rest normal to mild. On quality of life, mean scores per variable were: generally good for general quality of life (3.71 ± 0.93), generally satisfied for perception on general health, being satisfied with one's self, and his or her relationship with others (3.46 to 3.86 ± 0.97), generally of moderate satisfaction on having enough energy to daily life, on accepting bodily appearance, on the availability of information needed for daily living, and on the extent of having the opportunity for leisure (2.71 to 3.36 ± 1.02), a little level of satisfaction was thrown on having enough money to meet their needs (2.38 ± 0.92). Participants, on average, experienced quite often negative feelings such as having blue mood, despair, depression and anxiety (2.81 ± 0.79). A significant association between age (p = 0.047), cancer diagnosis (p = 0.001), BMI status (p = 0.028), and SGA nutritional status (p = 0.010) relative to the quality of life among adult cancer patients was documented.
Conclusion: Nutritional status was significantly associated with the quality of life among adult GI cancer Filipino patients seeking cancer therapy in a tertiary government hospital. Public health interventions may play a critical role on these factors to improve patient survival and outcome.
1Massachusetts General Hospital, Sharon, MA; 2MedStart National Rehabilitation Hospital, Washington, DC; 3New England Baptist Hospital, Boston, MA; 4Center for Neurotechnology and Neurorecovery, Mass General Hospital, Boston, MA; 5Nutrition and Food Services, MGH, Boston, MA; 6Harvard Medical School and Mass General Hospital, Boston, MA; 7Neurocritical Care & Neurorecovery, MGH, Boston, MA
Financial Support: Academy of Nutrition and Dietetics, Dietitian in Nutrition Support Member Research Award.
Background: Nutritional status is a known modifiable factor for optimal recovery in brain injury survivors, yet, data on specific benchmarks for optimizing clinical outcomes through nutrition are limited. This pilot study aimed to quantify the clinical, nutritional, and functional characteristics of brain injury survivors from ICU admission to 90 days post-discharge.
Methods: Patients admitted to the Massachusetts General Hospital (MGH) Neurosciences ICU over a 12-month period were enrolled based on these criteria: age 18 years and greater, primary diagnoses of acute brain injury, ICU stay for at least 72 hours, and meeting MGH Neurorecovery Clinic referral criteria for outpatient follow-up with survival beyond 90 days post-discharge. Data were collected from the electronic health record and from the neurorecovery clinic follow-visit/phone interview. These included patient characteristics, acute clinical outcomes, nutrition intake, surrogate nutrition and functional scores at admission, discharge, and 90 days post-discharge. Descriptive statistics were used for analysis.
Results: Of 212 admissions during the study period, 50 patients were included in the analysis. The mean APACHE (n = 50), GCS (n = 50), and NIHSS (n = 20) scores were 18, 11 and 15, respectively. Seventy-eight percent of the patients required ventilation with a mean duration of 6.3 days. Mean ICU and post ICU length of stay were 17.4 and 15.9 days, respectively. Eighty percent received nutrition (enteral or oral) within 24-48 hours of ICU admission. The first 7-day mean ICU energy and protein intake were 1128 kcal/day and 60.3 g protein/day, respectively, both 63% of estimated needs. Assessing based on ASPEN guidelines, patients with a BMI ≥ 30 received less energy (11.6 vs 14.6 kcal/kg/day) but higher protein (1.04 vs 0.7 g protein/kg/day) than those with a BMI < 30. Twelve percent of patients had less than 50% of their nutritional intake for at least 7 days pre-discharge and were considered nutritionally at risk. Forty-six percent were discharged with long-term enteral access. Only 16% of the patients were discharged to home rather than a rehabilitation facility. By 90 days post-discharge, 32% of the patients were readmitted, with 27% due to stroke. Upon admission, patients’ mean MUST (Malnutrition Universal Screening Tool) and MST (Malnutrition Screening Tool) scores were 0.56 and 0.48, respectively, reflecting that they were at low nutritional risk. By discharge, the mean MUST and MST scores of these patients increased to 1.16 and 2.08, respectively, suggesting these patients had become nutritionally at risk. At 90 days post-discharge, both scores returned to a low nutrition risk (MUST 0.48 and MST 0.59). All patients’ functional scores, as measured by the modified Rankin scale (mRS), followed a similar pattern: the mean score was 0.1 at admission, 4.2 at discharge, and 2.8 at 90 days post-discharge. The 90 days post-discharge Barthel index was 64.1, indicating a moderate dependence in these patients.
Conclusion: This pilot study highlighted key nutritional, clinical, and functional characteristics of brain injury survivors from ICU admission to 90 days post-discharge. Further statistical analysis will help delineate the relationships between nutritional status and clinical and functional outcomes, which may guide future research and practice in ICU nutrition and neurorehabilitation.
Background: Identifying frailty in patients with liver disease provides valuable insights into a patient's nutritional status and physical resilience. However, it is unclear if reduced muscle is an important etiology of frailty in liver disease. Identifying the possible connection between frailty and muscle mass may lead to better risk prediction, personalized interventions, and improved outcomes in patient care. The purpose of this study is to determine if frail patients have lower skeletal muscle index (SMI) compared to not-frail patients with liver disease undergoing liver transplant evaluation.
Methods: A retrospective, cross-sectional study design was utilized. Patients greater than 18 years of age who underwent a liver transplant evaluate from January 1st, 2019 until December 31, 2023 were included if they had a liver frailly index (LFI) assessment completed during the initial liver transplant evaluation and a diagnostic abdominal CT scan completed within 30 days of the initial liver transplant evaluation. Demographic data (age, sex, height, and BMI), etiology of liver disease, MELD-Na score, history of diabetes and hepatocellular carcinoma, liver disease complications (ascites, hepatocellular carcinoma, hepatic encephalopathy & esophageal varices), and LFI score were recorded for each patient. LFI was recorded as both a continuous variable and dichotomized into a categorical variable (frail: defined as LFI ≥ 4.5 versus not frail: defined as LFI ≤ 4.4). Cross-sectional muscle area (cm2) from the third lumbar region of the CT was quantified; SMI was calculated (cm2/height in meters2) and low muscle mass was dichotomized into a categorical variable (low muscle mass: defined as SMI ≤ 50 cm2/m2 for males and ≤39 cm2/m2 for females versus normal muscle mass: defined as SMI > 50 cm2/m2 for males and >39 cm2/m2 for females). An independent t-test analysis was used to determine if there is a difference in SMI between patients who are categorized as frail versus not frail.
Results: A total of 104 patients, 57% male with a mean age of 57 ± 10 years and mean of BMI 28.1 ± 6.4 kg/m2, were included. The mean MELD-Na score was 16.5 ± 6.9; 25% had a history of hepatocellular carcinoma and 38% had a history of diabetes. The majority of the sample had at least one liver disease complication (72% had ascites, 54% had hepatic encephalopathy, and 67% had varices). The mean LFI score was 4.5 ± 0.9 and 44% were categorized as frail. The mean SMI was 45.3 ± 12.6 cm2/m2 and 52% were categorized as having low muscle mass (males: 63% and females: 38%). There was no difference in SMI between patients who were frail versus not frail (43.5 ± 10.6 versus 47.3 ± 13.9 cm2/m2, p = 0.06). The difference between SMI by frailty status was reported for males and females, no significance testing was used due to the small sample size. Both frail males (43.5 ± 12.2 versus 48.4 ± 14.9) and females (43.4 ± 9.3 versus 45.2 ± 11.8) had a lower SMI compared to non-frail patients.
Conclusion: No difference in SMI between frail versus not frail patients was observed; however, based on the p-value of 0.06 a marginal trend and possible difference may exist, but further research is needed to confirm the findings. Additionally, it is concerning that men had a higher rate of low muscle mass and the mean SMI for both frail and not frail men was below the cut-off used to identify low muscle mass (SMI ≤ 50 cm2/m2). Additional research is needed to explore the underlying factors contributing to low muscle mass in men, particularly in frail populations, and to determine whether targeted interventions aimed at improving muscle mass could mitigate frailty and improve clinical outcomes in patients undergoing liver transplant evaluation.
1The University of Alabama, Tuscaloosa, AL; 2The University of Alabama at Birmingham, Birmingham, AL; 3Thomas Jefferson University, Philadelphia, PA
Financial Support: The ALS Association Quality of Care Grant.
Background: Amyotrophic Lateral Sclerosis (ALS) is a progressive neurodegenerative disease. Malnutrition is common in people with ALS (PALS) due to dysphagia, hypermetabolism, self-feeding difficulty and other challenges. In many clinical settings, malnutrition is diagnosed using the Academy of Nutrition and Dietetics/American Society of Parenteral and Enteral Nutrition indicators to diagnose malnutrition (AAIM) or the Global Leadership Initiative on Malnutrition (GLIM) criteria. However, little is known about malnutrition assessment practices in ALS clinics. This qualitative study explored how RDs at ALS clinics are diagnosing malnutrition in PALS.
Methods: Researchers conducted 6 virtual focus groups with 22 RDs working in ALS clinics across the United States. Audio recordings were transcribed verbatim, and transcripts imported to NVivo 14 software (QSR International, 2023, Melbourne, Australia). Two research team members independently analyzed the data using deductive thematic analysis.
Results: The AAIM indicators were identified as the most used malnutrition diagnostic criteria. Two participants described using a combination of AAIM and GLIM criteria. Although all participants described performing thorough nutrition assessments, some said they do not formally document malnutrition in the outpatient setting due to lack of reimbursement and feeling as though the diagnosis would not change the intervention. Conversely, others noted the importance of documenting malnutrition for reimbursement purposes. Across all groups, RDs reported challenges in diagnosing malnutrition because of difficulty differentiating between disease- versus malnutrition-related muscle loss. Consequently, several RDs described adapting current malnutrition criteria to focus on weight loss, decreased energy intake, or fat loss.
Conclusion: Overall, RDs agreed that malnutrition is common among PALS, and they conducted thorough nutrition assessments as part of standard care. Among those who documented malnutrition, most used AAIM indicators to support the diagnosis. However, as muscle loss is a natural consequence of ALS, RDs perceived difficulty in assessing nutrition-related muscle loss. This study highlights the need for malnutrition criteria specific for PALS.
Table 1. Themes Related to Diagnosing Malnutrition in ALS.
Carley Rusch, PhD, RDN, LDN1; Nicholas Baroun, BS2; Katie Robinson, PhD, MPH, RD, LD, CNSC1; Maria Geraldine E. Baggs, PhD1; Refaat Hegazi, MD, PhD, MPH1; Dominique Williams, MD, MPH1
Financial Support: This study was supported by Abbott Nutrition.
Background: Malnutrition is increasingly recognized as a condition that is present in all BMI categories. Although much research to date has focused on malnutrition in patients with lower BMIs, there is a need for understanding how nutrition interventions may alter outcomes for those with higher BMIs and existing comorbidities. In a post-hoc analysis of the NOURISH trial, which investigated hospitalized older adults with malnutrition, we sought to determine whether consuming a specialized ONS containing high energy, protein and beta-hydroxy-beta-methylbutyrate (ONS+HMB) can improve vitamin D and nutritional status in those with a BMI ≥ 27.
Methods: Using data from the NOURISH trial, a randomized, placebo-controlled, multi-center, double-blind study conducted in hospitalized participants with malnutrition and a primary diagnosis of congestive heart failure, acute myocardial infarction, pneumonia, or chronic obstructive pulmonary disease, post-hoc analysis was conducted. In the trial, participants received standard care with either ONS + HMB or a placebo beverage (target 2 servings/day) during hospitalization and for 90 days post-discharge. Nutritional status was assessed using Subjective Global Assessment (SGA) and handgrip strength at baseline, 0-, 30-, 60- and 90-days post-discharge. Vitamin D (25-hydroxyvitamin D) was assessed within 72 hours of admission (baseline), 30- and 60-days post-discharge. Participants with a BMI ≥ 27 formed the analysis cohort. Treatment effect was determined using analysis of covariance adjusted for baseline measures.
Results: The post-hoc cohort consisted of 166 patients with a BMI ≥ 27, mean age 76.41 ± 8.4 years, and who were predominantly female (51.2%). Baseline handgrip strength (n = 137) was 22.3 ± 0.8 kg while serum concentrations of 25-hydroxyvitamin D (n = 138) was 26.0 ± 1.14 ng/mL. At day 90, ONS+HMB improved nutritional status in which 64% of ONS+HMB group were well-nourished (SGA-A) vs. 37% of the control group (p = 0.011). There was a trend towards higher changes in handgrip strength with ONS+HMB during index hospitalization (baseline to discharge) compared to placebo (least squares means ± standard error: 1.34 kg ± 0.35 vs. 0.41 ± 0.39; p = 0.081) but was not significant at all other timepoints. Vitamin D concentrations were significantly higher at day 60 in those receiving ONS + HMB compared to placebo (29.7 ± 0.81 vs. 24.8 ± 0.91; p < 0.001).
Conclusion: Hospitalized older patients with malnutrition and a BMI ≥ 27 had significant improvements in their vitamin D and nutritional status at day 60 and 90, respectively, if they received standard care + ONS+HMB as compared to placebo. This suggests transitions of care post-acute setting should consider continuation of nutrition for patients with elevated BMI and malnutrition using interventions such as ONS+HMB in combination with standard care.
Aline Dos Santos1; Isis Helena Buonso2; Marisa Chiconeli Bailer2; Maria Fernanda Jensen Kok2
1Hospital Samaritano Higienópolis, São Paulo; 2Hospital Samaritano Higienopolis, São Paulo
Financial Support: None Reported.
Background: Malnutrition negatively impacts the length of hospital stay, infection rate, mortality, clinical complications, hospital readmission and also on average healthcare costs. It is believed that early nutritional interventions could reduce negative events and generate economic impact. Therefore, our objective was to evaluate the average cost of hospitalization of patients at nutritional risk through nutritional screening with indication of oral nutritional supplementation.
Methods: Retrospective study including 110 adult patients hospitalized in a private institution admitted between August 2023 and January 2024. Nutritional screening was performed within 24 hours of admission. To classify low muscle mass according to calf circumference (CC), the cutoff points were considered: 33 cm for women and 34 cm for men, measured within 96 hours of hospital admission. They were evaluated in a grouped manner considering G1 patients with an indication for oral supplementation (OS) but not started for modifiable reasons, G2 patients with an indication for OS and started assertively (within 48 hours of the therapeutic indication) and G3 patients with an indication for OS but started late (after 48 hours of therapeutic indication) and G4 the joining of patients from G1 and G3 as both did not receive OS assertively. Patients receiving enteral or parenteral nutritional therapy were excluded.
Results: G2 was prevalent in the studied sample (51%), had an intermediate average length of stay (20.9 days), lower average daily hospitalization cost, average age of 71 years, significant prevalence of low muscle mass (56%) and lower need for hospitalization in intensive care (IC) (63%) with an average length of stay (SLA) in IC of 13.5 days. G1 had a lower prevalence (9%), shorter average length of stay (16 days), average daily cost of hospitalization 41% higher than G2, average age of 68 years, unanimity of adequate muscle mass (100%) and considerable need for hospitalization in intensive care (70%), but with a SLA in IC of 7.7 days. G3 represented 40% of the sample studied, had a longer average length of stay (21.5 days), average daily cost of hospitalization 22% higher than G2, average age of 73 years, significant prevalence of low muscle mass (50%) and intermediate need for hospitalization in intensive care (66%) but with SLA in IC of 16.5 days. Compared to G2, G4 presented a similar sample group (G2: 56 patients and G4: 54 patients) as well as mean age (72 years), hospitalization (20.55 days), hospitalization in IC (66%), SLA in IC (64.23%) but higher average daily hospitalization cost (39% higher than G2) and higher prevalence of patients with low muscle mass (59%).
Conclusion: From the results presented, we can conclude that the percentage of patients who did not receive OS and who spent time in the IC was on average 5% higher than the other groups, with unanimously adequate muscle mass in this group, but with the need for supplementation due to clinical conditions, food acceptance and weight loss. More than 50% of patients among all groups except G1 had low muscle mass. Regarding costs, patients supplemented assertively or late cost, respectively, 45% and 29% less compared to patients who did not receive OS. Comparing G2 with G4, the cost remains 39% lower in patients assertively supplemented.
Background: Malnutrition in hospitals, once an underreported issue, has gained significant attention in recent decades. This widespread problem negatively impacts recovery, length of stay (LOS), and overall outcomes. This study aimed to evaluate the effectiveness of clinical nutrition practices and the role of a Nutrition Steering Committee (NSC) in addressing and potentially eradicating this persistent problem by improving nutritional care and patient outcomes.
Methods: Consecutive patients admitted to non-critical units of a tertiary care hospital between January 2018 and August 2024 were included in the study. Patient demographics, body mass index (BMI), modified Subjective Global Assessment (mSGA), oral nutritional supplements (ONS) usage, and clinical outcomes were retrospectively extracted from electronic medical records. Data was analyzed using SPSS version 20.0, comparing results before and after the implementation of the NSC.
Results: Out of 239,630 consecutive patients, 139,895 non-critical patients were included, with a mean age of 57.10 ± 15.89 years; 64.3% were men and 35.7% women. The mean BMI was 25.76 ± 4.74 kg/m2, and 49.6% of the patients were polymorbid. The majority (25.8%) were admitted with cardiac illness. According to the modified Subjective Global Assessment (mSGA), 87.1% were well-nourished, 12.8% moderately malnourished, and 0.1% severely malnourished. ONS were prescribed for 10% of the population and ONS prescription was highest among underweight (28.4%); Normal BMI (13%); Overweight (9.1%); Obese (7.7%) (p = 0.000) and mSGA – well-nourished (5.5%); moderately malnourished (MM) 41%; severely malnourished (SM) 53.2% (p = 0.000) and pulmonology (23.3%), followed by gastroenterology & Hepatology (19.2%) (p = 0.000). The mean hospital LOS was 4.29 ± 4.03 days, with an overall mortality rate of 1.2%. Severe malnutrition, as rated by mSGA, significantly impacted mortality (0.8% vs. 5.1%, p = 0.000). Mortality risk increased with polymorbidity (0.9% vs. 1.5%) and respiratory illness (2.6%, p = 0.000). Poor nutritional status, as assessed by mSGA (34.7%, 57.4%, 70.9%) and BMI (43.7% vs. 38%), was associated with longer hospital stays (LOS ≥ 4 days, p = 0.000). The implementation of the NSC led to significant improvements - average LOS decreased (4.4 vs. 4.1 days, p = 0.000), and mortality risk was reduced from 1.6% to 0.7% (p = 0.000). No significant changes were observed in baseline nutritional status, indicating effective clinical nutrition practices in assessing patient nutrition. ONS prescriptions increased from 5.2% to 9.7% between 2022 and 2024 (p = 0.000), contributing to the reduction in mortality rates to below 1% after 2022, compared to over 1% before NSC (p = 0.000). A significant negative correlation was found between LOS and ONS usage (p = 0.000). Step-wise binary logistic regression indicated that malnutrition assessed by mSGA predicts mortality with an odds ratio of 2.49 and LOS with an odds ratio of 1.7, followed by ONS prescription and polymorbidity (p = 0.000).
Conclusion: A well-functioning NSC is pivotal in driving successful nutritional interventions and achieving organizational goals. Early identification of malnutrition through mSGA, followed by timely and appropriate nutritional interventions, is essential to closing care gaps and improving clinical outcomes. Strong leadership and governance are critical in driving these efforts, ensuring that the patients receive optimal nutritional support to enhance recovery and reduce mortality.
Baseline details of Anthropometric Measurements and Nutrition Status.
Table 2. Logistic Regression to Predict Hospital LOS and Mortality.
Step-wise binary logistic regression indicated that malnutrition assessed by mSGA predicts mortality with an odds ratio of 2.49 and LOS with an odds ratio of 1.7, followed by ONS prescription and polymorbidity (p = 0.000).
mSGA-rated malnourished patients stayed longer in the hospital compared to the well-nourished category (p = 0.000)
Figure 1. Nutritional Status (mSGA) Vs Hospital LOS (>4days).
Background: Food insecurity is when people do not have enough food to eat and do not know where their next meal will come from. In the United States approximately 49 million people relied on food assistance charities in 2022 (data from Feeding America). Patients receiving parenteral nutrition (PN), who may be capable of supplementing with oral intake, may experience food insecurity due to chronic health conditions limiting work capability and total family income. Patients may also experience lack of affordable housing, increased utilities and the burden of medical expenses. Signs of food insecurity may present as weight loss, malnourishment, low energy, difficulty concentrating, or other physical indicators such as edema, chronically cracked lips, dry skin, and itchy eyes. The purpose of this abstract is to highlight two unique patient case presentations where food insecurity prompted clinicians to intervene.
Methods: Patient 1: 50-year-old male with short bowel syndrome (SBS) on long-term PN called the registered dietitian (RD) regarding financial difficulties with feeding the family (see Table 1). The patient and clinician relationship allowed the patient to convey sensitive concerns to the RD regarding inability to feed himself and his family, which resulted in the patient relying on the PN for all nutrition. Due to the food insecurity present, the clinician made changes to PN/hydration to help improve patient's clinical status. Patient 2: A 21-year-old male with SBS on long-term PN spoke with his in-home registered nurse (RN) regarding family's difficulties affording food (see Table 2). The RN informed the clinical team of suspected food insecurity and the insurance case manager (CM) was contacted regarding food affordability. The RD reached out to local community resources such as food banks, food boxes and community programs. A community program was able to assist the patient with meals until patient's aunt started cooking meals for him. This patient did not directly share food insecurity with RD; however, the relationship with the in-home RN proved valuable in having these face-to-face conversations with the patient.
Results: In these two patient examples, difficulty obtaining food affected the patients’ clinical status. The clinical team identified food insecurity and the need for further education for the interdisciplinary team. A food insecurity informational handout was created by the RD with an in-service to nursing to help aid recognition of signs (Figure 1) to detect possible food insecurity and potential patient resources available in the community. Figure 2 presents suggested questions to ask a patient if an issue is suspected.
Conclusion: Given the prevalence of food insecurity, routine assessment for signs and symptoms is essential. Home nutrition support teams (including RDs, RNs, pharmacists and care technicians) are positioned to assist in this effort as they have frequent phone and in-home contact with patients and together build a trusted relationship with patients and caregivers. Clinicians should be aware regarding potential social situations which can warrant changes to PN formulations. To approach this sensitive issue thoughtfully, PN infusion providers should consider enhancing patient assessments and promote education across the interdisciplinary team to create awareness of accessible community resources.
1Cleveland Clinic, University Heights, OH; 2Cleveland Clinic Foundation, Sullivan, OH; 3Cleveland Clinic, Cleveland, OH; 4Cleveland Clinic Cancer Center, Cleveland, OH
Encore Poster
Presentation: The Society of Critical Care Medicine, Critical Care Congress in Orlando, FL on February 25th.
Publication: Critical Care Medicine.2025;53(1):In press.
Financial Support: Project support provided by Morrison Cleveland Clinic Nutrition Research collaborative.
Background: Hospitalized and critically ill patients who have preexisting malnutrition can have worse outcomes and an increased length of stay (LOS). Currently, Registered Dietitians (RDs) assess malnutrition with a nutrition focused physical exam (NFPE). Recent recommendations encourage the use of body composition tools such as computed tomography (CT) along with the NFPE. Trained RDs can use CT scans to evaluate skeletal muscle mass at the third lumbar vertebrae (L3) and then calculate Skeletal Muscle Index (SMI) and mean Hounsfield Units (HU) to determine muscle size and quality, respectively. This has been validated in various clinical populations and may be particularly useful in the critically ill where the NFPE is difficult. We aim to evaluate if using CT scans in the surgical and critical care population can be a supportive tool to capture a missed malnutrition diagnosis.
Methods: One-hundred and twenty patients admitted to the Cleveland Clinic from 2021-2023 with a malnutrition evaluation that included an NFPE within 2 days of an abdominal CT were evaluated. Of those, 59 patients had a major surgery or procedure completed that admission and were included in the final analysis. The CT scans were read by a trained RD at L3 using Terarecon, and results were cross-referenced by an artificial intelligence software (AI) called Veronai. Age, sex, BMI, SMI & HU were analyzed, along with the malnutrition diagnosis.
Results: Fifty-nine patients were analyzed. Of these, 61% were male, 51% were >65 years old, and 24% had a BMI > 30. Malnutrition was diagnosed in 47% of patients. A total of 24 patients had no muscle wasting based on the NFPE while CT captured low muscle mass in 58% in that group. Twenty-two percent of patients (13/59) had a higher level of malnutrition severity when using CT. Additionally, poor muscle quality was detected in 71% of patients among all age groups. Notably, there was a 95% agreement between AI and the RD's assessment in detecting low muscle mass.
Conclusion: RDs can effectively analyze CT scans and use SMI and HU with their NFPE. The NFPE alone is not always sensitive enough to detect low muscle in surgical and critically ill patients. The degree of malnutrition dictates nutrition interventions, including the timing and type of nutrition support, so it is imperative to accurately diagnose and tailor interventions to improve outcomes.
Table 1. Change in Malnutrition Diagnosis Using CT.
The graph shows the change in Malnutrition Diagnosis when CT was applied in conjunction with the NFPE utilizing ASPEN Guidelines.
Table 2. Muscle Assessment: CT vs NFPE.
This graph compares muscle evaluation using both CT and the NFPE.
CT scan at 3rd lumbar vertebrae showing normal muscle mass and normal muscle quality in a pt >65 years old.
Figure 1. CT Scans Evaluating Muscle Size and Quality.
CT scan at 3rd lumbar vetebrae showing low muscle mass and low muscle quality in a patient with obesity.
Figure 2. CT Scans Evaluating Muscle Size and Quality.
Background: Disease-related malnutrition alters body composition and causes functional decline. In acute care hospitals, 20-50% of inpatients have malnutrition when they are admitted to the hospital. Patients with malnutrition experience higher medical costs, increased mortality, and longer hospital stays. Malnutrition is associated with an increased risk of readmission and complications. Monitoring, diagnosis, treatment, and documentation of malnutrition are important for treating patients. It also contributes to proper Diagnosis Related Group (DRG) coding and accurate CMI (Case Mix Index), which can increase reimbursement.
Methods: After dietitians completed the Nutrition Focused Physical Exam (NFPE) course and sufficient staff had been provided, the malnutrition project was initiated under the leadership of RDNs in our small rural community hospital. The interdisciplinary medical committee approved the project to improve malnutrition screening, diagnosis, treatment practices, and coding. It was decided to use the Academy/American Society of Parenteral and Enteral Nutrition (ASPEN) and Global Leadership Initiative on Malnutrition (GLIM) criteria to diagnose malnutrition. The Malnutrition Screening Tool (MST) was completed by nurses to determine the risk of malnutrition. The Nutrition and Dietetics department created a new custom report that provides NPO-Clear-Full liquids patients' reports using the nutrition database. RDNs check NPO-Clear-Full liquids patients' reports, BMI reports, and length of stay (LOS) patient lists on weekdays and weekends. RDNs also performed the NFPE exam to evaluate nutritional status. If malnutrition is identified, RDNs communicate with providers through the hospital messenger system. Providers add malnutrition diagnosis in their documentation and plan of care. RDNs created a dataset and shared it with Coders and Clinical Documentation Integrity Specialists/Care Coordination. We keep track of malnutrition patients. In addition, RDNs spent more time with malnourished patients. They contributed to discharge planning and education.
Results: The prevalence of malnutrition diagnosis and the amount of reimbursement for 2023 were compared to the six months after implementing the malnutrition project. Malnutrition diagnosis increased from 2.6% to 10.8%. Unspecified protein-calorie malnutrition diagnoses decreased from 39% to 1.5%. RDN-diagnosed malnutrition has been documented in provider notes for 82% of cases. The malnutrition diagnosis rate increased by 315% and the malnutrition reimbursement rate increased by 158%. Of those patients identified with malnutrition, 59% received malnutrition DRG code. The remaining 41% of patients received higher major complications and comorbidities (MCCs) codes. Our malnutrition reimbursement increased from ~$106,000 to ~$276,000.
Conclusion: The implementation of evidence-based practice guidelines was key in identifying and accurately diagnosing malnutrition. The provision of sufficient staff with the necessary training and multidisciplinary teamwork has improved malnutrition diagnosis documentation in our hospital, increasing malnutrition reimbursement.
Table 1. Before and After Malnutrition Implementation Results.
Figure 1. Prevalence of Malnutrition Diagnosis.
Elisabeth Schnicke, RD, LD, CNSC1; Sarah Holland, MSc, RD, LD, CNSC2
1The Ohio State University Wexner Medical Center, Columbus, OH; 2The Ohio State University Wexner Medical Center, Upper Arlington, OH
Financial Support: None Reported.
Background: Malnutrition is associated with increased length of stay, readmissions, mortality and poor outcomes. Early identification and treatment are essential. The Malnutrition Screening Tool (MST) is a quick, easy tool recommended for screening malnutrition in adult hospitalized patients. This is commonly used with or without additional indicators. We aimed to evaluate our current nutrition screening policy, which utilizes MST, age and body mass index (BMI), to improve malnutrition identification.
Methods: This quality improvement project data was obtained over a 3-month period on 4 different adult services at a large academic medical center. Services covered included general medicine, hepatology, heart failure and orthopedic surgery. Patients were assessed by a Registered Dietitian (RD) within 72hrs of admission if they met the following high-risk criteria: MST score >2 completed by nursing on admission, age ≥65 yrs or older, BMI ≤ 18.5 kg/m2. If none of the criteria were met, patients were seen within 7 days of admission or sooner by consult request. Malnutrition was diagnosed using Academy of Nutrition and Dietetics/American Society for Parenteral and Enteral Nutrition indicators of malnutrition (AAIM) criteria. Data collected included malnutrition severity and etiology, age, gender, BMI and MST generated on admission.
Results: A total of 239 patients were diagnosed with malnutrition. Table 1 shows detailed characteristics. Malnutrition was seen similarly across gender (51% male, 49% female) and age groups. Age range was 21-92 yrs with an average age of 61 yrs. BMI range was 9.8-50.2 kg/m2 with an average BMI of 24.6 kg/m2. More patients were found to have moderate malnutrition at 61.5% and chronic malnutrition at 54%. When data was stratified by age ≥65 yrs, similar characteristics were seen for malnutrition severity and etiology. Notably, more patients (61.5%) had an MST of < 2 or an incomplete MST compared to patients < 65 yrs of age (56%). There were 181 patients (76%) that met high risk screening criteria and were seen for an assessment within 72hrs. Seventy patients (39%) were screened only due to age ≥65 yrs. Forty-five (25%) were screened due to MST alone. There were 54 (30%) that met 2 indicators for screening. Only a small number of patients met BMI criteria alone or 3 indicators (6 patients or 3% each).
Conclusion: Utilizing MST alone would have missed over half of patients diagnosed with malnutrition and there was a higher miss rate with older adults using MST alone. Age alone as a screening criteria caught more patients than MST alone did. Adding BMI to screening criteria added very little and we still missed 24% of patients with our criteria. A multi-faceted tool should be explored to best capture patients.
Table 1. Malnutrition characteristics.
*BMI: excludes those with amputations, paraplegia; all patients n = 219, patients ≥65yo n = 106.
Robin Nuse Tome, MS, RD, CSP, LDN, CLC, FAND1
1Nemours Children's Hospital, DE, Landenberg, PA
Financial Support: None Reported.
Background: Malnutrition is a global problem impacting patients of all ages, genders, and races. Malnourished hospitalized patients are associated with poorer outcomes including longer in-hospital length of stays, higher rate of death, higher need for home healthcare service, and a higher rate of 30day readmission. This results in a higher economic burden for the healthcare industry, insurance, and the individual. Malnutrition documentation plays a vital role in capturing the status of the patient and starting the conversation about interventions to address then concern.
Methods: A taskforce made up of physicians, registered dietitians (RDs), and clinical documentation specialists met to discuss strategies to increase documentation of the malnutrition diagnosis to facilitate better conversation about the concern and potential interventions. Options included inbox messages, a best practice alert, a SmartLink between the physician and RD note, and adding the diagnosis to the problem list. Of the options, the team selected to develop a SmartLink was developed within the Electronic Medical Record (EMR) that links text from the RD note about malnutrition to the physician note to capture the diagnosis of malnutrition, the severity, and the progression of the diagnosis over time.
Results: Preliminary data shows that physician documentation of the malnutrition diagnosis as well as the severity and progression of the diagnosis increased by 20% in the pilot medical team. Anecdotally, physicians were more aware of the patient's nutrition status with documentation linked to their note and collaboration between the medical team and the RD to treat malnutrition has increased.
Conclusion: We hypothesize that expanding the practice to the entire hospital, will increase documentation of malnutrition diagnosis in the physician note. This will help increase awareness of the nutrition status of the patient, draw attention and promote collaboration on interventions to treat, and increase billable revenue to the hospital by capturing the documentation of the degree of malnutrition in the physician note.
David López-Daza, RD1; Cristina Posada-Alvarez, Centro Latinoamericano de Nutrición1; Alejandra Agudelo-Martínez, Universidad CES2; Ana Rivera-Jaramillo, Boydorr SAS3; Yeny Cuellar-Fernández, Centro Latinoamericano de Nutrición1; Ricardo Merchán-Chaverra, Centro Latinoamericano de Nutrición1; María-Camila Gómez-Univio, Centro Latinoamericano de Nutrición1; Patricia Savino-Lloreda, Centro Latinoamericano de Nutrición1
1Centro Latinoamericano de Nutrición (Latin American Nutrition Center), Chía, Cundinamarca; 2Universidad CES (CES University), Medellín, Antioquia; 3Boydorr SAS, Chía, Cundinamarca
Financial Support: None Reported.
Background: The Malnutrition Screening Tool (MST) is a simple and quick instrument designed to identify the risk of malnutrition in various healthcare settings, including home hospitalization. Its use has become widespread due to its ease of application and ability to preliminarily assess the nutritional status of patients. In the context of home care, where clinical resources may be limited, an efficient screening tool is crucial to ensure early interventions that prevent nutritional complications in vulnerable populations. The objective was to evaluate the diagnostic accuracy of the MST in detecting malnutrition among patients receiving home care.
Methods: A diagnostic test study was conducted, collecting sociodemographic data, MST results, and malnutrition diagnoses based on the Global Leadership Initiative on Malnutrition (GLIM) criteria. A positive MST score was defined as a value of 2 or above. Categorical data were summarized using proportions, while quantitative variables were described using measures of central tendency. Sensitivity, specificity, the area under the receiver operating characteristic curve (AUC), positive likelihood ratio (LR + ), and negative likelihood ratio (LR-) were estimated along with their respective 95% confidence intervals.
Results: A total of 676 patients were included, with a median age of 82 years (interquartile range: 68-89 years), and 57.3% were female. According to the GLIM criteria, 59.8% of the patients met the criteria for malnutrition. The MST classified patients into low risk (62.4%), medium risk (30.8%), and high risk (6.8%). The sensitivity of the MST was 11.4% (95% CI: 8.5-14.9%), while its specificity was 100% (95% CI: 98.7-100%). The positive likelihood ratio (LR + ) was 1.75, and the negative likelihood ratio (LR-) was 0.0. The area under the curve was 0.71 (95% CI: 0.69-0.73), indicating moderate discriminative capacity.
Conclusion: While the MST demonstrates extremely high specificity, its low sensitivity limits its effectiveness in accurately identifying malnourished patients in the context of home care. This suggests that, although the tool is highly accurate for confirming the absence of malnutrition, it fails to detect a significant number of patients who are malnourished. As a result, while the MST may be useful as an initial screening tool, its use should be complemented with more comprehensive assessments to ensure more precise detection of malnutrition in this high-risk population.
1University of Dayton, Xenia, OH; 2University of Dayton, Dayton, OH; 3Premier Health, Dayton, OH
Financial Support: None Reported.
Background: The prevalence of malnutrition in critically ill populations has previously been shown to be between 38-78%. Previously published guidelines have stated that patients in the ICU should be screened for malnutrition within 24-48 hours and all patients in the ICU for >48 hours should be considered at high risk for malnutrition. Patients with a malnutrition diagnosis in the ICU have been shown to have poorer clinical outcomes; including longer length of stay, greater readmission rates, and increased mortality. The purpose of the current study was to see if severity of malnutrition impacted the time to initiate enteral nutrition and the time to reach goal enteral nutrition rate in critically ill patients and determine the possible impact of malnutrition severity on clinical outcomes.
Methods: A descriptive, retrospective chart review was conducted in multiple ICU units at a large level I trauma hospital in the Midwest. All participants included in analysis had been assessed for malnutrition by a registered dietitian according to ASPEN clinical practice guidelines. Exclusion criteria included patients receiving EN prior to the RDN assessment, those who received EN for < 24 hours total, patients on mixed oral and enteral nutrition diets, and patients receiving any parenteral nutrition. Participants were grouped by malnutrition status; no malnutrition n = 27, moderate malnutrition n = 22, and severe malnutrition n = 32. All data analysis were analyzed SPSS version 29.
Results: There was no difference in primary outcomes (time to EN initiation, time to EN goal rate) by malnutrition status (both p > 0.05). Multiple regression analysis found neither moderately malnourished or severely malnourished patients were more likely to have enteral nutrition initiation delayed for >48 hours from admission (p > 0.05). Neither ICU LOS nor hospital LOS was different among malnutrition groups (p > 0.05). Furthermore, neither ICU nor hospital mortality was different among malnutrition groups (p < 0.05). Among patients who were moderately malnourished, 81.8% required vasopressors, compared to 75% of patients who were severely malnourished, and 44.4% of patients who did not have a malnutrition diagnosis (p = 0.010). 90.9% of moderately malnourished patients required extended time on a ventilator (>72 hours), compared to 59.4% of severely malnourished patients, and 51.9% of patients without a malnutrition diagnosis (p = 0.011).
Conclusion: Although the severity of malnutrition did not impact LOS, readmission, or mortality, malnutrition status did significantly predict greater odds of a patient requiring vasopressors and spending an extended amount of time on a ventilator. Further studies with larger sample sizes are warranted to continue developing a better understanding of the relationship between malnutrition status and clinical outcomes.
Publication: 2024 Vizient supplement to the American Journal of Medical Quality (AJMQ).
Financial Support: None Reported.
Background: Research reveals that up to 50% of hospitalized patients are malnourished, yet only 9% of these cases are diagnosed. (1) Inadequate diagnosis and intervention of malnutrition can lead to poorer patient outcomes and reduced revenue. Our systemwide malnutrition awareness campaign successfully enhanced dietitian engagement, provider education, and streamlined documentation processes. This initiative resulted in a two-fold increase in the capture of malnutrition codes, a notable rise in malnutrition variable capture, and an average increase in diagnosis-related group relative weight by approximately 0.9. Consequently, there was a ~ 300% increase in revenue associated with accurate malnutrition diagnosis and documentation, alongside improvements in the observed-to-expected (O/E) ratio for mortality and length of stay. Given the critical impact of malnutrition on mortality, length of stay, and costs, enhanced identification programs are essential. Within our health system, a malnutrition identification program has been implemented across five hospitals for several years. System leadership roles for clinical nutrition and clinical documentation integrity (CDI) were established to ensure consistency, implement best practices, and optimize program efficacy. In 2022, a comprehensive analysis identified opportunities for improvement: a low systemwide capture rate (2%), limited awareness of the program's benefits, and inconsistent documentation practices. The leadership team, with the support of our executive sponsors, addressed these issues, engaged with service line leaders, and continue to drive program enhancements.
Methods: A 4-part malnutrition education campaign was implemented: Strengthened collaboration between Clinical Nutrition and CDI, ensuring daily systemwide communication of newly identified malnourished patients. Leadership teams, including coding and compliance, reviewed documentation protocols considering denial risks and regulatory audits. Launched a systemwide dietitian training program with a core RD malnutrition optimization team, a 5-hour comprehensive training, and monthly chart audits aiming for >80% documentation compliance. Created a Provider Awareness Campaign featuring interactive presentations on the malnutrition program's benefits and provider documentation recommendations. Developed an electronic health record (EHR) report and a malnutrition EHR tool to standardize documentation. EHR and financial reports were used to monitor program impact and capture rates.
Results: The malnutrition campaign has notably improved outcomes through ongoing education for stakeholders. A malnutrition EHR tool was also created in November 2022. This tool is vital for enhancing documentation, significantly boosting provider and CDI efficiency. Key results include: Dietitian documentation compliance increased from 85% (July 2022) to 95% (2024); RD-identified malnutrition cases increased from 2% (2021) to 16% (2024); Monthly average of final coded malnutrition diagnoses increased from 240 (2021) to 717 (2023); Average DRG relative weight climbed from 1.24 (2021) to 2.17 (2023); Financial impact increased from $5.5 M (2021) to $17.7 M (2024); and LOS O/E improved from 1.04 to 0.94 and mortality O/E improved from 0.77 to 0.62 (2021-2023).
Conclusion: This systemwide initiative not only elevates capture rates and documentation but also enhances overall outcomes. By CDI and RD teams taking on more of a collaborative, leadership role, providers can concentrate more on patient care, allowing these teams to operate at their peak. Looking ahead to 2025, the focus will shift towards leading indicators to refine malnutrition identification and assess the educational campaign's impact further.
Ryota Sakamoto, MD, PhD1
1Kyoto University, Kyoto
Financial Support: None Reported.
Background: Growing concern about the environmental impact of meat eating has led to consideration of a shift to a plant-based diet. One nutrient that tends to be particularly deficient in a plant-based diet is vitamin B12. Since vitamin B12 deficiency can cause anemia, limb paresthesia and muscle weakness, and psychiatric symptoms including depression, delirium, and cognitive impairment, it is important to find sustainable local sources of vitamin B12. In this study, we focused on gundruk and sinki, two fermented and preserved vegetables traditionally consumed mainly in Nepal, India, and Bhutan, and investigated their vitamin B12 content. The sinki is mainly made from radish roots, while gundruk is made from green leaves such as mustard leaves, which are preserved through fermentation and sun-drying processes. Previous reports indicated that, in these regions, not only vegetarians and vegans but also a significant number of people may have consistently low meat intake, especially among the poor. The governments and other organizations have been initiating feeding programs to supply fortified foods with vitamins A, B1, B2, B3, B6, B9, B12, iron, and zinc, especially to schools. At this time, however, it is not easy to get fortified foods to residents in the community. It is important to explore the possibility of getting vitamin B12 from locally available products that can be taken by vegetarians, vegans, or the poor in the communities.
Methods: Four samples of gundruk and five samples of sinki were obtained from markets, and the vitamin B12 content in them was determined using Lactobacillus delbrueckii subsp. lactis (Lactobacillus leichmannii) ATCC7830. The lower limit of quantification was set at 0.03 µg/100 g. The sample with the highest vitamin B12 concentration in the microbial quantification method was also measured for cyanocobalamin using LC-MS/MS (Shimadzu LC system equipped with Triple Quad 5500 plus AB-Sciex mass spectrometer). The Multiple Reaction Monitoring transition pattern for cyanocobalamin were Q1: 678.3 m/z, Q3: 147.1 m/z.
Results: For gundruk, vitamin B12 was detected in all four of the four samples, with values of 5.0 µg/100 g, 0.13 µg/100 g, 0.12 µg/100 g, and 0.04 µg/100 g, respectively, from highest to lowest. For sinki, it was detected in four of the five samples, with values of 1.4 µg/100 g, 0.41 µg/100 g, 0.34 µg/100 g, and 0.16 µg/100 g, respectively, from highest to lowest. The cyanocobalamin concentration by LC-MS/MS in one sample was estimated to be 1.18 µg/100 g.
Conclusion: According to “Vitamin and mineral requirements in human nutrition (2nd edition) (2004)” by the World Health Organization and the Food and Agriculture Organization of the United Nations, the recommended intake of vitamin B12 is 2.4 µg/day for adults, 2.6 µg/day for pregnant women and 2.8 µg/day for lactating women. The results of this study suggest that gundruk and sinki have potential as a source of vitamin B12, although there is a great deal of variability among samples. In order to use gundruk and sinki as a source of vitamin B12, it may be necessary to find a way to stabilize the vitamin B12 content while focusing on the relationship between vitamin B12 and the different ways of making gundruk and sinki.
Teresa Capello, MS, RD, LD1; Amanda Truex, MS, RRT, RCP, AE-C1; Jennifer Curtiss, MS, RD, LD, CLC1; Ada Lin, MD1
1Nationwide Children's Hospital, Columbus, OH
Financial Support: None Reported.
Background: The metabolic demands of critically ill children are defined by an increase in resting energy expenditure. (1,2). Energy needs in the PICU are ever changing and accurate evaluations are challenging to obtain. (3) Predictive equations have been found to be inaccurate due to over or under feeding patients which can lead to negative outcomes such as muscle loss and poor healing (underfeeding) and weight gain as adipose (overfeeding) (1,4,5). Indirect calorimetry (IC) is considered the gold standard to assess metabolic demand especially for critically ill pediatric patients (1,2,4). The use of IC may be limited due to staffing, equipment availability and cost as well as other patient related issues and/or cart specifications (6). In our facility, we identified limited use of IC by dietitians. Most tests were ordered by PICU dietitians and rarely outside the critical care division even though testing would benefit patients in other divisions of the hospital such as the CTICU, rehab, NICU and stepdown areas. Informal polling of non-PICU dietitians revealed that they had significant uncertainty interpreting data and providing recommendations based on test results. Reasons for uncertainty mostly centered from a lack of familiarity with this technology. The purpose of this study was to develop guidelines and a worksheet for consistently evaluating IC results with the goal of encouraging increased use of indirect calorimetry at our pediatric facility.
Methods: A committee of registered dietitians (RDs) and respiratory therapists (RTs) met in January 2023 and agreed on step-by-step guidelines which were trialed, reviewed, and updated monthly. Finalized guidelines were transitioned to a worksheet to improve consistency of use and aid in interpretation of IC results. A shared file was established for articles about IC as well as access to the guidelines and worksheet (Figures 1 and 2). For this study, IC data from January 1, 2022 to July 31, 2024 was reviewed. This data included number of tests completed and where the orders originated.
Results: Since the guidelines have been implemented, the non-PICU areas using IC data increased from 16% in 2022 to 30% in 2023 and appears to be on track to be the same in 2024 (Figure 3). RDs report an improved comfort level with evaluating test results as well as making recommendations for test ordering.
Conclusion: The standardized guidelines and worksheet increased RD's level of comfort and interpretation of test results. The PICU RDs have become more proficient and comfortable explaining IC during PICU rounds. It is our hope that with the development of the guidelines/worksheet, more non-PICU RDs will utilize the IC testing outside of the critical care areas where longer lengths of stay may occur. IC allows for more individualized nutrition prescriptions. An additional benefit was the mutual exchange of information between disciplines. The RTs provided education on the use of the machine to the RDs. This enhanced RDs understanding of IC test results from the RT perspective. In return, the RDs educated the RTs as to why certain aspects of the patient's testing environment were helpful to report with the results for the RD to interpret the information correctly. The committee continues to meet and discuss patients’ tests to see how testing can be optimized as well as how results may be used to guide nutrition care.
Figure 1. Screen Capture of Metabolic Cart Shared File.
Figure 2. IC Worksheet.
Figure 3. Carts completed per year by unit: 2022 is pre-intervention; 2023 and 2024 are post intervention. Key: H2B = PICU; other areas are non-PICU (H4A = cardiothoracic stepdown, H4B = cardiothoracic ICU, H5B = burn, H8A = pulmonary, H8B = stable trach/vent unit, H10B = Neurosurgery/Neurology, H11B = Nephrology/GI, H12A = Hematology/Oncology, C4A = NICU, C5B = infectious disease).
1Colegio Mexicano de Nutrición Clinica y Terapia Nutricional (Mexican College of Clinical Nutrition and Nutritional Therapy), León, Guanajuato; 2Colegio Mexicano de Nutrición Clínica y Terapia Nutricionalc (Mexican College of Clinical Nutrition and Nutritional Therapy), León, Guanajuato
Financial Support: None Reported.
Background: Sarcopenia is a systemic, progressive musculoskeletal disorder associated with an increased risk of adverse events and is highly prevalent in older adults. This condition leads to a decline in functionality, quality of life, and has an economic impact. Sarcopenia is becoming increasingly common due to age-related changes, metabolic changes, obesity, sedentary lifestyle, chronic degenerative diseases, malnutrition, and pro-inflammatory states. The objective of this study was to investigate the relationship between strength and muscle mass, measured by calf circumference, both corrected for BMI, in young and older Mexican adults.
Methods: This is a prospective, observational, cross-sectional, population-based clinical study conducted among Mexican men and women aged 30 to 90 years old, obtained through convenience sampling. The study was approved by the Ethics Committee of the Aranda de la Parra Hospital in León, Guanajuato, Mexico, and adheres to the Declaration of Helsinki. Informed consent was obtained from all participants after explaining the nature of the study. Inclusion criteria were: Mexican men and women aged 30 to 90 years old who were functionally independent (Katz index category "a"). Participants with amputations, movement disorders, or immobilization devices on their extremities were excluded. The research team was previously standardized in anthropometric measurements. Demographic data and measurements of weight, height, BMI, calf circumference, and grip strength using a dynamometer were collected. Data are presented as average and standard deviation. Spearman's correlation analysis was used to assess the relationship between BMI and calf circumference adjusted for BMI, with grip strength, considering a significance level of p < 0.05.
Results: The results of 1032 subjects were presented, 394 men and 638 women from central Mexico, located in workplaces, recreation centers, and health facilities, aged between 30 and 90 years old. Table 1 shows the distribution of the population in each age category, categorized by sex. Combined obesity and overweight were found in 75.1% of the sample population, with a frequency of 69.2% in men and 78.7% in women; 20% had a normal weight, with 25.6% in men and 16.6% in women, and 4.8% had low BMI, with 5.1% of men and 4.7% of women (Graph 1). The depletion of calf circumference corrected for BMI and age in the female population begins at 50 years old with exacerbation at 65 years old and older, while in men a greater depletion can be observed from 70 years old onwards. (Graph 2). When analyzing the strength corrected for BMI and age, grip strength lowers at 55 years old, lowering even more as age increases, in both genders; Chi-square=83.5, p < 0.001 (Graph 3). By Spearman correlation, an inverse and high relationship was found in both genders between age and grip strength, that is, as age increases, grip strength decreases (r = -0.530, p < 0.001). A moderate and negative correlation was found between age and calf circumference, as age increases, calf circumference decreases independently of BMI (r = -0.365, p < 0.001). Calf circumference and grip strength are positively and moderately related, as calf circumference decreases, the grip strength decreases, independently of BMI (r = 0.447, p < 0.0001).
Conclusion: These results show that the study population exhibited a decrease in grip strength, not related to BMI, from early ages, which may increase the risk of early-onset sarcopenia. This findings encourage early assessment of both grip strength and muscle mass, using simple and accessible measurements such as grip strength and calf circumference, adjusted for BMI. These measurements can be performed in the office during the initial patient encounter or in large populations, as in this study.
Table 1. Distribution of the Population According to Age and Gender.
Alison Hannon, Medical Student1; Anne McCallister, DNP, CPNP2; Kanika Puri, MD3; Anthony Perkins, MS1; Charles Vanderpool, MD1
1Indiana University School of Medicine, Indianapolis, IN; 2Indiana University Health, Indianapolis, IN; 3Riley Hospital for Children at Indiana University Health, Indianapolis, IN
Financial Support: None Reported.
Background: The Academy of Nutrition and Dietetics and American Society for Parenteral and Enteral Nutrition have published criteria to diagnose a patient with mild, moderate, or severe malnutrition that use either single or multiple data points. Malnutrition is associated with worse clinical outcomes, and previous data from our institution showed that hospitalized children with severe malnutrition are at higher risk of mortality compared to mild and moderate malnutrition. This project aims to determine if differences in clinical outcomes exist in patients with severe malnutrition based on the diagnostic criteria or anthropometrics differences in patients.
Methods: We included all patients discharged from Riley Hospital for Children within the 2023 calendar year diagnosed with severe malnutrition, excluding maternity discharges. Diagnostic criteria used to determine severe malnutrition was collected from registered dietitian (RD) documentation and RD-assigned malnutrition statement within medical records for the admission. Data was collected on readmission rates, mortality, length of stay (LOS), LOS index, cost, operative procedures, pediatric intensive care unit (ICU) admissions, and anthropometrics measured on admission. We used mixed-effects regression or mixed effects logistic regression to test whether the outcome of interest differed by severe malnutrition type. Each model contained a random effect for patient to account for correlation within admissions by the same patient and a fixed effect for severe malnutrition type. All analyses were performed using SAS v9.4.
Results: Data was gathered on 409 patient admissions. 383 admissions had diagnostic criteria clearly defined regarding severity of malnutrition. This represented 327 unique patients (due to readmissions). There was no difference in any measured clinical outcomes based on the criteria used for severe malnutrition, including single or multiple point indicators or patients who met both single and multiple point indicators (Table 1). Anthropometric data was analyzed including weight Z-score (n = 398) and BMI Z-score (n = 180). There was no difference seen in the majority of measured clinical outcomes in admissions with severe malnutrition when comparing based on weight or BMI Z-score categories of Z < -2, -2 < Z < -0.01, or Z > 0 (Table 2). Patients admitted with severe malnutrition and a BMI Z score > 0 had an increase in median cost (p = 0.042) compared to BMI < -2 or between -2 and 0 (Table 2). There was a trend towards increased median cost (p = 0.067) and median LOS (p = 0.104) in patients with weight Z score > 0.
Conclusion: Hospitalized patients with severe malnutrition have similar clinical outcomes regardless of diagnostic criteria used to determine the diagnosis of severe malnutrition. Fewer admissions with severe malnutrition (n = 180 or 44%) had sufficient anthropometric data to determine BMI. Based on this data, future programs at our institution aimed at intervention prior to and following admission will need to focus on all patients with severe malnutrition and will not be narrowed based on criteria (single, multiple data point) of severe malnutrition or anthropometrics. Quality improvement projects include improving height measurement and BMI determination upon admission, which will allow for future evaluation of impact of anthropometrics on clinical outcomes.
Table 1. Outcomes by Severe Malnutrition Diagnosis Category.
Outcomes by severe malnutrition compared based on the diagnostic criteria used to determine malnutrition diagnosis. Patients with severe malnutrition only are represented. Diagnostic criteria determined based on ASPEN/AND guidelines and defined during admission by registered dietitian (RD). OR = operative room; ICU = intensive care unit; LOS = length of stay. Data on 383 admissions presented, total of 327 patients due to readmissions: 284 patients had 1 admission; 33 patients had 2 admissions; 8 patients had 3 admissions; 1 patient had 4 admissions; 1 patient had 5 admissions
Table 2. Outcomes By BMI Z-score Category.
Outcomes of patients admitted with severe malnutrition, stratified based on BMI Z-score. Patients with severe malnutrition only are represented. BMI Z-score determined based on weight and height measurement at time of admission, recorded by bedside admission nurse. OR = operative room; ICU = intensive care unit; LOS = length of stay. Due to incomplete height measurements, data on only 180 admissions was available, total of 158 patients: 142 patients had 1 admission; 12 patients had 2 admissions; 3 patients had 3 admissions; 1 patient had 5 admissions
1Centro Médico Militar (Military Medical Center), Guatemala, Santa Rosa; 2Hospital General de Tijuana (Tijuana General Hospital), Tijuana, Baja California; 3Universidad Católica de Santiago de Guayaquil (Catholic University of Santiago de Guayaquil), Guayaquil, Guayas; 4Universidad Espíritu Santo (Holy Spirit University), Dripping Springs, TX
Financial Support: None Reported.
Background: Malnutrition is a common and significant issue among hospitalized patients, particularly in older adults or those with multiple comorbidities. The presence of malnutrition in such patients is associated with an increased risk of morbidity, mortality, prolonged hospital stays, and elevated healthcare costs. A crucial indicator of nutritional status is muscle strength, which can be effectively measured using handgrip strength (HGS). This study aimed to describe the relationship between nutritional status and muscle strength reduction in patients from two hospitals in Latin America.
Methods: A retrospective observational study was conducted from February to May 2022. Data were collected from two hospitals: one in Guatemala and one in Mexico. A total of 169 patients aged 19-98 years were initially considered for the study, and 127 met the inclusion criteria. The sample comprised adult patients of both sexes admitted to internal medicine, surgery, and geriatrics departments. Handgrip strength, demographic data, baseline medical diagnosis, weight, and height were recorded at admission and on the 14th day of hospitalization. Exclusion criteria included patients with arm or hand movement limitations, those under sedation or mechanical ventilation, and those hospitalized for less than 24 hours. HGS was measured using JAMAR® and Smedley dynamometers, following standard protocols. Statistical analysis was performed using measures of central tendency, and results were presented in tables and figures.
Results: In the first hospital (Mexico), 62 patients participated, with a predominant female sample. The average weight was 69.02 kg, height 1.62 meters, and BMI 26.14 kg/m² (classified as overweight). The most common admission diagnoses were infectious diseases, nervous system disorders, and digestive diseases. (Table 1) A slight increase in HGS (0.49 kg) was observed between the first and second measurements. (Figure 1) In the second hospital (Guatemala), 62 patients also met the inclusion criteria, with a predominant male sample. The average weight was 65.92 kg, height 1.61 meters, and BMI 25.47 kg/m² (classified as overweight). Infectious diseases and musculoskeletal disorders were the most common diagnoses. (Table 1) HGS decreased by 2 kg between the first and second measurements. (Figure 2) Low HGS was associated with underweight patients and those with class II and III obesity. Patients with normal BMI in both centers exhibited significant reductions in muscle strength, indicating that weight alone is not a sufficient indicator of muscle strength preservation.
Conclusion: This multicenter study highlights the significant relationship between nutritional status and decreased muscle strength in hospitalized patients. While underweight patients showed reductions in HGS, those with class II and III obesity also experienced significant strength loss. These findings suggest that HGS is a valuable, non-invasive tool for assessing both nutritional status and muscle strength in hospitalized patients. Early identification of muscle strength deterioration can help healthcare providers implement timely nutritional interventions to improve patient outcomes.
Table 1. Baseline Demographic and Clinical Characteristics of the Study Population.
NS: Nervous System, BMI: Body Mass Index
Figure 1. Relationship Between Nutritional Status and Handgrip Strength (Center 1 - Mexico).
Figure 2. Relationship Between Nutritional Status and Handgrip Strength (Center 2 - Guatemala).
1Kaiser Permanente, Lone Tree, CO; 2Kaiser Permanente, Denver, CO; 3Kaiser Permanente, Castle Rock, CO; 4Kaiser Permanente, Littleton, CO
Financial Support: None Reported.
Background: Currently, there is no standardized screening required by the Centers for Medicare and Medicaid Services for malnutrition in outpatient settings. This raises concerns about early identification of malnutrition, the likelihood of nutrition interventions, and increased healthcare costs. While malnutrition screening in hospitalized patients is well-studied, its impact in outpatient care has not been thoroughly examined. Studies show that malnourished patients are 40% more likely to be readmitted within 30 days of discharge, adding over $10,000 to hospitalization costs. This quality improvement project aimed to evaluate the impact of implementing a standardized malnutrition screening tool for all patients as part of nutrition assessments by Registered Dietitians (RDs).
Methods: The Malnutrition Screening Tool (MST) was chosen due to its validity, sensitivity, and specificity in identifying malnutrition risk in outpatient settings. This tool assesses risk by asking about recent unintentional weight loss and decreased intake due to appetite loss. Based on responses, a score of 0-5 indicates the severity of risk. Scores of 0-1 indicate no risk, while scores of 2-5 indicate a risk for malnutrition. This questionnaire was integrated into the nutrition assessment section of the electronic medical record to standardize screening for all patients. Those with scores of 2 or greater were included, with no exclusions for disease states. Patients were scheduled for follow-ups 2-6 weeks after the initial assessment, during which their MST score was recalculated.
Results: A total of 414 patients were screened, with 175 completing follow-up visits. Of these, 131 showed improvements in their MST scores after nutrition interventions, 12 had score increases, and 32 maintained the same score. Two hundred thirty-nine patients were lost to follow-up for various reasons, including lack of response, limited RD scheduling access, changes in insurance, and mortality. Those with improved MST scores experienced an average cost avoidance of $15,000 each in subsequent hospitalizations. This cost avoidance was due to shorter hospitalizations, better treatment responses, and reduced need for medical interventions. The project raised awareness about the importance of early nutrition interventions among the multidisciplinary team.
Conclusion: This project suggests that standardizing malnutrition screening in outpatient settings could lead to cost avoidance for patients and health systems, along with improved overall care. Further studies are needed to identify the best tools for outpatient malnutrition screening, optimal follow-up timelines, and effective nutrition interventions for greatest cost avoidance.
Amy Sharn, MS, RDN, LD1; Raissa Sorgho, PhD, MScIH2; Suela Sulo, PhD, MSc3; Emilio Molina-Molina, PhD, MSc, MEd4; Clara Rojas Montenegro, RD5; Mary Jean Villa-Real Guno, MD, FPPS, FPSPGHAN, MBA6; Sue Abdel-Rahman, PharmD, MA7
1Abbott Nutrition, Columbus, OH; 2Center for Wellness and Nutrition, Public Health Institute, Sacramento, CA; 3Global Medical Affairs and Research, Abbott Nutrition, Chicago, IL; 4Research & Development, Abbott Nutrition, Granada, Andalucia; 5Universidad del Rosario, Escuela de Medicina, Bogota, Cundinamarca; 6Ateneo de Manila University, School of Medicine and Public Health, Metro Manila, National Capital Region; 7Health Data Synthesis Institute, Chicago, IL
Encore Poster
Presentation: American Society for Nutrition, June 29-July 2, Chicago, IL, USA; American Academy of Pediatrics, September 27-October 1, Orlando, FL, USA.
Publication: Sharn AR, Sorgho R, Sulo S, Molina-Molina E, Rojas Montenegro C, Villa-Real Guno MJ, Abdel-Rahman S. Using mid-upper arm circumference z-score measurement to support youth malnutrition screening as part of a global sports and wellness program and improve access to nutrition care. Front Nutr. 2024 Aug 12;11:1423978. doi: 10.3389/fnut.2024.1423978. PMID: 39188981; PMCID: PMC11345244.
Financial Support: This study was financially supported by the Abbott Center for Malnutrition Solutions, Chicago, IL, USA.
Financial Support: Grant supported by Khon Kaen University.
Background: Systemic sclerosis (SSc) is an autoimmune disease which malnutrition is a common complication caused by chronic inflammation of its natural history and/or gastrointestinal tract involvement. Current nutritional assessment tools, e.g. GLIM criteria, may include data regarding muscle mass measurement for nutritional diagnosis. Anthropometric measurement is a basic method in determining muscle mass, however, data in such condition is limited. This study aimed to determine utility of determining muscle mass and muscle function by anthropometric measurement for diagnosing malnutrition in SSc patients.
Methods: A cross-sectional diagnostic study was conducted in adult SSc patients at Srinagarind Hospital, Thailand. All patients were assessed for malnutrition based on Subjective Global Assessment (SGA). Muscle mass was measured by mid-upper-arm muscle circumference (MUAC) and calf circumference (CC), in addition, muscle function was determined by handgrip strength (HGS).
Results: A total of 208 SSc patients were included, of which 149 were females (71.6%). The respective mean age and body mass index was 59.3 ± 11.0 years and 21.1 ± 3.9 kg/m². Nearly half (95 cases; 45.7%) were malnourished based on SGA. Mean values of MUAC, CC, and HGS were 25.9 ± 3.83, 31.5 ± 3.81, and 19.0 ± 6.99 kg, respectively. Area under the curve (AUC) of receiver operating characteristic (ROC) curves of MUAC for diagnosing malnutrition was 0.796, of CC was 0.759, and HGS was 0.720. Proposed cut-off values were shown in table 1.
Conclusion: Muscle mass and muscle function were associated with malnutrition. Assessment of muscle mass and/or function by anthropometric measurement may be one part of nutritional assessment in patients with systemic sclerosis.
Table 1. Proposed Cut-Off Values of MUAC, CC, and HGS in Patients With Systemic Sclerosis.
1Duke University, Durham, NC; 2Duke University School of Medicine- Department of Anesthesiology, Durham, NC; 3Duke University School of Medicine, Durham, NC; 4Eastern Virginia Medical School, Norfolk, VA; 5Duke University Medical Center - Department of Anesthesiology - Duke Heart Center, Raleigh, NC; 6Duke University Medical School, Durham, NC
Financial Support: Baxter.
Background: Achieving acceptable nutritional goals is a crucial but often overlooked component of postoperative care, impacting patient-important outcomes like reducing infectious complications and shortening ICU length of stay. Predictive resting energy expenditure (pREE) equations poorly correlate with actual measured REE (mREE), leading to potentially harmful over- or under-feeding. International ICU guidelines now recommend the use of indirect calorimetry (IC) to determine mREE and personalized patient nutrition. Surgical stress increases protein catabolism and insulin resistance, but the effect of age on postoperative mREE trends, which commonly used pREE equations do not account for, is not well studied. This study hypothesized that older adults undergoing major abdominal surgery experience slower metabolic recovery than younger patients, as measured by IC.
Methods: This was an IRB-approved prospective trial of adult patients following major abdominal surgery with open abdomens secondary to blunt or penetrating trauma, sepsis, or vascular emergencies. Patients underwent serial IC assessments to guide postoperative nutrition delivery. Assessments were offered within the first 72 hours after surgery, then every 3 ± 2 days during ICU stay, followed by every 7 ± 2 days in stepdown. Patient mREE was determined using the Q-NRG® Metabolic Monitor (COSMED) in ventilator mode for mechanically ventilated patients or the mask or canopy modes, depending on medical feasibility. IC data were selected from ≥ 3-minute intervals that met steady-state conditions, defined by a variance of oxygen consumption and carbon dioxide production of less than 10%. Measurements not meeting these criteria were excluded from the final analysis. Patients without mREE data during at least two-time points in the first nine postoperative days were also excluded. Older adult patients were defined as ≥ 65 years and younger patients as ≤ 50. Trends in REE were calculated using the method of least squares and compared using t-tests assuming unequal variance (α = 0.05). Patients' pREE values were calculated from admission anthropometric data using ASPEN-SCCM equations and compared against IC measurements.
Results: Eighteen older and 15 younger adults met pre-specified eligibility criteria and were included in the final analysis. Average rates and standard error of REE recovery in older and younger adult patients were 28.9 ± 17.1 kcal/day and 75.5 ± 18.9 kcal/day, respectively, which approached – but did not reach – statistical significance (p = 0.07). The lower and upper bands of pREE for the older cohort averaged 1728 ± 332 kcal and 2093 ± 390 kcal, respectively, markedly exceeding the mREE obtained by IC for most patients and failing to capture the observed variability identified using mREE. In younger adults, pREE values were closer to IC measurements, with lower and upper averages of 1705 ± 278 kcal and 2084 ± 323 kcal, respectively.
Conclusion: Our data signal a difference in rates of metabolic recovery after major abdominal surgery between younger and older adult patients but did not reach statistical significance, possibly due to insufficient sample size. Predictive energy equations do not adequately capture changes in REE and may overestimate postoperative energy requirements in older adult patients, failing to appreciate the increased variability in mREE that our study found in older patients. These findings reinforce the importance of using IC to guide nutrition delivery during the early recovery period post-operatively. Larger trialsemploying IC and quantifying protein metabolism contributions are needed to explore these questions further.
Table 1. Patient Demographics.
Figure 1. Postoperative Changes in mREE in Older and Younger Adult Patients Following Major Abdominal Surgery Compared to pREE (ASPEN).
Amber Foster, BScFN, BSc1; Heather Resvick, PhD(c), MScFN, RD2; Janet Madill, PhD, RD, FDC3; Patrick Luke, MD, FRCSC2; Alp Sener, MD, PhD, FRCSC4; Max Levine, MD, MSc5
1Western University, Ilderton, ON; 2LHSC, London, ON; 3Brescia School of Food and Nutritional Sciences, Faculty of Health Sciences, Western University, London, ON; 4London Health Sciences Centre, London, ON; 5University of Alberta, Edmonton, AB
Financial Support: Brescia University College MScFN stipend.
Background: Currently, body mass index (BMI) is used as the sole criterion to determine whether patients with chronic kidney disease (CKD) are eligible for kidney transplantation. However, BMI is not a good measure of health in this patient population, as it does not distinguish between muscle mass, fat mass, and water weight. Individuals with end-stage kidney disease often experience shifts in fluid balance, resulting in fluid retention, swelling, and weight gain. Consequently, BMI of these patients may be falsely elevated. Therefore, it is vitally important to consider more accurate and objective measures of body composition for this patient population. The aim this study is to determine whether there is a difference in body composition between individuals with CKD who are categorized as having healthy body weight, overweight, or obesity.
Methods: This was a cross-sectional study analyzing body composition of 114 adult individuals with CKD being assessed for kidney transplantation. Participants were placed into one of three BMI groups: healthy weight (group 1, BMI < 24.9 kg/m2, n = 29), overweight (group 2, BMI ≥ 24.9-29.9 kg/m2, n = 39) or with obesity (group 3, BMI ≥ 30 kg/m2, n = 45). Fat mass, lean body mass (LBM), and phase angle (PhA) were measured using Bioelectrical Impedance Analysis (BIA). Standardized phase angle (SPhA), a measure of cellular health, was calculated by [(observed PhA-mean PhA)/standard deviation of PhA]. Handgrip strength (HGS) was measured using a Jamar dynamometer, and quadriceps muscle layer thickness (QMLT) was measured using ultrasonography. Normalized HGS (nHGS) was calculated as [HGS/body weight (kg)], and values were compared to age- and sex-specific standardized cutoff values. Fat free mass index (FFMI) was calculated using [LBM/(height (m))2]. Low FFMI, which may identify high risk of malnutrition, was determined using ESPEN cut-off values of < 17 kg/m2 for males and < 15 kg/m2 for females. Frailty status was determined using the validated Fried frailty phenotype assessment tool. Statistical analysis: continuous data was analyzed using one-way ANOVA followed by Tukey post hoc, while chi-square tests were used for analysis of categorical data. IBM SPSS version 29, significance p < 0.05.
Results: Participants in group 1 were younger than either group 2 (p = 0.004) or group 3 (p < 0.001). There was no significant difference in males and females between the three groups. FFMI values below cutoff were significantly higher for group 1 (13%), versus group 2 (0%) and group 3 (2.1%) (p = 0.02). A significant difference in nHGS was found between groups, with lower muscle strength occurring more frequently among participants in group 3 (75%) vs 48.7% in group 2 and 28.5% in group 1 (p < 0.001). No significant differences were seen in QMLT, SPhA, HGS, or frailty status between the three BMI groups.
Conclusion: It appears that there is no difference in body composition parameters such as QMLT, SPhA, or frailty status between the three BMI groups. However, patients with CKD categorized as having a healthy BMI were more likely to be at risk for malnutrition. Furthermore, those individuals categorized as having a healthy BMI appeared to have more muscle strength compared to the other two groups. Taken together, these results provide convincing evidence that BMI should not be the sole criterion for listing patients for kidney transplantation. Further research is needed to confirm these findings.
1Arizona State University and Veterans Healthcare Administration, Phoenix, AZ; 2Veterans Healthcare Administration, Phoenix, AZ; 3Arizona State University, Phoenix, AZ; 4Phoenix VAHCS, Phoenix, AZ
Financial Support: None Reported.
Background: Malnutrition does not have a standardized definition nor universal identification criteria. Registered dietitian nutritionists (RDNs) most often diagnose based on Academy and ASPEN Identification of Malnutrition (AAIM) criteria while physicians are required to use International Classification of Diseases Version 10 (ICD-10-CM). However, there are major differences in how malnutrition is identified between the two. The malnutrition ICD-10 codes (E43.0, E44.0, E44.1, E50.0-E64.9) have vague diagnostic criteria leading providers to use clinical expertise and prior nutrition education. For dietitians, AAIM's diagnostic criteria is clearly defined and validated to identify malnutrition based on reduced weight and intake, loss of muscle and fat mass, fluid accumulation, and decrease in physical functioning. Due to lack of standardization, the process of identifying and diagnosing malnutrition is inconsistent. The purpose of this study was to analyze the congruence of malnutrition diagnosis between physicians and dietitians using the two methods and to compare patient outcomes between the congruent and incongruent groups.
Methods: A retrospective chart review of 668 inpatients assigned a malnutrition diagnostic code were electronically pulled from the Veteran Health Administration's Clinical Data Warehouse for the time periods of April through July in 2019, 2020, and 2021. Length of stay, infection, pressure injury, falls, thirty day readmissions, and documentation of communication between providers was collected from chart review. Data for cost to the hospital was pulled from Veterans Equitable Resource Allocation (VERA) and paired with matching social security numbers in the sample. Chi squares were used for comparing differences between incongruency and congruency for infection, pressure injury, falls, and readmissions. Means for length of stay and cost to hospital between the two groups were analyzed using ANOVA through SPSS.
Results: The diagnosis of malnutrition is incongruent between providers. The incongruent group had a higher percentage of adverse patient outcomes than those with congruent diagnosis. Congruent diagnoses were found to be significantly associated with incidence of documented communication (p < 0.001).
Conclusion: This study showcases a gap in malnutrition patient care. Further research needs to be conducted to understand the barriers to congruent diagnosis and communication between providers.
1The University of Tokyo, Bunkyo-City, Tokyo; 2The University of Tokyo, Bunkyo-ku, Tokyo; 3The University of Tokyo, Chuo-City, Tokyo; 4Kanagawa University of Human Services, Yokosuka-city, Kanagawa; 5The University of Tokyo Hospital, Bunkyo-ku, Tokyo
Financial Support: None Reported.
Background: Therapeutic diets are often prescribed for patients with various disorders, for example, diabetes, renal dysfunction and hypertension. However, due to the limitations of nutrients amount given, therapeutic diets might reduce appetite. Hospital meals maintain patients’ nutritional status when the meals are fully consumed regardless of the diet types. It is possible that therapeutic diets are at least partly associated with malnutrition in hospital. Therefore, we conducted an exploratory retrospective cohort study to investigate whether there are differences inpatients’ oral consumption between therapeutic and regular diets, taking into account other factors.
Methods: The study protocol was approved by the Ethics Committee of the University of Tokyo under protocol No.2023396NI-(1). We retrospectively extracted information from medical record of the patients who were admitted to the department of orthopedic and spine surgery at the University of Tokyo Hospital between June to October 2022. Eligible patients were older than 20 years old and hospitalized more than 7 days. These patients were provided oral diets as main source of nutrition. Patients prescribed texture modified diet, half or liquid diet are excluded. The measurements include percentages of oral food intake at various points during the hospitalization (e.g. at admission, before and after surgery and discharge), sex and ages. The differences in patient's oral consumption rate between therapeutic diet and regular diet were analyzed through a linear mixed-effect model.
Results: A total of 290 patients were analyzed, with 50 patients receiving a therapeutic diet and 240 patients receiving regular diet at admission. The mean percentage of oral intake was 83.1% in a therapeutic diet and 87.2% in a regular diet, and consistently 4-6% higher for regular diets compared to therapeutic diets, at each timing of hospitalization (Figure). In a linear mixed effect model with adjustment of sex and age, mean percentage of oral intake of a regular diet was 4.0% higher (95% confidence interval [CI], -0.8% to 8.9%, p = 0.100) than a therapeutic diet, although the difference did not reach statistical significance. The mean percentage of oral intake in women were 15.6% lower than men (95%CI, -19.5% to -11.8%.) Likewise, older patient's intake rate was reduced compared than younger patients (difference, -0.2% per age, 95%CI -0.3% to -0.1%).
Conclusion: This exploratory study failed to show that therapeutic diets may reduce food intake in orthopedic and supine surgery patients as compared to regular diet. However, sex and ages were important factors affecting food intake. We need to pay special attention to female and/or aged patients for increasing oral food intake. Future research will increase the number of patients examined, expand the cohort to other department and proceed to prospective study to find out what factors truthfully affect patient's oral intake during the hospitalization.
Figure 1. The Percentage of Oral Intake During Hospitalization in Each Diet.
Lorena Muhaj, MS1; Michael Owen-Michaane, MD, MA, CNSC2
11 Institute of Human Nutrition, Vagelos College of Physicians and Surgeons, Columbia University Irving Medical Center, New York, NY; 2Columbia University Irving Medical Center, New York, NY
Financial Support: None Reported.
Background: Muscle mass is crucial for overall health and well-being. Consequently, accurate estimation of muscle mass is essential for diagnosing malnutrition and conditions such as sarcopenia and cachexia. The Kim equation uses biomarker data to estimate muscle mass, but whether this tool provides an accurate estimate in populations with high BMI and kidney disease remains uncertain. Therefore, the aim of this study is to assess whether the Kim equation is suitable and reliable for estimating muscle mass and predicting malnutrition, sarcopenia, and related outcomes in a cohort with diverse BMI and kidney disease.
Methods: This is a cross-sectional study using data from the All of Us Research Program. Data on demographics, weight, height, creatinine, cystatin C, and diagnoses of malnutrition, hip fractures, and cachexia were obtained from electronic health records (EHR). The Kim equation was derived from creatinine, cystatin C and weight (Table 1) and compared with established sarcopenia cutoffs for appendicular lean mass (ALM), including ALM/BMI and ALM/height2. Malnutrition was identified through specific ICD-10-CM codes recorded in the EHR and participants were categorized based on malnutrition status (with or without malnutrition). Muscle mass and biomarker levels were compared between groups with or without severe/moderate malnutrition, and the relationships of BMI with creatinine and cystatin C levels were analyzed using linear regression. Wilcoxon rank-sum tests were used to assess associations between estimated muscle mass and malnutrition diagnosis.
Results: Baseline characteristics were stratified by gender for comparison. The mean age of participants was 58.2 years (SD = 14.7). The mean BMI was 30.4 kg/m2 (SD = 7.5). Mean serum creatinine and cystatin C levels were 2.01 mg/dL (SD = 1.82) and 0.18 mg/dL (SD = 0.11), respectively. The mean estimated muscle mass was 80.1 kg (SD = 21), with estimated muscle mass as a percentage of body weight being 92.8%. Mean ALM/BMI was 2.38 (SD = 0.32), while ALM/height2 value was 25.41 kg/m2 (SD = 5.6). No participant met the cutoffs for sarcopenia. All calculated variables are summarized in Table 1. In this cohort, < 2% were diagnosed with severe malnutrition and < 2% with moderate malnutrition (Table 2). Muscle mass was lower in participants with severe malnutrition compared to those without (W = 2035, p < 0.05) (Figure 1).
Conclusion: This study suggests the Kim equation overestimates muscle mass in populations with high BMI or kidney disease, as no participants met sarcopenia cutoffs despite the expected prevalence in CKD. This overestimation is concerning given the known risk of low muscle mass in CKD. While lower muscle mass was significantly associated with severe malnutrition (p < 0.05), the Kim equation identified fewer malnutrition cases than expected based on clinical data. Though biomarkers like creatinine and cystatin C may help diagnose malnutrition, the Kim equation may not accurately estimate muscle mass or predict malnutrition and sarcopenia in diverse populations. Further research is needed to improve these estimates.
Table 1. Muscle Mass Metrics Calculations and Diagnosis of Sarcopenia Based on FNIH (ALM/BMI) and EWGSOP (ALM/Height2) Cut-off Values.
Abbreviations: BMI-Body Mass Index; TBMM-Total Body Muscle Mass (referred as Muscle Mass as well) (calculated using the Kim Equation); ALM-Appendicular Lean Muscle Mass (using the McCarthy equation); ALM/Height2-Appendicular Lean Muscle Mass adjusted for height square (using EWGSOP cutoffs for diagnosing sarcopenia); ALM/BMI-Appendicular Lean Muscle Mass adjusted for BMI (using FNIH cutoffs for diagnosing sarcopenia). Equation 1: Kim equation - Calculated body muscle mass = body weight * serum creatinine/((K * body weight * serum cystatin C) + serum creatinine)
Table 2. Prevalence of Severe and Moderate Malnutrition.
(Counts less than 20 suppressed to prevent reidentification of participants).
Figure 1. Muscle Mass in Groups With and Without Severe Malnutrition.
1University of Delaware, Newark, DE; 2University of Auckland, Auckland
Financial Support: None Reported.
Background: Loss of skeletal muscle is common in patients with liver cirrhosis and low muscle mass is internationally recognized as a key phenotypic criterion to diagnose malnutrition.1,2 Muscle can be clinically assessed through various modalities, although reference data and consensus guidance are limited for the interpretation of muscle measures to define malnutrition in this patient population. The aim of this study was to evaluate the sensitivity and specificity of published sarcopenia cutpoints applied to dual-energy X-ray absorptiometry (DXA) muscle measures to diagnose malnutrition by the GLIM criteria, using in vivo neutron activation analysis (IVNAA) measures of total body protein (TBP) as the reference.
Methods: Adults with liver cirrhosis underwent IVNAA and whole body DXA at the Body Composition Laboratory of the University of Auckland. DXA-fat-free mass (FFM) and appendicular skeletal muscle mass (ASMM, with and without correction for wet bone mass3) were measured and indexed to height squared (FFMI, ASMI). The ratio of measured to predicted TBP based on healthy reference data matched for age, sex and height was calculated as a protein index; values less than 2 standard deviations below the mean (< 0.77) were defined as protein depletion (malnutrition). Published cut points from recommended guidelines were evaluated (Table 1).4-9 DXA values below the cut point were interpreted as ‘sarcopenic’. Sensitivity and specificity for each cut point were determined.
Results: Study sample included 350 adults (238 males/112 females, median age 52 years) with liver cirrhosis and median model for end-stage liver disease (MELD) score 12 (range 5-36). Application of published sarcopenia cutpoints to diagnose malnutrition by DXA in patients with liver cirrhosis had sensitivity ranging from 40.8% - 79.0% and specificity ranging from 79.6% to 94.2% (Table 1). Although all of the selected published cutpoints for DXA-measured ASMI were similar, the Baumgartner4 and Newman5 ASMI cutpoints when applied to our DXA-measured ASMI, particularly after correction for wet bone mass, yielded the best combination of sensitivity and specificity for diagnosing malnutrition as identified by protein depletion in patients with liver cirrhosis. The Studentski ASMM cutpoints in Table 1 yielded unacceptably low sensitivity (41-55%).
Conclusion: These findings suggest that the use of the DXA-derived Baumgartner/Newman bone-corrected ASMI cutpoints offer acceptable validity in the diagnosis of malnutrition by GLIM in patients with liver cirrhosis. However, given that it is not common practice to make this correction for wet bone mass in DXA measures of ASMI, the application of these cutpoints to standard uncorrected measures of ASMI by DXA would likely yield much lower sensitivity, suggesting that many individuals with low muscularity and malnutrition would be misdiagnosed as non-malnourished when applying these cutpoints.
Table 1. Evaluation of Selected Published Cut-Points for Dual-Energy X-Ray Absorptiometry Appendicular Skeletal Muscle Index to Identify Protein Depletion in Patients with Liver Cirrhosis.
Abbreviations: DXA, dual-energy X-ray absorptiometry; M, male; F, female; GLIM, Global Leadership Initiative on Malnutrition; EWGSOP, European Working Group on Sarcopenia in Older People; AWGS, Asia Working Group for Sarcopenia; FNIH, Foundation for the National Institutes of Health; ASMM, appendicular skeletal muscle mass in kg determined from DXA-measured lean soft tissue of the arms and legs; ASMI, ASMM indexed to height in meters-squared; ASMM-BC, ASMM corrected for wet bone mass according to Heymsfield et al 1990; ASMI-BC, ASMI corrected for wet bone mass.
Critical Care and Critical Health Issues
Amir Kamel, PharmD, FASPEN1; Tori Gray, PharmD2; Cara Nys, PharmD, BCIDP3; Erin Vanzant, MD, FACS4; Martin Rosenthal, MD, FACS, FASPEN1
1University of Florida, Gainesville, FL; 2Cincinnati Children, Gainesville, FL; 3Orlando Health, Orlando, FL; 4Department of Surgery, Division of Trauma and Acute Care Surgery, College of Medicine, University of Florida, Gainesville, FL
Financial Support: None Reported.
Background: Amino acids (AAs) serve different purposes in our body including structural, enzymatic, and integral cellular functions. Amino acids utilization and demand may vary between healthy and disease states. Certain conditions such as chronic kidney disease or short bowel syndrome can affect plasma AA levels. Previous research has identified citrulline as a marker of intestinal function and absorptive capacity. Stressors such as surgery or trauma can alter AAs metabolism, potentially leading to a hypercatabolic state and changes in the available AAs pool. The primary objective of this study is to compare AA levels in patients who have undergone abdominal surgery with those who have not. The secondary endpoint is to describe post-surgical complications and correlate plasma AAs level to such complications.
Methods: This study was a single-center retrospective analysis conducted between January 1, 2007and March 15, 2019, of patients who were referred to the University of Florida Health Nutrition Support Team (NST) and had a routine metabolic evaluation with amino acid levels as a part of nutrition support consult. Amino acid data were excluded if specimen were deemed contaminated. Patients with genetic disorders were also excluded from the study. During the study period, the amino acid bioassay was performed using Bio-chrome Ion exchange chromatography (ARUP Laboratories, Salt Lake City, UT).
Results: Of the 227 patients screened, 181 patients were included in the study (58 who underwent abdominal surgery and 123 who did not). The mean age, BMI and height of participants were 52.2 years, 25.1 kg/m2 and 169 cm respectively. Baseline characteristics were similar between the two groups, 31% of the surgery arm had undergone a surgical procedure within a year from the index encounter, 86.5% retained their colon, 69.2% had a bowel resection with mean of 147.6 cm of bowel left for those with documented length of reaming bowel (36 out of 58). Postoperative complications of small bowel obstruction, ileus, leak, abscess, bleeding and surgical site infection (SSI) were 12.1%, 24%, 17.2%, 20.7%, 3.4% and 17.2% respectively. Among the 19 AAs evaluated, median citrulline and methionine levels were significantly different between the 2 groups (23 [14-35] vs 17 [11-23]; p = 0.0031 and 27 [20-39] vs 33[24-51]; p = 0.0383. Alanine and arginine levels were associated with postoperative ileus, leucine levels correlated with SSI and glutamic acid and glycine levels were linked to postoperative fistula formation.
Conclusion: Most amino acid levels showed no significant differences between patients who underwent abdominal surgery and those who did not except for citrulline and methionine. Specific amino acids, such as alanine, arginine, leucine, glutamic acid and glycine may serve as an early indicator of post-surgical complications, however larger prospective trial is warranted to validate our findings.
Background: Marginal donor livers (MDLs) have been used for liver transplantation to address major organ shortages. However, MDLs are notably susceptible to Ischemia/Reperfusion injury (IRI). Recent investigations have highlighted Ferroptosis, a new type of programmed cell death, as a potential contributor to IRI. We hypothesized that modulating ferroptosis by the iron chelator deferoxamine (DFO) could alter the course of IRI.
Methods: Using our novel Perfusion Regulated Organ Therapeutics with Enhanced Controlled Testing (PROTECT) model (US provision Patent, US63/136,165), six human MDLs (liver A to F) were procured and split into paired lobes. Simultaneous perfusion was performed on both lobes, with one lobe subjected to DFO while the other serving as an internal control. Histology, serum chemistry, expression of ferroptosis-associated genes, determination of iron accumulation, and measurement of lipid peroxidation, were performed.
Results: Histological analysis revealed severe macrovesicular steatosis (>30%) in liver A and D, while liver B and E exhibited mild to moderate macrovesicular steatosis. Majority of the samples noted mild inflammation dominantly in zone 3. No significant necrosis was noted during perfusion. Perl's Prussian blue stain and non-heme iron quantification demonstrated a suppression of iron accumulation in liver A to D with DFO treatment (p < 0.05). Based on the degree of iron chelation, 12 lobes were categorized into two groups: lobes with decreased iron (n = 4) and those with increased iron (n = 8). Comparative analysis demonstrated that ferroptosis-associated genes (HIF1-alpha, RPL8, IREB2, ACSF2, NQO1) were significantly downregulated in the former (p = 0.0338, p = 0.0085, p = 0.0138, p = 0.0138, p = 0.0209, respectively). Lipid peroxidation was significantly suppressed in lobes with decreased iron (p = 0.02). While serum AST was lower in iron chelated lobes this did not reach statistical significance.
Conclusion: This study affirmed that iron accumulation was driven by normothermic perfusion. Reduction of iron content suppressed ferroptosis-associated genes and lipid peroxidation to mitigate IRI. Our results using human MDLs revealed a novel relationship between iron content and ferroptosis, providing a solid foundation for future development of IRI therapeutics.
Gabriella ten Have, PhD1; Macie Mackey, BSc1; Carolina Perez, MSc1; John Thaden, PhD1; Sarah Rice, PhD1; Marielle Engelen, PhD1; Nicolaas Deutz, PhD, MD1
1Texas A&M University, College Station, TX
Financial Support: Department of Defense: CDMRP PR190829 - ASPEN Rhoads Research Foundation - C. Richard Fleming Grant & Daniel H. Teitelbaum Grant.
Background: Sepsis is a potentially life-threatening complication of infection in critically ill patients and is characterized by severe tissue breakdown in several organs, leading to long-term muscle weakness, fatigue, and reduced physical activity. Recent guidelines suggest increasing protein intake gradually in the first days of recovery from a septic event. It is, however, unclear how this affects whole-body protein turnover. Therefore in an acute sepsis-recovery pig model, we studied whole-body protein metabolism in the early sepsis recovery phase after restricted feeding with a balanced meal of amino acids (AA).
Methods: In 25 catheterized pigs (± 25 kg), sepsis was induced by IV infusion of live Pseudomonas aeruginosa bacteria (5*108 CFU/hour). At t = 9 h, recovery was initiated with IV administration of gentamicin. Post sepsis, twice daily food was given incrementally (Day 1:25%, 2:50%, 3:75%, >4:100%). The 100% meals contained per kg BW: 15.4gr CHO and 3.47gr fat, and a balanced free AA (reflecting muscle AA profile) mixture (0.56 gr n = 3.9 gr AA). Before sepsis (Baseline) and on recovery day 3, an amino acid stable isotope mixture was administered IV as pulse (post-absorptive). Subsequently, arterial blood samples were collected postabsorptive for 2 hours. Amino acid concentrations and enrichments were determined with LCMS. Statistics: RM-ANOVA, α = 0.05.
Results: At day 3, animal body weight was decreased (2.4 [0.9, 3.9]%, p = 0.0025). Compared to baseline values, plasma AA concentration profiles were changed. Overall, the total non-essential AA plasma concentration did not change. Essential AA plasma concentrations of histidine, leucine, methionine, phenylalanine, tryptophan, and valine were lower (p < 0.05) and lysine higher (p = 0.0027). No change in isoleucine. We observed lower whole-body production (WBP) of the non-essential amino acids arginine (p < 0.0001), glutamine (p < 0.0001), glutamate (p < 0.0001), glycine (p < 0.0001), hydro-proline (p = 0.0041), ornithine (p = 0.0003), taurine (p < 0.0001), and tyrosine (p < 0.0001). Citrulline production has not changed. In addition, lower WBP was observed for the essential amino acids, isoleucine (p = 0.0002), leucine (p < 0.0001), valine (p < 0.0001), methionine (p < 0.0001), tryptophane (p < 0.0001), and lysine (p < 0.0001). Whole-body protein breakdown and protein synthesis were also lower (p < 0.0001), while net protein breakdown has not changed.
Conclusion: Our sepsis-recovery pig model suggests that food restriction in the early phase of sepsis recovery leads to diminished protein turnover.
Gabriella ten Have, PhD1; Macie Mackey, BSc1; Carolina Perez, MSc1; John Thaden, PhD1; Sarah Rice, PhD1; Marielle Engelen, PhD1; Nicolaas Deutz, PhD, MD1
1Texas A&M University, College Station, TX
Financial Support: Department of Defense: CDMRP PR190829 - ASPEN Rhoads Research Foundation - C. Richard Fleming Grant & Daniel H. Teitelbaum Grant.
Background: Sepsis is a potentially life-threatening complication of infection in critically ill patients. It is characterized by severe tissue breakdown in several organs, leading to long-term muscle weakness, fatigue, and reduced physical activity (ICU-AW). In an acute sepsis-recovery ICU-AW pig model, we studied whether meals that only contain essential amino acids (EAA) can restore the metabolic deregulations during sepsis recovery, as assessed by comprehensive metabolic phenotyping1.
Methods: In 49 catheterized pigs (± 25 kg), sepsis was induced by IV infusion of live Pseudomonas aeruginosa bacteria (5*108 CFU/hour). At t = 9 h, recovery was initiated with IV administration of gentamicin. Post sepsis, twice daily food was given blindly and incrementally (Day 1:25%, 2:50%, 3:75%, >4:100%). The 100% meals contained per kg BW: 15.4gr CHO and 3.47gr fat, and 0.56 gr N of an EAA mixture (reflecting muscle protein EAA, 4.3 gr AA) or control (TAA, 3.9 gr AA). Before sepsis (Baseline) and on recovery day 7, an amino acid stable isotope mixture was administered IV as pulse (post-absorptive). Subsequently, arterial blood samples were collected for 2 hours. AA concentrations and enrichments were determined with LCMS. Statistics: RM-ANOVA, α = 0.05.
Results: A body weight reduction was found after sepsis, restored on Day 7 post sepsis. Compared to baseline, in the EAA group, increased muscle fatigue (p < 0.0001), tau-methylhistidine whole-body production (WBP) (reflects myofibrillar muscle breakdown, p < 0.0001), and whole-body net protein breakdown (p < 0.0001) was observed but less in the control group (muscle fatigue: p < 0.0001, tau-methylhistidine: p = 0.0531, net protein breakdown (p < 0.0001). In addition on day 7, lower WBP was observed of glycine (p < 0.0001), hydroxyproline (p < 0.0001), glutamate (p < 0.0001), glutamine (p < 0.0001), and taurine (p < 0.0001), however less (glycine: p = 0.0014; hydroxyproline (p = 0.0007); glutamate p = 0.0554) or more (glutamine: p = 0.0497; taurine: p < 0.0001) in the control group. In addition, the WBP of citrulline (p = 0.0011) was increased on day seven but less in the control group (p = 0.0078). Higher plasma concentrations of asparagine (p < 0.0001), citrulline (p < 0.0001), glutamine (p = 0.0001), tau-methylhistidine (0.0319), serine (p < 0.0001), taurine (p < 0.0001), and tyrosine (p < 0.0001) were observed in the EAA group. In the EAA group, the clearance was lower (p < 0.05), except for glycine, tau-methylhistidine, and ornithine.
Conclusion: Conclusion Our sepsis-recovery pig ICU-AW model shows that feeding EAA-only meals after sepsis relates to an increased muscle and whole-body net protein breakdown and affects non-EAA metabolism. We hypothesize that non-essential amino acids in post-sepsis nutrition are needed to improve protein anabolism.
1The Ohio State University Wexner Medical Center, Columbus, OH
Financial Support: None Reported.
Background: Patients in the intensive care unit, especially those on mechanical ventilation, frequently receive inadequate enteral nutrition (EN) therapy. Critically ill patients receive on average 40-50% of prescribed nutritional requirements, while ASPEN/SCCM guidelines encourage efforts be made to provide > 80% of goal energy and protein needs. One method to help achieve these efforts is the use of volume-based feedings (VBF). At our institution, an hourly rate-based feeding (RBF) approach is standard. In 2022, our Medical Intensive Care Unit (MICU) operations committee inquired about the potential benefits of implementing VBF. Before changing our practice, we collected data to assess our current performance in meeting EN goals and to identify the reasons for interruptions. While literature suggests that VBF is considered relatively safe in terms of EN complications compared to RBF, to our knowledge, there is currently no information on the safety of starting VBF in patients at risk of gastrointestinal (GI) intolerance. Therefore, we also sought to determine whether EN is more frequently held due to GI intolerance versus operative procedures.
Methods: We conducted a retrospective evaluation of EN delivery compared to EN goal and the reason for interruption if EN delivery was below goal in the MICU of a large tertiary academic medical center. We reviewed ten days of information on any MICU patient on EN. One day constituted the total EN volume received, in milliliters, from 0700-0659 hours. Using the QI Assessment Form, we collected the following data: goal EN volume in 24 hours, volume received in 24 hours, percent volume received versus prescribed, and hours feeds were held (or below goal rate) each day. The reasons for holding tube feeds were divided into six categories based on underlying causes: feeding initiation/titration, GI issues (constipation, diarrhea, emesis, nausea, distention, high gastric residual volume), operative procedure, non-operative procedure, mechanical issue, and practice issues. Data was entered on a spreadsheet, and descriptive statistics were used to evaluate results.
Results: MICU patients receiving EN were observed over ten random days in a two-week period in August 2022. Eighty-two patients were receiving EN. Three hundred and four EN days were observed. Average percent EN delivered was 70% among all patients. EN was withheld for the following reasons: 34 cases (23%) were related to feeding initiation, 55 (37%) GI issues, 19 (13%) operative procedures, 32 (22%) non-operative procedures, 2 (1%) mechanical issues, and 5 (3%) cases were related to practice issues. VBF could have been considered in 51 cases (35%).
Conclusion: These results suggest that EN delivery in our MICU is most often below prescribed amount due to GI issues and feeding initiation. Together, they comprised 89 cases (60%). VBF protocols would not improve delivery in either case. VBF would likely lead to increased discomfort in patients experiencing GI issues, and feeding initiation can be improved with changes to advancement protocols. Due to VBF having potential benefit in only 35% of cases, as well as observing above average EN delivery, this protocol was not implemented in the observed MICU.
Delaney Adams, PharmD1; Brandon Conaway, PharmD2; Julie Farrar, PharmD3; Saskya Byerly, MD4; Dina Filiberto, MD4; Peter Fischer, MD4; Roland Dickerson, PharmD3
1Regional One Health, Memphis, TN; 2Veterans Affairs Medical Center, Memphis, TN; 3University of Tennessee College of Pharmacy, Memphis, TN; 4University of Tennessee College of Medicine, Memphis, TN
Encore Poster
Presentation: Society for Critical Care Medicine 54th Annual Critical Care Congress. February 23 to 25, 2025, Orlando, FL.
Publication: Critical Care Medicine.2025;53(1):In press.
Financial Support: None Reported.
Best of ASPEN-Critical Care and Critical Health Issues
1Duke University School of Medicine- Department of Anesthesiology, Durham, NC; 2Duke University School of Medicine, Durham, NC; 3Duke University Medical Center - Department of Anesthesiology - Duke Heart Center, Raleigh, NC; 4Duke University Medical School, Durham, NC
Financial Support: Baxter, Abbott.
Background: Resting energy expenditure (REE) is critical in managing nutrition in intensive care unit (ICU) patients. Accurate energy needs assessments are vital for optimizing nutritional interventions. However, little is known about how different disease states specifically influence REE in ICU patients. Existing energy recommendations are generalized and do not account for the metabolic variability across disease types. Indirect calorimetry (IC) is considered the gold standard for measuring REE but is underutilized. This study addresses this gap by analyzing REE across disease states using metabolic cart assessments in a large academic medical center. The findings are expected to inform more precise, disease-specific nutritional recommendations in critical care.
Methods: This is a pooled analysis of patients enrolled in four prospective clinical trials evaluating ICU patients across a range of disease states. Patients included in this analysis were admitted to the ICU with diagnoses of COVID-19, respiratory failure, cardiothoracic (CT) surgery, trauma, or surgical intensive care conditions. All patients underwent IC within 72 hours of ICU admission to assess REE, with follow-up measurements conducted when patients were medically stable. In each study, patients were managed under standard ICU care protocols in each study, and nutritional interventions were individualized or standardized based on clinical trial protocols. The primary outcome was the measured REE, expressed in kcal/day and normalized to body weight (kcal/kg/day). Summary statistics for demographic and clinical characteristics, such as age, gender, height, weight, BMI, and comorbidities, were reported. Comparative analyses across the five disease states were performed using ANOVA tests to determine the significance of differences in REE.
Results: The analysis included 165 ICU patients. The cohort had a mean age of 58 years, with 58% male and 42% female. Racial demographics included 36% black, 52% white, and 10% from other backgrounds. Patients in the surgical ICU group had the lowest caloric requirements, averaging 1503 kcal/day, while COVID-19 patients had the highest calorie needs of 1982 kcal/day. CT surgery patients measured 1644 kcal/day, respiratory failure measured 1763 kcal/day, and trauma patients required1883 kcal/day. ANOVA analysis demonstrated statistically significant differences in REE between these groups (p < 0.001). When normalized to body weight (kcal/kg/day), the range of REE varied from 20.3 to 23.5 kcal/kg/day, with statistically significant differences between disease states (p < 0.001).
Conclusion: This study reveals significant variability in REE across different disease states in ICU patients, highlighting the need for disease-specific energy recommendations. These findings indicate that specific disease processes, such as COVID-19 and trauma, may increase metabolic demands, while patients recovering from surgical procedures may have comparatively lower energy needs. These findings emphasize the importance of individualized nutritional interventions based on a patient's disease state to optimize recovery, clinical outcomes and prevent underfeeding or overfeeding, which can adversely affect patient outcomes. The results suggest IC should be more widely implemented in ICU settings to guide precise and effective nutrition delivery based on real-time metabolic data rather than relying on standard predictive equations. Further research is needed to refine these recommendations and explore continuous monitoring of REE and tailored nutrition needs in the ICU.
Table 1. Demographic and Clinical Characteristics.
Table 2. Disease Group Diagnoses.
Figure 1. Average Measured Resting Energy Expenditure by Disease Group.
1Northwestern Memorial Hospital, Shorewood, IL; 2Northwestern Memorial Hospital, Chicago, IL
Financial Support: None Reported.
Background: Communication between Registered Dietitians and CTICU teams is non-standardized and often ineffective in supporting proper implementation of RD recommendations. Late or lack of RD support can impact the quality of nutrition care provided to patients. In FY23, the CTICU nutrition consult/risk turnaround time was 58% within 24hrs and missed nutrition consults/risk was 9%. Our goal was to improve RD consult/risk turnaround time within 24hrs, based on our department goal, from 58% to 75% and missed RD consult/risk from 9% to 6%, through standardizing communication between RD and CTICU APRNs. The outcome metrics nutrition risk turnaround time and nutrition consult turnaround time. The process metric was our percent of RD presence in rounds.
Methods: We used the DMAIC module to attempt to solve our communication issue in CTICU. We took the voice of the customer and surveyed the CTICU ARPNs and found that a barrier was the RDs limited presence in the CTICU. We found that the CTICU APRNs find it valuable to have an RD rounding daily with their team. We then did a literature search on RDs rounding in ICU, specifically cardiac/thoracic ICUs and found critically ill cardiac surgery patients are at high risk of developing malnutrition, however initiation of medical nutrition therapy and overall adequacy of nutrition provision is lower compared to noncardiac surgical or MICU patients. RD written orders directly improve outcomes in patients receiving nutrition support. To have the most influence, RDs need to be present in the ICU and be involved when important decisions are being made. Dietitian involvement in the ICU team significantly improves the team's ability to implement prompt, relevant nutrition support interventions. We used process mapping to address that rounding times overlap with the step-down cardiac floors and ICU rounding. We optimized the schedule for the RDs daily to be able to attend as many rounds as possible daily, including the CTICU rounds. We then implemented a new rounding structure within the Cardiac Service Line based on literature search for the standard of care and RD role in ICU rounding.
Results: Our percentage of turnaround time for nutrition consults/risks increased by 26% within 24hrs (58 to 84%) and decreased for missed consults/risks to 1% to exceed goals. The number of nutrition interventions we were able to implement increased with more RDs attending rounds, which was tracked with implementation of a RD rounding structure within the CTICU. The number of implemented interventions from 1 to 2 RDs was skewed due to the RD attempting to round with both teams each day there was only 1 RD.
Conclusion: Communication between the CTICU team and Clinical Nutrition continues to improve with consistent positive feedback from the ICU providers regarding the new rounding structure. The new workflow was implemented in the Clinical Nutrition Cardiac Service Line. For future opportunities, there are other ICU teams at NMH that do not have a dedicated RD to round with them due to RD staffing that could also benefit from a dedicated RD in those rounds daily.
Table 1. New Rounding Structure.
*Critical Care Rounds; Green: Attend; Gold: Unable to attend.
Table 2. Control Plan.
Figure 1. Results Consult Risk Turn Around Time Pre & Post Rounding.
Figure 2. Number of Implemented RD Nutrition Interventions by Number of RDs Rounding.
1Emory Healthcare, Macon, GA; 2Emory Healthcare, Atlanta, GA
Financial Support: None Reported.
Background: Micronutrients play a crucial role in biochemical processes in the body. During critical illness, the status of micronutrients can be affected by factors such as disease severity and medical interventions. Extracorporeal membrane oxygenation (ECMO) is a vital supportive therapy that has seen increased utilization for critically ill patients with acute severe refractory cardiorespiratory failure. The potential alterations in micronutrient status and requirements during ECMO are an area of significant interest, but data are limited. This study aimed to determine the incidence of micronutrient depletion in critically ill patients requiring ECMO.
Methods: A retrospective chart review was conducted for patients with at least one micronutrient level measured in blood while receiving ECMO between January 1, 2015, and September 30, 2023. Chart reviews were completed using the Emory Healthcare electronic medical record system after the Emory University Institutional Review Board approved the study. A full waiver for informed consent and authorization was approved for this study. Data on demographic characteristics, ECMO therapy-related information, and reported micronutrient levels were collected. Descriptive statistics were used to evaluate the data.
Results: A total of 77 of the 128 reviewed patients met inclusion criteria and were included in data analysis (Table 1). The average age of patients was 49, and 55.8% were female. The average duration of ECMO was 14 days, and the average length of stay in the intensive care unit was 46.7 days. Among the included patients, 56 required continuous renal replacement therapy (CRRT) along with ECMO. Patients who required CRRT received empiric standard supplementation of folic acid (1 mg), pyridoxine (200 mg), and thiamine (100 mg) every 48 hours. Of the 77 patients, 44% had below-normal blood levels of at least one of the measured micronutrients. The depletion percentages of various nutrients were as follows: vitamin C (80.6%), vitamin D (75.0%), iron (72.7%), copper (53.3%), carnitine (31.3%), selenium (30.0%), pyridoxine (25.0%), folic acid (18.8%), vitamin A (10.0%), zinc (7.1%), and thiamine (3.7%) (Table 2). Measured vitamin B12, manganese, and vitamin E levels were within normal limits.
Conclusion: This study demonstrated that 60% of patients on ECMO had orders to evaluate at least one micronutrient in their blood. Of these, almost half had at least one micronutrient level below normal limits. These findings underscore the need for regular nutrient monitoring for critically ill patients. Prospective studies are needed to understand the impact of ECMO on micronutrient status, determine the optimal time for evaluation, and assess the need for and efficacy of supplementation in these patients.
Table 1. General Demographic and ECMO Characteristics (N = 77).
Table 2. Observed Micronutrient Status during ECMO for Critically Ill Patients.
Diane Nowak, RD, LD, CNSC1; Mary Kronik, RD, LD, CNSC2; Caroline Couper, RD, LD, CNSC3; Mary Rath, MEd, RD, LD, CNSC4; Ashley Ratliff, MS, RD, LD, CNSC4; Eva Leszczak-Lesko, BS Health Sciences, RRT4
Background: Indirect calorimetry (IC) is the gold standard for the accurate determination of energy expenditure. The team performed a comprehensive literature review on current IC practices across the nation which showed facilities employing IC typically follow a standard protocol dictated by length of stay (LOS). Intensive Care Unit (ICU) Registered Dietitians (RD) have directed IC intervention to reduce reliance on inaccurate predictive equations and judiciously identify patients (1, 2) with the assistance of IC order-based practice. While IC candidacy is determined by clinical criteria, implementation has been primarily dictated by RD time constraints. Our project aims to include IC in our standard of care by using a standardized process for implementation.
Methods: To implement IC at our 1,299-bed quaternary care hospital, including 249 ICU beds, a multidisciplinary team including ICU RDs and Respiratory Therapists (RT) partnered with a physician champion. Three Cosmed QNRG+ indirect calorimeters were purchased after a 6-month trial period. Due to the potential for rapid clinical status changes and RD staffing, the ICU team selected an order-based practice as opposed to a protocol. The order is signed by a Licensed Independent Practitioner (LIP) and includes three components: an order for IC with indication, a nutrition assessment consult, and a conditional order for the RD to release to RT once testing approved. After the order is signed, the RD collaborates with the Registered Nurse and RT by verifying standardized clinical criteria to assess IC candidacy. If appropriate, the RD will release the order for RT prior to testing to allow for documentation of ventilator settings. To begin the test, the RD enters patient information and calibrates the pneumotach following which RT secures ventilation connections. Next, RD starts the test and remains at bedside for the standardized 20-minute duration to ensure steady state is not interrupted. Once testing is completed, the best 5-minute average is selected to obtain the measured resting energy expenditure (mREE). The RD interprets the results considering a multitude of factors and, if warranted, modifies nutrition interventions.
Results: Eight ICU registered dietitians completed 87 IC measurements from May 2024 through August 2024 which included patients across various ICUs. All 87 patients were selected by the RD due to concerns for over or underfeeding. Eighty-three percent of the measurements were valid tests and seventy-nine percent of the measurements led to intervention modifications. The amount of face-to-face time spent was 66 hours and 45 minutes or an average of 45 minutes per test. Additional time spent interpreting results and making modifications to interventions ranged from 15-30 minutes.
Conclusion: IC has the ability to capture accurate energy expenditures in the critically ill. RD directed IC order-based practice has allowed for the successful IC introduction at our institution. The future transition from limited IC implementation to a standard of care will be dependent on the consideration of numerous challenges including RD time constraints and patient volumes amid the ebbs and flows of critical care. To align with the ever-changing dynamics of critical care, staffing level and workflows are being actively evaluated.
Background: Adequate energy intake with appropriate macronutrient composition is an essential part of patient recovery; however, predictive equations have been found to be of variable accuracy. Indirect calorimetry (IC), gives insight into the primary nutrition substrate being utilized as metabolic fuel and caloric needs, often identifying over- and under-feeding. Though IC is considered the gold standard for determining resting energy expenditure, it has challenges with cost, equipment feasibility, and time restraints with personnel, namely respiratory therapy (RT). Our hypothesis: Registered Dietitian (RD)-led IC tests can be conducted in a safe and feasible manner without undue risk of complication. In addition, IC will show a higher caloric need, particularly in patients who require at least seven days of ventilatory support.
Methods: A team of RDs screened surgical ICU patients at a single institution. Intubated patients for at least 3 days were considered eligible for testing. Exclusion criteria included a PEEP > 10, fraction of inspired oxygen >60%, Richmond Agitation Sedation Scale ≥ 1, chest tube air leak, extracorporeal membrane oxygenation use, and >1°C change in 1 hour. Tests were completed using the Q-NRG+ Portable Metabolic Monitor (Baxter) based on RD, and patient availability. Test results were compared to calculated needs based on the Penn State Equation (39 tests). Overfeeding/underfeeding was defined as >15% deviation from the equation results. Analysis of mean difference in energy needs was calculated using a standard paired, two tailed t-test for </= 7 total ventilated days and >7 ventilated days.
Results: Thirty patients underwent IC testing; a total of 39 tests were completed. There were no complications in RD led IC testing, and minimal RT involvement was required after 5 tests completed. Overall, 56.4% of all IC regardless of ventilator days indicated overfeeding. In addition, 33.3% of tests indicated appropriate feeding (85-115% of calculated REE), and 10.3% of tests demonstrated underfeeding. When stratified by ventilator days (> 7 d vs. ≤7 d), similar results were found noting 66% of IC tests were >15% from calculated caloric needs; 54.4-60.0% via equation were overfed and 12.5-6.7% were underfed, respectively.
Conclusion: Equations estimating caloric needs provide inconsistent results. Nutritional equations under and overestimate nutritional needs similarly regardless of ventilatory days compared to IC. Despite the lack of statistical significance, the effects of poor nutrition are well documented and vastly clinically significant. With minimal training, IC can be performed safely with an RD and bedside RN. Utilizing the RD to coordinate and perform IC testing is a feasible process that maximizes personnel efficiency and allows for immediate adjustment in the nutrition plan. IC as the gold standard for nutrition estimation should be performed on surgical ICU patients to assist in developing nutritional treatment algorithms.
Dolores Rodríguez1; Mery Guerrero2; María Centeno2; Barbara Maldonado2; Sandra Herrera2; Sergio Santana3
1Ecuadorian Society for the Fight against Cancer, Guayaquil, Guayas; 2SOLCA, Guayaquil, Guayas; 3University of Havana, La Habana, Ciudad de la Habana
Financial Support: None Reported.
Background: In 2022, the International Agency for Research on Cancer-Globocan reported nearly 20 million new cases of cancer worldwide, with 30,888 cases in Ecuador. Breast, prostate, and stomach cancers were the most diagnosed types. Oncohematological diseases (OHD) significantly affect the nutritional status of patients. The ELAN Ecuador-2014 study, involving over 5,000 patients, found malnutrition in 37% of participants overall, rising to 65% among those with OHD. The Latin American Study of Malnutrition in Oncology (LASOMO), conducted by FELANPE between 2019 and 2020, revealed a 59.1% frequency of malnutrition among 1,842 patients across 52 health centers in 10 Latin American countries. This study aims to present the current state of malnutrition associated with OHD among patients treated in Ecuadorian hospitals.
Methods: The Ecuadorian segment of the LASOMO Study was conducted between 2019 and 2020, as part of the previously mentioned regional epidemiological initiative. This study was designed as a one-day, nationwide, multicenter survey involving health centers and specialized services for patients with Oncohematological diseases (OHD) across five hospitals located in the provinces of Guayas (3), Manabí (1), and Azuay (1). The nutritional status of patients with Oncohematological diseases (OHD) was assessed using the B + C scores from Detsky et al.'s Subjective Global Assessment (SGA). This study included male and female patients aged 18 years and older, admitted to clinical, surgical, intensive care, and bone marrow transplant (BMT) units during October and November 2019. Participation was voluntary, and patients provided informed consent by signing a consent form. Data were analyzed using location, dispersion, and aggregation statistics based on variable types. The nature and strength of relationships were assessed using chi-square tests for independence, with a significance level of < 5% to identify significant associations. Odds ratios for malnutrition were calculated along with their associated 95% confidence intervals.
Results: The study enrolled 390 patients, with 63.6% women and 36.4% men, averaging 55.3 ± 16.5 years old; 47.2% were aged 60 years or older. The most common tumor locations included kidneys, urinary tract, uterus, ovaries, prostate, and testicles, accounting for 18.7% of all cases (refer to Table 1). Chemotherapy was the predominant oncological treatment, administered to 42.8% of the patients surveyed. Malnutrition affected 49.7% of the patients surveyed, with 14.4% categorized as severely malnourished (see figure 1). The incidence of malnutrition was found to be independent of age, educational level, tumor location, and current cytoreductive treatment (refer to Table 2). Notably, the majority of the malnourished individuals were men.
Conclusion: Malnutrition is highly prevalent in patients treated for OHD in Ecuadorian hospitals.
Table 1. Most Frequent Location of Neoplastic Disease in Hospitalized Ecuadorian Patients and Type of Treatment Received. The number and {in brackets} the percentage of patients included in the corresponding.
Table 2. Distribution of Malnutrition Associated with Cancer According to Selected Demographic, Clinical, and Health Characteristics of Patients Surveyed During the Oncology Malnutrition Study in Ecuador. The number and {in brackets} the percentage of malnourished patients included in each characteristic category are presented. The frequency of malnutrition was estimated using the Subjective Global Assessment (Detsky et al,. 1987.)
Figure 1. State of Malnutrition Among Patients Treated for Cancer in Hospitals In Ecuador.
Ranna Modir, MS, RD, CNSC, CDE, CCTD1; Christina Salido, RD1; William Hiesinger, MD2
1Stanford Healthcare, Stanford, CA; 2Stanford Medicine, Stanford, CA
Financial Support: None Reported.
Background: Cardiovascular Intensive Care Unit (CVICU) patients are at high risk for significant nutrient deficits, especially during the early intensive care unit (ICU) phase, when postoperative complications and hemodynamic instability are prevalent. These deficits can exacerbate the catabolic state, leading to muscle wasting, impaired immune function, and delayed recovery. A calorie deficit > 10,000 with meeting < 80% of nutritional needs in the early ICU phase (first 14 days) has been linked to worse outcomes, including prolonged intubation, increased ICU length of stay (LOS) and higher risk of organ dysfunction and infections like central line-associated bloodstream infections (CLABSI). When evaluating CLABSI risk factors, the role of nutrition adequacy and malnutrition if often underestimated and overlooked, with more emphasis placed on the type of nutrition support (NS) provided, whether enteral nutrition (EN) or parenteral nutrition (PN). Historically, there has been practice of avoiding PN to reduce CLABSI risk, rather than ensuring that nutritional needs are fully met. This practice is based on initial reports from decades ago linking PN with hospital-acquired infections. However, updated guidelines based on modern data now indicate no difference in infectious outcomes with EN vs PN. In fact, adding PN when EN alone is insufficient can help reduce nutritional deficiencies, supporting optimal immune response and resilience to infection. As part of an ongoing NS audit in our CVICU, we reviewed all CLABSI cases over a 23-month period to assess nutrition adequacy and the modality of NS (EN vs PN) provided.
Methods: Data were extracted from electronic medical records for all CLABSI cases from September 2020 to July 2022. Data collected included patient characteristics, clinical and nutrition outcomes (Table 1). Descriptive statistics (means, standard deviations, frequencies) were calculated. Chi-square or Fisher's exact test assessed the association between type of NS and meeting >80% of calorie/protein targets within 14 days and until CLABSI onset. A significance level of 0.05 was used.
Results: In a 23 month period, 28/51 (54.9%) patients with a CLABSI required exclusive NS throughout their entire CVICU stay with n = 18 male (64.3%), median age 54.5 years, mean BMI 27.4, median CVICU LOS was 49.5 days with a 46.4% mortality rate. Surgical intervention was indicated in 60.7% patients with 41.2% requiring preoperative extracorporeal membrane oxygenation (ECMO) and 52.9% postoperative ECMO (Table 1). Majority of patients received exclusive EN (53.6%) with 46.4% EN + PN and 0% exclusive PN. In the first 14 ICU days, 21.4% met >80% of calorie needs, 32.1% met >80% of protein needs with 32.1% having a calorie deficit >10,000 kcal. No difference in type of NS and ability to meet >80% of nutrient targets in the first 14 days (Table 1, p = 0.372, p = 0.689). Majority of PN (61.5%) was initiated after ICU day 7. From ICU day 1 until CLABSI onset, the EN + PN group were more able to meet >80% of calorie targets vs exclusive EN (p = 0.016). 50% were diagnosed with malnutrition. 82% required ECMO cannulas and 42.9% dialysis triple lumen. Enterococcus Faecalis was the most common organism for the EN (43.7%) and EN + PN group (35.7%) (Table 2).
Conclusion: This single-center analysis of CVICU CLABSI patients found majority requiring exclusive NS failed to meet >80% of nutrition needs during the early ICU phase. Exclusive EN was the primary mode of NS compared to EN + PN or PN alone, challenging the assumption that PN inherently increases CLABSI risk. In fact, EN + PN improved ability to meet calorie targets until CLABSI onset. These findings suggest that early nutrient deficits may increase CLABSI risk and that the risk is not dependent on the type of NS provided.
Table 1. Patient Characteristics, Clinical and Nutritional Outcomes.
Table 2. Type of Central Access Device and Microorganism in Relation to Modality of Nutrition Support Provided.
Oki Yonatan, MD1; Faya Nuralda Sitompul2
1ASPEN, Jakarta, Jakarta Raya; 2Osaka University, Minoh, Osaka
Financial Support: None Reported.
Background: Ginseng, widely used as a functional food or therapeutic supplement in Asia, contains bioactive compounds such as ginsenosides, which exert a range of biological effects, including hypoglycemic, anti-inflammatory, cardioprotective, and anti-tumor properties. However, studies have indicated that ginseng also has anticoagulant and anti-aggregation effects and may be associated with bleeding. This case report presents a potential case of ginseng-induced bleeding in an elderly patient with advanced pancreatic cancer. Case Description: A 76-year-old male with stage IV pancreatic cancer and metastases to the liver, lymph nodes, and peritoneum was in home care with BIPAP ventilation, NGT feed, ascites drainage, and foley catheter. He had a history of type A aortic dissection repair, anemia, and thrombocytopenia, with platelet counts consistently below 50,000/µL. Despite no history of anticoagulant use, the patient developed massive gastrointestinal bleeding and hematuria after consuming 100 grams of American Ginseng (AG) per day for five days. Disseminated intravascular coagulation (DIC) was initially suspected, but no signs of bleeding were observed until the third week of care, coinciding with ginseng consumption. Endoscopy was not performed due to the patient's unstable condition and the family's refusal. Discussion: The consumption of AG may have triggered bleeding due to the patient's already unstable condition and low platelet count. Ginsenosides, particularly Rg1, Rg2, and Rg3, have been shown to exert anticoagulant effects, prolonging clotting times and inhibiting platelet aggregation. Studies have demonstrated that AG extracts can significantly extend clotting times and reduce platelet activity, potentially contributing to the observed bleeding. Conclusion: This case highlights the possible role of AG in inducing severe bleeding in a patient with pancreatic cancer and thrombocytopenia. Given ginseng's known anticoagulant properties, caution should be exercised when administering it to patients with hematological abnormalities or bleeding risks, and further research is warranted to assess its safety in these populations.
Methods: None Reported.
Results: None Reported.
Conclusion: None Reported.
Kursat Gundogan, MD1; Mary Nellis, PhD2; Nurhayat Ozer, PhD3; Sahin Temel, MD3; Recep Yuksel, MD4; Murat Sungar, MD5; Dean Jones, PhD2; Thomas Ziegler, MD6
1Division of Clinical Nutrition, Erciyes University Health Sciences Institute, Kayseri; 2Emory University, Atlanta, GA; 3Erciyes University Health Sciences Institute, Kayseri; 4Department of Internal Medicine, Erciyes University School of Medicine, Kayseri; 5Department of Internal Medicine, Erciyes University School of Medicine, Kayseri; 6Emory Healthcare, Atlanta, GA
Financial Support: Erciyes University Scientific Research Committee (TSG- 2021–10078) and TUBITAK 2219 program (1059B-192000150), each to KG, and National Institutes of Health grant P30 ES019776, to DPJ and TRZ.
Background: Metabolomics represents a promising profiling technique for the investigation of significant metabolic alterations that arise in response to critical illness. The present study utilized plasma high-resolution metabolomics (HRM) analysis to define systemic metabolism associated with common critical illness severity scores in critically ill adults.
Methods: This cross-sectional study performed at Erciyes University Hospital, Kayseri, Turkiye and Emory University, Atlanta, GA, USA. Participants were critically ill adults with an expected length of intensive care unit stay longer than 48 h. Plasma for metabolomics was obtained on the day of ICU admission. Data was analyzed using regression analysis of two ICU admission illness severity scores (APACHE II and mNUTRIC) against all plasma metabolomic features in metabolome-wide association studies (MWAS). APACHE II score was analyzed as a continuous variable, and mNUTRIC score was analyzed as a dichotomous variable [≤4 (low) vs. > 4 (high)]. Pathway enrichment analysis was performed on significant metabolites (raw p < 0.05) related to each of the two illness severity scores independently.
Results: A total of 77 patients were included. The mean age was 69 years (range 33-92 years); 65% were female. More than 15,000 metabolomic features were identified for MWAS of APACHE II and mNUTRIC scores, respectively. Metabolic pathways significantly associated with APACHE II score at ICU admission included: C21-steroid hormone biosynthesis, and urea cycle, vitamin E, seleno amino acid, aspartate/asparagine, and thiamine metabolism. Metabolic pathways associated with mNUTRIC score at ICU admission were N-glycan degradation, and metabolism of fructose/mannose, vitamin D3, pentose phosphate, sialic acid, and linoleic acid. Within the significant pathways, the gut microbiome-derived metabolites hippurate and N-acetyl ornithine were downregulated, and creatine and glutamate were upregulated with increasing APACHE II scores. Metabolites involved in energy metabolism that were altered with a high (> 4) mNUTRIC score included N-acetylglucosamine (increased) and gluconate (decreased).
Conclusion: Plasma HRM identified significant associations between two commonly used illness severity scores and metabolic processes involving steroid biosynthesis, the gut microbiome, skeletal muscle, and amino acid, vitamin, and energy metabolism in adult critically ill patients.
1Duke Health, Durham, NC; 2Duke University School of Medicine- Department of Anesthesiology, Durham, NC; 3Duke University Medical Center - Department of Anesthesiology - Duke Heart Center, Raleigh, NC; 4Duke University School of Medicine, Durham, NC; 5Duke University Medical School, Durham, NC
Financial Support: None Reported.
Background: Little evidence currently exists on the effect of body mass index (BMI) on resting energy expenditure (REE). Current clinical practice guidelines are based primarily on expert opinion and provide a wide range of calorie recommendations without delineations specific to BMI categories, making it difficult for the clinician to accurately prescribe calorie needs for their hospitalized patients. This abstract utilizes metabolic cart data from studies conducted at a large academic healthcare system to investigate trends within BMI and REE.
Methods: A pooled cohort of hospitalized patients was compiled from three clinical trials where metabolic cart measurements were collected. In all three studies, indirect calorimetry was initially conducted in the intensive care setting, with follow up measurements conducted as clinically able. Variables included in the analysis was measured resting energy expenditure (mREE) in total kcals as well as kcals per kilogram of body weight on ICU admission, respiratory quotient, days post ICU admission, as well as demographics and clinical characteristics. ANOVA tests were utilized to analyze continuous data.
Results: A total of 165 patients were included in the final analysis with 338 indirect calorimetry measurements. Disease groups included COVID-19 pneumonia, non-COVID respiratory failure, surgical ICU, cardiothoracic surgery, and trauma. The average age of patients was 58 years old, with 96 males (58.2%) and 69 females (41.8%), and an average BMI of 29.0 kg/m2. The metabolic cart measurements on average were taken on day 8 post ICU admission (ranging from day 1 to day 61). See Table 1 for more demographics and clinical characteristics. Indirect calorimetry measurements were grouped into three BMI categories: BMI ≤ 29.9 (normal), BMI 30-39.9 (obese), and BMI ≥ 40 (super obese). ANOVA analyses showed statistical significance amongst the three BMI groups in both total kcals (p < 0.001) and kcals per kg (p < 0.001). The normal BMI group had an average mREE of 1632 kcals (range of 767 to 4023), compared to 1868 kcals (range of 1107 to 3754) in the obese BMI group, and 2004 kcals (range of 1219 to 3458) in the super obese BMI group. Similarly, when analyzing kcals per kg, the normal BMI group averaged 23.3 kcals/kg, the obese BMI group 19.8, and the super obese BMI group 16.3.
Conclusion: Without access to a metabolic cart to accurately measure REE, the majority of nutrition clinicians are left to estimations. Current clinical guidelines and published data do not provide the guidance that is necessary to accurately feed many hospitalized patients. This current analysis only scratches the surface on the metabolic demands of different patient populations based on their BMI status, especially given the wide ranges of energy expenditures. Robust studies are needed to further elucidate the relationships between BMI and REE in different disease states.
Table 1. Demographics and Clinical Characteristics.
Figure 1. Average Measured Resting Energy Expenditure (in Total Kcals), by BMI Group.
Figure 2. Average Measured Resting Energy Expenditure (in kcals/kg), by BMI Group.
Carlos Reyes Torres, PhD, MSc1; Daniela Delgado Salgado, Dr2; Sergio Diaz Paredes, Dr1; Sarish Del Real Ordoñez, Dr1; Eva Willars Inman, Dr1
1Hospital Oncológico de Coahuila (Oncological Hospital of Coahuila), Saltillo, Coahuila de Zaragoza; 2ISSSTE, Saltillo, Coahuila de Zaragoza
Financial Support: None Reported.
Background: Chemotherapy is one of the principal treatment in cancer. Is described that some degree of toxicity is present in 98% of patients. Changes in body composition are frequent and are related to worse outcomes. Low muscle mass is associated with chemotherapy toxicity in observational studies. Phase angle (PhA) is an indicator cell integrity and positively correlates with adequate nutritional status and muscle mass. There is limited studies that have been evaluated the associaton of PhA and chemotherapy toxicity. The aim of this study was to evaluate the association of PhA, body composition and chemotherapy toxicity in cancer patients.
Methods: A prospective cohort study was conducted in adult patients with solid neoplasic disease with first-line systemic treatments. The subjects were evaluated at first chemotherapy treatment using bioelectrical impedance analysis with a RJL device and according to the standardized technique. The outcome is to know chemotherapy toxicity in the first 4 cycles of chemotherapy and associated with PhA and body composition. Toxicity was evaluated using National Cancer Institute (NCI) common terminology criteria for adverse events version 5.0. A PhA < 4.7 was considered low according to other studies.
Results: A total of 54 patients were evaluated and included in the study. The most common cancer diagnosis was breast cancer (40%), gastrointestinal tumors (33%), lung cancer (18%). Chemotherapy toxicity was presented in 46% of the patients. The most common adverse effects were gastrointestinal (48%), blood disorders (32%) and metabolically disorders (40%). There were statistically differences in PhA between patients with chemotherapy toxicity and patients without adverse effects: 4.45º (3.08-4.97) vs 6.07º (5.7-6.2) respectively, p vale < 0.001. PhA was associated with the risk of chemotherapy toxicity HR 8.7 (CI 95% 6.1-10.7) log rank test p = 0.02.
Conclusion: PhA was associated with the risk of chemotherapy toxicity in cancer patients.
Lizl Veldsman, RD, M Nutr, BSc Dietetics1; Guy Richards, MD, PhD2; Carl Lombard, PhD3; Renée Blaauw, PhD, RD1
1Division of Human Nutrition, Department of Global Health, Faculty of Medicine & Health Sciences, Stellenbosch University, Cape Town, Western Cape; 2Department of Surgery, Division of Critical Care, Faculty of Health Sciences, University of the Witwatersrand, Johannesburg, Gauteng; 3Division of Epidemiology and Biostatistics, Department of Global Health, Faculty of Medicine and Health Sciences, Stellenbosch University, Cape Town, Western Cape
Financial Support: Fresenius Kabi JumpStart Research Grant.
Background: Critically ill patients lose a significant amount of muscle mass over the first ICU week. We sought to determine the effect of bolus amino acid (AA) supplementation on the urea-to-creatinine ratio (UCR) trajectory over time, and whether UCR correlates with histology myofiber cross-sectional area (CSA) as a potential surrogate marker of muscle mass.
Methods: This was a secondary analysis of data from a registered clinical trial (ClinicalTrials.gov NCT04099108) undertaken in a predominantly trauma surgical ICU. Participants were randomly assigned into two groups both of which received standard care nutrition (SCN) and mobilisation. Study participants randomised to the intervention group also received a bolus AA supplement, with a 45-minute in-bed cycling session, from average ICU day 3 for a mean of 6 days. The change in vastus lateralis myofiber CSA, measured from pre-intervention (average ICU day 2) to post-intervention (average ICU day 8), was assessed through biopsy and histological analysis. A linear mixed-effects regression model was used to compare the mean daily UCR profiles between study groups from ICU day 0 to day 10. As a sensitivity analysis, we adjusted for disease severity on admission (APACHE II and SOFA scores), daily fluid balance, and the presence of acute kidney injury (AKI). A spearman correlation compared the UCR on ICU day 2 (pre-intervention) and ICU day 8 (post-intervention) with the corresponding myofiber CSA.
Results: A total of 50 enrolled participants were randomised to the study intervention and control groups in a 1:1 ratio. The control and intervention groups received, on average, 87.62 ± 32.18 and 85.53 ± 29.29 grams of protein per day (1.26 ± 0.41 and 1.29 ± 0.40 g/kg/day, respectively) from SCN, and the intervention group an additional 30.43 ± 5.62 grams of AA (0.37 ± 0.06 g/kg protein equivalents) from the AA supplement. At baseline, the mean UCR for the control (75.6 ± 31.5) and intervention group (63.8 ± 27.1) were similar. Mean UCR increased daily from baseline in both arms, but at a faster rate in the intervention arm with a significant intervention effect (p = 0.0127). The UCR for the intervention arm at day 7 and 8 was significantly higher by 21 and 22 units compared to controls (p = 0.0214 and p = 0.0215, respectively). After day 7 the mean daily UCR plateaued in the intervention arm, but not in the controls. Adjusting for disease severity, daily fluid balance, and AKI did not alter the intervention effect. A significant negative association was found between UCR and myofiber CSA (r = -0.39, p = 0.011) at ICU day 2 (pre-intervention), but not at day 8 (post-intervention) (r = 0.23, p = 0.153).
Conclusion: Bolus amino acid supplementation significantly increases the UCR during the first ICU week, thereafter plateauing. UCR at baseline may be an indicator of muscle status.
Figure 1. Change in Urea-to-Creatinine Ratio (UCR) Over ICU Days in the Control and Intervention Group. Error bars Represent 95% Confidence Intervals (CIs).
1UNAM, National Autonomous University of Mexico, Mexico City, Distrito Federal; 2National Institute of Respiratory Diseases, Mexico City, Distrito Federal; 3National Institute of Medical Sciences and Nutrition Salvador Zubirán, Mexico City, Distrito Federal
Financial Support: None Reported.
Background: Non-defecation (ND) is highly prevalent in critically ill patients on mechanical ventilation (MV) and has been reported in up to 83% of cases. This disorder is associated with high morbidity and mortality rates. Most existing research has focused on the association between clinical data and non-defecation; however, there is a lack of evidence regarding its association with dietary factors. We aimed to analyze the association between dietary fiber in enteral nutrition and the amount of fluids administered through enteral and parenteral routes with defecation during the first 6-days of MV in critically ill patients with pneumonia and other lung manifestations.
Methods: We conducted a longitudinal analysis of MV patients receiving enteral nutrition (EN) at a tertiary care hospital in Mexico City between May 2023 and April 2024. The inclusion criteria were age >18 years, MV, admission to the respiratory intensive care unit (ICU) for pneumonia or other lung manifestations, and nutritional assessment performed during the first 24 h after ICU admission. Exclusion criteria was established as patients who required parenteral nutrition, major surgery, traumatic brain injury, or neuromuscular disorders were excluded from this study. A nutritional assessment, including the NUTRIC score, SOFA and APACHE II assessments, and an estimation of energy-protein requirements, was performed by trained dietitians within the first 24 hours after ICU admission. During each day of follow-up (0 to 6 days), we recorded the amount of fiber provided in EN, and volume of infusion fluids, including enteral and parenteral route, and medical prescription of opioids, sedatives, neuromuscular blockers and vasopressors. ND was defined as >6 days without defecation from ICU admission. The differences between ND and defecation were also assessed. Association of ND with dietary factors were examined using discrete-time survival analysis.
Results: Seventy-four patients were included; ND was observed in 40 patients (54%). Non-defecation group had higher ICU length of stay, and 50% of this group had the first defecation until day 10. No differences in fiber provision and volume of infusion fluids were observed between the groups. In multivariate analysis, no associations between ND and fiber (fiber intake 10 to 20 g per day, OR 1.17 95% CI:0.41-3.38, p = 0.29) or total fluids (fluids intake 25 to 30 ml/kg/d, OR 1.85 95%CI:0.44-7.87, p = 0.404) were observed.
Conclusion: Non-defecation affected 54% of the study population. Although fiber and fluids are considered a treatment for non-defecation, we did not find an association in critically ill patients.
Table 1. Demographic and Clinical Characteristics by Groups.
Table 2. Daily Comparison of Dietary Factors.
Andrea Morand, MS, RDN, LD1; Osman Mohamed Elfadil, MBBS1; Kiah Graber, RDN1; Yash Patel, MBBS1; Suhena Patel, MBBS1; Chloe Loersch, RDN1; Isabelle Wiggins, RDN1; Anna Santoro, MS, RDN1; Natalie Johnson, MS1; Kristin Eckert, MS, RDN1; Dana Twernbold, RDN1; Dacia Talmo, RDN1; Elizabeth Engel, RRT, LRT1; Avery Erickson, MS, RDN1; Alex Kirby, MS, RDN1; Mackenzie Vukelich, RDN1; Kate Sandbakken, RDN1; Victoria Vasquez, RDN1; Manpreet Mundi, MD1
1Mayo Clinic, Rochester, MN
Financial Support: None Reported.
Background: Current guidelines for nutrition support in critically ill patients recommend the utilization of indirect calorimetry (IC) to determine energy needs. However, IC testing is limited at many institutions due to accessibility, labor, and costs, which leads to reliance on predictive equations to determine caloric targets. A quality improvement (QI) initiative was implemented to assess the impact on nutrition care when IC is routinely completed.
Methods: A prospective QI project in selected medical and surgical intensive care units (ICU) included critically ill patients assessed by the dietitian within 24-48 hours of consult order or by hospital day 4. Patients with contraindications to IC were excluded, including those requiring ECMO, CRRT, MARS, or FiO2 > 55, as well as spontaneously breathing patients requiring significant supplemental oxygen. The initial caloric target was established utilizing predictive equations, which is the standard of care at our institution. After day 4 of the ICU stay, IC measurements, predictive equations, and weight-based nomograms were collected. The predictive equations utilized included Harris-Benedict (HB) - basal, adjusted HB (75% of basal when body mass index (BMI) > 30), Penn State if ventilated, Mifflin St. Jeor (MSJ), revised HB, and revised HB adjusted (75% of basal when BMI > 30). Additional demographic, anthropometric, and clinical data were collected.
Results: Patients (n = 85) were majority male (n = 53, 62.4%), admitted to the surgical ICU (n = 57, 67.1%), overweight (mean BMI 29.8 kg/m^2), and the average age was 61.3 years old (SD 16.5). At the time of the IC test, the median ICU length of stay was 6 days; 77.6% (n = 66) were supported with mechanical ventilation, and median ventilator days were 4 days (Table 1). Mean IC measured REE was compared to predictive equations, showing that except for weight-based nomogram high caloric needs (p = 0.3615), all equations were significantly lower than IC (p < 0.0001). Median absolute differences from IC for evaluated predictive equations (Figure 1). After REE was measured, caloric goals increased significantly for patients on enteral (EN) and parenteral nutrition (PN) (p = 0.0016 and p = 0.05, respectively). In enterally fed the mean calorie goal before REE was 1655.4 (SD 588), and after REE 1917.6 (SD 528.6), an average increase of 268.4 kcal/day; In parenterally fed patients, the mean calorie goal before REE was1395.2 kcal (SD 313.6) and after REE 1614.1 (SD 239.3), an average increase of 167.5 kcal (Table 2). The mean REE per BMI category per actual body weight was BMI < 29.9 = 25.7 ± 7.9 kcal/kg, BMI 30-34.9 = 20.3 ± 3.8 kcal/kg, BMI 35-39.9 = 22.8 ± 4.6 kcal/kg, and BMI ≥ 40 = 16.3 ± 2.9 kcal/kg (25.4 ± 10.5 kcal/kg of ideal body weight). (Figure 2) illustrates the average daily calorie need broken down by BMI for IC and examined predictive equations.
Conclusion: There was a significant difference between IC measurements and various predictive equations except for weight-based high-estimated calorie needs. Nutrition goals changed significantly in response to IC measurements. It is recommended that we expand the use of IC in the critically ill population at our institution. In settings where IC is not possible, weight-based nomograms should be utilized.
Table 1. Baseline Demographics and Clinical Characteristics.
Table 2. Nutrition Support.
Figure 1. Difference in Daily Calorie Estimation Utilizing IC Compared to Predictive Equations.
Figure 2. RMR by IC and Other Predictive Equations by BMI.
GI, Obesity, Metabolic, and Other Nutrition Related Concepts
Suhena Patel, MBBS1; Osman Mohamed Elfadil, MBBS1; Yash Patel, MBBS1; Chanelle Hager, RN1; Manpreet Mundi, MD1; Ryan Hurt, MD, PhD1
1Mayo Clinic, Rochester, MN
Financial Support: None Reported.
Background: Chronic capillary leak syndrome is a rare but potentially fatal side effect of immunotherapy for some malignancies, mainly manifested with intractable generalized edema and often refractory hypotension. An idiopathic type of the syndrome is also known. It can be diagnosed by exclusion in patients with a single or recurrent episode of intravascular hypovolemia or generalized edema, primarily manifested by the diagnostic triad of hypotension, hemoconcentration, and hypoalbuminemia in the absence of an identifiable alternative cause. Supportive care with a role for steroids remains the standard treatment. In capillary leak syndrome secondary to cancer immune therapy, discontinuing the offending agent is typically considered. This relatively rare syndrome can be associated with significant clinical challenges. This clinical case report focuses on aspects of nutrition care.
Methods: A 45-year-old male with a past medical history of hypertension, pulmonary tuberculosis in childhood, and rectal cancer was admitted for evaluation of anasarca. He was first diagnosed with moderately differentiated invasive adenocarcinoma of rectum IIIb (cT3, cN1, cM0) in October 2022. As initial therapy, he was enrolled in a clinical trial. He received 25 cycles of immunotherapy with the study drug Vudalimab (PD1/CTLA4 bispecific antibody), achieving a complete clinical response without additional chemotherapy, radiation, or surgery. He, unfortunately, has developed extensive capillary leak syndrome manifested with recurrent anasarca, chylous ascites, and pleural effusions since November 2023. His treatment was also complicated by the development of thyroiditis and insulin-dependent diabetes. The patient most recently presented with abdominal fullness, ascites, and peripheral edema that did not improve despite diuretic therapy. A diagnostic and therapeutic paracentesis was performed, and chylous ascites were revealed. Two weeks later, the patient was presented with re-accumulation of ascites and worsening anasarca with pleural and pericardial effusion. A PET CT then was negative for malignant lesions but revealed increased uptake along the peritoneal wall, suggestive of peritonitis. A lymphangiogram performed for further evaluation revealed no gross leak/obstruction; however, this study could not rule out microleak from increased capillary permeability. However, he required bilateral pleural and peritoneal drains (output ranged from 0.5 to 1 L daily). A diagnosis of Capillary leak syndrome was made. In addition to Octreotide, immunosuppression therapy was initiated with IV methyl prednisone (40 mg BID) followed by a transition to oral steroids (60 mg PO); however, the patient's symptoms reappeared with a reduction in dose of prednisone and transition to oral steroids. His immunosuppression regimen was modified to include a trial of IVIG weekly and IV albumin twice daily. From a nutritional perspective, he was initially on a routine oral diet. Still, his drain was increased specifically after fatty food consumption, so he switched to a low fat 40 g/day and high protein diet to prevent worsening chylous ascites. In the setting of worsening anasarca and moderate malnutrition based on ASPEN criteria, along with significant muscle loss clinically, he was started on TPN. A no-fat diet was initiated to minimize lymphatic flow with subsequent improvement in his chest tube output volume, followed by a transition to home parenteral nutrition with mixed oil and oral diet.
Results: None Reported.
Conclusion: Chronic capillary/lymphatic leak syndrome can be challenging and necessitate dietary modification. Along with dietary changes to significantly reduce oral fat intake, short—or long-term PN can be considered.
Kishore Iyer, MBBS1; Francisca Joly, MD, PhD2; Donald Kirby, MD, FACG, FASPEN3; Simon Lal, MD, PhD, FRCP4; Kelly Tappenden, PhD, RD, FASPEN2; Palle Jeppesen, MD, PhD5; Nader Youssef, MD, MBA6; Mena Boules, MD, MBA, FACG6; Chang Ming, MS, PhD6; Tomasz Masior, MD6; Susanna Huh, MD, MPH7; Tim Vanuytsel, MD, PhD8
1Icahn School of Medicine at Mount Sinai, New York, NY; 2Nutritional Support, Hôpital Beaujon, Paris, Ile-de-France; 3Department of Intestinal Failure and Liver Diseases, Cleveland, OH; 4Salford Royal NHS Foundation Trust, Salford, England; 5Department of Intestinal Failure and Liver Diseases, Rigshospitalet, Copenhagen, Denmark, Hovedstaden; 6Ironwood Pharmaceuticals, Basel, Basel-Stadt; 7Ironwood Pharmaceuticals, Boston, MA; 8University Hospitals Leuven, Leuven, Brabant Wallon
Encore Poster
Presentation: American College of Gastroenterology 2024, October 25-30, 2024, Philadelphia, Pennsylvania.
1Nutritional Support, Hôpital Beaujon, Paris, Ile-de-France; 2University Hospitals Leuven, Leuven, Brabant Wallon; 3Department of Intestinal Failure and Liver Diseases, Cleveland, OH; 4Salford Royal NHS Foundation Trust, Salford, England; 5Department of Intestinal Failure and Liver Diseases, Rigshospitalet, Copenhagen, Denmark, Hovedstaden; 6Ironwood Pharmaceuticals, Basel, Basel-Stadt; 7Ironwood Pharmaceuticals, Boston, MA; 8Icahn School of Medicine at Mount Sinai, New York, NY
Tim Vanuytsel, MD, PhD1; Simon Lal, MD, PhD, FRCP2; Kelly Tappenden, PhD, RD, FASPEN3; Donald Kirby, MD, FACG, FASPEN4; Palle Jeppesen, MD, PhD5; Francisca Joly, MD, PhD3; Tomasz Masior, MD6; Patricia Valencia, PharmD7; Chang Ming, MS, PhD6; Mena Boules, MD, MBA, FACG6; Susanna Huh, MD, MPH7; Kishore Iyer, MBBS8
1University Hospitals Leuven, Leuven, Brabant Wallon; 2Salford Royal NHS Foundation Trust, Salford, England; 3Nutritional Support, Hôpital Beaujon, Paris, Ile-de-France; 4Department of Intestinal Failure and Liver Diseases, Cleveland, OH; 5Department of Intestinal Failure and Liver Diseases, Rigshospitalet, Copenhagen, Denmark, Hovedstaden; 6Ironwood Pharmaceuticals, Basel, Basel-Stadt; 7Ironwood Pharmaceuticals, Boston, MA; 8Icahn School of Medicine at Mount Sinai, New York, NY
Encore Poster
Presentation: American College of Gastroenterology 2024, October 25-30, 2024, Philadelphia, Pennsylvania.
Financial Support: None Reported.
Boram Lee, MD1; Ho-Seong Han, PhD1
1Seoul National University Bundang Hospital, Seoul, Seoul-t'ukpyolsi
Financial Support: None Reported.
Background: Pancreatic cancer is one of the fatal malignancies, with a 5-year survival rate of less than 10%. Despite advancements in treatment, its incidence is increasing, driven by aging populations and increasing obesity rates. Obesity is traditionally considered to be a negative prognostic factor for many cancers, including pancreatic cancer. However, the "obesity paradox" suggests that obesity might be associated with better outcomes in certain diseases. This study investigated the effect of obesity on the survival of long-term pancreatic cancer survivors after pancreatectomy.
Methods: A retrospective analysis was conducted on 404 patients with pancreatic ductal adenocarcinoma (PDAC) patients who underwent surgery between January 2004 and June 2022. Patients were classified into the non-obese (BMI 18.5-24.9) (n = 313) and obese (BMI ≥ 25.0) (n = 91) groups. The data collected included demographic, clinical, perioperative, and postoperative information. Survival outcomes (overall survival [OS], recurrence-free survival [RFS], and cancer-specific survival [CSS]) were analyzed using Kaplan-Meier curves and Cox regression models. A subgroup analysis examined the impact of the visceral fat to subcutaneous fat ratio (VSR) on survival within the obese cohort.
Results: Obese patients (n = 91) had a significantly better 5-year OS (38.9% vs. 27.9%, p = 0.040) and CSS (41.4% vs. 33%, p = 0.047) than non-obese patients. RFS did not differ significantly between the groups. Within the obese cohort, a lower VSR was associated with improved survival (p = 0.012), indicating the importance of fat distribution in outcomes.
Conclusion: Obesity is associated with improved overall and cancer-specific survival in patients with pancreatic cancer undergoing surgery, highlighting the potential benefits of a nuanced approach to managing obese patients. The distribution of adipose tissue, specifically higher subcutaneous fat relative to visceral fat, further influences survival, suggesting that tailored treatment strategies could enhance the outcomes.
Nicole Nardella, MS1; Nathan Gilchrist, BS1; Adrianna Oraiqat, BS1; Sarah Goodchild, BS1; Dena Berhan, BS1; Laila Stancil, HS1; Jeanine Milano, BS1; Christina Santiago, BS1; Melissa Adams, PA-C1; Pamela Hodul, MD1
1Moffitt Cancer Center, Tampa, FL
Financial Support: None Reported.
Background: Pancreatic cancer (PC) is a devastating diagnosis with 66,440 new cases and 51,750 deaths estimated in 2024. The prevalence of malnutrition in patients with cancer has been reported to range from 30-85% depending on patient age, cancer type, and stage of disease. Specifically, PC patients frequently present with malnutrition which can lead to negative effects on quality of life and tumor therapy. We hypothesize that increased awareness of early nutritional intervention for PC patients has led to high utilization of dietitian consultations at our tertiary cancer center.
Methods: This IRB-exempt retrospective review included newly diagnosed, treatment naïve PC patients presenting to our institution in 2021-2023 (n = 701). We define newly diagnosed as having pathologically confirmed adenocarcinoma within 30 days of clinic presentation. Patients were screened for weight loss and risk of malnutrition using the validated Malnutrition Screening Tool (MST) (89.5 positive predictive value) at the initial consultation and were referred to a dietitian based on risk or patient preference. Data was collected on demographics, disease stage (localized vs metastatic), tumor location (head/neck/uncinate, body/tail, multi-focal), presenting symptoms (percent weight loss, abdominal pain, bloating, nausea/vomiting, fatigue, change in bowel habits), experience of jaundice, pancreatitis, and gastric outlet obstruction, and dietitian consultation. Descriptive variables and Fisher's exact test were used to report outcomes.
Results: The majority of patients were male (54%) with median age of 70 (27-95). About half of patients had localized disease (54%) with primary tumor location in the head/neck/uncinate region (57%). Patients with head/neck/uncinate tumor location mostly had localized disease (66%), while patients with body/tail tumors tended to have metastatic disease (63%). See Table 1 for further demographics. Unintentional weight loss was experienced by 66% of patients (n = 466), 69% of localized patients (n = 261) and 64% of metastatic patients (n = 205). Patients with localized disease stated a 12% loss in weight over a median of 3 months, while metastatic patients reported a 10% weight loss over a median of 5 months. Of the localized patients, the majority presented with symptoms of abdominal pain (66%), nausea/vomiting/fatigue (61%), and change in bowel habits (44%). Presenting symptoms of the metastatic patients were similar (see Table 2). There was no statistical significance of tumor location in relation to presenting symptoms. Dietitian consults occurred for 67% (n = 473) of the patient population, 77% for those with localized disease and 57% for those with metastatic disease. Of those with reported weight loss, 74% (n = 343) had dietitian consultation.
Conclusion: Overall, it was seen that a high number of newly diagnosed, treatment naïve PC patients present with malnutrition. Patients with localized disease and tumors located in the head/neck/uncinate region experience the greatest gastrointestinal symptoms of nausea, vomiting, change in bowel habits, and fatigue. Early implementation of a proactive nutritional screening program resulted in increased awareness of malnutrition and referral for nutritional intervention for newly diagnosed PC patients.
Table 1. Demographics and Disease Characteristics.
Table 2. Presenting Symptoms.
Nicole Nardella, MS1; Nathan Gilchrist, BS1; Adrianna Oraiqat, BS1; Sarah Goodchild, BS1; Dena Berhan, BS1; Laila Stancil, HS1; Jeanine Milano, BS1; Christina Santiago, BS1; Melissa Adams, PA-C1; Pamela Hodul, MD1
1Moffitt Cancer Center, Tampa, FL
Financial Support: None Reported.
Background: Pancreatic cancer (PC) is an aggressive disease, with 5-year survival rate of 13%. Symptoms occur late in the disease course, leading to approximately 50% of patients presenting with metastatic disease. New-onset diabetes is often one of the first symptoms of PC, with diagnosis occurring up to 3 years before cancer diagnosis. We hypothesize increasing awareness of PC prevalence in diabetic patients, both new-onset and pre-existing, may lead to early PC diagnosis.
Methods: This IRB-exempt retrospective review included new PC patients presenting to our institution in 2021-2023 with diabetes diagnosis (n = 458). We define new-onset diabetes as having been diagnosed within 3 years prior to pathologically confirmed adenocarcinoma. We define newly diagnosed PC as having pathologically confirmed adenocarcinoma within 30 days of clinic presentation. Data was collected on demographics, staging (localized vs metastatic), tumor location (head/neck/uncinate, body/tail, multi-focal), treatment initiation, diabetes onset (new-onset vs pre-existing), diabetes regimen, and weight loss. Descriptive variables were used to report outcomes.
Results: In the study period, 1,310 patients presented to our institution. Of those, 35% had a diabetes diagnosis (n = 458). The majority of patients were male (61%) with age at PC diagnosis of 69 (41-92). Patients mostly had localized disease (57%) with primary tumor location in the head/neck/uncinate region (59%). New-onset diabetes was present in 31% of diabetics, 11% of all new patients, with 63% having localized disease (79% head/neck/uncinate) and 37% metastatic (66% body/tail). Of those with pre-existing diabetes (69%), 54% had localized disease (69% head/neck/uncinate) and 46% had metastatic disease (53% body/tail). See Table 1 for further demographic/disease characteristics. Abrupt worsening of diabetes was seen in 10% (n = 31) of patients with pre-existing diabetes, and 12% had a change in their regimen prior to PC diagnosis. Hence 13% (175/1,310) of all new patients presented with either new-onset or worsening diabetes. Weight loss was present in 75% (n = 108) of patients with new-onset diabetes, with a median of 14% weight loss (3%-38%) over 12 months (1-24). Alternatively, weight loss was present in 66% (n = 206) of patients with pre-existing diabetes, with a median of 14% weight loss (4%-51%) over 6 months (0.5-18). Diabetes medication was as follows: 41% oral, 30% insulin, 20% both oral and insulin, 10% no medication. Of those patients with new-onset diabetes, 68% were diagnosed within 1 year of PC diagnosis and 32% were diagnosed within 1-3 years of PC diagnosis. Of those within 1 year of diagnosis, 68% had localized disease with 81% having head/neck/uncinate tumors. Of the metastatic (31%), 73% had body/tail tumors. For patients with diabetes diagnosis within 1-3 years of PC diagnosis, 52% had localized disease (75% head/neck/uncinate) and 48% had metastatic disease (59% body/tail). See Table 2 for further characteristics.
Conclusion: Overall, approximately one-third of new patients presenting with PC at our institution had diabetes, and new-onset diabetes was present in one-third of those patients. The majority of diabetes patients presented with localized head/neck/uncinate tumors. When comparing new-onset vs pre-existing diabetes, new-onset tended to experience greater weight loss over a longer time with more localized disease than pre-existing diabetes patients. Patients with diabetes diagnosis within 1 year of PC diagnosis had more localized disease (head/neck/uncinate). Hence increased awareness of diabetes in relation to PC, particularly new onset and worsening pre-existing, may lead to early diagnosis.
Table 1. Demographics and Disease Characteristics.
Table 2. New-Onset Diabetes Characteristics.
Marcelo Mendes, PhD1; Gabriela Oliveira, RD2; Ana Zanini, RD, MSc2; Hellin dos Santos, RD, MSc2
1Cicatripelli, Belém, Para; 2Prodiet Medical Nutrition, Curitiba, Parana
Encore Poster
Financial Support: None Reported.
Background: According to the NPUAP, a pressure injury (PI) is damage that occurs to the skin and/or underlying soft tissue, primarily over bony prominences, and may also be related to the use of medical devices. PIs can range from intact skin to deeper ulcers, affecting structures such as muscles and bones. The aim of this study was to report the experience of using a specialized supplement for wound healing in the treatment of a PI.
Methods: This is a case report based on the experience of the nurses from Cicatripelli®. Data were collected from May to July 2024 through a review of medical records and photographs showing the wound's progression. The patient was a 69-year-old woman with COPD, diabetes mellitus, and systemic arterial hypertension, who denied smoking and alcohol consumption. She developed a stage 4 PI in the sacral region after 15 days of mechanical ventilation due to bacterial pneumonia and was admitted to a private clinic for treatment on May 2, 2024. Initial wound assessment: Measurements: 16.5x13x4cm (WxLxD); abundant purulent exudate with a foul odor; intact peripheral skin; mild to moderate pain; 75% granulation tissue and 25% liquefactive necrosis (slough) (Figure 1). The wound was cleansed with 0.1% polyhexamethylene biguanide (PHMB) solution, followed by conservative debridement, primary coverage with hydrophiber with 1.2% silver, and secondary coverage with cotton gauze and transparent film, with dressing changes scheduled every 72 hours. Supplementation (Correctmax – Prodiet Medical Nutrition) started on 05/20/2024, with a dosage of 2 sachets per day, containing 10 g of collagen peptide, 3 g of L-arginine, 612 mg of vitamin A, 16 mg of vitamin E, 508 mg of vitamin C, 30 mcg of selenium, and 16 mg of zinc.
Results: On the 17th day of supplementation, the hydrophiber with silver dressing was replaced with PHMB-impregnated gauze, as the wound showed no signs of infection and demonstrated significant clinical improvement. Measurements: 8x6x2cm (WxLxD); moderate serohematic exudate; intact peripheral skin; 100% granulation tissue; significant improvement in pain and odor (Figure 2). On the 28th day, the dressing was switched to calcium and sodium alginate to optimize exudate control due to the appearance of mild dermatitis. Low-intensity laser therapy was applied, and a skin protective spray was used. Wound assessment: Measurements: 7x5.5x1.5 cm (WxLxD), with maintained characteristics (Figure 3). On the 56th day, the patient returned for dressing change and discharge instructions, as she could not continue the treatment due to a lack of resources. The approach remained the same with dressing changes every 3 days. Wound assessment: Measurements: 5x3.5x0.5 cm (WxLxD), with approximately 92% reduction in wound area, epithelialized margins, and maintained characteristics (Figure 4).
Conclusion: Nutritional intervention with specific nutrient supplementation can aid in the management of complex wounds, serving as a crucial tool in the healing process and contributing to reduced healing time.
Figure 1. Photo of the wound on the day of the initial assessment on 05/02/2024.
Figure 2. Photo of the wound after 17 days of supplementation on 06/06/2024.
Figure 3. Photo of the wound after 28 days of supplementation on 06/17/2024.
Figure 4. Photo of the wound after 56 days of supplementation on 07/15/2024.
Ludimila Ribeiro, RD, MSc1; Bárbara Gois, RD, PhD2; Ana Zanini, RD, MSc3; Hellin dos Santos, RD, MSc3; Ana Paula Celes, MBA3; Flávia Corgosinho, PhD2; Joao Mota, PhD4
1School of Nutrition, Federal University of Goiás, Goiania, Goias; 2School of Nutrition, Federal University of Goiás, Goiânia, Goias; 3Prodiet Medical Nutrition, Curitiba, Parana; 4Federal University of Goias, Goiania, Goias
Financial Support: None Reported.
Background: Postprandial blood glucose is considered an important risk factor for the development of macrovascular and microvascular diseases. Despite the use of hypoglycemic agents, patients with diabetes often experience postprandial hyperglycemia due to unbalanced meals. The aim of this study was to compare the effects of a low glycemic index formula for glycemic control as a substitute for a standard breakfast in patients with type 2 diabetes.
Methods: This randomized, placebo-controlled, crossover study included 18 individuals with type 2 diabetes. Participants were instructed to consume, in random order, either a nutritional formula or a typical Brazilian breakfast with the same caloric content for three consecutive week days in different weeks. The nutritional formula (200 mL) provided 200 kcal, with 20 g of carbohydrates, 8.8 g of protein, 9.4 g of fat (MUFA: 5.6 g, PUFA: 2.0 g), and 3.0 g of fiber (DiamaxIG – Prodiet Medical Nutrition), serving as a breakfast substitute. Both the nutritional formula and the standard breakfast were provided to the participants. During the two weeks of intervention, participants used continuous glucose monitoring sensors (Libre 2). Weight and height were measured to calculate body mass index (BMI), and medication use was monitored.
Results: The sample consisted of 61% females, with a mean age of 50.28 ± 12.58 years. The average blood glucose level was 187.13 ± 77.98 mg/dL and BMI 29.67 ± 4.86 kg/m². All participants were taking metformin, and two were taking it concomitantly with insulin. There were no changes in medication doses or regimens during the study. The incremental area under the curve was significantly lower in the nutritional formula group compared to the standard breakfast group (2,794.02 ± 572.98 vs. 4,461.55 ± 2,815.73, p = 0.01).
Conclusion: The low glycemic index formula for glycemic control significantly reduced postprandial glycemic response compared to a standard Brazilian breakfast in patients with type 2 diabetes. These findings suggest that incorporating low glycemic index meals could be an effective strategy for better managing postprandial blood glucose levels in this population, which may help mitigate the risk of developing macro and microvascular complications.
Background: According to the World Health Organization, obesity is a leading risk factor for global non communicable diseases like diabetes, heart disease and cancer. Weight cycling, often defined as intentionally losing and unintentionally regaining weight, is observed in people with obesity and can have adverse health and economic consequences. This study develops the first model of the health economic consequences of weight cycling in individuals with obesity, defined by a body-mass index (BMI) ≥ 30 kg/m².
Methods: A lifetime state-transition model (STM) with monthly cycles was developed to simulate a cohort of individuals with obesity, comparing “weight cyclers” versus “non-cyclers”. The simulated patient cohort was assumed to have average BMI 35.5, with 11% of patients with cardiovascular disease, and 6% of patients with type 2 diabetes. Key outcomes were the cost per obesity-associated event avoided, the cost per life-year (LY) gained, and the cost per quality-adjusted life year (QALY) gained, using a US societal perspective. Transition probabilities for obesity-associated diseases were informed by meta-analyses and were based on US disease-related base risks. Risks were adjusted by BMI-related, weight-cycling-related, and T2D-related relative risks (RR). BMI progression, health utilities, direct and indirect costs, and other population characteristics were informed by published US studies. Future costs and effects were discounted by 3% per year. Deterministic and probabilistic sensitivity analyses were performed to investigate the robustness of the results.
Results: Simulating a lifetime horizon, non-cyclers had 0.090 obesity-associated events avoided, 0.602 LYs gained, 0.518 QALYs gained and reduced total costs of approximately $4,592 ($1,004 direct and $3,588 indirect costs) per person. Sensitivity analysis showed the model was most responsive to changes in patient age and cardiovascular disease morbidity and mortality risks. Non-cycling as the cost-effective option was robust in sensitivity analyses.
Conclusion: The model shows that weight cycling has a major impact on the health of people with obesity, resulting in increased direct and indirect costs. The approach to chronic weight management programs should focus not only on weight reduction but also on weight maintenance to prevent the enhanced risks of weight cycling.
Avi Toiv, MD1; Arif Sarowar, MSc2; Hope O'Brien, BS2; Thomas Pietrowsky, MS, RD1; Nemie Beltran, RN1; Yakir Muszkat, MD1; Syed-Mohammad Jafri, MD1
1Henry Ford Hospital, Detroit, MI; 2Wayne State University School of Medicine, Detroit, MI
Financial Support: None Reported.
Background: Age is an important factor in the transplant evaluation as age at transplantation is historically thought to influence transplant outcomes in organ transplant recipients. There is limited data on the impact of age on intestinal (IT) and multivisceral (MVT) organ transplantation. This study investigates the impact of age on post-transplant outcomes in patients who received an intestinal or multivisceral (including intestine) transplant, comparing those under 40 years old to those aged 40 and above.
Methods: We conducted a retrospective chart review of all patients who underwent IT at an academic transplant center from 2010 to 2023. The primary outcome was patient survival and graft failure analyzed with Kaplan-Meier survival analysis.
Results: Among 50 IT recipients, there were 11 IT recipients < 40years old and 39 IT recipients ≥40 years old (Table). The median age at transplant in the <40 group was 37 years (range, 17-39) and in the ≥40 group was 54 years (range, 40-68). In both groups, the majority of transplants were exclusively IT, however they included MVT as well. Kaplan-Meier survival analysis revealed no significant differences between the two age groups in graft survival or patient mortality. Reoperation within 1 month was significantly linked to decreased survival (p = 0.015) and decreased graft survival (p = 0.003) as was moderate to severe rejection within 1 month (p = 0.009) but was not significantly different between the two age groups. Wilcoxon rank-sum test showed no difference between groups in regard to reoperation or moderate to severe rejection at 1 or 3 months or the development of chronic kidney disease.
Conclusion: Age at the time of intestinal transplantation (< 40 vs. ≥40 years old) does not appear to significantly impact major transplant outcomes, such as patient mortality, graft survival, or rejection rates. While reoperation and moderate to severe rejection within 1 and 3 months negatively affected overall outcomes, these complications were not more frequent in older or younger patients.
Table 1. Demographic Characteristics of Intestinal Transplant Recipients.
BMI, body mass index; TPN, total parenteral nutrition.
International Poster of Distinction
Gabriela de Oliveira Lemos, MD1; Natasha Mendonça Machado, PhD2; Raquel Torrinhas, PhD3; Dan Linetzky Waitzberg, PhD3
1University of Sao Paulo School of Medicine, Brasília, Distrito Federal; 2University of Sao Paulo School of Medicine, São Paulo; 3Faculty of Medicine of the University of São Paulo, São Paulo
Financial Support: This study is linked to project no. 2011/09612-3 and was funded by the Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP).
Background: Sphingolipids (SLs) are key molecules in cell signaling and play a central role in lipotoxicity. Moreover, they are major constituents of eucaryotic membrane cells. Data on SLs remodeling in the gastrointestinal tract after Roux-en-Y gastric bypass (RYGB) are lacking and may help to elicit tissue turnover and metabolism. This protocol aims to evaluate the SLs’ profile and remodeling within the plasma and different segments of the gastrointestinal tract (GIT) before and 3 months after RYGB in a population of women with obesity and type 2 diabetes mellitus (T2DM) and to correlate the changes within these tissues. This investigation is part of the SURMetaGIT study, registered at www.clinicalTrials.gov (NCT01251016).
Methods: Twenty-eight women with obesity and T2DM who underwent RYGB were enrolled in this protocol. Those on insulin therapy were excluded. We collected plasma (n = 28) and intestinal samples from the gastric pouch (n = 9), duodenum (n = 8), jejunum (n = 9), and ileum (n = 9) for untarget metabolomics analysis at baseline and 3 months post-surgery. Indian ink (SPOT®) was used to check the places for the following biopsies. SLs were identified using high-performance liquid chromatography coupled mass spectrometry. Data were processed and analyzed using AnalysisBaseFileConverter and MS-DIAL, respectively. The magnitude of SLs’ changes after RYGB was assessed by the fold change (log2 post-surgery mean/pre-surgery mean). The Spearman test was performed for the correlation analysis. P value < 0.05 was considered significant. Statistics were carried out in the Jamovi software (2.2.5) and MetaboAnalyst 5.0.
Results: 34 SLs were identified, including sphingomyelins (SM), ceramides (Cer), and glycosphingolipids (GlcSL). SMs were the most common SL found in the plasma – 27 SM, 3 Cer, and 4 GlcSL- and in GIT -16 SM, 13 Cer, and 5 GlcSL. Every GIT tissue presented distinct SL remodeling. The plasma and the jejunum better-discriminated SLs’ changes following surgery (Figure 1). The jejunum expressed the most robust changes, followed by the plasma and the duodenum (Figure 2). Figure 3 presents the heatmap of the plasma and the GIT tissues. Correlation analysis showed that the plasmatic SLs, particularly SM(d32:0), GlcCer(d42:1), and SM(d36:1), strongly correlated with jejunum SLs. These lipids showed a negative-strong correlation with jejunal sphingomyelins, but a positive-strong correlation with jejunal ceramides (Table 1).
Conclusion: RYGB was associated with SL remodeling in the plasma and GIT. SM was the main SL found in the plasma and the GIT. The most robust changes occurred in the jejunum and the plasma, and these 2 samples presented the more relevant correlation. Considering our findings, the role of SM in metabolic changes after RYGB should be investigated.
Table 1. Correlation Analysis of Sphingolipids from the plasma with the Sphingolipids from the Gastrointestinal Tract.
*p < ,05; **p < ,01; ***p < 0,001.
The green circle represents samples at baseline and the red circles represent the samples 3 months after RYGB.
Figure 1. Principal Component Analysis (PCA) from GIT Tissues and Plasma.
Figure 2. Fold Change of Sphingolipids from the Plasma and Gastrointestinal Tract.
The map under the right top green box represents lipids’ abundance before surgery, and the map under the left top red box represents lipids’ abundance after RYGB.
Figure 3. Heatmap of Sphingolipids from the Plasma and Gastrointestinal Tract.
Lucas Santander1; Gabriela de Oliveira Lemos, MD2; Daiane Mancuzo3; Natasha Mendonça Machado, PhD4; Raquel Torrinhas, PhD5; Dan Linetzky Waitzberg, PhD5
1Universidade Santo Amaro (Santo Amaro University), São Bernardo Do Campo, Sao Paulo; 2University of Sao Paulo School of Medicine, Brasília, Distrito Federal; 3Universidade São Caetano (São Caetano University), São Caetano do Sul, Sao Paulo; 4University of Sao Paulo School of Medicine, São Paulo; 5Faculty of Medicine of the University of São Paulo, São Paulo
Financial Support: Fundação de Amparo a Pesquisa do Estado de São Paulo.
Background: Microalbuminuria (MAL) is an early biomarker of kidney injury, linked to conditions like hypertension, type 2 diabetes (T2DM), and obesity. It is also associated with higher cardiovascular (CV) risk. This protocol examines the impact of Roux-en-Y gastric bypass (RYGB) on MAL and CV markers in patients with obesity, T2DM, and MAL.
Methods: 8 women with grade II-III obesity, T2DM, and MAL who underwent RYGB were included. Patients on insulin therapy were not included. MAL was defined as a urinary albumin-to-creatinine ratio > 30 mg/g. MAL, glycemic, and lipid serum biomarkers were measured at baseline and three months post-surgery. Systolic (SBP) and diastolic (DBP) blood pressures and medical treatments for T2DM and hypertension were also assessed. T2DM remission was defined by ADA 2021 criteria. Categorical variables were reported as absolute and relative frequencies, while continuous variables were expressed as median and IQR based on normality tests. Intragroup and intergroup comparisons were conducted using the Wilcoxon and Mann-Whitney tests for numeric data. The Fisher test was performed when necessary to compare dichotomic variables. Data were analyzed in the JASP software version 0.18.1.0.
Results: Overall, RYGB was associated with weight loss, improved body composition, and better CV markers (Table 1). Post-surgery, MAL decreased by at least 70% in all patients and resolved in half. All patients with MAL resolution had pre-surgery levels ≤ 100 mg/g. Those without resolution had severe pre-surgery MAL (33.8 vs. 667.5, p = 0.029), and higher SBP (193 vs. 149.5, p = 0.029) DBP (138 vs. 98, p = 0.025). Blood pressure decreased after surgery but remained higher in patients without MAL resolution: SBP (156.0 vs. 129.2, p = 0.069) and DBP (109.5 vs. 76.5, p < 0.001). MAL resolution was not linked to T2DM remission at 3 months (75% vs. 50%, p = 1.0). One patient had worsened MAL (193.0 vs. 386.9 mg/g) after RYGB. Glomerular filtration rate (GFR) tended to increase post-surgery only in the group with MAL resolution (95.4 vs. 108.2 ml/min/1.73 m², p = 0.089), compared to the group without MAL resolution (79.2 vs. 73.7 ml/min/1.73 m², p = 0.6).
Conclusion: RYGB effectively reduced markers of renal dysfunction and cardiovascular risk in this cohort. Patients showed a decrease in MAL, with resolution in half of the patients. The small sample size and short follow-up period may have limited the overall impact of the surgery on renal function. Future studies with larger cohorts and longer follow-ups are needed to understand better the effects of bariatric surgery on MAL, and its relation to other CV markers.
Table 1. Biochemical and Clinical Data Analysis Following RYGB.
eGFR: estimated glomerular filtration rate; HbA1c: glycated hemoglobin; HDL-c: high-density lipoprotein cholesterol; HOMA-BETA: beta-cell function by the homeostasis model; HOMA-IR: Homeostasis Model Assessment of Insulin Resistance; LDL-c: low-density lipoprotein cholesterol; Non-HDL-c: non-high-density lipoprotein cholesterol; VLDL-c: very-low-density lipoprotein cholesterol; DBP: diastolic blood pressure; SBP: systolic blood pressure; WC: waist circumference.
Michelle Nguyen, BSc, MSc1; Johane P Allard, MD, FRCPC2; Dane Christina Daoud, MD3; Maitreyi Raman, MD, MSc4; Jennifer Jin, MD, FRCPC5; Leah Gramlich, MD6; Jessica Weiss, MSc1; Johnny H. Chen, PhD7; Lidia Demchyshyn, PhD8
1Pentavere Research Group Inc., Toronto, ON; 2Division of Gastroenterology, Department of Medicine, Toronto General Hospital, Toronto, ON; 3Division of Gastroenterology, Centre Hospitalier de l'Université de Montréal (CHUM), Department of Medicine, University of Montreal, Montreal, QC; 4Division of Gastroenterology, University of Calgary, Calgary, AB; 5Department of Medicine, University of Alberta, Division of Gastroenterology, Royal Alexandra Hospital, Edmonton, AB; 6Division of Gastroenterology, Faculty of Medicine and Dentistry, University of Alberta, Edmonton, AB; 7Takeda Canada Inc., Vancouver, BC; 8Takeda Canada Inc., Toronto, ON
Encore Poster
Presentation: 46th European Society for Clinical Nutrition and Metabolism (ESPEN) Congress. 7-10 September 2024, Milan, Italy.
Financial Support: Funding of this study is from Takeda Canada Inc.
Background: Teduglutide is indicated for the treatment of patients with short bowel syndrome (SBS) who are dependent on parenteral support (PS). This study evaluated longer-term teduglutide effectiveness and safety in Canadian patients diagnosed with SBS dependent on PS using Real-World Evidence.
Methods: This was an observational, retrospective study, using data from the national Canadian Takeda patient support program, and included adults with SBS. Data were collected 6 months before teduglutide initiation and from initiation to Dec-01-2023, death, or loss of follow-up. Descriptive statistics characterized the population and treatment-emergent adverse events (TEAEs). Changes in parenteral nutrition/intravenous fluid supply (PN/IV) were assessed based on decreases in PN/IV volume from baseline. Statistical significance was set at p < 0.05.
Results: 52 patients (60% women) were included in this study. Median age (range) was 54 (22–81) and 50% had Crohn's disease as their etiology of SBS. At 6 months, median (range) absolute and percentage reduction from baseline in PN/IV volume was 3,900 mL/week (-6,960–26,784; p < 0.001), and 28.1% (-82.9–100). At 24 months, median (range) absolute reduction from baseline was 6,650 mL/week (-4,400–26,850; p = 0.003), and the proportion of patients who achieved ≥20% reduction in weekly PN/IV was 66.7%. Over the study, 27% achieved independence from PN/IV. TEAEs were reported in 51 (98%) patients (83% were serious TEAEs) during the study period, the 3 most common were weight changes, diarrhea, and fatigue.
Conclusion: Patients showed significant decreases in PN/IV volumes after initiating teduglutide, with no unexpected safety findings. This study demonstrates real-world, longer-term effectiveness and safety of teduglutide in Canadian patients with SBS, complimenting previous clinical trials, and real-world studies.
Poster of Distinction
Sarah Carter, RD, LDN, CNSC1; Ruth Fisher, RDN, LD, CNSC2
Background: Perceived benefit is one factor determining therapy continuation. Little information is published regarding positive outcomes for patients receiving the GLP-2 analog teduglutide apart from success rates of weaning HPN and hydration volumes by 20%. Patients who are new to therapy may question how long after initiation should they expect to see results. Patients may collaborate with prescribers to improve therapy tolerance if they have hope for improvements in their quality of life. This data analysis will provide details regarding patients receiving teduglutide and their perceived benefits to therapy.
Methods: Dietitians interview patients receiving teduglutide as part of a service agreement, monitoring persistency and helping to eliminate barriers to therapy. Dietitians make weekly and monthly calls based on patients’ drug start dates and document interventions in flowsheets in patients’ electronic medical records. The interventions are published to data visualization software to monitor compliance with dietitian outreach as part of a quality improvement project. All patients were diagnosed with short bowel syndrome, but baseline characteristics were not exported to this dashboard. This study is a retrospective analysis of the existing dashboard using 3 years of assessments (May 1, 2021-April 30, 2024). Exclusion criteria included therapy dispensed from a pharmacy not using the company's newest computer platform and patients who did not respond to outreach. We analyzed the time on therapy before a positive outcome was reported by the patient to the dietitian and which positive outcomes were most frequently reported.
Results: The data set included 336 patients with 2509 phone assessments. The most frequently reported first positive outcome was improved ostomy output/less diarrhea (72%, n = 243), occurring just after a month on therapy (mean 31 days ± 26.4). The mean time for first positive outcome for all patients who reported one was 32 days ± 28.5 (n = 314). Of the 22 patients who reported no positive outcome, 13 didn't answer the dietitians’ calls after initial contact. A summary is listed in Table 1. Overall positive outcomes reported were improved ostomy output/less diarrhea (87%, n = 292), weight gain (70%, n = 236), improved appetite/interest in food (33%, n = 112), feeling stronger/more energetic (27%, n = 92), improved quality of life (23%, n = 90), improved lab results (13%, n = 45) and fewer antidiarrheal medications (12%, n = 40). Of the 218 patients receiving parenteral support, 44 patients stopped hydration and HPN completely (20%) with another 92 patients reporting less time or days on hydration and HPN (42%) for a total of 136 patients experiencing a positive outcome of parenteral support weaning (62%). Patients reported improvements in other areas of their lives including fewer hospitalizations (n = 39), being able to travel (n = 35), tolerating more enteral nutrition volume (n = 19), returning to work/school (n = 14) and improving sleep (n = 13). A summary is diagramed in Figure 2.
Conclusion: This retrospective analysis indicates that teduglutide is associated with improved symptom control and improved quality of life measures, with most patients seeing a response to therapy within the first 2 months. Patients responded to teduglutide with a decrease in ostomy output and diarrhea as the most frequent recognizable response to therapy. In addition to the goal of weaning parenteral support, clinicians should be cognizant of improvements in patients’ clinical status that can have significant impact in quality of life.
Table 1. Timing of First Reported Positive Outcome by Patients Receiving Teduglutide.
Figure 1. Total Positive Outcomes Reported by Patients (n = 336).
Poster of Distinction
Jennifer Cholewka, RD, CNSC, CDCES, CDN1; Jeffrey Mechanick, MD1
1The Mount Sinai Hospital, New York, NY
Financial Support: None Reported.
Background: Bariatric surgery is a guideline-directed intervention for patients with severe obesity. Adherence to post-operative recommendations is variable and consequent undernutrition complicated by multiple micronutrient deficiencies is prevalent. A post-bariatric surgery syndrome (PBSS) is not well defined and therefore poorly understood, though early detection and intervention would likely decrease clinical and economic burdens. Our current experience with PBSS is presented here as a case series. We will define PBSS based on clinical evidence related to risk factors, interventions, and clinical/biochemical responses.
Methods: Twenty four consecutive patients referred to our metabolic support service were identified between January 1, 2019 and December 31, 2023 who were admitted to The Mount Sinai Hospital in New York City with a history of RYGB (roux en y gastric bypass) or BPDDS (biliopancreatic diversion with duodenal switch) and found to have failure to thrive, undernutrition, clinical/biochemical features of at least one micronutrient deficiency, and indications for parenteral nutrition. Patients were excluded if they had surgical complications or were not treated with parenteral nutrition. Fifteen patients were included in this case series and de-identified prior to data collection using the electronic health records (EPIC) and coded data collection sheets. Descriptive statistical methods were used to analyze parametric variables as mean ± standard deviation and non-parametric variables as median (interquartile range).
Results: Results are provided in Table 1.
Conclusion: The PBSS is defined by significant decompensation following a bariatric surgery procedure with malabsorptive component characterized by failure to thrive, hypoalbuminemia, multiple micronutrient deficiencies, and need for parenteral nutrition. Major risk factors include inadequate protein and micronutrient intake due to either unawareness (e.g., not recommended) or poor adherence, significant alcohol consumption, or a complicating medical/surgical condition. Parenteral nutrition formulation was safe in this population and prioritizes adequate nitrogen, nonprotein calories, and micronutrition. Further analyses on risk factors, responses to therapy, and role of a multidisciplinary team are in progress.
Table 1. Risks/Presentation.
Table 2. Responses to Parenteral Nutrition Intervention.
1The Ohio State University, Columbus, OH; 2NYC Health + Hospitals, New York City, NY; 3The Ohio State University Wexner Medical Center, Columbus, OH; 4The Ohio State University, Granville, OH
Financial Support: None Reported.
Background: Decompensated cirrhosis increases the risk of fat maldigestion through altered bile synthesis and excretion through the bile canaliculi. Maldigestion increases the risk of vitamin and mineral deficiencies which when untreated contribute to consequential health issues such as metabolic bone disease, xerophthalmia, and hyperkeratosis. There is an absence of comprehensive guidelines for prevention and treatment of deficiencies.
Methods: Medical and surgical history, anthropometrics, medications and nutritional supplements, laboratory data, and medical procedures were extracted and analyzed from the electronic medical record.
Results: A patient with congenital biliary atresia and decompensated cirrhosis was seen in a hepatology outpatient clinic. Biochemical assessment revealed severe vitamin A deficiency and suboptimal vitamin D and zinc status. Physical assessment indicated telogen effluvium and transient blurry vision. Despite a history of high-dose oral retinyl acetate ranging from 10,000-50,000 units daily and a 3-day course of 100,000 units via intermuscular injection and co-treatment of zinc deficiency to ensure adequate circulating retinol binding protein, normalization of serum retinol was not possible over the last 10 years. The patient's serum vitamin A level normalized following liver transplantation.
Conclusion: In decompensated cirrhosis, there is a lack of sufficient guidelines for micronutrient dosing when traditional treatment strategies are unsuccessful. Furthermore, altered secretion of transport proteins due to underlying liver dysfunction may pose challenges in evaluating laboratory markers of micronutrient status. Collaborations with pharmacy and medicine support a thorough assessment, and the establishment of a safe treatment and monitoring plan. Clinical research is needed to understand strategies for acceptable and safe dosing strategies for patients with chronic, unresponsive fat soluble vitamin deficiencies.
Gang Wang, PhD1
1Nimble Science, Calgary, AB
Financial Support: This work was supported by the National Research Council of Canada Industrial Research Assistance Program (NRC IRAP), an Alberta Innovate Accelerating Innovations into CarE (AICE) grant, and partially by a Clinical Problems Incubator Grant from the Snyder Institute for Chronic Disease at the University of Calgary.
Background: The small intestine (SI) microbiome plays a crucial role in nutrient absorption and emerging evidence indicates that fecal contents is insufficient to represent the SI microenvironment. Endoscopic sampling is possible but expensive and not scalable. Ingestible sampling capsule technologies are emerging. However, potential contamination becomes are major limitations of these devices.
Methods: We previously reported the Small Intestinal MicroBiome Aspiration (SIMBA) capsules as effective means for sampling, sealing and preserving SI luminal contents for 16 s rRNA gene sequencing analysis. A subset of the DNA samples, including SI samples collected by SIMBA capsules (CAP) and the matched saliva (SAL), fecal (FEC) and duodenal endoscopic aspirate (ASP) and brush (BRU) samples, from 16 participants recruited for an observational clinical validation study were sent for shotgun metagenomic sequencing. The aims are 1) to compare the sampling performance of the capsule (CAP) compared to endoscopic aspirates (ASP) and 850 small intestine, large intestine and fecal samples from the Clinical Microbiomics data warehouse (PRJEB28097), and 2) to characterize samples from the 4 different sampling sites in terms of species composition and functional potential.
Results: 4/80 samples (1/16 SAL, 2/16 ASP, 0/16 BRU, 1/16 CAP and 0/16 FEC) failed library preparation and 76 samples were shotgun sequenced (average of 38.5 M read pairs per sample) (Figure 1). Quality assessment demonstrated that despite the low raw DNA yield of CAP samples, they retained a minimal level of host contamination, in comparison to ASP and BRU (mean 5.27 % vs. 93.09 - 96.44% of average total reads per sample) (Figure 2). CAP samples shared majority of the species with ASP samples as well as a big fraction of species detected in the terminal ileum samples. ASP and CAP sample composition was more similar to duodenum, jejunum and saliva and very different from large intestine and stool samples. Functional genomics further revealed GI regional-specific differences: In both ASP and CAP samples we detect a number of Gut Metabolic Modules (GMMs) for carbohydrate digestion and short-chain fatty acids. However, probiotic species, and species and genes involved in the bile acid metabolism were mainly prevalent in CAP and FEC samples and could not be detected in ASP samples.
Conclusion: CAP and ASP microbiome are compositionally similar despite of the high level of host contamination of ASP samples. CAP appears to be of better quality to reveal GI regional-specific functional potentials than ASP. This analysis demonstrates the great potential for the SIMBA capsule for unveiling the SI microbiome and supports the prospective use of SIMBA capsules in observational and interventional studies to investigate the impacts of short-term and long-term biotic foods interventions to the gut microbiome (Figure 3). A series of studies utilizing the SIMBA capsules are being conducted under way and the detectability of the biotic foods intervention impacts will be reported in near future (Table 1).
Table 1. List of Ongoing Observational and Interventional Clinical Studies using SIMBA Capsule.
Figure 1. Shotgun Metagenomic Sequencing: Taxonomy Overview (Relative Abundance of the 10 Most Species By Sampling Site).
Figure 2. Shotgun Metagenomic Sequencing: High Quality Non-Host Contamination Reads On Sampling Sites.
Figure 3. Short-term and Long-term Interventional Study Protocols Using SIMBA Capsules.
Financial Support: University of Global Health Equity.
Background: Rwanda is currently grappling with the double burden of malnutrition and rising obesity, particularly in urban populations. As the country experiences rapid urbanization and dietary shifts, traditional issues of undernutrition coexist with increasing rates of obesity and related metabolic disorders. This study aims to investigate the relationship between these nutrition-related issues and their impact on gastrointestinal (GI) health and metabolic outcomes in urban Rwandan populations.
Methods: A cross-sectional study was conducted in Kigali, Rwanda's capital, involving 1,200 adult participants aged 18 to 65. Data were collected on dietary intake, body mass index (BMI), GI symptoms, and metabolic markers such as fasting glucose, cholesterol levels, and liver enzymes. Participants were categorized into three groups: undernourished, normal weight, and overweight/obese based on their BMI. GI symptoms were assessed using a validated questionnaire, and metabolic markers were evaluated through blood tests. Statistical analyses were performed to assess correlations between dietary patterns, BMI categories, and GI/metabolic health outcomes.
Results: The study found that 25% of participants were classified as undernourished, while 22% were obese, reflecting the double burden of malnutrition and rising obesity in Rwanda's urban population. Among obese participants, 40% exhibited elevated fasting glucose levels (p < 0.01), and 30% reported significant GI disturbances, such as irritable bowel syndrome (IBS) and non-alcoholic fatty liver disease (NAFLD). In contrast, undernourished individuals reported fewer GI symptoms but showed a higher prevalence of micronutrient deficiencies, including anaemia (28%) and vitamin A deficiency (15%). Dietary patterns characterized by high-fat and low-fiber intake were significantly associated with increased GI disorders and metabolic dysfunction in both obese and normal-weight participants (p < 0.05).
Conclusion: This study highlights the growing public health challenge posed by the coexistence of undernutrition and obesity in Rwanda's urban centres. The dietary shifts associated with urbanization are contributing to both ends of the nutritional spectrum, adversely affecting GI and metabolic health. Addressing these issues requires comprehensive nutrition interventions that consider the dual challenges of undernutrition and obesity, promoting balanced diets and improving access to health services. These findings have important implications for nutrition therapy and metabolic support practices in Rwanda, emphasizing the need for tailored interventions that reflect the country's unique nutritional landscape.
1University of Minnesota, St. Paul, MN; 2University of Minnesota, Minneapolis, MN; 3University of Minnesota, Austin, MN; 4Indiana University School of Medicine, Indianapolis, IN
Financial Support: Achieving Cures Together.
Background: Fecal microbiota transplantation (FMT) is a highly effective treatment for recurrent Clostridioides difficile infection (rCDI). The procedure results in the repair of the gut microbiota following severe antibiotic injury. Recovery from rCDI is associated with high incidence of post-infection irritable bowel syndrome (IBS). In addition, older age, medical comorbidities, and prolonged diarrheal illness contribute to frailty in this patient population. The effect of FMT on IBS symptoms and frailty in patients with rCDI is largely unknown. In this prospective cohort study, we collected IBS symptom and frailty data over the 3 months following FMT treatment for rCDI in patients at two large, academic medical centers.
Methods: Consenting adults who underwent FMT treatment for rCDI were enrolled in this study (n = 113). We excluded patients who developed a recurrence of CDI within the 3-month follow-up period (n = 15) or had incomplete IBS symptom severity scale (IBS-SSS) scores at any timepoint. The IBS-SSS is a 5-item survey measuring symptom intensity, with a score range from 0 to 500 with higher scores representing greater severity. IBS-SSS was collected at baseline, 1-week post FMT, 1-month post-FMT, and 3-months post-FMT. Frailty was assessed at baseline and 3-months using the FRAIL scale (categorical variable: “Robust Health”, “Pre-Frail”, “Frail”). Kruskal-Wallis test was used to compare IBS-SSS across timepoints. Post-hoc analysis was performed with the Pairwise Wilcoxon Rank Sum Tests using the False Discovery Rate adjustment method. The Friedman test was used to compare frailty distribution between the baseline and 3-month timepoints.
Results: Mean age of the cohort was 63.3 (SD 15.4) years; 75% of the patients were female sex (total n = 58 patients). The IBS-SSS scores across timepoints are presented in Table 1 and Figure 1. The median IBS score at baseline was 134 [IQR 121], which decreased to a median score of 65 [IQR 174] at 1-week post-FMT. The baseline timepoint was found to differ from the 1-week, 1-month, and 3-month timepoints (p < 0.05). No other differences between timepoints were observed. Frailty was assessed at baseline and 3 months (total n = 52). At baseline, 71% of patients (n = 37) were considered Pre-Frail or Frail, but this percentage decreased to 46% (n = 24) at 3-months (Table 2; p < 0.05).
Conclusion: Findings from this multicenter, prospective cohort study indicate an overall improvement in both IBS symptoms and frailty in the 3-months following FMT therapy for rCDI. Notably, IBS symptom scores were found to improve by 1-week post FMT. Further research is required to understand what predicts IBS symptom improvement following FMT and if nutrition therapy can help support further improvement. It will also be important to further understand how nutrition therapy can help support the improvement in frailty status observed following FMT for rCDI.
Table 1. Distribution of IBS-SSS Scores at Baseline and Following FMT.
Table 2. Frailty Distribution Assessed by FRAIL Scale at Baseline and 3-Months Post-FMT.
Box-plot distributions of IBS-SSS scores across timepoints. Median IBS-SSS score are baseline was 134 and decreased to a median score of 65 at 1-week post-FMT. The baseline timepoint was found to differ from the 1-week, 1-month, and 3-month timepoints (p < 0.05).
Figure 1. Distribution of IBS-SSS Scores by Timepoint.
1The Ohio State University, Columbus, OH; 2The Ohio State University Wexner Medical Center, Columbus, OH; 3Nationwide Children's Hospital, Columbus, OH; 4The Ohio State University, Granville, OH
Financial Support: None Reported.
Background: Cyclic vomiting syndrome (CVS) is a disorder of gut-brain interaction characterized by recurrent spells of nausea, vomiting, and abdominal pain lasting hours to days. Estimated prevalence is approximately 2% in adults and children. Due to recalcitrant symptoms, patients might develop disorder eating patterns though data are lacking. Identifying patients at risk for disordered eating patterns and malnutrition can help optimize nutritional status and may improve overall patient outcomes. The purpose of this study is to define nutrient intakes and dietary patterns of those with CVS seeking care at a tertiary care referral clinic, and to establish variation in dietary intakes based on disease severity.
Methods: In this ongoing, cross-sectional study of adults diagnosed with CVS based on Rome IV criteria, participants are asked to complete validated surveys including a food frequency questionnaire (FFQ). Baseline demographics and clinical characteristics including disease severity (defined by the number of episodes per year) were ascertained. Healthy eating index (HEI) scores (scale of 0-100) were calculated to assess diet quality with higher scores indicating better diet quality compared to lower scores. Those with complete data were included in this interim analysis.
Results: Data from 33 participants with an average age of 40 ± 16 years and an average BMI of 28.6 ± 7.9 is presented. The cohort was predominately female (67%), white (79%) and with moderate to severe disease (76%). The malnutrition screening tool supported that 42% of participants were at risk of malnutrition independent of BMI status (p = 0.358) and disease severity (p = 0.074). HEI scores were poor amongst those with CVS (55) and did not differ based on disease severity (58 vs 54; p = 0.452). Energy intakes varied ranging from 416-3974 kcals/day with a median intake of 1562 kcals/day.
Conclusion: In CVS, dietary intake is poor and there is a high risk of malnutrition regardless of disease severity and BMI. Providers and registered dietitian nutritionists must be aware of the high rates of malnutrition risk and poor dietary intakes in this patient population to improve delivery of dietary interventions. Insight into disordered eating and metabolic derangements may improve the understanding of dietary intakes in CVS.
Hannah Huey, MDN1; Holly Estes-Doetsch, MS, RDN, LD2; Christopher Taylor, PhD, RDN2; Kristen Roberts, PhD, RDN, LD, CNSC, FASPEN, FAND3
1Nationwide Children's Hospital, Columbus, OH; 2The Ohio State University, Columbus, OH; 3The Ohio State University, Granville, OH
Financial Support: None Reported.
Background: Micronutrient deficiencies are common in Crohn's disease (CD) due to poor dietary absorption. Pregnancy has an increased nutritional demand and when coupled with a malabsorptive condition like CD, clinicians must closely monitor micronutrient status. However, there is a lack of evidence-based guidelines for clinicians when managing these complex patients leaving clinicians to use clinical judgement for management. A case study of a pregnant female with CD presents for delivery with an undetected fat-soluble vitamin deficiency. The impact of a vitamin K deficiency on the post-partum management of a patient with CD is presented along with potential assessment and treatment strategies. At 25 weeks gestation, the patient presented with a biochemical iron deficiency anemia, vitamin B12 deficiency, and zinc deficiency for which she was treated with oral supplementation and/or intramuscular injections. No assessment of fat-soluble vitamins during the gestation period was conducted despite a pre-pregnancy history of multiple micronutrient deficiencies. At 33 weeks gestation, the mother was diagnosed with preeclampsia and delivered the fetus at 35 weeks. After birth, the infant presented to the NICU with a mediastinal mass, abnormal liver function tests and initial coagulopathy. The mother experienced a uterine hemorrhage post-cesarean section. At this time her INR was 14.8 with a severe prolongation of the PT and PTT and suboptimal levels of blood clotting factors II, VII, IX, and X. The patient was diagnosed with a vitamin K deficiency and was treated initially with 10 mg daily by mouth x 3 days resulting in an elevated serum vitamin K while PT and INR were trending towards normal limits. At discharge she was recommended to take 1 mg daily by mouth of vitamin K to prevent further deficiency. PT and INR were the biochemical assays that were reassessed every 3 months since serum vitamin K is more reflective of recent intake. CD represents a complex disorder and the impact of pregnancy on micronutrient status is unknown. During pregnancy, patients with CD may require additional micronutrient monitoring particularly in the case of historical micronutrient deficiencies or other risk factors. This case presents the need for further research into CD-specific micronutrient deficiencies and the creation of specific supplementation guidelines and treatment algorithms for detection of micronutrient deficiencies in at-risk patients.
Methods: None Reported.
Results: None Reported.
Conclusion: None Reported.
Gretchen Murray, BS, RDN1; Kristen Roberts, PhD, RDN, LD, CNSC, FASPEN, FAND2; Phil Hart, MD1; Mitchell Ramsey, MD1
1The Ohio State University Wexner Medical Center, Columbus, OH; 2The Ohio State University, Granville, OH
Financial Support: UL1TR002733.
Background: Enteric hyperoxaluria (EH) and resultant lithiasis is well documented in many malabsorptive conditions including inflammatory bowel disease, celiac disease, short bowel syndrome, and post gastric bypass. Chronic pancreatitis (CP) often leads to exocrine pancreatic insufficiency (EPI) and subsequent fat malabsorption increasing the risk of EH secondary to calcium binding to dietary fat leaving oxalates available for colonic absorption. Modulating oxalate intake by reducing whole grains, greens, baked beans, berries, nuts, beer and chocolate while simultaneously improving hydration is the accepted medical nutrition therapy (MNT) for EH and calcium-oxalate stones. Although sources of dietary oxalate are well-known, there is limited literature regarding dietary oxalates in the CP diet, leaving a paucity of data to guide MNT for EH and calcium-oxalate lithiasis for these patients.
Methods: A cross-sectional, case-control study was performed comparing subjects with CP to healthy controls. Vioscreen™ food frequency questionnaire was used to assess and quantify total oxalic acid intake in the CP cohort and to describe dietary sources. Descriptive statistics were used to describe dietary intake of oxalic acid and contributing food sources.
Results: A total of 52 subjects with CP were included and had a mean age of 50 ± 15 years. Most subjects were male (n = 35; 67%). Mean BMI was 24 ± 6 kg/m2 and 8 subjects (15%) were classified as underweight by BMI. Median daily caloric intake was 1549 kcal with a median daily oxalic acid intake of 104 mg (range 11-1428 mg). The top three contributors to dietary oxalate intake were raw and cooked greens such as spinach or lettuce, followed by mixed foods such as pizza, spaghetti, and tacos and tea. Other significant contributors (>100 mg) to dietary oxalate intake included sports or meal replacement bars, cookies and cakes, potato products (mashed, baked, chips, fried), and refined grains (breads, tortillas, bagels).
Conclusion: In the CP population, highest contributors to oxalate intake include greens, mixed foods, tea, meal replacement bars, some desserts, potatoes, and refined grains. Many of the identified dietary oxalate sources are not considered for exclusion in a typical oxalate restricted diet. A personalized approach to dietary oxalate modulation is necessary to drive MNT for EH prevention in those with CP.
Qian Ren, PhD1; Peizhan Chen, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine2
1Department of Clinical Nutrition, Shanghai Sixth People's Hospital Affiliated to Shanghai Jiao Tong University School of Medicine, Shanghai; 2Clinical Research Center, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai
Financial Support: This study was supported by the Youth Cultivation Program of Shanghai Sixth People's Hospital (Grant No. ynqn202223), the Key Laboratory of Trace Element Nutrition, National Health Commission of the Peoples’ Republic of China (Grant No. wlkfz202308), and the Danone Institute China Diet Nutrition Research and Communication (Grant No. DIC2023-06).
Background: Low serum vitamin D status was reported to be associated with reduced muscle mass; however, it is inconclusive whether this relationship is causal. This study used data from the National Health and Nutrition Examination Survey (NHANES) and two-sample Mendelian randomization (MR) analyses to ascertain the causal relationship between serum 25-hydroxyvitamin D [25(OH)D] and appendicular muscle mass (AMM).
Methods: In the NHANES 2011–2018 dataset, 11,242 participants aged 18–59 years old were included, and multivariant linear regression was performed to assess the relationship between 25(OH)D and AMM measured by dual-energy X-ray absorptiometry (Figure 1). In two-sample MR analysis, 167 single nucleotide polymorphisms significantly associated with serum 25(OH)D were applied as instrumental variables (IVs) to assess vitamin D effects on AMM in the UK Biobank (417,580 Europeans) using univariable and multivariable MR models (Figure 2).
Results: In the NHANES 2011–2018 dataset, serum 25(OH)D concentrations were positively associated with AMM (β = 0.013, SE = 0.001, p < 0.001) in all participants, after adjustment for age, race, season of blood collection, education, income, body mass index and physical activity. In stratification analysis by sex, males (β = 0.024, SE = 0.002, p < 0.001) showed more pronounced positive associations than females (β = 0.003, SE = 0.002, p = 0.024). In univariable MR, genetically higher serum 25(OH)D levels were positively associated with AMM in all participants (β = 0.049, SE = 0.024, p = 0.039) and males (β = 0.057, SE = 0.025, p = 0.021), but only marginally significant in females (β = 0.043, SE = 0.025, p = 0.090) based on IVW models were noticed. No significant pleiotropy effects were detected for the IVs in the two-sample MR investigations. In MVMR analysis, a positive causal effect of 25(OH)D on AMM were observed in total population (β = 0.116, SE = 0.051, p = 0.022), males (β = 0.111, SE = 0.053, p = 0.036) and females (β = 0.124, SE = 0.054, p = 0.021).
Conclusion: Our results suggested a positive causal effect of serum 25(OH)D concentration on AMM; however, more researches are needed to understand the underlying biological mechanisms.
Figure 1. Working Flowchart of Participants Selection in the Cross-Sectional Study.
Figure 2. The study assumptions of the two-sample Mendelian Randomization analysis between serum 25(OH)D and appendicular muscle mass. The assumptions include: (1) the genetic instrumental variables (IVs) should exhibit a significant association with serum 25(OH)D; (2) the genetic IVs should not associate with any other potential confounding factors; and (3) the genetic IVs must only through serum 25(OH)D but not any other confounders to influence the appendicular muscle mass. The dotted lines indicate the violate of the assumptions.
Qian Ren, PhD1; Junxian Wu1
1Department of Clinical Nutrition, Shanghai Sixth People's Hospital Affiliated to Shanghai Jiao Tong University School of Medicine, Shanghai
Financial Support: None Reported.
Background: A healthy diet is essential for both preventing and treating type 2 diabetes mellitus (T2DM), which has a negative impact on public health. Whole grains, which are rich in dietary fibre, can serve as a good source of carbohydrates. However, the correlation and causality between whole grain intake and the risk of T2DM and glucose metabolism remains to be clarified.
Methods: First, the National Health and Nutrition Examination Survey database 2003-2018 were used to investigate the correlation between dietary whole grain/fibre intake and the risk of T2DM and glucose metabolism. Then, based on the largest publicly published genome-wide association analysis of whole grain intake in the Neale lab database, single nucleotide polymorphisms (SNPs) that were statistically and significantly associated with whole grain intake were selected as instrumental variables (p < 5×10-8, linkage disequilibrium r2 < 0.1). Inverse variance weighted analysis (IVW), weighted median method and other methods were used to analyze the causal relationship between whole grain intake and T2DM. Heterogeneity test, gene pleiotropy test and sensitivity analysis were performed to evaluate the stability and reliability of the results.
Results: The results showed that dietary intakes of whole grains (OR = 0.999, 95%CI: 0.999 ~ 1.000, p = 0.004)/fibre (OR = 0.996, 95% CI: 0.993 ~ 0.999, p = 0.014) were negatively associated with the risk of T2DM. In the population with normal glucose metabolism, dietary fibre intake was negatively associated with FPG (β = -0.003, SE = 0.001, p < 0.001), FINS (β = -0.023, SE = 0.011, p = 0.044), and HOMA-IR (β = -0.007, SE = 0.003, p = 0.023). In the population with abnormal glucose metabolism, dietary intakes of whole grains (β = -0.001, SE = 0.001, p = 0.036) and fibre (β = -0.006, SE = 0.002, p = 0.005) were negatively associated with HbA1c. For further MR analyses, the IVW method demonstrated that for every one standard deviation increase in whole grains intake, T2DM risk was decreased by 1.9% (OR = 0.981, 95%CI: 0.970 ~ 0.993, β = -0.019, p = 0.002), with consistent findings for IVW-multiplicative random effects, IVW-fixed effects and IVW-radial. Meanwhile, MR-Egger regression analysis (intercept = -2.7 × 10-5, p = 0.954) showed that gene pleiotropy doesn't influence the results of MR analysis. Leave-one-out analysis showed that individual SNP greatly doesn't influence the results (pheterogeneity= 0.445).
Conclusion: Dietary intakes of whole grains may reduce the risk of T2DM and improve glucose metabolic homeostasis and insulin resistance. The causal relationship between whole grains intake and T2DM, as well as the optimal daily intake of whole grains still needs to be further explored in the future through large randomised controlled intervention studies and prospective cohort studies.
1University of Hyogo, Ashiya-shi, Hyogo; 2University of Hyogo, Himezi-shi, Hyogo
Financial Support: None Reported.
Background: In recent years, lifestyle-related diseases such as obesity, diabetes, and dyslipidemia have been recognized as problems, one of the causes of which is an excessive fatty diet. Obesity and other related diseases are known to be risk factors for the severity of infectious diseases such as sepsis and novel coronavirus infection, but their pathomechanisms have not been clarified. Therefore, we hypothesized that a diet high in fat might induce functional changes not only in adipocytes but also in macrophages, weakening the immune response and leading to aggravation of infectious diseases. Therefore, in this study, we performed proteome analysis and RNA sequence analysis to examine what kind of gene and protein expression is induced in macrophages by high-fat diet loading.
Methods: Four-week-old mice were divided into a normal diet (ND) group and a high-fat diet (HFD) group and kept for four weeks. Macrophages were collected intraperitoneally from mice that had been intraperitoneally injected with 2 mL of thioglycolate medium to promote macrophage proliferation one week before dissection, and incubated at 37°C with 5% CO2 using Roswell Park Memorial Institute medium (RPMI). After culturing in an environment for 2 hours, floating cells were removed, and proteome analysis was performed using the recovered macrophages. In addition, RNA sequence analysis was performed on RNA extracted from macrophages.
Results: Proteome analysis identified more than 4000 proteins in each group. In the HFD group, compared to the ND group, decreased expression of proteins involved in phagocytosis, such as immunoglobulin and eosinophil peroxidase, was observed. In addition, RNA sequencing data analysis also showed a decrease in the expression levels of genes related to phagocytosis, which had been observed to decrease in proteome analysis.
Conclusion: From the above, it was suggested that the phagocytic ability of macrophages is reduced by high-fat diet loading. This research is expected to clarify the molecular mechanisms by which high-fat dietary loading induces the expression of genes and proteins and induces immunosuppressive effects.
1The Ohio State University College of Medicine, Columbus, OH; 2The Ohio State University Wexner Medical Center, Columbus, OH; 3The Ohio State University College of Public Health, Columbus, OH; 4The Ohio State University Wexner Medical Center, Dublin, OH
Financial Support: None Reported.
Background: Food insecurity (FI) refers to lack of consistent access to sufficient food for an active, healthy life. This issue, influenced by economic, social, and environmental factors, disproportionately affects disadvantaged populations. According to the United States Department of Agriculture, more than 10% of U.S. households experience FI, leading to physical and mental health consequences. FI has been linked to poorer dietary quality, increased consumption of processed, calorie-dense foods, and a higher prevalence of obesity, diabetes, and cardiovascular diseases. Chronic stress from FI can also trigger or exacerbate mental health disorders, including depression and anxiety, further complicating health outcomes. Affecting ~20% of the global population, dyspepsia significantly diminishes quality of life and increases healthcare costs. Its etiology is multifactorial, involving abnormal gastric motility, visceral hypersensitivity, inflammation, and psychosocial stressors. Although research on the link between FI and disorders of gut-brain interaction (DGBIs) like dyspepsia is limited, emerging evidence suggests a bidirectional relationship. Therefore, this study was designed to examine the association between FI, dyspepsia, and other health-related social needs (HRSN). Our hypotheses include 1) patients with FI have more severe dyspepsia symptoms, and 2) FI is associated with HRSN in other domains.
Methods: Patients presenting to a specialty motility clinic were prospectively enrolled into a registry created with the goal of holistically investigating the pathophysiology of DGBIs. Validated questionnaires for HRSN and dyspepsia were completed prior to their clinic visits. Data were managed with REDCap and statistical analyses were performed using SPSS.
Results: 53 patients completed the questionnaires. 88.7% of patients were White and 73.6% were female with an average age of 45.6 years (21-72) and BMI of 28.7 kg/m2 (17.8-51.1). FI was present in 13 (24.5%) patients. The overall severity of dyspepsia symptoms was significantly less in the food secure patients (13.8 vs. 18.8, p = 0.042). Of the four subscales of dyspepsia (nausea-vomiting, postprandial fullness, bloating, and loss of appetite), only loss of appetite was significantly greater in those with FI (2.3 vs. 1.3, p = 0.017). Patients with FI were more likely to be at medium (61.5% vs. 5.0%) or high risk (30.8% vs. 2.5%, p < 0.001) of financial hardship, experience unmet transportation needs (38.5% vs. 5.0%, p = 0.0.019) and housing instability (30.8% vs. 5.0%, p = 0.023) compared to those who were food secure. They were also at higher risk of depression (54% vs. 12.5%, p = 0.005), and reporting insufficient physical activity (92.3% vs. 55.0%, p = 0.05). After adjusting for age, gender, race, and BMI, FI was not a predictor of global dyspepsia severity. Greater BMI (O.R. 0.89, 95% C.I. 0.81-0.98) was associated with severity of early satiety. Female gender (O.R. 10.0, 95% C.I. 1.3-76.9, p = 0.03) was associated with the severity of nausea. Greater BMI (O.R. 1.10, 95% C.I. 1.001-1.22, p = 0.048) and female gender (O.R. 10.8, 95% C.I. 1.6-72.9, p = 0.015) both correlated with severity of postprandial fullness.
Conclusion: FI affected ~25% of patients seen in our clinic. It was, however, not an independent predictor of overall severity of dyspepsia symptoms. Patients who experienced FI had higher prevalence of other HRSN, and risk of depression and physical inactivity. Our results highlight the importance of considering FI and other HRSN in the management of dyspepsia. Understanding this interaction is essential for improving clinical outcomes and guiding public health interventions.
Background: Farnesoid X receptor (FXR) a gut nuclear receptor regulates intestinal driven bile acid homeostasis in short bowel syndrome. Chenodeoxycholic acid (CDCA), a primary bile acid, acts as a FXR ligand. Stem cell-derived intestinal enteroids offer a valuable system to study intestinal FXR function. We hypothesized that transfection of porcine enteroids with small interfering RNA (siRNA) would modulate FXR to further define its mechanistic role.
Methods: We developed a porcine protocol for matrigel-based 3D culture systems to generated enteroids from the small bowel of neonatal yorkshire pigs. After 7 days, cultures were passaged and expanded. RNA strands for Dicer-substrate siRNAs (DsiRNAs) were synthesized as single-strand RNAs (ssRNAs) by Integrated DNA Technologies and resuspended in an RNase-free buffer. DsiRNA targeted Sus scrofa, farnesoid x receptor (FXR) gene (KF597010.1). Three sets of DsiRNA were made for gene-specific silencing of FXR gene. Porcine enteroids were cultured and transfected with FXR-specific siRNA and control siRNA using Lipofectamine RNAiMAX reagent. They were also treated with escalating CDCA concentrations (25 μM to 50 μM) for 24 and 48 hrs. FXR mRNA levels were quantified by real-time PCR and functional assays performed to assess changes in bile acid uptake and efflux following transfection.
Results: Data from 3 separate experiments of intestinal crypts showed similar results in enhanced FXR expression with CDCA against control (p < 0.01). CDCA treatment resulted in a dose-dependent increase in FXR mRNA expression, reaching peak levels after 48 hours of exposure (2.8X increase) with enhanced nuclear localization. Functionally, CDCA-treated enteroids displayed increased bile acid uptake and reduced efflux, validating FXR's role in mediating bile acid driven enterohepatic circulation. Several runs with siRNA were conducted. Using 30pMole siRNA (sense: 5' GAUCAACAGAAUCUUCUUCAUUATA 3' and antisense: 5' UAUAAUGAAGAAGAUUCUGUUGAUCUG 3') there was a 68% reduction in FXR expression against scramble. FXR silencing led to decreased bile acid uptake and increased efflux. No significant effects were observed in enteroids transfected with control siRNA. Paradoxically, CDCA treated cultures showed a higher proportion of immature enteroids to mature enteroids.
Conclusion: In porcine enteroids, CDCA treatment increased FXR gene expression and promoted its nuclear localization. This finding implies the existence of a positive feedback cycle in which CDCA, an FXR ligand, induces further synthesis and uptake. siRNA transfection was able to significantly decrease FXR activity. By employing this innovative methodology, one can effectively examine the function of FXR in ligand treated or control systems.
Presentation: North American Society for Pediatric Gastroenterology, Hepatology and Nutrition (NASPGHAN) Florida.
Financial Support: None Reported.
Background: Biliary atresia (BA) is a major cause of obstructive neonatal cholestatic disease. Although hepato-portoenterostomy (HPE) is routinely performed for BA patients, more than half eventually require liver transplantation. The intricate mechanisms of bile ductular injury driving its pathogenesis remains elusive and not recapitulated in current small animal and rodent models that rely on bile duct ligation. Addressing prevailing lacunae, we hypothesized that extra and intra hepatic bile duct destruction through an endothelial irritant would recapitulate human condition. We have thus developed a novel neonatal piglet BA model called, ‘BATTED’. Piglets have liver and gastro-intestinal homology to human infants and share anatomic and physiological processes providing a robust platform for BATTED and HPE.
Methods: Six 7-10 day old piglets were randomized to BATTED (US provisional Patent US63/603,995) or sham surgery. BATTED included cholecystectomy, common bile duct and hepatic duct injection of 95% ethanol and a retainer suture for continued ethanol induced intrahepatic injury. Vascular access ports were placed and scheduled ultrasound guided liver biopsies were performed. Six-weeks post-BATTED piglets underwent HPE. 8-weeks after initial surgery, animals were euthanized. Serology, histology, gene expression and immunohistochemistry were performed.
Results: Serological evaluation revealed a surge in conjugated bilirubin 6 weeks after BATTED procedure from baseline (mean Δ 0.39 mg/dL to 3.88 mg/dL). Gamma-glutamyl transferase (GGT) also exhibited a several fold increase (mean: Δ 16.3IU to 89.5IU). Sham did not display these elevations (conjugated bilirubin: Δ 0.39 mg/dL to 0.98 mg/dL, GGT: Δ 9.2IU to 10.4IU). Sirius red staining demonstrated significant periportal and diffuse liver fibrosis (16-fold increase) and bile duct proliferation marker CK-7 increased 9-fold with BATTED. Piglets in the BATTED group demonstrated enhanced CD-3 (7-fold), alpha-SMA (8.85-fold), COL1A1 (11.7-fold) and CYP7A1 (7-fold), vs sham. Successful HPE was accomplished in piglets with improved nutritional status and a reduction in conjugated bilirubin (Δ 4.89 mg/dL to 2.11 mg/dL).
Conclusion: BATTED replicated BA features, including hyperbilirubinemia, GGT elevation, significant hepatic fibrosis, bile duct proliferation, and inflammatory infiltration with subsequent successful HPE. This model offers substantial opportunities to elucidate the mechanism underlying BA and adaptation post HPE, paving the path for the development of diagnostics and therapeutics.
Sirine Belaid, MBBS, MPH1; Vikram Raghu, MD, MS1
1UPMC, Pittsburgh, PA
Financial Support: None Reported.
Background: At our institution, pediatric residents are responsible for managing patients with intestinal failure (IF) in the inpatient units. However, they have reported feelings of inexperience and anxiety when dealing with these complex cases. This project aimed to identify knowledge gaps and evaluate the confidence levels of pediatric residents in managing IF patients.
Methods: We conducted an online needs assessment survey using Qualtrics, which included Likert-scale, multiple-choice, open-ended and rating questions to assess residents' confidence levels (from 1 to 10) in performing tasks related to IF patient care. This voluntary survey, approved by the IRB as exempt, was distributed to all pediatric residents at the University of Pittsburgh via QR codes and emails.
Results: Of the respondents, 32% participated in the survey, with nearly 50% having completed a rotation on the Intestinal Rehabilitation (IR) service. Residents reported the lowest confidence in calculating Total Parenteral Nutrition (TPN)-like Intravenous (IV) fluids (solutions administered centrally or peripherally, containing dextrose and major electrolytes designed to match the patients’ home TPN contents), identifying signs of D-lactic acidosis and small intestinal bacterial overgrowth (SIBO), and managing SIBO (average confidence rating of 2/10). They also expressed low confidence in ordering home TPN and TPN-like IV Fluids, understanding related anatomy, ensuring proper stoma care, managing central access loss, and addressing poor catheter blood flow (average confidence ratings of 3-4/10). Conversely, residents felt more confident managing feeding intolerance and central line-associated infections (average confidence ratings of 5-6/10). Additionally, they rated their ability to identify signs of septic and hypovolemic shock as 8/10, though they felt less confident in managing these conditions (7/10). Furthermore, 64% of respondents agreed that managing IF patients is educationally valuable and preferred laminated cards or simulated sessions as educational resources (average ratings of 8 and 6.8 out of 10, respectively).
Conclusion: The survey highlights several areas where pediatric residents need further education. Addressing these knowledge gaps through targeted curricular interventions can better equip residents to manage IF patients and potentially increase their interest in this specialty.
CLABSI = Central Line-Associated Bloodstream Infection, TPN = Total Parenteral Nutrition, EHR = Electronic Health Record, IV = Intravenous, SIBO = Small Intestinal Bacterial Overgrowth.
Figure 1. Stratifies the tasks related to managing patients with Intestinal Failure (IF) into three categories based on the average confidence rating score (>= 7/10, 5-6/10, <=4/10) of pediatric residents.
Figure 2. Illustrates the distribution of pediatric residents’ opinions on the educational value of managing patients with intestinal failure.
1The Hospital for Sick Children, Toronto, ON; 2Hospital Metropolitano de Quito, Quito, Pichincha
Financial Support: Nestle Health Science Canada, North York, Ontario, Canada.
Background: Enteral nutrition provides fluids and nutrients to individuals unable to meet needs orally. Recent interest in real food-based formulas highlights a shift towards providing nutrition with familiar fruit, vegetable and protein ingredients. This study aimed to evaluate the tolerability and nutritional adequacy of hypercaloric, plant-based, real food ingredient formula in pediatric tube-fed patients.
Methods: This prospective, single-arm, open-label study evaluated the tolerability and nutritional adequacy of a hypercaloric, plant-based formula, Compleat® Junior 1.5, in medically complex, stable, pediatric tube-fed patients aged 1-13 years. Participants were recruited from outpatient clinics at The Hospital for Sick Children, Toronto from May 2023 to June 2024. Demographic and anthropometric measurements (weight, height) were obtained at baseline. Daily dose of study product was isocaloric when compared to routine feeds. Total fluid needs were met by adding water. Participants transitioned to the study formula over 3 days, followed by 14-days of exclusive use of study product. Caregivers monitored volume of daily intake, feed tolerance, and bowel movements during the transition and study period using an electronic database (Medrio®). An end of study visit was conducted to collect weight measurements. Descriptive statistics summarize demographic and clinical characteristics (Table 1). The paired t-test compared weight-for-age and BMI-for-age z-scores between baseline and the end-of-study. Symptoms of intolerance and bowel movements, using either the Bristol Stool Scale or Brussel's Infant and Toddler Stool Scale, were described as frequency of events and compared at baseline to intervention period. The percent calorie and protein goals during the study period were calculated as amount calories received over prescribed, and amount protein received as per dietary reference intake for age and weight.
Results: In total, 27 ambulatory pediatric participants with a median age of 5.5 years (IQR, 2.5-7) were recruited for the study with 26 completing (Table 1). Participant weight-for-age and BMI-for-age z-scores significantly improved between baseline and end of study, from -1.75 ± 1.93 to -1.67 ± 1.88 (p < 0.05), and from -0.47 ± 1.46 to 0.15 ± 0.23 (p < 0.05), respectively. There was no significant difference in the frequency of any GI symptom, including vomiting, gagging/retching, tube venting or perceived pain/discomfort with feeds, between baseline and end of study. There was no significant difference in frequency or type of stool between baseline and end of study. Study participants met 100% of their prescribed energy for the majority (13 ± 1.7 days) of the study period. All participants exceeded protein requirements during the study period. Twenty families (76.9%) indicated wanting to continue to use study product after completing the study.
Conclusion: This prospective study demonstrated that a hypercaloric, plant-based, real food ingredient formula among stable, yet medically complex children was well tolerated and calorically adequate to maintain or facilitate weight gain over a 14-day study period. The majority of caregivers preferred to continue use of the study product.
Table 1. Demographic and Clinical Characteristics of Participants (n = 27).
1Northwestern University Feinberg School of Medicine, Chicago, IL; 2University of Alabama Culverhouse College of Business, Tuscaloosa, AL; 3Northwestern University Kellogg School of Business & McCormick School of Engineering, Evanston, IL; 4University of North Carolina School of Medicine, Chapel Hill, NC
1Cohen Children's Medical Center of New York, Port Washington, NY; 2Cohen Children's Medical Center of NY, New Hyde Park, NY
Financial Support: None Reported.
Background: Premature infants in the neonatal intensive care unit (NICU) are at high risk for peripheral intravenous catheter infiltration (PIVI) because of the frequent cannulation of small and fragile vessels. The most common infusate in neonates is parenteral nutrition (PN), followed by antibiotics. Previous reports have suggested that the intrinsic properties of infusates, such as pH, osmolality, and calcium content, determine the severity of PIVIs. This has led to the common practice of restricting the intake of protein and/or calcium to less than that which is recommended for optimal growth and bone mineralization.
Methods: Our objective was to identify the characteristics of infants and intravenous (IV) infusates that associated with the development of severe neonatal PIVIs. We conducted a retrospective analysis of PIVIs in our level IV NICU from 2018-2022 (n = 120). Each PIVI was evaluated by a wound certified neonatologist and classified as mild, moderate, or severe using a scoring system based on the Infusion Nurses Society (INS) staging criteria. Comparison between groups were done using ANOVA and chi-square analysis or Mann-Whitney tests for non-parametric data.
Results: Infants with severe PIVIs had a lower mean birthweight than those with mild or moderate PIVIs (1413.1 g vs 2116.9 g and 2020.3 g respectively, p = .01) (Table 1). Most PIVIs occurred during infusions of PN and lipids, but the severity was not associated with the infusion rate, osmolality, or with the concentration of amino acids (median 3.8 g/dL in the mild group, 3.5 g/dL in the moderate group and 3.4 g/dL in the severe group) or calcium (median 6500 mg/L in all groups) (Table 2, Figure 1). Of note, the infusion of IV medications within 24 hours of PIVI was most common in the severe PIVI group (p = .03)(Table 2). Most PIVIs, including mild, were treated with hyaluronidase. Advanced wound care was required for 4% of moderate and 44% of severe PIVIs, and none required surgical intervention.
Conclusion: Severe PIVIs in the NICU are most likely to occur in infants with a low birthweight and within 24 hours of administration of IV medications. This is most likely because medications must be kept at acidic or basic pH for stability, and many have high osmolarity and/or intrinsic caustic properties. Thus, medications may induce chemical phlebitis and extravasation with inflammation. In contrast, PN components, including amino acids and calcium, are not related to the severity of extravasations. Our findings suggest that increased surveillance of IV sites for preterm infants following medication administration may decease the risk of severe PIVIs. Conversely, reducing or withholding parenteral amino acid and or calcium to mitigate PIVI risk may introduce nutritional deficiencies without decreasing the risk of clinically significant PIVIs.
Table 1. Characteristic Comparison of Mild, Moderate, and Severe Pivis in Neonatal ICU. PIVI Severity Was Designated Based on INS Criteria.
Table 2. Association of Medication Administration and Components of Infusates With the Incidence and Severity of PIVI in NICU.
1Cincinnati Children's Hospital Medical Center, Mason, OH; 2University of Cincinnati, Cincinnati, OH; 3Cincinnati Children's Hospital Medical Center, Cincinnati, OH; 4Cincinnati Children's Hospital, Cincinnati, OH; 5Cincinnati Children's Hospital Medical Center, Cincinnati, OH
Financial Support: None Reported.
Background: It is common for children with intestinal failure on parenteral nutrition to be fed an elemental enteral formula as it is believed they are typically better tolerated due to the protein module being free amino acids, the absence of other allergens, and the presence of long chain fatty acids. In February 2022, a popular elemental formula on the market was recalled due to bacterial contamination that necessitated an immediate transition to an alternative enteral formula. This included initiating plant-based options for some of our patients. We have experienced a growing interest and request from families to switch to plant-based formulas due to religious practices, cost concerns, and personal preferences. While plant-based formulas lack major allergens and may contain beneficial soluble fiber, they are under studied in this patient population. This study aims to determine if growth was affected amongst children with intestinal failure on parenteral nutrition who switched from elemental to plant-based formulas.
Methods: We conducted a retrospective cohort study of IF patients on PN managed by our intestinal rehabilitation program who transitioned from elemental to plant-based formulas during the product recall. Data were collected on demographics, intestinal anatomy, formula intake, parenteral nutrition support, tolerance, stool consistency, and weight gain for 6 months before and after formula transition. Paired analyses were performed for change in growth and nutritional intake, using the Wilcoxon-signed rank test. Chi-squared tests were performed to compare formula tolerance. An alpha value < 0.05 was considered significant.
Results: Eleven patients were included in the study [8 Males; median gestational age 33 (IQR = 29, 35.5) weeks, median age at assessment 20.4 (IQR = 18.7,29.7) months]. All participants had short bowel syndrome (SBS) as their IF category. Residual small bowel length was 28(IQR = 14.5,47.5) cm. Overall, there was no statistically significant difference in growth observed after switching to plant-based formulas (p = 0.76) (Figure 1). Both median enteral formula volume and calorie intake were higher on plant-based formula, but not statistically significant (p = 0.83 and p = 0.41) (Figure 2). 7 of 11 patients (64%) reported decreased stool count (p = 0.078) and improved stool consistency (p = 0.103) after switching to plant-based formula. Throughout the study, the rate of PN calorie and volume weaning were not different after switching to plant-based formula (calories: p = 0.83; volume: p = 0.52) (Figure 3).
Conclusion: In this small study of children with IF, the switch from free amino acid formula to an intact plant-based formula was well tolerated. Growth was maintained between groups. After switching to plant-based formulas these children tolerated increased enteral volumes, but we were underpowered to demonstrate a statistical difference. There was no evidence of protein allergy among children who switched. Plant-based formulas may be an alternative option to elemental formulas for children with intestinal failure.
Figure 1: Change in Weight Gain (g/day) on Elemental and Plant-Based Formulas.
Figure 2. Change in Enteral Calories (kcal/kg/day) and Volume (mL/kg/day) on Elemental and Plant-Based Formulas.
Figure 3. Change in PN Calories (kcal/kg/day) and Volume (mL/kg/day) on Elemental and Plant-Based Formulas.
Background: In pediatrics, post-pyloric enteral feeding via jejunostomy, gastrojejunostomy or naso-jejunostomy tubes is increasingly utilized to overcome problems related to aspiration, severe gastroesophageal reflux, poor gastric motility, and gastric outlet obstruction. Jejunal enteral feeding bypasses the duodenum, a primary site of absorption for many nutrients. Copper is thought to be primarily absorbed in the stomach and proximal duodenum, thus patients receiving nutritional support in a way that bypasses this region can experience copper deficiency. The prevalence and complications of copper deficiency in the pediatric population are not well documented.
Methods: This was a retrospective case series of two patients treated at Nationwide Children's Hospital (NCH). Medical records were reviewed to collect laboratory, medication/supplement data and enteral feeding history. In each case report, both patients were observed to be receiving Pediasure Peptide® for enteral formula.
Results: Case 1: A 14-year-old male receiving exclusive post-pyloric enteral nutrition for two years. This patient presented with pancytopenia and worsening anemia. Laboratory data was drawn on 3/2017 and demonstrated deficient levels of copper (< 10 ug/dL) and ceruloplasmin (20 mg/dL). Repletion initiated with intravenous cupric chloride 38 mcg/kg/day for 3 days and then transitioned to 119 mcg/kg/day via j-tube for 4 months. Labs were redrawn 2 months after initial episode of deficiency and indicated overall improvement of pancytopenia (Table 1). After 4 months, cupric chloride decreased to 57 mcg/kg/day as a maintenance dose. Laboratory data redrawn two and a half years after initial episode of deficiency and revealed deficient levels of copper (27 ug/dL) and ceruloplasmin (10 mg/dL) despite lower dose supplementation being administered. Case 2: An 8-year-old female receiving exclusive post-pyloric enteral nutrition for 3 months. Laboratory data was drawn on 3/2019 and revealed deficient levels of copper (38 ug/dL) and ceruloplasmin (13 mg/dL). Supplementation of 50 mcg/kg/day cupric chloride administered through jejunal tube daily. Copper and ceruloplasmin labs redrawn at 11 months and 15 months after initiation of supplementation and revealed continued deficiency though hematologic values remained stable (Table 2).
Conclusion: There are currently no guidelines for clinicians for prevention, screening, treatment, and maintenance of copper deficiency in post-pyloric enteral feeding in pediatrics. Current dosing for copper repletion in profound copper deficiency is largely based on case series and expert opinions. At NCH, the current standard-of-care supplementation demonstrates inconsistent improvement in copper repletion, as evidenced by case reports discussed above. Future research should determine appropriate supplementation and evaluate their efficacy in patients with post-pyloric enteral feeding.
Table 1. Laboratory Evaluation of Case 1.
'-' indicates no data available, bolded indicates result below the lower limit of normal for age.
Table 2. Laboratory Evaluation of Case 2.
'-' indicates no data available, bolded indicates result below the lower limit of normal for age.
Background: Parenteral nutrition (PN) is and remains a high-risk therapy, typically containing more than 40 different ingredients. Incorporation of PN prescribing into the electronic health record (EHR) has been recommended to minimize the risk of transcription errors and allow for implementation of important safety alerts for prescribers. Ambulatory PN prescribing workflows typically require manual transcription of orders in the absence of robust EHR interoperability between systems used for transitions of care. Pediatric patients receiving ambulatory PN have an increased risk of medication events due to the need for weight-based customization and low utilization of ambulatory PN leading to inexperience for many pharmacies. Development and implementation of a prescribing system incorporating inpatient and outpatient functionality within the EHR is necessary to improve the quality and safety of ambulatory PN in pediatric patients. The primary goal was to create a workflow that provided improvement in transitions of care through minimization of manual transcription and improved medication safety. We describe modification of standard EHR tools to achieve this aim.
Methods: Utilizing a multidisciplinary team, development and incorporation of ambulatory PN prescribing within the EHR at Nationwide Children's Hospital was completed. Inpatient and outpatient variances, safety parameters to provide appropriate alerts to prescribers, and legal requirements were considered and evaluated for optimization within the new system.
Results: The final product successfully incorporated ambulatory PN prescribing while allowing seamless transfer of prescriptions between care settings. The prescriber orders the patient specific ambulatory PN during an inpatient or outpatient encounter. The order is subsequently queued for pharmacist review/verification to assess the adjustments and determine extended stability considerations in the ambulatory setting. After pharmacist review, the prescription prints and is signed by the provider to be faxed to the pharmacy.
Conclusion: To our knowledge, this is the first institution to be able to develop and incorporate pediatric PN prescribing into the EHR that transfers over in both the inpatient and outpatient settings independent of manual transcription while still allowing for customization of PN.
Faith Bala, PhD1; Enas Alshaikh, PhD1; Sudarshan Jadcherla, MD1
1The Research Institute at Nationwide Children's Hospital, Columbus, OH
Financial Support: None Reported.
Background: Extrauterine growth remains a concern among preterm-born infants admitted to the neonatal ICU (NICU) as it rarely matches intrauterine fetal growth rates. Although the reasons are multifactorial, the role played by the duration of exclusive parenteral nutrition (EPN) and the transition to reach exclusive enteral nutrition (EEN) phase remains unclear. Significant nutrient deficits can exist during the critical phase from birth to EEN, and thereafter, and these likely impact short and long-term outcomes. Given this rationale, our aims were to examine the relationship between the duration from birth to EEN on growth and length of hospital stay (LOHS) among convalescing preterm-born infants with oral feeding difficulties.
Methods: This is a retrospective analysis of prospectively collected data from 77 preterm infants admitted to the all-referral level IV NICU at Nationwide Children's Hospital, Columbus, Ohio, who were later referred to our innovative neonatal and infant feeding disorders program for the evaluation and management of severe feeding/aero-digestive difficulties. Inclusion criteria: infants born < 32 weeks gestation, birthweight < 1500 g, absence of chromosomal/genetic disorders, discharged at term equivalent postmenstrual age (37-42 weeks, PMA) on full oral feeding. Growth variables were converted to age- and gender-specific Z-scores using the Fenton growth charts. Using the Academy of Nutrition and Dietetics criteria for neonates and preterm populations, extrauterine growth restriction (EUGR) was defined as weight Z-score decline from birth to discharge > 0.8. Clinical characteristics stratified by EUGR status were compared using the Chi-Square test, Fisher exact test, Mann Whitney U test, and T-test as appropriate. Multivariate regression was used to explore and assess the relationship between the duration from birth to EEN and growth Z-scores at discharge simultaneously. Multiple Linear regression was used to assess the relationship between the duration from birth to EEN and LOHS.
Results: Forty-two infants (54.5%) had EUGR at discharge, and those with weight and length percentiles < 10% were significantly greater at discharge than at birth (Table 1). The growth-restricted infants at discharge had significantly lower birth gestational age, a higher proportion required mechanical ventilation at birth, had a higher incidence of sepsis, and took longer days to attain EEN (Table 2). The duration from birth to EEN was significantly negatively associated with weight, length, and head circumference Z scores at discharge. Likewise, the duration from birth to EEN was significantly positively associated with the LOHS (Figure 1).
Conclusion: The duration from birth to exclusive enteral nutrition (EEN) can influence growth outcomes. We speculate that significant gaps between recommended and actual nutrient intake exist during the period to EEN, particularly for those with chronic tube-feeding difficulties. Well-planned and personalized nutrition is relevant until EEN is established, and enteral nutrition advancement strategies using standardized feeding protocols offers prospects for generalizability, albeit provides opportunities for personalization.
Table 1. Participant Growth Characteristics.
Table 2. Participants Clinical Characteristics.
Figure 1. Relationship between the Duration from Birth to EEN Versus Growth Parameters and Length of Hospital Stay.
1Florida International University, Bloomingdale, GA; 2East Carolina Health, Washington, NC; 3Florida International University, Miami Beach, FL; 4Banner University Medical Center, The University of Arizona, Tucson, AZ
Financial Support: The Rickard Foundation.
Background: The neonatal registered dietitian nutritionist (NICU RDN) plays a crucial role in the care of premature and critically ill infants in the neonatal intensive care unit (NICU). Advanced pediatric competencies are frequently a gap in nutrition degree coursework or dietetic internships. Critical care success with such vulnerable patients requires expertise in patient-centered care, multidisciplinary collaboration, and adaptive clinical problem-solving. This research aimed to identify the needs, engagement levels, and expertise of NICU RDNs, while also providing insights into their job satisfaction and career longevity. Currently in the US, there are approximately 850 Level III and Level IV NICUs in which neonatal dietitians are vital care team members, making it imperative that hospital provide appropriate compensation, benefits, and educational support.
Methods: This was a cross-sectional examination using a national, online, IRB approved survey during March 2024 sent to established Neonatal and Pediatric Dietitian practice groups. A Qualtrics link was provided for current and former NICU RDNs to complete a 10-minute online survey that provided an optional gift card for completion. The link remained open until 200 gift cards were exhausted, approximately one week after the survey opened. Statistical analyses were performed using Stats IQ Qualtrics. In the descriptive statistics, frequencies of responses were represented as counts and percentages. For comparison of differences, the Chi-Squared test and Fisher's Exact test are used for categorical analysis.
Results: In total, 253 current (n = 206) and former (n = 47) NICU RDNs completed the online questionnaire. Of the 210 respondents, 84 (40%) reported having pediatric clinical experience, 94 (44%) had clinical pediatric dietetic intern experience, 21 (10%) had previously worked as a WIC nutritionist, 15 (7.1%) had specialized pediatric certification or fellowship, and 12 (5.7%) had no prior experience before starting in the NICU. (Table 1) Of 163 respondents, 83 (50.9%) reported receiving financial support or reimbursement for additional NICU training. Respondents who felt valued as team members planned to stay in the NICU RD role for more than 5 years (p > 0.0046). Additionally, they reported having acknowledgement and appreciation (64.4%), motivation (54.1%), and opportunities for advancement (22.9%). (Table 2).
Conclusion: NICU RDNs do not have a clear competency roadmap nor a career development track. In addition, financial support or reimbursement for continuing education is not consistently an employee benefit which may play a key role in job satisfaction and retention. This data provides valuable insight for not only managers of dietitians but also professional societies to build programs and retention opportunities.
Table 1. Question: When You Started as a NICU RD, What Experience Did You Have? (n = 210).
N and Percentages will total more than 210 as respondents could check multiple answers.
Table 2. Comparison of Questions: Do You Feel You Have the Following in Your Role in the NICU and How Long Do You Plan to Stay in Your Role?
Sivan Kinberg, MD1; Christine Hoyer, RD2; Everardo Perez Montoya, RD2; June Chang, MA2; Elizabeth Berg, MD2; Jyneva Pickel, DNP2
1Columbia University Irving Medical Center, New York, NY; 2Columbia University Medical Center, New York, NY
Financial Support: None Reported.
Background: Patients with short bowel syndrome (SBS) can have significant fat malabsorption due to decreased intestinal surface area, bile acid deficiency, and rapid transit time, often leading to feeding intolerance and dependence on parenteral nutrition (PN). Patients with SBS can have deficiency of pancreatic enzymes and/or reduced effectiveness of available pancreatic enzymes, resulting in symptoms of exocrine pancreatic insufficiency (EPI), including weight loss, poor weight gain, abdominal pain, diarrhea, and fat-soluble vitamin deficiencies. Oral pancreatic enzyme replacement therapy (PERT) is often tried but is impractical for use with enteral nutrition (EN) due to inconsistent enzyme delivery and risk of clogged feeding tubes. When used with continuous EN, oral PERT provides inadequate enzyme delivery as its ability to hydrolyze fats decreases significantly after 30 minutes of ingestion. The significant need for therapies to improve enteral absorption in this population has led to interest in using an in-line digestive cartridge to treat EPI symptoms in SBS patients on tube feedings. Immobilized lipase cartridge is an FDA-approved in-line digestive cartridge designed to hydrolyze fat in EN before it reaches the gastrointestinal tract, allowing for the delivery of absorbable fats through continuous or bolus feeds. In patients with SBS, this modality may be more effective in improving enteral fat absorption compared to oral preparations of PERT. Preclinical studies have demonstrated that use of an in-line digestive cartridge in a porcine SBS model increased fat-soluble vitamin absorption, reduced PN dependence, and improved intestinal adaptation. Our study aims to evaluate changes in PN, EN, growth parameters, stool output, and fat-soluble vitamin levels in pediatric SBS patients using an in-line digestive cartridge at our center.
Methods: Single-center retrospective study in pediatric patients with SBS on EN who used an in-line immobilized lipase (RELiZORB) cartridge. Data collection included patient demographics, etiology of SBS, surgical history, PN characteristics (calories, volume, infusion hours/days), EN characteristics (tube type, bolus or continuous feeds, formula, calories, volume, hours), immobilized lipase cartridge use (#cartridges/day, duration), anthropometrics, stool output, gastrointestinal symptoms, medications (including previous PERT use), laboratory assessments (fat-soluble vitamin levels, fatty acid panels, pancreatic elastase), indication to start immobilized lipase cartridge, and any reported side effects. Patients with small intestinal transplant or cystic fibrosis were excluded.
Results: Eleven patients were included in the study (mean age 10.4 years, 55% female). The most common etiology of SBS was necrotizing enterocolitis (45%) and 7 (64%) of patients were dependent on PN. Results of interim analysis show: mean duration of immobilized lipase cartridge use of 3.9 months, PN calorie decrease in 43% of patients, weight gain in 100% of patients, and improvement in stool output in 6/9 (67%) patients. Clogging of the cartridges was the most common reported technical difficulty (33%), which was overcome with better mixing of the formula. No adverse events or side effects were reported.
Conclusion: In this single-center study, use of an in-line immobilized lipase digestive cartridge in pediatric patients with SBS demonstrated promising outcomes, including weight gain, improved stool output and reduced dependence on PN. These findings suggest that in-line digestive cartridges may play a role in improving fat malabsorption and decreasing PN dependence in pediatric SBS patients. Larger multicenter studies are needed to further evaluate the efficacy, tolerability, and safety of in-line digestive cartridges in this population.
1University of Pittsburgh School of Medicine, Gibsonia, PA; 2UPMC Children's Hospital of Pittsburgh, Pittsburgh, PA; 3University of Pittsburgh School of Medicine, Pittsburgh, PA
Financial Support: National Center for Advancing Translational Sciences (KL2TR001856.)
Background: Administrative databases can provide unique perspectives in rare disease due to the ability to query multicenter data efficiently. Pediatric intestinal failure can be challenging to study due to the rarity of the condition at single centers and the previous lack of a single diagnosis code. In October 2023, a new diagnosis code for intestinal failure was added to the International Classification of Diseases, 10th revision (ICD-10). We aimed to describe the usage and limitations of this code to identify children with intestinal failure.
Methods: We performed a multicenter cross-sectional study using the Pediatric Health Information Systems database from October 1, 2023, to June 30, 2024. Children with intestinal failure were identified by ICD-10 code (K90.83). Descriptive statistics were used to characterize demographics, diagnoses, utilization, and outcomes.
Results: We identified 1804 inpatient encounters from 849 unique patients with a diagnosis code of intestinal failure. Figure 1 shows the trend of code use by month since its inception in October 2023. The 849 patients had a total of 7085 inpatient encounters over that timeframe, meaning only 25% of their encounters included the intestinal failure diagnosis. Among these 849 patients, 638 had at least one encounter over the timeframe in which they received parenteral nutrition; 400 corresponded to an admission in which they also had an intestinal failure diagnosis code. Examining only inpatient stays longer than 2 days, 592/701 (84%) patients with such a stay received parenteral nutrition. Central line-associated bloodstream infections accounted for 501 encounters. Patients spent a median of 37 days (IQR 13-96) in the hospital. Patients were predominantly non-Hispanic White (43.7%), non-Hispanic Black (14.8%), or Hispanic (27.9%). Most had government-based insurance (63.5%). Child Opportunity Index was even split among all five quintiles. The total standardized cost from all encounters with an intestinal failure diagnosis totaled $157 million with the total from all encounters with these patients totaling $259 million. The median cost over those 9 months per patients was $104,890 (IQR $31,149 - $315,167). Death occurred in 28 patients (3.3%) over the study period.
Conclusion: The diagnosis code of intestinal failure has been used inconsistently since implementation in October 2023, perhaps due to varying definitions of intestinal failure. Children with intestinal failure experience high inpatient stay costs and rare but significant mortality. Future work must consider the limitations of using only the new code in identifying these patients.
Figure 1. Number of Encounters With an Intestinal Failure Diagnosis Code.
1Emory University, Atlanta, GA; 2Cincinnati Children's Hospital Medical Center, Cincinnati, OH
Encore Poster
Presentation: 6th Annual Pediatric Early Career Research Conference, August 27, 2024, Health Sciences Research Building I, Emory University.
Financial Support: None Reported.
Background: The neonatal period is a time of rapid growth, and many infants who require intensive care need extra nutritional support. The Academy of Nutrition and Dietetics (AND) published an expert consensus statement to establish criteria for the identification of neonatal malnutrition. There is limited evidence regarding outcomes associated with diagnosis from these opinion-derived criteria. The objective of this study was to compare anthropometric-based malnutrition indicators with direct body composition measurements in infancy.
Methods: Air displacement plethysmography is considered the gold standard for non-invasive body composition measurement, and this was incorporated into routine clinical care at a referral Level IV neonatal intensive care unit. Late preterm and term infants (34-42 weeks gestational age) with body composition measurement available were included in this study. Infants were categorized as having malnutrition per AND criteria. Fat mass, fat-free mass, and body fat percentage z-scores were determined per Norris body composition growth curves. Logistic regression was conducted to ascertain the relationship of fat mass, fat-free mass, and body fat percentage with malnutrition diagnosis. Linear regression was performed to predict body mass index (BMI) at age 18-24 months from each body composition variable.
Results: Eighty-four infants were included, with 39% female and 96% singleton (Table 1). Fifteen percent were small for gestational age and 12% were large for gestational age at birth. Nearly half had a congenital intestinal anomaly, including gastroschisis and intestinal atresia. Sixty-three percent of the group met at least one malnutrition criterion. Fat-free mass z-score was negatively associated with a malnutrition diagnosis, with an odds ratio 0.77 (95% CI 0.59-0.99, p < 0.05). There was not a statistically significant association between malnutrition diagnosis and body fat percentage or fat mass. There was not a statistically significant relationship between any body composition variable and BMI at 18-24 months, even after removing outliers with a high Cook's distance.
Conclusion: Malnutrition diagnosis is associated with low fat-free mass in critically ill term infants. Body composition is not a predictor of later BMI in this small study.
Table 1. Characteristics of Late Preterm and Term Infants in the Neonatal Intensive Care Unit With a Body Composition Measurement Performed Via Air Displacement Plethysmography.
John Stutts, MD, MPH1; Yong Choe, MAS1
1Abbott, Columbus, OH
Financial Support: Abbott.
Background: The prevalence of obesity in children is rising. Despite the awareness and work toward weight reduction, less is known about malnutrition in children with obesity. The purpose of this study was to evaluate the prevalence of obesity in U.S. children and determine which combination of indicators best define malnutrition in this population.
Methods: The 2017-2018 National Health and Nutrition Examination Survey (NHANES) database was utilized to assess biomarkers (most recent complete dataset due to Covid-19 pandemic). Trends in prevalence were obtained from 2013-2018 survey data. Obesity was defined as ≥ 95th percentile of the CDC sex-specific BMI-for-age growth charts. Cohort age range was 12-18 years. Nutrient intake and serum were analyzed for vitamin D, vitamin E, vitamin C, potassium, calcium, vitamin B9, vitamin A, total protein, albumin, globulin, high sensitivity C-reactive protein (hs-CRP), iron, hemoglobin and mean corpuscular volume (MCV). Intake levels of fiber were also analyzed. Children consuming supplements were excluded. Categorical and continuous data analysis was performed using SAS® Proc SURVEYFREQ (with Wald chi-square test) and Proc SURVEYMEANS (with t-test) respectively in SAS® Version 9.4 and SAS® Enterprise Guide Version 8.3. Hypothesis tests were performed using 2-sided, 0.05 level tests. Results were reported as mean ± standard error (SE) (n = survey sample size) or percent ± SE (n).
Results: The prevalence of obesity in the cohort was 21.3% ± 1.3 (993). Prevalence trended upward annually; 20.3% ± 2.1 (1232) in 2013-2014, 20.5% ± 2.0 (1129) in 2015-2016. When compared with children without obesity, the mean serum levels of those with obesity were significantly (P ≤ 0.05) lower for vitamin D (50.3 ± 2.4 vs. 61.5 ± 2.1, p < 0.001), iron (14.4 ± 0.5 vs. 16.4 ± 0.4, p < 0.001), albumin (41.7 ± 0.3 vs. 43.3 ± 0.3, p < 0.001), and MCV (83.8 ± 0.5 vs. 85.8 ± 0.3, p = 0.003). When compared with children without obesity, the mean serum levels of those with obesity were significantly (p ≤ 0.05) higher for total protein (73.2 ± 0.4 vs. 72.0 ± 0.3, p = 0.002), globulin (31.5 ± 0.4 vs. 28.7 ± 0.3, p < 0.001) and hs-CRP (3.5 ± 0.3 vs. 1.2 ± 0.2, p < 0.001). A higher prevalence of insufficiency was found for Vitamin D (51.9% ± 5.6 vs. 26.8% ± 3.7, p = 0.001), hemoglobin (16.3% ± 3.1 vs. 7.5% ± 1.8, p = 0.034) and the combination of low hemoglobin + low MCV (11.2% ± 2.9 vs. 3.3% ± 1.0, p = 0.049). All other serum levels were not significantly (p > 0.05) different, with no significant difference in intake.
Conclusion: Results indicate a continued increase in prevalence of obesity in children. When comparing with the non-obese pediatric population, it also shows the differences in micro- and macronutrient serum levels, with no significant differences in dietary intake of these nutrients. The higher prevalence of low hemoglobin + low MCV supports iron deficiency and adds clinical relevance to the data surrounding low mean blood levels of iron. Children with obesity show higher mean globulin and hs-CRP levels consistent with an inflammatory state. The results underscore the existence of malnutrition in children with obesity and the need for nutrition awareness in this pediatric population.
1Reckitt/Mead Johnson, Evansville, IN; 2Data Minded Consulting, LLC, Houston, TX; 3Reckitt/Mead Johnson Nutrition, Henderson, KY; 4Reckitt/Mead Johnson Nutrition, Manchester, England; 5Reckitt/Mead Johnson Nutrition, Newburgh, IN
Financial Support: None Reported.
Background: The objective was to examine whether nutrient intake varied across malnutrition classification among a nationally representative sample of children and adolescents.
Methods: This was a secondary analysis of children and adolescents 1-18 y who participated in the National Health and Nutrition Examination Survey 2001-March 2020. Participants were excluded if they were pregnant or did not provide at least one reliable dietary recall. The degree of malnutrition risk was assessed by weight-for-height and BMI-for-age Z-scores for 1-2 y and 3-18 y, respectively. As per the Academy of Nutrition and Dietetics/American Society for Parenteral and Enteral Nutrition, malnutrition was classified using Z-scores: none (Z > -1), mild (Z between -1 and -1.9), and moderate/severe (Z ≤ -2). Dietary intake was assessed from one to two 24-hr dietary recalls. Usual intakes of macronutrients from foods and beverages, and of micronutrients from foods, beverages, and supplements were assessed using the National Cancer Institute method. The cut-point approach was used to estimate the proportion of children with intake below the estimated average requirement for micronutrients and outside the acceptable macronutrient distribution range for macronutrients. All analyses were adjusted for sampling methodology and the appropriate sample weights were applied. Independent samples t-tests were conducted to compare estimates within age groups between nutrition status classifications, with no malnutrition as the reference group.
Results: A total of 32,188 participants were analyzed. Of those, 31,689 (98.4%) provided anthropometrics. The majority (91%) did not meet criteria for malnutrition, while 7.4% and 1.6% met criteria for mild and moderate/severe malnutrition, respectively. Dietary supplement use (mean[SE]) was reported among 33.3[0.7]% of all children/adolescents. Of those experiencing moderate/severe malnutrition, inadequate calcium intake was greatest in adolescents 14-18 y (70.6[3.3]%), and older children 9-13 y (69.7[3.4]%), compared to 3-8 y (29.5[3.9]%) and 1-2 y (5.0[1.2]%). In children 9-13 y, risk for inadequate calcium intake was greater among those experiencing mild (65.3[2.1]%) and moderate/severe malnutrition (69.7[2.1]%) compared to those not experiencing malnutrition (61.2[1.1]%). Similarly, in those experiencing moderate/severe malnutrition, inadequate zinc and phosphorus intake was greatest in adolescents 14-18 y (20.9[3.3]% and 30.6[3.4]%, respectively) and 9-13 y (13.4[2.6]% and 33.9[3.6]%, respectively) compared to younger children (range 0.2[0.1]-0.9[0.4]% and 0.2[0.1]-0.4[0.2]% in 1-8 y, respectively). Although the greatest risk for inadequate protein intake was observed among those experiencing moderate/severe malnutrition, percent energy intake from protein and carbohydrate was adequate for most children and adolescents. Total and saturated fat were consumed in excess by all age groups regardless of nutrition status. The greatest risk for excessive saturated fat intake was reported in those experiencing moderate/severe malnutrition (range across all age groups: 85.9-95.1%).
Conclusion: Older children and adolescents experiencing malnutrition were at greatest risk for inadequate intakes of certain micronutrients such as calcium, zinc, and phosphorus. These results may indicate poor diet quality among those at greatest risk for malnutrition, especially adolescents.
Anna Benson, DO1; Louis Martin, PhD2; Katie Huff, MD, MS2
1Indiana University School of Medicine, Carmel, IN; 2Indiana University School of Medicine, Indianapolis, IN
Financial Support: None Reported.
Background: Trace metals are essential for growth and are especially important in the neonatal period. Recommendations are available for intake of these trace metals in the neonate. However, these recommendations are based on limited data and there are few available descriptions regarding trace metal levels in neonates and their influence on outcomes. In addition, monitoring trace metal levels can be difficult as multiple factors, including inflammation, can affect accuracy. The goal of this project was to evaluate patient serum levels of zinc, selenium and copper and related outcomes including growth, rate of cholestasis, sepsis, bronchopulmonary dysplasia (BPD), and death in a cohort admitted to the neonatal intensive care unit (NICU) and parenterally dependent.
Methods: We completed a retrospective chart review of NICU patients who received parenteral nutrition (PN) and had trace metal panels drawn between January 2016 and February 2023. Charts were reviewed for baseline labs, time on PN, trace metal panel level, dose of trace metals in PN, enteral feeds and supplements, and outcomes including morbidities and mortality. Sepsis was diagnosed based on positive blood culture and cholestasis as a direct bilirubin >2 mg/dL. Fisher's Exact Test or Chi square were used to assess association between categorical variables. Spearman correlation was used to assess the correlation between two continuous variables. A p-value of 0.05 was used for significance.
Results: We included 98 patients in the study with demographic data shown in Table 1. The number of patients with trace metal elevation and deficiency are noted in Tables 1 and 2, respectively. The correlation between growth and trace metal levels is shown in Figure 1. Patient outcomes related to the diagnosis of trace metal deficiency are noted in Table 2. Copper deficiency was found to be significantly associated with sepsis (p = 0.010) and BPD (p = 0.033). Selenium deficiency was associated with cholestasis (p = 0.001) and BPD (p = 0.003). To further assess the relation of selenium and cholestasis, spearman correlation noted a significant negative correlation between selenium levels and direct bilirubin levels with (p = 0.002; Figure 2).
Conclusion: Trace metal deficiency was common in our population. In addition, selenium and copper deficiency were associated neonatal morbidities including sepsis, cholestasis, and BPD. When assessing selenium deficiency and cholestasis, the measured selenium level was found to correlate with direct bilirubin level. While there was correlation between trace metal levels and growth, the negative association noted is unclear and highlights the need for further assessment to determine the influence of other patient factors and the technique used for growth measurement. Overall this project highlights the important relation between trace metals and neonatal morbidities. Further research is needed to understand how trace metal supplementation might be used to optimize neonatal outcomes in the future.
Table 1. Patient Demographic and Outcome Information for Entire Population Having Trace Metal Levels Obtained. (Total population n = 98 unless noted).
Patient demographic and outcome information for entire population having trace metal levels obtained.
Table 2. Rate of Trace Metal Deficiency and Association With Patient Outcomes.
(Total n = 98).
Rate of trace metal deficiency and association with patient outcomes.
Scatter plot of average trace metal level and change in growth over time. (Growth represented as change in parameter over days). The Spearman correlation coefficient is listed for each graph. And significance note by symbol with *p-value < 0.05, †p-value < 0.01, ‡p-value < 0.001.
Figure 1. Correlation of Trace Metal Level and Growth.
Scatter plot of individual direct bilirubin levels plotted by selenium levels. Spearman correlation coefficient noted with negative correlation with p-value 0.002.
Figure 2. Correlation of Selenium Level With Direct Bilirubin Level.
1BC Children's Hospital, North Vancouver, BC; 2BCCHR, Vancouver, BC; 3University of British Columbia, Vancouver, BC; 4UBC/BCCHR, Vancouver, BC
Financial Support: None Reported.
Background: Pediatric critical illness causes increased demand for several nutrients. Children admitted requiring nutrition support have a naso-gastric tube to deliver enteral nutrition (EN) formula as liquid nutrition. These formulas are developed using dietary reference intakes (DRI) for healthy populations and does not account for altered nutrient needs in critical illness. Imbalanced nutrient delivery during this vulnerable time could lead to malnutrition, and negatively impact hospital outcomes. Published guidelines by the American Society of Parenteral and Enteral Nutrition (ASPEN) in 2017 aim to alleviate poor nutrition: use concentrated formula in fluid restriction, meet 2/3 estimated calories (resting energy expenditure, REE) by the end of first week, a minimum of 1.5 g/kg/day dietary protein and provide the DRI for each micronutrient. The objective of this retrospective cohort study was to evaluate nutrition delivery (prescribed vs. delivered) compared to 2017 guidelines, and correlate to adequacy in children admitted to a Canadian PICU.
Methods: Three-years of charts were included over two retrospective cohorts: September 2018- December 2020 and February 2022- March 2023. The first cohort, paper chart based, included children 1-18 y with tube feeding started within 3 d after admission. The second cohort, after transition to electronic medical records, included children 1-6 y on exclusive tube feeding during the first week of admission. Patient characteristics, daily formula type, rate prescribed (physician order), amount delivered (nursing notes) and interruption hours and reasons were collected. Statistical analysis included descriptive analysis of characteristics, logistic regression for odds of achieving adequacy of intake with two exposures: age categories and formula type. Pearson correlation was used to interpret interruption hours with percentage of calories met.
Results: Patients (n = 86) that were included spanned 458 nutrition support days (NSD). Admissions predominantly were for respiratory disease (73%), requiring ventilation support (81.4%). Calorie prescription (WHO REE equation) was met in 20.3% of NSD and 43.9% met 2/3 of calorie recommendation (table 1). Concentrated calories were provided in 34% of patients. Hours of interruptions vs. percentage of goal calories met was negatively correlated (r = -.52, p = .002) when looking at those ordered EN without prior EN history (i.e. home tube fed). More than 4 h of interruptions was more likely to not meet 2/3 calorie goal. Calorie goals were met when standard pediatric formula was used in children 1-8 y and concentrated adult formula in 9-18 y. Odds of meeting calorie goal increased by 85% per 1 day increase (OR 1.85 [1.52, 2.26], p < .0001) with a median of 4 d after admission to meet 2/3 calories. Minimum protein intake (1.5 g/kg/d) was only met in 24.9% of all NSD. Micronutrients examined, except for vitamin D, met DRI based on prescribed amount. Delivered amounts provided suboptimal micronutrient intake, especially for vitamin D (figure 1).
Conclusion: Current ASPEN recommendations for EN were not being achieved in critically ill children. Concentrated formula was often not chosen and decreased ability to meet calorie goals in younger patients. Prescribing shorter continuous EN duration (20/24 h) may improve odds of meeting calorie targets. Evaluation of NSD showed improving trend in calorie intake over the first week to meet 2/3 goal recommendation. However, results highlight inadequacy of protein even if calorie needs are increasingly met. Attention to micronutrient delivery with supplementation of Vitamin D is required.
Table 1. Enteral Nutrition Characteristics Per Nutrition Support Days. (N = 458)
Estimated Vitamin D Intake and 95% Confidence Intervals by Age and Formula Groups.
Figure 1. Estimated Vitamin D Intake by Age and Formula Groups.
Dana Steien, MD1; Megan Thorvilson, MD1; Erin Alexander, MD1; Molissa Hager, NP1; Andrea Armellino, RDN1
1Mayo Clinic, Rochester, MN
Financial Support: None Reported.
Background: Home parenteral nutrition (HPN) is a life-sustaining therapy for children with long-term digestive dysfunction. Historically HPN has been considered a bridge to enteral autonomy, or intestinal transplantation. However, thanks to medical and management improvements, HPN is now used for a variety of diagnoses, including intractable feeding intolerance (IFI) in children with severe neurological impairment (SNI). IFI occurs most often near the end-of-life (EOL) in patients with SNI. Thus, outpatient planning and preparation for HPN in this population, vastly differs from historical HPN use.
Methods: Case series of four pediatric patients with SNI who develop IFI and utilized HPN during their EOL care. Data was collected by retrospective chart review. The hospital pediatric palliative care service was heavily involved in the patients’ care when HPN was discussed and planned. The pediatric intestinal rehabilitation (PIR) and palliative care teams worked closely together during discharge planning and throughout the outpatient courses.
Results: The children with SNI in this case series, developed IFI between ages 1 and 12 years. Duration of HPN use varied from 5 weeks to 2 years. All patients were enrolled in hospice, but at various stages. Routine, outpatient HPN management plans and expectations were modified, based on each family's goals of EOL care. Discussions regarding the use and timing of laboratory studies, fever plans, central line issues, growth, and follow up appointments, required detailed discussions and planning.
Conclusion: EOL care for children differs from most EOL care in adults. Providing HPN to children with SNI and IFI can provide time, opportunities, and peace for families during their child's EOL journey, if it aligns with their EOL goals. PIR teams can provide valuable HPN expertise for palliative care services and families during these challenging times.
1Nutricia North America, Roseville, CA; 2Nutricia North America, Rockville, MD; 3Nutricia North America, Greenville, NC
Financial Support: This study was conducted by Nutricia North America.
Background: Extensively hydrolyzed formulas (eHFs) are indicated for the management of cow milk allergy (CMA) and related symptoms. This category of formulas is often associated with an unpleasant smell and bitter taste, however; whey-based eHFs are considered more palatable than casein-based eHFs.1-4 The inclusion of lactose, the primary carbohydrate in human milk, in hypoallergenic formula can also promote palatability. Historically, concerns for residual protein traces in lactose has resulted in complete avoidance of lactose in CMA. However, “adverse reactions to lactose in CMA are not supported in the literature, and complete avoidance of lactose in CMA is not warranted.”5 Clinicians in the United Kingdom previously reported that taste and acceptance of an eHF is an important consideration when prescribing, as they believe a palatable eHF may result in decreased formula refusal and lead to more content families.1 The objective of this study was to understand caregiver sensory perspectives on an infant, when-based eHF containing lactose.
Methods: Fifteen clinical sites were recruited from across the United States. Clinicians enrolled 132 infants, whose families received the whey based eHF for 2 weeks, based on clinician recommendation. Caregivers completed two surveys: an enrollment survey and 2-week-post-survey characterizing eHF intake, CMA related symptoms, stooling patterns, sensory perspectives and satisfaction with eHF. Data was analyzed using SPSS 27 and descriptive statistics.
Results: One hundred and twenty-two infants completed the study. At enrollment infants were 22 ( ± 14.7) weeks old. Prior to study initiation, 12.3% of infants were breastfed, 40.2% were on a casein-based eHF, and 18.9% were on a standard formula with intact proteins. Most patients (97.5%) were fed orally and 2.5% were tube fed. Among all parents who responded, 92.5% (n = 86/93) reported better taste and 88.9% (n = 96/108) reported better small for whey-based formula containing lactose compared to the previous formula. For caregivers whose child was on a casein-based eHF at enrollment and responded, 97.6% (n = 41/42) reported better taste and 95.7% (n = 44/46) reported better smell than the previous formula. Additional caregiver reported perceptions of taste and smell are reported in Figure 1 and Figure 2, respectively. Finally, 89.3% of caregivers said it was easy to start their child on the whey-based eHF containing lactose and 91.8% would recommend it to other caregivers whose child requires a hypoallergenic formula.
Conclusion: The majority of caregivers had a positive sensory experience with the whey-based eHF containing lactose compared to the baseline formulas. Additionally, they found the trial formula easy to transition to and would recommend it to other families. These data support the findings of Maslin et al. and support the clinicians' expectation that good palatability would result in better acceptance and more content infants and families.1 Further research is needed to better understand how improved palatability can contribute to decreased waste and heath care costs.
Figure 1. Caregiver Ranking: Taste of Whey-Based, Lactose-Containing eHF.
Figure 2. Caregiver Ranking: Smell of Whey-Based, Lactose-Containing eHF.
Background: There is limited information on hyperkalemia in adult patients that received trimethoprim-sulfamethoxazole (TMP-SMX). The mechanism for hyperkalemia is related to the TMP component, which is structurally related to the potassium-sparing diuretic amiloride. In children, there is no information on clinical impact or monitoring required. We noted that a pediatric patient on total parenteral nutrition (TPN), had a drop in the TPN potassium dosing once the TMP-SMX was started. This reduction remained for two weeks, following the last dose of the antibiotic. Case presentation: A 7-month-old was in a cardiac intensive care unit following complex surgical procedures and extracorporeal membrane oxygenation requirement. TPN was started due to concerns related to poor perfusion and possible necrotizing enterocolitis. TPN continued for a total of one hundred and ten days. Per protocol while on TPN, electrolytes and renal function (urine output, serum creatinine) were monitored daily. Diuretic therapy, including loop diuretics, chlorothiazide and spironolactone, were prescribed prior to TPN and continued for the entire duration. Renal function remained stable for the duration of TPN therapy. Dosing of potassium in the TPN was initiated per ASPEN guidelines and adjusted for serum potassium levels. Due to respiratory requirement and positive cultures, TMP-SMX was added to the medication regimen on two separate occasions. TMP-SMX 15 mg/kg/day was ordered twelve days after the start of the TPN and continued for three days. TMP-SMX 15 mg/kg/day was again started on day forty-three of TPN and continued for a five-day duration. Serum potassium was closely monitored for adjustments once TMP-SMX was started. When the TPN was successfully weaned off, we reviewed this information again. There was an obvious drop in TPN potassium dosing by day two of both regimens of TMP-SMX start and did not return to the prior stable dosing until approximately two weeks after the last dose of the antibiotic. This reduction lasted far beyond the projected half-life of TMP-SMX. (Table 1) Discussion: TMP-SMX is known for potential hyperkalemia in adult patients with multiple confounding factors. Factors include high dosage, renal dysfunction, congestive heart failure, and concomitant medications known to cause hyperkalemia. Little literature exists to note this side effect in pediatrics. The onset of our patient's increased serum potassium levels, and concurrent decrease in TPN dosing, could be expected, as TMP-SMX's time to peak effect is 1-4 hours. Half-life of TMP in children < 2 years old is 5.9 hours. Given this information, one would expect TMP-SMX to be cleared approximately thirty hours from the last dose administered. Our patient's potassium dosing took approximately two weeks from the end of the TMP-SMX administration to return to the pre TMP-SMX potassium dosing for both treatment regimens. Potential causes for the extended time to stabilize include concurrent high dose TMP-SMX and continuation of the potassium sparing diuretic. Prolonged potassium monitoring for pediatric patients started on high dose TMP-SMX while on TPN should be considered and further evaluation explored.
Methods: None Reported.
Results: None Reported.
Conclusion: None Reported.
Graph representing TPN Potassium dose (in mEq/kg/day), and addition of TMP-SMX regimen on two separate occasions. Noted drop in TPN potassium dose and delayed return after each TMP-SMX regimen.
Figure 1. TPN Potassium Dose and TMP-SMX Addition.
Financial Support: North American Society for Pediatric Gastroenterology, Hepatology, and Nutrition (NASPGHAN) Foundation.
Background: The prevalence of Avoidant/Restrictive Food Intake Disorder (ARFID) is approximately 3% in the general population and 10% in adults with inflammatory bowel disease (IBD). Up to 10% of adolescents with IBD have reported disordered eating behaviors; however, there have been no prospective studies on the prevalence of ARFID eating behaviors in this population.
Methods: This is a one-time, cross-sectional, non-consecutive study of English-speaking patients with a confirmed diagnosis of IBD, aged 12-18 years. Participants completed the validated Nine Item ARFID Screen (NIAS), the SCOFF eating disorder screen (this screen utilizes an acronym [Sick, Control, One, Fat, and Food] in relation to the five questions on the screen) and answered one question about perceived food intolerances. The NIAS is organized into the three specific ARFID domains: eating restriction due to picky eating, poor appetite/limited interest in eating, and fear of negative consequences from eating, each of which is addressed by three questions. Questions are based on a 6-point Likert scale. Participants scoring ≥23 on the total scale or ≥12 on an individual subscale were considered to meet criteria for ARFID eating behaviors. Since some individuals with a positive NIAS screen may have anorexia nervosa or bulimia, we also used the SCOFF questionnaire to assess the possible presence of an eating disorder. A score of two or more positive answers has a sensitivity of 100% and specificity of 90% for anorexia nervosa or bulimia. Apart from descriptive statistics, Chi-square testing was used to study the prevalence of malnutrition, positive SCOFF screening, or food intolerances in patients with and without positive NIAS screens.
Results: We enrolled 82 patients whose demographics are shown in Table 1. Twenty percent (16/82) scored positive on the NIAS questionnaire, 16% (13/82) scored positive on the SCOFF questionnaire, and 48% (39/82) noted food intolerances. Of the 16 participants who scored positive on the NIAS, 50% (8/16) were male, 56% (9/16) had a diagnosis of Crohn's Disease, and 69% (11/16) had inactive disease. Twenty-five percent of those with a positive NIAS (4/16) met criteria for malnutrition (1 mild, 2 moderate, and 1 severe). Sixty-nine percent of those who scored positive on the NIAS (11/16) noted food intolerances and 30% (5/16) had a positive SCOFF screener. The prevalence of malnutrition (p = 0.4), the percentage of patients who scored positive on the SCOFF eating disorder screen (p = 0.3), or those who with reported food intolerances (p = 0.6) was similar in participants who scored positive on the NIAS vs. not.
Conclusion: Using the NIAS, 20% of adolescents with IBD met criteria for ARFID. Participants were no more likely to have malnutrition, a positive score on the SCOFF eating disorder screen, or reported food intolerances whether or not they met criteria for ARFID. Routine screening of adolescents with IBD for ARFID or other eating disorders may identify patients who would benefit from further evaluation.
Table 1. Demographics.
Qian Wen Sng, RN1; Jacqueline Soo May Ong2; Sin Wee Loh, MB BCh BAO (Ireland), MMed (Paed) (Spore), MRCPCH (RCPCH, UK)1; Joel Kian Boon Lim, MBBS, MRCPCH, MMed (Paeds)1; Li Jia Fan, MBBS, MMed (Paeds), MRCPCH (UK)3; Rehena Sultana4; Chengsi Ong, BS (Dietician), MS (Nutrition and Public Health), PhD1; Charlotte Lin3; Judith Ju Ming Wong, MB BCh BAO, LRCP & SI (Ireland), MRCPCH (Paeds) (RCPCH, UK)1; Ryan Richard Taylor3; Elaine Hor2; Pei Fen Poh, MSc (Nursing), BSN1; Priscilla Cheng2; Jan Hau Lee, MCI, MRCPCH (UK), USMLE, MBBS1
1KK Hospital, Singapore; 2National University Hospital, Singapore; 3National University Hospital Singapore, Singapore; 4Duke-NUS Graduate Medical School, Singapore
Financial Support: This work is supported by the National Medical Research Council, Ministry of Health, Singapore.
Background: Protein-energy malnutrition is pervasive in pediatric intensive care unit (PICU) patients. There remains clinical equipoise on the impact of protein supplementation in critically ill children. Our primary aim was to determine the feasibility of conducting a randomized controlled trial (RCT) of protein supplementation versus standard enteral nutrition (EN) in the PICU.
Methods: An open-labelled pilot RCT was conducted from January 2021 to June 2024 in 2 tertiary pediatric centers in Singapore. Children with body mass index (BMI) z-score < 0, who were expected to require invasive or non-invasive mechanical ventilation for at least 48 hours and required EN support for feeding were included. Patients were randomized (1:1 allocation) to protein supplementation of ³1.5 g/kg/day in addition to standard EN or standard EN alone for 7 days after enrolment or discharge to high dependency unit, whichever was earlier. Feasibility was based on 4 outcomes: Effective screening (>80% eligible patients approached for consent), satisfactory enrolment (>1 patient/center/month), timely protocol implementation (>80% of participants receiving protein supplementation within first 72 hours) and protocol adherence (receiving >80% of protein supplementation as per protocol).
Results: A total of 20 patients were recruited - 10 (50.0%) and 10 (50.0%) in protein supplementation and standard EN groups, respectively. Median age was 13.0 [Interquartile range (IQR) 4.2, 49.1] months. Respiratory distress was the most common reason for PICU admission [11 (55.0%)]. Median PICU and hospital length of stay were 8.0 (IQR 4.5, 16.5) and 19.0 (IQR 11.5, 36.5) days, respectively. There were 3 (15%) deaths which were not related trial intervention. Screening rate was 50/74 (67.6%). Mean enrollment was 0.45 patient/center/month. Timely protocol implementation was performed in 15/20 (75%) participants. Protocol adherence was achieved by the participants in 11/15 (73.3%) of protein supplementation days.
Conclusion: Satisfactory feasibility outcomes were not met in this pilot RCT. Based on inclusion criteria of this pilot study and setup of centers, a larger study in Singapore alone will not be feasible. With incorporation of revised logistic arrangements, a larger feasibility multi-center center study involving regional countries should be piloted.
Veronica Urbik, MD1; Kera McNelis, MD1
1Emory University, Atlanta, GA
Financial Support: None Reported.
Background: Many clinical management and physiologic knowledge gaps are present in the care of the tiny baby population (neonates born at ≤23 weeks gestation, also labelled periviable gestational age). There is therefore significant center-specific variation in practice, as well as increased morbidity and mortality observed in infants born at 22-23 weeks compared to those born at later gestational ages1. The importance of nutrition applies to tiny babies, and appropriate nutrition is perhaps even more critical in this population than others. Numerous studies have demonstrated the benefits of full early enteral feeding, including decreased rates of central-line associated blood infection and cholestasis2,3. The risk of developing necrotizing enterocolitis (NEC) is a balancing measure for advancement of enteral feeds4. Adequate nutrition is critical for growth, reducing morbidity and mortality, and improving overall outcomes. Current proposed protocols for this population target full enteral feeding volumes to be reached by 10-14 days of life5.
Methods: From baseline data collected at two Level III neonatal intensive care units (NICU) attended by a single group of academic neonatology faculty from January 2020 – January 2024, the average length of time from birth until full enteral feeds was achieved was 31 days. Using quality improvement (QI) methodology, we identified the barriers in advancement to full enteral feeds (defined as 120cc/kg/day) in babies born at 22- and 23-weeks gestational age admitted to the pediatric resident staffed Level III NICU.
Results: The Pareto chart displays the primary barriers including undefined critical illness, vasopressor use, evaluation for NEC, or spontaneous intestinal perforation, patent ductus arteriosus treatment, and electrolyte derangements (Figure 1). In many cases, no specific reason was able to be identified in chart review for not advancing toward full enteral feeds.
Conclusion: In this ongoing QI project, our SMART aim is to reduce the number of days to reach full feeds over the period of January 2024 -June 2025 by 10%, informed by root cause analysis and the key driver diagram developed from the data thus far (Figure 2). The first plan-do-study-act cycle started on January 16, 2024. Data are analyzed using statistical process control methods.
Financial Support: Some investigators received support from agencies including National Institutes of Health and NASPGHAN which did not directly fund this project.
Background: The widespread shortage of amino acid-based formula in February 2022 highlighted the need for urgent and coordinated hospital response to disseminate accurate information to front-line staff in both inpatient and ambulatory settings.
Methods: An interdisciplinary working group consisting of pediatric gastroenterologists, dietitians, nurses and a quality improvement analyst was established in September 2022. The group met at regular intervals to conduct a needs assessment for all disciplines. The group developed and refined a novel clinical process algorithm to respond to reports of formula shortages and/or recalls. The Clinical Process Map is presented in Figure 1. Plan-do-study-act cycles were implemented to improve the quality of the communication output based on staff feedback. The key performance indicator is time from notification of possible shortage to dissemination of communication to stakeholders, with a goal of < 24 hours.
Results: From September 2022 to August 2024, the group met 18 times for unplanned responses to formula recall/shortage events. Email communication was disseminated within 24 hours for 8/18 (44%) events; within 48 hours for 9/18 (50%) and 1/18 (6%) after 48 hours. Iterative changes included the initiation of an urgent huddle for key stakeholders to identify impact and substitution options; development of preliminary investigation pathway to ensure validity of report; development of structured email format that was further refined to table format including images of products (Figure 2) and creation of email distribution list to disseminate shortage reports. The keys to this project's success were the creation of a multidisciplinary team dedicated to meeting urgently for all events, and the real time drafting and approval of communication within the meeting. Of note, the one communication which was substantially delayed (94.7 hours) was addressed over email only, underscoring the importance of the multidisciplinary working meeting.
Conclusion: Establishing clear lines of communication and assembling key stakeholders resulted in timely, accurate and coordinated communication regarding nutrition recalls/shortage events at our institution.
Background: Early introduction of peanuts and eggs decreases the incidence of peanut and egg allergies in infants who are at high risk of developing food allergies. Prevention guidelines and clinical practice have shifted with recent studies. However, little is known about introducing common food allergens in infants fed via enteral feeding tubes. Early introduction of allergens could be of importance in infants working towards tube feeding wean and those that may benefit from blended tube feeding in the future. We aimed to compare the characteristics of patients with enteral tubes who received education during their gastroenterology visit verses those who did not.
Methods: We performed a single center retrospective chart review involving all patients ages 4 to 24 months of age with an enteral feeding tube seen at the University of South Florida Pediatric Gastroenterology clinic from August 2020 to July 2024. Corrected age was used for infants born < 37 weeks’ gestation age. All types of enteral nutrition were included i.e. nasogastric, gastrostomy, gastrostomy-jejunostomy tube. Data on demographics, clinical characteristics and parent-reported food allergen exposure were collected. An exception waiver was received by the University of South Florida Institution Review Board for this retrospective chart review. Differences between patients who received education and those who did not were evaluated using Student's t-test for continuous variables and chi-square test for categorical variables. All analysis was performed using R Statistical Software (v4.4.2). A p value < =0.05 was considered statistically significant.
Results: A total of 77 patients met inclusion criteria, with 349 total visits. Patient demographics at each visit are shown in Table 1. There was a documented food allergy in 12% (43) of total visits. Education on early introduction of common food allergens was provided in 12% (42) of total visits. Patients who received education at their visit were significantly younger compared to those who did not and were also more likely to have eczema. Table 2 compares nutrition characteristics of the patient at visits where education was discussed vs those where it was not. Infants with any percent of oral intake were more likely to have received education than those that were nil per os (p = 0.013). There was a significant association between starting solids and receiving education (p < 0.001). Reported allergen exposure across all visits was low. For total visits with the patient < 8 months of age (n = 103), only 6% (6) reported peanut and 8% (8) egg exposure. Expanded to < 12 months of age at the time of visit (n = 198), there was minimal increase in reported allergen exposure, 7% (14) reported peanut and 8% (15) egg exposure. Oral feeds were the most common source of reported form of allergen exposure. Only one patient received commercial early allergen introduction product. Cow's milk exposure was the most reported allergen exposure with 61% (63) under 8 months and 54% (106) under 12 months of age, majority from their infant formula.
Conclusion: Age and any proportion of oral intake were associated with receiving education on common food allergen introduction at their visit. However, there were missed opportunities for education in infants with enteral feeding tubes. There were few visits with patients that reported peanut or egg exposure. Further research and national guidelines are needed on optimal methods of introduction in this population.
1Nationwide Children's Hospital, Grove City, OH; 2Nationwide Children's Hospital, Columbus, OH
Financial Support: None Reported.
Background: Critically ill children with severe acute kidney injury (AKI) requiring continuous renal replacement therapy (CRRT) are at high risk for micronutrient deficiencies, including vitamin C (VC). VC acts as an antioxidant and enzyme cofactor. It cannot be made endogenously and must be derived from the diet. Critically ill patients often enter the hospital with some degree of nutritional debt and may have low VC concentrations upon admission. This is exacerbated in the case of sepsis, multi-organ dysfunction, burns, etc. when VC needs are higher to address increased oxidative and inflammatory stress. When these patients develop a severe AKI requiring CRRT, their kidneys do not reabsorb VC as healthy kidneys would, further increasing losses. Moreover, VC is a small molecule and is filtered out of the blood via CRRT. The combination of increased VC losses and elevated VC requirements in critically ill patients on CRRT result in high risk of VC deficiency. The true prevalence of VC deficiency in this population is not well-known; whereas the prevalence of deficiency in the general population is 5.9% and 18.3% in critically ill children, on average. The aim of this study is to ascertain the prevalence of VC deficiency in critically ill pediatric patients on CRRT by monitoring serum VC levels throughout their CRRT course and correcting noted deficiencies.
Methods: An observational study was conducted from May 2023 through August 2024 in a 54 bed high-acuity PICU offering ECMO, CRRT, Level 1 Trauma, burn care, solid organ transplant and bone marrow transplantation as tertiary care services. Fifteen patients were identified upon initiation of CRRT and followed until transfer from the PICU. Serial serum VC levels were checked after 5-10 days on CRRT and rechecked weekly thereafter as appropriate. Deficiency was defined as VC concentrations < 11 umol/L. Inadequacy was defined as concentrations between 11-23 umol/L. Supplementation was initiated for levels < 23 umol/L; dose varied 250-500 mg/d depending on age and clinical situation. Most deficient patients received 500 mg/d supplementation. Those with inadequacy received 250 mg/d.
Results: Of 15 patients, 9 had VC deficiency and 4 had VC inadequacy [FIGURE 1]. Of those with deficiency, 5 of 9 patients were admitted for septic shock [FIGURE 2]. VC level was rechecked in 8 patients; level returned to normal in 5 patients and 4 of those 5 received 500 mg/d supplementation. Levels remained low in 3 patients; all received 250 mg/d supplementation [FIGURE 3]. Supplementation dose changes noted in figure 4.
Conclusion: VC deficiency was present in 60% of CRRT patients, suggesting deficiency is more common in this population and that critically ill patients on CRRT are at higher risk of developing deficiency than those who are not receiving CRRT. Septic shock and degree of VC deficiency appeared to be correlated; 56% of deficient patients were admitted with septic shock. Together, this suggests a need to start supplementation earlier, perhaps upon CRRT initiation vs upon admission to the PICU in a septic patient; and utilize higher supplementation doses as our patients with low VC levels at their follow-up check were all receiving 250 mg/d. Study limitations include: 1. Potential for VC deficiency prior to CRRT initiation with critical illness and/or poor intakes as confounding factors and 2. Small sample size. Our results suggest that critically ill children with AKI requiring CRRT are at increased risk of VC deficiency while on CRRT. Future research should focus on identifying at-risk micronutrients for this patient population and creating supplementation regimens to prevent the development of deficiencies. Our institution is currently crafting a quality improvement project with these aims.
Figure 1. Initial Serum Vitamin C Levels of Our Patients on CRRT, Obtained 5-10 Days After CRRT Initiation (N = 15).
Figure 2. Underlying Disease Process of Patients on CRRT (N = 15).
Figure 3. Follow Up Vitamin C Levels After Supplementation (N = 8), Including Supplementation Regimen Prior to Follow-Up Lab (dose/route).
Figure 4. Alterations in Supplementation Regimen (dose) Based on Follow-up Lab Data (N = 6).
1Purdue University College of Pharmacy, West Lafayette, IN; 2Riley Hospital for Children at Indiana University Health, Indianapolis, IN; 3Indiana University, Indianapolis, IN
Financial Support: The Gerber Foundation.
Background: During the hospital-to-home transition period, family members or caregivers of medically complex children are expected to assume the responsibility of managing medication and feeding regimens for the child under their care. However, this transition period represents a time of high vulnerability, including the risk of communication breakdowns and lack of tools tailored to caregivers’ context. This vulnerability is further heightened when a language barrier is present as it introduces additional opportunities for misunderstandings leading to adverse events and errors in the home setting. Hence, it is critical that caregivers are educated to develop skills that aid in successful implementation of post-discharge care plans. Addressing unmet needs for caregivers who use languages other than English (LOE) requires an in-depth understanding of the current challenges associated with educating and preparing caregivers for the post-discharge period.
Methods: In this prospective qualitative study, healthcare workers (HCWs) were recruited from a tertiary care children's hospital in central Indiana and were eligible to participate if they were involved directly or indirectly in preparing and assisting families of children under three years of age with medication and feeding needs around the hospital discharge period and/or outpatient care following discharge. Each HCW completed a brief demographic survey followed by observation on the job for 2-3 hours, and participated in a follow-up semi-structured interview using video conferencing technology to expand on observed behavior. Handwritten field notes were taken during the observations, which were immediately typed and expanded upon post-observation. Audio recordings from the interviews were transcribed and de-identified. Both the typed observation notes and interview transcripts were subjected to thematic content analysis, which was completed using the Dedoose software.
Results: Data collection is ongoing with anticipated completion in October 2024. Fourteen HCW interviews have been completed to date, with a target sample of 20-25 participants. Preliminary analysis presented is from transcripts of seven interviews. Participants included six females and one male with a mean age of 35.4 years (range, 24 - 59). HCWs were from diverse inpatient and outpatient clinical backgrounds including registered dieticians, physicians, pharmacists, and nurses. Four overarching themes describe the challenges that HCWs experience when communicating with caregivers who use LOE during the hospital-to-home transition. These themes include lack of equipment and materials in diverse languages, challenges with people and technologies that assist with translating information, instructions getting lost in translation/uncertainty of translation, and difficulty getting materials translated in a timely manner. Main themes, subthemes, and examples are presented in Figure 1, and themes, subthemes, and quotes are presented in Table 1.
Conclusion: The study is ongoing, however based on the preliminary analysis, it is evident that the systems and processes that are in place to aid in communication between HCWs and caregivers who use LOE can be improved. This can ultimately lead to improved quality of care provided to caregivers who use LOE during the hospital-to-home transition and resultant safer care in the home setting for medically complex children.
Background: Growth faltering among preterm neonates admitted to NICUs may be managed with hypercaloric and/or hypervolemic feeding. However, in intractable cases of growth faltering or when fluid restriction is indicated, fortification with extensively hydrolyzed liquid protein may offer a solution to meeting the high protein demands for infant growth while limiting overfeeding. Yet, there is limited clinical data regarding supplementation with liquid protein, leaving clinicians to make decisions about dosing and duration on a case-by-case basis.
Methods: We present a case of neonatal growth faltering managed with a liquid protein modular in a level IV NICU in North America.
Results: Case Summary: A male infant, born extremely preterm (GA: 24, 1/7) and admitted to the NICU for respiratory distress, requiring intubation. NICU course was complicated by patent ductus arteriosus (PDA), requiring surgery on day of life (DOL) 31 and severe bronchopulmonary dysplasia. Birth Anthropometrics: weight: 0.78 kg; height: 31.5 cm. TPN was initiated at birth, with trophic feeds of donor human milk per gavage (PG) for a total provision of 117 ml/kg, 75 kcal/kg, and 3.5 gm/kg of protein. The regimen was advanced per unit protocol; however, on DOL 5, total volume was decreased in the setting of metabolic acidosis and significant PDA. PG feeds of maternal milk were fortified to 24 kcal/oz on DOL 23, and the infant reached full feeds on DOL 26. Feed provision by DOL 28 was ~144 ml/kg/day and 4 g/kg of protein based on estimated dry weight. TPN was first discontinued on DOL 25 but restarted on DOL 32 due to frequent NPO status and clinical instability. Of note, the infant required diuretics during the hospital stay. TPN was again discontinued on DOL 43. At DOL 116, the infant was receiving and tolerating PG feeds fortified to 24 kcal/oz at 151 ml/kg, 121 kcal/kg, and 2.1 gm/kg protein. The infant's weight gain rate was 56 g/day; however, linear growth was impaired, with a gain of 0.5 cm over 21 days (rate of ~0.2 cm/week). Liquid protein was commenced at DOL 124 to supply an additional 0.5 gm/kg of protein. A week after adding liquid protein, the infant's weight gain rate was 39 g/day, and height increased by 2.5 cm/week. Feed fortification was reduced to 22 kcal/oz on DOL 156 due to rapid weight gain, and liquid protein dosage increased to 0.6 gm/kg for a total protein intake of 2.2 g/kg. At DOL 170, Calorie fortification of maternal milk was discontinued, and liquid protein dosage was increased to 1 g/kg in the setting of a relapse of poor linear growth for a total protein intake of 3.1 g/kg. Liquid protein was provided for two months until discontinuation (d/c) at DOL 183 per parent request. At the time of d/c of liquid protein, the infant's weight and length gain rates for the protein supplementation period (59 days) was 42 gm/day and 1.78 cm/week, respectively.
Conclusion: While we observed objective increases in linear growth for the presented case following the addition of a liquid protein modular, it is crucial to note that these findings are not generalizable, and there is limited evidence and guidelines on the use of hydrolyzed liquid protein. Larger, well-controlled studies examining mechanisms of action, appropriate dosage and duration, short- and long-term efficacy, and safety are required to guide best practices for using these modulars.
Background: Accurate assessment of growth and nutritional status is critical for preterm infants. Various growth charts have been developed to track the growth of preterm infants, but differences in reference standards may influence the diagnosis and management of malnutrition. The goal of this study was to compare the rate of malnutrition, defined by a decline in weight-for-age z-score, using the Fenton growth chart, the Olsen growth chart, and the INTERGROWTH-21st Preterm Postnatal Growth Standard.
Methods: All preterm infants born between 24 and 37 weeks of gestational age who were admitted to the neonatal intensive care unit (NICU) in 2022 and had weight at birth and on day 28 recorded were included. Preterm infants were excluded if they were admitted to the NICU ≥seven days after birth. Sex and gestational age were recorded for each infant. Weight and weight-for-age z-score at birth and on day 28 were recorded. Weight-for-age z-score was determined using three growth charts: (1) the Fenton growth chart, which was developed using growth data from uncomplicated preterm infants who were at least 22 weeks of gestational age from the United States (US), Australia, Canada, Germany, Italy, and Scotland; (2) the Olsen growth chart, which was developed using growth data from uncomplicated preterm infants who were at least 23 weeks of gestational age from the US; and (3) the INTERGROWTH-21st Preterm Postnatal Growth Standard, which was developed using growth data from uncomplicated preterm infants who were at least 24 weeks of gestational age from the US, United Kingdom, Brazil, China, India, Italy, Kenya, and Oman. Change in weight-for-age z-score from birth to day 28 was calculated using z-scores from each growth chart. Malnutrition was defined as a decline in weight-for-age z-score of ≥0.8; any infant meeting this cut-point had their malnutrition status further classified into three categories: (1) mild malnutrition was defined as a weight-for-age z-score decline between 0.8-1.1, (2) moderate malnutrition was defined as a weight-for-age z-score decline between 1.2-1.9, and (3) severe malnutrition was defined as a weight-for-age z-score decline of ≥2.0.
Results: The sample included 102 preterm infants, 58% male, with a mean gestational age of 29.3 weeks. At birth, the average weight was 1,192 grams, and the average weight-for-age z-score was -0.50, -0.36, and -1.14 using the Olsen, Fenton, and INTERGROWTH-21st growth charts, respectively. At 28 days, the average weight was 1,690 grams, and the average weight-for-age z-score was -0.96, -1.00, and -1.43 using the Olsen, Fenton, and INTERGROWTH-21st growth charts, respectively. Using the Olsen growth chart, 29 infants met the criteria for malnutrition; 15 had mild malnutrition and 14 had moderate malnutrition. Using the Fenton growth chart, 33 infants met the criteria for malnutrition; 21 had mild malnutrition and 12 had moderate malnutrition. Using the INTERGROWTH-21st growth chart, 32 infants met the criteria for malnutrition; 14 had mild malnutrition, 14 had moderate malnutrition, and 4 had severe malnutrition. In total, 24 infants met the criteria for malnutrition using all three growth charts, while 19 infants were only categorized as malnourished by one of the three growth charts.
Conclusion: The findings of this study reveal important discrepancies in average z-scores and the classification of malnutrition among preterm infants when using the Olsen, Fenton, and INTERGROWTH-21st growth charts. These differences suggest that the choice of growth chart has important implications for identifying rates of malnutrition. Therefore, standardized guidelines for growth monitoring in preterm infants are necessary to ensure consistent and accurate diagnosis of malnutrition.
1Univery of Galway, Vancouver, BC; 2BC Children's Hospital, Vancouver, BC
Financial Support: None Reported.
Background: Infants with gastroschisis have variable intestinal function with some achieving enteral independence within a few weeks and others remaining dependent on parental nutrition (PN) for prolonged periods. Establishing enteral feeds can be challenging for many of these neonates due to poor intestinal motility, frequent emesis and/or abdominal distension. Therefore, many care teams use standardized post-natal nutrition protocols in an attempt to minimize PN exposure and maximize oral feeding. However, it remains unclear whether initiating continuous feeds is advantageous or whether bolus feeding is preferred. Potential benefits of bolus feeding include that it is more physiologic and that the feeds can be given orally, but there remain concerns about feed tolerance and prolonged periods of withholding feeds with this approach. The objective of this study was to compare the initial feeding strategy in infants with gastroschisis to determine whether bolus feeding is a feasible approach.
Methods: After obtaining REB approval (H24-01052) a retrospective chart review was performed in neonates born with gastroschisis, cared for by a neonatal intestinal rehabilitation team between 2018 and 2023. A continuous feeding protocol was used between 2018-2020 (human milk at 1 ml/h with 10 ml/kg/d advancements given continuously until 50 ml/kg/d and then trialing bolus feeding) and a bolus protocol was used between 2021-2023 (10-15 ml/kg divided into 8 feeds/d with 15-20 ml/kg/d advancements). Clinical data was collected including: gestational age, gastroschisis prognosis score (GPS), need for intestinal resection, age when feeds initiated, time to full feeds, route of feeding at full feeds and hepatic cholestasis were compared between groups. Welch's t-test and chi square test were performed to compare variables with p-values < 0.05 considered significant.
Results: Forty-one infants with gastroschisis were reviewed (23 who were managed with continuous feed initiation and 18 with bolus feed initiation). Continuous feed and bolus feed groups had comparable mean gestational age at birth, GPS score, need for intestinal surgery, age at feed initiation, and the incidence of cholestasis (Table 1). Time to achieve enteral independence was similar between both groups with nearly half of infants reaching full feeds by 6 weeks (48% continuous feeds vs. 44% bolus feeds) and most by 9 weeks of life (74% continuous vs. 72% bolus). Significantly more infants in the bolus feeding group were feeding exclusively orally compared to the continuous feeding group at the time of reaching full enteral feeds (50% vs. 17%, p = 0.017).
Conclusion: Initiating bolus enteral feeding for infants with gastroschisis, including those requiring intestinal resection, is safe and does not result in prolonged time to full enteral feeding. Avoiding continuous feeds may improve oral feeding in this population.
Table 1. Clinical Characteristics and Initial Feeding Strategy.
International Poster of Distinction
Matheus Albuquerque1; Diogo Ferreira1; João Victor Maldonado2; Mateus Margato2; Luiz Eduardo Nunes1; Emanuel Sarinho1; Lúcia Cordeiro1; Amanda Fifi3
1Federal University of Pernambuco, Recife, Pernambuco; 2University of Brasilia, Brasília, Distrito Federal; 3University of Miami, Miami, FL
Financial Support: None Reported.
Background: Intestinal failure secondary to short bowel syndrome is a malabsorptive condition, caused by intestinal resection. Patients with intestinal failure require parenteral support to maintain hydration and nutrition. Long-term parenteral nutrition leads to complications. Teduglutide, an analog of GLP-2, may improve intestinal adaptation thereby minimizing reliance on parenteral nutrition. This meta-analysis evaluates the efficacy of teduglutide in reducing parenteral nutrition dependency in pediatric patients with intestinal failure.
Methods: We included randomised controlled trials (RCTs) that assessed the efficacy of teglutide in reducing parenteral nutrition support and improving anthropometrics in pediatric patients with intestinal failure secondary to short bowel syndrome. The RoB-2 tool (Cochrane) evaluated the risk of bias, and statistical analyses were conducted utilizing RevMan 5.4.1 software. The results are expressed as mean differences with CI 95% and p-value.
Results: Data was extracted from three clinical trials, involving a total of 172 participants. Teduglutide use was associated with a reduction in parenteral nutrition volume (-17.92 mL, 95% CI -24.65 to -11.20, p < 0.00001) with most patients reducing parenteral support by >20% (11.79, 95% CI 2.04 to 68.24, p = 0.006) (Figure2). Treatment with teduglutide also improved height (0.27 Z-score, 95% CI 0.08, to 0.46, p = 0.005) but did not significantly increase weight when compared to the control group (-0.13 Z-score, 95% CI, -0.41 to 0.16, p = 0.38) (Figure 3).
Conclusion: This meta-analysis suggests that therapy with teduglutide reduces parenteral nutrition volume in patients with short bowel syndrome and intestinal failure. Reduced pareneteral nutrition dependency can minimize complications and improve quality of life in patients with short bowel syndrome and intestinal failure.
Figure 1. Parenteral Nutrition Support Volume Change.
Figure 2. Anthropometric Data (Weight and Height) Change from Baseline.
1Medical College of Wisconsin, Milwaukee, WI; 2Childrens Hospital of Wisconsin, Milwaukee, WI
Financial Support: Medical College of Wisconsin, Department of Pediatrics.
Background: Malnutrition is a significant concern in pediatric patients, particularly those critically ill. In children with diabetes mellitus (DM), the presence of malnutrition can exacerbate complications such as unstable blood sugar levels and delayed wound healing, potentially leading to worse clinical outcomes. Despite the known impact of malnutrition on adult patients and critically ill children, established criteria for identifying malnutrition in critically ill children are lacking. This study was designed to determine the relationship between malnutrition, mortality, and length of stay (LOS) in critically ill pediatric patients with diabetes mellitus.
Methods: We conducted a retrospective cohort study using the VPS (Virtual Pediatric Systems, LLC) Database. We categorized critically ill pediatric patients with DM as malnourished or at risk of being malnourished based on admission nutrition screens. We compared mortality rates between malnourished and non-malnourished patients using Fisher's Exact test. We used logistic regression analysis to compare mortality controlling for measures like PRISM3 (a severity of illness measure), demographic, and clinical factors. We compared the LOS in the Pediatric Intensive Care Unit (PICU) between malnourished and non-malnourished patients using the Mann-Whitney-Wilcoxon test. Additionally, we used a general linear model with appropriate transformation to adjust for the severity of illness, demographic, and clinical factors. We considered statistical significance at p < 0.05.
Results: We analyzed data for 4,014 patients, of whom 2,653 were screened for malnutrition. Of these 2,653, 88.5% were type 1 DM, 9.3% were type 2 DM, and the remaining patients were unspecified DM. Of the 2,653 patients, 841 (31.7%) were malnourished based on their nutrition screen at admission to the PICU. Mortality in patients who were screened as malnourished did not differ from mortality in those who were not malnourished (0.4% vs. 0.2%, p = 0.15). Malnourished patients also had longer PICU LOS, with a geometric mean and 95% CI of 1.03 (0.94–1.13) days, compared to 0.91 (0.86–0.96) days for non-malnourished patients. Similarly, the malnourished patients had longer hospital LOS with a geometric mean and 95% CI of 5.31 (4.84–5.83) days, compared to 2.67 (2.53–2.82) days for those who were not malnourished. Both differences were significant with p < 0.0001, after adjusting for age, race/ethnicity, and PRISM3.
Conclusion: We found no difference in mortality rates, but critically ill children who were screened as malnourished had longer PICU and hospital LOS than those who were not malnourished. This was true even after adjusting for age, race/ethnicity, and PRISM3.
Emily Gutzwiller1; Katie Huff, MD, MS1
1Indiana University School of Medicine, Indianapolis, IN
Financial Support: None Reported.
Background: Neonates with intestinal failure require parenteral nutrition for survival. While life sustaining, it can lead to serious complications, including intestinal failure associated liver disease (IFALD). The etiology of IFALD is likely multifactorial, with intravenous lipid emulsions (ILE) being a large contributor, particularly soybean oil-based lipid emulsions (SO-ILE). Alternate ILEs, including those containing fish oil, can be used to prevent and treat IFALD. Fish oil-based ILE (FO-ILE) is only approved at a dose of 1 g/kg/d, limiting calories prescribed from fat and shifting the calorie delivery to carbohydrate predominance. While FO-ILE was shown to have comparable growth to SO-ILE, a comparison to soy, MCT, olive, fish oil-based ILE (SO,MCT,OO,FO-ILE) has not been conducted to our knowledge. The purpose of this study is to compare the growth and laboratory data associated with SO,MCT,OO,FO-ILE and FO-ILE therapy in the setting of IFALD in a cohort of neonates treated during two time periods.
Methods: We performed a retrospective chart review of patients with IFALD receiving SO,MCT,OO,FO-ILE (SMOFlipid) or FO-ILE (Omegaven) in a level IV neonatal intensive care unit from September 2016 – May 2024. IFALD was defined as direct bilirubin >2 mg/dL after receiving >2 weeks of parental nutrition. Patients with underlying genetic or hepatic diagnoses, as well as patients with elevated direct bilirubin prior to two weeks of life were excluded. Patients were divided based on the period they were treated. Data was collected on demographic characteristics, parental and enteral nutrition, weekly labs, and growth changes while receiving SO,MCT,OO,FO-ILE or FO-ILE. Rate of change of weight, length, and head circumference and comparison of z-scores over time were studied for each ILE group. Secondary outcomes included nutritional data in addition to hepatic labs. Nonparametric analysis using Mann-Whitney U test was conducted to compare ILE groups and a p-value of < 0.05 was used to define statistical significance.
Results: A total of 51 patients were enrolled, 25 receiving SO,MCT,OO,FO-ILE and 26 FO-ILE. Table 1 notes the demographic and baseline characteristics of the two ILE groups. There was no difference in the rate of OFC (p = 0.984) or length (p = 0.279) growth between the two treatment groups (Table 2). There was a difference, however, in the rate of weight gain between groups. (p = 0.002; Table 2), with the FO-ILE group gaining more weight over time. When comparing nutritional outcomes (Table 2), SO,MCT,OO,FO-ILE patients received greater total calories than FO-ILE patients (p = 0.005) including a higher ILE dose (p < 0.001) and enteral calories (p = 0.029). The FO-ILE group, however received a higher carbohydrate dose (p = 0.003; Table 2). There was no difference in amino acid dose (p = 0.127) or parental nutrition calories (p = 0.821). Hepatic labs differed over time with the FO-ILE having a larger decrease in AST, direct bilirubin, and total bilirubin over time compared to the SO,MCT,OO,FO-ILE group (Table 2).
Conclusion: Our results show the FO-ILE patients did have a significant increase in weight gain compared to the SO,MCT,OO,FO-ILE patients. This is despite SO,MCT,OO,FO-ILE patients receiving greater total calories and enteral calories. The FO-ILE only received greater calories in the form of glucose infusion. With this increased weight gain but similar length growth between groups, concerns regarding alterations in body composition and increased fat mass arise. Further research is needed to determine the influence of this various ILE products on neonatal body composition over time.
Table 1. Demographic and Baseline Lab Data by Lipid Treatment Group.
(All data presented as median and interquartile range, unless specified.)
Table 2. Nutritional, Hepatic Lab, and Growth Outcomes by Lipid Treatment Group.
(All data presented as median interquartile range unless specified.)
*z-score change compares z-score at end and beginning of study period
1Emory University Nell Hodgson Woodruff School of Nursing, Atlanta, GA; 2Emory University School of Medicine; Children's Healthcare of Atlanta, Atlanta, GA; 3Emory University Nell Hodgson Woodruff School of Nursing; Children's Healthcare of Atlanta, Atlanta, GA
Financial Support: None Reported.
Background: Many children receiving hematopoietic stem cell transplant (HSCT) are malnourished prior to the start of transplant or develop malnutrition post-infusion. Children are at risk for malnutrition due to the pre-transplant conditioning phase, from chemotherapy treatments for their primary diagnosis, and from acute graft versus host disease (GVHD). These factors can cause impaired oral feeding, vomiting, diarrhea, and mucositis. Enteral nutrition (EN) and parenteral nutrition (PN) are support options for this patient population when oral feeding is impaired. There is currently a paucity of research in evidence-based guidelines of nutrition in this population. The purpose of this integrative literature review was to determine the outcomes of EN and PN in pediatric HSCT and to discuss evidence-based implications for practice to positively affect quality of life for these patients.
Methods: A literature search was conducted using the following databases: PubMed, CINAHL, Cochrane, and Healthsource: Nursing/Academic Edition. The search strategy included randomized controlled trials, prospective and retrospective cohort studies, case control studies, cross sectional studies, systematic reviews, and meta-analyses. Relevant papers were utilized if they discussed patients 0-21 years of age, allogenic or autogenic HSCT, and enteral and/or parenteral nutrition. Papers were excluded if there was no English translation, they did not discuss nutrition, or they had animal subjects.
Results: Initially 477 papers were identified and after the screening process 15 papers were utilized for this integrative review. EN and PN have effects on clinical outcomes, complications, and hospital and survival outcomes. EN was associated with faster platelet engraftment, improved gut microbiome, and decreased mucositis and GVHD. PN was used more in severe mucositis due to interference with feeding tube placement, therefore decreasing the use of EN. Use of PN is more common in severe grades III-IV of gut GVHD. Initiation of EN later in treatment, such as after conditioning and the presence of mucositis, can be associated with severe grades III-IV of gut GVHD. This is because conditioning can cause damage to the gut leading to mucosal atrophy and intestinal permeability altering gut microbiota. PN can induce gut mucosal atrophy and dysbiosis allowing for bacterial translocation, while EN improves the gut epithelium and microbiota reducing translocation. Additionally, increase access to central venous lines in PN can introduce bacterial infections to the bloodstream. Feeding tube placement complications include dislodgement, refusal of replacement, and increased risk of bleeding due to thrombocytopenia. Electrolyte imbalance can be attributed to loss of absorption through the gut. If a gastrostomy tube is present, there can be infection at the site. There is currently no consensus in the appropriate timeline of tube placement. There was no significant difference in neutrophil engraftment and variable findings in morbidity/mortality and weight gain. Weight gain can be attributed to edema in PN. Length of stay was significantly shorter in only EN than PN (p < 0.0001).
Conclusion: This literature review indicates a need for a more comprehensive nutritional assessment to adequately evaluate nutritional status before and after HSCT. EN should be given as a first line therapy and should be considered prior to the conditioning phase. The initiation of a feeding tube prior to conditioning should be considered. Finally, PN may be considered if EN cannot be tolerated. More research is needed for a sensitive nutritional evaluation, earlier administration of EN, and standardized pathways for EN and PN nutrition in pediatric HSCT.
期刊介绍:
The Journal of Parenteral and Enteral Nutrition (JPEN) is the premier scientific journal of nutrition and metabolic support. It publishes original peer-reviewed studies that define the cutting edge of basic and clinical research in the field. It explores the science of optimizing the care of patients receiving enteral or IV therapies. Also included: reviews, techniques, brief reports, case reports, and abstracts.