海报摘要。

IF 3.2 3区 医学 Q2 NUTRITION & DIETETICS
{"title":"海报摘要。","authors":"","doi":"10.1002/jpen.2735","DOIUrl":null,"url":null,"abstract":"<p><b>P1–P34 Parenteral Nutrition Therapy</b></p><p><b>P35–P52 Enteral Nutrition Therapy</b></p><p><b>P53–P83 Malnutrition and Nutrition Assessment</b></p><p><b>P84–P103 Critical Care and Critical Health Issues</b></p><p><b>P104–P131 GI, Obesity, Metabolic, and Other Nutrition Related Concepts</b></p><p><b>P132–P165 Pediatric, Neonatal, Pregnancy, and Lactation</b></p><p><b>Parenteral Nutrition Therapy</b></p><p>Sarah Williams, MD, CNSC<sup>1</sup>; Angela Zimmerman, RD, CNSC<sup>2</sup>; Denise Jezerski, RD, CNSC<sup>2</sup>; Ashley Bestgen, RD, CNSC<sup>2</sup></p><p><sup>1</sup>Cleveland Clinic Foundation, Parma, OH; <sup>2</sup>Cleveland Clinic Foundation, Cleveland, OH</p><p><b>Financial Support:</b> Morrison Healthcare.</p><p><b>Background:</b> Essential fatty acid deficiency (EFAD) is a rare disorder among the general population but can be a concern in patients reliant on home parenteral nutrition (HPN), particularly those who are not receiving intravenous lipid emulsions (ILE). In the US, the only ILE available until 2016 was soybean oil based (SO-ILE), which contains more than adequate amounts of essential fatty acids, including alpha-linolenic acid (ALA, an omega-3 fatty acid) and linoleic acid (LA, an omega-6 fatty acid). In 2016, a mixed ILE containing soybean oil, medium chain triglycerides, olive oil and fish oil, became available (SO, MCT, OO, FO-ILE). However, it contains a lower concentration of essential fatty acids compared to SO-ILE, raising theoretical concerns for development of EFAD if not administered in adequate amounts. Liver dysfunction is a common complication in HPN patients that can occur with soybean based ILE use due to their pro-inflammatory properties. Short-term studies and case reports in patients receiving SO, MCT, OO, FO-ILE have shown improvements in liver dysfunction for some patients. Our study evaluates the long-term impact of SO, MCT, OO, FO-ILE in our HPN patient population.</p><p><b>Methods:</b> This single-center, retrospective cohort study was conducted at the Cleveland Clinic Center for Human Nutrition using data from 2017 to 2020. It involved adult patients who received HPN with SO, MCT, OO, FO-ILE for a minimum of one year. The study assessed changes in essential fatty acid profiles, including triene-tetraene ratios (TTRs) and liver function tests (LFTs) over the year. Data was described as mean and standard deviation for normal distributed continuous variables, medians and interquartile range for non-normally distributed continuous variables and frequency for categorical variables. The Wilcoxon signed rank test was used to compare the baseline and follow-up TTR values (mixed time points). The Wilcoxon signed rank test with pairwise comparisons was used to compare the LFTs at different time points and to determine which time groups were different. P-values were adjusted using Bonferroni corrections. Ordinal logistic regression was used to assess the association between lipid dosing and follow-up TTR level. Analyses were performed using R software and a significance level of 0.05 was assumed for all tests.</p><p><b>Results:</b> Out of 110 patients screened, 26 met the inclusion criteria of having baseline and follow-up TTRs. None of the patients developed EFAD, and there was no significant difference in the distribution of TTR values between baseline and follow-up. Additionally, 5.5% of patients reported adverse GI symptoms while receiving SO, MCT, OO, FO-ILE. A separate subgroup of 14 patients who had abnormal LFTs, including bilirubin, alkaline phosphatase (AP), aspartate aminotransferase (AST) or alanine aminotransferase (ALT), were evaluated. There was a statistically significant improvement of AST and ALT and decreases in bilirubin and AP that were not statistically significant.</p><p><b>Conclusion:</b> We found that using SO, MCT, OO, FO-ILE as the primary lipid source did not result in EFAD in any of our subset of 26 patients, and TTRs remained statistically unchanged after introduction of SO, MCT, OO, FO-ILE. Additionally, there was a statistically significant decrease in AST and ALT following the start of SO, MCT, OO, FO-ILE. While liver dysfunction from PN is multifactorial, the use of fish oil based lipids has been shown to improve LFT results due to a reduction of phytosterol content as well as less pro-inflammatory omega-6 content when compared to SO-ILEs. A significant limitation was the difficulty in obtaining TTR measurements by home health nursing in the outpatient setting, which considerably reduced the number of patients who could be analyzed for EFAD.</p><p><b>Table 1.</b> Summary Descriptive Statistics of 26 Patients With Baseline and Follow Up TTR.</p><p></p><p><b>Table 2.</b> Change in LFTs From Baseline Levels Compared to 3 Months, 6 Months, 9 Months and 12 Months.</p><p></p><p>Wendy Raissle, RD, CNSC<sup>1</sup>; Hannah Welch, MS, RD<sup>2</sup>; Jan Nguyen, PharmD<sup>3</sup></p><p><sup>1</sup>Optum Infusion Pharmacy, Buckeye, AZ; <sup>2</sup>Optum Infusion Pharmacy, Phoenix, AZ; <sup>3</sup>Optum Infusion Pharmacy, Mesa, AZ</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Aluminum is a non-nutrient contaminant of parenteral nutrition (PN) solution. The additive effects of PN components can contribute to toxicity and cause central nervous system issues as well as contribute to metabolic bone disease as observed in adults with osteomalacia. When renal function and gastrointestinal mechanisms are impaired, aluminum can accumulate in the body. Aluminum toxicity can result in anemia, dementia, bone disease and encephalopathy. Symptoms of aluminum toxicity may include mental status change, bone pain, muscle weakness, nonhealing fractures and premature osteoporosis. In July 2004, the U.S. Food and Drug Administration (FDA) mandated labeling of aluminum content with a goal to limit exposure to less than 5mCg/kg/day. Adult and pediatric dialysis patients, as well as patients of all ages receiving PN support, have an increased risk of high aluminum exposure. Reducing PN additives high in aluminum is the most effective way to decrease aluminum exposure and risk of toxicity. This abstract presents a unique case where antiperspirant use contributed to an accumulation of aluminum in an adult PN patient.</p><p><b>Methods:</b> A patient on long-term PN (Table 1) often had results of low ionized calcium of &lt; 3 mg/dL, leading to consideration of other contributing factors. In addition, patient was taking very high doses of vitamin D daily (by mouth) to stay in normal range (50,000IU orally 6 days/week). Risk factors for developing metabolic bone disease include mineral imbalances of calcium, magnesium, phosphorus, vitamin D, corticosteroid use, long-term PN use and aluminum toxicity (Table 2). A patient with known osteoporosis diagnosis had two stress fractures in left lower leg. Aluminum testing was completed in order to identify other factors that may be contributing to low ionized calcium values and osteoporosis. During patient discussion, the patient revealed they used an aluminum-containing antiperspirant one time daily. The range of aluminum content in antiperspirants is unknown, but studies show that minimal absorption may be possible, especially in populations with kidney insufficiency.</p><p><b>Results:</b> After an elevated aluminum value resulted on July 3, 2023 (Figure 1), patient changed products to a non-aluminum containing antiperspirant. Aluminum values were rechecked at 3 and 7 months. Results indicate that patient's antiperspirant choice may have been contributing to aluminum content through skin absorption. Antiperspirant choice may not lead to aluminum toxicity but can contribute to an increased total daily aluminum content.</p><p><b>Conclusion:</b> Preventing aluminum accumulation is vital for patients receiving long-term PN support due to heightened risk of aluminum toxicity. Other potential sources of contamination outside of PN include dialysis, processed food, aluminum foil, cosmetic products (antiperspirants, deodorant, toothpaste) medications (antacids), vaccinations, work environment with aluminum welding and certain processing industry plants. Aluminum content of medications and PN additives vary based on brands and amount. Clinicians should review all potential aluminum containing sources and assess ways to reduce aluminum exposure and prevent potential aluminum toxicity in long-term PN patients.</p><p><b>Table 1.</b> Patient Demographics.</p><p></p><p><b>Table 2.</b> Aluminum Content in PN Prescription.</p><p></p><p></p><p><b>Figure 1.</b> Aluminum Lab Value Result.</p><p>Haruka Takayama, RD, PhD<sup>1</sup>; Kazuhiko Fukatsu, MD, PhD<sup>2</sup>; MIdori Noguchi, BA<sup>3</sup>; Nana Matsumoto, RD, MS<sup>2</sup>; Tomonori Narita, MD<sup>4</sup>; Reo Inoue, MD, PhD<sup>3</sup>; Satoshi Murakoshi, MD, PhD<sup>5</sup></p><p><sup>1</sup>St. Luke's International Hospital, Chuo-ku, Tokyo; <sup>2</sup>The University of Tokyo, Bunkyo-ku, Tokyo; <sup>3</sup>The University of Tokyo Hospital, Bunkyo-ku, Tokyo; <sup>4</sup>The University of Tokyo, Chuo-City, Tokyo; <sup>5</sup>Kanagawa University of Human Services, Yokosuka-city, Kanagawa</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Our previous study has demonstrated beta-hydroxy-beta-methylbutyrate (HMB)-supplemented total parenteral nutrition (TPN) partially to restore gut-associated lymphoid tissue (GALT) atrophy observed in standard TPN-fed mice. Oral intake of HMB is now popular in body builders and athletes. Herein, we examined whether oral supplementation of HMB could increase GALT mass in mice which eat dietary chow ad libitum.</p><p><b>Methods:</b> Six-week-old male Institute of Cancer Research (ICR) mice were divided into the Control (n = 9), the H600 (n = 9) and the H2000 (n = 9) groups. All mice were allowed to take chow and water ad libitum for 7 days. The H600 or H2000 mice were given water containing Ca-HMB at 3 mg or 10 mg/mL water, while the Controls drank normal tap water. Because these mice drank approximately 6-7 mL water per day, the H600 and H2000 groups took 600 and 2000mg/kg Ca-HMB in one day, respectively. After 7 days manipulation, all mice were killed with cardiac puncture under general anesthesia, and the whole small intestine was harvested for GALT cell isolation. GALT cell numbers were evaluated in each tissue (Payer patches; PPs, Intraepithelial spaces; IE, and Lamina propria; LP). The nasal washings, bronchoalveolar lavage fluid (BALF), and intestinal washings were also collected for IgA level measurement by ELISA. The Kruskal-Wallis test was used for all parameter analyses, and the significance level was set at less than 5%.</p><p><b>Results:</b> There were no significant differences in the number of GALT cells in any sites among the 3 groups (Table 1). Likewise, mucosal IgA levels did not differ between any of the 2 groups (Table 2).</p><p><b>Conclusion:</b> Oral intake of HMB does not affect GALT cell number or mucosal IgA levels when the mice are given normal diet orally. It appears that beneficial effects of HMB on the GALT are expected only in parenterally fed mice. We should examine the influences of IV-HMB in orally fed model in the next study.</p><p><b>Table 1.</b> GALT Cell Number (x10<sup>7</sup>/body).</p><p></p><p><b>Table 2.</b> IgA Levels.</p><p></p><p>Median (interquartile range). Kruskal-Wallis test. n; Control=9, H600 = 8, H2000 = 9.</p><p>Nahoki Hayashi, MS<sup>1</sup>; Yoshikuni Kawaguchi, MD, PhD, MPH, MMA<sup>2</sup>; Kenta Murotani, PhD<sup>3</sup>; Satoru Kamoshita, BA<sup>1</sup></p><p><sup>1</sup>Medical Affairs Department, Research and Development Center, Chiyoda-ku, Tokyo; <sup>2</sup>Hepato-Biliary-Pancreatic Surgery Division, Bunkyo-ku, Tokyo; <sup>3</sup>School of Medical Technology, Kurume, Fukuoka</p><p><b>Financial Support:</b> Otsuka Pharmaceutical Factory, Inc.</p><p><b>Background:</b> The guideline of American Society for Parenteral and Enteral Nutrition recommends a target energy intake of 20 to 30 kcal/kg/day in patients undergoing surgery. Infectious complications reportedly decreased when the target energy and protein intake were achieved in the early period after gastrointestinal cancer surgery. However, no studies investigated the association of prescribed parenteral energy doses with clinical outcomes in patients who did not receive oral/tube feeding in the early period after gastrointestinal cancer surgery.</p><p><b>Methods:</b> Data of patients who underwent gastrointestinal cancer surgery during 2011–2022 and fasted for 7 days or longer after surgery were extracted from a nationwide medical claims database. The patients were divided into 3 groups based on the mean prescribed parenteral energy doses during 7 days after surgery as follows: the very-low group (&lt;10 kcal/kg/day), the low group (10–20 kcal/kg/day), and the moderate group (≥20 kcal/kg/day). Multivariable logistic regression model analysis was performed using in-hospital mortality, postoperative complications, length of hospital stay, and total in-hospital medical cost as the objective variable and the 3 group and confounding factors as the explanatory variables.</p><p><b>Results:</b> Of the 18,294 study patients, the number of patients in the very low, low, and moderate groups was 6,727, 9,760, and 1,807, respectively. The median prescribed energy doses on the 7th day after surgery were 9.2 kcal/kg, 16 kcal/kg, and 27 kcal/kg in the very low, low, and moderate groups, respectively. The adjusted odds ratio (95% confidence interval) for in-hospital mortality with reference to the very low group was 1.060 (1.057–1.062) for the low group and 1.281 (1.275–1.287) for the moderate group. That of postoperative complications was 1.030 (0.940–1.128) and 0.982 (0.842–1.144) for the low and moderate groups, respectively. The partial regression coefficient (95% confidence interval) for length of hospital stay (day) with reference to the very low group was 2.0 (0.7–3.3) and 3.2 (1.0–5.5), and that of total in-hospital medical cost (US$) was 1,220 (705–1,735) and 2,000 (1,136–2,864), for the low and moderate groups, respectively.</p><p><b>Conclusion:</b> Contrary to the guideline recommendation, the prescribed energy doses of ≥ 10 kcal/kg/day was associated with the increase in in-hospital mortality, length of hospital stay, and total in-hospital medical cost. Our findings questioned the efficacy of guideline-recommended energy intake for patients during 7 days after gastrointestinal cancer surgery.</p><p>Jayme Scali, BS<sup>1</sup>; Gaby Luna, BS<sup>2</sup>; Kristi Griggs, MSN, FNP-C, CRNI<sup>3</sup>; Kristie Jesionek, MPS, RDN, LDN<sup>4</sup>; Christina Ritchey, MS, RD, LD, CNSC, FASPEN, FNHIA<sup>5</sup></p><p><sup>1</sup>Optum Infusion Pharmacy, Thornton, PA; <sup>2</sup>Optum Infusion Pharmacy, Milford, MA; <sup>3</sup>Optum Infusion Pharmacy, Murphy, NC; <sup>4</sup>Optum Infusion Pharmacy, Franklin, TN; <sup>5</sup>Optum Infusion Pharmacy, Bulverde, TX</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Home parenteral nutrition (HPN) is a life-sustaining nutrition support therapy that is administered through a central venous access device (CVAD). Caregivers of children on HPN are initially trained to provide CVAD care and therapy administration by their clinical team or home infusion nurse. Often there is a gap in training when the patient is ready to assume responsibility for CVAD care. Without proper training, patients are at significant risk of complications such as bloodstream infections and catheter occlusions. The purpose of this study was twofold: 1) to explore the caregiver's perspective about current and future CVAD training practices and 2) to evaluate the need for a proactive formalized CVAD training program when care is transitioned from caregiver to patient.</p><p><b>Methods:</b> An 8-question survey was created using an online software tool. The target audience included caregivers of children receiving HPN. A link to the survey was sent via email and posted on various social media platforms that support the HPN community. The survey was conducted from June 17 to July 18, 2024. Respondents who were not caregivers of a child receiving HPN via a CVAD were excluded.</p><p><b>Results:</b> The survey received 114 responses, but only 86 were included in the analysis based on exclusion criteria. The distribution of children with a CVAD receiving HPN was evenly weighted between 0 and 18 years of age. The majority of the time, initial training regarding HPN therapy and CVAD care was conducted by the HPN clinic/hospital resource/learning center or home infusion pharmacy (Table 1). Forty-eight percent of respondents indicated their HPN team never offers reeducation or shares best practices (Figure 1). Most respondents selected the best individual to train their child on CVAD care and safety is the caregiver (Figure 2). In addition, 60% of respondents selected yes, they would want their child to participate in CVAD training if offered (Figure 3).</p><p><b>Conclusion:</b> This survey confirms that most caregivers anticipate training their child to perform CVAD care when it is determined the child is ready for this responsibility. One challenge to this provision of training is that almost half of the respondents in this survey stated they never receive reeducation or best practice recommendations from their team. This finding demonstrates a need for a formalized training program to assist caregivers when transitioning CVAD care to the patient. Since most respondents reported relying on their intestinal rehab or GI/motility clinic for CVAD related concerns, these centers would be the best place to establish a transition training program. Limitations of the study are as follows: It was only distributed via select social platforms, and users outside of these platforms were not captured. Additional studies would be beneficial in helping to determine the best sequence and cadence for content training.</p><p><b>Table 1.</b> Central Venous Access Device (CVAD) Training and Support Practices.</p><p></p><p></p><p><b>Figure 1.</b> How Often Does Your HPN Team Offer Reeducation or Share Best Practices?</p><p></p><p><b>Figure 2.</b> Who is Best to Train Your Child on CVAD Care Management and Safety?</p><p></p><p><b>Figure 3.</b> If Formalized CVAD Training is Offered, Would You Want Your Child to Participate?</p><p>Laryssa Grguric, MS, RDN, LDN, CNSC<sup>1</sup>; Elena Stoyanova, MSN, RN<sup>2</sup>; Crystal Wilkinson, PharmD<sup>3</sup>; Emma Tillman, PharmD, PhD<sup>4</sup></p><p><sup>1</sup>Nutrishare, Tamarac, FL; <sup>2</sup>Nutrishare, Kansas City, MO; <sup>3</sup>Nutrishare, San Diego, CA; <sup>4</sup>Indiana University, Carmel, IN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Long-term parenteral nutrition (LTPN) within the home is a lifeline for many patients throughout the United States. Patients utilize central venous access devices (CVAD) to administer LTPN. Central line-associated bloodstream infection (CLABSI) is a serious risk associated with patients who require LTPN. Rates of CLABSI in LTPN populations range from 0.9-1.1 per 1000 catheter days. The aim of this study was to determine the incidence of CLABSI in a cohort of patients serviced by a national home infusion provider specializing in LTPN and identify variables associated with an increased incidence of CLABSI.</p><p><b>Methods:</b> A retrospective review of electronic medical records of LTPN patients with intestinal failure was queried from March 2023 to May 2024 for patient demographics, anthropometric data, nursing utilization, parenteral nutrition prescription including lipid type, length of therapy use, geographic distribution, prescriber specialty, history of CLABSI, blood culture results as available, and use of ethanol lock. Patient zip codes were used to determine rural health areas, as defined by the US Department of Health &amp; Human Services. Patients were divided into two groups: 1) patients that had at least one CLABSI and 2) patients with no CLABSI during the study period. Demographic and clinical variables were compared between the two groups. Nominal data were analyzed by Fisher's exact test and continuous data were analyzed with student t-test for normal distributed data and Mann-Whitney U-test was used for non-normal distributed data.</p><p><b>Results:</b> We identified 198 persons that were maintained on LTPN during the study time. The overall CLABSI rate for this cohort during the study period was 0.49 per 1000 catheter days. Forty-four persons with LTPN had one or more CLABSI and 154 persons with LTPN did not have a CLABSI during the study period. Persons who experienced CLABSI weighed significantly more, had fewer days of infusing injectable lipid emulsions (ILE), and had a shorter catheter dwell duration compared to those that did not have a CLABSI (Table 1). There was no significant difference between the CLABSI and no CLABSI groups in the length of time on LTPN, location of consumer (rural versus non-rural), utilization of home health services, number of days parenteral nutrition (PN) was infused, or use of ethanol locks (Table 1).</p><p><b>Conclusion:</b> In this retrospective cohort study, we report a CLABSI rate of 0.49 per 1000 catheter days, which is lower than previously published CLABSI rates for similar patient populations. Patient weight, days of infusing ILE, and catheter dwell duration were significantly different between those that did and did not have a CLABSI in this study period. Yet, variables such as use of ethanol lock and proximity to care providers that had previously been reported to impact CLABSI were not significantly different in this cohort. An expanded study with more LTPN patients or a longer study duration may be necessary to confirm these results and their impact on CLABSI rates.</p><p><b>Table 1.</b> Long Term Parenteral Nutrition (LTPN) Characteristics.</p><p></p><p>Silvia Figueiroa, MS, RD, CNSC<sup>1</sup>; Stacie Townsend, MS, RD, CNSC<sup>2</sup></p><p><sup>1</sup>MedStar Washington Hospital Center, Bethesda, MD; <sup>2</sup>National Institutes of Health, Bethesda, MD</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> In hospitalized patients, lipid emulsions constitute an essential component of balanced parenteral nutrition (PN). Soy oil-based lipid injectable emulsions (SO-ILE) were traditionally administered as part of PN formulations primarily as a source of energy and for prevention of essential fatty acid deficiency. Contemporary practice has evolved to incorporate mixtures of different lipid emulsions, including a combination of soy, MCT, olive, and fish oils (SO, MCT, OO, FO-ILE). Evidence suggests that the use of SO, MCT, OO, FO-ILEs may alter essential fatty acid profiles, impacting hepatic metabolism and other processes associated with clinical benefits. The aim of this project was to compare essential fatty acid profile (EFAP), triglycerides (TGL), liver function tests (LFTs), and total bilirubin (TB) levels in adult patients receiving parenteral nutrition with SO-ILE or SO, MCT, OO, FO-ILE in a unique hospital entirely dedicated to conducting clinical research.</p><p><b>Methods:</b> This was a retrospective chart review from 1/1/2019 to 12/31/2023 of adult patients in our hospital who received PN with SO-ILE or SO, MCT, OO, FO-ILE and had EFAP assessed after 7 days of receiving ILE. Data included demographic, clinical, and nutritional parameters. Patients with no laboratory markers, those on propofol, and those who received both ILE products in the 7 days prior to collection of EFAP were excluded. Data was statistically analyzed using Fisher's tests and Mann-Whitney U tests as appropriate.</p><p><b>Results:</b> A total of 42 patient charts were included (14 SO-ILE; 28 SO, MCT, OO, FO-ILE). Group characteristics can be found in Table 1. Patients on SO-ILE received more ILE (0.84 vs 0.79 g/kg/day, p &lt; 0.0001). TGL levels changed significantly after start of ILE (p &lt; 0.0001). LFTs were found to be elevated in 57% of patients in the SO-ILE group and 60% in the SO, MCT, OO, FO-ILE group, while TB was increased in 21% and 40% of the patients respectively (Figure 1). Further analysis showed no significant differences in LFTs and TB between the two groups. Assessment of EFAP revealed a significant difference in the levels of DHA, docosenoic acid, and EPA, which were found to be higher in the group receiving SO, MCT, OO, FO-ILE. Conversely, significant differences were also observed in the levels of linoleic acid, homo-g-linolenic acid, and total omega 6 fatty acids, with those being higher in patients administered SO ILE (Figure 2). No differences were observed between groups regarding the presence of essential fatty acid deficiency, as indicated by the triene:tetraene ratio.</p><p><b>Conclusion:</b> In our sample analysis, LFTs and TB levels did not differ significantly between SO-ILE and SO, MCT, OO, FO-ILE groups. Increased levels of DHA, docosenoic acid, and EPA were found in the SO, MCT, OO, FO-ILE group, while linoleic acid, homo-g-linolenic acid, and total omega 6 fatty acids tended to be higher in the SO-ILE group. Although in our sample the SO, MCT, OO, FO-ILE group received a lower dosage of ILE/kg/day, there were no differences in the rate of essential fatty acid deficiency between groups.</p><p><b>Table 1.</b> General Characteristics (N = 42).</p><p></p><p></p><p><b>Figure 1.</b> Liver Function Tests (N = 39).</p><p></p><p><b>Figure 2.</b> Essential Fatty Acid Profile (N = 42).</p><p>Kassandra Samuel, MD, MA<sup>1</sup>; Jody (Lind) Payne, RD, CNSC<sup>2</sup>; Karey Schutte, RD<sup>3</sup>; Kristen Horner, RDN, CNSC<sup>3</sup>; Daniel Yeh, MD, MHPE, FACS, FCCM, FASPEN, CNSC<sup>3</sup></p><p><sup>1</sup>Denver Health, St. Joseph Hospital, Denver, CO; <sup>2</sup>Denver Health, Parker, CO; <sup>3</sup>Denver Health, Denver, CO</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Early nutritional support in hospitalized patients has long been established as a key intervention to improve overall patient outcomes. Patients admitted to the hospital often have barriers to receiving adequate nutrition via enteral route means and may be candidates for parenteral nutrition (PN). Central parenteral nutrition (CPN) requires central access, which has historically led to concerns for central line-associated bloodstream infection (CLABSI). Obtaining central access can be resource-intensive and may result in treatment delays while awaiting access. Conversely, peripheral parenteral nutrition (PPN) can be delivered without central access. In this quality improvement project, we sought to characterize our PPN utilization at a large urban tertiary hospital.</p><p><b>Methods:</b> We performed a retrospective review of adult inpatients receiving PN at our facility from 1/1/23–12/31/23. Patients were excluded from review if they had PN initiated prior to hospitalization. Demographic information, duration of treatment, timely administration status, and information regarding formula nutrition composition were collected.</p><p><b>Results:</b> A total of 128 inpatients received PN for a total of 1302 PN days. The mean age of these patients was 53.8 years old (SD: 17.9) and 65 (50%) were male. Twenty-six (20%) patients received only PPN for a median [IQR] length of 3 [2–4] days, and 61 (48%) patients received only CPN for a median length 6 [3–10] days. Thirty-nine (30%) patients were started on PPN with the median time to transition to CPN of 1 [1-3] day(s) and a median total duration of CPN being 8 [5-15.5] days. A small minority of patients received CPN and then transitioned to PPN (2%).</p><p><b>Conclusion:</b> At our institution, PPN is utilized in more than 50% of all inpatient PN, most commonly at PN initiation and then eventually transitioning to CPN for a relatively short duration of one to two weeks. Additional research is required to identify those patients who might avoid central access by increasing PPN volume and macronutrients to provide adequate nutrition therapy.</p><p>Nicole Halton, NP, CNSC<sup>1</sup>; Marion Winkler, PhD, RD, LDN, CNSC, FASPEN<sup>2</sup>; Elizabeth Colgan, MS, RD<sup>3</sup>; Benjamin Hall, MD<sup>4</sup></p><p><sup>1</sup>Brown Surgical Associates, Providence, RI; <sup>2</sup>Department of Surgery and Nutritional Support at Rhode Island Hospital, Providence, RI; <sup>3</sup>Rhode Island Hospital, Providence, RI; <sup>4</sup>Brown Surgical Associates, Brown University School of Medicine, Providence, RI</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Parenteral nutrition (PN) provides adequate nutrition and fluids to patients with impaired gastrointestinal function who cannot meet their nutritional needs orally or enterally. PN requires a venous access device that has associated risks including infection as well as metabolic abnormalities associated with the therapy. Monitoring of PN therapy in the hospital setting involves regular blood work, yet improperly collected samples can lead to abnormal laboratory results and unnecessary medical interventions.</p><p><b>Methods:</b> An IRB exempted quality improvement study was conducted at Rhode Island Hospital by the Surgical Nutrition Service which manages all adult PN. The purpose of the study was to quantify the occurrence of contaminated blood samples among PN patients between January 1, 2024, and August 31, 2024. Demographic data, venous access device, and PN-related diagnoses were collected. Quantification of contaminated blood specimens was determined per patient, per hospital unit, and adjusted for total PN days. Comparisons were made between serum glucose, potassium, and phosphorus levels from contaminated and redrawn blood samples. Descriptive data are reported.</p><p><b>Results:</b> 138 patients received PN for a total of 1840 days with a median length of PN therapy of 8 days (IQR 9, range 2-84). The most common vascular access device was dual lumen peripherally inserted central catheter. The majority (63%) of patients were referred by surgery teams and received care on surgical floors or critical care units. The most frequent PN related diagnoses were ileus, gastric or small bowel obstruction, and short bowel syndrome. There were 74 contaminated blood specimens among 42 (30%) patients receiving TPN for a rate of 4% per total patient days. Of 25 nursing units, 64% had at least one occurrence of contaminated blood specimens among TPN patients on that unit. Contaminated samples showed significantly different serum glucose, potassium, and phosphorus compared to redrawn samples (p &lt; 0.001); glucose in contaminated vs redrawn samples was 922 ± 491 vs 129 ± 44 mg/dL; potassium 6.1 ± 1.6 vs 3.9 ± 0.5 mEq/L; phosphorus 4.9 ± 1.2 vs 3.3 ± 0.6 mg/dL. The average time delay between repeated blood samples was 3 hours.</p><p><b>Conclusion:</b> Contaminated blood samples can lead to delays in patient care, discomfort from multiple blood draws, unnecessary medical interventions (insulin; discontinuation of PN), delay in placement of timely PN orders, and increased infection risk. Nursing re-education on proper blood sampling techniques is critical for reducing contamination occurrences. All policies and procedures will be reviewed, and an educational program will be implemented. Following this, occurrences of blood contamination during PN will be reassessed.</p><p>Hassan Dashti, PhD, RD<sup>1</sup>; Priyasahi Saravana<sup>1</sup>; Meghan Lau<sup>1</sup></p><p><sup>1</sup>Massachusetts General Hospital, Boston, MA</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> ASN Nutrition 2024.</p><p><b>Publication:</b> Saravana P, Lau M, Dashti HS. Continuous glucose monitoring in adults with short bowel syndrome receiving overnight infusions of home parenteral nutrition. Eur J Clin Nutr. 2024 Nov 23. doi: 10.1038/s41430-024-01548-z. Online ahead of print. PMID: 39580544.</p><p><b>Financial Support:</b> ASPEN Rhoads Research Foundation.</p><p>Maria Romanova, MD<sup>1</sup>; Azadeh Lankarani-Fard, MD<sup>2</sup></p><p><sup>1</sup>VA Greater Los Angeles Healthcare System, Oak Park, CA; <sup>2</sup>GA Greater Los Angeles Healthcare System, Los Angeles, CA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition is a serious complication of the hospital stay. Parenteral nutrition (PN) in most sophisticated way of addressing it but requires on-going monitoring. In our medical center PN provision is guided by the interdisciplinary Nutrition Support Team (NST). In 2024 we began creation of a dashboard to monitor safety and utilization of PN at the Greater Los Angeles VA. Here we discuss the collaborative process of developing the dashboard and its first use.</p><p><b>Methods:</b> A dashboard was constructed using data from the VA electronic health record. The dashboard used Microsoft Power BI technology to customize data visualization. The NST group worked closely with the Data Analytics team at the facility to modify and validate the dashboard to accommodate the needs of the group. The dashboard was maintained behind a VA firewall and only accessible to members of the NST. The dashboard reviewed patient level data for whom a Nutrition Support consult was placed for the last 2 years. The variables included were the data of admission, date of consult request, the treating specialty at the time of request, demographics, admission diagnosis, discharge diagnosis, number of orders for PPN/TPN, number of blood sugars &gt;200 mg/dL after admission, number of serum phosphorus values &lt; 2.5 mg/dL, number of serum potassium values &lt; 3.5 mmol/L, any discharge diagnosis of refeeding (ICD 10 E87.8), micronutrient levels during admission, and any discharge diagnosis of infection. The ICD10 codes used to capture infection were for: bacteremia (R78.81), sepsis (A41.*), or catheter associated line infection (ICD10 = T80.211*). The asterix (*) denotes any number in that ICD10 classification. The dashboard was updated once a week. The NST validated the information on the dashboard to ensure validity, and refine information as needed.</p><p><b>Results:</b> The initial data extraction noted duplicate consult request as patients changed treating specialties during the same admission and duplicate orders for PPN/TPN as the formulations were frequently modified before administration. The Data Analytics team worked to reduce these duplicates. The NST also collaborated with the Data Analytics team to modify their existing documentation to better capture the data needed going forward. Dashboard data was verified by direct chart review. Between April 2022- 2024 68 consults were placed from the acute care setting and 58 patients received PPN or TPN during this time period. Thirty-five patients experienced hyperglycemia. Two patients were deemed to have experience refeeding at the time of discharge. Fourteen episodes of infection were noted in those who received PPN/TPN but the etiology was unclear from the dashboard alone and required additional chart review.</p><p><b>Conclusion:</b> A dashboard can facilitate monitoring of Nutrition Support services in the hospital. Refinement of the dashboard requires collaboration between the clinical team and the data analytics team to ensure validity and workload capture.</p><p>Michael Fourkas, MS<sup>1</sup>; Julia Rasooly, MS<sup>1</sup>; Gregory Schears, MD<sup>2</sup></p><p><sup>1</sup>PuraCath Medical Inc., Newark, CA; <sup>2</sup>Department of Anesthesiology and Perioperative Medicine, Mayo Clinic, Rochester, MN</p><p><b>Financial Support:</b> Funding of the study has been provided by Puracath Medical.</p><p><b>Background:</b> Intravenous catheters can provide venous access for drug and nutrition delivery in patients for extended periods of time, but risk the occurrence of central line associated bloodstream infections (CLABSI) due to inadequate asepsis. Needleless connectors (NC), which provide access for injection of medications, are known to be one of the major sources of contamination. Studies demonstrate that current methods of disinfecting connectors such as a 15 second antiseptic wipe do not guarantee complete disinfection inside of connectors. With the rise of superbugs such as Candida auris, there is an urgent need for better aseptic technique compliance and non-antibiotic disinfection methods. Ultraviolet light-C (UV-C) is an established technology that is commonly used in hospital settings for disinfection of equipment and rooms. In this study, we investigate the efficacy of our novel UV-C light disinfection device on UV-C light-transmissive NCs inoculated with common CLABSI-associated organisms.</p><p><b>Methods:</b> Staphylococcus aureus (ATCC #6538), Candida albicans (ATCC #10231), Candida auris (CDC B11903), Escherichia coli (ATCC #8739), Pseudomonas aeruginosa (ATCC #9027), and Staphylococcus epidermidis (ATCC #12228) were used as test organisms for this study. A total of 29 NC samples were tested for each organism with 3 positive controls and 1 negative control. Each UV-C light-transmissive NC was inoculated with 10 µl of cultured inoculum (7.00-7.66 log) and were exposed to an average of 48 mW/cm2 of UV light for 1 second using our in-house UV light disinfection device FireflyTM. After UV disinfection, 10 mL of 0.9% saline solution was flushed through the NC and filtered through a 0.45 µm membrane. The membrane filter was plated onto an agar medium matched to the organism and was incubated overnight at 37°C for S. aureus, E. coli, S. epidermidis, and P. aeruginosa, and two days at room temperature for C. albicans and C. auris. Positive controls followed the same procedure without exposure to UV light and diluted by 100x before being spread onto agar plates in triplicates. The negative controls followed the same procedure without inoculation. After plates were incubated, the number of colonies on each plate were counted and recorded. Log reduction was calculated by determining the positive control log concentration over the sample concentration in cfu/mL. 1 cfu/10 mL was used to make calculations for total kills.</p><p><b>Results:</b> Using our UV light generating device, we were able to achieve greater than 4 log reduction average and complete kills for all test organisms. The log reduction for S. aureus, C. albicans, C. auris, E. coli, P. aeruginosa, and S. epidermidis were 5.29, 5.73, 5.05, 5.24, 5.10, and 5.19, respectively.</p><p><b>Conclusion:</b> We demonstrated greater than 4-log reduction in common CLABSI-associated organisms using our UV light disinfection device and UV-C transmissive NCs. By injecting inoculum directly inside the NC, we demonstrated that disinfection inside NCs can be achieved, which is not possible with conventional scrubbing methods. A one second NC disinfection time will allow less disruption in the workflow in hospitals, particularly in intensive care units where highly effective and efficient disinfection rates are essential for adoption of the technology.</p><p><b>Table 1.</b> Log Reduction of Tested Organisms After Exposure to 48 mW/cm2 UV-C for 1 Second.</p><p></p><p>Yaiseli Figueredo, PharmD<sup>1</sup></p><p><sup>1</sup>University of Miami Hospital, Miami, FL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Octreotide belongs to the somatostatin analog class. It is used off-label for malignant bowel obstructions (MBO). Somatostatin analogs (SSA) inhibit the release and action of multiple hormones, reducing gastric secretions, peristalsis, and splanchnic blood flow while enhancing water and electrolyte absorption. National Comprehensive Cancer Network (NCCN) guidelines recommend octreotide 100-300 mcg subcutaneous twice to three times a day or 10-40 mcg/hour continuous infusion for the management of malignant bowel obstructions, and if prognosis is greater than 8 weeks, consider long-acting release (LAR) or depot injection. Using octreotide as an additive to parenteral nutrition solutions has been a debatable topic due to concerns of formation of a glycosyl octreotide conjugate that may decrease the octreotide's efficacy. However, other compatibility studies have concluded little octreotide loss over 48 hours in TPN solutions at room temperature in ambient room light. At the University of Miami Hospital, it is practiced using octreotide as an additive to Total Parenteral Nutrition (TPN) solutions to reduce gastro-intestinal secretions in patients with malignant bowel obstructions. The starting dose is 300 mcg, and dose is increased on 300 mcg increments to a maximum dose of 900 mcg if output remains uncontrolled/elevated. The objective of this study is to evaluate the effectiveness and safety of octreotide as an additive to TPN in hospitalized patients with malignant bowel obstructions.</p><p><b>Methods:</b> A three-year retrospective chart review (June 2021-June 2024) was conducted to evaluate the effectiveness and safety of octreotide as an additive to TPN in hospitalized patients with MBO diagnosis at UMH. The following information was obtained from chart review: age, gender, oncologic diagnosis, TPN indication, TPN dependency, octreotide doses used, baseline and final gastrointestinal secretion output recorded, type of venting gastrostomy in place, length of hospital stay, and baseline and final hepatic function tests.</p><p><b>Results:</b> A total of 27 patients were identified to have malignant bowel obstruction requiring TPN which had octreotide additive. All patients were started on octreotide 300 mcg/day added into 2-in-1 TPN solution. The gastrointestinal secretion output was reduced on average by 65% among all patients with a final average daily amount of 540 mL recorded. The baseline average output recorded was 1,518 mL/day. The average length of treatment as an inpatient was 23 days, range 3-98 days. Liver function tests (LFTs) were assessed at baseline and last inpatient value available for the admission. Four out of the 27 patients (15%) reviewed were observed to have a significant rise in liver enzymes greater than three times the upper limit of normal.</p><p><b>Conclusion:</b> Octreotide represents a valuable addition to the limited pharmacological options for managing malignant bowel obstruction. Its ability to reduce gastrointestinal secretions by 65% on average as observed in this retrospective chart review can significantly alleviate symptoms and improve patient care. Using octreotide as an additive to TPN solutions for patients with malignant bowel obstructions who are TPN dependent reduces the number of infusions or subcutaneous injections patients receive per day. According to octreotide's package insert, the incidence of hepato-biliary complications is up to 63%. The finding that 15% of patients from this retrospective chart review had significant liver enzyme elevations remains an important monitoring parameter to evaluate.</p><p>Pavel Tesinsky, Assoc. Prof., MUDr.<sup>1</sup>; Jan Gojda, Prof., MUDr, PhD<sup>2</sup>; Petr Wohl, MUDr, PhD<sup>3</sup>; Katerina Koudelkova, MUDr<sup>4</sup></p><p><sup>1</sup>Department of Medicine, Prague, Hlavni mesto Praha; <sup>2</sup>Department of Medicine, University Hospital, 3rd Faculty of Medicine Charles University in Prague, Praha, Hlavni mesto Praha; <sup>3</sup>Institute for Clinical and Experimental Medicine, Prague, Hlavni mesto Praha; <sup>4</sup>Department of Medicine, University Hospital and 3rd Faculty of Medicine in Prague, Prague, Hlavni mesto Praha</p><p><b>Financial Support:</b> The Registry was supported by Takeda and Baxter scientific grants.</p><p><b>Background:</b> Trends in indications, syndromes, performance, weaning, and complications of patients on total HPN based on the updated 30 years analysis and stratification of patients on home parenteral nutrition (HPN) in Czech Republic.</p><p><b>Methods:</b> Records from the HPN National Registry were analysed for the time period 2007 – 2023, based on the data from the HPN centers. Catheter related sepsis (CRS), catheter occlusions, and thrombotic complications were analyzed for the time–to–event using the competing-risks regression (Fine and Gray) model. Other data is presented as median or mean with 95% CI (p &lt; 0.05 as significant).</p><p><b>Results:</b> The incidence rate of HPN is 1.98 per 100.000 inhabitants (population 10.5 mil.). Lifetime dependency is expected in 20% patients, potential weaning in 40%, and 40% patients are palliative. Out of 1838 records representing almost 1.5 million catheter days, short bowel syndrome was present in 672 patients (36.6%), intestinal obstruction in 531 patients (28.9%), malabsorption in 274 patients (14.9%), and the rest of 361 patients (19.6%) was split among fistulas, dysphagia, or remained unspecified. The majority of SBS were type I (57.8 %) and II (20.8%). Mean length of residual intestine was 104.3 cm (35.9 - 173.4 cm) with longer remnants in type I SBS. Dominant indications for HPN were pseudoobstruction (35.8%), non-maignant surgical conditions (8.9%), Crohn disease (7.3%), and mesenteric occlusion (6.8%). Mobility for a substantial part of the day was reported from 77.8% HPN patients, economic activity and independence from 162 (24.8 %) out of 653 economically potent patients. A tunneled catheter was primarily used in 49.1%, PICC in 24.3%, and IV port in 19.8% patients. Commercially prepared bags were used in 69.7%, and pharmacy-prepared admixtures in 24.7% patients. A total of 66.9% patients were administered 1 bag per day/7 days a week. The sepsis ratio per 1000 catheter days decreased from 0.84 in 2013 to 0.15 in 2022. The catheter occlusions ratio decreased from 0.152 to 0.10 per 1000 catheter days, and thrombotic complications ratio from 0.05 to 0.04. Prevalence of metabolic bone disease is 15.6 %, and prevalence of PNALD is 22.3%. In the first 12 months, 28 % patients achieved intestinal autonomy increasing to 45 % after 5 years. Patient survival rate is 62% in the first year, 45% at 5 years, and 35% at the 10-years mark. Tedeglutide was indicated in 36 patients up to date with reduction of the daily HPN volume to 60.3% on average.</p><p><b>Conclusion:</b> Prevalence of HPN patients in the Czech Republic is increasing in the past ten years and it is corresponding to the incidence rate. Majority of patients are expected to terminate HPN within the first year. Risk of CRS decreased significantly in the past five years and remains low, while catheter occlusion and thrombotic complications have a stable trend. Tedeglutide significantly reduced the required IV volume.</p><p></p><p><b>Figure 1.</b> Per-Year Prevalence of HPN Patients and Average Number of Catheter Days Per Patient (2007 - 2022).</p><p></p><p><b>Figure 2.</b> Annual Incidence of HPN Patients (2007 - 2022).</p><p></p><p><b>Figure 3.</b> Catheter related bloodstream infections (events per 1,000 catheter-days).</p><p>Jill Murphree, MS, RD, CNSC, LDN<sup>1</sup>; Anne Ammons, RD, LDN, CNSC<sup>2</sup>; Vanessa Kumpf, PharmD, BCNSP, FASPEN<sup>2</sup>; Dawn Adams, MD, MS, CNSC<sup>2</sup></p><p><sup>1</sup>Vanderbilt University Medical Center, Nashville, TN; <sup>2</sup>Vanderbilt University Medical Center, Nashville, TN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Determining macronutrient goals in patients requiring home parenteral nutrition (HPN) can be difficult due to various factors. While indirect calorimetry is the gold standard for measuring energy expenditure, it is not readily available in the outpatient setting. Therefore, clinicians typically rely on less accurate weight-based equations for assessment of protein and energy requirements. Energy goals are also impacted by the targeted desire for weight loss, weight gain, or weight maintenance. Patients receiving HPN may consume some oral dietary intake and experience variable degrees of macronutrient absorption. These factors, as well as underlying clinical conditions, can significantly impact protein and energy requirements and may change over the course of HPN therapy. The purpose of this study was to evaluate the range of protein and energy doses prescribed in patients receiving HPN managed by an interdisciplinary intestinal failure clinic at a large academic medical center.</p><p><b>Methods:</b> Patient demographics including patient age, gender, and PN indication/diagnosis were retrospectively obtained for all patients discharged home with PN between May 2021 to May 2023 utilizing an HPN patient database. Additional information was extracted from the electronic medical record at the start of HPN, then at 2-week, 2 to 3 month, and 6-month intervals following discharge home that included height, actual weight, target weight, HPN energy dose, HPN protein dose, and whether the patient was eating. Data collection ended at completion of HPN therapy or up to 6 months of HPN. All data was entered and stored in an electronic database.</p><p><b>Results:</b> During the study period, 248 patients were started on HPN and 56 of these patients received HPN for at least 6 months. Patient demographics are included in Table 1. At the start of HPN, prescribed energy doses ranged from 344 to 2805 kcal/d (6 kcal/kg/d to 45 kcal/kg/d) and prescribed protein doses ranged from 35 to 190 g/d (0.6 g/kg/d to 2.1 g/kg/d). There continued to be a broad range of prescribed energy and protein doses at 2-week, 2 to 3 month, and 6-month intervals of HPN. Figures 1 and 2 provide the prescribed energy and protein doses for all patients and for those who are eating and not eating. For patients not eating, the prescribed range for energy was 970 to 2791 kcal/d (8 kcal/kg/d to 45 kcal/kg/d) and for protein was 40 to 190 g/d (0.6 g/kg/d to 2.0 g/kg/d) at the start of PN therapy. The difference between actual weight and target weight was assessed at each study interval. Over the study period, patients demonstrated a decrease in the difference between actual and target weight to suggest improvement in reaching target weight (Figure 3).</p><p><b>Conclusion:</b> The results of this study demonstrate a wide range of energy and protein doses prescribed in patients receiving HPN. This differs from use of PN in the inpatient setting, where weight-based macronutrient goals tend to be more defined. Macronutrient adjustments may be necessary in the long-term setting when patients are consuming oral intake, for achievement/maintenance of target weight, or for changes in underlying conditions. Patients receiving HPN require an individualized approach to care that can be provided by interdisciplinary nutrition support teams specializing in intestinal failure.</p><p><b>Table 1.</b> Patient Demographics Over 6-Month Study Period.</p><p></p><p></p><p><b>Figure 1.</b> Parenteral Nutrition (PN) Energy Range.</p><p></p><p><b>Figure 2.</b> Parenteral Nutrition (PN) Protein Range.</p><p></p><p><b>Figure 3.</b> Difference Between Actual Weight and Target Weight.</p><p>Jennifer Lachnicht, RD, CNSC<sup>1</sup>; Christine Miller, PharmD<sup>2</sup>; Jessica Younkman, RD CNSC<sup>2</sup></p><p><sup>1</sup>Soleo Home Infusion, Frisco, TX; <sup>2</sup>Soleo Health, Frisco, TX</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Since the 1990s, initiating parenteral nutrition (PN) at home has been performed, though some clinicians prefer hospital initiation due to risks like refeeding syndrome (RS). A key factor in successful home PN initiation is careful evaluation by an experienced nutrition support clinician, particularly assessing RS risk. In 2020, ASPEN published consensus recommendations for identifying patients at risk for RS and guidelines for initiating and advancing nutrition. Home PN initiation offers advantages, including avoiding hospitalization, reducing healthcare costs, minimizing hospital-acquired infections, and improving quality of life. Literature suggests a savings of $2,000 per day when PN is started at home. This study aims to determine the risk and incidence of RS in patients who began PN at home, based on the 2020 ASPEN Consensus Recommendations for RS.</p><p><b>Methods:</b> A national home infusion provider's nutrition support service reviewed medical records for 27 adult patients who initiated home PN between September 2022 and June 2024. Patients were evaluated for RS risk before PN initiation and the actual incidence of RS based on pre- and post-feeding phosphorus, potassium, and magnesium levels using ASPEN 2020 criteria. Initial lab work was obtained before and after PN initiation. Refeeding risk was categorized as mild, moderate, moderate to severe, or severe based on initial nutrition assessment, including BMI, weight loss history, recent caloric intake, pre-feeding lab abnormalities, fat/muscle wasting, and high-risk comorbidities. The percent change in phosphorus, potassium, and magnesium was evaluated and categorized as mild, moderate, or severe if levels decreased after therapy start. Initial PN prescriptions included multivitamins and supplemental thiamin per provider policy and consensus recommendations.</p><p><b>Results:</b> The average baseline BMI for the study population was 19.6 kg/m² (range 12.7-31.8, median 18.9). Weight loss was reported in 88.9% of patients, averaging 22%. Little to no oral intake at least 5-10 days before assessment was reported in 92.3% of patients. Initial lab work was obtained within 5 days of therapy start in 96.2% of cases, with 18.5% showing low prefeeding electrolytes. 100% had high-risk comorbidities. RS risk was categorized as mild (4%), moderate (48%), moderate to severe (11%), and severe (37%). Home PN was successfully initiated in 25 patients (93%). Two patients could not start PN at home: one due to persistently low pre-PN electrolyte levels despite IV repletion, and one due to not meeting home care admission criteria. Starting dextrose averaged 87.2 g/d (range: 50-120, median 100). Average total starting calories were 730 kcals/d, representing 12.5 kcals/kg (range: 9-20, median 12). Initial PN formula electrolyte content included potassium (average 55.4 mEq/d, range: 15-69, median 60), magnesium (average 11.6 mEq/d, range: 4-16, median 12), and phosphorus (average 15.6 mmol/d, range: 8-30, median 15). Labs were drawn on average 4.3 days after therapy start. Potassium, magnesium, and phosphorus levels were monitored for decreases ≥10% of baseline to detect RS. Decreases in magnesium and potassium were classified as mild (10-20%) and experienced by 4% of patients, respectively. Eight patients (32%) had a ≥ 10% decrease in phosphorus: 4 mild (10-20%), 2 moderate (20-30%), 2 severe (&gt;30%).</p><p><b>Conclusion:</b> Home initiation of PN can be safely implemented with careful monitoring and evaluation of RS risk. This review showed a low incidence of RS based on ASPEN criteria, even in patients at moderate to severe risk prior to home PN initiation. Close monitoring of labs and patient status, along with adherence to initial prescription recommendations, resulted in successful PN initiation at home in 92.5% of patients.</p><p>Dana Finke, MS, RD, CNSC<sup>1</sup>; Christine Miller, PharmD<sup>1</sup>; Paige Paswaters, RD, CNSC<sup>1</sup>; Jessica Younkman, RD, CNSC<sup>1</sup></p><p><sup>1</sup>Soleo Health, Frisco, TX</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Iron deficiency anemia is common in home parenteral nutrition (HPN) patients (Hwa et al., 2016). Post iron infusion, hypophosphatemia due to fibroblast growth factor 23 is a known side effect of some IV iron formulations (Wolf et al., 2019). Ferric carboxymaltose can cause serum phosphorus levels to drop below 2 mg/dL in 27% of patients (Onken et al., 2014). Hypophosphatemia (&lt; 2.7 mg/dL) can lead to neurological, neuromuscular, cardiopulmonary, and hematologic issues, and long-term effects like osteopenia and osteoporosis (Langley et al., 2017). This case series reviews the occurrence and clinical implications of transient serum phosphorus drops in patients receiving ferric carboxymaltose and HPN.</p><p><b>Methods:</b> A retrospective case series review was performed for three patients who were administered ferric carboxymaltose while on HPN therapy. Serum phosphorus levels were measured at baseline prior to initial iron dose, within 1 week post injection, and at subsequent time intervals up to 4 months post initial injection. Data was collected on the timing, magnitude, and duration of any phosphorus decreases, and any associated clinical symptoms or complications. Patient records were also reviewed to evaluate any correlations with HPN composition or phosphorus dosing.</p><p><b>Results:</b> Among the three patients reviewed, all exhibited short-term drops in serum phosphorus levels following ferric carboxymaltose injection (Table 1). All patients had baseline serum phosphorus levels within normal limits prior to the initial dose of ferric carboxymaltose. All cases involved multiple doses of ferric carboxymaltose, which contributed to the fluctuations in phosphorus levels. The average drop in serum phosphorus from baseline to the lowest point was 50.3%. The lowest recorded phosphorus level among these three patients was 1.4 mg/dL, and this was in a patient who received more than two doses of ferric carboxymaltose. In two cases, increases were made in HPN phosphorus in response to serum levels, and in one case no HPN changes were made. However, all serum phosphorus levels returned to normal despite varied interventions. Despite the low phosphorus levels, none of the patients reported significant symptoms of hypophosphatemia during the monitoring periods. Ferric carboxymaltose significantly impacts serum phosphorus in HPN patients, consistent with existing literature. The need for vigilant monitoring is highlighted, patients receiving HPN are closely monitored by a trained nutrition support team with frequent lab monitoring. Lab monitoring in patients receiving ferric carboxymaltose who are not on HPN may be less common. The lowest level recorded was 1.4 mg/dL, indicating potential severity. Despite significant drops, no clinical symptoms were observed, suggesting subclinical hypophosphatemia may be common. In two of the reviewed cases, hypophosphatemia was addressed by making incremental increases in the patient's HPN formulas. Note that there are limitations to phosphorus management in HPN due to compatibility and stability issues, and alternative means of supplementation may be necessary depending upon the patient's individual formula.</p><p><b>Conclusion:</b> Three HPN patients receiving ferric carboxymaltose experienced transient, generally mild reductions in serum phosphorus. Monitoring is crucial, but the results from this case series suggest that clinical complications are rare. Adjustments to HPN or additional supplementation may be needed based on individual patient needs, with some cases self-correcting over time.</p><p><b>Table 1.</b> Timeline of Iron Injections and the Resulting Serum Phosphorus Levels and HPN Formula Adjustments.</p><p></p><p>Danial Nadeem, MD<sup>1</sup>; Stephen Adams, MS, RPh, BCNSP<sup>2</sup>; Bryan Snook<sup>2</sup></p><p><sup>1</sup>Geisinger Wyoming Valley, Bloomsburg, PA; <sup>2</sup>Geisinger, Danville, PA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Ferric carboxymaltose (FC) is a widely used intravenous iron formulation, primarily employed in the treatment of iron deficiency. It offers significant benefits, particularly in cases where oral iron supplementation proves ineffective or is not well-tolerated. However, an important potential adverse effect associated with FC use is hypophosphatemia. This condition has been observed in multiple patients following their treatment with FC. The paper discusses the potential mechanisms leading to this adverse effect and its significant implications for patient care.</p><p><b>Methods:</b> A middle-aged female with history of malnutrition and iron deficiency receiving parenteral nutrition at home had received multiple doses Venofer in the past, with the last dose given in 2017 to which the patient developed an anaphylactic reaction. She was therefore switched to ferric carboxymaltose (FCM) therapy. However, upon receiving multiple doses of FCM in 2018, the patient developed significant hypophosphatemia. As hypophosphatemia was noted, adjustments were made to the patient's total parenteral nutrition (TPN) regimen to increase the total phosphorus content in an effort to treat the low phosphate levels. The patient also received continued doses of FCM in subsequent years, with persistent hypophosphatemia despite repletion.</p><p><b>Results:</b> Ferric carboxymaltose (FC) is a widely used intravenous iron therapy for the treatment of iron deficiency. It is particularly beneficial in cases where oral iron supplementation is ineffective or not tolerated. FC works by delivering iron directly to the macrophages in the reticuloendothelial system. The iron is then released slowly for use by the body, primarily for the production of hemoglobin. However, recent studies have highlighted the potential adverse effect of hypophosphatemia associated with the use of FC. Hypophosphatemia induced by FC is thought to be caused by an increase in the secretion of the hormone fibroblast growth factor 23 (FGF23). FGF23 is a hormone that regulates phosphate homeostasis. When FGF23 levels rise, the kidneys increase the excretion of phosphate, leading to lower levels of phosphate in the blood. There are many implications of hypophosphatemia in regards to patient care. Symptoms of hypophosphatemia can include muscle weakness, fatigue, bone pain, and confusion. In severe cases, persistent hypophosphatemia can lead to serious complications such as rhabdomyolysis, hemolysis, respiratory failure, and even death. Therefore, it is crucial for clinicians to be aware of the potential risk of hypophosphatemia when administering FC. Regular monitoring of serum phosphate levels is recommended in patients receiving repeated doses of FC. Further research is needed to fully understand the mechanisms underlying this adverse effect and to develop strategies for its prevention and management.</p><p><b>Conclusion:</b> In conclusion, while FC is an effective treatment for iron deficiency, it is important for clinicians to be aware of the potential risk of hypophosphatemia. Regular monitoring of serum phosphate levels is recommended in patients receiving repeated doses of FC. Further research is needed to fully understand the mechanisms underlying this adverse effect and to develop strategies for its prevention and management.</p><p><b>Table 1.</b> Phosphorous Levels and Iron Administration.</p><p></p><p>Table 1 shows the response to serum phosphorous levels in a patient given multiple doses of intravenous iron over time.</p><p>Ashley Voyles, RD, LD, CNSC<sup>1</sup>; Jill Palmer, RD, LD, CNSC<sup>1</sup>; Kristin Gillespie, MD, RD, LDN, CNSC<sup>1</sup>; Suzanne Mack, MS, MPH, RD, LDN, CNSC<sup>1</sup>; Tricia Laglenne, MS, RD, LDN, CNSC<sup>1</sup>; Jessica Monczka, RD, CNSC, FASPEN<sup>1</sup>; Susan Dietz, PharmD, BCSCP<sup>1</sup>; Kathy Martinez, RD, LD<sup>1</sup></p><p><sup>1</sup>Option Care Health, Bannockburn, IL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Home parenteral nutrition (HPN) can be successfully initiated in the home setting with careful evaluation and management by an experienced nutrition support team (NST).<sup>1,2</sup> Safe candidates for HPN initiation include medically stable patients with an appropriate indication, a safe environment, and the means for reliable follow-up. However, some patients are not appropriate to start directly with HPN due to logistical reasons or the risk of refeeding syndrome (RFS).<sup>2</sup> Consensus Recommendations for RFS provide guidance regarding recognizing and managing risk.<sup>3</sup> An experienced NST that provides individualized care can recommend intravenous (IV) hydration before initiating HPN to expedite the initiation of therapy and normalize blood chemistry to mitigate the risk of RFS. The purpose of this study is to evaluate the impact of IV hydration on adult patients managed by a home infusion NST who received IV hydration prior to initiating HPN. The proportion of patients who received IV hydration prior to HPN, the reason for initiating IV hydration, and the impact this intervention may have had on their care and outcomes will be described.</p><p><b>Methods:</b> This retrospective review includes 200 HPN patients 18 years of age and older initiated on HPN therapy with a national home infusion pharmacy over a 6-month period from January 1, 2024, to June 30, 2024. Data collection included baseline demographics, indication for HPN, risk of RFS, days receiving IV hydration prior to initiating HPN, the number of rehospitalizations within the first 2 weeks, whether they received IV hydration, and if so, the indication for IV hydration and the components of the orders. Data was collected via electronic medical records and deidentified into a standardized data collection form.</p><p><b>Results:</b> Of the 200 total patients, 19 (9.5%) received IV hydration prior to HPN. Of these 19 patients, 16 were female, and 3 were male (Table 1). The most common indications for HPN were bariatric surgery complication (5), intestinal failure (4), and oncology diagnosis (4) (Figure 1). Among these patients, 9 (47%) were at moderate RFS risk and 10 (53%) were at high RFS risk. The indications for IV hydration included 7 (37%) due to electrolyte abnormalities/RFS risk, 5 (26%) due to delay in central line placement, and 7 (37%) due to scheduling delays (Figure 2). IV hydration orders included electrolytes in 15 (79%) of the orders. All orders without electrolytes (4) had an indication related to logistical reasons (Figure 3). All 19 patients started HPN within 7 days of receiving IV hydration. Two were hospitalized within the first two weeks of therapy with admitting diagnoses unrelated to HPN.</p><p><b>Conclusion:</b> In this group of patients, HPN was successfully initiated in the home setting when managed by an experienced NST, preventing unnecessary hospitalizations. This study demonstrated that safe initiation of HPN may include IV hydration with or without electrolytes first, either to mitigate RFS or due to logistical reasons, when started on HPN within 7 days. The IV hydration orders were individualized to fit the needs of each patient. This data only reflects IV hydration dispensed through the home infusion pharmacy and does not capture IV hydration received at an outside clinic. This patient population also does not include those deemed medically unstable for HPN or those not conducive to starting in the home setting for other factors. Future research should account for these limitations and detail IV hydration components, dosing, and frequency of orders.</p><p><b>Table 1.</b> Demographics.</p><p></p><p></p><p><b>Figure 1.</b> HPN Indications of IV Hydration.</p><p></p><p><b>Figure 2.</b> Indication for IV Hydration and Refeeding Risk.</p><p></p><p><b>Figure 3.</b> Indications and Types of IV Hydration.</p><p>Emily Boland Kramer, MS, RD, LDN, CNSC<sup>1</sup>; Jessica Monczka, RD, CNSC, FASPEN<sup>1</sup>; Tricia Laglenne, MS, RD, LDN, CNSC<sup>1</sup>; Ashley Voyles, RD, LD, CNSC<sup>1</sup>; Susan Dietz, PharmD, BCSCP<sup>1</sup>; Kathy Martinez, RD, LD<sup>1</sup></p><p><sup>1</sup>Option Care Health, Bannockburn, IL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Parenteral nutrition (PN) is a critical therapy for patients who are unable to absorb nutrients adequately via the gastrointestinal tract.<sup>1</sup> PN is complex, with 10 or more individually dosed components in each order which inherently increases the risk for dosing errors. <sup>2</sup> This study seeks to analyze the PN orders at hospital discharge received by a home infusion provider and identify the incidence of the omission of the standard components, as determined by ASPEN Recommendations on Appropriate Parenteral Nutrition Dosing.<sup>3</sup> The primary objective of this study was to identify missing sodium, potassium, magnesium, calcium, phosphorus, and multivitamin components in hospital discharge orders. The secondary objective was to determine whether identified missing components were added back during the transition of care (TOC) process from hospital to home.</p><p><b>Methods:</b> This multi-center, retrospective chart review analyzed patients referred to a national home infusion provider over a 3-month period. Data was collected from the electronic medical record and internal surveillance software. The Registered Dietitian, Certified Nutrition Support Clinician (RD, CNSC) reviewed all PN hospital discharge orders for patients transitioning clinical management and PN therapy from hospital to home. Inclusion criteria were patients 18 years of age or older with PN providing the majority of their nutritional needs, as defined in Table 1, who were missing sodium, potassium, magnesium, calcium, phosphorus, or multivitamin from their PN order at hospital discharge. Exclusion criteria were patients less than 18 years of age, patients receiving supplemental PN not providing majority of nutritional needs, and patients with doses ordered for all electrolytes and multivitamin.</p><p><b>Results:</b> During the 3-month period (April 1, 2024 to June 30, 2024), 267 patients were identified who were greater than 18 years of age, receiving the majority of their nutritional needs via PN, and missing at least one PN component in their hospital discharge order. See Table 2 and Figure 1 for demographics. One hundred seventy-five (65.5%) patients were missing one component and 92 (34.5%) were missing multiple components from the hospital discharge order. One hundred seventy-five (65.5%) patients were missing calcium, 68 (25.5%) phosphorus, 38 (14%) multivitamin, 23 (8.6%) magnesium, 20 (7.5%) potassium, and 20 (7.5%) sodium. During the transition from hospital to home, after discussion with the provider, 94.9% of patients had calcium added back, 94.7% multivitamin, 91.3% magnesium, 90% potassium, 88.2% phosphorus, and 80% sodium.</p><p><b>Conclusion:</b> This study highlights the prevalence of missing components in PN hospital discharge orders, with calcium being the most frequently omitted at a rate of 65.5%. Given that many patients discharging home on PN will require long term therapy, adequate calcium supplementation is essential to prevent bone resorption and complications related to metabolic bone disease. In hospital discharge orders that were identified by the RD, CNSC as missing calcium, 94.9% of the time the provider agreed that it was clinically appropriate to add calcium to the PN order during the TOC process. This underlines the importance of nutrition support clinician review and communication during the transition from hospital to home. Future research should analyze the reasons why components are missing from PN orders and increase awareness of the need for a thorough clinical review of all patients going home on PN to ensure the adequacy of all components required for safe and optimized long term PN.</p><p><b>Table 1.</b> Inclusion and Exclusion Criteria.</p><p></p><p><b>Table 2.</b> Demographics.</p><p></p><p></p><p><b>Figure 1.</b> Primary PN Diagnosis.</p><p></p><p><b>Figure 2.</b> Components Missing from Order and Added Back During TOC Process.</p><p>Avi Toiv, MD<sup>1</sup>; Hope O'Brien, BS<sup>2</sup>; Arif Sarowar, MSc<sup>2</sup>; Thomas Pietrowsky, MS, RD<sup>1</sup>; Nemie Beltran, RN<sup>1</sup>; Yakir Muszkat, MD<sup>1</sup>; Syed-Mohammad Jafri, MD<sup>1</sup></p><p><sup>1</sup>Henry Ford Hospital, Detroit, MI; <sup>2</sup>Wayne State University School of Medicine, Detroit, MI</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Intestinal Failure Associated Liver Disease (IFALD) is a known complication in patients reliant on total parenteral nutrition (TPN), especially those awaiting intestinal transplantation. There is concern that IFALD may negatively impact post-transplant outcomes, including graft survival and overall patient mortality. This study aims to evaluate the impact of IFALD, as indicated by liver function test (LFT) abnormalities before intestinal or multivisceral transplant, on transplant outcomes.</p><p><b>Methods:</b> We conducted a retrospective chart review of all patients who underwent IT at an academic transplant center from 2010 to 2023. The primary outcome was patient survival and graft failure in transplant recipients.</p><p><b>Results:</b> Among 50 IT recipients, there were 30 IT recipients (60%) who required TPN before IT. The median age at transplant in was 50 years (range, 17-68). In both groups, the majority of transplants were exclusively IT, however they included MVT as well. 87% of patients on TPN developed elevated LFTs before transplant. 33% had persistently elevated LFTs 1 year after transplant. TPN-associated liver dysfunction in our cohort was associated with mixed liver injury patterns, with both hepatocellular injury (p &lt; 0.001) and cholestatic injury (p &lt; 0.001). No significant associations were found between TPN-related elevated LFTs and major transplant outcomes, including death (p = 0.856), graft failure (p = 0.144), or acute rejection (p = 0.306). Similarly, no significant difference was observed between elevated LFTs and death (p = 0.855), graft failure (p = 0.769), or acute rejection (p = 0.386) in patients who were not on TPN. TPN-associated liver dysfunction before transplant was associated with elevated LFTs at one year after transplant (p &lt; 0.001) but lacked clinical relevance.</p><p><b>Conclusion:</b> Although IFALD related to TPN use is associated with specific liver dysfunction patterns in patients awaiting intestinal or multivisceral transplants, it does not appear to be associated with significant key transplant outcomes such as graft failure, mortality, or acute rejection. However, it is associated with persistently elevated LFTs even 1 year after transplant. These findings suggest that while TPN-related liver injury is common, it may not have a clinically significant effect on long-term transplant success. Further research is needed to explore the long-term implications of IFALD in this patient population.</p><p>Jody (Lind) Payne, RD, CNSC<sup>1</sup>; Kassandra Samuel, MD, MA<sup>2</sup>; Heather Young, MD<sup>3</sup>; Karey Schutte, RD<sup>3</sup>; Kristen Horner, RDN, CNSC<sup>3</sup>; Daniel Yeh, MD, MHPE, FACS, FCCM, FASPEN, CNSC<sup>3</sup></p><p><sup>1</sup>Denver Health, Parker, CO; <sup>2</sup>Denver Health, St. Joseph Hospital, Denver, CO; <sup>3</sup>Denver Health, Denver, CO</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Central line-associated blood stream infection (CLABSI) is associated with increased complications, length of stay and cost of care. The majority of CLABSI studies are focused on home parenteral nutrition (PN) patients and there is a paucity of data documenting the incidence of CLABSI attributable to PN in the inpatient setting. At our institution, we have observed that clinicians are reluctant to initiate PN in patients with clear indications for PN due to concerns about CLABSI. Therefore, we performed a quality improvement project to document our incidence of CLABSI rate for new central parenteral nutrition (CPN) initiated during hospitalization.</p><p><b>Methods:</b> We performed a retrospective review of adult inpatients who initiated CPN at our facility from 1/1/23-12/31/23. Patients were excluded if they received PN prior to admission or only received peripheral PN. The National Healthcare Safety Network (NHSN) definitions were used for CLABSI and secondary attribution. Further deeper review of CLABSI cases was provided by an Infectious Disease (ID) consultant to determine if positive cases were attributable to CPN vs other causes. The type of venous access for the positive patients was also reviewed.</p><p><b>Results:</b> A total of 106 inpatients received CPN for a total of 1121 CPN days. The median [IQR] length of CPN infusion was 8 [4-14] days. Mean (standard deviation) age of patients receiving CPN infusion was 53.3 (18.6) years and 65 (61%) were men. The CPN patients who met criteria for CLABSI were further reviewed by ID consultant and resulted in only four CLABSI cases being attributable to CPN. These four cases resulted in an incidence rate of 3.6 cases of CLABSI per 1000 CPN days. Two of these patients were noted for additional causes of infection including gastric ulcer perforation and bowel perforation with anastomotic leak. Three of the patients had CPN infused via a central venous catheter (a port, a femoral line, and a non-tunneled internal jugular catheter) and the fourth patient had CPN infused via a peripherally inserted central catheter. The incidence rate for CLABSI cases per catheter days was not reported in our review.</p><p><b>Conclusion:</b> At our institution, &lt; 4% of patients initiating short-term CPN during hospitalization developed a CLABSI attributable to the CPN. This low rate of infection serves as a benchmark for our institution's quality improvement and quality assurance efforts. Collaboration with ID is recommended for additional deeper review of CPN patients with CLABSI to determine if the infection is more likely to be related to other causes than infusion of CPN.</p><p>Julianne Harcombe, RPh<sup>1</sup>; Jana Mammen, PharmD<sup>1</sup>; Hayato Delellis, PharmD<sup>1</sup>; Stefani Billante, PharmD<sup>1</sup></p><p><sup>1</sup>Baycare, St. Joseph's Hospital, Tampa, FL</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> Florida Residency Conference 2023.</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Purpose/Background: Refeeding syndrome is defined as potentially fatal shifts in fluids and electrolytes that may occur in malnourished patients receiving enteral or parenteral nutrition. When refeeding syndrome occurs, a reduction in phosphorus/magnesium/potassium levels and thiamine deficiency can be seen shortly after initiation of calorie provision. The American Society of Parenteral and Enteral Nutrition guidelines considers hypophosphatemia as the hallmark sign of refeeding syndrome; however, magnesium and potassium have been shown to be equally important. The purpose of this study is to identify the incidence of hypophosphatemia in patients who are at risk of refeeding syndrome and the importance of monitoring phosphorus.</p><p><b>Methods:</b> This study was a multicenter retrospective chart review that was conducted using the BayCare Health System medical records database, Cerner. The study included patients who were 18 years or older, admitted between January 2023 through December 2023, received total parenteral nutrition (TPN) and were at risk of refeeding syndrome. We defined patients at risk of refeeding syndrome as patients who meet two of the following criteria prior to starting the TPN: body mass index (BMI) prior to starting TPN &lt; 18.5 kg/m2, 5% weight loss in 1 month, no oral intake 5 days, or low levels of serum phosphorus/magnesium/potassium. COVID-19 patients and patients receiving propofol were excluded from the study. The primary objective of the study was to evaluate the incidence of hypophosphatemia versus hypomagnesemia versus hypokalemia in patients receiving TPN who were at risk of refeeding syndrome. The secondary objective was to evaluate whether the addition of thiamine upon initiation of TPN showed benefit in the incidence of hypophosphatemia.</p><p><b>Results:</b> A total of 83 patients met the criteria for risk of refeeding syndrome. Out of the 83 patients, a total of 53 patients were used to run a pilot study to determine the sample size and 30 patients were included in the study. The results on day 1 and day 2 suggest the incidence of hypomagnesemia differs from that of hypophosphatemia and hypokalemia, with a notably lower occurrence. The Cochran's Q test yielded x2(2) = 9.57 (p-value = 0.008) on day 1 and x2(2) = 4.77 (p-value = 0.097) on day 2, indicating a difference in at least one group compared to the others on only day 1. A post hoc analysis found a difference on day 1 between the incidence of hypophosphatemia vs hypomagnesemia (30%) and hypomagnesemia vs hypokalemia (33.3%). For the secondary outcome, the difference in day 2 versus day 1 phosphorus levels with the addition of thiamine in the TPN was 0.073 (p-value = 0.668, 95% CI [-0.266 – 0.413]).</p><p><b>Conclusion:</b> Among patients who were on parenteral nutrition and were at risk of refeeding syndrome, there was not a statistically significant difference in the incidence of hypophosphatemia vs hypomagnesemia and hypomagnesemia vs hypokalemia. Among patients who were on parenteral nutrition and were at risk of refeeding syndrome, there was not a statistically significant difference on day 2 phosphorus levels vs day 1 phosphorus levels when thiamine was added.</p><p></p><p></p><p>Jennifer McClelland, MS, RN, FNP-BC<sup>1</sup>; Margaret Murphy, PharmD, BCNSP<sup>1</sup>; Matthew Mixdorf<sup>1</sup>; Alexandra Carey, MD<sup>1</sup></p><p><sup>1</sup>Boston Children's Hospital, Boston, MA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Iron deficiency anemia (IDA) is common in patients with intestinal failure (IF) dependent on parenteral nutrition (PN). Treatment with enteral iron is preferred; however, may not be tolerated or efficacious. In these cases, intravenous (IV) iron is a suitable alternative. Complications include adverse reactions and infection, though low when using low-molecular-weight (LMW) formulations. In a home PN (HPN) program, an algorithm (Figure 1) was developed to treat IDA utilizing IV iron.</p><p><b>Methods:</b> A retrospective chart review was conducted in large HPN program (~150 patients annually) from Jan 2019 - April 2024 who were prescribed IV iron following an algorithm. Laboratory studies were analyzed looking for instances of ferritin &gt;500 ng/mL indicating potential iron overload, as well as transferrin saturation 12-20% indicating iron sufficiency. In instances of ferritin levels &gt;500 further review was conducted to understand etiology, clinical significance and if the IV iron algorithm was adhered to.</p><p><b>Results:</b> HPN patients are diagnosed with IDA based on low iron panel (low hemoglobin and/or MCV, low ferritin, high reticulocyte count, serum iron and transferrin saturation and/or high total iron binding capacity (TIBC). If the patient can tolerate enteral iron supplementation, a dose of 3-6 mg/kg/day is initiated. If patient cannot tolerate enteral iron, the IV route is initiated. Initial IV dose is administered in the hospital or infusion center for close monitoring and to establish home maintenance administration post repletion dosing. Iron dextran is preferred as it can be directly added into the PN and run for duration of the cycle. Addition to the PN eliminates an extra infusion and decrease additional CVC access. Iron dextran is incompatible with IV lipids, so the patient must have one lipid-free day weekly to be able to administer. If patient receives daily IV lipids, iron sucrose is given as separate infusion from the PN. Maintenance IV iron dosing is 1 mg/kg/week, with dose and frequency titrated based on clinical status, lab studies and trends. Iron panel and C-reactive protein (CRP) are ordered every 2 months. If lab studies are below the desired range and consistent with IDA, IV iron dose is increased by 50% by dose or frequency; if studies are over the desired range, IV iron dose is decreased by 50% by dose or frequency. Maximum home dose is &lt; 3 mg/kg/dose; if higher dose needed, patient is referred to an infusion center. IV iron is suspended if ferritin &gt;500 ng/mL due to risk for iron overload and deposition in the liver. Ferritin results (n = 4165) for all patients in HPN program from January 2019-April 2024 were reviewed looking for levels &gt;500 ng/mL indicating iron overload. Twenty-nine instances of ferritin &gt;500 ng/mL (0.7% of values reviewed) were identified in 14 unique patients on maintenance IV iron. In 9 instances, the high ferritin level occurred with concomitant acute illness with an elevated CRP; elevated ferritin in these cases was thought to be related to an inflammatory state vs. iron overload. In 2 instances, IV iron dose was given the day before lab draw, rendering a falsely elevated result. Two patients had 12 instances (0.28% of values reviewed) of elevated ferritin thought to be related to IV iron dosing in the absence of inflammation, with normal CRP levels. During this period, there were no recorded adverse events.</p><p><b>Conclusion:</b> IDA is common in patients with IF dependent on PN. Iron is not a standard component or additive in PN. Use of IV iron in this population can increase quality of life by decreasing need for admissions, visits to infusion centers, or need for blood transfusions in cases of severe anemia. IV iron can be safely used for maintenance therapy in HPN patients with appropriate dosing and monitoring.</p><p></p><p><b>Figure 1.</b> Intravenous Iron in the Home Parenteral Nutrition dependent patient Algorithm.</p><p>Lynne Sustersic, MS, RD<sup>1</sup>; Debbie Stevenson, MS, RD, CNSC<sup>2</sup></p><p><sup>1</sup>Amerita Specialty Infusion Services, Thornton, CO; <sup>2</sup>Amerita Specialty Infusion Services, Rochester Hills, MI</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Desmoplastic small round tumor (DSRT) is a soft-tissue sarcoma that causes tumors to form in the abdomen and pelvis. To improve control, cytoreduction, hyperthermic intraperitoneal chemotherapy (CRS/HIPEC) and radiotherapy are often used, which can result in bowel obstruction secondary to sclerosing peritonitis. This necessitates total parenteral nutrition therapy (TPN) due to the inability to consume nutrition orally or enterally. A major complication of parenteral nutrition therapy is parenteral nutrition associated liver disease (PNALD) and the most common metastasis for DSRT is the liver. This case report details the substitution of an olive and soy oil-based intravenous lipid emulsion (OO, SO-ILE) for a soy, MCT, olive, fish oil-based intravenous lipid emulsion (SO, MCT, OO, FO-ILE) to treat high liver function tests (LFTs).</p><p><b>Methods:</b> A 28-year-old male with DSRT metastatic to peritoneum and large hepatic mass complicated by encapsulating peritonitis and enterocutaneous fistula (ECF), following CRS/HIPEC presented to the parenteral nutrition program at Amerita Specialty Infusion Services in 2022. TPN was initiated from January 2022 to March 2023, stopped, and restarted in December 2023 following a biliary obstruction. TPN was initiated and advanced using 1.3 g/kg/day SMOFlipid, (SO, MCT, OO, FO-ILE), known to help mitigate PNALD. The patient developed rising LFTs, with alanine transferase (ALT) peaking at 445 U/L, aspartate transferase (AST) at 606 U/L, and alkaline phosphatase (ALP) at 1265 U/L. Despite transitioning the patient to a cyclic regimen and maximizing calories in dextrose and amino acids, liver function continued to worsen. A switch to Clinolipid, (OO, SO-ILE), at 1.3 g/kg/day was tried.</p><p><b>Results:</b> Following the initiation of OO, SO-ILE, LFTs improved in 12 days with ALT resulting at 263 U/L, AST at 278 U/L, and ALP at 913 U/L. These values continued to improve until the end of therapy in June 2024 with a final ALT value of 224 U/L, AST at 138 U/L, and ALP at 220 U/L. See Figure 1. No significant improvements in total bilirubin were found. The patient was able to successfully tolerate this switch in lipid emulsions and was able to increase his weight from 50 kg to 53.6 kg.</p><p><b>Conclusion:</b> SO, MCT, OO, FO-ILE is well-supported to help prevent and alleviate adverse effects of PNALD, however lipid emulsion impacts on other forms of liver disease need further research. Our case suggests that elevated LFTs were likely cancer induced, rather than associated with prolonged use of parenteral nutrition. A higher olive oil lipid concentration may have beneficial impacts on LFTs that are not associated with PNALD. It is also worth noting that soybean oil has been demonstrated in previous research to have a negative impact on liver function, and the concentration of soy in SO, MCT, OO, FO-ILE is larger (30%) compared to OO, SO-ILE (20%). This may warrant further investigation into specific soy concentrations’ impact on liver function. LFTs should be assessed and treated on a case-by-case basis that evaluates disease mechanisms, medication-drug interactions, parenteral nutrition composition, and patient subjective information.</p><p></p><p><b>Figure 1.</b> OO, SO-ILE Impact on LFTs.</p><p>Shaurya Mehta, BS<sup>1</sup>; Ajay Jain, MD, DNB, MHA<sup>1</sup>; Kento Kurashima, MD, PhD<sup>1</sup>; Chandrashekhara Manithody, PhD<sup>1</sup>; Arun Verma, MD<sup>1</sup>; Marzena Swiderska-Syn<sup>1</sup>; Shin Miyata, MD<sup>1</sup>; Mustafa Nazzal, MD<sup>1</sup>; Miguel Guzman, MD<sup>1</sup>; Sherri Besmer, MD<sup>1</sup>; Matthew Mchale, MD<sup>1</sup>; Jordyn Wray<sup>1</sup>; Chelsea Hutchinson, MD<sup>1</sup>; John Long, DVM<sup>1</sup></p><p><sup>1</sup>Saint Louis University, St. Louis, MO</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> North American Society for Pediatric Gastroenterology, Hepatology and Nutrition (NASPGHAN) Florida.</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Short bowel syndrome (SBS) is a devastating condition. In absence of enteral nutrition (EN), patients are dependent on Total Parenteral Nutrition (TPN) and suffer from of intestinal failure associated liver disease and gut atrophy. Intestinal adaptation (IA) and enteral autonomy (EA) remains the clinical goal. We hypothesized EA can be achieved using our DREAM system, (US patent 63/413,988) which allows EN via the stomach and then a mechanism for cyclical recirculation of nutrient rich distal intestinal content into proximal bowel enabling full EN despite SBS.</p><p><b>Methods:</b> 24 neonatal pigs were randomly allocated to enteral nutrition (EN; n = 8); TPN-SBS (on TPN only, n = 8); or DREAM (n = 8). Liver, gut, and serum were collected for histology, and serum biochemistry. Statistical analysis was performed using ‘Graph Pad Prism 10.1.2 (324)’. All tests were 2-tailed using a significance level of 0.05.</p><p><b>Results:</b> TPN-SBS piglets had significant cholestasis vs DREAM (p = 0.001) with no statistical difference in DREAM vs EN (p = 0.14). DREAM transitioned to full EN by day 4. Mean serum conjugated bilirubin for EN was 0.037 mg/dL, TPN-SBS 1.2 mg/dL, and DREAM 0.05 mg/dL. Serum bile acids were significantly elevated in TPN-SBS vs EN (p = 0.007) and DREAM (p = 0.03). Mean GGT, a marker of cholangiocytic injury was significantly higher in TPN-SBS vs EN (p &lt; 0.001) and DREAM (p &lt; 0.001) with values of EN 21.2 U/L, TPN-SBS 47.9 U/L, and DREAM 22.5 U/L (p = 0.89 DREAM vs EN). To evaluate gut growth, we measured lineal gut mass (LGM), calculated as the weight of the bowel per centimeter. There was significant IA and preservation in gut atrophy with DREAM. Mean proximal gut LGM was EN 0.21 g/cm, TPN-SBS 0.11 g/cm, and DREAM 0.31 g/cm (p = 0.004, TPN-SBS vs DREAM). Distal gut LGM was EN 0.34 g/cm, TPN-SBS 0.13 g/cm, and DREAM 0.43 g/cm (p = 0.006, TPN-SBS vs DREAM). IHC revealed DREAM had similar hepatic CK-7 (bile duct epithelium marker), p = 0.18 and hepatic Cyp7A1, p = 0.3 vs EN. No statistical differences were noted in LGR5 positive intestinal stem cells in EN vs DREAM, p = 0.18. DREAM prevented changes in hepatic, CyP7A1, BSEP, FGFR4, SHP, SREPBP-1 and gut FXR, TGR5, EGF vs TPN and SBS groups.</p><p><b>Conclusion:</b> DREAM resulted in a significant reduction of hepatic cholestasis, prevented gut atrophy, and presents a novel method enabling early full EN despite SBS. This system, by driving IA and EN autonomy highlights a major advancement in SBS management, bringing a paradigm change to life saving strategies for SBS patients.</p><p>Silvia Figueiroa, MS, RD, CNSC<sup>1</sup>; Paula Delmerico, MS, RD, CNSC<sup>2</sup></p><p><sup>1</sup>MedStar Washington Hospital Center, Bethesda, MD; <sup>2</sup>MedStar Washington Hospital Center, Arlington, VA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Parenteral nutrition (PN) therapy is a vital clinical intervention for patients of all ages and across care settings. The complexity of PN has the potential to cause significant patient harm, especially when errors occur. According to The Institute for Safe Medication Practices (ISMP), PN is classified as a high-alert medication and safety-focused strategies should be formulated to minimize errors and harm. Processing PN is multifactorial and includes prescribing, order review and verification, compounding, labeling, and administration. PN prescription should consider clinical appropriateness and formulation safety. The PN formulation must be written to provide appropriate amounts of macronutrients and micronutrients based on the patient's clinical condition, laboratory parameters, and nutrition status. The PN admixture should not exceed the total amounts, recommended concentrations, or rate of infusion of these nutrients as it could result in toxicities and formulation incompatibility or instability. The ASPEN Parenteral Nutrition Safety Consensus Recommendations recommend PN be prescribed using standardized electronic orders via a computerized provider order entry (CPOE) system, as handwritten orders have potential for error. Data suggests up to 40% of PN-related errors occur during the prescription and transcription steps. Registered pharmacists (RPh) are tasked with reviewing PN orders for order accuracy and consistency with recommendations made by the nutrition support team. These RPh adjustments ensure formula stability and nutrient optimization. This quality improvement project compares the frequency of RPh PN adjustments following initial provider order after transition from a paper to CPOE ordering system. Our hypothesis is CPOE reduces the need for PN adjustments by pharmacists during processing which increases clinical effectiveness and maximizes resource efficiency.</p><p><b>Methods:</b> This was a retrospective evaluation of PN ordering practices at a large, academic medical center after shifting from paper to electronic orders. PN orders were collected during three-week periods in December 2022 (Paper) and December 2023 (CPOE) and analyzed for the frequency of order adjustments by the RPh. Adjustments were classified into intravascular access, infusion rate, macronutrients, electrolytes, multivitamin (MVI) and trace elements (TE), and medication categories. The total number of adjustments made by the RPh during final PN processing was collected. These adjustments were made per nutrition support team.</p><p><b>Results:</b> Daily PN orders for 106 patients – totaling 694 orders – were reviewed for provider order accuracy at the time of fax (paper) and electronic (CPOE) submission. Order corrections made by the RPh decreased by 96% for infusion rate, 91.5% for macronutrients, 79.6% for electrolytes, 81.4% for MVI and TE, and 50% for medication additives (Table 1).</p><p><b>Conclusion:</b> Transitioning to CPOE led to reduction in the need for PN order adjustments at the time of processing. One reason for this decline is improvement in physician understanding of PN recommendations. With CPOE, the registered dietitian's formula recommendation is viewable within the order template and can be referenced at the time of ordering. The components of the active PN infusion also automatically populate upon ordering a subsequent bag. This information can aid the provider when calculating insulin needs or repleting electrolytes outside the PN, increasing clinical effectiveness. A byproduct of this process change is improved efficiency, as CPOE requires less time for provider prescription and RPh processing and verification.</p><p><b>Table 1.</b> RPh Order Adjustments Required During Collection Period.</p><p></p><p>Elaina Szeszycki, BS, PharmD, CNSC<sup>1</sup>; Emily Gray, PharmD<sup>2</sup>; Kathleen Doan, PharmD, BCPPS<sup>3</sup>; Kanika Puri, MD<sup>1</sup></p><p><sup>1</sup>Riley Hospital for Children at Indiana University Health, Indianapolis, IN; <sup>2</sup>Lurie Children's Hospital, Chicago, IL; <sup>3</sup>Riley Hospital for Children at IU Health, Indianapolis, IN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Parenteral nutrition (PN) is ordered daily at Riley Hospital for Children at IU Health by different specialties. Our hospital is a stand-alone pediatric hospital, including labor, delivery, and high-risk maternal care. Historically, the PN orders were due by early afternoon with a hard cut-off by end of the day shift for timely central compounding at a nearby adult hospital. Due to the relocation of staff and equipment to a new sterile compounding facility, a hard deadline was created with an earlier cutoff time and contingency plans for orders received after the deadline. This updated process was created to allow for timely delivery to Riley and subsequently to the patients to meet the standard PN hang-time of 2100.The Nutrition Support Team (NST) and Pharmacy and Therapeutics (P&amp;T) Committee approved an updated PN order process as follows:</p><p>Enforce Hard PN Deadline of 1200 for new and current PN orders If PN not received by 1200, renew active PN order for the next 24 hours If active PN order is not appropriate for the next 24 hours, the providers will need to order IVF in place of PN until the following day PN orders into PN order software by 1500</p><p><b>Methods:</b> A quality improvement (QI) process check was performed 3 months initiation of the updated PN order process. Data collection was performed for 1 month with the following data points: Total PN orders, Missing PN orders at 1200, PN orders re-ordered per P&amp;T policy after 1200 deadline, Lab review, Input and output, Subsequent order changes for 24 hours after renewal of active PN order, Waste of PN, and Service responsible for late PN order.</p><p><b>Results:</b></p><p></p><p><b>Conclusion:</b> The number of late PN orders after the hard deadline was &lt; 5% and there was a minimal number of renewed active PN orders due to the pharmacists' concern for ensuring safety of our patients. No clinically significant changes resulted from renewal of active PN, so considered a safe process despite small numbers. The changes made to late PN orders were minor or related to the planned discontinuation of PN. After review of results by NST and pharmacy administration, it was decided to take the following actions: Review data and process with pharmacy staff to assist with workload flow and education Create a succinct Riley TPN process document for providers, specifically services with late orders, reviewing PN order entry hard deadline and need for DC PN order by deadline to assist with pharmacy staff workflow and avoidance of potential PN waste Repeat QI analysis in 6-12 months.</p><p><b>International Poster of Distinction</b></p><p>Muna Islami, PharmD, BCNSP<sup>1</sup>; Mohammed Almusawa, PharmD, BCIDP<sup>2</sup>; Nouf Alotaibi, PharmD, BCPS, BCNSP<sup>3</sup>; Jwael Alhamoud, PharmD<sup>1</sup>; Maha Islami, PharmD<sup>4</sup>; Khalid Eljaaly, PharmD, MS, BCIDP, FCCP, FIDSA<sup>4</sup>; Majda Alattas, PharmD, BCPS, BCIDP<sup>1</sup>; Lama Hefni, RN<sup>5</sup>; Basem Alraddadi, MD<sup>1</sup></p><p><sup>1</sup>King Faisal Specialist Hospital, Jeddah, Makkah; <sup>2</sup>Wayne State University, Jeddah, Makkah; <sup>3</sup>Umm al Qura University, Jeddah, Makkah; <sup>4</sup>King Abdulaziz University Hospital, Jeddah, Makkah; <sup>5</sup>King Faisal Specialist Hospital, Jeddah, Makkah</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Parenteral nutrition (PN) is a critical therapy for patients unable to meet their nutritional needs through the gastrointestinal tract. While it offers a life-saving solution, it also carries the risk of central line-associated bloodstream infections (CLABSIs). However, there is a lack of comprehensive studies examining the risk factors for CLABSIs in a more heterogeneous cohort of PN recipients. This study aims to identify the risk factors associated with CLABSIs in patients receiving PN therapy in Saudi Arabia.</p><p><b>Methods:</b> This retrospective cohort multicenter study was conducted in three large tertiary referral centers in Saudi Arabia. The study included all hospitalized patients who received PN therapy through central lines between 2018 and 2022. The purpose of the study was to investigate the association between parenteral nutrition (PN) and central line-associated bloodstream infections (CLABSIs), using both univariate and multivariate analysis.</p><p><b>Results:</b> Out of 662 hospitalized patients who received PN and had central lines, 123 patients (18.6%) developed CLABSI. Among our patients, the duration of parenteral nutrition was a dependent risk factor for CLABSI development (OR, 1.012; 95% CI, 0.9-1.02). In patients who were taken PN, the incidence of CLABSI did not change significantly over the course of the study's years.</p><p><b>Conclusion:</b> The length of PN therapy is still an important risk factor for CLABSIs; more research is required to determine the best ways to reduce the incidence of CLABSI in patients on PN.</p><p><b>Table 1.</b> Characteristics of Hospitalized Patients Who Received PN.</p><p></p><p>1 n (%); Median (IQR) BMI, Body Mass Index.</p><p><b>Table 2.</b> The Characteristics of Individuals With and Without CLABSI Who Received PN.</p><p></p><p>1 n (%); Median (IQR), 2 Fisher's exact test; Pearson's Chi-squared test; Mann Whitney U test PN, Parenteral Nutrition</p><p></p><p>CLABSI, central line-associated bloodstream infections PN, parenteral nutrition</p><p><b>Figure 1.</b> Percentage of Patients With a Central Line Receiving PN Who Experienced CLABSI.</p><p>Duy Luu, PharmD<sup>1</sup>; Rachel Leong, PharmD<sup>2</sup>; Nisha Dave, PharmD<sup>2</sup>; Thomas Ziegler, MD<sup>2</sup>; Vivian Zhao, PharmD<sup>2</sup></p><p><sup>1</sup>Emory University Hospital - Nutrition Support Team, Lawrenceville, GA; <sup>2</sup>Emory Healthcare, Atlanta, GA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Intravenous lipid emulsion (ILE) is an essential component in parenteral nutrition (PN)-dependent patients because it provides calories and essential fatty acids; however, the use of soybean oil-based ILE (SO-ILE) may contribute to the development of PN-associated liver disease (PNALD) in patients with intestinal failure who require chronic PN. To mitigate this risk, new formulations of ILE, such as a mixture of SO, medium chain triglycerides (MCT), olive oil (OO), and fish oil-based ILE (SO/MCT/OO/FO-ILE), or pure fish oil-based ILE (FO-ILE) are now available in the US. FO-ILE is only approved for pediatric use for PN-associated cholestasis. This patient case highlights the benefits of using a combination of FO-containing ILEs to improve PNALD.</p><p><b>Methods:</b> A 65-year-old female with symptomatic achalasia required robotic-assisted esophagomyotomy with fundoplasty in February 2020. Her postoperative course was complicated with bowel injury that required multiple small bowel resections and total colectomy with end jejunostomy, resulting in short bowel syndrome with 80 centimeters of residual small bowel. Postoperatively, daily PN containing SO-ILE was initiated along with tube feedings (TF) during hospitalization, and she was discharged home with PN and TF. Her surgeon referred her to the Emory University Hospital (EUH) Nutrition Support Team (NST) for management. She received daily cyclic-PN infused over 12 hours, providing 25.4 kcal/kg/day with SO-ILE 70 grams (1.2 g/kg) three times weekly and standard TF 1-2 containers/day. In September 2020, she complained of persistent jaundice and was admitted to EUH. She presented with scleral icterus, hyperbilirubinemia, and elevated liver function tests (LFTs). EUH NST optimized PN to provide SO/MCT/OO/FO-ILE (0.8 g/kg/day), which improved blood LFTs, and the dose was then increased to 1 g/kg/day. In the subsequent four months, her LFTs worsened despite optimizing pharmacotherapy, continuing cyclic-TF, and reducing and then discontinuing ILE. She required multiple readmissions at EUH and obtained two liver biopsies that confirmed a diagnosis of PN-induced hepatic fibrosis after 15 months of PN. Her serum total bilirubin level peaked at 18.5 mg/dL, which led to an intensive care unit admission and required molecular adsorbent recirculating system therapy. In March 2022, the NST exhausted all options and incorporated FO-ILE (0.84 g/kg/day) three times weekly (separate infusion) and SO/MCT/OO/FO-ILE (1 g/kg/day) weekly.</p><p><b>Results:</b> The patient's LFTs are shown in Figure 1. The blood level of aspartate aminotransferase improved from 123 to 60 units/L, and alanine aminotransferase decreased from 84 to 51 units/L after 2 months and returned to normal after 4 months of the two ILEs. Similarly, the total bilirubin decreased from 5.6 to 2.2 and 1.1 mg/dL by 2 and 6 months, respectively. Both total bilirubin and transaminase levels remained stable. Although her alkaline phosphatase continued to fluctuate and elevated in the last two years, this marker decreased from 268 to 104 units/L. All other PNALD-related symptoms were resolved.</p><p><b>Conclusion:</b> This case demonstrates that the combination of FO-containing ILEs significantly improved and stabilized LFTs in an adult with PNALD. Additional research is needed to investigate the effect of FO-ILE in adult PN patients to mitigate PNALD.</p><p></p><p>SO: soybean oil; MCT: median-chain triglyceride; OO: olive oil; FO: fish oil; AST: Aspartate Aminotransferase; ALT: Alanine Aminotransferase.</p><p><b>Figure 1.</b> Progression of Liver Enzymes Status in Relation to Lipid Injectable Emulsions.</p><p>Narisorn Lakananurak, MD<sup>1</sup>; Leah Gramlich, MD<sup>2</sup></p><p><sup>1</sup>Department of Medicine, Faculty of Medicine, Chulalongkorn University, Bangkok, Krung Thep; <sup>2</sup>Division of Gastroenterology, Faculty of Medicine and Dentistry, University of Alberta, Edmonton, AB</p><p><b>Financial Support:</b> This research study received a grant from Baxter, Canada.</p><p><b>Background:</b> Pre-operative parenteral nutrition (PN) has been shown to enhance outcomes in malnourished surgical patients. Traditionally, pre-operative PN necessitates hospital admission, which leads to increased length of stay (LOS) and higher hospital costs. Furthermore, inpatient pre-operative PN may not be feasible or prioritized when access to hospital beds is restricted. Outpatient PN presents a potential solution to this issue. To date, the feasibility and impact of outpatient PN for surgical patients have not been investigated. This study aims to assess the outcomes and feasibility of outpatient pre-operative PN in malnourished surgical patients.</p><p><b>Methods:</b> Patients scheduled for major surgery who were identified as at risk of malnutrition using the Canadian Nutrition Screening Tool and classified as malnourished by Subjective Global Assessment (SGA) B or C were enrolled. Exclusion criteria included severe systemic diseases as defined by the American Society of Anesthesiologists (ASA) classification III to V, insulin-dependent diabetes mellitus, and extreme underweight (less than 40 kg). Eligible patients received a peripherally inserted central catheter (PICC) line and outpatient PN (Olimel 7.6%E 1,000 ml) for 5-10 days using the maximum infusion days possible prior to surgery at an infusion clinic. PN was administered via an infusion pump over 4-5 hours by infusion clinic nurses. The outcomes and feasibility of outpatient PN were assessed. Safety outcomes, including refeeding syndrome, dysglycemia, volume overload, and catheter-related complications, were monitored.</p><p><b>Results:</b> Outpatient PN was administered to eight patients (4 males, 4 females). Pancreatic cancer and Whipple's procedure were the most common diagnoses and operations, accounting for 37.5% of cases. (Table 1) The mean (SD) duration of PN was 5.5 (1.2) days (range 5-8 days). Outpatient PN was completed in 75% of patients, with an 88% completion rate of PN days (44/50 days). Post-PN infusion, mean body weight and body mass index increased by 4.6 kg and 2.1 kg/m², respectively. The mean PG-SGA score improved by 4.9 points, and mean handgrip strength increased from 20 kg to 25.2 kg. Quality of life, as measured by SF-12, improved in both physical and mental health domains (7.3 and 3.8 points, respectively). Patient-reported feasibility scores were high across all aspects (Acceptability, Appropriateness, and Feasibility), with a total score of 55.7/60 (92.8%). Infusion clinic nurses (n = 3) also reported high total feasibility scores (52.7/60, 87.8%). (Table 2) No complications were observed in any of the patients.</p><p><b>Conclusion:</b> Outpatient pre-operative PN was a feasible approach that was associated with improved outcomes in malnourished surgical patients. This novel approach has the potential to enhance outcomes and decrease the necessity for hospital admission in malnourished surgical patients. Future studies involving larger populations are needed to evaluate the efficacy of outpatient PN.</p><p><b>Table 1.</b> Baseline Characteristics of the Participants (n = 8).</p><p></p><p><b>Table 2.</b> Outcomes and Feasibility of Outpatient Preoperative Parenteral Nutrition (n = 8).</p><p></p><p>Adrianna Wierzbicka, MD<sup>1</sup>; Rosmary Carballo Araque, RD<sup>1</sup>; Andrew Ukleja, MD<sup>1</sup></p><p><sup>1</sup>Cleveland Clinic Florida, Weston, FL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Gastroparesis (GP) is a chronic motility disorder marked by delayed gastric emptying, associated with symptoms: nausea, vomiting and abdominal pain. Treatment consists of diet modifications and medications, with nutritional support tailored to disease severity. Severe refractory cases may require enteral or parenteral nutrition (PN). However, the role of home parenteral nutrition (HPN) in managing GP is underexplored. This study aims to enhance nutrition therapy practice by examining the utilization of HPN in GP population, addressing a significant gap in current nutrition support strategies.</p><p><b>Methods:</b> We conducted a retrospective, single-center analysis of patients receiving HPN from August 2022 to August 2024. Data were obtained through a review of electronic medical records as part of a quality improvement monitoring process. Patients' demographics, etiology of GP, indications for HPN, types of central access, duration of therapy, and PN-related complications were analyzed using descriptive statistics. Inclusion criteria were: adults (&gt;18 yrs.), GP diagnosis by gastric scintigraphy, and HPN for a minimum of 2 consecutive months. Among 141 identified HPN patients, 10 were diagnosed with GP as indication for PN.</p><p><b>Results:</b> GP patients constituted 7% (10/141) of our home PN population. In this cohort analysis of 10 patients with GP receiving HPN, the demographic profile was predominantly female (80%); a mean age of 42.6 yrs., all individuals identified as Caucasian. All patients had idiopathic GP, severe gastric emptying delay was found in 80% of cases, with all experiencing predominant symptoms of nausea/vomiting. Type of central access: 50% PICC lines, 30% Hickman catheters, 10% Powerlines, and 10% mediport. The mean weight change with PN therapy was an increase of 21.9 lbs. 80% of patients experienced infection-related complications, including bacteremia (Methicillin-Sensitive Staphylococcus Aureus (MSSA), Methicillin-Resistant Staphylococcus Aureus (MRSA)), Pseudomonas, and fungemia. Deep vein thrombosis (DVT) was identified in 20% of patients, alongside one case of a cardiac thrombus. Tube feeding trials were attempted in 70% of cases, but 50% ultimately discontinued due to intolerance, such as abdominal pain or complications like buried bumper syndrome. Chronic pain management was used in 60% of patients, with 40% on opioid therapy (morphine, fentanyl). PN was discontinued in 50% of patients due to recurrent infections (20%), advancement to tube feeding (20%), questionable compliance (20%), or improvement in oral intake (40%).</p><p><b>Conclusion:</b> This retrospective analysis underscores the potential of HPN as a nutritional strategy for GP, particularly in patients with refractory symptoms and severe delay in gastric emptying who previously failed EN or experienced complications related to the enteral access. In addition to the observed mean weight gain, HPN seems to play a crucial role in alleviating debilitating symptoms such as nausea, vomiting, and abdominal pain, thereby improving patients' overall quality of life. Nonetheless, the prevalence of infection-related complications and the requirement for chronic pain management underscore the challenges associated with GP treatment. The variability in patient responses to different nutritional strategies emphasizes the importance of individualized care plans. These findings advocate for further research to optimize HPN protocols and improve comprehensive management strategies in GP.</p><p></p><p><b>Figure 1.</b> Reasons for PN Discontinuation.</p><p></p><p><b>Figure 2.</b> Complication Associated with PN.</p><p>Longchang Huang, MD<sup>1</sup>; Peng Wang<sup>2</sup>; Shuai Liu<sup>3</sup>; Xin Qi<sup>1</sup>; Li Zhang<sup>1</sup>; Xinying Wang<sup>4</sup></p><p><sup>1</sup>Department of General Surgery, Jinling Hospital, Medical School of Nanjing University, Nanjing, Jiangsu; <sup>2</sup>Department of Digestive Disease Research Center, Gastrointestinal Surgery, The First People's Hospital of Foshan, Guangdong, Foshan; <sup>3</sup>Department of General Surgery, Jinling Hospital, Medical School of Nanjing University, Nanjing, Jiangsu; <sup>4</sup>Wang, Department of General Surgery, Jinling Hospital, Medical School of Nanjing University, Nanjing, Jiangsu</p><p><b>Financial Support:</b> National Natural Science Foundation of China, 82170575 and 82370900.</p><p><b>Background:</b> Total parenteral nutrition (TPN) induced gut microbiota dysbiosis is closely linked to intestinal barrier damage, but the mechanism remains unclear.</p><p><b>Methods:</b> Through the application of 16S rRNA gene sequencing and metagenomic analysis, we examined alterations in the gut microbiota of patients with chronic intestinal failure (CIF) and TPN mouse models subjected to parenteral nutrition, subsequently validating these observations in an independent verification cohort. Additionally, we conducted a comprehensive analysis of key metabolites utilizing liquid chromatography-mass spectrometry (LC-MS). Moreover, we explored modifications in essential innate-like lymphoid cell populations through RNA sequencing (RNA-seq), flow cytometry, and single-cell RNA sequencing (scRNA-seq).</p><p><b>Results:</b> The gut barrier damage associated with TPN is due to decreased Lactobacillus murinus. L.murinus mitigates TPN-induced intestinal barrier damage through the metabolism of tryptophan into indole-3-carboxylic acid (ICA). Furthermore, ICA stimulates innate lymphoid cells 3 (ILC3) to secrete interleukin-22 by targeting the nuclear receptor Rorc to enhance intestinal barrier protection.</p><p><b>Conclusion:</b> We elucidate the mechanisms driving TPN-associated intestinal barrier damage and indicate that interventions with L. murinus or ICA could effectively ameliorate TPN-induced gut barrier injury.</p><p></p><p><b>Figure 1.</b> TPN Induces Intestinal Barrier Damage in Humans and Mice. (a) the rate of febrile and admission of ICU in the Cohort 1. (b-d) Serum levels of IFABP, CRP, and LPS in patients with CIF. (e) Representative intestinal H&amp;E staining and injury scores (f) (n = 10 mice per group). (g) Results of the electrical resistance of the intestine in mice by Ussing Chamber (n = 5 mice per group). (h) Immunofluorescence experiments in the intestines and livers of mice. (i) The results of Western blot in the Chow and TPN groups.</p><p></p><p><b>Figure 2.</b> TPN Induces Gut Dysbiosis in Humans and Mice. (a) PCoA for 16S rRNA of fecal content from Cohort 1 (n = 16 individuals/group). (b) Significant abundance are identified using linear discriminant analysis (LDA). (c) Top 10 abundant genus. (d) PCoA of the relative genus or species abundances (n = 5 mice per group). (e) LDA for mice. (f) Sankey diagram showing the top 10 abundant genus of humans and mice. (g) The following heatmap illustrates the correlation between the abundance of species in intestinal microbiota and clinical characteristics of patients with CIF.</p><p></p><p><b>Figure 3.</b> Metabolically active L.murinus ameliorate intestinal barrier damage. (a) RT-PCR was conducted to quantify the abundance of L. murinus in feces from L-PN and H-PN patients (Cohorts 1 and 2). (b) Representative intestinal H&amp;E staining and injury scores (c) (n = 10 mice per group). (d) The results of the electrical resistance of the intestine in mice by Ussing Chamber (n = 5 mice per group). (e) The results of Western Blot. (f) 3D-PCA and volcano plot (g) analyses between the Chow and TPN group mice. (h) The metabolome-wide pathways were enriched based on the metabolomics data obtained from fecal content from Chow and TPN group mice (n = 5 mice per group). (i) The heatmap depicts the correlation between the abundance of intestinal microbiota in species level and tryptophan metabolites of the Chow and TPN group mice (n = 5 mice per group). (j) VIP scores of 3D-PCA. A taxon with a variable importance in projection (VIP) score of &gt;1.5 was deemed to be of significant importance in the discrimination process.</p><p></p><p><b>Figure 4.</b> ICA is critical for the effects of L.murinus. (a) The fecal level of ICA from TPN mice treated PBS control or ICA(n = 10 mice per group). (b) Representative intestinal H&amp;E staining and injury scores (c) (n = 10 mice per group). (d) The results of the electrical resistance of the intestine in mice by Ussing Chamber (n = 5 mice per group). (e) The results of Western blot. (f) This metabolic pathway illustrates the production of ICA by the bacterium L. murinus from the tryptophan. (g) PLS-DA for the profiles of metabolite in feces from TPN mice receiving ΔArAT or live L. murinus (n = 5 mice per group). (h) The heat map of tryptophan-targeted metabolomics in fecal samples from TPN mice that received either ΔArAT (n = 5) or live L. murinus (n = 5). (i) Representative intestinal H&amp;E staining and injury scores (j)(n = 10 mice per group). (k) The results of Western Blot.</p><p>Callie Rancourt, RDN<sup>1</sup>; Osman Mohamed Elfadil, MBBS<sup>1</sup>; Yash Patel, MBBS<sup>1</sup>; Taylor Dale, MS, RDN<sup>1</sup>; Allison Keller, MS, RDN<sup>1</sup>; Alania Bodi, MS, RDN<sup>1</sup>; Suhena Patel, MBBS<sup>1</sup>; Andrea Morand, MS, RDN, LD<sup>1</sup>; Amanda Engle, PharmD, RPh<sup>1</sup>; Manpreet Mundi, MD<sup>1</sup></p><p><sup>1</sup>Mayo Clinic, Rochester, MN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Although patent foramen ovale (PFO) are generally asymptomatic and cause no health concerns, they can be a risk factor for embolism and stroke. Due to this theoretical risk, some institutions have established protocols requiring most IV solutions to be administered through a small micron filter in patients with PFO. While 2-in-1 dextrose and amino acid solutions can be filtered through a 0.22-micron filter with relative ease, injectable lipid emulsions (ILEs), whether on their own or as part of a total admixture, consist of larger particles, requiring a 1.2-micron or bigger filter size. The use of the larger filter precludes the administration of ILE, an essential source of calories, in patients with PFO. It is unknown if patients who do receive ILE have an increased incidence of lipid embolism and stroke.</p><p><b>Methods:</b> A single-center retrospective review of patients on central parenteral nutrition (CPN) was completed. Demographics, and baseline clinical characteristics including co-morbidities and history of CVA were collected. The outcome of interest is defined as an ischemic cerebrovascular accident (CVA) within 30 days of CPN and, therefore, potentially attributable to it. Other cardiovascular and thromboembolic events were captured. All Patients with a PFO diagnosis and inpatient CPN administration between January 1, 2018, and December 18, 2023, at our quaternary care referral center were included as the case cohort. A 3:1 control group matched to age, gender, duration of inpatient CPN, and clinical co-morbidities was identified and utilized to examine the difference in the outcome of interest.</p><p><b>Results:</b> Patients with PFO who received CPN (n = 38, 53.8% female) had a mean age of 63.5 ± 13.1 years and a mean BMI of 31.1 ± 12.1 at CPN initiation (Table 1). The PFO varied in size, with the majority (38.5%) having a very small/trivial one (Table 2). All patients in this cohort had appropriate size filters placed for CPN and ILE administration. CPN prescription and duration were comparable between both groups. The majority of patients with PFO (53.8%) received mixed oil ILE, followed by soy-olive oil ILE (23.1%), whereas the majority of patients without PFO (51.8%) received soy-olive oil ILE and (42.9%) received mixed oil ILE (Table 3). Case and control groups had cardiovascular risks at comparable prevalence, including obesity, hypertension, diabetes, and dyslipidemia. However, more patients had a history of vascular cardiac events and atrial fibrillation in the PFO group, and more patients were smokers in the non-PFO group (Table 4). Patients with PFO received PN for a median of 7 days (IQR: 5,13), and 32 (84.2%) received ILE. Patients without PFO who received CPN (n = 114, 52.6%) had a mean age of 64.9 ± 10 years and a mean BMI of 29.6 ± 8.5 at CPN initiation. Patients in this cohort received PN for a median of 7 days (IQR: 5,13.5), and 113 (99.1%) received ILE. There was no difference in the incidence of ischemic CVA within 30 days of receiving CPN between both groups (2 (5.3%) in the PFO group vs. 1 (0.8%) in the non-PFO group; p = 0.092) (Table 4).</p><p><b>Conclusion:</b> The question of the risk of CVA/stroke with CPN in patients with PFO is clinically relevant and often lacks a definitive answer. Our study revealed no difference in ischemic CVA potentially attributable to CPN between patients with PFO and patients without PFO in a matched control cohort in the first 30 days after administration of PN. This finding demonstrates that CPN with ILE is likely safe for patients with PFO in an inpatient setting.</p><p><b>Table 1.</b> Baseline Demographics and Clinical Characteristics.</p><p></p><p><b>Table 2.</b> PFO Diagnosis.</p><p></p><p>*All received propofol concomitantly.</p><p><b>Table 3.</b> PN Prescription.</p><p></p><p><b>Table 4.</b> Outcomes and Complications.</p><p></p><p><b>Enteral Nutrition Therapy</b></p><p>Osman Mohamed Elfadil, MBBS<sup>1</sup>; Edel Keaveney, PhD<sup>2</sup>; Adele Pattinson, RDN<sup>1</sup>; Danelle Johnson, MS, RDN<sup>1</sup>; Rachael Connolly, BSc.<sup>2</sup>; Suhena Patel, MBBS<sup>1</sup>; Yash Patel, MBBS<sup>1</sup>; Ryan Hurt, MD, PhD<sup>1</sup>; Manpreet Mundi, MD<sup>1</sup></p><p><sup>1</sup>Mayo Clinic, Rochester, MN; <sup>2</sup>Rockfield MD, Galway</p><p><b>Financial Support:</b> Rockfield Medical Devices.</p><p><b>Background:</b> Patients on home enteral nutrition (HEN), many of whom are mobile, can experience significant hardships and reduced quality of life (QoL) due to limitations on mobility, on top of burdens due to underlying disease processes. Improving mobility while feeding could reduce burdens associated with HEN and potentially improve QoL. This prospective cohort study aims to evaluate participants’ perspectives on their mobility, ease of performing physical activities while feeding, and QoL following the use of a novel enteral feeding system (EFS).</p><p><b>Methods:</b> A prospective single-center study was conducted to evaluate a novel EFS, which is an FDA-cleared elastomeric system (Mobility + ®) that consists of a lightweight feeding pouch (reservoir for 500 mL feed), a filling set (used in conjunction with a syringe to fill EFS) and a feeding set to deliver EN formula to an extension set/feeding tube with an ISO 80369-3 compatible connector. Adult HEN-dependent patients were recruited by invitation to use the study EFS for a minimum of 2 feeds a day for 14 days, preceded by a familiarization period of 5-7 days. Participant perspectives on how they rated performing typical daily activities while feeding (e.g., moving, traveling, socializing) and feeding system parameters (ease of use, portability, noise, discretion, performance) were evaluated using HEN-expert validated questionnaires. A score was given for each rating from 1 to 5, with 5 being the most positive response. An overall score was calculated and averaged for the cohort. Participants were followed up during the familiarization period. On days 7 and 14, additional telephone interviews were conducted regarding compliance, enteral feed intake, participant perspectives on study EFS vs. current system, and other measures. We excluded those with reduced functional capacity due to their underlying disease(s).</p><p><b>Results:</b> Seventeen participants completed the study (mean age 63.8 ± 12 years; 70.6% male). Participants used various feeding systems, including gravity, bolus method, and pump, with the majority (82.4%) having a G-tube placed (Table 1). Sixteen (94.1%) patients achieved use of study EFS for at least two feeds a day (and majority of daily EN calories) for all study days (Table 2). The ratings for the ability to perform various activities using study EFS were significantly different compared to those of the systems used before the study. An improvement in ratings was noted for the ease of performing common daily activities, including moving between rooms or on stairs, taking short and long walks, traveling by car or public transport, engaging in moderate- to high-intensity activities, sleeping, and socializing with family and friends, between the time point before enrolment and end of study (day 14) (p-value &lt; 0.0001) (Table 3). Ratings of feeding system parameters were significantly different between systems used before the study and the study EFS (p &lt; 0.0001) (Table 3), with the largest increases in positive ratings noted in relation to easiness to carry, noise level, and ability to feed discreetly. Ratings for overall satisfaction with the performance of study EFS did not differ from the ratings for the systems used before the study, with participants reporting that the main influencing factors were the length of time and the effort needed to fill study EFS. No difference was noted in the QoL rating.</p><p><b>Conclusion:</b> The studied EFS is safe and effective as an enteral feeding modality that provides an alternative option for HEN recipients. Participants reported a significant positive impact of study EFS on their activities of daily living. Although the overall QoL rating remained the same, improvements in mobility, discretion, and ease of carrying— aspects of QoL—were associated with the use of study EFS.</p><p><b>Table 1.</b> Baseline Demographics and Clinical Characteristics.</p><p></p><p><b>Table 2.</b> Safety and Effectiveness.</p><p></p><p><b>Table 3.</b> Usability and Impact of the Study EFS.</p><p></p><p>Talal Sharaiha, MD<sup>1</sup>; Martin Croce, MD, FACS<sup>2</sup>; Lisa McKnight, RN, BSN MS<sup>2</sup>; Alejandra Alvarez, ACP, PMP, CPXP<sup>2</sup></p><p><sup>1</sup>Aspisafe Solutions Inc., Brooklyn, NY; <sup>2</sup>Regional One Health, Memphis, TN</p><p><b>Financial Support:</b> Talal Sharaiha is an executive of Aspisafe Solutions Inc. Martin Croce, Lisa McKnight and Alejandra Alvarez are employees of Regional One Health institutions. Regional One Health has a financial interest in Aspisafe Solutions Inc. through services, not cash. Aspisafe provided the products at no charge.</p><p><b>Background:</b> Feeding tube securement has seen minimal innovation over the past decades, leaving medical adhesives to remain as the standard method. However, adhesives frequently fail to maintain secure positioning, with dislodgement rates reported between 36% and 62%, averaging approximately 40%. Dislodgement can lead to adverse outcomes, including aspiration, increased risk of malnutrition, higher healthcare costs, and extended nursing care. We aimed to evaluate the safety and efficacy of a novel feeding tube securement device in preventing NG tube dislodgement compared to standard adhesive tape. The device consists of a bracket that sits on the patient's upper lip. The bracket has a non-adhesive mechanism for securing feeding tubes ranging from sizes 10 to 18 French. It extends to two cheek pads that sit on either side of the patient's cheeks and is further supported by a head strap that wraps around the patient's head (Fig. 1 + Fig. 2).</p><p><b>Methods:</b> We conducted a prospective, case-control trial at Regional One Health Center, Memphis, TN, comparing 50 patients using the novel securement device, the NG Guard, against 50 patients receiving standard care with adhesive tape. The primary outcome was the rate of accidental or intentional NG tube dislodgement. Secondary outcomes included the number of new NG tubes required as a result of dislodgement, and device-related complications or adhesive-related skin injuries in the control group. Statistical analyses employed Student's t-test for continuous variables and Fisher's exact test for categorical variables. Significance was set at an alpha level of 0.05. We adjusted for confounding variables, including age, sex, race, and diagnosis codes related to delirium, dementia, and confusion (Table 1).</p><p><b>Results:</b> There were no significant differences between the groups in baseline characteristics, including age, sex, race, or confusion-related diagnoses (Table 2) (p ≥ 0.09). Nasogastric tube dislodgement occurred significantly more often in the adhesive tape group (31%) compared to the intervention group (11%) (p &lt; 0.05). The novel device reduced the risk of tube dislodgement by 65%. Additionally, 12 new tubes were required in the control group compared to 3 in the intervention group (p &lt; 0.05), translating to 18 fewer reinsertion events per 100 tubes inserted in patients secured by the novel device. No device-related complications or adhesive-related injuries were reported in either group.</p><p><b>Conclusion:</b> The novel securement device significantly reduced the incidence of nasogastric tube dislodgement compared to traditional adhesive tape. It is a safe and effective securement method and should be considered for use in patients with nasogastric tubes to reduce the likelihood of dislodgement and the need for reinsertion of new NG tubes.</p><p><b>Table 1.</b> Diagnosis Codes Related to Dementia and Delirium.</p><p></p><p><b>Table 2.</b> Baseline Demographics.</p><p></p><p></p><p><b>Figure 1.</b> Novel Securement Device - Front View.</p><p></p><p><b>Figure 2.</b> Novel Securement Device - Side Profile.</p><p><b>Best of ASPEN-Enteral Nutrition Therapy</b></p><p><b>Poster of Distinction</b></p><p>Alexandra Kimchy, DO<sup>1</sup>; Sophia Dahmani, BS<sup>2</sup>; Sejal Dave, RDN<sup>1</sup>; Molly Good, RDN<sup>1</sup>; Salam Sunna, RDN<sup>1</sup>; Karen Strenger, PA-C<sup>1</sup>; Eshetu Tefera, MS<sup>3</sup>; Alex Montero, MD<sup>1</sup>; Rohit Satoskar, MD<sup>1</sup></p><p><sup>1</sup>MedStar Georgetown University Hospital, Washington, DC; <sup>2</sup>Georgetown University Hospital, Washington, DC; <sup>3</sup>MedStar Health Research Institute, Columbia, MD</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Early nutrition intervention is of high importance in patients with cirrhosis given the faster onset of protein catabolism for gluconeogenesis compared to those without liver disease. Severe malnutrition is associated with frequent complications of cirrhosis such as infection, hepatic encephalopathy, and ascites. Furthermore, studies have demonstrated higher mortality rates in cirrhotic patients who are severely malnourished in both the pre- and post-transplant setting. The current practice guidelines encourage the use of enteral nutrition in cirrhotic patients who are unable to meet their intake requirements with an oral diet. The aim of this study was to evaluate the utilization and implications of enteral feeding in hospitalized patients with cirrhosis diagnosed with severe protein calorie malnutrition at our institution.</p><p><b>Methods:</b> This was a retrospective study of patients admitted to the transplant hepatology inpatient service at MedStar Georgetown University Hospital from 2019-2023. ICD-10-CM code E43 was then used to identity patients with a diagnosis of severe protein calorie malnutrition. The diagnosis of cirrhosis and pre-transplant status were confirmed by review of the electronic medical record. Patients with the following characteristics were excluded: absence of cirrhosis, history of liver transplant, admission diagnosis of upper gastrointestinal bleed, and/or receipt of total parenteral nutrition. Wilcoxon rank sum and two sample t-tests were used to examine differences in the averages of continuous variables between two groups. Chi-square and Fisher exact tests were used to investigate differences for categorical variables. Statistical significance was defined as p-values ≤ 0.05.</p><p><b>Results:</b> Of the 96 patients with cirrhosis and severe protein calorie malnutrition, 31 patients (32%) received enteral nutrition. Time from admission to initiation of enteral feeding was on average 7 days with an average total duration of enteral nutrition of 10 days. In the group that received enteral nutrition, there was no significant change in weight, BMI, creatinine, total bilirubin or MELD 3.0 score from admission to discharge; however, albumin, sodium and INR levels had significantly increased (Table 1). A comparative analysis between patients with and without enteral nutrition showed a significant increase in length of stay, intensive care requirement, bacteremia, gastrointestinal bleeding, discharge MELD 3.0 score and in hospital mortality rates among patients with enteral nutrition. There was no significant difference in rates of spontaneous bacterial peritonitis, pneumonia, admission MELD 3.0 score or post-transplant survival duration in patients with enteral nutrition compared to those without enteral nutrition (Table 2).</p><p><b>Conclusion:</b> In this study, less than fifty percent of patients hospitalized with cirrhosis received enteral nutrition despite having a diagnosis of severe protein calorie malnutrition. Initiation of enteral nutrition was found to be delayed a week, on average, after hospital admission. Prolonged length of stay and higher in-hospital mortality rates suggest a lack of benefit of enteral nutrition when started late in the hospital course. Based on these findings, our institution has implemented a quality improvement initiative to establish earlier enteral feeding in hospitalized patients with cirrhosis and severe protein calorie malnutrition. Future studies will evaluate the efficacy of this initiative and implications for clinical outcomes.</p><p><b>Table 1.</b> The Change in Clinical End Points from Admission to Discharge Among Patients Who Received Enteral Nutrition.</p><p></p><p>Abbreviations: kg, kilograms; BMI, body mass index; INR, international normalized ratio; Na, sodium, Cr, creatinine; TB, total bilirubin; MELD, model for end stage liver disease; EN, enteral nutrition; Std, standard deviation</p><p><b>Table 2.</b> Comparative Analysis of Clinical Characteristics and Outcomes Between Patients With And Without Enteral Nutrition.</p><p></p><p>Abbreviations: MASLD, metabolic dysfunction-associated steatotic liver disease; HCV, hepatitis C virus; HBV, hepatitis B virus; AIH, autoimmune hepatitis; PSC, primary sclerosing cholangitis; PBC, primary biliary cholangitis; EN, enteral nutrition; RD, registered dietician; MICU, medical intensive care unit; PNA, pneumonia; SBP, spontaneous bacterial peritonitis; GIB, gastrointestinal bleed; MELD, model for end stage liver disease; d, days; N, number; Std, standard deviation.</p><p>Jesse James, MS, RDN, CNSC<sup>1</sup></p><p><sup>1</sup>Williamson Medical Center, Franklin, TN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Feeding tubes (Tubes) are used to deliver enteral nutrition to patients who are unable to safely ingest nutrients and medications orally, a population at elevated risk of malnutrition and dehydration. Unfortunately, these Tubes have a propensity for becoming clogged. Staff will attempt to unclog Tubes using standard bedside techniques including warm water flushes or chemical enzymes. However, not only are these practices time-consuming, often they are unsuccessful, requiring replacement. An actuated mechanical device for restoring patency in clogged small bore Tubes was evaluated at a level 2 medical center as an alternative declogging method from September 2021- July 2023. Study objectives were to explore the actuated mechanical device's ability to unclog indwelling Tubes and monitor any potential safety issues.</p><p><b>Methods:</b> The TubeClear® System (actuated mechanical device, Actuated Medical, Inc., Bellefonte, PA, Figure 1) was developed to resolve clogs from various indwelling Tubes. N = 20 patients (Table 1) with n = 16, 10Fr 109 cm long nasogastric (NG) tubes, and n = 4, 10Fr 140 cm long nasojejunal (NJ) tubes, underwent clearing attempts with the actuated mechanical device. Initially, patients underwent standard declogging strategies for a minimum of 30 minutes, including warm water flushes and Creon/NaHCO3 slurry. Following unsuccessful patency restoration (n = 17) or patency restoration and reclogging occurring (n = 3), the actuated mechanical device was attempted. Procedure time was estimated from electronic monitoring records charting system and included set up, use, and cleaning time for the actuated mechanical device, to the closest five minutes. All clearing procedures were completed by three trained registered dietitians.</p><p><b>Results:</b> The average time to restore Tube patency (n = 20) was 26.5 min (25 minutes for NG, 32.5 min for NJ) with 90% success (Table 2), and no significant safety issues reported by the operator or patient. User satisfaction was 100% (20/20) and patient discomfort being 10% (2/20).</p><p><b>Conclusion:</b> Based on presented results, the actuated mechanical device was significantly more successful at resolving clogs compared to alternative bedside practices. Operators noted that the “Actuated mechanical device was able to work with clogs when slurries/water can't be flushed.” It was noted that actuated mechanical device use prior to formation of full clog, utilizing a prophylactic approach, “was substantially easier than waiting until the Tube fully clogged.” For a partly clogged Tube, “despite it being somewhat patent and useable, a quick pass of the actuated mechanical device essentially restored full patency and likely prevented a full clog.” For an NG patient, “no amount of flushing or medication slurry was effective, but the actuated mechanical device worked in just minutes without issue.” “Following standard interventions failure after multiple attempts, only the actuated mechanical device was able to restore Tube patency, saving money on not having to replace Tube.” For a failed clearance, the operator noted “that despite failure to restore patency, there was clearly no opportunity for flushes to achieve a better result and having this option [actuated mechanical device] was helpful to attempt to avoid tube replacement.” For an NJ patient, “there would have been no other conventional method to restore patency of a NJ small bore feeding tube without extensive x-ray exposure and \"guess work,\" which would have been impossible for this patient who was critically ill and ventilator dependent.” Having an alternative to standard bedside unclogging techniques proved beneficial to this facility, with 90% effectiveness and saving those patients from undergoing a Tube replacement and saving our facility money by avoiding Tube replacement costs.</p><p><b>Table 1.</b> Patient and Feeding Tube Demographics.</p><p></p><p><b>Table 2.</b> Actuated Mechanical Device Uses.</p><p></p><p></p><p><b>Figure 1.</b> Actuated Mechanical Device for Clearing Partial and Fully Clogged Indwelling Feeding Tubes.</p><p>Vicki Emch, MS, RD<sup>1</sup>; Dani Foster<sup>2</sup>; Holly Walsworth, RD<sup>3</sup></p><p><sup>1</sup>Aveanna Medical Solutions, Lakewood, CO; <sup>2</sup>Aveanna Medical Solutions, Chandler, AZ; <sup>3</sup>Aveanna Medical Solutions, Erie, CO</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Homecare providers have managed through multiple formula backorders since the pandemic. Due to creative problem-solving, clinicians have successfully been able to offer substitutions. However, when pump feeding sets are on backorder, the options are limited; feeding sets are specific to the brand of pump. Recently, a backorder of feeding sets used for pumps common in acute and home care resulted in a severe shortage in the home care supply chain. This required fast action to ensure patients are able to continue administering their tube feeding and prevent readmissions. A solution is to change the patient to a pump brand which is not on backorder. Normally transitioning a patient to a different brand of pump would require in-person teaching. Due to the urgency of the situation, a more efficient method needed to be established. A regional home care provider determined that 20% of patients using enteral feeding pumps were using backordered sets, and 50% were pediatric patients who tend not to tolerate other methods of feeding. In response this home care provider assembled a team to create a new educational model for pump training. The team was composed of Registered Nurses, Registered Dietitians, and patient care and distribution representatives. The hypothesis is that providing high quality educational material with instructional videos, detailed communication on the issue and telephonic clinical support will allow for a successful transition.</p><p><b>Methods:</b> To determine urgency of transition we prioritized patients with a diagnosis of short bowel syndrome, gastroparesis, glycogen storage disease, vent dependency, &lt; 2 years of age, those living in a rural area with a 2-day shipping zip code and conducted a clinical review to determine patients with jejunal feeding tube. (See Table 1) A pump conversion team contacted patients/caregivers to review the situation, discuss options for nutrition delivery, determine current inventory of sets, assessed urgency for transition and coordinated pump, sets and educational material delivery. Weekly reporting tracked number of patients using the impacted pump, transitioned patients, and those requesting to transition back to their original pump.</p><p><b>Results:</b> A total of 2111 patients were using the feeding pump with backordered sets and 50% of these patients were under the age of 12 yrs. old. Over a period of 3 months, 1435 patients or 68% of this patient population were successfully transitioned to a different brand of pump and of those only 7 patients or 0.5% requested to return to their original pump even though they understood the risk of potentially running short on feeding sets. (See Figure 1).</p><p><b>Conclusion:</b> A team approach which included proactively communicating with patients/caregivers, prioritizing patient risk level, providing high-quality educational material with video links and outbound calls from a clinician resulted in a successful transition to a new brand of feeding pump.</p><p><b>Table 1.</b> Patient Priority Levels for Pump with Backordered Sets (Table 1).</p><p></p><p></p><p><b>Figure 1.</b> Number of Pump Conversions (Chart 1).</p><p>Desiree Barrientos, DNP, MSN, RN, LEC<sup>1</sup></p><p><sup>1</sup>Coram CVS, Chino, CA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Follow-up care for the enteral nutrition therapy community is essential for good outcomes. No data had been collected regarding home enteral nutrition (HEN) outcomes at a major university medical center. There was no robust program in place to follow up with patients who were discharged on tube feedings. Consequently, there was little information regarding the follow-up care or patient outcomes related to current practice, complications, re-hospitalizations, and equipment issues for this population.</p><p><b>Methods:</b> The tools utilized were the questionnaire for the 48-hours and 30-day post-discharge discharge outreach calls, pre-discharge handouts, and feeding pump handouts.</p><p><b>Results:</b> Education: Comparison of 48-Hours and 30 days. Q1: Can you tell me why you were hospitalized?Q2: Did a provider contact you for education prior to discharging home? Q3: Do you understand your nutrition orders from your Doctor? Q4: Can you tell me the steps of how to keep your PEG/TF site clean? Q5: Can you tell me how much water to flush your tube? There was an improvement in patient understanding, self-monitoring, and navigation between the two survey timepoints. Regarding patient education in Q3, there was an improved understanding of nutrition orders from 91% to 100%, Q4: steps to keeping tube feeding site clean resulted from 78% to 96%, and knowledge of water flushed before and after each feeding from 81% to 100% at the 48-hour and 30-day timepoints, respectively.</p><p><b>Conclusion:</b> There was an improvement in patient understanding, self-monitoring, and navigation between the two survey timepoints. Verbal responses to open-ended and informational questions were aggregated to analyze complications, care gaps, and service failures.</p><p><b>Table 1.</b> Questionnaire Responses At 48 Hours and 30 Days.</p><p></p><p><b>Table 2.</b> Questionnaire Responses At 48 Hours and 30 Days.</p><p></p><p></p><p><b>Figure 1.</b> Education: Comparison at 48-hours and 30-days.</p><p></p><p><b>Figure 2.</b> Self-monitoring and Navigation: Comparison at 48-hours and 30-days.</p><p>Rachel Ludke, MS, RD, CD, CNSC, CCTD<sup>1</sup>; Cayla Marshall, RD, CD<sup>2</sup></p><p><sup>1</sup>Froedtert Memorial Lutheran Hospital, Waukesha, WI; <sup>2</sup>Froedtert Memorial Lutheran Hospital, Big Bend, WI</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Initiation of early enteral nutrition plays an essential role in improving patient outcomes<sup>1</sup>. Historically, feeding tubes have been placed by nurses, doctors and advanced practice providers. Over the past two decades, the prevalence of dietitians (RDNs) placing feedings tubes at the bedside has grown. This practice has been endorsed by the Academy of Nutrition and Dietetics and American Society for Enteral and Parenteral Nutrition through modifications to the Scope of Practice for the Registered Dietitian Nutritionist and Standards of Professional Performance for Registered Dietitian Nutritionists.<sup>2,3</sup> Feeding tubes placed at the bedside by RDNs has the potential to decrease nursing, fluoroscopy and internal transport time, which is of interest to our hospital. In fall of 2023, we launched a pilot to evaluate the feasibility of an RDN-led bedside tube placement team at our 800-bed level 1 trauma center.</p><p><b>Methods:</b> RDNs first worked closely with nursing leadership to create a tube placement workflow to identify appropriate patients, outline communication needed with the bedside nurse and provider, and establish troubleshooting solutions (Figure 1). Intensive care unit nursing staff then trained RDNs in tube placement using camera-guided technology (IRIS) and deemed RDNs competent after 10 successful tube placements. Given limited literature on RDN led tube placement, we defined success as &gt;80% of tube placements in an appropriate position within the gastrointestinal tract.</p><p><b>Results:</b> To date, the pilot includes 57 patients; Forty-six tubes (80.7%; 39 gastric and 7 post-pyloric) were placed successfully, as confirmed by KUB. Of note, 2 of the 39 gastric tubes were originally ordered to be placed post-pylorically, however gastric tubes were deemed acceptable for these patients after issues were identified during placement. Eleven (19%) tube placements were unsuccessful due to behavioral issues, blockages in the nasal cavity, and anatomical abnormalities.</p><p><b>Conclusion:</b> This pilot demonstrated that well trained RDNs can successfully place feeding tubes at the bedside using camera-guided tube placement technology. One limitation to this pilot is the small sample size. We initially limited the pilot to 2 hospital floors and had trouble educating nursing staff on the availability of the RDN to place tubes. Through evaluation of tube placement orders, we found the floors placed on average 198 tubes during our pilot, indicating 141 missed opportunities for RDNs to place tubes. To address these issues, we created numerous educational bulletins and worked with unit nursing staff to encourage contacting the RDN when feeding tube placement was needed. We also expanded the pilot hospital-wide and are looking into time periods when tubes are most often placed. Anecdotally, bedside feeding tube placement takes 30 to 60 minutes, therefore this pilot saved 1350 to 2700 minutes of nursing time and 180 to 360 minutes of fluoroscopy time necessary to place post-pyloric tubes. Overall, our pilot has demonstrated feasibility in RDN-led bedside feeding tube placement, allowing staff RDNs to practice at the top of their scope and promoting effective use of hospital resources.</p><p></p><p><b>Figure 1.</b> Dietitian Feeding Tube Insertion Pilot: 2NT and 9NT.</p><p>Lauren Murch, MSc, RD<sup>1</sup>; Janet Madill, PhD, RD, FDC<sup>2</sup>; Cindy Steel, MSc, RD<sup>3</sup></p><p><sup>1</sup>Nestle Health Science, Cambridge, ON; <sup>2</sup>Brescia School of Food and Nutritional Sciences, Faculty of Health Sciences, Western University, London, ON; <sup>3</sup>Nestle Health Science, Hamilton, ON</p><p><b>Financial Support:</b> Nestle Health Science.</p><p><b>Background:</b> Continuing education (CE) is a component of professional development which serves two functions: maintaining practice competencies, and translating new knowledge into practice. Understanding registered dietitian (RD) participation and perceptions of CE facilitates creation of more effective CE activities to enhance knowledge acquisition and practice change. This preliminary analysis of a practice survey describes RD participation in CE and evaluates barriers to CE participation.</p><p><b>Methods:</b> This was a cross-sectional survey of clinical RDs working across care settings in Canada. Targeted participants (n = 4667), identified using a convenience sample and in accordance with applicable law, were invited to complete a 25-question online survey, between November 2023 to February 2024. Descriptive statistics and frequencies were reported.</p><p><b>Results:</b> Nationally, 428 RDs working in acute care, long term care and home care, fully or partially completed the survey (9.1% response). Respondents indicated the median ideal number of CE activities per year was 3 in-person, 5 reviews of written materials, and 6 virtual activities. However, participation was most frequently reported as “less often than ideal” for in person activities (74.7% of respondents) and written material (53.6%) and “as often as ideal” for virtual activities (50.7%). Pre-recorded video presentations, live virtual presentations and critically reviewing written materials were the most common types of CE that RDs had participated in at least once in the preceding 12-months. In-person hands-on sessions, multimodal education and simulations were the least common types of CE that RDs had encountered in the preceding 12-months (Figure 1). The most frequent barriers to participation in CE were cost (68% of respondents), scheduling (60%), and location (51%). However, encountered barriers that most greatly limited participation in CE were inadequate staff coverage, lack of dedicated education days within role, and lack of dedicated time during work hours (Table 1). When deciding to participate in CE, RDs ranked the most important aspects of the content as 1) from a credible source, 2) specific/narrow topic relevant to practice and 3) enabling use of practical tools/skills at the bedside.</p><p><b>Conclusion:</b> This data suggests there is opportunity for improvement in RD CE participation, with the greatest gaps in in-person and written activities. Although RDs recognize the importance of relevant and practical content, they reported infrequent exposure to types of CE that are well-suited to this, such as simulations and hands-on activities. When planning and executing CE, content should be credible, relevant, and practical, using a format that is both accessible and impactful. Results of this study help benchmark Canadian RD participation in CE and provide convincing evidence to address barriers and maximize optimal participation.</p><p><b>Table 1.</b> Frequent and Impactful Barriers Limiting Participation in CE Activities.</p><p></p><p>Note: 1. Percentages total greater than 100% because all respondents selected the 3 most important barriers impacting their participation in CE activities. 2. Items are ranked based on a weighted score calculated from a 5-point Likert scale, indicating the extent to which the barrier was perceived to have limited participation in CE activities.</p><p></p><p><b>Figure 1.</b> Types of Continuing Education Activities Dietitians Participated In At Least Once, In The Preceding 12-Months.</p><p>Karen Sudders, MS, RDN, LDN<sup>1</sup>; Alyssa Carlson, RD, CSO, LDN, CNSC<sup>2</sup>; Jessica Young, PharmD<sup>3</sup>; Elyse Roel, MS, RDN, LDN, CNSC<sup>2</sup>; Sophia Vainrub, PharmD, BCPS<sup>4</sup></p><p><sup>1</sup>Medtrition, Huntingdon Valley, PA; <sup>2</sup>Endeavor Health/Aramark Healthcare +, Evanston, IL; <sup>3</sup>Parkview Health, Fort Wayne, IN; <sup>4</sup>Endeavor Health, Glenview, IL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Adequate nutrition is a critical component of patient care and plays a significant role in reducing hospital length of stay (LOS). Malnutrition is associated with numerous adverse outcomes, including increased morbidity, delayed recovery, and prolonged hospitalization (Tappenden et al., 2013). Timely nutrition interventions, and strategies for integrating successful nutrition care into hospital protocols to reduce LOS are essential parts of medical nutrition therapy. Modular nutrition often plays a key role in these interventions. A study by Klek et al. (2020) emphasizes the role of modular nutrition in providing personalized nutritional support for the specific needs of critically ill patients. The study suggests that using nutrient modules allows for a more precise adjustment of nutrition based on the metabolic requirements of patients (Klek et al., 2020). Modular nutrition has also been shown to positively impact clinical outcomes in ICU patients. An observational study by Compher et al. (2019) reported that the targeted administration of protein modules to achieve higher protein intake was associated with improved clinical outcomes, such as reduced ICU LOS (Compher et al., 2019).</p><p><b>Methods:</b> Administration of modular nutrition can be a challenge. Typically, modular proteins (MP) are ordered through the dietitian and dispensed as part of the diet order. The nursing team is responsible for administration and documentation of the MP. There is commonly a disconnect between MP prescription and administration. In some cases, it's related to the MP not being a tracked task in the electronic health record (EHR). The goal of this evaluation was to review data related to a quality improvement (QI)initiative where MP (ProSource TF) was added to the medication administration record (MAR) which used a barcode scanning process to track provision and documentation of MP. The objective of this evaluation was to determine possible correlation between the QI initiative and patients’ ICU LOS. The QI initiative evaluated a pre-implementation timeframe from June 1<sup>st</sup>, 2021 to November 30<sup>th</sup>, 2021, with a post implementation timeframe from January 1<sup>st</sup>, 2022 to June 30<sup>th</sup>, 2022. There were a total of 1962 ICU encounters in the pre-implementation period and 1844 ICU encounters in the post implementation period. The data was analyzed using a series of statistical tests.</p><p><b>Results:</b> The t-test for the total sample was significant, t(3804) = 8.35, p &lt; .001, indicating the average LOS was significantly lower at post compared to pre implementation(TABLE 1). This positive correlation allows us to assume that improved provision of MP may be related to a reduced LOS in the ICU. In addition to LOS, we can also suggest a relationship with the MAR and MP utilization. Pre-implementation, 1600 doses of MP were obtained with an increase of 2400 doses obtained post implementation. The data suggests there is a correlation between product use and MAR implementation even though the overall encounters at post implementation were reduced. There was a 50% increase in product utilization post implementation compared to previous.</p><p><b>Conclusion:</b> The data provided suggests the benefit for adding MP on the MAR to help improve provision, streamline documentation and potentially reduce ICU LOS.</p><p><b>Table 1.</b> Comparison of LOS Between Pre and Post Total Encounters.</p><p></p><p>Table 1 displays the t-test comparison of LOS in pre vs post implementation of MP on the MAR.</p><p></p><p><b>Figure 1.</b> Displays Product Utilization and Encounters Pre vs Post Implementation of MP on the MAR.</p><p><b>International Poster of Distinction</b></p><p>Eliana Giuntini, PhD<sup>1</sup>; Ana Zanini, RD, MSc<sup>2</sup>; Hellin dos Santos, RD, MSc<sup>2</sup>; Ana Paula Celes, MBA<sup>2</sup>; Bernadette Franco, PhD<sup>3</sup></p><p><sup>1</sup>Food Research Center/University of São Paulo, São Paulo; <sup>2</sup>Prodiet Medical Nutrition, Curitiba, Parana; <sup>3</sup>Food Research Center/School of Pharmaceutical Sciences/University of São Paulo, São Paulo</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Critically ill patients present an increased need for protein to preserve muscle mass due to anabolic resistance. Additionally, these patients are more prone to developing hyperglycemia, and one of the nutritional strategies that can be adopted is to provide a diet with a low glycemic index. Hypercaloric and high-protein enteral formulas can help meet energy and protein goals for these patients. Due to their reduced carbohydrate content, these formulas can also contribute to lowering the postprandial glycemic response. The study aimed to evaluate the glycemic index (GI) and the glycemic load (GL) of a specialized high-protein enteral nutrition formula.</p><p><b>Methods:</b> Fifteen healthy volunteers were selected, based on self-reported absence of diseases or regular medication use, aged between 21 and 49 years, with normal glucose tolerance according to fasting and postprandial glucose assessments over 2 hours. The individuals attended after a 10-hour fast, once per week, consuming the glucose solution – reference food – for 3 weeks, and the specialized high-protein enteral formula (Prodiet Medical Nutrition) in the following week, both in amounts equivalent to 25 g of available carbohydrates. The specialized high-protein formula provides 1.5 kcal/ml, 26% protein (98 g/L), 39% carbohydrates, and 35% lipids, including EPA + DHA. Capillary blood sampling was performed at regular intervals, at 0 (before consumption), 15, 30, 45, 60, 90, and 120 minutes. The incremental area under the curve (iAUC) was calculated, excluding areas below the fasting line. Glycemic load (GL) was determined based on the equation GL = [GI (glucose=reference) X grams of available carbohydrates in the portion]/100. Student's t-test were conducted to identify differences (p &lt; 0.05).</p><p><b>Results:</b> To consume 25 g of available carbohydrates, the individuals ingested 140 g of the high-protein formula. The high-protein enteral formula showed a low GI (GI = 23) with a significant difference compared to glucose (p &lt; 0.0001) and a low GL (GL = 8.2). The glycemic curve data showed significant differences at all time points between glucose and the specialized high-protein formula, except at T90, with the glycemic peak occurring at T30 for glucose (126 mg/dL) and at both T30 and T45 for the specialized high-protein enteral formula, with values significantly lower than glucose (102 vs 126 mg/dL). The iAUC was smaller for the specialized high-protein formula compared to glucose (538 ± 91 vs 2061 ± 174 mg/dL x min) (p &lt; 0.0001), exhibiting a curve without high peak, typically observed in foods with a reduced glycemic index.</p><p><b>Conclusion:</b> The specialized high-protein enteral nutrition formula showed a low GI and GL, resulting in a significantly reduced postprandial glycemic response, with lower glucose elevation and variation. This may reduce insulin requirements, and glycemic variability.</p><p></p><p><b>Figure 1.</b> Mean Glycemic Response of Volunteers (N = 15) to 25 G of Available Carbohydrates After Consumption of Reference Food and a Specialized High-Protein Enteral Nutrition Formula, in 120 Min.</p><p>Lisa Epp, RDN, LD, CNSC, FASPEN<sup>1</sup>; Bethaney Wescott, APRN, CNP, MS<sup>2</sup>; Manpreet Mundi, MD<sup>2</sup>; Ryan Hurt, MD, PhD<sup>2</sup></p><p><sup>1</sup>Mayo Clinic Rochester, Rochester, MN; <sup>2</sup>Mayo Clinic, Rochester, MN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Hypnotherapy is the use of hypnosis for the treatment of a medical or psychological disorder. Specifically, gut directed hypnotherapy is a treatment option for functional gastrointestinal disorders and disorders of the gut brain axis. It has been shown to be effective in management of GI symptoms such as abdominal pain, nausea, functional dyspepsia and irritable bowel syndrome symptoms. Evidence suggests that 6%–19% of patients with these GI symptoms exhibit characteristics of Avoidant/restrictive food intake disorder (ARFID). Multiple studies show improvement in GI symptoms and ability to maintain that improvement after 1 year. However, there is a paucity of data regarding use of hypnotherapy in home enteral nutrition patients.</p><p><b>Methods:</b> A case report involving a 67-year-old adult female with h/o Irritable bowel syndrome (diarrhea predominant) and new mucinous appendiceal cancer s/p debulking of abdominal tumor, including colostomy and distal gastrectomy is presented. She was on parenteral nutrition (PN) for 1 month post op due to delayed return of bowel function before her oral diet was advanced. Unfortunately, she had difficulty weaning from PN as she was “scared to start eating” due to functional dysphagia with gagging at the sight of food, even on TV. After 4 weeks of PN, a nasojejunal feeding tube was placed and she was dismissed home.</p><p><b>Results:</b> At multidisciplinary outpatient nutrition clinic visit, the patient was dependent on enteral nutrition and reported inability to tolerate oral intake for unclear reasons. Long term enteral access was discussed, however the patient wished to avoid this and asked for alternative interventions she could try to help her eat. She was referred for gut directed hypnotherapy. After 4 in-person sessions over 3 weeks of hypnotherapy the patient was able to tolerate increasing amounts of oral intake and remove her nasal jejunal feeding. Upon follow-up 4 months later, she was still eating well and continued to praise the outcome she received from gut directed hypnotherapy.</p><p><b>Conclusion:</b> Patient-centered treatments for gut-brain axis disorders and disordered eating behaviors and/or eating disorders are important to consider in addition to nutrition support. These include but are not limited to Cognitive Behavior Therapy, mindfulness interventions, acupuncture, biofeedback strategies, and gut directed hypnotherapy. Group, online and therapist directed therapies could be considered for treatment avenues dependent on patient needs and preferences. Additional research is needed to better delineate impact of these treatment modalities in the home enteral nutrition population.</p><p>Allison Krall, MS, RD, LD, CNSC<sup>1</sup>; Cassie Fackler, RD, LD, CNSC<sup>1</sup>; Gretchen Murray, RD, LD, CNSC<sup>1</sup>; Amy Patton, MHI, RD, CNSC, LSSGB<sup>2</sup></p><p><sup>1</sup>The Ohio State University Wexner Medical Center, Columbus, OH; <sup>2</sup>The Ohio State University Wexner Medical Center, Westerville, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> It is well documented that unnecessary hospital admissions can have a negative impact on patient's physical and emotional wellbeing and can increase healthcare costs.<sup>1</sup> Numerous strategies exist to limit unnecessary hospital admissions; one innovative strategy being utilized at our 1000+ bed Academic Medical Center involves Registered Dietitians (RDs). Literature demonstrates that feeding tube placement by a dedicated team using electromagnetic tracking improves patient morbidity and mortality, and is a cost effective solution for this procedure<sup>.2</sup> RDs have been part of feeding tube teams for many years, though exact numbers of RD only teams are unclear.<sup>3</sup> The Revised 2021 Standards of Practice and Standards of Professional Performance for RDNs (Competent, Proficient, and Expert) in Nutrition Support identifies that dietitians at the “expert” level strive for additional experience and training, and thus may serve a vital role in feeding tube placement teams.<sup>4</sup> Prior to the implementation of the RD led tube team, there was no uniform process at our hospital for these patients to obtain enteral access in a timely manner.</p><p><b>Methods:</b> In December 2023 an “RD tube team” consult and order set went live within the electronic medical record at our hospital. The original intent of the tube team was to cover the inpatient units, but it soon became apparent that there were opportunities to extend this service to observation areas and the Emergency Department (ED). This case series abstract will outline case studies from three patients with various clinical backgrounds and how the tube team was able to prevent an inpatient admission. Patient 1: An 81-year-old female who returned to the ED on POD# 4 s/p esophageal repair with a dislodged nasoenteric feeding tube. The RD tube team was consulted and was able to replace her tube and bridled it in place. Patient discharged from ED without requiring hospital readmission. Patient 2: An 81-year-old male with a history of ENT cancer who transferred to our ED after outside hospital ED had no trained staff available to replace his dislodge nasoenteric feeding tube. RD tube team replaced his tube and bridled it into place. Patient was able to discharge from the ED without admission. Patient 3: A 31-year-old female with complex GI history and multiple prolonged hospitalizations due to PO intolerance. Patient returned to ED 2 days post discharge with a clogged nasoenteric feeding tube. Tube was unable to be unclogged, thus the RD tube team was able to replace tube in ED and prevent readmission.</p><p><b>Results:</b> Consult volumes validated there was a need for a tube team service. In the first 8 months of consult and order set implementation, a total of 403 tubes were placed by the RD team. Of those, 24 (6%) were placed in the ED and observation units. In the 3 patient cases described above, and numerous other patient cases, the RD was able to successfully place a tube using an electromagnetic placement device (EMPD), thereby preventing a patient admission.</p><p><b>Conclusion:</b> Creating a feeding tube team can be a complex process to navigate and requires support from senior hospital administration, physician champions, nursing teams and legal/risk management teams. Within the first year of implementation, our hospital system was able to demonstrate that RD led tube teams have the potential to not only help with establishing safe enteral access for patients, but also can be an asset to the medical facility by preventing admissions and readmissions.</p><p><b>Table 1.</b> RD Tube Team Consults (December 11, 2023-August 31, 2024).</p><p></p><p>Arina Cazac, RD<sup>1</sup>; Joanne Matthews, RD<sup>2</sup>; Kirsten Willemsen, RD<sup>3</sup>; Paisley Steele, RD<sup>4</sup>; Savannah Zantingh, RD<sup>5</sup>; Sylvia Rinaldi, RD, PhD<sup>2</sup></p><p><sup>1</sup>Internal Equilibrium, King City, ON; <sup>2</sup>London Health Sciences Centre, London, ON; <sup>3</sup>NutritionRx, London, ON; <sup>4</sup>Vanier Children's Mental Wellness, London, ON; <sup>5</sup>Listowel-Wingham and Area Family Health Team, Wingham, ON</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Parkinson's disease is the second most prevalent neurodegenerative disease with a predominant disease-related symptom known as dysphagia. Dysphagia can increase the risk of aspiration, the inhaling of food or liquids into the lungs, potentially instigating the onset of pneumonia, a recurrent fatality in patients with Parkinson's disease. Therefore, feeding tubes are placed to deliver nutrition into the stomach or the small intestine to maintain appropriate nutrition delivery and reduce the risk of aspiration by oral intake. To the best of our knowledge, there is no research comparing the differences in outcomes between gastric (G) or jejunal (J) tube feeding in patients with Parkinson's disease-related dysphagia, however, limited research does exist in critically ill populations comparing these two modalities. The purpose of this study is to compare the differences in hospital readmissions related to aspiration events and differences in mortality rates after placement of a gastric or jejunal feeding tube in patients with Parkinson's disease-related dysphagia.</p><p><b>Methods:</b> This was a retrospective chart review of patients either admitted to the medicine or clinical neurosciences units at University Hospital in London Ontario, Canada, between January 1, 2010, to December 31, 2022. Patients were included if they had a documented diagnosis of Parkinson's or associated diseases and had a permanent feeding tube placed during hospital admission that was used for enteral nutrition. Patients were excluded from the study if they had a comorbidity that would affect their survival, such as cancer, or if a feeding tube was placed unrelated to Parkinson's Disease-related dysphagia, for example, feeding tube placement post-stroke. A p-value &lt; 0.05 was considered statistically significant.</p><p><b>Results:</b> 25 participants were included in this study; 7 had gastric feeding tubes, and 18 had jejunal feeding tubes. Demographic data is shown in Table 1. No statistically significant differences were found in demographic variables between the G- and J-tube groups. Of interest, none of the 28% of participants that had dementia were discharged to home; 5 were discharged to long-term care, 1 was discharged to a complex continuing care facility, and 1 passed in hospital. Differences in readmission rates and morality between groups did not reach significance, likely due to our small sample size in the G-tube group (Figures 1 and 2). However, we found that 50% of participants were known to have passed within 1 year of initiating enteral nutrition via their permanent feeding tube, and there was a trend of higher readmission rates in the G-tube group.</p><p><b>Conclusion:</b> While this study did not yield statistically significant results, it highlights the need for further research of a larger sample size to assess confounding factors, such as concurrent oral intake, that affect the difference in outcomes between G- and J-tube groups. Future research would also benefit from examining the influence on quality of life in these patients. Additional research is necessary to inform clinical practice guidelines and clinical decision-making for clinicians, patients and families when considering a permanent feeding tube.</p><p><b>Table 1.</b> Participant Demographics.</p><p></p><p></p><p>Readmission rates were calculated as a hospital. If a participant was readmitted more than once within the defined percentage of the number of readmissions to the number of discharges from timeframes, subsequent readmissions were counted as a new readmission and new discharge event. Readmission rate calculations did not include participants who passed during or after the defined timeframes. Differences in readmission rates between gastric and jejunal feeding tube groups did no reach statistical significance.</p><p><b>Figure 1.</b> Readmission Rate.</p><p></p><p>Mortality rates were calculated from the time that enteral nutrition was initiated through a permanent feeding tube in 30-day, 60-day, 90-day, and 1-year time intervals. Differences in mortality rates between gastric and jejunal feeding tube groups did not reach statistical significance.</p><p><b>Figure 2.</b> Mortality Rate.</p><p>Jennifer Carter, MHA, RD<sup>1</sup></p><p><sup>1</sup>Winchester Medical Center, Valley Health, Winchester, VA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Early enteral nutrition is shown to improve patient outcomes and can decrease, or attenuate, the progression of malnutrition. Placement of nasoenteric feeding tubes was deemed within the scope of practice (SOP) for Registered Dietitian Nutritionists (RDNs) in 2007, with recent revisions in 2021, by the Academy of Nutrition and Dietetics (AND) and the American Society of Parenteral and Enteral Nutrition (ASPEN). Due to enhanced order writing privileges, RDNs are knowledgeable and aware of those in need of enteral nutrition recommendations. The goal of this abstract is to bring light to the efficiency of a RDN led nasoenteric tube placement team.</p><p><b>Methods:</b> A retrospective chart review of the first 70 patients who received a nasoenteric tube placed by a RDN in 2023 was conducted. Data points collected include time of tube order to tube placement and time of tube order to enteral nutrition order.</p><p><b>Results:</b> Out of 70 tubes placed, the average time from tube order to tube placement was 2.28 hours. The longest time from tube order to placement was 19 hours. The average time from tube order to enteral nutrition order was 5.58 hours. The longest time from tube order to enteral nutrition order was 23.6 hours.</p><p><b>Conclusion:</b> This retrospective review reflects the timeliness of placement and provision of enteral nutrition in the acute care setting when performed by an all RDN team. Overall, placement occurred within less than 2.5 hours of tube placement order, and enteral nutrition orders entered less than 6 hours of tube placement order. The RDNs at Winchester Medical Center have been placing nasoenteric-feeding tubes since 2013 using an electromagnetic tube placement device (EMPD). As of 2018, it became an all RDN team. With the enhanced support from AND and ASPEN, nasoenteric feeding tube placement continues to be promoted within the SOP for RDNS. Also, the Accreditation Council for Education in Nutrition and Dietetics (ACEND) now requires dietetic interns to learn, observe, and even assist with nasoenteric tube placements. Over time, more RDNs in the acute care setting will likely advance their skillset to include this expertise.</p><p></p><p><b>Figure 1.</b> Time From MD Order to Tube Placement in Hours.</p><p></p><p><b>Figure 2.</b> Time From MD Order of Tube to Tube Feed Order in Hours.</p><p><b>Poster of Distinction</b></p><p>Vanessa Millovich, DCN, MS, RDN, CNSC<sup>1</sup>; Susan Ray, MS, RD, CNSC, CDCES<sup>2</sup>; Robert McMahon, PhD<sup>3</sup>; Christina Valentine, MD, RDN, FAAP, FASPEN<sup>4</sup></p><p><sup>1</sup>Kate Farms, Hemet, CA; <sup>2</sup>Kate Farms, Temecula, CA; <sup>3</sup>Seven Hills Strategies, Columbus, OH; <sup>4</sup>Kate Farms, Cincinnati, OH</p><p><b>Financial Support:</b> Kate Farms provided all financial support.</p><p><b>Background:</b> Whole food plant-based diets have demonstrated metabolic benefits across many populations. The resulting increased intake of dietary fiber and phytonutrients is integral to the success of this dietary pattern due to the positive effect on the digestive system. Patients dependent on tube feeding may not receive dietary fiber or sources of phytonutrients, and the impact of this is unknown. Evidence suggests that the pathways that promote digestive health include more than traditional prebiotic sources from carbohydrate fermentation. Data on protein fermentation metabolites and potential adverse effects on colon epithelial cell integrity are emerging. These lesser-known metabolites, branched-chain fatty acids (BCFAs), are produced through proteolytic fermentation. Emerging research suggests that the overproduction of BCFAs via protein fermentation may be associated with toxic by-products like p-cresol. These resulting by-products may play a role in digestive disease pathogenesis. Enteral formulas are often used to support the nutritional needs of those with digestive conditions. Plant-based formulations made with yellow pea protein have been reported to improve GI tolerance symptoms. However, the underlying mechanisms responsible have yet to be investigated. The purpose of this study was to assess the impact of a mixed food matrix enteral formula containing pea protein, fiber, and phytonutrients on various markers of gut health in healthy children and adults, using an in-vitro model.</p><p><b>Methods:</b> Stool samples of ten healthy pediatric and 10 adult donors were collected and stored at -80°C. The yellow pea protein formulas (Kate Farms™ Pediatric Standard 1.2 Vanilla-P1, Pediatric Peptide 1.0 Vanilla-P2, and Standard 1.4 Plain-P3) were first predigested using standardized intestinal processing to simulate movement along the digestive tract. The in-vitro model was ProDigest's Colon-on-a-Plate (CoaP®) simulation platform which has demonstrated in vivo-in vitro correlation. Measurements of microbial metabolic activity included pH, production of gas, SCFAs, BCFA, ammonia, and microbiota shifts. Paired two-sided t-tests were performed to evaluate differences between treatment and control. Differential abundance analysis was performed using LEfSe and treeclimbR. Statistical significance, as compared to negative control, is indicated by a p-value of &lt; 0.05.</p><p><b>Results:</b> In the pediatric group, the microbial analysis showed significant enrichment of Bifidobacteria as well as butyrate-producing genera Agathobacter and Agathobaculum with the use of the pediatric formulas when compared to the control. P1 resulted in a statistically significant reduction of BCFA production (p &lt; = 0.05). P1 and P2 resulted in statistically significant increases in acetate and propionate. In the adult group, with treatment using P3, microbial analysis showed significant enrichment of Bifidobacteria compared to the control group. P3 also resulted in a reduction of BCFAs, although not statistically significant. Gas production and drop in pH were statistically significant (p &lt; = 0.05) for all groups P1, P2, and P3 compared to control, which indicates microbial activity.</p><p><b>Conclusion:</b> All enteral formulas demonstrated a consistent prebiotic effect on the gut microbial community composition in healthy pediatric and adult donors. These findings provide insight into the mechanisms related to digestive health and highlight the importance of designing prospective interventional research to better understand the role of fiber and phytonutrients within enteral products.</p><p>Hill Johnson, MEng<sup>1</sup>; Shanshan Chen, PhD<sup>2</sup>; Garrett Marin<sup>3</sup></p><p><sup>1</sup>Luminoah Inc, Charlottesville, VA; <sup>2</sup>Virginia Commonwealth University, Richmond, VA; <sup>3</sup>Luminoah Inc, San Diego, CA</p><p><b>Financial Support:</b> Research was conducted with the support of VBHRC's VA Catalyst Grant Funding.</p><p><b>Background:</b> Medical devices designed for home use must prioritize user safety, ease of operation, and reliability, especially in critical activities such as enteral feeding. This study aimed to validate the usability, safety, and overall user satisfaction of a novel enteral nutrition system through summative testing and task analysis.</p><p><b>Methods:</b> A simulation-based, human factors summative study was conducted with 36 participants, including both caregivers and direct users of enteral feeding technology. Participants were recruited across three major cities: Houston, Chicago, and Phoenix. Task analysis focused on critical and non-critical functions of the Luminoah FLOW™ Enteral Nutrition System, while user satisfaction was measured using the System Usability Scale (SUS). The study evaluated successful task completion, potential use errors, and qualitative user feedback.</p><p><b>Results:</b> All critical tasks were completed successfully by 100% of users, with the exception of a cleaning task, which had an 89% success rate. Non-critical tasks reached an overall completion rate of 95.7%, demonstrating the ease of use and intuitive design of the system. The SUS score was exceptionally high, with an average score of 91.5, indicating strong user preference for the device over current alternatives. Furthermore, 91% of participants indicated they would choose the new system over other products in the market.</p><p><b>Conclusion:</b> The innovative portable enteral nutrition system demonstrated excellent usability and safety, meeting the design requirements for its intended user population. High completion rates for critical tasks and an overwhelmingly positive SUS score underscore the system's ease of use and desirability. These findings suggest that the system is a superior option for home enteral feeding, providing both safety and efficiency in real-world scenarios. Further refinements in instructional materials may improve user performance on non-critical tasks.</p><p>Elease Tewalt<sup>1</sup></p><p><sup>1</sup>Phoenix Veterans Affairs Administration, Phoenix, AZ</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Enhanced Recovery After Surgery (ERAS) protocols, including preoperative carbohydrate loading, aim to accelerate recovery by reducing the stress responses and insulin resistance. These protocols have been shown to decrease hospital stays, postoperative complications, and healthcare costs. However, there is limited knowledge about the safety and efficacy of ERAS for diabetic patients. Patients with diabetes make up 15% of surgical cases and often have longer hospital stays and more postoperative complications. This study evaluated outcome measures important to patients with diabetes in a non-diabetic population in order to support the groundwork for future trials that could include diabetic patients in ERAS protocols.</p><p><b>Methods:</b> A retrospective chart review at the Phoenix Veterans Affairs Health Care System compared 24 colorectal surgery patients who received preoperative carbohydrate drinks with 24 who received traditional care. Outcomes assessed included blood glucose (BG) levels, aspiration, and postoperative complications. Additional analyses evaluated adherence, length of hospital stay, and healthcare costs.</p><p><b>Results:</b> The demographics of the two groups were comparable (Table 1). The preoperative BG levels of the carbohydrate loading group were similar (164.6 ± 36.3 mg/dL) to the control group (151.8 ± 47.7 mg/dL) (p &gt; 0.05) (Figure 1). The carbohydrate loading group demonstrated lower and more stable postoperative BG levels (139.4 ± 37.5 mg/dL) compared to the control group (157.6 ± 61.9 mg/dL), but this difference was not statistically significant (p &gt; 0.05) (Figure 2). There were no significant differences in aspiration or vomiting between the groups (p &gt; 0.05) (Table 2). The carbohydrate loading group had a shorter average hospital stay by one day, but this difference was not statistically significant (p &gt; 0.05) (Table 2).</p><p><b>Conclusion:</b> Carbohydrate loading as part of ERAS protocols was associated with better postoperative glucose control, no increased risk of complications, and reduced hospital stays. Although diabetic patients were not included in this study, these findings suggest thatcarbohydrate loading is a safe and effective component of ERAS. Including diabetic patients in ERAS is a logical next step that could significantly improve surgical outcomes for this population. Future research should focus on incorporating diabetic patients to assess the impact of carbohydrate loading on postoperative glucose control, complication rates, length of stay, and healthcare costs.</p><p><b>Table 1.</b> Demographics.</p><p></p><p>The table includes the demographics of the two groups. The study group consists of participants who received the carbohydrate drink, while the control group includes those who received traditional care.</p><p><b>Table 2.</b> Postoperative Outcomes.</p><p></p><p>The table includes the postoperative outcomes of the two groups. The study group consists of participants who received the carbohydrate drink, while the control group includes those who received traditional care.</p><p></p><p>The figure shows the preoperative BG levels of the carbohydrate and non-carbohydrate groups, along with a trendline for each group (p &gt; 0.05).</p><p><b>Figure 1.</b> Preoperative BG Levels.</p><p></p><p>The figure shows the postoperative BG levels of the carbohydrate and non-carbohydrate groups, along with a trendline for each group (p &gt; 0.05).</p><p><b>Figure 2.</b> Postoperative BG Levels.</p><p><b>Malnutrition and Nutrition Assessment</b></p><p>Amy Patton, MHI, RD, CNSC, LSSGB<sup>1</sup>; Elisabeth Schnicke, RD, LD, CNSC<sup>2</sup>; Sarah Holland, MSc, RD, LD, CNSC<sup>3</sup>; Cassie Fackler, RD, LD, CNSC<sup>2</sup>; Holly Estes-Doetsch, MS, RDN, LD<sup>4</sup>; Kristen Roberts, PhD, RDN, LD, CNSC, FASPEN, FAND<sup>5</sup>; Christopher Taylor, PhD, RDN<sup>4</sup></p><p><sup>1</sup>The Ohio State University Wexner Medical Center, Westerville, OH; <sup>2</sup>The Ohio State University Wexner Medical Center, Columbus, OH; <sup>3</sup>The Ohio State University Wexner Medical Center, Upper Arlington, OH; <sup>4</sup>The Ohio State University, Columbus, OH; <sup>5</sup>The Ohio State University, Granville, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> The unfavorable association of malnutrition on hospital outcomes such as longer length of stays (LOS), increased falls, and increased hospital readmissions has been well documented in the literature. We aimed to see if a different model of care that lowered Registered Dietitian (RD) to patient ratios would lead to increased malnutrition identification and documentation within our facility. We also evaluated the relationship between these metrics and LOS on monitored hospital units.</p><p><b>Methods:</b> In July 2022, two additional RDs were hired at a large 1000+ bed academic medical center as part of a pilot program focused on malnutrition identification, documentation integrity, and staff training. The RD to patient ratio was reduced from 1:75 to 1:47 on three units of the hospital. On the pilot units, the RD completed a full nutrition assessment, including a Nutrition Focused Physical Exam (NFPE), for all patients who were identified as \"at risk\" per hospital nutrition screening policy. Those patients that were not identified as “at risk” received a full RD assessment with NFPE by day 7 of admission or per consult request. A malnutrition dashboard was created with assistance from a Quality Data Manager from the Quality and Operations team. This visual graphic allowed us to track and monitor RD malnutrition identification rates by unit and the percentage of patients that had a malnutrition diagnosis captured by the billing and coding team. Data was also pulled from the Electronic Medical Record (EMR) to look at other patient outcomes. In a retrospective analysis we compared the new model of care to the standard model on one of these units.</p><p><b>Results:</b> There was an increase in the RD identified capture rate of malnutrition on the pilot units. On a cardiac care unit, the RD identification rate went from a baseline of 6% in Fiscal Year (FY) 2022 to an average of 12.5% over FY 2023-2024. On two general medicine units, the malnutrition rates identified by RD nearly doubled during the two-year intervention (Table 1). LOS was significantly lower on one of the general medicine intervention floors compared to a control unit (p &lt; 0.001, Cohen's D: 13.8) (Table 2). LOS was reduced on all units analyzed between FY22 and FY23/24. Those patients with a malnutrition diagnosis had a 15% reduction in LOS FY22 to FY23/24 in control group compared to 19% reduction in LOS for those identified with malnutrition on intervention unit. When comparing intervention versus control units for FY23 and FY24 combined, the intervention had a much lower LOS than control unit.</p><p><b>Conclusion:</b> Dietitian assessments and related interventions may contribute in reducing LOS. Reducing RD to patient ratios may allow for greater identification of malnutrition and support patient outcomes such as LOS. There is an opportunity to evaluate other patient outcomes for the pilot units including falls, readmission rates and Case Mix Index.</p><p><b>Table 1.</b> RD Identified Malnutrition Rates on Two General Medicine Pilot Units.</p><p></p><p><b>Table 2.</b> Control Unit and Intervention Unit Length of Stay Comparison.</p><p></p><p>Amy Patton, MHI, RD, CNSC, LSSGB<sup>1</sup>; Misty McGiffin, DTR<sup>2</sup></p><p><sup>1</sup>The Ohio State University Wexner Medical Center, Westerville, OH; <sup>2</sup>The Ohio State University Wexner Medical Center, Columbus, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Delays in identifying patients at nutrition risk can impact patient outcomes. Per facility policy, Dietetic Technicians (DTs) and Registered Dietitians (RDs) review patient charts, meet with patients, and assign a nutrition risk level. High risk patients are then assessed by the RD for malnutrition among other nutrition concerns. Based on a new tracking process implemented in January of 2023, an average of 501 patient nutrition risk assignments were overdue or incomplete per month in January thru April of 2023 with a dramatic increase to 835 in May. Missed risk assignments occur daily. If the risk assignment is not completed within the 72-hour parameters of the policy, this can result in late or missed RD assessment opportunities and policy compliance concerns.</p><p><b>Methods:</b> In June 2023, a Lean Six Sigma quality improvement project using the DMAIC (Define, Measure, Analyze, Improve, Control) framework was initiated at a 1000+ bed Academic Medical Center with the goal to improve efficiency of the nutrition risk assignment (NRA) process for RDs and DTs. A secondary goal was to see if potential improved efficiency would also lead to an increase in RD identified malnutrition based on Academy of Nutrition and Dietetics/American Society for Parenteral and Enteral Nutrition Indicators or Malnutrition (AAIM) criteria. A group of RDs and DTs met to walk through each of the quality improvement process steps. The problem was defined and baseline measurements of malnutrition identification rates and missed nutrition risk assignments were analyzed. A fishbone diagram was used to help with root cause analysis and later a payoff matrix was used to identify potential interventions. The improve phase was implemented in October and November 2023 and included changes to the screening policy itself and redistributing clinical nutrition staff on certain patient units.</p><p><b>Results:</b> Identified improvements did have a positive impact on incomplete work and on malnutrition identification rates. Malnutrition identification rates on average May thru October were 11.7% compared to malnutrition rates November thru April of 12.1%. The number of missed NRA's decreased from an average of 975 per month May thru October to 783 per month November thru April, a decrease of 192 per month (20%). An additional quality improvement process cycle is currently underway to further improve these metrics.</p><p><b>Conclusion:</b> Fostering a culture of ongoing improvement presents a significant challenge for both clinicians and leaders. Enhancing nutrition care and boosting clinician efficiency are critical goals. Introducing clinical nutrition leaders to tools designed for quality monitoring and enhancement can lead to better performance outcomes and more effective nutrition care. Tools such as those used for this project along with PDSA (Plan, Do, Study, Act) projects are valuable in this process. Involving team members from the beginning of these improvement efforts can also help ensure the successful adoption of practice changes.</p><p><b>Table 1.</b> RD Identified Malnutrition Rates.</p><p></p><p><b>Table 2.</b> Incomplete Nutrition Risk Assignments (NRA's).</p><p></p><p>Maurice Jeanne Aguero, RN, MD<sup>1</sup>; Precy Gem Calamba, MD, FPCP, DPBCN<sup>2</sup></p><p><sup>1</sup>Department of Internal Medicine, Prosperidad, Agusan del Sur; <sup>2</sup>Medical Nutrition Department, Tagum City, Davao del Norte</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition is a strong predictor for mortality and morbidity through poor response to therapy, and quality of life among Gastro-intestinal (GI) cancer patients. In a tertiary government hospital in Tagum City where a cancer center is present, although malnutrition screening among patients with cancer is routinary, no studies focusing on determining the association between nutritional status and quality of life among GI cancer patients were conducted in the past. This study generally aims to determine if nutritional status is associated with the quality of life among adult GI cancer Filipino patients seeking cancer therapy in a tertiary government hospital.</p><p><b>Methods:</b> A quantitative, observational, cross-sectional, survey analytical, and predictive type of research was done. World Health Organization Quality of Life Brief Version (WHOQOL-BREF) questionnaire was utilized to determine the quality of life of cases. Logistic regression analysis was used for the association between the demographic, clinical and nutritional profile among patients with gastrointestinal cancer.</p><p><b>Results:</b> Among respondents (n = 160, mean age 56.4 ± 12 years), majority were male (61.9%), married (77.5%), Roman Catholic (81.1%), and finished high school (38.1%). Almost half were diagnosed cases of colon adenocarcinoma (43.125%), followed by rectal adenocarcinoma (22.5%), rectosigmoid adenocarcinoma (11.875%), then GI stromal tumor (5.625%). On cancer staging, 40.625% were Stage 4, followed by Stage 3b (19.375%), Stage 3c (10%), Stage 3a (5.625%), then Stage 2a (4.375%). Only 2.5% were Stage 4a, while 0.625% were Stage 4b. More than one fourth received CAPEOX (38.125%), followed by FOLFOX (25.625%), then IMATINIB (5.625%). Among cases, 15.6% were underweight or obese, or overweight (34.4%). In terms of SGA grading, 38.1% had severe level, 33.8% moderate level, while the rest normal to mild. On quality of life, mean scores per variable were: generally good for general quality of life (3.71 ± 0.93), generally satisfied for perception on general health, being satisfied with one's self, and his or her relationship with others (3.46 to 3.86 ± 0.97), generally of moderate satisfaction on having enough energy to daily life, on accepting bodily appearance, on the availability of information needed for daily living, and on the extent of having the opportunity for leisure (2.71 to 3.36 ± 1.02), a little level of satisfaction was thrown on having enough money to meet their needs (2.38 ± 0.92). Participants, on average, experienced quite often negative feelings such as having blue mood, despair, depression and anxiety (2.81 ± 0.79). A significant association between age (p = 0.047), cancer diagnosis (p = 0.001), BMI status (p = 0.028), and SGA nutritional status (p = 0.010) relative to the quality of life among adult cancer patients was documented.</p><p><b>Conclusion:</b> Nutritional status was significantly associated with the quality of life among adult GI cancer Filipino patients seeking cancer therapy in a tertiary government hospital. Public health interventions may play a critical role on these factors to improve patient survival and outcome.</p><p>Carmen, Kaman Lo, MS, RD, LDN, CNSC<sup>1</sup>; Hannah Jacobs, OTD, OTR/L<sup>2</sup>; Sydney Duong, MS, RD, LDN<sup>3</sup>; Julie DiCarlo, MS<sup>4</sup>; Donna Belcher, EdD, MS, RD, LDN, CDCES, CNSC, FAND<sup>5</sup>; Galina Gheihman, MD<sup>6</sup>; David Lin, MD<sup>7</sup></p><p><sup>1</sup>Massachusetts General Hospital, Sharon, MA; <sup>2</sup>MedStart National Rehabilitation Hospital, Washington, DC; <sup>3</sup>New England Baptist Hospital, Boston, MA; <sup>4</sup>Center for Neurotechnology and Neurorecovery, Mass General Hospital, Boston, MA; <sup>5</sup>Nutrition and Food Services, MGH, Boston, MA; <sup>6</sup>Harvard Medical School and Mass General Hospital, Boston, MA; <sup>7</sup>Neurocritical Care &amp; Neurorecovery, MGH, Boston, MA</p><p><b>Financial Support:</b> Academy of Nutrition and Dietetics, Dietitian in Nutrition Support Member Research Award.</p><p><b>Background:</b> Nutritional status is a known modifiable factor for optimal recovery in brain injury survivors, yet, data on specific benchmarks for optimizing clinical outcomes through nutrition are limited. This pilot study aimed to quantify the clinical, nutritional, and functional characteristics of brain injury survivors from ICU admission to 90 days post-discharge.</p><p><b>Methods:</b> Patients admitted to the Massachusetts General Hospital (MGH) Neurosciences ICU over a 12-month period were enrolled based on these criteria: age 18 years and greater, primary diagnoses of acute brain injury, ICU stay for at least 72 hours, and meeting MGH Neurorecovery Clinic referral criteria for outpatient follow-up with survival beyond 90 days post-discharge. Data were collected from the electronic health record and from the neurorecovery clinic follow-visit/phone interview. These included patient characteristics, acute clinical outcomes, nutrition intake, surrogate nutrition and functional scores at admission, discharge, and 90 days post-discharge. Descriptive statistics were used for analysis.</p><p><b>Results:</b> Of 212 admissions during the study period, 50 patients were included in the analysis. The mean APACHE (n = 50), GCS (n = 50), and NIHSS (n = 20) scores were 18, 11 and 15, respectively. Seventy-eight percent of the patients required ventilation with a mean duration of 6.3 days. Mean ICU and post ICU length of stay were 17.4 and 15.9 days, respectively. Eighty percent received nutrition (enteral or oral) within 24-48 hours of ICU admission. The first 7-day mean ICU energy and protein intake were 1128 kcal/day and 60.3 g protein/day, respectively, both 63% of estimated needs. Assessing based on ASPEN guidelines, patients with a BMI ≥ 30 received less energy (11.6 vs 14.6 kcal/kg/day) but higher protein (1.04 vs 0.7 g protein/kg/day) than those with a BMI &lt; 30. Twelve percent of patients had less than 50% of their nutritional intake for at least 7 days pre-discharge and were considered nutritionally at risk. Forty-six percent were discharged with long-term enteral access. Only 16% of the patients were discharged to home rather than a rehabilitation facility. By 90 days post-discharge, 32% of the patients were readmitted, with 27% due to stroke. Upon admission, patients’ mean MUST (Malnutrition Universal Screening Tool) and MST (Malnutrition Screening Tool) scores were 0.56 and 0.48, respectively, reflecting that they were at low nutritional risk. By discharge, the mean MUST and MST scores of these patients increased to 1.16 and 2.08, respectively, suggesting these patients had become nutritionally at risk. At 90 days post-discharge, both scores returned to a low nutrition risk (MUST 0.48 and MST 0.59). All patients’ functional scores, as measured by the modified Rankin scale (mRS), followed a similar pattern: the mean score was 0.1 at admission, 4.2 at discharge, and 2.8 at 90 days post-discharge. The 90 days post-discharge Barthel index was 64.1, indicating a moderate dependence in these patients.</p><p><b>Conclusion:</b> This pilot study highlighted key nutritional, clinical, and functional characteristics of brain injury survivors from ICU admission to 90 days post-discharge. Further statistical analysis will help delineate the relationships between nutritional status and clinical and functional outcomes, which may guide future research and practice in ICU nutrition and neurorehabilitation.</p><p>Lavanya Chhetri, BS<sup>1</sup>; Amanda Van Jacob, MS, RDN, LDN, CCTD<sup>1</sup>; Sandra Gomez, PhD, RD<sup>1</sup>; Pokhraj Suthar, MBBS<sup>1</sup>; Sarah Peterson, PhD, RD<sup>1</sup></p><p><sup>1</sup>Rush University Medical Center, Chicago, IL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Identifying frailty in patients with liver disease provides valuable insights into a patient's nutritional status and physical resilience. However, it is unclear if reduced muscle is an important etiology of frailty in liver disease. Identifying the possible connection between frailty and muscle mass may lead to better risk prediction, personalized interventions, and improved outcomes in patient care. The purpose of this study is to determine if frail patients have lower skeletal muscle index (SMI) compared to not-frail patients with liver disease undergoing liver transplant evaluation.</p><p><b>Methods:</b> A retrospective, cross-sectional study design was utilized. Patients greater than 18 years of age who underwent a liver transplant evaluate from January 1<sup>st</sup>, 2019 until December 31, 2023 were included if they had a liver frailly index (LFI) assessment completed during the initial liver transplant evaluation and a diagnostic abdominal CT scan completed within 30 days of the initial liver transplant evaluation. Demographic data (age, sex, height, and BMI), etiology of liver disease, MELD-Na score, history of diabetes and hepatocellular carcinoma, liver disease complications (ascites, hepatocellular carcinoma, hepatic encephalopathy &amp; esophageal varices), and LFI score were recorded for each patient. LFI was recorded as both a continuous variable and dichotomized into a categorical variable (frail: defined as LFI ≥ 4.5 versus not frail: defined as LFI ≤ 4.4). Cross-sectional muscle area (cm<sup>2</sup>) from the third lumbar region of the CT was quantified; SMI was calculated (cm<sup>2</sup>/height in meters<sup>2</sup>) and low muscle mass was dichotomized into a categorical variable (low muscle mass: defined as SMI ≤ 50 cm<sup>2</sup>/m<sup>2</sup> for males and ≤39 cm<sup>2</sup>/m<sup>2</sup> for females versus normal muscle mass: defined as SMI &gt; 50 cm<sup>2</sup>/m<sup>2</sup> for males and &gt;39 cm<sup>2</sup>/m<sup>2</sup> for females). An independent t-test analysis was used to determine if there is a difference in SMI between patients who are categorized as frail versus not frail.</p><p><b>Results:</b> A total of 104 patients, 57% male with a mean age of 57 ± 10 years and mean of BMI 28.1 ± 6.4 kg/m<sup>2</sup>, were included. The mean MELD-Na score was 16.5 ± 6.9; 25% had a history of hepatocellular carcinoma and 38% had a history of diabetes. The majority of the sample had at least one liver disease complication (72% had ascites, 54% had hepatic encephalopathy, and 67% had varices). The mean LFI score was 4.5 ± 0.9 and 44% were categorized as frail. The mean SMI was 45.3 ± 12.6 cm<sup>2</sup>/m<sup>2</sup> and 52% were categorized as having low muscle mass (males: 63% and females: 38%). There was no difference in SMI between patients who were frail versus not frail (43.5 ± 10.6 versus 47.3 ± 13.9 cm<sup>2</sup>/m<sup>2</sup>, p = 0.06). The difference between SMI by frailty status was reported for males and females, no significance testing was used due to the small sample size. Both frail males (43.5 ± 12.2 versus 48.4 ± 14.9) and females (43.4 ± 9.3 versus 45.2 ± 11.8) had a lower SMI compared to non-frail patients.</p><p><b>Conclusion:</b> No difference in SMI between frail versus not frail patients was observed; however, based on the p-value of 0.06 a marginal trend and possible difference may exist, but further research is needed to confirm the findings. Additionally, it is concerning that men had a higher rate of low muscle mass and the mean SMI for both frail and not frail men was below the cut-off used to identify low muscle mass (SMI ≤ 50 cm<sup>2</sup>/m<sup>2</sup>). Additional research is needed to explore the underlying factors contributing to low muscle mass in men, particularly in frail populations, and to determine whether targeted interventions aimed at improving muscle mass could mitigate frailty and improve clinical outcomes in patients undergoing liver transplant evaluation.</p><p>Rebekah Preston, MS, RD, LD<sup>1</sup>; Keith Pearson, PhD, RD, LD<sup>2</sup>; Stephanie Dobak, MS, RD, LDN, CNSC<sup>3</sup>; Amy Ellis, PhD, MPH, RD, LD<sup>1</sup></p><p><sup>1</sup>The University of Alabama, Tuscaloosa, AL; <sup>2</sup>The University of Alabama at Birmingham, Birmingham, AL; <sup>3</sup>Thomas Jefferson University, Philadelphia, PA</p><p><b>Financial Support:</b> The ALS Association Quality of Care Grant.</p><p><b>Background:</b> Amyotrophic Lateral Sclerosis (ALS) is a progressive neurodegenerative disease. Malnutrition is common in people with ALS (PALS) due to dysphagia, hypermetabolism, self-feeding difficulty and other challenges. In many clinical settings, malnutrition is diagnosed using the Academy of Nutrition and Dietetics/American Society of Parenteral and Enteral Nutrition indicators to diagnose malnutrition (AAIM) or the Global Leadership Initiative on Malnutrition (GLIM) criteria. However, little is known about malnutrition assessment practices in ALS clinics. This qualitative study explored how RDs at ALS clinics are diagnosing malnutrition in PALS.</p><p><b>Methods:</b> Researchers conducted 6 virtual focus groups with 22 RDs working in ALS clinics across the United States. Audio recordings were transcribed verbatim, and transcripts imported to NVivo 14 software (QSR International, 2023, Melbourne, Australia). Two research team members independently analyzed the data using deductive thematic analysis.</p><p><b>Results:</b> The AAIM indicators were identified as the most used malnutrition diagnostic criteria. Two participants described using a combination of AAIM and GLIM criteria. Although all participants described performing thorough nutrition assessments, some said they do not formally document malnutrition in the outpatient setting due to lack of reimbursement and feeling as though the diagnosis would not change the intervention. Conversely, others noted the importance of documenting malnutrition for reimbursement purposes. Across all groups, RDs reported challenges in diagnosing malnutrition because of difficulty differentiating between disease- versus malnutrition-related muscle loss. Consequently, several RDs described adapting current malnutrition criteria to focus on weight loss, decreased energy intake, or fat loss.</p><p><b>Conclusion:</b> Overall, RDs agreed that malnutrition is common among PALS, and they conducted thorough nutrition assessments as part of standard care. Among those who documented malnutrition, most used AAIM indicators to support the diagnosis. However, as muscle loss is a natural consequence of ALS, RDs perceived difficulty in assessing nutrition-related muscle loss. This study highlights the need for malnutrition criteria specific for PALS.</p><p><b>Table 1.</b> Themes Related to Diagnosing Malnutrition in ALS.</p><p></p><p>Carley Rusch, PhD, RDN, LDN<sup>1</sup>; Nicholas Baroun, BS<sup>2</sup>; Katie Robinson, PhD, MPH, RD, LD, CNSC<sup>1</sup>; Maria Geraldine E. Baggs, PhD<sup>1</sup>; Refaat Hegazi, MD, PhD, MPH<sup>1</sup>; Dominique Williams, MD, MPH<sup>1</sup></p><p><sup>1</sup>Abbott Nutrition, Columbus, OH; <sup>2</sup>Miami University, Oxford, OH</p><p><b>Financial Support:</b> This study was supported by Abbott Nutrition.</p><p><b>Background:</b> Malnutrition is increasingly recognized as a condition that is present in all BMI categories. Although much research to date has focused on malnutrition in patients with lower BMIs, there is a need for understanding how nutrition interventions may alter outcomes for those with higher BMIs and existing comorbidities. In a post-hoc analysis of the NOURISH trial, which investigated hospitalized older adults with malnutrition, we sought to determine whether consuming a specialized ONS containing high energy, protein and beta-hydroxy-beta-methylbutyrate (ONS+HMB) can improve vitamin D and nutritional status in those with a BMI ≥ 27.</p><p><b>Methods:</b> Using data from the NOURISH trial, a randomized, placebo-controlled, multi-center, double-blind study conducted in hospitalized participants with malnutrition and a primary diagnosis of congestive heart failure, acute myocardial infarction, pneumonia, or chronic obstructive pulmonary disease, post-hoc analysis was conducted. In the trial, participants received standard care with either ONS + HMB or a placebo beverage (target 2 servings/day) during hospitalization and for 90 days post-discharge. Nutritional status was assessed using Subjective Global Assessment (SGA) and handgrip strength at baseline, 0-, 30-, 60- and 90-days post-discharge. Vitamin D (25-hydroxyvitamin D) was assessed within 72 hours of admission (baseline), 30- and 60-days post-discharge. Participants with a BMI ≥ 27 formed the analysis cohort. Treatment effect was determined using analysis of covariance adjusted for baseline measures.</p><p><b>Results:</b> The post-hoc cohort consisted of 166 patients with a BMI ≥ 27, mean age 76.41 ± 8.4 years, and who were predominantly female (51.2%). Baseline handgrip strength (n = 137) was 22.3 ± 0.8 kg while serum concentrations of 25-hydroxyvitamin D (n = 138) was 26.0 ± 1.14 ng/mL. At day 90, ONS+HMB improved nutritional status in which 64% of ONS+HMB group were well-nourished (SGA-A) vs. 37% of the control group (p = 0.011). There was a trend towards higher changes in handgrip strength with ONS+HMB during index hospitalization (baseline to discharge) compared to placebo (least squares means ± standard error: 1.34 kg ± 0.35 vs. 0.41 ± 0.39; p = 0.081) but was not significant at all other timepoints. Vitamin D concentrations were significantly higher at day 60 in those receiving ONS + HMB compared to placebo (29.7 ± 0.81 vs. 24.8 ± 0.91; p &lt; 0.001).</p><p><b>Conclusion:</b> Hospitalized older patients with malnutrition and a BMI ≥ 27 had significant improvements in their vitamin D and nutritional status at day 60 and 90, respectively, if they received standard care + ONS+HMB as compared to placebo. This suggests transitions of care post-acute setting should consider continuation of nutrition for patients with elevated BMI and malnutrition using interventions such as ONS+HMB in combination with standard care.</p><p>Aline Dos Santos<sup>1</sup>; Isis Helena Buonso<sup>2</sup>; Marisa Chiconeli Bailer<sup>2</sup>; Maria Fernanda Jensen Kok<sup>2</sup></p><p><sup>1</sup>Hospital Samaritano Higienópolis, São Paulo; <sup>2</sup>Hospital Samaritano Higienopolis, São Paulo</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition negatively impacts the length of hospital stay, infection rate, mortality, clinical complications, hospital readmission and also on average healthcare costs. It is believed that early nutritional interventions could reduce negative events and generate economic impact. Therefore, our objective was to evaluate the average cost of hospitalization of patients at nutritional risk through nutritional screening with indication of oral nutritional supplementation.</p><p><b>Methods:</b> Retrospective study including 110 adult patients hospitalized in a private institution admitted between August 2023 and January 2024. Nutritional screening was performed within 24 hours of admission. To classify low muscle mass according to calf circumference (CC), the cutoff points were considered: 33 cm for women and 34 cm for men, measured within 96 hours of hospital admission. They were evaluated in a grouped manner considering G1 patients with an indication for oral supplementation (OS) but not started for modifiable reasons, G2 patients with an indication for OS and started assertively (within 48 hours of the therapeutic indication) and G3 patients with an indication for OS but started late (after 48 hours of therapeutic indication) and G4 the joining of patients from G1 and G3 as both did not receive OS assertively. Patients receiving enteral or parenteral nutritional therapy were excluded.</p><p><b>Results:</b> G2 was prevalent in the studied sample (51%), had an intermediate average length of stay (20.9 days), lower average daily hospitalization cost, average age of 71 years, significant prevalence of low muscle mass (56%) and lower need for hospitalization in intensive care (IC) (63%) with an average length of stay (SLA) in IC of 13.5 days. G1 had a lower prevalence (9%), shorter average length of stay (16 days), average daily cost of hospitalization 41% higher than G2, average age of 68 years, unanimity of adequate muscle mass (100%) and considerable need for hospitalization in intensive care (70%), but with a SLA in IC of 7.7 days. G3 represented 40% of the sample studied, had a longer average length of stay (21.5 days), average daily cost of hospitalization 22% higher than G2, average age of 73 years, significant prevalence of low muscle mass (50%) and intermediate need for hospitalization in intensive care (66%) but with SLA in IC of 16.5 days. Compared to G2, G4 presented a similar sample group (G2: 56 patients and G4: 54 patients) as well as mean age (72 years), hospitalization (20.55 days), hospitalization in IC (66%), SLA in IC (64.23%) but higher average daily hospitalization cost (39% higher than G2) and higher prevalence of patients with low muscle mass (59%).</p><p><b>Conclusion:</b> From the results presented, we can conclude that the percentage of patients who did not receive OS and who spent time in the IC was on average 5% higher than the other groups, with unanimously adequate muscle mass in this group, but with the need for supplementation due to clinical conditions, food acceptance and weight loss. More than 50% of patients among all groups except G1 had low muscle mass. Regarding costs, patients supplemented assertively or late cost, respectively, 45% and 29% less compared to patients who did not receive OS. Comparing G2 with G4, the cost remains 39% lower in patients assertively supplemented.</p><p><b>International Poster of Distinction</b></p><p>Daphnee Lovesley, PhD, RD<sup>1</sup>; Rajalakshmi Paramasivam, MSc, RD<sup>1</sup></p><p><sup>1</sup>Apollo Hospitals, Chennai, Tamil Nadu</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition in hospitals, once an underreported issue, has gained significant attention in recent decades. This widespread problem negatively impacts recovery, length of stay (LOS), and overall outcomes. This study aimed to evaluate the effectiveness of clinical nutrition practices and the role of a Nutrition Steering Committee (NSC) in addressing and potentially eradicating this persistent problem by improving nutritional care and patient outcomes.</p><p><b>Methods:</b> Consecutive patients admitted to non-critical units of a tertiary care hospital between January 2018 and August 2024 were included in the study. Patient demographics, body mass index (BMI), modified Subjective Global Assessment (mSGA), oral nutritional supplements (ONS) usage, and clinical outcomes were retrospectively extracted from electronic medical records. Data was analyzed using SPSS version 20.0, comparing results before and after the implementation of the NSC.</p><p><b>Results:</b> Out of 239,630 consecutive patients, 139,895 non-critical patients were included, with a mean age of 57.10 ± 15.89 years; 64.3% were men and 35.7% women. The mean BMI was 25.76 ± 4.74 kg/m<sup>2</sup>, and 49.6% of the patients were polymorbid. The majority (25.8%) were admitted with cardiac illness. According to the modified Subjective Global Assessment (mSGA), 87.1% were well-nourished, 12.8% moderately malnourished, and 0.1% severely malnourished. ONS were prescribed for 10% of the population and ONS prescription was highest among underweight (28.4%); Normal BMI (13%); Overweight (9.1%); Obese (7.7%) (p = 0.000) and mSGA – well-nourished (5.5%); moderately malnourished (MM) 41%; severely malnourished (SM) 53.2% (p = 0.000) and pulmonology (23.3%), followed by gastroenterology &amp; Hepatology (19.2%) (p = 0.000). The mean hospital LOS was 4.29 ± 4.03 days, with an overall mortality rate of 1.2%. Severe malnutrition, as rated by mSGA, significantly impacted mortality (0.8% vs. 5.1%, p = 0.000). Mortality risk increased with polymorbidity (0.9% vs. 1.5%) and respiratory illness (2.6%, p = 0.000). Poor nutritional status, as assessed by mSGA (34.7%, 57.4%, 70.9%) and BMI (43.7% vs. 38%), was associated with longer hospital stays (LOS ≥ 4 days, p = 0.000). The implementation of the NSC led to significant improvements - average LOS decreased (4.4 vs. 4.1 days, p = 0.000), and mortality risk was reduced from 1.6% to 0.7% (p = 0.000). No significant changes were observed in baseline nutritional status, indicating effective clinical nutrition practices in assessing patient nutrition. ONS prescriptions increased from 5.2% to 9.7% between 2022 and 2024 (p = 0.000), contributing to the reduction in mortality rates to below 1% after 2022, compared to over 1% before NSC (p = 0.000). A significant negative correlation was found between LOS and ONS usage (p = 0.000). Step-wise binary logistic regression indicated that malnutrition assessed by mSGA predicts mortality with an odds ratio of 2.49 and LOS with an odds ratio of 1.7, followed by ONS prescription and polymorbidity (p = 0.000).</p><p><b>Conclusion:</b> A well-functioning NSC is pivotal in driving successful nutritional interventions and achieving organizational goals. Early identification of malnutrition through mSGA, followed by timely and appropriate nutritional interventions, is essential to closing care gaps and improving clinical outcomes. Strong leadership and governance are critical in driving these efforts, ensuring that the patients receive optimal nutritional support to enhance recovery and reduce mortality.</p><p><b>Table 1.</b> Patient Characteristics: Details of Baseline Anthropometric &amp; Nutritional Status.</p><p></p><p>Baseline details of Anthropometric Measurements and Nutrition Status.</p><p><b>Table 2.</b> Logistic Regression to Predict Hospital LOS and Mortality.</p><p></p><p>Step-wise binary logistic regression indicated that malnutrition assessed by mSGA predicts mortality with an odds ratio of 2.49 and LOS with an odds ratio of 1.7, followed by ONS prescription and polymorbidity (p = 0.000).</p><p></p><p>mSGA-rated malnourished patients stayed longer in the hospital compared to the well-nourished category (p = 0.000)</p><p><b>Figure 1.</b> Nutritional Status (mSGA) Vs Hospital LOS (&gt;4days).</p><p>Hannah Welch, MS, RD<sup>1</sup>; Wendy Raissle, RD, CNSC<sup>2</sup>; Maria Karimbakas, RD, CNSC<sup>3</sup></p><p><sup>1</sup>Optum Infusion Pharmacy, Phoenix, AZ; <sup>2</sup>Optum Infusion Pharmacy, Buckeye, AZ; <sup>3</sup>Optum Infusion Pharmacy, Milton, MA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Food insecurity is when people do not have enough food to eat and do not know where their next meal will come from. In the United States approximately 49 million people relied on food assistance charities in 2022 (data from Feeding America). Patients receiving parenteral nutrition (PN), who may be capable of supplementing with oral intake, may experience food insecurity due to chronic health conditions limiting work capability and total family income. Patients may also experience lack of affordable housing, increased utilities and the burden of medical expenses. Signs of food insecurity may present as weight loss, malnourishment, low energy, difficulty concentrating, or other physical indicators such as edema, chronically cracked lips, dry skin, and itchy eyes. The purpose of this abstract is to highlight two unique patient case presentations where food insecurity prompted clinicians to intervene.</p><p><b>Methods:</b> Patient 1: 50-year-old male with short bowel syndrome (SBS) on long-term PN called the registered dietitian (RD) regarding financial difficulties with feeding the family (see Table 1). The patient and clinician relationship allowed the patient to convey sensitive concerns to the RD regarding inability to feed himself and his family, which resulted in the patient relying on the PN for all nutrition. Due to the food insecurity present, the clinician made changes to PN/hydration to help improve patient's clinical status. Patient 2: A 21-year-old male with SBS on long-term PN spoke with his in-home registered nurse (RN) regarding family's difficulties affording food (see Table 2). The RN informed the clinical team of suspected food insecurity and the insurance case manager (CM) was contacted regarding food affordability. The RD reached out to local community resources such as food banks, food boxes and community programs. A community program was able to assist the patient with meals until patient's aunt started cooking meals for him. This patient did not directly share food insecurity with RD; however, the relationship with the in-home RN proved valuable in having these face-to-face conversations with the patient.</p><p><b>Results:</b> In these two patient examples, difficulty obtaining food affected the patients’ clinical status. The clinical team identified food insecurity and the need for further education for the interdisciplinary team. A food insecurity informational handout was created by the RD with an in-service to nursing to help aid recognition of signs (Figure 1) to detect possible food insecurity and potential patient resources available in the community. Figure 2 presents suggested questions to ask a patient if an issue is suspected.</p><p><b>Conclusion:</b> Given the prevalence of food insecurity, routine assessment for signs and symptoms is essential. Home nutrition support teams (including RDs, RNs, pharmacists and care technicians) are positioned to assist in this effort as they have frequent phone and in-home contact with patients and together build a trusted relationship with patients and caregivers. Clinicians should be aware regarding potential social situations which can warrant changes to PN formulations. To approach this sensitive issue thoughtfully, PN infusion providers should consider enhancing patient assessments and promote education across the interdisciplinary team to create awareness of accessible community resources.</p><p><b>Table 1.</b> Patient 1 Information.</p><p></p><p><b>Table 2.</b> Suspected Food Insecurity Timeline.</p><p></p><p></p><p><b>Figure 1.</b> Signs to Detect Food Insecurity.</p><p></p><p><b>Figure 2.</b> Questions to Ask.</p><p><b>Poster of Distinction</b></p><p>Christan Bury, MS, RD, LD, CNSC<sup>1</sup>; Amanda Hodge Bode, RDN, LD<sup>2</sup>; David Gardinier, RD, LD<sup>3</sup>; Roshni Sreedharan, MD, FASA, FCCM<sup>3</sup>; Maria Garcia Luis, MS, RD, LD<sup>4</sup></p><p><sup>1</sup>Cleveland Clinic, University Heights, OH; <sup>2</sup>Cleveland Clinic Foundation, Sullivan, OH; <sup>3</sup>Cleveland Clinic, Cleveland, OH; <sup>4</sup>Cleveland Clinic Cancer Center, Cleveland, OH</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> The Society of Critical Care Medicine, Critical Care Congress in Orlando, FL on February 25<sup>th</sup>.</p><p><b>Publication:</b> Critical Care Medicine.2025;53(1):In press.</p><p><b>Financial Support:</b> Project support provided by Morrison Cleveland Clinic Nutrition Research collaborative.</p><p><b>Background:</b> Hospitalized and critically ill patients who have preexisting malnutrition can have worse outcomes and an increased length of stay (LOS). Currently, Registered Dietitians (RDs) assess malnutrition with a nutrition focused physical exam (NFPE). Recent recommendations encourage the use of body composition tools such as computed tomography (CT) along with the NFPE. Trained RDs can use CT scans to evaluate skeletal muscle mass at the third lumbar vertebrae (L3) and then calculate Skeletal Muscle Index (SMI) and mean Hounsfield Units (HU) to determine muscle size and quality, respectively. This has been validated in various clinical populations and may be particularly useful in the critically ill where the NFPE is difficult. We aim to evaluate if using CT scans in the surgical and critical care population can be a supportive tool to capture a missed malnutrition diagnosis.</p><p><b>Methods:</b> One-hundred and twenty patients admitted to the Cleveland Clinic from 2021-2023 with a malnutrition evaluation that included an NFPE within 2 days of an abdominal CT were evaluated. Of those, 59 patients had a major surgery or procedure completed that admission and were included in the final analysis. The CT scans were read by a trained RD at L3 using Terarecon, and results were cross-referenced by an artificial intelligence software (AI) called Veronai. Age, sex, BMI, SMI &amp; HU were analyzed, along with the malnutrition diagnosis.</p><p><b>Results:</b> Fifty-nine patients were analyzed. Of these, 61% were male, 51% were &gt;65 years old, and 24% had a BMI &gt; 30. Malnutrition was diagnosed in 47% of patients. A total of 24 patients had no muscle wasting based on the NFPE while CT captured low muscle mass in 58% in that group. Twenty-two percent of patients (13/59) had a higher level of malnutrition severity when using CT. Additionally, poor muscle quality was detected in 71% of patients among all age groups. Notably, there was a 95% agreement between AI and the RD's assessment in detecting low muscle mass.</p><p><b>Conclusion:</b> RDs can effectively analyze CT scans and use SMI and HU with their NFPE. The NFPE alone is not always sensitive enough to detect low muscle in surgical and critically ill patients. The degree of malnutrition dictates nutrition interventions, including the timing and type of nutrition support, so it is imperative to accurately diagnose and tailor interventions to improve outcomes.</p><p><b>Table 1.</b> Change in Malnutrition Diagnosis Using CT.</p><p></p><p>The graph shows the change in Malnutrition Diagnosis when CT was applied in conjunction with the NFPE utilizing ASPEN Guidelines.</p><p><b>Table 2.</b> Muscle Assessment: CT vs NFPE.</p><p></p><p>This graph compares muscle evaluation using both CT and the NFPE.</p><p></p><p>CT scan at 3rd lumbar vertebrae showing normal muscle mass and normal muscle quality in a pt &gt;65 years old.</p><p><b>Figure 1.</b> CT Scans Evaluating Muscle Size and Quality.</p><p></p><p>CT scan at 3rd lumbar vetebrae showing low muscle mass and low muscle quality in a patient with obesity.</p><p><b>Figure 2.</b> CT Scans Evaluating Muscle Size and Quality.</p><p>Elif Aysin, PhD, RDN, LD<sup>1</sup>; Rachel Platts, RDN, LD<sup>1</sup>; Lori Logan, RN<sup>1</sup></p><p><sup>1</sup>Henry Community Health, New Castle, IN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Disease-related malnutrition alters body composition and causes functional decline. In acute care hospitals, 20-50% of inpatients have malnutrition when they are admitted to the hospital. Patients with malnutrition experience higher medical costs, increased mortality, and longer hospital stays. Malnutrition is associated with an increased risk of readmission and complications. Monitoring, diagnosis, treatment, and documentation of malnutrition are important for treating patients. It also contributes to proper Diagnosis Related Group (DRG) coding and accurate CMI (Case Mix Index), which can increase reimbursement.</p><p><b>Methods:</b> After dietitians completed the Nutrition Focused Physical Exam (NFPE) course and sufficient staff had been provided, the malnutrition project was initiated under the leadership of RDNs in our small rural community hospital. The interdisciplinary medical committee approved the project to improve malnutrition screening, diagnosis, treatment practices, and coding. It was decided to use the Academy/American Society of Parenteral and Enteral Nutrition (ASPEN) and Global Leadership Initiative on Malnutrition (GLIM) criteria to diagnose malnutrition. The Malnutrition Screening Tool (MST) was completed by nurses to determine the risk of malnutrition. The Nutrition and Dietetics department created a new custom report that provides NPO-Clear-Full liquids patients' reports using the nutrition database. RDNs check NPO-Clear-Full liquids patients' reports, BMI reports, and length of stay (LOS) patient lists on weekdays and weekends. RDNs also performed the NFPE exam to evaluate nutritional status. If malnutrition is identified, RDNs communicate with providers through the hospital messenger system. Providers add malnutrition diagnosis in their documentation and plan of care. RDNs created a dataset and shared it with Coders and Clinical Documentation Integrity Specialists/Care Coordination. We keep track of malnutrition patients. In addition, RDNs spent more time with malnourished patients. They contributed to discharge planning and education.</p><p><b>Results:</b> The prevalence of malnutrition diagnosis and the amount of reimbursement for 2023 were compared to the six months after implementing the malnutrition project. Malnutrition diagnosis increased from 2.6% to 10.8%. Unspecified protein-calorie malnutrition diagnoses decreased from 39% to 1.5%. RDN-diagnosed malnutrition has been documented in provider notes for 82% of cases. The malnutrition diagnosis rate increased by 315% and the malnutrition reimbursement rate increased by 158%. Of those patients identified with malnutrition, 59% received malnutrition DRG code. The remaining 41% of patients received higher major complications and comorbidities (MCCs) codes. Our malnutrition reimbursement increased from ~$106,000 to ~$276,000.</p><p><b>Conclusion:</b> The implementation of evidence-based practice guidelines was key in identifying and accurately diagnosing malnutrition. The provision of sufficient staff with the necessary training and multidisciplinary teamwork has improved malnutrition diagnosis documentation in our hospital, increasing malnutrition reimbursement.</p><p><b>Table 1.</b> Before and After Malnutrition Implementation Results.</p><p></p><p></p><p><b>Figure 1.</b> Prevalence of Malnutrition Diagnosis.</p><p>Elisabeth Schnicke, RD, LD, CNSC<sup>1</sup>; Sarah Holland, MSc, RD, LD, CNSC<sup>2</sup></p><p><sup>1</sup>The Ohio State University Wexner Medical Center, Columbus, OH; <sup>2</sup>The Ohio State University Wexner Medical Center, Upper Arlington, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition is associated with increased length of stay, readmissions, mortality and poor outcomes. Early identification and treatment are essential. The Malnutrition Screening Tool (MST) is a quick, easy tool recommended for screening malnutrition in adult hospitalized patients. This is commonly used with or without additional indicators. We aimed to evaluate our current nutrition screening policy, which utilizes MST, age and body mass index (BMI), to improve malnutrition identification.</p><p><b>Methods:</b> This quality improvement project data was obtained over a 3-month period on 4 different adult services at a large academic medical center. Services covered included general medicine, hepatology, heart failure and orthopedic surgery. Patients were assessed by a Registered Dietitian (RD) within 72hrs of admission if they met the following high-risk criteria: MST score &gt;2 completed by nursing on admission, age ≥65 yrs or older, BMI ≤ 18.5 kg/m<sup>2</sup>. If none of the criteria were met, patients were seen within 7 days of admission or sooner by consult request. Malnutrition was diagnosed using Academy of Nutrition and Dietetics/American Society for Parenteral and Enteral Nutrition indicators of malnutrition (AAIM) criteria. Data collected included malnutrition severity and etiology, age, gender, BMI and MST generated on admission.</p><p><b>Results:</b> A total of 239 patients were diagnosed with malnutrition. Table 1 shows detailed characteristics. Malnutrition was seen similarly across gender (51% male, 49% female) and age groups. Age range was 21-92 yrs with an average age of 61 yrs. BMI range was 9.8-50.2 kg/m2 with an average BMI of 24.6 kg/m2. More patients were found to have moderate malnutrition at 61.5% and chronic malnutrition at 54%. When data was stratified by age ≥65 yrs, similar characteristics were seen for malnutrition severity and etiology. Notably, more patients (61.5%) had an MST of &lt; 2 or an incomplete MST compared to patients &lt; 65 yrs of age (56%). There were 181 patients (76%) that met high risk screening criteria and were seen for an assessment within 72hrs. Seventy patients (39%) were screened only due to age ≥65 yrs. Forty-five (25%) were screened due to MST alone. There were 54 (30%) that met 2 indicators for screening. Only a small number of patients met BMI criteria alone or 3 indicators (6 patients or 3% each).</p><p><b>Conclusion:</b> Utilizing MST alone would have missed over half of patients diagnosed with malnutrition and there was a higher miss rate with older adults using MST alone. Age alone as a screening criteria caught more patients than MST alone did. Adding BMI to screening criteria added very little and we still missed 24% of patients with our criteria. A multi-faceted tool should be explored to best capture patients.</p><p><b>Table 1.</b> Malnutrition characteristics.</p><p></p><p>*BMI: excludes those with amputations, paraplegia; all patients n = 219, patients ≥65yo n = 106.</p><p>Robin Nuse Tome, MS, RD, CSP, LDN, CLC, FAND<sup>1</sup></p><p><sup>1</sup>Nemours Children's Hospital, DE, Landenberg, PA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition is a global problem impacting patients of all ages, genders, and races. Malnourished hospitalized patients are associated with poorer outcomes including longer in-hospital length of stays, higher rate of death, higher need for home healthcare service, and a higher rate of 30day readmission. This results in a higher economic burden for the healthcare industry, insurance, and the individual. Malnutrition documentation plays a vital role in capturing the status of the patient and starting the conversation about interventions to address then concern.</p><p><b>Methods:</b> A taskforce made up of physicians, registered dietitians (RDs), and clinical documentation specialists met to discuss strategies to increase documentation of the malnutrition diagnosis to facilitate better conversation about the concern and potential interventions. Options included inbox messages, a best practice alert, a SmartLink between the physician and RD note, and adding the diagnosis to the problem list. Of the options, the team selected to develop a SmartLink was developed within the Electronic Medical Record (EMR) that links text from the RD note about malnutrition to the physician note to capture the diagnosis of malnutrition, the severity, and the progression of the diagnosis over time.</p><p><b>Results:</b> Preliminary data shows that physician documentation of the malnutrition diagnosis as well as the severity and progression of the diagnosis increased by 20% in the pilot medical team. Anecdotally, physicians were more aware of the patient's nutrition status with documentation linked to their note and collaboration between the medical team and the RD to treat malnutrition has increased.</p><p><b>Conclusion:</b> We hypothesize that expanding the practice to the entire hospital, will increase documentation of malnutrition diagnosis in the physician note. This will help increase awareness of the nutrition status of the patient, draw attention and promote collaboration on interventions to treat, and increase billable revenue to the hospital by capturing the documentation of the degree of malnutrition in the physician note.</p><p>David López-Daza, RD<sup>1</sup>; Cristina Posada-Alvarez, Centro Latinoamericano de Nutrición<sup>1</sup>; Alejandra Agudelo-Martínez, Universidad CES<sup>2</sup>; Ana Rivera-Jaramillo, Boydorr SAS<sup>3</sup>; Yeny Cuellar-Fernández, Centro Latinoamericano de Nutrición<sup>1</sup>; Ricardo Merchán-Chaverra, Centro Latinoamericano de Nutrición<sup>1</sup>; María-Camila Gómez-Univio, Centro Latinoamericano de Nutrición<sup>1</sup>; Patricia Savino-Lloreda, Centro Latinoamericano de Nutrición<sup>1</sup></p><p><sup>1</sup>Centro Latinoamericano de Nutrición (Latin American Nutrition Center), Chía, Cundinamarca; <sup>2</sup>Universidad CES (CES University), Medellín, Antioquia; <sup>3</sup>Boydorr SAS, Chía, Cundinamarca</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> The Malnutrition Screening Tool (MST) is a simple and quick instrument designed to identify the risk of malnutrition in various healthcare settings, including home hospitalization. Its use has become widespread due to its ease of application and ability to preliminarily assess the nutritional status of patients. In the context of home care, where clinical resources may be limited, an efficient screening tool is crucial to ensure early interventions that prevent nutritional complications in vulnerable populations. The objective was to evaluate the diagnostic accuracy of the MST in detecting malnutrition among patients receiving home care.</p><p><b>Methods:</b> A diagnostic test study was conducted, collecting sociodemographic data, MST results, and malnutrition diagnoses based on the Global Leadership Initiative on Malnutrition (GLIM) criteria. A positive MST score was defined as a value of 2 or above. Categorical data were summarized using proportions, while quantitative variables were described using measures of central tendency. Sensitivity, specificity, the area under the receiver operating characteristic curve (AUC), positive likelihood ratio (LR + ), and negative likelihood ratio (LR-) were estimated along with their respective 95% confidence intervals.</p><p><b>Results:</b> A total of 676 patients were included, with a median age of 82 years (interquartile range: 68-89 years), and 57.3% were female. According to the GLIM criteria, 59.8% of the patients met the criteria for malnutrition. The MST classified patients into low risk (62.4%), medium risk (30.8%), and high risk (6.8%). The sensitivity of the MST was 11.4% (95% CI: 8.5-14.9%), while its specificity was 100% (95% CI: 98.7-100%). The positive likelihood ratio (LR + ) was 1.75, and the negative likelihood ratio (LR-) was 0.0. The area under the curve was 0.71 (95% CI: 0.69-0.73), indicating moderate discriminative capacity.</p><p><b>Conclusion:</b> While the MST demonstrates extremely high specificity, its low sensitivity limits its effectiveness in accurately identifying malnourished patients in the context of home care. This suggests that, although the tool is highly accurate for confirming the absence of malnutrition, it fails to detect a significant number of patients who are malnourished. As a result, while the MST may be useful as an initial screening tool, its use should be complemented with more comprehensive assessments to ensure more precise detection of malnutrition in this high-risk population.</p><p><b>Poster of Distinction</b></p><p>Colby Teeman, PhD, RDN, CNSC<sup>1</sup>; Kaylee Griffith, BS<sup>2</sup>; Karyn Catrine, MS, RDN, LD<sup>3</sup>; Lauren Murray, MS, RD, CNSC, LD<sup>3</sup>; Amanda Vande Griend, BS, MS<sup>2</sup></p><p><sup>1</sup>University of Dayton, Xenia, OH; <sup>2</sup>University of Dayton, Dayton, OH; <sup>3</sup>Premier Health, Dayton, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> The prevalence of malnutrition in critically ill populations has previously been shown to be between 38-78%. Previously published guidelines have stated that patients in the ICU should be screened for malnutrition within 24-48 hours and all patients in the ICU for &gt;48 hours should be considered at high risk for malnutrition. Patients with a malnutrition diagnosis in the ICU have been shown to have poorer clinical outcomes; including longer length of stay, greater readmission rates, and increased mortality. The purpose of the current study was to see if severity of malnutrition impacted the time to initiate enteral nutrition and the time to reach goal enteral nutrition rate in critically ill patients and determine the possible impact of malnutrition severity on clinical outcomes.</p><p><b>Methods:</b> A descriptive, retrospective chart review was conducted in multiple ICU units at a large level I trauma hospital in the Midwest. All participants included in analysis had been assessed for malnutrition by a registered dietitian according to ASPEN clinical practice guidelines. Exclusion criteria included patients receiving EN prior to the RDN assessment, those who received EN for &lt; 24 hours total, patients on mixed oral and enteral nutrition diets, and patients receiving any parenteral nutrition. Participants were grouped by malnutrition status; no malnutrition n = 27, moderate malnutrition n = 22, and severe malnutrition n = 32. All data analysis were analyzed SPSS version 29.</p><p><b>Results:</b> There was no difference in primary outcomes (time to EN initiation, time to EN goal rate) by malnutrition status (both p &gt; 0.05). Multiple regression analysis found neither moderately malnourished or severely malnourished patients were more likely to have enteral nutrition initiation delayed for &gt;48 hours from admission (p &gt; 0.05). Neither ICU LOS nor hospital LOS was different among malnutrition groups (p &gt; 0.05). Furthermore, neither ICU nor hospital mortality was different among malnutrition groups (p &lt; 0.05). Among patients who were moderately malnourished, 81.8% required vasopressors, compared to 75% of patients who were severely malnourished, and 44.4% of patients who did not have a malnutrition diagnosis (p = 0.010). 90.9% of moderately malnourished patients required extended time on a ventilator (&gt;72 hours), compared to 59.4% of severely malnourished patients, and 51.9% of patients without a malnutrition diagnosis (p = 0.011).</p><p><b>Conclusion:</b> Although the severity of malnutrition did not impact LOS, readmission, or mortality, malnutrition status did significantly predict greater odds of a patient requiring vasopressors and spending an extended amount of time on a ventilator. Further studies with larger sample sizes are warranted to continue developing a better understanding of the relationship between malnutrition status and clinical outcomes.</p><p>Jamie Grandic, RDN-AP, CNSC<sup>1</sup>; Cindi Stefl, RN, BSN, CCDS<sup>2</sup></p><p><sup>1</sup>Inova Health System, Fairfax Station, VA; <sup>2</sup>Inova Health System, Fairfax, VA</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> Vizient Connections Summit 2024 (Sept 16-19, 2024).</p><p><b>Publication:</b> 2024 Vizient supplement to the American Journal of Medical Quality (AJMQ).</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Research reveals that up to 50% of hospitalized patients are malnourished, yet only 9% of these cases are diagnosed. <sup>(1)</sup> Inadequate diagnosis and intervention of malnutrition can lead to poorer patient outcomes and reduced revenue. Our systemwide malnutrition awareness campaign successfully enhanced dietitian engagement, provider education, and streamlined documentation processes. This initiative resulted in a two-fold increase in the capture of malnutrition codes, a notable rise in malnutrition variable capture, and an average increase in diagnosis-related group relative weight by approximately 0.9. Consequently, there was a ~ 300% increase in revenue associated with accurate malnutrition diagnosis and documentation, alongside improvements in the observed-to-expected (O/E) ratio for mortality and length of stay. Given the critical impact of malnutrition on mortality, length of stay, and costs, enhanced identification programs are essential. Within our health system, a malnutrition identification program has been implemented across five hospitals for several years. System leadership roles for clinical nutrition and clinical documentation integrity (CDI) were established to ensure consistency, implement best practices, and optimize program efficacy. In 2022, a comprehensive analysis identified opportunities for improvement: a low systemwide capture rate (2%), limited awareness of the program's benefits, and inconsistent documentation practices. The leadership team, with the support of our executive sponsors, addressed these issues, engaged with service line leaders, and continue to drive program enhancements.</p><p><b>Methods:</b> A 4-part malnutrition education campaign was implemented: Strengthened collaboration between Clinical Nutrition and CDI, ensuring daily systemwide communication of newly identified malnourished patients. Leadership teams, including coding and compliance, reviewed documentation protocols considering denial risks and regulatory audits. Launched a systemwide dietitian training program with a core RD malnutrition optimization team, a 5-hour comprehensive training, and monthly chart audits aiming for &gt;80% documentation compliance. Created a Provider Awareness Campaign featuring interactive presentations on the malnutrition program's benefits and provider documentation recommendations. Developed an electronic health record (EHR) report and a malnutrition EHR tool to standardize documentation. EHR and financial reports were used to monitor program impact and capture rates.</p><p><b>Results:</b> The malnutrition campaign has notably improved outcomes through ongoing education for stakeholders. A malnutrition EHR tool was also created in November 2022. This tool is vital for enhancing documentation, significantly boosting provider and CDI efficiency. Key results include: Dietitian documentation compliance increased from 85% (July 2022) to 95% (2024); RD-identified malnutrition cases increased from 2% (2021) to 16% (2024); Monthly average of final coded malnutrition diagnoses increased from 240 (2021) to 717 (2023); Average DRG relative weight climbed from 1.24 (2021) to 2.17 (2023); Financial impact increased from $5.5 M (2021) to $17.7 M (2024); and LOS O/E improved from 1.04 to 0.94 and mortality O/E improved from 0.77 to 0.62 (2021-2023).</p><p><b>Conclusion:</b> This systemwide initiative not only elevates capture rates and documentation but also enhances overall outcomes. By CDI and RD teams taking on more of a collaborative, leadership role, providers can concentrate more on patient care, allowing these teams to operate at their peak. Looking ahead to 2025, the focus will shift towards leading indicators to refine malnutrition identification and assess the educational campaign's impact further.</p><p>Ryota Sakamoto, MD, PhD<sup>1</sup></p><p><sup>1</sup>Kyoto University, Kyoto</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Growing concern about the environmental impact of meat eating has led to consideration of a shift to a plant-based diet. One nutrient that tends to be particularly deficient in a plant-based diet is vitamin B12. Since vitamin B12 deficiency can cause anemia, limb paresthesia and muscle weakness, and psychiatric symptoms including depression, delirium, and cognitive impairment, it is important to find sustainable local sources of vitamin B12. In this study, we focused on gundruk and sinki, two fermented and preserved vegetables traditionally consumed mainly in Nepal, India, and Bhutan, and investigated their vitamin B12 content. The sinki is mainly made from radish roots, while gundruk is made from green leaves such as mustard leaves, which are preserved through fermentation and sun-drying processes. Previous reports indicated that, in these regions, not only vegetarians and vegans but also a significant number of people may have consistently low meat intake, especially among the poor. The governments and other organizations have been initiating feeding programs to supply fortified foods with vitamins A, B1, B2, B3, B6, B9, B12, iron, and zinc, especially to schools. At this time, however, it is not easy to get fortified foods to residents in the community. It is important to explore the possibility of getting vitamin B12 from locally available products that can be taken by vegetarians, vegans, or the poor in the communities.</p><p><b>Methods:</b> Four samples of gundruk and five samples of sinki were obtained from markets, and the vitamin B12 content in them was determined using Lactobacillus delbrueckii subsp. lactis (Lactobacillus leichmannii) ATCC7830. The lower limit of quantification was set at 0.03 µg/100 g. The sample with the highest vitamin B12 concentration in the microbial quantification method was also measured for cyanocobalamin using LC-MS/MS (Shimadzu LC system equipped with Triple Quad 5500 plus AB-Sciex mass spectrometer). The Multiple Reaction Monitoring transition pattern for cyanocobalamin were Q1: 678.3 m/z, Q3: 147.1 m/z.</p><p><b>Results:</b> For gundruk, vitamin B12 was detected in all four of the four samples, with values of 5.0 µg/100 g, 0.13 µg/100 g, 0.12 µg/100 g, and 0.04 µg/100 g, respectively, from highest to lowest. For sinki, it was detected in four of the five samples, with values of 1.4 µg/100 g, 0.41 µg/100 g, 0.34 µg/100 g, and 0.16 µg/100 g, respectively, from highest to lowest. The cyanocobalamin concentration by LC-MS/MS in one sample was estimated to be 1.18 µg/100 g.</p><p><b>Conclusion:</b> According to “Vitamin and mineral requirements in human nutrition (2nd edition) (2004)” by the World Health Organization and the Food and Agriculture Organization of the United Nations, the recommended intake of vitamin B12 is 2.4 µg/day for adults, 2.6 µg/day for pregnant women and 2.8 µg/day for lactating women. The results of this study suggest that gundruk and sinki have potential as a source of vitamin B12, although there is a great deal of variability among samples. In order to use gundruk and sinki as a source of vitamin B12, it may be necessary to find a way to stabilize the vitamin B12 content while focusing on the relationship between vitamin B12 and the different ways of making gundruk and sinki.</p><p>Teresa Capello, MS, RD, LD<sup>1</sup>; Amanda Truex, MS, RRT, RCP, AE-C<sup>1</sup>; Jennifer Curtiss, MS, RD, LD, CLC<sup>1</sup>; Ada Lin, MD<sup>1</sup></p><p><sup>1</sup>Nationwide Children's Hospital, Columbus, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> The metabolic demands of critically ill children are defined by an increase in resting energy expenditure. (1,2). Energy needs in the PICU are ever changing and accurate evaluations are challenging to obtain. (3) Predictive equations have been found to be inaccurate due to over or under feeding patients which can lead to negative outcomes such as muscle loss and poor healing (underfeeding) and weight gain as adipose (overfeeding) (1,4,5). Indirect calorimetry (IC) is considered the gold standard to assess metabolic demand especially for critically ill pediatric patients (1,2,4). The use of IC may be limited due to staffing, equipment availability and cost as well as other patient related issues and/or cart specifications (6). In our facility, we identified limited use of IC by dietitians. Most tests were ordered by PICU dietitians and rarely outside the critical care division even though testing would benefit patients in other divisions of the hospital such as the CTICU, rehab, NICU and stepdown areas. Informal polling of non-PICU dietitians revealed that they had significant uncertainty interpreting data and providing recommendations based on test results. Reasons for uncertainty mostly centered from a lack of familiarity with this technology. The purpose of this study was to develop guidelines and a worksheet for consistently evaluating IC results with the goal of encouraging increased use of indirect calorimetry at our pediatric facility.</p><p><b>Methods:</b> A committee of registered dietitians (RDs) and respiratory therapists (RTs) met in January 2023 and agreed on step-by-step guidelines which were trialed, reviewed, and updated monthly. Finalized guidelines were transitioned to a worksheet to improve consistency of use and aid in interpretation of IC results. A shared file was established for articles about IC as well as access to the guidelines and worksheet (Figures 1 and 2). For this study, IC data from January 1, 2022 to July 31, 2024 was reviewed. This data included number of tests completed and where the orders originated.</p><p><b>Results:</b> Since the guidelines have been implemented, the non-PICU areas using IC data increased from 16% in 2022 to 30% in 2023 and appears to be on track to be the same in 2024 (Figure 3). RDs report an improved comfort level with evaluating test results as well as making recommendations for test ordering.</p><p><b>Conclusion:</b> The standardized guidelines and worksheet increased RD's level of comfort and interpretation of test results. The PICU RDs have become more proficient and comfortable explaining IC during PICU rounds. It is our hope that with the development of the guidelines/worksheet, more non-PICU RDs will utilize the IC testing outside of the critical care areas where longer lengths of stay may occur. IC allows for more individualized nutrition prescriptions. An additional benefit was the mutual exchange of information between disciplines. The RTs provided education on the use of the machine to the RDs. This enhanced RDs understanding of IC test results from the RT perspective. In return, the RDs educated the RTs as to why certain aspects of the patient's testing environment were helpful to report with the results for the RD to interpret the information correctly. The committee continues to meet and discuss patients’ tests to see how testing can be optimized as well as how results may be used to guide nutrition care.</p><p></p><p><b>Figure 1.</b> Screen Capture of Metabolic Cart Shared File.</p><p></p><p><b>Figure 2.</b> IC Worksheet.</p><p></p><p><b>Figure 3.</b> Carts completed per year by unit: 2022 is pre-intervention; 2023 and 2024 are post intervention. Key: H2B = PICU; other areas are non-PICU (H4A = cardiothoracic stepdown, H4B = cardiothoracic ICU, H5B = burn, H8A = pulmonary, H8B = stable trach/vent unit, H10B = Neurosurgery/Neurology, H11B = Nephrology/GI, H12A = Hematology/Oncology, C4A = NICU, C5B = infectious disease).</p><p>Alfredo Lozornio-Jiménez-de-la-Rosa, MD, MSCN<sup>1</sup>; Minu Rodríguez-Gil, MSCN<sup>2</sup>; Luz Romero-Manriqe, MSCN<sup>2</sup>; Cynthia García-Vargas, MD, MSCN<sup>2</sup>; Rosa Castillo-Valenzuela, PhD<sup>2</sup>; Yolanda Méndez-Romero, MD, MSC<sup>1</sup></p><p><sup>1</sup>Colegio Mexicano de Nutrición Clinica y Terapia Nutricional (Mexican College of Clinical Nutrition and Nutritional Therapy), León, Guanajuato; <sup>2</sup>Colegio Mexicano de Nutrición Clínica y Terapia Nutricionalc (Mexican College of Clinical Nutrition and Nutritional Therapy), León, Guanajuato</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Sarcopenia is a systemic, progressive musculoskeletal disorder associated with an increased risk of adverse events and is highly prevalent in older adults. This condition leads to a decline in functionality, quality of life, and has an economic impact. Sarcopenia is becoming increasingly common due to age-related changes, metabolic changes, obesity, sedentary lifestyle, chronic degenerative diseases, malnutrition, and pro-inflammatory states. The objective of this study was to investigate the relationship between strength and muscle mass, measured by calf circumference, both corrected for BMI, in young and older Mexican adults.</p><p><b>Methods:</b> This is a prospective, observational, cross-sectional, population-based clinical study conducted among Mexican men and women aged 30 to 90 years old, obtained through convenience sampling. The study was approved by the Ethics Committee of the Aranda de la Parra Hospital in León, Guanajuato, Mexico, and adheres to the Declaration of Helsinki. Informed consent was obtained from all participants after explaining the nature of the study. Inclusion criteria were: Mexican men and women aged 30 to 90 years old who were functionally independent (Katz index category \"a\"). Participants with amputations, movement disorders, or immobilization devices on their extremities were excluded. The research team was previously standardized in anthropometric measurements. Demographic data and measurements of weight, height, BMI, calf circumference, and grip strength using a dynamometer were collected. Data are presented as average and standard deviation. Spearman's correlation analysis was used to assess the relationship between BMI and calf circumference adjusted for BMI, with grip strength, considering a significance level of p &lt; 0.05.</p><p><b>Results:</b> The results of 1032 subjects were presented, 394 men and 638 women from central Mexico, located in workplaces, recreation centers, and health facilities, aged between 30 and 90 years old. Table 1 shows the distribution of the population in each age category, categorized by sex. Combined obesity and overweight were found in 75.1% of the sample population, with a frequency of 69.2% in men and 78.7% in women; 20% had a normal weight, with 25.6% in men and 16.6% in women, and 4.8% had low BMI, with 5.1% of men and 4.7% of women (Graph 1). The depletion of calf circumference corrected for BMI and age in the female population begins at 50 years old with exacerbation at 65 years old and older, while in men a greater depletion can be observed from 70 years old onwards. (Graph 2). When analyzing the strength corrected for BMI and age, grip strength lowers at 55 years old, lowering even more as age increases, in both genders; Chi-square=83.5, p &lt; 0.001 (Graph 3). By Spearman correlation, an inverse and high relationship was found in both genders between age and grip strength, that is, as age increases, grip strength decreases (r = -0.530, p &lt; 0.001). A moderate and negative correlation was found between age and calf circumference, as age increases, calf circumference decreases independently of BMI (r = -0.365, p &lt; 0.001). Calf circumference and grip strength are positively and moderately related, as calf circumference decreases, the grip strength decreases, independently of BMI (r = 0.447, p &lt; 0.0001).</p><p><b>Conclusion:</b> These results show that the study population exhibited a decrease in grip strength, not related to BMI, from early ages, which may increase the risk of early-onset sarcopenia. This findings encourage early assessment of both grip strength and muscle mass, using simple and accessible measurements such as grip strength and calf circumference, adjusted for BMI. These measurements can be performed in the office during the initial patient encounter or in large populations, as in this study.</p><p><b>Table 1.</b> Distribution of the Population According to Age and Gender.</p><p></p><p>Alison Hannon, Medical Student<sup>1</sup>; Anne McCallister, DNP, CPNP<sup>2</sup>; Kanika Puri, MD<sup>3</sup>; Anthony Perkins, MS<sup>1</sup>; Charles Vanderpool, MD<sup>1</sup></p><p><sup>1</sup>Indiana University School of Medicine, Indianapolis, IN; <sup>2</sup>Indiana University Health, Indianapolis, IN; <sup>3</sup>Riley Hospital for Children at Indiana University Health, Indianapolis, IN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> The Academy of Nutrition and Dietetics and American Society for Parenteral and Enteral Nutrition have published criteria to diagnose a patient with mild, moderate, or severe malnutrition that use either single or multiple data points. Malnutrition is associated with worse clinical outcomes, and previous data from our institution showed that hospitalized children with severe malnutrition are at higher risk of mortality compared to mild and moderate malnutrition. This project aims to determine if differences in clinical outcomes exist in patients with severe malnutrition based on the diagnostic criteria or anthropometrics differences in patients.</p><p><b>Methods:</b> We included all patients discharged from Riley Hospital for Children within the 2023 calendar year diagnosed with severe malnutrition, excluding maternity discharges. Diagnostic criteria used to determine severe malnutrition was collected from registered dietitian (RD) documentation and RD-assigned malnutrition statement within medical records for the admission. Data was collected on readmission rates, mortality, length of stay (LOS), LOS index, cost, operative procedures, pediatric intensive care unit (ICU) admissions, and anthropometrics measured on admission. We used mixed-effects regression or mixed effects logistic regression to test whether the outcome of interest differed by severe malnutrition type. Each model contained a random effect for patient to account for correlation within admissions by the same patient and a fixed effect for severe malnutrition type. All analyses were performed using SAS v9.4.</p><p><b>Results:</b> Data was gathered on 409 patient admissions. 383 admissions had diagnostic criteria clearly defined regarding severity of malnutrition. This represented 327 unique patients (due to readmissions). There was no difference in any measured clinical outcomes based on the criteria used for severe malnutrition, including single or multiple point indicators or patients who met both single and multiple point indicators (Table 1). Anthropometric data was analyzed including weight Z-score (n = 398) and BMI Z-score (n = 180). There was no difference seen in the majority of measured clinical outcomes in admissions with severe malnutrition when comparing based on weight or BMI Z-score categories of Z &lt; -2, -2 &lt; Z &lt; -0.01, or Z &gt; 0 (Table 2). Patients admitted with severe malnutrition and a BMI Z score &gt; 0 had an increase in median cost (p = 0.042) compared to BMI &lt; -2 or between -2 and 0 (Table 2). There was a trend towards increased median cost (p = 0.067) and median LOS (p = 0.104) in patients with weight Z score &gt; 0.</p><p><b>Conclusion:</b> Hospitalized patients with severe malnutrition have similar clinical outcomes regardless of diagnostic criteria used to determine the diagnosis of severe malnutrition. Fewer admissions with severe malnutrition (n = 180 or 44%) had sufficient anthropometric data to determine BMI. Based on this data, future programs at our institution aimed at intervention prior to and following admission will need to focus on all patients with severe malnutrition and will not be narrowed based on criteria (single, multiple data point) of severe malnutrition or anthropometrics. Quality improvement projects include improving height measurement and BMI determination upon admission, which will allow for future evaluation of impact of anthropometrics on clinical outcomes.</p><p><b>Table 1.</b> Outcomes by Severe Malnutrition Diagnosis Category.</p><p></p><p>Outcomes by severe malnutrition compared based on the diagnostic criteria used to determine malnutrition diagnosis. Patients with severe malnutrition only are represented. Diagnostic criteria determined based on ASPEN/AND guidelines and defined during admission by registered dietitian (RD). OR = operative room; ICU = intensive care unit; LOS = length of stay. Data on 383 admissions presented, total of 327 patients due to readmissions: 284 patients had 1 admission; 33 patients had 2 admissions; 8 patients had 3 admissions; 1 patient had 4 admissions; 1 patient had 5 admissions</p><p><b>Table 2.</b> Outcomes By BMI Z-score Category.</p><p></p><p>Outcomes of patients admitted with severe malnutrition, stratified based on BMI Z-score. Patients with severe malnutrition only are represented. BMI Z-score determined based on weight and height measurement at time of admission, recorded by bedside admission nurse. OR = operative room; ICU = intensive care unit; LOS = length of stay. Due to incomplete height measurements, data on only 180 admissions was available, total of 158 patients: 142 patients had 1 admission; 12 patients had 2 admissions; 3 patients had 3 admissions; 1 patient had 5 admissions</p><p>Claudia Maza, ND MSc<sup>1</sup>; Isabel Calvo, MD, MSc<sup>2</sup>; Andrea Gómez, ND<sup>2</sup>; Tania Abril, MSc<sup>3</sup>; Evelyn Frias-Toral, MD, MSc<sup>4</sup></p><p><sup>1</sup>Centro Médico Militar (Military Medical Center), Guatemala, Santa Rosa; <sup>2</sup>Hospital General de Tijuana (Tijuana General Hospital), Tijuana, Baja California; <sup>3</sup>Universidad Católica de Santiago de Guayaquil (Catholic University of Santiago de Guayaquil), Guayaquil, Guayas; <sup>4</sup>Universidad Espíritu Santo (Holy Spirit University), Dripping Springs, TX</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition is a common and significant issue among hospitalized patients, particularly in older adults or those with multiple comorbidities. The presence of malnutrition in such patients is associated with an increased risk of morbidity, mortality, prolonged hospital stays, and elevated healthcare costs. A crucial indicator of nutritional status is muscle strength, which can be effectively measured using handgrip strength (HGS). This study aimed to describe the relationship between nutritional status and muscle strength reduction in patients from two hospitals in Latin America.</p><p><b>Methods:</b> A retrospective observational study was conducted from February to May 2022. Data were collected from two hospitals: one in Guatemala and one in Mexico. A total of 169 patients aged 19-98 years were initially considered for the study, and 127 met the inclusion criteria. The sample comprised adult patients of both sexes admitted to internal medicine, surgery, and geriatrics departments. Handgrip strength, demographic data, baseline medical diagnosis, weight, and height were recorded at admission and on the 14th day of hospitalization. Exclusion criteria included patients with arm or hand movement limitations, those under sedation or mechanical ventilation, and those hospitalized for less than 24 hours. HGS was measured using JAMAR® and Smedley dynamometers, following standard protocols. Statistical analysis was performed using measures of central tendency, and results were presented in tables and figures.</p><p><b>Results:</b> In the first hospital (Mexico), 62 patients participated, with a predominant female sample. The average weight was 69.02 kg, height 1.62 meters, and BMI 26.14 kg/m² (classified as overweight). The most common admission diagnoses were infectious diseases, nervous system disorders, and digestive diseases. (Table 1) A slight increase in HGS (0.49 kg) was observed between the first and second measurements. (Figure 1) In the second hospital (Guatemala), 62 patients also met the inclusion criteria, with a predominant male sample. The average weight was 65.92 kg, height 1.61 meters, and BMI 25.47 kg/m² (classified as overweight). Infectious diseases and musculoskeletal disorders were the most common diagnoses. (Table 1) HGS decreased by 2 kg between the first and second measurements. (Figure 2) Low HGS was associated with underweight patients and those with class II and III obesity. Patients with normal BMI in both centers exhibited significant reductions in muscle strength, indicating that weight alone is not a sufficient indicator of muscle strength preservation.</p><p><b>Conclusion:</b> This multicenter study highlights the significant relationship between nutritional status and decreased muscle strength in hospitalized patients. While underweight patients showed reductions in HGS, those with class II and III obesity also experienced significant strength loss. These findings suggest that HGS is a valuable, non-invasive tool for assessing both nutritional status and muscle strength in hospitalized patients. Early identification of muscle strength deterioration can help healthcare providers implement timely nutritional interventions to improve patient outcomes.</p><p><b>Table 1.</b> Baseline Demographic and Clinical Characteristics of the Study Population.</p><p></p><p>NS: Nervous System, BMI: Body Mass Index</p><p></p><p><b>Figure 1.</b> Relationship Between Nutritional Status and Handgrip Strength (Center 1 - Mexico).</p><p></p><p><b>Figure 2.</b> Relationship Between Nutritional Status and Handgrip Strength (Center 2 - Guatemala).</p><p>Reem Farra, MDS, RD, CNSC, CCTD<sup>1</sup>; Cassie Greene, RD, CNSC, CDCES<sup>2</sup>; Michele Gilson, MDA, RD, CEDS<sup>2</sup>; Mary Englick, MS, RD, CSO, CDCES<sup>2</sup>; Kristine Thornham, MS, RD, CDE<sup>2</sup>; Debbie Andersen, MS, RD, CEDRD-S, CHC<sup>3</sup>; Stephanie Hancock, RD, CSP, CNSC<sup>4</sup></p><p><sup>1</sup>Kaiser Permanente, Lone Tree, CO; <sup>2</sup>Kaiser Permanente, Denver, CO; <sup>3</sup>Kaiser Permanente, Castle Rock, CO; <sup>4</sup>Kaiser Permanente, Littleton, CO</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Currently, there is no standardized screening required by the Centers for Medicare and Medicaid Services for malnutrition in outpatient settings. This raises concerns about early identification of malnutrition, the likelihood of nutrition interventions, and increased healthcare costs. While malnutrition screening in hospitalized patients is well-studied, its impact in outpatient care has not been thoroughly examined. Studies show that malnourished patients are 40% more likely to be readmitted within 30 days of discharge, adding over $10,000 to hospitalization costs. This quality improvement project aimed to evaluate the impact of implementing a standardized malnutrition screening tool for all patients as part of nutrition assessments by Registered Dietitians (RDs).</p><p><b>Methods:</b> The Malnutrition Screening Tool (MST) was chosen due to its validity, sensitivity, and specificity in identifying malnutrition risk in outpatient settings. This tool assesses risk by asking about recent unintentional weight loss and decreased intake due to appetite loss. Based on responses, a score of 0-5 indicates the severity of risk. Scores of 0-1 indicate no risk, while scores of 2-5 indicate a risk for malnutrition. This questionnaire was integrated into the nutrition assessment section of the electronic medical record to standardize screening for all patients. Those with scores of 2 or greater were included, with no exclusions for disease states. Patients were scheduled for follow-ups 2-6 weeks after the initial assessment, during which their MST score was recalculated.</p><p><b>Results:</b> A total of 414 patients were screened, with 175 completing follow-up visits. Of these, 131 showed improvements in their MST scores after nutrition interventions, 12 had score increases, and 32 maintained the same score. Two hundred thirty-nine patients were lost to follow-up for various reasons, including lack of response, limited RD scheduling access, changes in insurance, and mortality. Those with improved MST scores experienced an average cost avoidance of $15,000 each in subsequent hospitalizations. This cost avoidance was due to shorter hospitalizations, better treatment responses, and reduced need for medical interventions. The project raised awareness about the importance of early nutrition interventions among the multidisciplinary team.</p><p><b>Conclusion:</b> This project suggests that standardizing malnutrition screening in outpatient settings could lead to cost avoidance for patients and health systems, along with improved overall care. Further studies are needed to identify the best tools for outpatient malnutrition screening, optimal follow-up timelines, and effective nutrition interventions for greatest cost avoidance.</p><p>Amy Sharn, MS, RDN, LD<sup>1</sup>; Raissa Sorgho, PhD, MScIH<sup>2</sup>; Suela Sulo, PhD, MSc<sup>3</sup>; Emilio Molina-Molina, PhD, MSc, MEd<sup>4</sup>; Clara Rojas Montenegro, RD<sup>5</sup>; Mary Jean Villa-Real Guno, MD, FPPS, FPSPGHAN, MBA<sup>6</sup>; Sue Abdel-Rahman, PharmD, MA<sup>7</sup></p><p><sup>1</sup>Abbott Nutrition, Columbus, OH; <sup>2</sup>Center for Wellness and Nutrition, Public Health Institute, Sacramento, CA; <sup>3</sup>Global Medical Affairs and Research, Abbott Nutrition, Chicago, IL; <sup>4</sup>Research &amp; Development, Abbott Nutrition, Granada, Andalucia; <sup>5</sup>Universidad del Rosario, Escuela de Medicina, Bogota, Cundinamarca; <sup>6</sup>Ateneo de Manila University, School of Medicine and Public Health, Metro Manila, National Capital Region; <sup>7</sup>Health Data Synthesis Institute, Chicago, IL</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> American Society for Nutrition, June 29-July 2, Chicago, IL, USA; American Academy of Pediatrics, September 27-October 1, Orlando, FL, USA.</p><p><b>Publication:</b> Sharn AR, Sorgho R, Sulo S, Molina-Molina E, Rojas Montenegro C, Villa-Real Guno MJ, Abdel-Rahman S. Using mid-upper arm circumference z-score measurement to support youth malnutrition screening as part of a global sports and wellness program and improve access to nutrition care. Front Nutr. 2024 Aug 12;11:1423978. doi: 10.3389/fnut.2024.1423978. PMID: 39188981; PMCID: PMC11345244.</p><p><b>Financial Support:</b> This study was financially supported by the Abbott Center for Malnutrition Solutions, Chicago, IL, USA.</p><p>Veeradej Pisprasert, MD, PhD<sup>1</sup>; Kittipadh Boonyavarakul, MD<sup>2</sup>; Sornwichate Rattanachaiwong, MD<sup>3</sup>; Thunchanok Kuichanuan, MD<sup>3</sup>; Pranithi Hongsprabhas, MD<sup>3</sup>; Chingching Foocharoen, MD<sup>3</sup></p><p><sup>1</sup>Faculty of Medicine, Khon Kaen University, Muang Khon Kaen, Khon Kaen; <sup>2</sup>Chulalongkorn University, Bangkok, Krung Thep; <sup>3</sup>Department of Medicine, Khon Kaen University, Muang Khon Kaen, Khon Kaen</p><p><b>Financial Support:</b> Grant supported by Khon Kaen University.</p><p><b>Background:</b> Systemic sclerosis (SSc) is an autoimmune disease which malnutrition is a common complication caused by chronic inflammation of its natural history and/or gastrointestinal tract involvement. Current nutritional assessment tools, e.g. GLIM criteria, may include data regarding muscle mass measurement for nutritional diagnosis. Anthropometric measurement is a basic method in determining muscle mass, however, data in such condition is limited. This study aimed to determine utility of determining muscle mass and muscle function by anthropometric measurement for diagnosing malnutrition in SSc patients.</p><p><b>Methods:</b> A cross-sectional diagnostic study was conducted in adult SSc patients at Srinagarind Hospital, Thailand. All patients were assessed for malnutrition based on Subjective Global Assessment (SGA). Muscle mass was measured by mid-upper-arm muscle circumference (MUAC) and calf circumference (CC), in addition, muscle function was determined by handgrip strength (HGS).</p><p><b>Results:</b> A total of 208 SSc patients were included, of which 149 were females (71.6%). The respective mean age and body mass index was 59.3 ± 11.0 years and 21.1 ± 3.9 kg/m². Nearly half (95 cases; 45.7%) were malnourished based on SGA. Mean values of MUAC, CC, and HGS were 25.9 ± 3.83, 31.5 ± 3.81, and 19.0 ± 6.99 kg, respectively. Area under the curve (AUC) of receiver operating characteristic (ROC) curves of MUAC for diagnosing malnutrition was 0.796, of CC was 0.759, and HGS was 0.720. Proposed cut-off values were shown in table 1.</p><p><b>Conclusion:</b> Muscle mass and muscle function were associated with malnutrition. Assessment of muscle mass and/or function by anthropometric measurement may be one part of nutritional assessment in patients with systemic sclerosis.</p><p><b>Table 1.</b> Proposed Cut-Off Values of MUAC, CC, and HGS in Patients With Systemic Sclerosis.</p><p></p><p>CC; calf circumference, HGS; handgrip strength, MUAC; mid-upper-arm circumference.</p><p></p><p><b>Figure 1.</b> ROC Curve of MUAC, CC, and HGS in Diagnosing Malnutrition by Subjective Global Assessment (SGA).</p><p>Trevor Sytsma, BS<sup>1</sup>; Megan Beyer, MS, RD, LDN<sup>2</sup>; Hilary Winthrop, MS, RD, LDN, CNSC<sup>3</sup>; William Rice, BS<sup>4</sup>; Jeroen Molinger, PhDc<sup>5</sup>; Suresh Agarwal, MD<sup>3</sup>; Cory Vatsaas, MD<sup>3</sup>; Paul Wischmeyer, MD, EDIC, FCCM, FASPEN<sup>6</sup>; Krista Haines, DO, MA<sup>3</sup></p><p><sup>1</sup>Duke University, Durham, NC; <sup>2</sup>Duke University School of Medicine- Department of Anesthesiology, Durham, NC; <sup>3</sup>Duke University School of Medicine, Durham, NC; <sup>4</sup>Eastern Virginia Medical School, Norfolk, VA; <sup>5</sup>Duke University Medical Center - Department of Anesthesiology - Duke Heart Center, Raleigh, NC; <sup>6</sup>Duke University Medical School, Durham, NC</p><p><b>Financial Support:</b> Baxter.</p><p><b>Background:</b> Achieving acceptable nutritional goals is a crucial but often overlooked component of postoperative care, impacting patient-important outcomes like reducing infectious complications and shortening ICU length of stay. Predictive resting energy expenditure (pREE) equations poorly correlate with actual measured REE (mREE), leading to potentially harmful over- or under-feeding. International ICU guidelines now recommend the use of indirect calorimetry (IC) to determine mREE and personalized patient nutrition. Surgical stress increases protein catabolism and insulin resistance, but the effect of age on postoperative mREE trends, which commonly used pREE equations do not account for, is not well studied. This study hypothesized that older adults undergoing major abdominal surgery experience slower metabolic recovery than younger patients, as measured by IC.</p><p><b>Methods:</b> This was an IRB-approved prospective trial of adult patients following major abdominal surgery with open abdomens secondary to blunt or penetrating trauma, sepsis, or vascular emergencies. Patients underwent serial IC assessments to guide postoperative nutrition delivery. Assessments were offered within the first 72 hours after surgery, then every 3 ± 2 days during ICU stay, followed by every 7 ± 2 days in stepdown. Patient mREE was determined using the Q-NRG® Metabolic Monitor (COSMED) in ventilator mode for mechanically ventilated patients or the mask or canopy modes, depending on medical feasibility. IC data were selected from ≥ 3-minute intervals that met steady-state conditions, defined by a variance of oxygen consumption and carbon dioxide production of less than 10%. Measurements not meeting these criteria were excluded from the final analysis. Patients without mREE data during at least two-time points in the first nine postoperative days were also excluded. Older adult patients were defined as ≥ 65 years and younger patients as ≤ 50. Trends in REE were calculated using the method of least squares and compared using t-tests assuming unequal variance (α = 0.05). Patients' pREE values were calculated from admission anthropometric data using ASPEN-SCCM equations and compared against IC measurements.</p><p><b>Results:</b> Eighteen older and 15 younger adults met pre-specified eligibility criteria and were included in the final analysis. Average rates and standard error of REE recovery in older and younger adult patients were 28.9 ± 17.1 kcal/day and 75.5 ± 18.9 kcal/day, respectively, which approached – but did not reach – statistical significance (p = 0.07). The lower and upper bands of pREE for the older cohort averaged 1728 ± 332 kcal and 2093 ± 390 kcal, respectively, markedly exceeding the mREE obtained by IC for most patients and failing to capture the observed variability identified using mREE. In younger adults, pREE values were closer to IC measurements, with lower and upper averages of 1705 ± 278 kcal and 2084 ± 323 kcal, respectively.</p><p><b>Conclusion:</b> Our data signal a difference in rates of metabolic recovery after major abdominal surgery between younger and older adult patients but did not reach statistical significance, possibly due to insufficient sample size. Predictive energy equations do not adequately capture changes in REE and may overestimate postoperative energy requirements in older adult patients, failing to appreciate the increased variability in mREE that our study found in older patients. These findings reinforce the importance of using IC to guide nutrition delivery during the early recovery period post-operatively. Larger trialsemploying IC and quantifying protein metabolism contributions are needed to explore these questions further.</p><p><b>Table 1.</b> Patient Demographics.</p><p></p><p></p><p><b>Figure 1.</b> Postoperative Changes in mREE in Older and Younger Adult Patients Following Major Abdominal Surgery Compared to pREE (ASPEN).</p><p>Amber Foster, BScFN, BSc<sup>1</sup>; Heather Resvick, PhD(c), MScFN, RD<sup>2</sup>; Janet Madill, PhD, RD, FDC<sup>3</sup>; Patrick Luke, MD, FRCSC<sup>2</sup>; Alp Sener, MD, PhD, FRCSC<sup>4</sup>; Max Levine, MD, MSc<sup>5</sup></p><p><sup>1</sup>Western University, Ilderton, ON; <sup>2</sup>LHSC, London, ON; <sup>3</sup>Brescia School of Food and Nutritional Sciences, Faculty of Health Sciences, Western University, London, ON; <sup>4</sup>London Health Sciences Centre, London, ON; <sup>5</sup>University of Alberta, Edmonton, AB</p><p><b>Financial Support:</b> Brescia University College MScFN stipend.</p><p><b>Background:</b> Currently, body mass index (BMI) is used as the sole criterion to determine whether patients with chronic kidney disease (CKD) are eligible for kidney transplantation. However, BMI is not a good measure of health in this patient population, as it does not distinguish between muscle mass, fat mass, and water weight. Individuals with end-stage kidney disease often experience shifts in fluid balance, resulting in fluid retention, swelling, and weight gain. Consequently, BMI of these patients may be falsely elevated. Therefore, it is vitally important to consider more accurate and objective measures of body composition for this patient population. The aim this study is to determine whether there is a difference in body composition between individuals with CKD who are categorized as having healthy body weight, overweight, or obesity.</p><p><b>Methods:</b> This was a cross-sectional study analyzing body composition of 114 adult individuals with CKD being assessed for kidney transplantation. Participants were placed into one of three BMI groups: healthy weight (group 1, BMI &lt; 24.9 kg/m<sup>2</sup>, n = 29), overweight (group 2, BMI ≥ 24.9-29.9 kg/m2, n = 39) or with obesity (group 3, BMI ≥ 30 kg/m2, n = 45). Fat mass, lean body mass (LBM), and phase angle (PhA) were measured using Bioelectrical Impedance Analysis (BIA). Standardized phase angle (SPhA), a measure of cellular health, was calculated by [(observed PhA-mean PhA)/standard deviation of PhA]. Handgrip strength (HGS) was measured using a Jamar dynamometer, and quadriceps muscle layer thickness (QMLT) was measured using ultrasonography. Normalized HGS (nHGS) was calculated as [HGS/body weight (kg)], and values were compared to age- and sex-specific standardized cutoff values. Fat free mass index (FFMI) was calculated using [LBM/(height (m))<sup>2</sup>]. Low FFMI, which may identify high risk of malnutrition, was determined using ESPEN cut-off values of &lt; 17 kg/m2 for males and &lt; 15 kg/m2 for females. Frailty status was determined using the validated Fried frailty phenotype assessment tool. Statistical analysis: continuous data was analyzed using one-way ANOVA followed by Tukey post hoc, while chi-square tests were used for analysis of categorical data. IBM SPSS version 29, significance p &lt; 0.05.</p><p><b>Results:</b> Participants in group 1 were younger than either group 2 (p = 0.004) or group 3 (p &lt; 0.001). There was no significant difference in males and females between the three groups. FFMI values below cutoff were significantly higher for group 1 (13%), versus group 2 (0%) and group 3 (2.1%) (p = 0.02). A significant difference in nHGS was found between groups, with lower muscle strength occurring more frequently among participants in group 3 (75%) vs 48.7% in group 2 and 28.5% in group 1 (p &lt; 0.001). No significant differences were seen in QMLT, SPhA, HGS, or frailty status between the three BMI groups.</p><p><b>Conclusion:</b> It appears that there is no difference in body composition parameters such as QMLT, SPhA, or frailty status between the three BMI groups. However, patients with CKD categorized as having a healthy BMI were more likely to be at risk for malnutrition. Furthermore, those individuals categorized as having a healthy BMI appeared to have more muscle strength compared to the other two groups. Taken together, these results provide convincing evidence that BMI should not be the sole criterion for listing patients for kidney transplantation. Further research is needed to confirm these findings.</p><p>Kylie Waynick, BS<sup>1</sup>; Katherine Petersen, MS, RDN, CSO<sup>2</sup>; Julie Kurtz, MS, CDCES, RDN<sup>2</sup>; Maureen McCoy, MS, RDN<sup>3</sup>; Mary Chew, MS, RDN<sup>4</sup></p><p><sup>1</sup>Arizona State University and Veterans Healthcare Administration, Phoenix, AZ; <sup>2</sup>Veterans Healthcare Administration, Phoenix, AZ; <sup>3</sup>Arizona State University, Phoenix, AZ; <sup>4</sup>Phoenix VAHCS, Phoenix, AZ</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition does not have a standardized definition nor universal identification criteria. Registered dietitian nutritionists (RDNs) most often diagnose based on Academy and ASPEN Identification of Malnutrition (AAIM) criteria while physicians are required to use International Classification of Diseases Version 10 (ICD-10-CM). However, there are major differences in how malnutrition is identified between the two. The malnutrition ICD-10 codes (E43.0, E44.0, E44.1, E50.0-E64.9) have vague diagnostic criteria leading providers to use clinical expertise and prior nutrition education. For dietitians, AAIM's diagnostic criteria is clearly defined and validated to identify malnutrition based on reduced weight and intake, loss of muscle and fat mass, fluid accumulation, and decrease in physical functioning. Due to lack of standardization, the process of identifying and diagnosing malnutrition is inconsistent. The purpose of this study was to analyze the congruence of malnutrition diagnosis between physicians and dietitians using the two methods and to compare patient outcomes between the congruent and incongruent groups.</p><p><b>Methods:</b> A retrospective chart review of 668 inpatients assigned a malnutrition diagnostic code were electronically pulled from the Veteran Health Administration's Clinical Data Warehouse for the time periods of April through July in 2019, 2020, and 2021. Length of stay, infection, pressure injury, falls, thirty day readmissions, and documentation of communication between providers was collected from chart review. Data for cost to the hospital was pulled from Veterans Equitable Resource Allocation (VERA) and paired with matching social security numbers in the sample. Chi squares were used for comparing differences between incongruency and congruency for infection, pressure injury, falls, and readmissions. Means for length of stay and cost to hospital between the two groups were analyzed using ANOVA through SPSS.</p><p><b>Results:</b> The diagnosis of malnutrition is incongruent between providers. The incongruent group had a higher percentage of adverse patient outcomes than those with congruent diagnosis. Congruent diagnoses were found to be significantly associated with incidence of documented communication (p &lt; 0.001).</p><p><b>Conclusion:</b> This study showcases a gap in malnutrition patient care. Further research needs to be conducted to understand the barriers to congruent diagnosis and communication between providers.</p><p>Nana Matsumoto, RD, MS<sup>1</sup>; Koji Oba, Associate Professor<sup>2</sup>; Tomonori Narita, MD<sup>3</sup>; Reo Inoue, MD<sup>2</sup>; Satoshi Murakoshi, MD, PhD<sup>4</sup>; Yuki Taniguchi, MD<sup>2</sup>; Kenichi Kono, MD<sup>2</sup>; MIdori Noguchi, BA<sup>5</sup>; Seiko Tsuihiji<sup>2</sup>; Kazuhiko Fukatsu, MD, PhD<sup>2</sup></p><p><sup>1</sup>The University of Tokyo, Bunkyo-City, Tokyo; <sup>2</sup>The University of Tokyo, Bunkyo-ku, Tokyo; <sup>3</sup>The University of Tokyo, Chuo-City, Tokyo; <sup>4</sup>Kanagawa University of Human Services, Yokosuka-city, Kanagawa; <sup>5</sup>The University of Tokyo Hospital, Bunkyo-ku, Tokyo</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Therapeutic diets are often prescribed for patients with various disorders, for example, diabetes, renal dysfunction and hypertension. However, due to the limitations of nutrients amount given, therapeutic diets might reduce appetite. Hospital meals maintain patients’ nutritional status when the meals are fully consumed regardless of the diet types. It is possible that therapeutic diets are at least partly associated with malnutrition in hospital. Therefore, we conducted an exploratory retrospective cohort study to investigate whether there are differences inpatients’ oral consumption between therapeutic and regular diets, taking into account other factors.</p><p><b>Methods:</b> The study protocol was approved by the Ethics Committee of the University of Tokyo under protocol No.2023396NI-(1). We retrospectively extracted information from medical record of the patients who were admitted to the department of orthopedic and spine surgery at the University of Tokyo Hospital between June to October 2022. Eligible patients were older than 20 years old and hospitalized more than 7 days. These patients were provided oral diets as main source of nutrition. Patients prescribed texture modified diet, half or liquid diet are excluded. The measurements include percentages of oral food intake at various points during the hospitalization (e.g. at admission, before and after surgery and discharge), sex and ages. The differences in patient's oral consumption rate between therapeutic diet and regular diet were analyzed through a linear mixed-effect model.</p><p><b>Results:</b> A total of 290 patients were analyzed, with 50 patients receiving a therapeutic diet and 240 patients receiving regular diet at admission. The mean percentage of oral intake was 83.1% in a therapeutic diet and 87.2% in a regular diet, and consistently 4-6% higher for regular diets compared to therapeutic diets, at each timing of hospitalization (Figure). In a linear mixed effect model with adjustment of sex and age, mean percentage of oral intake of a regular diet was 4.0% higher (95% confidence interval [CI], -0.8% to 8.9%, p = 0.100) than a therapeutic diet, although the difference did not reach statistical significance. The mean percentage of oral intake in women were 15.6% lower than men (95%CI, -19.5% to -11.8%.) Likewise, older patient's intake rate was reduced compared than younger patients (difference, -0.2% per age, 95%CI -0.3% to -0.1%).</p><p><b>Conclusion:</b> This exploratory study failed to show that therapeutic diets may reduce food intake in orthopedic and supine surgery patients as compared to regular diet. However, sex and ages were important factors affecting food intake. We need to pay special attention to female and/or aged patients for increasing oral food intake. Future research will increase the number of patients examined, expand the cohort to other department and proceed to prospective study to find out what factors truthfully affect patient's oral intake during the hospitalization.</p><p></p><p><b>Figure 1.</b> The Percentage of Oral Intake During Hospitalization in Each Diet.</p><p>Lorena Muhaj, MS<sup>1</sup>; Michael Owen-Michaane, MD, MA, CNSC<sup>2</sup></p><p><sup>1</sup>1 Institute of Human Nutrition, Vagelos College of Physicians and Surgeons, Columbia University Irving Medical Center, New York, NY; <sup>2</sup>Columbia University Irving Medical Center, New York, NY</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Muscle mass is crucial for overall health and well-being. Consequently, accurate estimation of muscle mass is essential for diagnosing malnutrition and conditions such as sarcopenia and cachexia. The Kim equation uses biomarker data to estimate muscle mass, but whether this tool provides an accurate estimate in populations with high BMI and kidney disease remains uncertain. Therefore, the aim of this study is to assess whether the Kim equation is suitable and reliable for estimating muscle mass and predicting malnutrition, sarcopenia, and related outcomes in a cohort with diverse BMI and kidney disease.</p><p><b>Methods:</b> This is a cross-sectional study using data from the All of Us Research Program. Data on demographics, weight, height, creatinine, cystatin C, and diagnoses of malnutrition, hip fractures, and cachexia were obtained from electronic health records (EHR). The Kim equation was derived from creatinine, cystatin C and weight (Table 1) and compared with established sarcopenia cutoffs for appendicular lean mass (ALM), including ALM/BMI and ALM/height<sup>2</sup>. Malnutrition was identified through specific ICD-10-CM codes recorded in the EHR and participants were categorized based on malnutrition status (with or without malnutrition). Muscle mass and biomarker levels were compared between groups with or without severe/moderate malnutrition, and the relationships of BMI with creatinine and cystatin C levels were analyzed using linear regression. Wilcoxon rank-sum tests were used to assess associations between estimated muscle mass and malnutrition diagnosis.</p><p><b>Results:</b> Baseline characteristics were stratified by gender for comparison. The mean age of participants was 58.2 years (SD = 14.7). The mean BMI was 30.4 kg/m<sup>2</sup> (SD = 7.5). Mean serum creatinine and cystatin C levels were 2.01 mg/dL (SD = 1.82) and 0.18 mg/dL (SD = 0.11), respectively. The mean estimated muscle mass was 80.1 kg (SD = 21), with estimated muscle mass as a percentage of body weight being 92.8%. Mean ALM/BMI was 2.38 (SD = 0.32), while ALM/height<sup>2</sup> value was 25.41 kg/m<sup>2</sup> (SD = 5.6). No participant met the cutoffs for sarcopenia. All calculated variables are summarized in Table 1. In this cohort, &lt; 2% were diagnosed with severe malnutrition and &lt; 2% with moderate malnutrition (Table 2). Muscle mass was lower in participants with severe malnutrition compared to those without (W = 2035, p &lt; 0.05) (Figure 1).</p><p><b>Conclusion:</b> This study suggests the Kim equation overestimates muscle mass in populations with high BMI or kidney disease, as no participants met sarcopenia cutoffs despite the expected prevalence in CKD. This overestimation is concerning given the known risk of low muscle mass in CKD. While lower muscle mass was significantly associated with severe malnutrition (p &lt; 0.05), the Kim equation identified fewer malnutrition cases than expected based on clinical data. Though biomarkers like creatinine and cystatin C may help diagnose malnutrition, the Kim equation may not accurately estimate muscle mass or predict malnutrition and sarcopenia in diverse populations. Further research is needed to improve these estimates.</p><p><b>Table 1.</b> Muscle Mass Metrics Calculations and Diagnosis of Sarcopenia Based on FNIH (ALM/BMI) and EWGSOP (ALM/Height2) Cut-off Values.</p><p></p><p>Abbreviations: BMI-Body Mass Index; TBMM-Total Body Muscle Mass (referred as Muscle Mass as well) (calculated using the Kim Equation); ALM-Appendicular Lean Muscle Mass (using the McCarthy equation); ALM/Height2-Appendicular Lean Muscle Mass adjusted for height square (using EWGSOP cutoffs for diagnosing sarcopenia); ALM/BMI-Appendicular Lean Muscle Mass adjusted for BMI (using FNIH cutoffs for diagnosing sarcopenia). Equation 1: Kim equation - Calculated body muscle mass = body weight * serum creatinine/((K * body weight * serum cystatin C) + serum creatinine)</p><p><b>Table 2.</b> Prevalence of Severe and Moderate Malnutrition.</p><p></p><p>(Counts less than 20 suppressed to prevent reidentification of participants).</p><p></p><p><b>Figure 1.</b> Muscle Mass in Groups With and Without Severe Malnutrition.</p><p><b>Poster of Distinction</b></p><p>Robert Weimer, BS<sup>1</sup>; Lindsay Plank, PhD<sup>2</sup>; Alisha Rovner, PhD<sup>1</sup>; Carrie Earthman, PhD, RD<sup>1</sup></p><p><sup>1</sup>University of Delaware, Newark, DE; <sup>2</sup>University of Auckland, Auckland</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Loss of skeletal muscle is common in patients with liver cirrhosis and low muscle mass is internationally recognized as a key phenotypic criterion to diagnose malnutrition.<sup>1,2</sup> Muscle can be clinically assessed through various modalities, although reference data and consensus guidance are limited for the interpretation of muscle measures to define malnutrition in this patient population. The aim of this study was to evaluate the sensitivity and specificity of published sarcopenia cutpoints applied to dual-energy X-ray absorptiometry (DXA) muscle measures to diagnose malnutrition by the GLIM criteria, using in vivo neutron activation analysis (IVNAA) measures of total body protein (TBP) as the reference.</p><p><b>Methods:</b> Adults with liver cirrhosis underwent IVNAA and whole body DXA at the Body Composition Laboratory of the University of Auckland. DXA-fat-free mass (FFM) and appendicular skeletal muscle mass (ASMM, with and without correction for wet bone mass<sup>3</sup>) were measured and indexed to height squared (FFMI, ASMI). The ratio of measured to predicted TBP based on healthy reference data matched for age, sex and height was calculated as a protein index; values less than 2 standard deviations below the mean (&lt; 0.77) were defined as protein depletion (malnutrition). Published cut points from recommended guidelines were evaluated (Table 1).<sup>4-9</sup> DXA values below the cut point were interpreted as ‘sarcopenic’. Sensitivity and specificity for each cut point were determined.</p><p><b>Results:</b> Study sample included 350 adults (238 males/112 females, median age 52 years) with liver cirrhosis and median model for end-stage liver disease (MELD) score 12 (range 5-36). Application of published sarcopenia cutpoints to diagnose malnutrition by DXA in patients with liver cirrhosis had sensitivity ranging from 40.8% - 79.0% and specificity ranging from 79.6% to 94.2% (Table 1). Although all of the selected published cutpoints for DXA-measured ASMI were similar, the Baumgartner<sup>4</sup> and Newman<sup>5</sup> ASMI cutpoints when applied to our DXA-measured ASMI, particularly after correction for wet bone mass, yielded the best combination of sensitivity and specificity for diagnosing malnutrition as identified by protein depletion in patients with liver cirrhosis. The Studentski ASMM cutpoints in Table 1 yielded unacceptably low sensitivity (41-55%).</p><p><b>Conclusion:</b> These findings suggest that the use of the DXA-derived Baumgartner/Newman bone-corrected ASMI cutpoints offer acceptable validity in the diagnosis of malnutrition by GLIM in patients with liver cirrhosis. However, given that it is not common practice to make this correction for wet bone mass in DXA measures of ASMI, the application of these cutpoints to standard uncorrected measures of ASMI by DXA would likely yield much lower sensitivity, suggesting that many individuals with low muscularity and malnutrition would be misdiagnosed as non-malnourished when applying these cutpoints.</p><p><b>Table 1.</b> Evaluation of Selected Published Cut-Points for Dual-Energy X-Ray Absorptiometry Appendicular Skeletal Muscle Index to Identify Protein Depletion in Patients with Liver Cirrhosis.</p><p></p><p>Abbreviations: DXA, dual-energy X-ray absorptiometry; M, male; F, female; GLIM, Global Leadership Initiative on Malnutrition; EWGSOP, European Working Group on Sarcopenia in Older People; AWGS, Asia Working Group for Sarcopenia; FNIH, Foundation for the National Institutes of Health; ASMM, appendicular skeletal muscle mass in kg determined from DXA-measured lean soft tissue of the arms and legs; ASMI, ASMM indexed to height in meters-squared; ASMM-BC, ASMM corrected for wet bone mass according to Heymsfield et al 1990; ASMI-BC, ASMI corrected for wet bone mass.</p><p><b>Critical Care and Critical Health Issues</b></p><p>Amir Kamel, PharmD, FASPEN<sup>1</sup>; Tori Gray, PharmD<sup>2</sup>; Cara Nys, PharmD, BCIDP<sup>3</sup>; Erin Vanzant, MD, FACS<sup>4</sup>; Martin Rosenthal, MD, FACS, FASPEN<sup>1</sup></p><p><sup>1</sup>University of Florida, Gainesville, FL; <sup>2</sup>Cincinnati Children, Gainesville, FL; <sup>3</sup>Orlando Health, Orlando, FL; <sup>4</sup>Department of Surgery, Division of Trauma and Acute Care Surgery, College of Medicine, University of Florida, Gainesville, FL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Amino acids (AAs) serve different purposes in our body including structural, enzymatic, and integral cellular functions. Amino acids utilization and demand may vary between healthy and disease states. Certain conditions such as chronic kidney disease or short bowel syndrome can affect plasma AA levels. Previous research has identified citrulline as a marker of intestinal function and absorptive capacity. Stressors such as surgery or trauma can alter AAs metabolism, potentially leading to a hypercatabolic state and changes in the available AAs pool. The primary objective of this study is to compare AA levels in patients who have undergone abdominal surgery with those who have not. The secondary endpoint is to describe post-surgical complications and correlate plasma AAs level to such complications.</p><p><b>Methods:</b> This study was a single-center retrospective analysis conducted between January 1, 2007and March 15, 2019, of patients who were referred to the University of Florida Health Nutrition Support Team (NST) and had a routine metabolic evaluation with amino acid levels as a part of nutrition support consult. Amino acid data were excluded if specimen were deemed contaminated. Patients with genetic disorders were also excluded from the study. During the study period, the amino acid bioassay was performed using Bio-chrome Ion exchange chromatography (ARUP Laboratories, Salt Lake City, UT).</p><p><b>Results:</b> Of the 227 patients screened, 181 patients were included in the study (58 who underwent abdominal surgery and 123 who did not). The mean age, BMI and height of participants were 52.2 years, 25.1 kg/m<sup>2</sup> and 169 cm respectively. Baseline characteristics were similar between the two groups, 31% of the surgery arm had undergone a surgical procedure within a year from the index encounter, 86.5% retained their colon, 69.2% had a bowel resection with mean of 147.6 cm of bowel left for those with documented length of reaming bowel (36 out of 58). Postoperative complications of small bowel obstruction, ileus, leak, abscess, bleeding and surgical site infection (SSI) were 12.1%, 24%, 17.2%, 20.7%, 3.4% and 17.2% respectively. Among the 19 AAs evaluated, median citrulline and methionine levels were significantly different between the 2 groups (23 [14-35] vs 17 [11-23]; p = 0.0031 and 27 [20-39] vs 33[24-51]; p = 0.0383. Alanine and arginine levels were associated with postoperative ileus, leucine levels correlated with SSI and glutamic acid and glycine levels were linked to postoperative fistula formation.</p><p><b>Conclusion:</b> Most amino acid levels showed no significant differences between patients who underwent abdominal surgery and those who did not except for citrulline and methionine. Specific amino acids, such as alanine, arginine, leucine, glutamic acid and glycine may serve as an early indicator of post-surgical complications, however larger prospective trial is warranted to validate our findings.</p><p>Kento Kurashima, MD, PhD<sup>1</sup>; Grace Trello<sup>1</sup>; James Fox<sup>1</sup>; Edward Portz<sup>1</sup>; Shaurya Mehta, BS<sup>1</sup>; Austin Sims<sup>1</sup>; Arun Verma, MD<sup>1</sup>; Chandrashekhara Manithody, PhD<sup>1</sup>; Yasar Caliskan, MD<sup>1</sup>; Mustafa Nazzal, MD<sup>1</sup>; Ajay Jain, MD, DNB, MHA<sup>1</sup></p><p><sup>1</sup>Saint Louis University, St. Louis, MO</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Marginal donor livers (MDLs) have been used for liver transplantation to address major organ shortages. However, MDLs are notably susceptible to Ischemia/Reperfusion injury (IRI). Recent investigations have highlighted Ferroptosis, a new type of programmed cell death, as a potential contributor to IRI. We hypothesized that modulating ferroptosis by the iron chelator deferoxamine (DFO) could alter the course of IRI.</p><p><b>Methods:</b> Using our novel Perfusion Regulated Organ Therapeutics with Enhanced Controlled Testing (PROTECT) model (US provision Patent, US63/136,165), six human MDLs (liver A to F) were procured and split into paired lobes. Simultaneous perfusion was performed on both lobes, with one lobe subjected to DFO while the other serving as an internal control. Histology, serum chemistry, expression of ferroptosis-associated genes, determination of iron accumulation, and measurement of lipid peroxidation, were performed.</p><p><b>Results:</b> Histological analysis revealed severe macrovesicular steatosis (&gt;30%) in liver A and D, while liver B and E exhibited mild to moderate macrovesicular steatosis. Majority of the samples noted mild inflammation dominantly in zone 3. No significant necrosis was noted during perfusion. Perl's Prussian blue stain and non-heme iron quantification demonstrated a suppression of iron accumulation in liver A to D with DFO treatment (p &lt; 0.05). Based on the degree of iron chelation, 12 lobes were categorized into two groups: lobes with decreased iron (n = 4) and those with increased iron (n = 8). Comparative analysis demonstrated that ferroptosis-associated genes (HIF1-alpha, RPL8, IREB2, ACSF2, NQO1) were significantly downregulated in the former (p = 0.0338, p = 0.0085, p = 0.0138, p = 0.0138, p = 0.0209, respectively). Lipid peroxidation was significantly suppressed in lobes with decreased iron (p = 0.02). While serum AST was lower in iron chelated lobes this did not reach statistical significance.</p><p><b>Conclusion:</b> This study affirmed that iron accumulation was driven by normothermic perfusion. Reduction of iron content suppressed ferroptosis-associated genes and lipid peroxidation to mitigate IRI. Our results using human MDLs revealed a novel relationship between iron content and ferroptosis, providing a solid foundation for future development of IRI therapeutics.</p><p>Gabriella ten Have, PhD<sup>1</sup>; Macie Mackey, BSc<sup>1</sup>; Carolina Perez, MSc<sup>1</sup>; John Thaden, PhD<sup>1</sup>; Sarah Rice, PhD<sup>1</sup>; Marielle Engelen, PhD<sup>1</sup>; Nicolaas Deutz, PhD, MD<sup>1</sup></p><p><sup>1</sup>Texas A&amp;M University, College Station, TX</p><p><b>Financial Support:</b> Department of Defense: CDMRP PR190829 - ASPEN Rhoads Research Foundation - C. Richard Fleming Grant &amp; Daniel H. Teitelbaum Grant.</p><p><b>Background:</b> Sepsis is a potentially life-threatening complication of infection in critically ill patients and is characterized by severe tissue breakdown in several organs, leading to long-term muscle weakness, fatigue, and reduced physical activity. Recent guidelines suggest increasing protein intake gradually in the first days of recovery from a septic event. It is, however, unclear how this affects whole-body protein turnover. Therefore in an acute sepsis-recovery pig model, we studied whole-body protein metabolism in the early sepsis recovery phase after restricted feeding with a balanced meal of amino acids (AA).</p><p><b>Methods:</b> In 25 catheterized pigs (± 25 kg), sepsis was induced by IV infusion of live Pseudomonas aeruginosa bacteria (5*108 CFU/hour). At t = 9 h, recovery was initiated with IV administration of gentamicin. Post sepsis, twice daily food was given incrementally (Day 1:25%, 2:50%, 3:75%, &gt;4:100%). The 100% meals contained per kg BW: 15.4gr CHO and 3.47gr fat, and a balanced free AA (reflecting muscle AA profile) mixture (0.56 gr n = 3.9 gr AA). Before sepsis (Baseline) and on recovery day 3, an amino acid stable isotope mixture was administered IV as pulse (post-absorptive). Subsequently, arterial blood samples were collected postabsorptive for 2 hours. Amino acid concentrations and enrichments were determined with LCMS. Statistics: RM-ANOVA, <i><b>α </b></i>= 0.05.</p><p><b>Results:</b> At day 3, animal body weight was decreased (2.4 [0.9, 3.9]%, p = 0.0025). Compared to baseline values, plasma AA concentration profiles were changed. Overall, the total non-essential AA plasma concentration did not change. Essential AA plasma concentrations of histidine, leucine, methionine, phenylalanine, tryptophan, and valine were lower (p &lt; 0.05) and lysine higher (p = 0.0027). No change in isoleucine. We observed lower whole-body production (WBP) of the non-essential amino acids arginine (p &lt; 0.0001), glutamine (p &lt; 0.0001), glutamate (p &lt; 0.0001), glycine (p &lt; 0.0001), hydro-proline (p = 0.0041), ornithine (p = 0.0003), taurine (p &lt; 0.0001), and tyrosine (p &lt; 0.0001). Citrulline production has not changed. In addition, lower WBP was observed for the essential amino acids, isoleucine (p = 0.0002), leucine (p &lt; 0.0001), valine (p &lt; 0.0001), methionine (p &lt; 0.0001), tryptophane (p &lt; 0.0001), and lysine (p &lt; 0.0001). Whole-body protein breakdown and protein synthesis were also lower (p &lt; 0.0001), while net protein breakdown has not changed.</p><p><b>Conclusion:</b> Our sepsis-recovery pig model suggests that food restriction in the early phase of sepsis recovery leads to diminished protein turnover.</p><p>Gabriella ten Have, PhD<sup>1</sup>; Macie Mackey, BSc<sup>1</sup>; Carolina Perez, MSc<sup>1</sup>; John Thaden, PhD<sup>1</sup>; Sarah Rice, PhD<sup>1</sup>; Marielle Engelen, PhD<sup>1</sup>; Nicolaas Deutz, PhD, MD<sup>1</sup></p><p><sup>1</sup>Texas A&amp;M University, College Station, TX</p><p><b>Financial Support:</b> Department of Defense: CDMRP PR190829 - ASPEN Rhoads Research Foundation - C. Richard Fleming Grant &amp; Daniel H. Teitelbaum Grant.</p><p><b>Background:</b> Sepsis is a potentially life-threatening complication of infection in critically ill patients. It is characterized by severe tissue breakdown in several organs, leading to long-term muscle weakness, fatigue, and reduced physical activity (ICU-AW). In an acute sepsis-recovery ICU-AW pig model, we studied whether meals that only contain essential amino acids (EAA) can restore the metabolic deregulations during sepsis recovery, as assessed by comprehensive metabolic phenotyping1.</p><p><b>Methods:</b> In 49 catheterized pigs (± 25 kg), sepsis was induced by IV infusion of live Pseudomonas aeruginosa bacteria (5*108 CFU/hour). At t = 9 h, recovery was initiated with IV administration of gentamicin. Post sepsis, twice daily food was given blindly and incrementally (Day 1:25%, 2:50%, 3:75%, &gt;4:100%). The 100% meals contained per kg BW: 15.4gr CHO and 3.47gr fat, and 0.56 gr N of an EAA mixture (reflecting muscle protein EAA, 4.3 gr AA) or control (TAA, 3.9 gr AA). Before sepsis (Baseline) and on recovery day 7, an amino acid stable isotope mixture was administered IV as pulse (post-absorptive). Subsequently, arterial blood samples were collected for 2 hours. AA concentrations and enrichments were determined with LCMS. Statistics: RM-ANOVA, <i><b>α</b></i> = 0.05.</p><p><b>Results:</b> A body weight reduction was found after sepsis, restored on Day 7 post sepsis. Compared to baseline, in the EAA group, increased muscle fatigue (p &lt; 0.0001), tau-methylhistidine whole-body production (WBP) (reflects myofibrillar muscle breakdown, p &lt; 0.0001), and whole-body net protein breakdown (p &lt; 0.0001) was observed but less in the control group (muscle fatigue: p &lt; 0.0001, tau-methylhistidine: p = 0.0531, net protein breakdown (p &lt; 0.0001). In addition on day 7, lower WBP was observed of glycine (p &lt; 0.0001), hydroxyproline (p &lt; 0.0001), glutamate (p &lt; 0.0001), glutamine (p &lt; 0.0001), and taurine (p &lt; 0.0001), however less (glycine: p = 0.0014; hydroxyproline (p = 0.0007); glutamate p = 0.0554) or more (glutamine: p = 0.0497; taurine: p &lt; 0.0001) in the control group. In addition, the WBP of citrulline (p = 0.0011) was increased on day seven but less in the control group (p = 0.0078). Higher plasma concentrations of asparagine (p &lt; 0.0001), citrulline (p &lt; 0.0001), glutamine (p = 0.0001), tau-methylhistidine (0.0319), serine (p &lt; 0.0001), taurine (p &lt; 0.0001), and tyrosine (p &lt; 0.0001) were observed in the EAA group. In the EAA group, the clearance was lower (p &lt; 0.05), except for glycine, tau-methylhistidine, and ornithine.</p><p><b>Conclusion:</b> Conclusion Our sepsis-recovery pig ICU-AW model shows that feeding EAA-only meals after sepsis relates to an increased muscle and whole-body net protein breakdown and affects non-EAA metabolism. We hypothesize that non-essential amino acids in post-sepsis nutrition are needed to improve protein anabolism.</p><p>Rebecca Wehner, RD, LD, CNSC<sup>1</sup>; Angela Parillo, MS, RD, LD, CNSC<sup>1</sup>; Lauren McGlade, RD, LD, CNSC<sup>1</sup>; Nan Yang, RD, LD, CNSC<sup>1</sup>; Allyson Vasu-Sarver, MSN, APRN-CNP<sup>1</sup>; Michele Weber, DNP, RN, APRN-CNS, APRN-NP, CCRN, CCNS, OCN, AOCNS<sup>1</sup>; Stella Ogake, MD, FCCP<sup>1</sup></p><p><sup>1</sup>The Ohio State University Wexner Medical Center, Columbus, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Patients in the intensive care unit, especially those on mechanical ventilation, frequently receive inadequate enteral nutrition (EN) therapy. Critically ill patients receive on average 40-50% of prescribed nutritional requirements, while ASPEN/SCCM guidelines encourage efforts be made to provide &gt; 80% of goal energy and protein needs. One method to help achieve these efforts is the use of volume-based feedings (VBF). At our institution, an hourly rate-based feeding (RBF) approach is standard. In 2022, our Medical Intensive Care Unit (MICU) operations committee inquired about the potential benefits of implementing VBF. Before changing our practice, we collected data to assess our current performance in meeting EN goals and to identify the reasons for interruptions. While literature suggests that VBF is considered relatively safe in terms of EN complications compared to RBF, to our knowledge, there is currently no information on the safety of starting VBF in patients at risk of gastrointestinal (GI) intolerance. Therefore, we also sought to determine whether EN is more frequently held due to GI intolerance versus operative procedures.</p><p><b>Methods:</b> We conducted a retrospective evaluation of EN delivery compared to EN goal and the reason for interruption if EN delivery was below goal in the MICU of a large tertiary academic medical center. We reviewed ten days of information on any MICU patient on EN. One day constituted the total EN volume received, in milliliters, from 0700-0659 hours. Using the QI Assessment Form, we collected the following data: goal EN volume in 24 hours, volume received in 24 hours, percent volume received versus prescribed, and hours feeds were held (or below goal rate) each day. The reasons for holding tube feeds were divided into six categories based on underlying causes: feeding initiation/titration, GI issues (constipation, diarrhea, emesis, nausea, distention, high gastric residual volume), operative procedure, non-operative procedure, mechanical issue, and practice issues. Data was entered on a spreadsheet, and descriptive statistics were used to evaluate results.</p><p><b>Results:</b> MICU patients receiving EN were observed over ten random days in a two-week period in August 2022. Eighty-two patients were receiving EN. Three hundred and four EN days were observed. Average percent EN delivered was 70% among all patients. EN was withheld for the following reasons: 34 cases (23%) were related to feeding initiation, 55 (37%) GI issues, 19 (13%) operative procedures, 32 (22%) non-operative procedures, 2 (1%) mechanical issues, and 5 (3%) cases were related to practice issues. VBF could have been considered in 51 cases (35%).</p><p><b>Conclusion:</b> These results suggest that EN delivery in our MICU is most often below prescribed amount due to GI issues and feeding initiation. Together, they comprised 89 cases (60%). VBF protocols would not improve delivery in either case. VBF would likely lead to increased discomfort in patients experiencing GI issues, and feeding initiation can be improved with changes to advancement protocols. Due to VBF having potential benefit in only 35% of cases, as well as observing above average EN delivery, this protocol was not implemented in the observed MICU.</p><p>Delaney Adams, PharmD<sup>1</sup>; Brandon Conaway, PharmD<sup>2</sup>; Julie Farrar, PharmD<sup>3</sup>; Saskya Byerly, MD<sup>4</sup>; Dina Filiberto, MD<sup>4</sup>; Peter Fischer, MD<sup>4</sup>; Roland Dickerson, PharmD<sup>3</sup></p><p><sup>1</sup>Regional One Health, Memphis, TN; <sup>2</sup>Veterans Affairs Medical Center, Memphis, TN; <sup>3</sup>University of Tennessee College of Pharmacy, Memphis, TN; <sup>4</sup>University of Tennessee College of Medicine, Memphis, TN</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> Society for Critical Care Medicine 54th Annual Critical Care Congress. February 23 to 25, 2025, Orlando, FL.</p><p><b>Publication:</b> Critical Care Medicine.2025;53(1):In press.</p><p><b>Financial Support:</b> None Reported.</p><p><b>Best of ASPEN-Critical Care and Critical Health Issues</b></p><p>Megan Beyer, MS, RD, LDN<sup>1</sup>; Krista Haines, DO, MA<sup>2</sup>; Suresh Agarwal, MD<sup>2</sup>; Hilary Winthrop, MS, RD, LDN, CNSC<sup>2</sup>; Jeroen Molinger, PhDc<sup>3</sup>; Paul Wischmeyer, MD, EDIC, FCCM, FASPEN<sup>4</sup></p><p><sup>1</sup>Duke University School of Medicine- Department of Anesthesiology, Durham, NC; <sup>2</sup>Duke University School of Medicine, Durham, NC; <sup>3</sup>Duke University Medical Center - Department of Anesthesiology - Duke Heart Center, Raleigh, NC; <sup>4</sup>Duke University Medical School, Durham, NC</p><p><b>Financial Support:</b> Baxter, Abbott.</p><p><b>Background:</b> Resting energy expenditure (REE) is critical in managing nutrition in intensive care unit (ICU) patients. Accurate energy needs assessments are vital for optimizing nutritional interventions. However, little is known about how different disease states specifically influence REE in ICU patients. Existing energy recommendations are generalized and do not account for the metabolic variability across disease types. Indirect calorimetry (IC) is considered the gold standard for measuring REE but is underutilized. This study addresses this gap by analyzing REE across disease states using metabolic cart assessments in a large academic medical center. The findings are expected to inform more precise, disease-specific nutritional recommendations in critical care.</p><p><b>Methods:</b> This is a pooled analysis of patients enrolled in four prospective clinical trials evaluating ICU patients across a range of disease states. Patients included in this analysis were admitted to the ICU with diagnoses of COVID-19, respiratory failure, cardiothoracic (CT) surgery, trauma, or surgical intensive care conditions. All patients underwent IC within 72 hours of ICU admission to assess REE, with follow-up measurements conducted when patients were medically stable. In each study, patients were managed under standard ICU care protocols in each study, and nutritional interventions were individualized or standardized based on clinical trial protocols. The primary outcome was the measured REE, expressed in kcal/day and normalized to body weight (kcal/kg/day). Summary statistics for demographic and clinical characteristics, such as age, gender, height, weight, BMI, and comorbidities, were reported. Comparative analyses across the five disease states were performed using ANOVA tests to determine the significance of differences in REE.</p><p><b>Results:</b> The analysis included 165 ICU patients. The cohort had a mean age of 58 years, with 58% male and 42% female. Racial demographics included 36% black, 52% white, and 10% from other backgrounds. Patients in the surgical ICU group had the lowest caloric requirements, averaging 1503 kcal/day, while COVID-19 patients had the highest calorie needs of 1982 kcal/day. CT surgery patients measured 1644 kcal/day, respiratory failure measured 1763 kcal/day, and trauma patients required1883 kcal/day. ANOVA analysis demonstrated statistically significant differences in REE between these groups (p &lt; 0.001). When normalized to body weight (kcal/kg/day), the range of REE varied from 20.3 to 23.5 kcal/kg/day, with statistically significant differences between disease states (p &lt; 0.001).</p><p><b>Conclusion:</b> This study reveals significant variability in REE across different disease states in ICU patients, highlighting the need for disease-specific energy recommendations. These findings indicate that specific disease processes, such as COVID-19 and trauma, may increase metabolic demands, while patients recovering from surgical procedures may have comparatively lower energy needs. These findings emphasize the importance of individualized nutritional interventions based on a patient's disease state to optimize recovery, clinical outcomes and prevent underfeeding or overfeeding, which can adversely affect patient outcomes. The results suggest IC should be more widely implemented in ICU settings to guide precise and effective nutrition delivery based on real-time metabolic data rather than relying on standard predictive equations. Further research is needed to refine these recommendations and explore continuous monitoring of REE and tailored nutrition needs in the ICU.</p><p><b>Table 1.</b> Demographic and Clinical Characteristics.</p><p></p><p><b>Table 2.</b> Disease Group Diagnoses.</p><p></p><p></p><p><b>Figure 1.</b> Average Measured Resting Energy Expenditure by Disease Group.</p><p>Hailee Prieto, MA, RD, LDN, CNSC<sup>1</sup>; Emily McDermott, MS, RD, LDN, CNSC<sup>2</sup></p><p><sup>1</sup>Northwestern Memorial Hospital, Shorewood, IL; <sup>2</sup>Northwestern Memorial Hospital, Chicago, IL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Communication between Registered Dietitians and CTICU teams is non-standardized and often ineffective in supporting proper implementation of RD recommendations. Late or lack of RD support can impact the quality of nutrition care provided to patients. In FY23, the CTICU nutrition consult/risk turnaround time was 58% within 24hrs and missed nutrition consults/risk was 9%. Our goal was to improve RD consult/risk turnaround time within 24hrs, based on our department goal, from 58% to 75% and missed RD consult/risk from 9% to 6%, through standardizing communication between RD and CTICU APRNs. The outcome metrics nutrition risk turnaround time and nutrition consult turnaround time. The process metric was our percent of RD presence in rounds.</p><p><b>Methods:</b> We used the DMAIC module to attempt to solve our communication issue in CTICU. We took the voice of the customer and surveyed the CTICU ARPNs and found that a barrier was the RDs limited presence in the CTICU. We found that the CTICU APRNs find it valuable to have an RD rounding daily with their team. We then did a literature search on RDs rounding in ICU, specifically cardiac/thoracic ICUs and found critically ill cardiac surgery patients are at high risk of developing malnutrition, however initiation of medical nutrition therapy and overall adequacy of nutrition provision is lower compared to noncardiac surgical or MICU patients. RD written orders directly improve outcomes in patients receiving nutrition support. To have the most influence, RDs need to be present in the ICU and be involved when important decisions are being made. Dietitian involvement in the ICU team significantly improves the team's ability to implement prompt, relevant nutrition support interventions. We used process mapping to address that rounding times overlap with the step-down cardiac floors and ICU rounding. We optimized the schedule for the RDs daily to be able to attend as many rounds as possible daily, including the CTICU rounds. We then implemented a new rounding structure within the Cardiac Service Line based on literature search for the standard of care and RD role in ICU rounding.</p><p><b>Results:</b> Our percentage of turnaround time for nutrition consults/risks increased by 26% within 24hrs (58 to 84%) and decreased for missed consults/risks to 1% to exceed goals. The number of nutrition interventions we were able to implement increased with more RDs attending rounds, which was tracked with implementation of a RD rounding structure within the CTICU. The number of implemented interventions from 1 to 2 RDs was skewed due to the RD attempting to round with both teams each day there was only 1 RD.</p><p><b>Conclusion:</b> Communication between the CTICU team and Clinical Nutrition continues to improve with consistent positive feedback from the ICU providers regarding the new rounding structure. The new workflow was implemented in the Clinical Nutrition Cardiac Service Line. For future opportunities, there are other ICU teams at NMH that do not have a dedicated RD to round with them due to RD staffing that could also benefit from a dedicated RD in those rounds daily.</p><p><b>Table 1.</b> New Rounding Structure.</p><p></p><p>*Critical Care Rounds; Green: Attend; Gold: Unable to attend.</p><p><b>Table 2.</b> Control Plan.</p><p></p><p></p><p><b>Figure 1.</b> Results Consult Risk Turn Around Time Pre &amp; Post Rounding.</p><p></p><p><b>Figure 2.</b> Number of Implemented RD Nutrition Interventions by Number of RDs Rounding.</p><p>Kenny Ngo, PharmD<sup>1</sup>; Rachel Leong, PharmD<sup>2</sup>; Vivian Zhao, PharmD<sup>2</sup>; Nisha Dave, PharmD<sup>2</sup>; Thomas Ziegler, MD<sup>2</sup></p><p><sup>1</sup>Emory Healthcare, Macon, GA; <sup>2</sup>Emory Healthcare, Atlanta, GA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Micronutrients play a crucial role in biochemical processes in the body. During critical illness, the status of micronutrients can be affected by factors such as disease severity and medical interventions. Extracorporeal membrane oxygenation (ECMO) is a vital supportive therapy that has seen increased utilization for critically ill patients with acute severe refractory cardiorespiratory failure. The potential alterations in micronutrient status and requirements during ECMO are an area of significant interest, but data are limited. This study aimed to determine the incidence of micronutrient depletion in critically ill patients requiring ECMO.</p><p><b>Methods:</b> A retrospective chart review was conducted for patients with at least one micronutrient level measured in blood while receiving ECMO between January 1, 2015, and September 30, 2023. Chart reviews were completed using the Emory Healthcare electronic medical record system after the Emory University Institutional Review Board approved the study. A full waiver for informed consent and authorization was approved for this study. Data on demographic characteristics, ECMO therapy-related information, and reported micronutrient levels were collected. Descriptive statistics were used to evaluate the data.</p><p><b>Results:</b> A total of 77 of the 128 reviewed patients met inclusion criteria and were included in data analysis (Table 1). The average age of patients was 49, and 55.8% were female. The average duration of ECMO was 14 days, and the average length of stay in the intensive care unit was 46.7 days. Among the included patients, 56 required continuous renal replacement therapy (CRRT) along with ECMO. Patients who required CRRT received empiric standard supplementation of folic acid (1 mg), pyridoxine (200 mg), and thiamine (100 mg) every 48 hours. Of the 77 patients, 44% had below-normal blood levels of at least one of the measured micronutrients. The depletion percentages of various nutrients were as follows: vitamin C (80.6%), vitamin D (75.0%), iron (72.7%), copper (53.3%), carnitine (31.3%), selenium (30.0%), pyridoxine (25.0%), folic acid (18.8%), vitamin A (10.0%), zinc (7.1%), and thiamine (3.7%) (Table 2). Measured vitamin B12, manganese, and vitamin E levels were within normal limits.</p><p><b>Conclusion:</b> This study demonstrated that 60% of patients on ECMO had orders to evaluate at least one micronutrient in their blood. Of these, almost half had at least one micronutrient level below normal limits. These findings underscore the need for regular nutrient monitoring for critically ill patients. Prospective studies are needed to understand the impact of ECMO on micronutrient status, determine the optimal time for evaluation, and assess the need for and efficacy of supplementation in these patients.</p><p><b>Table 1.</b> General Demographic and ECMO Characteristics (N = 77).</p><p></p><p><b>Table 2.</b> Observed Micronutrient Status during ECMO for Critically Ill Patients.</p><p></p><p>Diane Nowak, RD, LD, CNSC<sup>1</sup>; Mary Kronik, RD, LD, CNSC<sup>2</sup>; Caroline Couper, RD, LD, CNSC<sup>3</sup>; Mary Rath, MEd, RD, LD, CNSC<sup>4</sup>; Ashley Ratliff, MS, RD, LD, CNSC<sup>4</sup>; Eva Leszczak-Lesko, BS Health Sciences, RRT<sup>4</sup></p><p><sup>1</sup>Cleveland Clinic, Elyria, OH; <sup>2</sup>Cleveland Clinic, Olmsted Twp, OH; <sup>3</sup>Cleveland Clinic, Rocky River, OH; <sup>4</sup>Cleveland Clinic, Cleveland, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Indirect calorimetry (IC) is the gold standard for the accurate determination of energy expenditure. The team performed a comprehensive literature review on current IC practices across the nation which showed facilities employing IC typically follow a standard protocol dictated by length of stay (LOS). Intensive Care Unit (ICU) Registered Dietitians (RD) have directed IC intervention to reduce reliance on inaccurate predictive equations and judiciously identify patients (1, 2) with the assistance of IC order-based practice. While IC candidacy is determined by clinical criteria, implementation has been primarily dictated by RD time constraints. Our project aims to include IC in our standard of care by using a standardized process for implementation.</p><p><b>Methods:</b> To implement IC at our 1,299-bed quaternary care hospital, including 249 ICU beds, a multidisciplinary team including ICU RDs and Respiratory Therapists (RT) partnered with a physician champion. Three Cosmed QNRG+ indirect calorimeters were purchased after a 6-month trial period. Due to the potential for rapid clinical status changes and RD staffing, the ICU team selected an order-based practice as opposed to a protocol. The order is signed by a Licensed Independent Practitioner (LIP) and includes three components: an order for IC with indication, a nutrition assessment consult, and a conditional order for the RD to release to RT once testing approved. After the order is signed, the RD collaborates with the Registered Nurse and RT by verifying standardized clinical criteria to assess IC candidacy. If appropriate, the RD will release the order for RT prior to testing to allow for documentation of ventilator settings. To begin the test, the RD enters patient information and calibrates the pneumotach following which RT secures ventilation connections. Next, RD starts the test and remains at bedside for the standardized 20-minute duration to ensure steady state is not interrupted. Once testing is completed, the best 5-minute average is selected to obtain the measured resting energy expenditure (mREE). The RD interprets the results considering a multitude of factors and, if warranted, modifies nutrition interventions.</p><p><b>Results:</b> Eight ICU registered dietitians completed 87 IC measurements from May 2024 through August 2024 which included patients across various ICUs. All 87 patients were selected by the RD due to concerns for over or underfeeding. Eighty-three percent of the measurements were valid tests and seventy-nine percent of the measurements led to intervention modifications. The amount of face-to-face time spent was 66 hours and 45 minutes or an average of 45 minutes per test. Additional time spent interpreting results and making modifications to interventions ranged from 15-30 minutes.</p><p><b>Conclusion:</b> IC has the ability to capture accurate energy expenditures in the critically ill. RD directed IC order-based practice has allowed for the successful IC introduction at our institution. The future transition from limited IC implementation to a standard of care will be dependent on the consideration of numerous challenges including RD time constraints and patient volumes amid the ebbs and flows of critical care. To align with the ever-changing dynamics of critical care, staffing level and workflows are being actively evaluated.</p><p><b>Table 1.</b> Indirect Calorimetry (IC) Checklist.</p><p></p><p></p><p><b>Figure 1.</b> IC Result with Invalid Test.</p><p></p><p><b>Figure 2.</b> IC Result with Valid Test.</p><p></p><p><b>Figure 3.</b> IC Indications and Contraindications.</p><p></p><p><b>Figure 4.</b> IC EPIC Order.</p><p>Rebecca Frazier, MS, RD, CNSC<sup>1</sup>; Chelsea Heisler, MD, MPH<sup>1</sup>; Bryan Collier, DO, FACS, FCCM<sup>1</sup></p><p><sup>1</sup>Carilion Roanoke Memorial Hospital, Roanoke, VA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Adequate energy intake with appropriate macronutrient composition is an essential part of patient recovery; however, predictive equations have been found to be of variable accuracy. Indirect calorimetry (IC), gives insight into the primary nutrition substrate being utilized as metabolic fuel and caloric needs, often identifying over- and under-feeding. Though IC is considered the gold standard for determining resting energy expenditure, it has challenges with cost, equipment feasibility, and time restraints with personnel, namely respiratory therapy (RT). Our hypothesis: Registered Dietitian (RD)-led IC tests can be conducted in a safe and feasible manner without undue risk of complication. In addition, IC will show a higher caloric need, particularly in patients who require at least seven days of ventilatory support.</p><p><b>Methods:</b> A team of RDs screened surgical ICU patients at a single institution. Intubated patients for at least 3 days were considered eligible for testing. Exclusion criteria included a PEEP &gt; 10, fraction of inspired oxygen &gt;60%, Richmond Agitation Sedation Scale ≥ 1, chest tube air leak, extracorporeal membrane oxygenation use, and &gt;1°C change in 1 hour. Tests were completed using the Q-NRG+ Portable Metabolic Monitor (Baxter) based on RD, and patient availability. Test results were compared to calculated needs based on the Penn State Equation (39 tests). Overfeeding/underfeeding was defined as &gt;15% deviation from the equation results. Analysis of mean difference in energy needs was calculated using a standard paired, two tailed t-test for &lt;/= 7 total ventilated days and &gt;7 ventilated days.</p><p><b>Results:</b> Thirty patients underwent IC testing; a total of 39 tests were completed. There were no complications in RD led IC testing, and minimal RT involvement was required after 5 tests completed. Overall, 56.4% of all IC regardless of ventilator days indicated overfeeding. In addition, 33.3% of tests indicated appropriate feeding (85-115% of calculated REE), and 10.3% of tests demonstrated underfeeding. When stratified by ventilator days (&gt; 7 d vs. ≤7 d), similar results were found noting 66% of IC tests were &gt;15% from calculated caloric needs; 54.4-60.0% via equation were overfed and 12.5-6.7% were underfed, respectively.</p><p><b>Conclusion:</b> Equations estimating caloric needs provide inconsistent results. Nutritional equations under and overestimate nutritional needs similarly regardless of ventilatory days compared to IC. Despite the lack of statistical significance, the effects of poor nutrition are well documented and vastly clinically significant. With minimal training, IC can be performed safely with an RD and bedside RN. Utilizing the RD to coordinate and perform IC testing is a feasible process that maximizes personnel efficiency and allows for immediate adjustment in the nutrition plan. IC as the gold standard for nutrition estimation should be performed on surgical ICU patients to assist in developing nutritional treatment algorithms.</p><p>Dolores Rodríguez<sup>1</sup>; Mery Guerrero<sup>2</sup>; María Centeno<sup>2</sup>; Barbara Maldonado<sup>2</sup>; Sandra Herrera<sup>2</sup>; Sergio Santana<sup>3</sup></p><p><sup>1</sup>Ecuadorian Society for the Fight against Cancer, Guayaquil, Guayas; <sup>2</sup>SOLCA, Guayaquil, Guayas; <sup>3</sup>University of Havana, La Habana, Ciudad de la Habana</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> In 2022, the International Agency for Research on Cancer-Globocan reported nearly 20 million new cases of cancer worldwide, with 30,888 cases in Ecuador. Breast, prostate, and stomach cancers were the most diagnosed types. Oncohematological diseases (OHD) significantly affect the nutritional status of patients. The ELAN Ecuador-2014 study, involving over 5,000 patients, found malnutrition in 37% of participants overall, rising to 65% among those with OHD. The Latin American Study of Malnutrition in Oncology (LASOMO), conducted by FELANPE between 2019 and 2020, revealed a 59.1% frequency of malnutrition among 1,842 patients across 52 health centers in 10 Latin American countries. This study aims to present the current state of malnutrition associated with OHD among patients treated in Ecuadorian hospitals.</p><p><b>Methods:</b> The Ecuadorian segment of the LASOMO Study was conducted between 2019 and 2020, as part of the previously mentioned regional epidemiological initiative. This study was designed as a one-day, nationwide, multicenter survey involving health centers and specialized services for patients with Oncohematological diseases (OHD) across five hospitals located in the provinces of Guayas (3), Manabí (1), and Azuay (1). The nutritional status of patients with Oncohematological diseases (OHD) was assessed using the B + C scores from Detsky et al.'s Subjective Global Assessment (SGA). This study included male and female patients aged 18 years and older, admitted to clinical, surgical, intensive care, and bone marrow transplant (BMT) units during October and November 2019. Participation was voluntary, and patients provided informed consent by signing a consent form. Data were analyzed using location, dispersion, and aggregation statistics based on variable types. The nature and strength of relationships were assessed using chi-square tests for independence, with a significance level of &lt; 5% to identify significant associations. Odds ratios for malnutrition were calculated along with their associated 95% confidence intervals.</p><p><b>Results:</b> The study enrolled 390 patients, with 63.6% women and 36.4% men, averaging 55.3 ± 16.5 years old; 47.2% were aged 60 years or older. The most common tumor locations included kidneys, urinary tract, uterus, ovaries, prostate, and testicles, accounting for 18.7% of all cases (refer to Table 1). Chemotherapy was the predominant oncological treatment, administered to 42.8% of the patients surveyed. Malnutrition affected 49.7% of the patients surveyed, with 14.4% categorized as severely malnourished (see figure 1). The incidence of malnutrition was found to be independent of age, educational level, tumor location, and current cytoreductive treatment (refer to Table 2). Notably, the majority of the malnourished individuals were men.</p><p><b>Conclusion:</b> Malnutrition is highly prevalent in patients treated for OHD in Ecuadorian hospitals.</p><p><b>Table 1.</b> Most Frequent Location of Neoplastic Disease in Hospitalized Ecuadorian Patients and Type of Treatment Received. The number and {in brackets} the percentage of patients included in the corresponding.</p><p></p><p><b>Table 2.</b> Distribution of Malnutrition Associated with Cancer According to Selected Demographic, Clinical, and Health Characteristics of Patients Surveyed During the Oncology Malnutrition Study in Ecuador. The number and {in brackets} the percentage of malnourished patients included in each characteristic category are presented. The frequency of malnutrition was estimated using the Subjective Global Assessment (Detsky et al,. 1987.)</p><p></p><p></p><p><b>Figure 1.</b> State of Malnutrition Among Patients Treated for Cancer in Hospitals In Ecuador.</p><p>Ranna Modir, MS, RD, CNSC, CDE, CCTD<sup>1</sup>; Christina Salido, RD<sup>1</sup>; William Hiesinger, MD<sup>2</sup></p><p><sup>1</sup>Stanford Healthcare, Stanford, CA; <sup>2</sup>Stanford Medicine, Stanford, CA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Cardiovascular Intensive Care Unit (CVICU) patients are at high risk for significant nutrient deficits, especially during the early intensive care unit (ICU) phase, when postoperative complications and hemodynamic instability are prevalent. These deficits can exacerbate the catabolic state, leading to muscle wasting, impaired immune function, and delayed recovery. A calorie deficit &gt; 10,000 with meeting &lt; 80% of nutritional needs in the early ICU phase (first 14 days) has been linked to worse outcomes, including prolonged intubation, increased ICU length of stay (LOS) and higher risk of organ dysfunction and infections like central line-associated bloodstream infections (CLABSI). When evaluating CLABSI risk factors, the role of nutrition adequacy and malnutrition if often underestimated and overlooked, with more emphasis placed on the type of nutrition support (NS) provided, whether enteral nutrition (EN) or parenteral nutrition (PN). Historically, there has been practice of avoiding PN to reduce CLABSI risk, rather than ensuring that nutritional needs are fully met. This practice is based on initial reports from decades ago linking PN with hospital-acquired infections. However, updated guidelines based on modern data now indicate no difference in infectious outcomes with EN vs PN. In fact, adding PN when EN alone is insufficient can help reduce nutritional deficiencies, supporting optimal immune response and resilience to infection. As part of an ongoing NS audit in our CVICU, we reviewed all CLABSI cases over a 23-month period to assess nutrition adequacy and the modality of NS (EN vs PN) provided.</p><p><b>Methods:</b> Data were extracted from electronic medical records for all CLABSI cases from September 2020 to July 2022. Data collected included patient characteristics, clinical and nutrition outcomes (Table 1). Descriptive statistics (means, standard deviations, frequencies) were calculated. Chi-square or Fisher's exact test assessed the association between type of NS and meeting &gt;80% of calorie/protein targets within 14 days and until CLABSI onset. A significance level of 0.05 was used.</p><p><b>Results:</b> In a 23 month period, 28/51 (54.9%) patients with a CLABSI required exclusive NS throughout their entire CVICU stay with n = 18 male (64.3%), median age 54.5 years, mean BMI 27.4, median CVICU LOS was 49.5 days with a 46.4% mortality rate. Surgical intervention was indicated in 60.7% patients with 41.2% requiring preoperative extracorporeal membrane oxygenation (ECMO) and 52.9% postoperative ECMO (Table 1). Majority of patients received exclusive EN (53.6%) with 46.4% EN + PN and 0% exclusive PN. In the first 14 ICU days, 21.4% met &gt;80% of calorie needs, 32.1% met &gt;80% of protein needs with 32.1% having a calorie deficit &gt;10,000 kcal. No difference in type of NS and ability to meet &gt;80% of nutrient targets in the first 14 days (Table 1, p = 0.372, p = 0.689). Majority of PN (61.5%) was initiated after ICU day 7. From ICU day 1 until CLABSI onset, the EN + PN group were more able to meet &gt;80% of calorie targets vs exclusive EN (p = 0.016). 50% were diagnosed with malnutrition. 82% required ECMO cannulas and 42.9% dialysis triple lumen. Enterococcus Faecalis was the most common organism for the EN (43.7%) and EN + PN group (35.7%) (Table 2).</p><p><b>Conclusion:</b> This single-center analysis of CVICU CLABSI patients found majority requiring exclusive NS failed to meet &gt;80% of nutrition needs during the early ICU phase. Exclusive EN was the primary mode of NS compared to EN + PN or PN alone, challenging the assumption that PN inherently increases CLABSI risk. In fact, EN + PN improved ability to meet calorie targets until CLABSI onset. These findings suggest that early nutrient deficits may increase CLABSI risk and that the risk is not dependent on the type of NS provided.</p><p><b>Table 1.</b> Patient Characteristics, Clinical and Nutritional Outcomes.</p><p></p><p><b>Table 2.</b> Type of Central Access Device and Microorganism in Relation to Modality of Nutrition Support Provided.</p><p></p><p>Oki Yonatan, MD<sup>1</sup>; Faya Nuralda Sitompul<sup>2</sup></p><p><sup>1</sup>ASPEN, Jakarta, Jakarta Raya; <sup>2</sup>Osaka University, Minoh, Osaka</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Ginseng, widely used as a functional food or therapeutic supplement in Asia, contains bioactive compounds such as ginsenosides, which exert a range of biological effects, including hypoglycemic, anti-inflammatory, cardioprotective, and anti-tumor properties. However, studies have indicated that ginseng also has anticoagulant and anti-aggregation effects and may be associated with bleeding. This case report presents a potential case of ginseng-induced bleeding in an elderly patient with advanced pancreatic cancer. Case Description: A 76-year-old male with stage IV pancreatic cancer and metastases to the liver, lymph nodes, and peritoneum was in home care with BIPAP ventilation, NGT feed, ascites drainage, and foley catheter. He had a history of type A aortic dissection repair, anemia, and thrombocytopenia, with platelet counts consistently below 50,000/µL. Despite no history of anticoagulant use, the patient developed massive gastrointestinal bleeding and hematuria after consuming 100 grams of American Ginseng (AG) per day for five days. Disseminated intravascular coagulation (DIC) was initially suspected, but no signs of bleeding were observed until the third week of care, coinciding with ginseng consumption. Endoscopy was not performed due to the patient's unstable condition and the family's refusal. Discussion: The consumption of AG may have triggered bleeding due to the patient's already unstable condition and low platelet count. Ginsenosides, particularly Rg1, Rg2, and Rg3, have been shown to exert anticoagulant effects, prolonging clotting times and inhibiting platelet aggregation. Studies have demonstrated that AG extracts can significantly extend clotting times and reduce platelet activity, potentially contributing to the observed bleeding. Conclusion: This case highlights the possible role of AG in inducing severe bleeding in a patient with pancreatic cancer and thrombocytopenia. Given ginseng's known anticoagulant properties, caution should be exercised when administering it to patients with hematological abnormalities or bleeding risks, and further research is warranted to assess its safety in these populations.</p><p><b>Methods:</b> None Reported.</p><p><b>Results:</b> None Reported.</p><p><b>Conclusion:</b> None Reported.</p><p>Kursat Gundogan, MD<sup>1</sup>; Mary Nellis, PhD<sup>2</sup>; Nurhayat Ozer, PhD<sup>3</sup>; Sahin Temel, MD<sup>3</sup>; Recep Yuksel, MD<sup>4</sup>; Murat Sungar, MD<sup>5</sup>; Dean Jones, PhD<sup>2</sup>; Thomas Ziegler, MD<sup>6</sup></p><p><sup>1</sup>Division of Clinical Nutrition, Erciyes University Health Sciences Institute, Kayseri; <sup>2</sup>Emory University, Atlanta, GA; <sup>3</sup>Erciyes University Health Sciences Institute, Kayseri; <sup>4</sup>Department of Internal Medicine, Erciyes University School of Medicine, Kayseri; <sup>5</sup>Department of Internal Medicine, Erciyes University School of Medicine, Kayseri; <sup>6</sup>Emory Healthcare, Atlanta, GA</p><p><b>Financial Support:</b> Erciyes University Scientific Research Committee (TSG- 2021–10078) and TUBITAK 2219 program (1059B-192000150), each to KG, and National Institutes of Health grant P30 ES019776, to DPJ and TRZ.</p><p><b>Background:</b> Metabolomics represents a promising profiling technique for the investigation of significant metabolic alterations that arise in response to critical illness. The present study utilized plasma high-resolution metabolomics (HRM) analysis to define systemic metabolism associated with common critical illness severity scores in critically ill adults.</p><p><b>Methods:</b> This cross-sectional study performed at Erciyes University Hospital, Kayseri, Turkiye and Emory University, Atlanta, GA, USA. Participants were critically ill adults with an expected length of intensive care unit stay longer than 48 h. Plasma for metabolomics was obtained on the day of ICU admission. Data was analyzed using regression analysis of two ICU admission illness severity scores (APACHE II and mNUTRIC) against all plasma metabolomic features in metabolome-wide association studies (MWAS). APACHE II score was analyzed as a continuous variable, and mNUTRIC score was analyzed as a dichotomous variable [≤4 (low) vs. &gt; 4 (high)]. Pathway enrichment analysis was performed on significant metabolites (raw p &lt; 0.05) related to each of the two illness severity scores independently.</p><p><b>Results:</b> A total of 77 patients were included. The mean age was 69 years (range 33-92 years); 65% were female. More than 15,000 metabolomic features were identified for MWAS of APACHE II and mNUTRIC scores, respectively. Metabolic pathways significantly associated with APACHE II score at ICU admission included: C21-steroid hormone biosynthesis, and urea cycle, vitamin E, seleno amino acid, aspartate/asparagine, and thiamine metabolism. Metabolic pathways associated with mNUTRIC score at ICU admission were N-glycan degradation, and metabolism of fructose/mannose, vitamin D3, pentose phosphate, sialic acid, and linoleic acid. Within the significant pathways, the gut microbiome-derived metabolites hippurate and N-acetyl ornithine were downregulated, and creatine and glutamate were upregulated with increasing APACHE II scores. Metabolites involved in energy metabolism that were altered with a high (&gt; 4) mNUTRIC score included N-acetylglucosamine (increased) and gluconate (decreased).</p><p><b>Conclusion:</b> Plasma HRM identified significant associations between two commonly used illness severity scores and metabolic processes involving steroid biosynthesis, the gut microbiome, skeletal muscle, and amino acid, vitamin, and energy metabolism in adult critically ill patients.</p><p>Hilary Winthrop, MS, RD, LDN, CNSC<sup>1</sup>; Megan Beyer, MS, RD, LDN<sup>2</sup>; Jeroen Molinger, PhDc<sup>3</sup>; Suresh Agarwal, MD<sup>4</sup>; Paul Wischmeyer, MD, EDIC, FCCM, FASPEN<sup>5</sup>; Krista Haines, DO, MA<sup>4</sup></p><p><sup>1</sup>Duke Health, Durham, NC; <sup>2</sup>Duke University School of Medicine- Department of Anesthesiology, Durham, NC; <sup>3</sup>Duke University Medical Center - Department of Anesthesiology - Duke Heart Center, Raleigh, NC; <sup>4</sup>Duke University School of Medicine, Durham, NC; <sup>5</sup>Duke University Medical School, Durham, NC</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Little evidence currently exists on the effect of body mass index (BMI) on resting energy expenditure (REE). Current clinical practice guidelines are based primarily on expert opinion and provide a wide range of calorie recommendations without delineations specific to BMI categories, making it difficult for the clinician to accurately prescribe calorie needs for their hospitalized patients. This abstract utilizes metabolic cart data from studies conducted at a large academic healthcare system to investigate trends within BMI and REE.</p><p><b>Methods:</b> A pooled cohort of hospitalized patients was compiled from three clinical trials where metabolic cart measurements were collected. In all three studies, indirect calorimetry was initially conducted in the intensive care setting, with follow up measurements conducted as clinically able. Variables included in the analysis was measured resting energy expenditure (mREE) in total kcals as well as kcals per kilogram of body weight on ICU admission, respiratory quotient, days post ICU admission, as well as demographics and clinical characteristics. ANOVA tests were utilized to analyze continuous data.</p><p><b>Results:</b> A total of 165 patients were included in the final analysis with 338 indirect calorimetry measurements. Disease groups included COVID-19 pneumonia, non-COVID respiratory failure, surgical ICU, cardiothoracic surgery, and trauma. The average age of patients was 58 years old, with 96 males (58.2%) and 69 females (41.8%), and an average BMI of 29.0 kg/m<sup>2</sup>. The metabolic cart measurements on average were taken on day 8 post ICU admission (ranging from day 1 to day 61). See Table 1 for more demographics and clinical characteristics. Indirect calorimetry measurements were grouped into three BMI categories: BMI ≤ 29.9 (normal), BMI 30-39.9 (obese), and BMI ≥ 40 (super obese). ANOVA analyses showed statistical significance amongst the three BMI groups in both total kcals (p &lt; 0.001) and kcals per kg (p &lt; 0.001). The normal BMI group had an average mREE of 1632 kcals (range of 767 to 4023), compared to 1868 kcals (range of 1107 to 3754) in the obese BMI group, and 2004 kcals (range of 1219 to 3458) in the super obese BMI group. Similarly, when analyzing kcals per kg, the normal BMI group averaged 23.3 kcals/kg, the obese BMI group 19.8, and the super obese BMI group 16.3.</p><p><b>Conclusion:</b> Without access to a metabolic cart to accurately measure REE, the majority of nutrition clinicians are left to estimations. Current clinical guidelines and published data do not provide the guidance that is necessary to accurately feed many hospitalized patients. This current analysis only scratches the surface on the metabolic demands of different patient populations based on their BMI status, especially given the wide ranges of energy expenditures. Robust studies are needed to further elucidate the relationships between BMI and REE in different disease states.</p><p><b>Table 1.</b> Demographics and Clinical Characteristics.</p><p></p><p></p><p><b>Figure 1.</b> Average Measured Resting Energy Expenditure (in Total Kcals), by BMI Group.</p><p></p><p><b>Figure 2.</b> Average Measured Resting Energy Expenditure (in kcals/kg), by BMI Group.</p><p>Carlos Reyes Torres, PhD, MSc<sup>1</sup>; Daniela Delgado Salgado, Dr<sup>2</sup>; Sergio Diaz Paredes, Dr<sup>1</sup>; Sarish Del Real Ordoñez, Dr<sup>1</sup>; Eva Willars Inman, Dr<sup>1</sup></p><p><sup>1</sup>Hospital Oncológico de Coahuila (Oncological Hospital of Coahuila), Saltillo, Coahuila de Zaragoza; <sup>2</sup>ISSSTE, Saltillo, Coahuila de Zaragoza</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Chemotherapy is one of the principal treatment in cancer. Is described that some degree of toxicity is present in 98% of patients. Changes in body composition are frequent and are related to worse outcomes. Low muscle mass is associated with chemotherapy toxicity in observational studies. Phase angle (PhA) is an indicator cell integrity and positively correlates with adequate nutritional status and muscle mass. There is limited studies that have been evaluated the associaton of PhA and chemotherapy toxicity. The aim of this study was to evaluate the association of PhA, body composition and chemotherapy toxicity in cancer patients.</p><p><b>Methods:</b> A prospective cohort study was conducted in adult patients with solid neoplasic disease with first-line systemic treatments. The subjects were evaluated at first chemotherapy treatment using bioelectrical impedance analysis with a RJL device and according to the standardized technique. The outcome is to know chemotherapy toxicity in the first 4 cycles of chemotherapy and associated with PhA and body composition. Toxicity was evaluated using National Cancer Institute (NCI) common terminology criteria for adverse events version 5.0. A PhA &lt; 4.7 was considered low according to other studies.</p><p><b>Results:</b> A total of 54 patients were evaluated and included in the study. The most common cancer diagnosis was breast cancer (40%), gastrointestinal tumors (33%), lung cancer (18%). Chemotherapy toxicity was presented in 46% of the patients. The most common adverse effects were gastrointestinal (48%), blood disorders (32%) and metabolically disorders (40%). There were statistically differences in PhA between patients with chemotherapy toxicity and patients without adverse effects: 4.45º (3.08-4.97) vs 6.07º (5.7-6.2) respectively, p vale &lt; 0.001. PhA was associated with the risk of chemotherapy toxicity HR 8.7 (CI 95% 6.1-10.7) log rank test p = 0.02.</p><p><b>Conclusion:</b> PhA was associated with the risk of chemotherapy toxicity in cancer patients.</p><p>Lizl Veldsman, RD, M Nutr, BSc Dietetics<sup>1</sup>; Guy Richards, MD, PhD<sup>2</sup>; Carl Lombard, PhD<sup>3</sup>; Renée Blaauw, PhD, RD<sup>1</sup></p><p><sup>1</sup>Division of Human Nutrition, Department of Global Health, Faculty of Medicine &amp; Health Sciences, Stellenbosch University, Cape Town, Western Cape; <sup>2</sup>Department of Surgery, Division of Critical Care, Faculty of Health Sciences, University of the Witwatersrand, Johannesburg, Gauteng; <sup>3</sup>Division of Epidemiology and Biostatistics, Department of Global Health, Faculty of Medicine and Health Sciences, Stellenbosch University, Cape Town, Western Cape</p><p><b>Financial Support:</b> Fresenius Kabi JumpStart Research Grant.</p><p><b>Background:</b> Critically ill patients lose a significant amount of muscle mass over the first ICU week. We sought to determine the effect of bolus amino acid (AA) supplementation on the urea-to-creatinine ratio (UCR) trajectory over time, and whether UCR correlates with histology myofiber cross-sectional area (CSA) as a potential surrogate marker of muscle mass.</p><p><b>Methods:</b> This was a secondary analysis of data from a registered clinical trial (ClinicalTrials.gov NCT04099108) undertaken in a predominantly trauma surgical ICU. Participants were randomly assigned into two groups both of which received standard care nutrition (SCN) and mobilisation. Study participants randomised to the intervention group also received a bolus AA supplement, with a 45-minute in-bed cycling session, from average ICU day 3 for a mean of 6 days. The change in vastus lateralis myofiber CSA, measured from pre-intervention (average ICU day 2) to post-intervention (average ICU day 8), was assessed through biopsy and histological analysis. A linear mixed-effects regression model was used to compare the mean daily UCR profiles between study groups from ICU day 0 to day 10. As a sensitivity analysis, we adjusted for disease severity on admission (APACHE II and SOFA scores), daily fluid balance, and the presence of acute kidney injury (AKI). A spearman correlation compared the UCR on ICU day 2 (pre-intervention) and ICU day 8 (post-intervention) with the corresponding myofiber CSA.</p><p><b>Results:</b> A total of 50 enrolled participants were randomised to the study intervention and control groups in a 1:1 ratio. The control and intervention groups received, on average, 87.62 ± 32.18 and 85.53 ± 29.29 grams of protein per day (1.26 ± 0.41 and 1.29 ± 0.40 g/kg/day, respectively) from SCN, and the intervention group an additional 30.43 ± 5.62 grams of AA (0.37 ± 0.06 g/kg protein equivalents) from the AA supplement. At baseline, the mean UCR for the control (75.6 ± 31.5) and intervention group (63.8 ± 27.1) were similar. Mean UCR increased daily from baseline in both arms, but at a faster rate in the intervention arm with a significant intervention effect (p = 0.0127). The UCR for the intervention arm at day 7 and 8 was significantly higher by 21 and 22 units compared to controls (p = 0.0214 and p = 0.0215, respectively). After day 7 the mean daily UCR plateaued in the intervention arm, but not in the controls. Adjusting for disease severity, daily fluid balance, and AKI did not alter the intervention effect. A significant negative association was found between UCR and myofiber CSA (r = -0.39, p = 0.011) at ICU day 2 (pre-intervention), but not at day 8 (post-intervention) (r = 0.23, p = 0.153).</p><p><b>Conclusion:</b> Bolus amino acid supplementation significantly increases the UCR during the first ICU week, thereafter plateauing. UCR at baseline may be an indicator of muscle status.</p><p></p><p><b>Figure 1.</b> Change in Urea-to-Creatinine Ratio (UCR) Over ICU Days in the Control and Intervention Group. Error bars Represent 95% Confidence Intervals (CIs).</p><p>Paola Renata Lamoyi Domínguez, MSc<sup>1</sup>; Iván Osuna Padilla, PhD<sup>2</sup>; Lilia Castillo Martínez, PhD<sup>3</sup>; Josué Daniel Cadeza-Aguilar, MD<sup>2</sup>; Martín Ríos-Ayala, MD<sup>2</sup></p><p><sup>1</sup>UNAM, National Autonomous University of Mexico, Mexico City, Distrito Federal; <sup>2</sup>National Institute of Respiratory Diseases, Mexico City, Distrito Federal; <sup>3</sup>National Institute of Medical Sciences and Nutrition Salvador Zubirán, Mexico City, Distrito Federal</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Non-defecation (ND) is highly prevalent in critically ill patients on mechanical ventilation (MV) and has been reported in up to 83% of cases. This disorder is associated with high morbidity and mortality rates. Most existing research has focused on the association between clinical data and non-defecation; however, there is a lack of evidence regarding its association with dietary factors. We aimed to analyze the association between dietary fiber in enteral nutrition and the amount of fluids administered through enteral and parenteral routes with defecation during the first 6-days of MV in critically ill patients with pneumonia and other lung manifestations.</p><p><b>Methods:</b> We conducted a longitudinal analysis of MV patients receiving enteral nutrition (EN) at a tertiary care hospital in Mexico City between May 2023 and April 2024. The inclusion criteria were age &gt;18 years, MV, admission to the respiratory intensive care unit (ICU) for pneumonia or other lung manifestations, and nutritional assessment performed during the first 24 h after ICU admission. Exclusion criteria was established as patients who required parenteral nutrition, major surgery, traumatic brain injury, or neuromuscular disorders were excluded from this study. A nutritional assessment, including the NUTRIC score, SOFA and APACHE II assessments, and an estimation of energy-protein requirements, was performed by trained dietitians within the first 24 hours after ICU admission. During each day of follow-up (0 to 6 days), we recorded the amount of fiber provided in EN, and volume of infusion fluids, including enteral and parenteral route, and medical prescription of opioids, sedatives, neuromuscular blockers and vasopressors. ND was defined as &gt;6 days without defecation from ICU admission. The differences between ND and defecation were also assessed. Association of ND with dietary factors were examined using discrete-time survival analysis.</p><p><b>Results:</b> Seventy-four patients were included; ND was observed in 40 patients (54%). Non-defecation group had higher ICU length of stay, and 50% of this group had the first defecation until day 10. No differences in fiber provision and volume of infusion fluids were observed between the groups. In multivariate analysis, no associations between ND and fiber (fiber intake 10 to 20 g per day, OR 1.17 95% CI:0.41-3.38, p = 0.29) or total fluids (fluids intake 25 to 30 ml/kg/d, OR 1.85 95%CI:0.44-7.87, p = 0.404) were observed.</p><p><b>Conclusion:</b> Non-defecation affected 54% of the study population. Although fiber and fluids are considered a treatment for non-defecation, we did not find an association in critically ill patients.</p><p><b>Table 1.</b> Demographic and Clinical Characteristics by Groups.</p><p></p><p><b>Table 2.</b> Daily Comparison of Dietary Factors.</p><p></p><p>Andrea Morand, MS, RDN, LD<sup>1</sup>; Osman Mohamed Elfadil, MBBS<sup>1</sup>; Kiah Graber, RDN<sup>1</sup>; Yash Patel, MBBS<sup>1</sup>; Suhena Patel, MBBS<sup>1</sup>; Chloe Loersch, RDN<sup>1</sup>; Isabelle Wiggins, RDN<sup>1</sup>; Anna Santoro, MS, RDN<sup>1</sup>; Natalie Johnson, MS<sup>1</sup>; Kristin Eckert, MS, RDN<sup>1</sup>; Dana Twernbold, RDN<sup>1</sup>; Dacia Talmo, RDN<sup>1</sup>; Elizabeth Engel, RRT, LRT<sup>1</sup>; Avery Erickson, MS, RDN<sup>1</sup>; Alex Kirby, MS, RDN<sup>1</sup>; Mackenzie Vukelich, RDN<sup>1</sup>; Kate Sandbakken, RDN<sup>1</sup>; Victoria Vasquez, RDN<sup>1</sup>; Manpreet Mundi, MD<sup>1</sup></p><p><sup>1</sup>Mayo Clinic, Rochester, MN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Current guidelines for nutrition support in critically ill patients recommend the utilization of indirect calorimetry (IC) to determine energy needs. However, IC testing is limited at many institutions due to accessibility, labor, and costs, which leads to reliance on predictive equations to determine caloric targets. A quality improvement (QI) initiative was implemented to assess the impact on nutrition care when IC is routinely completed.</p><p><b>Methods:</b> A prospective QI project in selected medical and surgical intensive care units (ICU) included critically ill patients assessed by the dietitian within 24-48 hours of consult order or by hospital day 4. Patients with contraindications to IC were excluded, including those requiring ECMO, CRRT, MARS, or FiO2 &gt; 55, as well as spontaneously breathing patients requiring significant supplemental oxygen. The initial caloric target was established utilizing predictive equations, which is the standard of care at our institution. After day 4 of the ICU stay, IC measurements, predictive equations, and weight-based nomograms were collected. The predictive equations utilized included Harris-Benedict (HB) - basal, adjusted HB (75% of basal when body mass index (BMI) &gt; 30), Penn State if ventilated, Mifflin St. Jeor (MSJ), revised HB, and revised HB adjusted (75% of basal when BMI &gt; 30). Additional demographic, anthropometric, and clinical data were collected.</p><p><b>Results:</b> Patients (n = 85) were majority male (n = 53, 62.4%), admitted to the surgical ICU (n = 57, 67.1%), overweight (mean BMI 29.8 kg/m^2), and the average age was 61.3 years old (SD 16.5). At the time of the IC test, the median ICU length of stay was 6 days; 77.6% (n = 66) were supported with mechanical ventilation, and median ventilator days were 4 days (Table 1). Mean IC measured REE was compared to predictive equations, showing that except for weight-based nomogram high caloric needs (p = 0.3615), all equations were significantly lower than IC (p &lt; 0.0001). Median absolute differences from IC for evaluated predictive equations (Figure 1). After REE was measured, caloric goals increased significantly for patients on enteral (EN) and parenteral nutrition (PN) (p = 0.0016 and p = 0.05, respectively). In enterally fed the mean calorie goal before REE was 1655.4 (SD 588), and after REE 1917.6 (SD 528.6), an average increase of 268.4 kcal/day; In parenterally fed patients, the mean calorie goal before REE was1395.2 kcal (SD 313.6) and after REE 1614.1 (SD 239.3), an average increase of 167.5 kcal (Table 2). The mean REE per BMI category per actual body weight was BMI &lt; 29.9 = 25.7 ± 7.9 kcal/kg, BMI 30-34.9 = 20.3 ± 3.8 kcal/kg, BMI 35-39.9 = 22.8 ± 4.6 kcal/kg, and BMI ≥ 40 = 16.3 ± 2.9 kcal/kg (25.4 ± 10.5 kcal/kg of ideal body weight). (Figure 2) illustrates the average daily calorie need broken down by BMI for IC and examined predictive equations.</p><p><b>Conclusion:</b> There was a significant difference between IC measurements and various predictive equations except for weight-based high-estimated calorie needs. Nutrition goals changed significantly in response to IC measurements. It is recommended that we expand the use of IC in the critically ill population at our institution. In settings where IC is not possible, weight-based nomograms should be utilized.</p><p><b>Table 1.</b> Baseline Demographics and Clinical Characteristics.</p><p></p><p><b>Table 2.</b> Nutrition Support.</p><p></p><p></p><p><b>Figure 1.</b> Difference in Daily Calorie Estimation Utilizing IC Compared to Predictive Equations.</p><p></p><p><b>Figure 2.</b> RMR by IC and Other Predictive Equations by BMI.</p><p><b>GI, Obesity, Metabolic, and Other Nutrition Related Concepts</b></p><p>Suhena Patel, MBBS<sup>1</sup>; Osman Mohamed Elfadil, MBBS<sup>1</sup>; Yash Patel, MBBS<sup>1</sup>; Chanelle Hager, RN<sup>1</sup>; Manpreet Mundi, MD<sup>1</sup>; Ryan Hurt, MD, PhD<sup>1</sup></p><p><sup>1</sup>Mayo Clinic, Rochester, MN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Chronic capillary leak syndrome is a rare but potentially fatal side effect of immunotherapy for some malignancies, mainly manifested with intractable generalized edema and often refractory hypotension. An idiopathic type of the syndrome is also known. It can be diagnosed by exclusion in patients with a single or recurrent episode of intravascular hypovolemia or generalized edema, primarily manifested by the diagnostic triad of hypotension, hemoconcentration, and hypoalbuminemia in the absence of an identifiable alternative cause. Supportive care with a role for steroids remains the standard treatment. In capillary leak syndrome secondary to cancer immune therapy, discontinuing the offending agent is typically considered. This relatively rare syndrome can be associated with significant clinical challenges. This clinical case report focuses on aspects of nutrition care.</p><p><b>Methods:</b> A 45-year-old male with a past medical history of hypertension, pulmonary tuberculosis in childhood, and rectal cancer was admitted for evaluation of anasarca. He was first diagnosed with moderately differentiated invasive adenocarcinoma of rectum IIIb (cT3, cN1, cM0) in October 2022. As initial therapy, he was enrolled in a clinical trial. He received 25 cycles of immunotherapy with the study drug Vudalimab (PD1/CTLA4 bispecific antibody), achieving a complete clinical response without additional chemotherapy, radiation, or surgery. He, unfortunately, has developed extensive capillary leak syndrome manifested with recurrent anasarca, chylous ascites, and pleural effusions since November 2023. His treatment was also complicated by the development of thyroiditis and insulin-dependent diabetes. The patient most recently presented with abdominal fullness, ascites, and peripheral edema that did not improve despite diuretic therapy. A diagnostic and therapeutic paracentesis was performed, and chylous ascites were revealed. Two weeks later, the patient was presented with re-accumulation of ascites and worsening anasarca with pleural and pericardial effusion. A PET CT then was negative for malignant lesions but revealed increased uptake along the peritoneal wall, suggestive of peritonitis. A lymphangiogram performed for further evaluation revealed no gross leak/obstruction; however, this study could not rule out microleak from increased capillary permeability. However, he required bilateral pleural and peritoneal drains (output ranged from 0.5 to 1 L daily). A diagnosis of Capillary leak syndrome was made. In addition to Octreotide, immunosuppression therapy was initiated with IV methyl prednisone (40 mg BID) followed by a transition to oral steroids (60 mg PO); however, the patient's symptoms reappeared with a reduction in dose of prednisone and transition to oral steroids. His immunosuppression regimen was modified to include a trial of IVIG weekly and IV albumin twice daily. From a nutritional perspective, he was initially on a routine oral diet. Still, his drain was increased specifically after fatty food consumption, so he switched to a low fat 40 g/day and high protein diet to prevent worsening chylous ascites. In the setting of worsening anasarca and moderate malnutrition based on ASPEN criteria, along with significant muscle loss clinically, he was started on TPN. A no-fat diet was initiated to minimize lymphatic flow with subsequent improvement in his chest tube output volume, followed by a transition to home parenteral nutrition with mixed oil and oral diet.</p><p><b>Results:</b> None Reported.</p><p><b>Conclusion:</b> Chronic capillary/lymphatic leak syndrome can be challenging and necessitate dietary modification. Along with dietary changes to significantly reduce oral fat intake, short—or long-term PN can be considered.</p><p>Kishore Iyer, MBBS<sup>1</sup>; Francisca Joly, MD, PhD<sup>2</sup>; Donald Kirby, MD, FACG, FASPEN<sup>3</sup>; Simon Lal, MD, PhD, FRCP<sup>4</sup>; Kelly Tappenden, PhD, RD, FASPEN<sup>2</sup>; Palle Jeppesen, MD, PhD<sup>5</sup>; Nader Youssef, MD, MBA<sup>6</sup>; Mena Boules, MD, MBA, FACG<sup>6</sup>; Chang Ming, MS, PhD<sup>6</sup>; Tomasz Masior, MD<sup>6</sup>; Susanna Huh, MD, MPH<sup>7</sup>; Tim Vanuytsel, MD, PhD<sup>8</sup></p><p><sup>1</sup>Icahn School of Medicine at Mount Sinai, New York, NY; <sup>2</sup>Nutritional Support, Hôpital Beaujon, Paris, Ile-de-France; <sup>3</sup>Department of Intestinal Failure and Liver Diseases, Cleveland, OH; <sup>4</sup>Salford Royal NHS Foundation Trust, Salford, England; <sup>5</sup>Department of Intestinal Failure and Liver Diseases, Rigshospitalet, Copenhagen, Denmark, Hovedstaden; <sup>6</sup>Ironwood Pharmaceuticals, Basel, Basel-Stadt; <sup>7</sup>Ironwood Pharmaceuticals, Boston, MA; <sup>8</sup>University Hospitals Leuven, Leuven, Brabant Wallon</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> American College of Gastroenterology 2024, October 25-30, 2024, Philadelphia, Pennsylvania.</p><p><b>Financial Support:</b> None Reported.</p><p><b>International Poster of Distinction</b></p><p>Francisca Joly, MD, PhD<sup>1</sup>; Tim Vanuytsel, MD, PhD<sup>2</sup>; Donald Kirby, MD, FACG, FASPEN<sup>3</sup>; Simon Lal, MD, PhD, FRCP<sup>4</sup>; Kelly Tappenden, PhD, RD, FASPEN<sup>1</sup>; Palle Jeppesen, MD, PhD<sup>5</sup>; Federico Bolognani, MD, PhD<sup>6</sup>; Nader Youssef, MD, MBA<sup>6</sup>; Carrie Li, PhD<sup>6</sup>; Reda Sheik, MPH<sup>6</sup>; Isabelle Statovci, BS, CH<sup>6</sup>; Susanna Huh, MD, MPH<sup>7</sup>; Kishore Iyer, MBBS<sup>8</sup></p><p><sup>1</sup>Nutritional Support, Hôpital Beaujon, Paris, Ile-de-France; <sup>2</sup>University Hospitals Leuven, Leuven, Brabant Wallon; <sup>3</sup>Department of Intestinal Failure and Liver Diseases, Cleveland, OH; <sup>4</sup>Salford Royal NHS Foundation Trust, Salford, England; <sup>5</sup>Department of Intestinal Failure and Liver Diseases, Rigshospitalet, Copenhagen, Denmark, Hovedstaden; <sup>6</sup>Ironwood Pharmaceuticals, Basel, Basel-Stadt; <sup>7</sup>Ironwood Pharmaceuticals, Boston, MA; <sup>8</sup>Icahn School of Medicine at Mount Sinai, New York, NY</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> Digestive Disease Week 2024, May 18 - 21, 2024, Washington, US.</p><p><b>Financial Support:</b> None Reported.</p><p>Tim Vanuytsel, MD, PhD<sup>1</sup>; Simon Lal, MD, PhD, FRCP<sup>2</sup>; Kelly Tappenden, PhD, RD, FASPEN<sup>3</sup>; Donald Kirby, MD, FACG, FASPEN<sup>4</sup>; Palle Jeppesen, MD, PhD<sup>5</sup>; Francisca Joly, MD, PhD<sup>3</sup>; Tomasz Masior, MD<sup>6</sup>; Patricia Valencia, PharmD<sup>7</sup>; Chang Ming, MS, PhD<sup>6</sup>; Mena Boules, MD, MBA, FACG<sup>6</sup>; Susanna Huh, MD, MPH<sup>7</sup>; Kishore Iyer, MBBS<sup>8</sup></p><p><sup>1</sup>University Hospitals Leuven, Leuven, Brabant Wallon; <sup>2</sup>Salford Royal NHS Foundation Trust, Salford, England; <sup>3</sup>Nutritional Support, Hôpital Beaujon, Paris, Ile-de-France; <sup>4</sup>Department of Intestinal Failure and Liver Diseases, Cleveland, OH; <sup>5</sup>Department of Intestinal Failure and Liver Diseases, Rigshospitalet, Copenhagen, Denmark, Hovedstaden; <sup>6</sup>Ironwood Pharmaceuticals, Basel, Basel-Stadt; <sup>7</sup>Ironwood Pharmaceuticals, Boston, MA; <sup>8</sup>Icahn School of Medicine at Mount Sinai, New York, NY</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> American College of Gastroenterology 2024, October 25-30, 2024, Philadelphia, Pennsylvania.</p><p><b>Financial Support:</b> None Reported.</p><p>Boram Lee, MD<sup>1</sup>; Ho-Seong Han, PhD<sup>1</sup></p><p><sup>1</sup>Seoul National University Bundang Hospital, Seoul, Seoul-t'ukpyolsi</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Pancreatic cancer is one of the fatal malignancies, with a 5-year survival rate of less than 10%. Despite advancements in treatment, its incidence is increasing, driven by aging populations and increasing obesity rates. Obesity is traditionally considered to be a negative prognostic factor for many cancers, including pancreatic cancer. However, the \"obesity paradox\" suggests that obesity might be associated with better outcomes in certain diseases. This study investigated the effect of obesity on the survival of long-term pancreatic cancer survivors after pancreatectomy.</p><p><b>Methods:</b> A retrospective analysis was conducted on 404 patients with pancreatic ductal adenocarcinoma (PDAC) patients who underwent surgery between January 2004 and June 2022. Patients were classified into the non-obese (BMI 18.5-24.9) (n = 313) and obese (BMI ≥ 25.0) (n = 91) groups. The data collected included demographic, clinical, perioperative, and postoperative information. Survival outcomes (overall survival [OS], recurrence-free survival [RFS], and cancer-specific survival [CSS]) were analyzed using Kaplan-Meier curves and Cox regression models. A subgroup analysis examined the impact of the visceral fat to subcutaneous fat ratio (VSR) on survival within the obese cohort.</p><p><b>Results:</b> Obese patients (n = 91) had a significantly better 5-year OS (38.9% vs. 27.9%, p = 0.040) and CSS (41.4% vs. 33%, p = 0.047) than non-obese patients. RFS did not differ significantly between the groups. Within the obese cohort, a lower VSR was associated with improved survival (p = 0.012), indicating the importance of fat distribution in outcomes.</p><p><b>Conclusion:</b> Obesity is associated with improved overall and cancer-specific survival in patients with pancreatic cancer undergoing surgery, highlighting the potential benefits of a nuanced approach to managing obese patients. The distribution of adipose tissue, specifically higher subcutaneous fat relative to visceral fat, further influences survival, suggesting that tailored treatment strategies could enhance the outcomes.</p><p>Nicole Nardella, MS<sup>1</sup>; Nathan Gilchrist, BS<sup>1</sup>; Adrianna Oraiqat, BS<sup>1</sup>; Sarah Goodchild, BS<sup>1</sup>; Dena Berhan, BS<sup>1</sup>; Laila Stancil, HS<sup>1</sup>; Jeanine Milano, BS<sup>1</sup>; Christina Santiago, BS<sup>1</sup>; Melissa Adams, PA-C<sup>1</sup>; Pamela Hodul, MD<sup>1</sup></p><p><sup>1</sup>Moffitt Cancer Center, Tampa, FL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Pancreatic cancer (PC) is a devastating diagnosis with 66,440 new cases and 51,750 deaths estimated in 2024. The prevalence of malnutrition in patients with cancer has been reported to range from 30-85% depending on patient age, cancer type, and stage of disease. Specifically, PC patients frequently present with malnutrition which can lead to negative effects on quality of life and tumor therapy. We hypothesize that increased awareness of early nutritional intervention for PC patients has led to high utilization of dietitian consultations at our tertiary cancer center.</p><p><b>Methods:</b> This IRB-exempt retrospective review included newly diagnosed, treatment naïve PC patients presenting to our institution in 2021-2023 (n = 701). We define newly diagnosed as having pathologically confirmed adenocarcinoma within 30 days of clinic presentation. Patients were screened for weight loss and risk of malnutrition using the validated Malnutrition Screening Tool (MST) (89.5 positive predictive value) at the initial consultation and were referred to a dietitian based on risk or patient preference. Data was collected on demographics, disease stage (localized vs metastatic), tumor location (head/neck/uncinate, body/tail, multi-focal), presenting symptoms (percent weight loss, abdominal pain, bloating, nausea/vomiting, fatigue, change in bowel habits), experience of jaundice, pancreatitis, and gastric outlet obstruction, and dietitian consultation. Descriptive variables and Fisher's exact test were used to report outcomes.</p><p><b>Results:</b> The majority of patients were male (54%) with median age of 70 (27-95). About half of patients had localized disease (54%) with primary tumor location in the head/neck/uncinate region (57%). Patients with head/neck/uncinate tumor location mostly had localized disease (66%), while patients with body/tail tumors tended to have metastatic disease (63%). See Table 1 for further demographics. Unintentional weight loss was experienced by 66% of patients (n = 466), 69% of localized patients (n = 261) and 64% of metastatic patients (n = 205). Patients with localized disease stated a 12% loss in weight over a median of 3 months, while metastatic patients reported a 10% weight loss over a median of 5 months. Of the localized patients, the majority presented with symptoms of abdominal pain (66%), nausea/vomiting/fatigue (61%), and change in bowel habits (44%). Presenting symptoms of the metastatic patients were similar (see Table 2). There was no statistical significance of tumor location in relation to presenting symptoms. Dietitian consults occurred for 67% (n = 473) of the patient population, 77% for those with localized disease and 57% for those with metastatic disease. Of those with reported weight loss, 74% (n = 343) had dietitian consultation.</p><p><b>Conclusion:</b> Overall, it was seen that a high number of newly diagnosed, treatment naïve PC patients present with malnutrition. Patients with localized disease and tumors located in the head/neck/uncinate region experience the greatest gastrointestinal symptoms of nausea, vomiting, change in bowel habits, and fatigue. Early implementation of a proactive nutritional screening program resulted in increased awareness of malnutrition and referral for nutritional intervention for newly diagnosed PC patients.</p><p><b>Table 1.</b> Demographics and Disease Characteristics.</p><p></p><p><b>Table 2.</b> Presenting Symptoms.</p><p></p><p>Nicole Nardella, MS<sup>1</sup>; Nathan Gilchrist, BS<sup>1</sup>; Adrianna Oraiqat, BS<sup>1</sup>; Sarah Goodchild, BS<sup>1</sup>; Dena Berhan, BS<sup>1</sup>; Laila Stancil, HS<sup>1</sup>; Jeanine Milano, BS<sup>1</sup>; Christina Santiago, BS<sup>1</sup>; Melissa Adams, PA-C<sup>1</sup>; Pamela Hodul, MD<sup>1</sup></p><p><sup>1</sup>Moffitt Cancer Center, Tampa, FL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Pancreatic cancer (PC) is an aggressive disease, with 5-year survival rate of 13%. Symptoms occur late in the disease course, leading to approximately 50% of patients presenting with metastatic disease. New-onset diabetes is often one of the first symptoms of PC, with diagnosis occurring up to 3 years before cancer diagnosis. We hypothesize increasing awareness of PC prevalence in diabetic patients, both new-onset and pre-existing, may lead to early PC diagnosis.</p><p><b>Methods:</b> This IRB-exempt retrospective review included new PC patients presenting to our institution in 2021-2023 with diabetes diagnosis (n = 458). We define new-onset diabetes as having been diagnosed within 3 years prior to pathologically confirmed adenocarcinoma. We define newly diagnosed PC as having pathologically confirmed adenocarcinoma within 30 days of clinic presentation. Data was collected on demographics, staging (localized vs metastatic), tumor location (head/neck/uncinate, body/tail, multi-focal), treatment initiation, diabetes onset (new-onset vs pre-existing), diabetes regimen, and weight loss. Descriptive variables were used to report outcomes.</p><p><b>Results:</b> In the study period, 1,310 patients presented to our institution. Of those, 35% had a diabetes diagnosis (n = 458). The majority of patients were male (61%) with age at PC diagnosis of 69 (41-92). Patients mostly had localized disease (57%) with primary tumor location in the head/neck/uncinate region (59%). New-onset diabetes was present in 31% of diabetics, 11% of all new patients, with 63% having localized disease (79% head/neck/uncinate) and 37% metastatic (66% body/tail). Of those with pre-existing diabetes (69%), 54% had localized disease (69% head/neck/uncinate) and 46% had metastatic disease (53% body/tail). See Table 1 for further demographic/disease characteristics. Abrupt worsening of diabetes was seen in 10% (n = 31) of patients with pre-existing diabetes, and 12% had a change in their regimen prior to PC diagnosis. Hence 13% (175/1,310) of all new patients presented with either new-onset or worsening diabetes. Weight loss was present in 75% (n = 108) of patients with new-onset diabetes, with a median of 14% weight loss (3%-38%) over 12 months (1-24). Alternatively, weight loss was present in 66% (n = 206) of patients with pre-existing diabetes, with a median of 14% weight loss (4%-51%) over 6 months (0.5-18). Diabetes medication was as follows: 41% oral, 30% insulin, 20% both oral and insulin, 10% no medication. Of those patients with new-onset diabetes, 68% were diagnosed within 1 year of PC diagnosis and 32% were diagnosed within 1-3 years of PC diagnosis. Of those within 1 year of diagnosis, 68% had localized disease with 81% having head/neck/uncinate tumors. Of the metastatic (31%), 73% had body/tail tumors. For patients with diabetes diagnosis within 1-3 years of PC diagnosis, 52% had localized disease (75% head/neck/uncinate) and 48% had metastatic disease (59% body/tail). See Table 2 for further characteristics.</p><p><b>Conclusion:</b> Overall, approximately one-third of new patients presenting with PC at our institution had diabetes, and new-onset diabetes was present in one-third of those patients. The majority of diabetes patients presented with localized head/neck/uncinate tumors. When comparing new-onset vs pre-existing diabetes, new-onset tended to experience greater weight loss over a longer time with more localized disease than pre-existing diabetes patients. Patients with diabetes diagnosis within 1 year of PC diagnosis had more localized disease (head/neck/uncinate). Hence increased awareness of diabetes in relation to PC, particularly new onset and worsening pre-existing, may lead to early diagnosis.</p><p><b>Table 1.</b> Demographics and Disease Characteristics.</p><p></p><p><b>Table 2.</b> New-Onset Diabetes Characteristics.</p><p></p><p>Marcelo Mendes, PhD<sup>1</sup>; Gabriela Oliveira, RD<sup>2</sup>; Ana Zanini, RD, MSc<sup>2</sup>; Hellin dos Santos, RD, MSc<sup>2</sup></p><p><sup>1</sup>Cicatripelli, Belém, Para; <sup>2</sup>Prodiet Medical Nutrition, Curitiba, Parana</p><p><b>Encore Poster</b></p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> According to the NPUAP, a pressure injury (PI) is damage that occurs to the skin and/or underlying soft tissue, primarily over bony prominences, and may also be related to the use of medical devices. PIs can range from intact skin to deeper ulcers, affecting structures such as muscles and bones. The aim of this study was to report the experience of using a specialized supplement for wound healing in the treatment of a PI.</p><p><b>Methods:</b> This is a case report based on the experience of the nurses from Cicatripelli®. Data were collected from May to July 2024 through a review of medical records and photographs showing the wound's progression. The patient was a 69-year-old woman with COPD, diabetes mellitus, and systemic arterial hypertension, who denied smoking and alcohol consumption. She developed a stage 4 PI in the sacral region after 15 days of mechanical ventilation due to bacterial pneumonia and was admitted to a private clinic for treatment on May 2, 2024. Initial wound assessment: Measurements: 16.5x13x4cm (WxLxD); abundant purulent exudate with a foul odor; intact peripheral skin; mild to moderate pain; 75% granulation tissue and 25% liquefactive necrosis (slough) (Figure 1). The wound was cleansed with 0.1% polyhexamethylene biguanide (PHMB) solution, followed by conservative debridement, primary coverage with hydrophiber with 1.2% silver, and secondary coverage with cotton gauze and transparent film, with dressing changes scheduled every 72 hours. Supplementation (Correctmax – Prodiet Medical Nutrition) started on 05/20/2024, with a dosage of 2 sachets per day, containing 10 g of collagen peptide, 3 g of L-arginine, 612 mg of vitamin A, 16 mg of vitamin E, 508 mg of vitamin C, 30 mcg of selenium, and 16 mg of zinc.</p><p><b>Results:</b> On the 17th day of supplementation, the hydrophiber with silver dressing was replaced with PHMB-impregnated gauze, as the wound showed no signs of infection and demonstrated significant clinical improvement. Measurements: 8x6x2cm (WxLxD); moderate serohematic exudate; intact peripheral skin; 100% granulation tissue; significant improvement in pain and odor (Figure 2). On the 28th day, the dressing was switched to calcium and sodium alginate to optimize exudate control due to the appearance of mild dermatitis. Low-intensity laser therapy was applied, and a skin protective spray was used. Wound assessment: Measurements: 7x5.5x1.5 cm (WxLxD), with maintained characteristics (Figure 3). On the 56th day, the patient returned for dressing change and discharge instructions, as she could not continue the treatment due to a lack of resources. The approach remained the same with dressing changes every 3 days. Wound assessment: Measurements: 5x3.5x0.5 cm (WxLxD), with approximately 92% reduction in wound area, epithelialized margins, and maintained characteristics (Figure 4).</p><p><b>Conclusion:</b> Nutritional intervention with specific nutrient supplementation can aid in the management of complex wounds, serving as a crucial tool in the healing process and contributing to reduced healing time.</p><p></p><p><b>Figure 1.</b> Photo of the wound on the day of the initial assessment on 05/02/2024.</p><p></p><p><b>Figure 2.</b> Photo of the wound after 17 days of supplementation on 06/06/2024.</p><p></p><p><b>Figure 3.</b> Photo of the wound after 28 days of supplementation on 06/17/2024.</p><p></p><p><b>Figure 4.</b> Photo of the wound after 56 days of supplementation on 07/15/2024.</p><p>Ludimila Ribeiro, RD, MSc<sup>1</sup>; Bárbara Gois, RD, PhD<sup>2</sup>; Ana Zanini, RD, MSc<sup>3</sup>; Hellin dos Santos, RD, MSc<sup>3</sup>; Ana Paula Celes, MBA<sup>3</sup>; Flávia Corgosinho, PhD<sup>2</sup>; Joao Mota, PhD<sup>4</sup></p><p><sup>1</sup>School of Nutrition, Federal University of Goiás, Goiania, Goias; <sup>2</sup>School of Nutrition, Federal University of Goiás, Goiânia, Goias; <sup>3</sup>Prodiet Medical Nutrition, Curitiba, Parana; <sup>4</sup>Federal University of Goias, Goiania, Goias</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Postprandial blood glucose is considered an important risk factor for the development of macrovascular and microvascular diseases. Despite the use of hypoglycemic agents, patients with diabetes often experience postprandial hyperglycemia due to unbalanced meals. The aim of this study was to compare the effects of a low glycemic index formula for glycemic control as a substitute for a standard breakfast in patients with type 2 diabetes.</p><p><b>Methods:</b> This randomized, placebo-controlled, crossover study included 18 individuals with type 2 diabetes. Participants were instructed to consume, in random order, either a nutritional formula or a typical Brazilian breakfast with the same caloric content for three consecutive week days in different weeks. The nutritional formula (200 mL) provided 200 kcal, with 20 g of carbohydrates, 8.8 g of protein, 9.4 g of fat (MUFA: 5.6 g, PUFA: 2.0 g), and 3.0 g of fiber (DiamaxIG – Prodiet Medical Nutrition), serving as a breakfast substitute. Both the nutritional formula and the standard breakfast were provided to the participants. During the two weeks of intervention, participants used continuous glucose monitoring sensors (Libre 2). Weight and height were measured to calculate body mass index (BMI), and medication use was monitored.</p><p><b>Results:</b> The sample consisted of 61% females, with a mean age of 50.28 ± 12.58 years. The average blood glucose level was 187.13 ± 77.98 mg/dL and BMI 29.67 ± 4.86 kg/m². All participants were taking metformin, and two were taking it concomitantly with insulin. There were no changes in medication doses or regimens during the study. The incremental area under the curve was significantly lower in the nutritional formula group compared to the standard breakfast group (2,794.02 ± 572.98 vs. 4,461.55 ± 2,815.73, p = 0.01).</p><p><b>Conclusion:</b> The low glycemic index formula for glycemic control significantly reduced postprandial glycemic response compared to a standard Brazilian breakfast in patients with type 2 diabetes. These findings suggest that incorporating low glycemic index meals could be an effective strategy for better managing postprandial blood glucose levels in this population, which may help mitigate the risk of developing macro and microvascular complications.</p><p>Kirk Kerr, PhD<sup>1</sup>; Bjoern Schwander, PhD<sup>2</sup>; Dominique Williams, MD, MPH<sup>1</sup></p><p><sup>1</sup>Abbott Nutrition, Columbus, OH; <sup>2</sup>AHEAD GmbH, Bietigheim-Bissingen, Baden-Wurttemberg</p><p><b>Financial Support:</b> Abbott Nutrition.</p><p><b>Background:</b> According to the World Health Organization, obesity is a leading risk factor for global non communicable diseases like diabetes, heart disease and cancer. Weight cycling, often defined as intentionally losing and unintentionally regaining weight, is observed in people with obesity and can have adverse health and economic consequences. This study develops the first model of the health economic consequences of weight cycling in individuals with obesity, defined by a body-mass index (BMI) ≥ 30 kg/m².</p><p><b>Methods:</b> A lifetime state-transition model (STM) with monthly cycles was developed to simulate a cohort of individuals with obesity, comparing “weight cyclers” versus “non-cyclers”. The simulated patient cohort was assumed to have average BMI 35.5, with 11% of patients with cardiovascular disease, and 6% of patients with type 2 diabetes. Key outcomes were the cost per obesity-associated event avoided, the cost per life-year (LY) gained, and the cost per quality-adjusted life year (QALY) gained, using a US societal perspective. Transition probabilities for obesity-associated diseases were informed by meta-analyses and were based on US disease-related base risks. Risks were adjusted by BMI-related, weight-cycling-related, and T2D-related relative risks (RR). BMI progression, health utilities, direct and indirect costs, and other population characteristics were informed by published US studies. Future costs and effects were discounted by 3% per year. Deterministic and probabilistic sensitivity analyses were performed to investigate the robustness of the results.</p><p><b>Results:</b> Simulating a lifetime horizon, non-cyclers had 0.090 obesity-associated events avoided, 0.602 LYs gained, 0.518 QALYs gained and reduced total costs of approximately $4,592 ($1,004 direct and $3,588 indirect costs) per person. Sensitivity analysis showed the model was most responsive to changes in patient age and cardiovascular disease morbidity and mortality risks. Non-cycling as the cost-effective option was robust in sensitivity analyses.</p><p><b>Conclusion:</b> The model shows that weight cycling has a major impact on the health of people with obesity, resulting in increased direct and indirect costs. The approach to chronic weight management programs should focus not only on weight reduction but also on weight maintenance to prevent the enhanced risks of weight cycling.</p><p>Avi Toiv, MD<sup>1</sup>; Arif Sarowar, MSc<sup>2</sup>; Hope O'Brien, BS<sup>2</sup>; Thomas Pietrowsky, MS, RD<sup>1</sup>; Nemie Beltran, RN<sup>1</sup>; Yakir Muszkat, MD<sup>1</sup>; Syed-Mohammad Jafri, MD<sup>1</sup></p><p><sup>1</sup>Henry Ford Hospital, Detroit, MI; <sup>2</sup>Wayne State University School of Medicine, Detroit, MI</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Age is an important factor in the transplant evaluation as age at transplantation is historically thought to influence transplant outcomes in organ transplant recipients. There is limited data on the impact of age on intestinal (IT) and multivisceral (MVT) organ transplantation. This study investigates the impact of age on post-transplant outcomes in patients who received an intestinal or multivisceral (including intestine) transplant, comparing those under 40 years old to those aged 40 and above.</p><p><b>Methods:</b> We conducted a retrospective chart review of all patients who underwent IT at an academic transplant center from 2010 to 2023. The primary outcome was patient survival and graft failure analyzed with Kaplan-Meier survival analysis.</p><p><b>Results:</b> Among 50 IT recipients, there were 11 IT recipients &lt; 40years old and 39 IT recipients ≥40 years old (Table). The median age at transplant in the &lt;40 group was 37 years (range, 17-39) and in the ≥40 group was 54 years (range, 40-68). In both groups, the majority of transplants were exclusively IT, however they included MVT as well. Kaplan-Meier survival analysis revealed no significant differences between the two age groups in graft survival or patient mortality. Reoperation within 1 month was significantly linked to decreased survival (p = 0.015) and decreased graft survival (p = 0.003) as was moderate to severe rejection within 1 month (p = 0.009) but was not significantly different between the two age groups. Wilcoxon rank-sum test showed no difference between groups in regard to reoperation or moderate to severe rejection at 1 or 3 months or the development of chronic kidney disease.</p><p><b>Conclusion:</b> Age at the time of intestinal transplantation (&lt; 40 vs. ≥40 years old) does not appear to significantly impact major transplant outcomes, such as patient mortality, graft survival, or rejection rates. While reoperation and moderate to severe rejection within 1 and 3 months negatively affected overall outcomes, these complications were not more frequent in older or younger patients.</p><p><b>Table 1.</b> Demographic Characteristics of Intestinal Transplant Recipients.</p><p></p><p>BMI, body mass index; TPN, total parenteral nutrition.</p><p><b>International Poster of Distinction</b></p><p>Gabriela de Oliveira Lemos, MD<sup>1</sup>; Natasha Mendonça Machado, PhD<sup>2</sup>; Raquel Torrinhas, PhD<sup>3</sup>; Dan Linetzky Waitzberg, PhD<sup>3</sup></p><p><sup>1</sup>University of Sao Paulo School of Medicine, Brasília, Distrito Federal; <sup>2</sup>University of Sao Paulo School of Medicine, São Paulo; <sup>3</sup>Faculty of Medicine of the University of São Paulo, São Paulo</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> Ganepão 2023.</p><p><b>Publication:</b> Braspen Journal. ISSN 2764-1546 | Online Version.</p><p><b>Financial Support:</b> This study is linked to project no. 2011/09612-3 and was funded by the Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP).</p><p><b>Background:</b> Sphingolipids (SLs) are key molecules in cell signaling and play a central role in lipotoxicity. Moreover, they are major constituents of eucaryotic membrane cells. Data on SLs remodeling in the gastrointestinal tract after Roux-en-Y gastric bypass (RYGB) are lacking and may help to elicit tissue turnover and metabolism. This protocol aims to evaluate the SLs’ profile and remodeling within the plasma and different segments of the gastrointestinal tract (GIT) before and 3 months after RYGB in a population of women with obesity and type 2 diabetes mellitus (T2DM) and to correlate the changes within these tissues. This investigation is part of the SURMetaGIT study, registered at www.clinicalTrials.gov (NCT01251016).</p><p><b>Methods:</b> Twenty-eight women with obesity and T2DM who underwent RYGB were enrolled in this protocol. Those on insulin therapy were excluded. We collected plasma (n = 28) and intestinal samples from the gastric pouch (n = 9), duodenum (n = 8), jejunum (n = 9), and ileum (n = 9) for untarget metabolomics analysis at baseline and 3 months post-surgery. Indian ink (SPOT®) was used to check the places for the following biopsies. SLs were identified using high-performance liquid chromatography coupled mass spectrometry. Data were processed and analyzed using AnalysisBaseFileConverter and MS-DIAL, respectively. The magnitude of SLs’ changes after RYGB was assessed by the fold change (log2 post-surgery mean/pre-surgery mean). The Spearman test was performed for the correlation analysis. P value &lt; 0.05 was considered significant. Statistics were carried out in the Jamovi software (2.2.5) and MetaboAnalyst 5.0.</p><p><b>Results:</b> 34 SLs were identified, including sphingomyelins (SM), ceramides (Cer), and glycosphingolipids (GlcSL). SMs were the most common SL found in the plasma – 27 SM, 3 Cer, and 4 GlcSL- and in GIT -16 SM, 13 Cer, and 5 GlcSL. Every GIT tissue presented distinct SL remodeling. The plasma and the jejunum better-discriminated SLs’ changes following surgery (Figure 1). The jejunum expressed the most robust changes, followed by the plasma and the duodenum (Figure 2). Figure 3 presents the heatmap of the plasma and the GIT tissues. Correlation analysis showed that the plasmatic SLs, particularly SM(d32:0), GlcCer(d42:1), and SM(d36:1), strongly correlated with jejunum SLs. These lipids showed a negative-strong correlation with jejunal sphingomyelins, but a positive-strong correlation with jejunal ceramides (Table 1).</p><p><b>Conclusion:</b> RYGB was associated with SL remodeling in the plasma and GIT. SM was the main SL found in the plasma and the GIT. The most robust changes occurred in the jejunum and the plasma, and these 2 samples presented the more relevant correlation. Considering our findings, the role of SM in metabolic changes after RYGB should be investigated.</p><p><b>Table 1.</b> Correlation Analysis of Sphingolipids from the plasma with the Sphingolipids from the Gastrointestinal Tract.</p><p></p><p>*p &lt; ,05; **p &lt; ,01; ***p &lt; 0,001.</p><p></p><p>The green circle represents samples at baseline and the red circles represent the samples 3 months after RYGB.</p><p><b>Figure 1.</b> Principal Component Analysis (PCA) from GIT Tissues and Plasma.</p><p></p><p>Fold change = log2 post-surgery mean/pre-surgery mean.</p><p><b>Figure 2.</b> Fold Change of Sphingolipids from the Plasma and Gastrointestinal Tract.</p><p></p><p>The map under the right top green box represents lipids’ abundance before surgery, and the map under the left top red box represents lipids’ abundance after RYGB.</p><p><b>Figure 3.</b> Heatmap of Sphingolipids from the Plasma and Gastrointestinal Tract.</p><p>Lucas Santander<sup>1</sup>; Gabriela de Oliveira Lemos, MD<sup>2</sup>; Daiane Mancuzo<sup>3</sup>; Natasha Mendonça Machado, PhD<sup>4</sup>; Raquel Torrinhas, PhD<sup>5</sup>; Dan Linetzky Waitzberg, PhD<sup>5</sup></p><p><sup>1</sup>Universidade Santo Amaro (Santo Amaro University), São Bernardo Do Campo, Sao Paulo; <sup>2</sup>University of Sao Paulo School of Medicine, Brasília, Distrito Federal; <sup>3</sup>Universidade São Caetano (São Caetano University), São Caetano do Sul, Sao Paulo; <sup>4</sup>University of Sao Paulo School of Medicine, São Paulo; <sup>5</sup>Faculty of Medicine of the University of São Paulo, São Paulo</p><p><b>Financial Support:</b> Fundação de Amparo a Pesquisa do Estado de São Paulo.</p><p><b>Background:</b> Microalbuminuria (MAL) is an early biomarker of kidney injury, linked to conditions like hypertension, type 2 diabetes (T2DM), and obesity. It is also associated with higher cardiovascular (CV) risk. This protocol examines the impact of Roux-en-Y gastric bypass (RYGB) on MAL and CV markers in patients with obesity, T2DM, and MAL.</p><p><b>Methods:</b> 8 women with grade II-III obesity, T2DM, and MAL who underwent RYGB were included. Patients on insulin therapy were not included. MAL was defined as a urinary albumin-to-creatinine ratio &gt; 30 mg/g. MAL, glycemic, and lipid serum biomarkers were measured at baseline and three months post-surgery. Systolic (SBP) and diastolic (DBP) blood pressures and medical treatments for T2DM and hypertension were also assessed. T2DM remission was defined by ADA 2021 criteria. Categorical variables were reported as absolute and relative frequencies, while continuous variables were expressed as median and IQR based on normality tests. Intragroup and intergroup comparisons were conducted using the Wilcoxon and Mann-Whitney tests for numeric data. The Fisher test was performed when necessary to compare dichotomic variables. Data were analyzed in the JASP software version 0.18.1.0.</p><p><b>Results:</b> Overall, RYGB was associated with weight loss, improved body composition, and better CV markers (Table 1). Post-surgery, MAL decreased by at least 70% in all patients and resolved in half. All patients with MAL resolution had pre-surgery levels ≤ 100 mg/g. Those without resolution had severe pre-surgery MAL (33.8 vs. 667.5, p = 0.029), and higher SBP (193 vs. 149.5, p = 0.029) DBP (138 vs. 98, p = 0.025). Blood pressure decreased after surgery but remained higher in patients without MAL resolution: SBP (156.0 vs. 129.2, p = 0.069) and DBP (109.5 vs. 76.5, p &lt; 0.001). MAL resolution was not linked to T2DM remission at 3 months (75% vs. 50%, p = 1.0). One patient had worsened MAL (193.0 vs. 386.9 mg/g) after RYGB. Glomerular filtration rate (GFR) tended to increase post-surgery only in the group with MAL resolution (95.4 vs. 108.2 ml/min/1.73 m², p = 0.089), compared to the group without MAL resolution (79.2 vs. 73.7 ml/min/1.73 m², p = 0.6).</p><p><b>Conclusion:</b> RYGB effectively reduced markers of renal dysfunction and cardiovascular risk in this cohort. Patients showed a decrease in MAL, with resolution in half of the patients. The small sample size and short follow-up period may have limited the overall impact of the surgery on renal function. Future studies with larger cohorts and longer follow-ups are needed to understand better the effects of bariatric surgery on MAL, and its relation to other CV markers.</p><p><b>Table 1.</b> Biochemical and Clinical Data Analysis Following RYGB.</p><p></p><p>eGFR: estimated glomerular filtration rate; HbA1c: glycated hemoglobin; HDL-c: high-density lipoprotein cholesterol; HOMA-BETA: beta-cell function by the homeostasis model; HOMA-IR: Homeostasis Model Assessment of Insulin Resistance; LDL-c: low-density lipoprotein cholesterol; Non-HDL-c: non-high-density lipoprotein cholesterol; VLDL-c: very-low-density lipoprotein cholesterol; DBP: diastolic blood pressure; SBP: systolic blood pressure; WC: waist circumference.</p><p>Michelle Nguyen, BSc, MSc<sup>1</sup>; Johane P Allard, MD, FRCPC<sup>2</sup>; Dane Christina Daoud, MD<sup>3</sup>; Maitreyi Raman, MD, MSc<sup>4</sup>; Jennifer Jin, MD, FRCPC<sup>5</sup>; Leah Gramlich, MD<sup>6</sup>; Jessica Weiss, MSc<sup>1</sup>; Johnny H. Chen, PhD<sup>7</sup>; Lidia Demchyshyn, PhD<sup>8</sup></p><p><sup>1</sup>Pentavere Research Group Inc., Toronto, ON; <sup>2</sup>Division of Gastroenterology, Department of Medicine, Toronto General Hospital, Toronto, ON; <sup>3</sup>Division of Gastroenterology, Centre Hospitalier de l'Université de Montréal (CHUM), Department of Medicine, University of Montreal, Montreal, QC; <sup>4</sup>Division of Gastroenterology, University of Calgary, Calgary, AB; <sup>5</sup>Department of Medicine, University of Alberta, Division of Gastroenterology, Royal Alexandra Hospital, Edmonton, AB; <sup>6</sup>Division of Gastroenterology, Faculty of Medicine and Dentistry, University of Alberta, Edmonton, AB; <sup>7</sup>Takeda Canada Inc., Vancouver, BC; <sup>8</sup>Takeda Canada Inc., Toronto, ON</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> 46th European Society for Clinical Nutrition and Metabolism (ESPEN) Congress. 7-10 September 2024, Milan, Italy.</p><p><b>Financial Support:</b> Funding of this study is from Takeda Canada Inc.</p><p><b>Background:</b> Teduglutide is indicated for the treatment of patients with short bowel syndrome (SBS) who are dependent on parenteral support (PS). This study evaluated longer-term teduglutide effectiveness and safety in Canadian patients diagnosed with SBS dependent on PS using Real-World Evidence.</p><p><b>Methods:</b> This was an observational, retrospective study, using data from the national Canadian Takeda patient support program, and included adults with SBS. Data were collected 6 months before teduglutide initiation and from initiation to Dec-01-2023, death, or loss of follow-up. Descriptive statistics characterized the population and treatment-emergent adverse events (TEAEs). Changes in parenteral nutrition/intravenous fluid supply (PN/IV) were assessed based on decreases in PN/IV volume from baseline. Statistical significance was set at p &lt; 0.05.</p><p><b>Results:</b> 52 patients (60% women) were included in this study. Median age (range) was 54 (22–81) and 50% had Crohn's disease as their etiology of SBS. At 6 months, median (range) absolute and percentage reduction from baseline in PN/IV volume was 3,900 mL/week (-6,960–26,784; p &lt; 0.001), and 28.1% (-82.9–100). At 24 months, median (range) absolute reduction from baseline was 6,650 mL/week (-4,400–26,850; p = 0.003), and the proportion of patients who achieved ≥20% reduction in weekly PN/IV was 66.7%. Over the study, 27% achieved independence from PN/IV. TEAEs were reported in 51 (98%) patients (83% were serious TEAEs) during the study period, the 3 most common were weight changes, diarrhea, and fatigue.</p><p><b>Conclusion:</b> Patients showed significant decreases in PN/IV volumes after initiating teduglutide, with no unexpected safety findings. This study demonstrates real-world, longer-term effectiveness and safety of teduglutide in Canadian patients with SBS, complimenting previous clinical trials, and real-world studies.</p><p><b>Poster of Distinction</b></p><p>Sarah Carter, RD, LDN, CNSC<sup>1</sup>; Ruth Fisher, RDN, LD, CNSC<sup>2</sup></p><p><sup>1</sup>Coram CVS/Specialty Infusion Services, Tullahoma, TN; <sup>2</sup>Coram CVS/Specialty Infusion Services, Saint Hilaire, MN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Perceived benefit is one factor determining therapy continuation. Little information is published regarding positive outcomes for patients receiving the GLP-2 analog teduglutide apart from success rates of weaning HPN and hydration volumes by 20%. Patients who are new to therapy may question how long after initiation should they expect to see results. Patients may collaborate with prescribers to improve therapy tolerance if they have hope for improvements in their quality of life. This data analysis will provide details regarding patients receiving teduglutide and their perceived benefits to therapy.</p><p><b>Methods:</b> Dietitians interview patients receiving teduglutide as part of a service agreement, monitoring persistency and helping to eliminate barriers to therapy. Dietitians make weekly and monthly calls based on patients’ drug start dates and document interventions in flowsheets in patients’ electronic medical records. The interventions are published to data visualization software to monitor compliance with dietitian outreach as part of a quality improvement project. All patients were diagnosed with short bowel syndrome, but baseline characteristics were not exported to this dashboard. This study is a retrospective analysis of the existing dashboard using 3 years of assessments (May 1, 2021-April 30, 2024). Exclusion criteria included therapy dispensed from a pharmacy not using the company's newest computer platform and patients who did not respond to outreach. We analyzed the time on therapy before a positive outcome was reported by the patient to the dietitian and which positive outcomes were most frequently reported.</p><p><b>Results:</b> The data set included 336 patients with 2509 phone assessments. The most frequently reported first positive outcome was improved ostomy output/less diarrhea (72%, n = 243), occurring just after a month on therapy (mean 31 days ± 26.4). The mean time for first positive outcome for all patients who reported one was 32 days ± 28.5 (n = 314). Of the 22 patients who reported no positive outcome, 13 didn't answer the dietitians’ calls after initial contact. A summary is listed in Table 1. Overall positive outcomes reported were improved ostomy output/less diarrhea (87%, n = 292), weight gain (70%, n = 236), improved appetite/interest in food (33%, n = 112), feeling stronger/more energetic (27%, n = 92), improved quality of life (23%, n = 90), improved lab results (13%, n = 45) and fewer antidiarrheal medications (12%, n = 40). Of the 218 patients receiving parenteral support, 44 patients stopped hydration and HPN completely (20%) with another 92 patients reporting less time or days on hydration and HPN (42%) for a total of 136 patients experiencing a positive outcome of parenteral support weaning (62%). Patients reported improvements in other areas of their lives including fewer hospitalizations (n = 39), being able to travel (n = 35), tolerating more enteral nutrition volume (n = 19), returning to work/school (n = 14) and improving sleep (n = 13). A summary is diagramed in Figure 2.</p><p><b>Conclusion:</b> This retrospective analysis indicates that teduglutide is associated with improved symptom control and improved quality of life measures, with most patients seeing a response to therapy within the first 2 months. Patients responded to teduglutide with a decrease in ostomy output and diarrhea as the most frequent recognizable response to therapy. In addition to the goal of weaning parenteral support, clinicians should be cognizant of improvements in patients’ clinical status that can have significant impact in quality of life.</p><p><b>Table 1.</b> Timing of First Reported Positive Outcome by Patients Receiving Teduglutide.</p><p></p><p></p><p><b>Figure 1.</b> Total Positive Outcomes Reported by Patients (n = 336).</p><p><b>Poster of Distinction</b></p><p>Jennifer Cholewka, RD, CNSC, CDCES, CDN<sup>1</sup>; Jeffrey Mechanick, MD<sup>1</sup></p><p><sup>1</sup>The Mount Sinai Hospital, New York, NY</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Bariatric surgery is a guideline-directed intervention for patients with severe obesity. Adherence to post-operative recommendations is variable and consequent undernutrition complicated by multiple micronutrient deficiencies is prevalent. A post-bariatric surgery syndrome (PBSS) is not well defined and therefore poorly understood, though early detection and intervention would likely decrease clinical and economic burdens. Our current experience with PBSS is presented here as a case series. We will define PBSS based on clinical evidence related to risk factors, interventions, and clinical/biochemical responses.</p><p><b>Methods:</b> Twenty four consecutive patients referred to our metabolic support service were identified between January 1, 2019 and December 31, 2023 who were admitted to The Mount Sinai Hospital in New York City with a history of RYGB (roux en y gastric bypass) or BPDDS (biliopancreatic diversion with duodenal switch) and found to have failure to thrive, undernutrition, clinical/biochemical features of at least one micronutrient deficiency, and indications for parenteral nutrition. Patients were excluded if they had surgical complications or were not treated with parenteral nutrition. Fifteen patients were included in this case series and de-identified prior to data collection using the electronic health records (EPIC) and coded data collection sheets. Descriptive statistical methods were used to analyze parametric variables as mean ± standard deviation and non-parametric variables as median (interquartile range).</p><p><b>Results:</b> Results are provided in Table 1.</p><p><b>Conclusion:</b> The PBSS is defined by significant decompensation following a bariatric surgery procedure with malabsorptive component characterized by failure to thrive, hypoalbuminemia, multiple micronutrient deficiencies, and need for parenteral nutrition. Major risk factors include inadequate protein and micronutrient intake due to either unawareness (e.g., not recommended) or poor adherence, significant alcohol consumption, or a complicating medical/surgical condition. Parenteral nutrition formulation was safe in this population and prioritizes adequate nitrogen, nonprotein calories, and micronutrition. Further analyses on risk factors, responses to therapy, and role of a multidisciplinary team are in progress.</p><p><b>Table 1.</b> Risks/Presentation.</p><p></p><p><b>Table 2.</b> Responses to Parenteral Nutrition Intervention.</p><p></p><p>Holly Estes-Doetsch, MS, RDN, LD<sup>1</sup>; Aimee Gershberg, RD, CDN, CPT<sup>2</sup>; Megan Smetana, PharmD, BCPS, BCTXP<sup>3</sup>; Lindsay Sobotka, DO<sup>3</sup>; Kristen Roberts, PhD, RDN, LD, CNSC, FASPEN, FAND<sup>4</sup></p><p><sup>1</sup>The Ohio State University, Columbus, OH; <sup>2</sup>NYC Health + Hospitals, New York City, NY; <sup>3</sup>The Ohio State University Wexner Medical Center, Columbus, OH; <sup>4</sup>The Ohio State University, Granville, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Decompensated cirrhosis increases the risk of fat maldigestion through altered bile synthesis and excretion through the bile canaliculi. Maldigestion increases the risk of vitamin and mineral deficiencies which when untreated contribute to consequential health issues such as metabolic bone disease, xerophthalmia, and hyperkeratosis. There is an absence of comprehensive guidelines for prevention and treatment of deficiencies.</p><p><b>Methods:</b> Medical and surgical history, anthropometrics, medications and nutritional supplements, laboratory data, and medical procedures were extracted and analyzed from the electronic medical record.</p><p><b>Results:</b> A patient with congenital biliary atresia and decompensated cirrhosis was seen in a hepatology outpatient clinic. Biochemical assessment revealed severe vitamin A deficiency and suboptimal vitamin D and zinc status. Physical assessment indicated telogen effluvium and transient blurry vision. Despite a history of high-dose oral retinyl acetate ranging from 10,000-50,000 units daily and a 3-day course of 100,000 units via intermuscular injection and co-treatment of zinc deficiency to ensure adequate circulating retinol binding protein, normalization of serum retinol was not possible over the last 10 years. The patient's serum vitamin A level normalized following liver transplantation.</p><p><b>Conclusion:</b> In decompensated cirrhosis, there is a lack of sufficient guidelines for micronutrient dosing when traditional treatment strategies are unsuccessful. Furthermore, altered secretion of transport proteins due to underlying liver dysfunction may pose challenges in evaluating laboratory markers of micronutrient status. Collaborations with pharmacy and medicine support a thorough assessment, and the establishment of a safe treatment and monitoring plan. Clinical research is needed to understand strategies for acceptable and safe dosing strategies for patients with chronic, unresponsive fat soluble vitamin deficiencies.</p><p>Gang Wang, PhD<sup>1</sup></p><p><sup>1</sup>Nimble Science, Calgary, AB</p><p><b>Financial Support:</b> This work was supported by the National Research Council of Canada Industrial Research Assistance Program (NRC IRAP), an Alberta Innovate Accelerating Innovations into CarE (AICE) grant, and partially by a Clinical Problems Incubator Grant from the Snyder Institute for Chronic Disease at the University of Calgary.</p><p><b>Background:</b> The small intestine (SI) microbiome plays a crucial role in nutrient absorption and emerging evidence indicates that fecal contents is insufficient to represent the SI microenvironment. Endoscopic sampling is possible but expensive and not scalable. Ingestible sampling capsule technologies are emerging. However, potential contamination becomes are major limitations of these devices.</p><p><b>Methods:</b> We previously reported the Small Intestinal MicroBiome Aspiration (SIMBA) capsules as effective means for sampling, sealing and preserving SI luminal contents for 16 s rRNA gene sequencing analysis. A subset of the DNA samples, including SI samples collected by SIMBA capsules (CAP) and the matched saliva (SAL), fecal (FEC) and duodenal endoscopic aspirate (ASP) and brush (BRU) samples, from 16 participants recruited for an observational clinical validation study were sent for shotgun metagenomic sequencing. The aims are 1) to compare the sampling performance of the capsule (CAP) compared to endoscopic aspirates (ASP) and 850 small intestine, large intestine and fecal samples from the Clinical Microbiomics data warehouse (PRJEB28097), and 2) to characterize samples from the 4 different sampling sites in terms of species composition and functional potential.</p><p><b>Results:</b> 4/80 samples (1/16 SAL, 2/16 ASP, 0/16 BRU, 1/16 CAP and 0/16 FEC) failed library preparation and 76 samples were shotgun sequenced (average of 38.5 M read pairs per sample) (Figure 1). Quality assessment demonstrated that despite the low raw DNA yield of CAP samples, they retained a minimal level of host contamination, in comparison to ASP and BRU (mean 5.27 % vs. 93.09 - 96.44% of average total reads per sample) (Figure 2). CAP samples shared majority of the species with ASP samples as well as a big fraction of species detected in the terminal ileum samples. ASP and CAP sample composition was more similar to duodenum, jejunum and saliva and very different from large intestine and stool samples. Functional genomics further revealed GI regional-specific differences: In both ASP and CAP samples we detect a number of Gut Metabolic Modules (GMMs) for carbohydrate digestion and short-chain fatty acids. However, probiotic species, and species and genes involved in the bile acid metabolism were mainly prevalent in CAP and FEC samples and could not be detected in ASP samples.</p><p><b>Conclusion:</b> CAP and ASP microbiome are compositionally similar despite of the high level of host contamination of ASP samples. CAP appears to be of better quality to reveal GI regional-specific functional potentials than ASP. This analysis demonstrates the great potential for the SIMBA capsule for unveiling the SI microbiome and supports the prospective use of SIMBA capsules in observational and interventional studies to investigate the impacts of short-term and long-term biotic foods interventions to the gut microbiome (Figure 3). A series of studies utilizing the SIMBA capsules are being conducted under way and the detectability of the biotic foods intervention impacts will be reported in near future (Table 1).</p><p><b>Table 1.</b> List of Ongoing Observational and Interventional Clinical Studies using SIMBA Capsule.</p><p></p><p></p><p><b>Figure 1.</b> Shotgun Metagenomic Sequencing: Taxonomy Overview (Relative Abundance of the 10 Most Species By Sampling Site).</p><p></p><p><b>Figure 2.</b> Shotgun Metagenomic Sequencing: High Quality Non-Host Contamination Reads On Sampling Sites.</p><p></p><p><b>Figure 3.</b> Short-term and Long-term Interventional Study Protocols Using SIMBA Capsules.</p><p>Darius Bazimya, MSc. Nutrition, RN<sup>1</sup>; Francine Mwitende, RN<sup>1</sup>; Theogene Uwizeyimana, Phn<sup>1</sup></p><p><sup>1</sup>University of Global Health Equity, Kigali</p><p><b>Financial Support:</b> University of Global Health Equity.</p><p><b>Background:</b> Rwanda is currently grappling with the double burden of malnutrition and rising obesity, particularly in urban populations. As the country experiences rapid urbanization and dietary shifts, traditional issues of undernutrition coexist with increasing rates of obesity and related metabolic disorders. This study aims to investigate the relationship between these nutrition-related issues and their impact on gastrointestinal (GI) health and metabolic outcomes in urban Rwandan populations.</p><p><b>Methods:</b> A cross-sectional study was conducted in Kigali, Rwanda's capital, involving 1,200 adult participants aged 18 to 65. Data were collected on dietary intake, body mass index (BMI), GI symptoms, and metabolic markers such as fasting glucose, cholesterol levels, and liver enzymes. Participants were categorized into three groups: undernourished, normal weight, and overweight/obese based on their BMI. GI symptoms were assessed using a validated questionnaire, and metabolic markers were evaluated through blood tests. Statistical analyses were performed to assess correlations between dietary patterns, BMI categories, and GI/metabolic health outcomes.</p><p><b>Results:</b> The study found that 25% of participants were classified as undernourished, while 22% were obese, reflecting the double burden of malnutrition and rising obesity in Rwanda's urban population. Among obese participants, 40% exhibited elevated fasting glucose levels (p &lt; 0.01), and 30% reported significant GI disturbances, such as irritable bowel syndrome (IBS) and non-alcoholic fatty liver disease (NAFLD). In contrast, undernourished individuals reported fewer GI symptoms but showed a higher prevalence of micronutrient deficiencies, including anaemia (28%) and vitamin A deficiency (15%). Dietary patterns characterized by high-fat and low-fiber intake were significantly associated with increased GI disorders and metabolic dysfunction in both obese and normal-weight participants (p &lt; 0.05).</p><p><b>Conclusion:</b> This study highlights the growing public health challenge posed by the coexistence of undernutrition and obesity in Rwanda's urban centres. The dietary shifts associated with urbanization are contributing to both ends of the nutritional spectrum, adversely affecting GI and metabolic health. Addressing these issues requires comprehensive nutrition interventions that consider the dual challenges of undernutrition and obesity, promoting balanced diets and improving access to health services. These findings have important implications for nutrition therapy and metabolic support practices in Rwanda, emphasizing the need for tailored interventions that reflect the country's unique nutritional landscape.</p><p>Levi Teigen, PhD, RD<sup>1</sup>; Nataliia Kuchma, MD<sup>2</sup>; Hijab Zehra, BS<sup>1</sup>; Annie Lin, PhD, RD<sup>3</sup>; Sharon Lopez, BS<sup>2</sup>; Amanda Kabage, MS<sup>2</sup>; Monika Fischer, MD<sup>4</sup>; Alexander Khoruts, MD<sup>2</sup></p><p><sup>1</sup>University of Minnesota, St. Paul, MN; <sup>2</sup>University of Minnesota, Minneapolis, MN; <sup>3</sup>University of Minnesota, Austin, MN; <sup>4</sup>Indiana University School of Medicine, Indianapolis, IN</p><p><b>Financial Support:</b> Achieving Cures Together.</p><p><b>Background:</b> Fecal microbiota transplantation (FMT) is a highly effective treatment for recurrent Clostridioides difficile infection (rCDI). The procedure results in the repair of the gut microbiota following severe antibiotic injury. Recovery from rCDI is associated with high incidence of post-infection irritable bowel syndrome (IBS). In addition, older age, medical comorbidities, and prolonged diarrheal illness contribute to frailty in this patient population. The effect of FMT on IBS symptoms and frailty in patients with rCDI is largely unknown. In this prospective cohort study, we collected IBS symptom and frailty data over the 3 months following FMT treatment for rCDI in patients at two large, academic medical centers.</p><p><b>Methods:</b> Consenting adults who underwent FMT treatment for rCDI were enrolled in this study (n = 113). We excluded patients who developed a recurrence of CDI within the 3-month follow-up period (n = 15) or had incomplete IBS symptom severity scale (IBS-SSS) scores at any timepoint. The IBS-SSS is a 5-item survey measuring symptom intensity, with a score range from 0 to 500 with higher scores representing greater severity. IBS-SSS was collected at baseline, 1-week post FMT, 1-month post-FMT, and 3-months post-FMT. Frailty was assessed at baseline and 3-months using the FRAIL scale (categorical variable: “Robust Health”, “Pre-Frail”, “Frail”). Kruskal-Wallis test was used to compare IBS-SSS across timepoints. Post-hoc analysis was performed with the Pairwise Wilcoxon Rank Sum Tests using the False Discovery Rate adjustment method. The Friedman test was used to compare frailty distribution between the baseline and 3-month timepoints.</p><p><b>Results:</b> Mean age of the cohort was 63.3 (SD 15.4) years; 75% of the patients were female sex (total n = 58 patients). The IBS-SSS scores across timepoints are presented in Table 1 and Figure 1. The median IBS score at baseline was 134 [IQR 121], which decreased to a median score of 65 [IQR 174] at 1-week post-FMT. The baseline timepoint was found to differ from the 1-week, 1-month, and 3-month timepoints (p &lt; 0.05). No other differences between timepoints were observed. Frailty was assessed at baseline and 3 months (total n = 52). At baseline, 71% of patients (n = 37) were considered Pre-Frail or Frail, but this percentage decreased to 46% (n = 24) at 3-months (Table 2; p &lt; 0.05).</p><p><b>Conclusion:</b> Findings from this multicenter, prospective cohort study indicate an overall improvement in both IBS symptoms and frailty in the 3-months following FMT therapy for rCDI. Notably, IBS symptom scores were found to improve by 1-week post FMT. Further research is required to understand what predicts IBS symptom improvement following FMT and if nutrition therapy can help support further improvement. It will also be important to further understand how nutrition therapy can help support the improvement in frailty status observed following FMT for rCDI.</p><p><b>Table 1.</b> Distribution of IBS-SSS Scores at Baseline and Following FMT.</p><p></p><p><b>Table 2.</b> Frailty Distribution Assessed by FRAIL Scale at Baseline and 3-Months Post-FMT.</p><p></p><p></p><p>Box-plot distributions of IBS-SSS scores across timepoints. Median IBS-SSS score are baseline was 134 and decreased to a median score of 65 at 1-week post-FMT. The baseline timepoint was found to differ from the 1-week, 1-month, and 3-month timepoints (p &lt; 0.05).</p><p><b>Figure 1.</b> Distribution of IBS-SSS Scores by Timepoint.</p><p>Oshin Khan, BS<sup>1</sup>; Subanandhini Subramaniam Parameshwari, MD<sup>2</sup>; Kristen Heitman, PhD, RDN<sup>1</sup>; Kebire Gofar, MD, MPH<sup>2</sup>; Kristin Goheen, BS, RDN<sup>1</sup>; Gabrielle Vanhouwe, BS<sup>1</sup>; Lydia Forsthoefel, BS<sup>1</sup>; Mahima Vijaybhai Vyas<sup>2</sup>; Saranya Arumugam, MBBS<sup>2</sup>; Peter Madril, MS, RDN<sup>1</sup>; Praveen Goday, MBBS<sup>3</sup>; Thangam Venkatesan, MD<sup>2</sup>; Kristen Roberts, PhD, RDN, LD, CNSC, FASPEN, FAND<sup>4</sup></p><p><sup>1</sup>The Ohio State University, Columbus, OH; <sup>2</sup>The Ohio State University Wexner Medical Center, Columbus, OH; <sup>3</sup>Nationwide Children's Hospital, Columbus, OH; <sup>4</sup>The Ohio State University, Granville, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Cyclic vomiting syndrome (CVS) is a disorder of gut-brain interaction characterized by recurrent spells of nausea, vomiting, and abdominal pain lasting hours to days. Estimated prevalence is approximately 2% in adults and children. Due to recalcitrant symptoms, patients might develop disorder eating patterns though data are lacking. Identifying patients at risk for disordered eating patterns and malnutrition can help optimize nutritional status and may improve overall patient outcomes. The purpose of this study is to define nutrient intakes and dietary patterns of those with CVS seeking care at a tertiary care referral clinic, and to establish variation in dietary intakes based on disease severity.</p><p><b>Methods:</b> In this ongoing, cross-sectional study of adults diagnosed with CVS based on Rome IV criteria, participants are asked to complete validated surveys including a food frequency questionnaire (FFQ). Baseline demographics and clinical characteristics including disease severity (defined by the number of episodes per year) were ascertained. Healthy eating index (HEI) scores (scale of 0-100) were calculated to assess diet quality with higher scores indicating better diet quality compared to lower scores. Those with complete data were included in this interim analysis.</p><p><b>Results:</b> Data from 33 participants with an average age of 40 ± 16 years and an average BMI of 28.6 ± 7.9 is presented. The cohort was predominately female (67%), white (79%) and with moderate to severe disease (76%). The malnutrition screening tool supported that 42% of participants were at risk of malnutrition independent of BMI status (p = 0.358) and disease severity (p = 0.074). HEI scores were poor amongst those with CVS (55) and did not differ based on disease severity (58 vs 54; p = 0.452). Energy intakes varied ranging from 416-3974 kcals/day with a median intake of 1562 kcals/day.</p><p><b>Conclusion:</b> In CVS, dietary intake is poor and there is a high risk of malnutrition regardless of disease severity and BMI. Providers and registered dietitian nutritionists must be aware of the high rates of malnutrition risk and poor dietary intakes in this patient population to improve delivery of dietary interventions. Insight into disordered eating and metabolic derangements may improve the understanding of dietary intakes in CVS.</p><p>Hannah Huey, MDN<sup>1</sup>; Holly Estes-Doetsch, MS, RDN, LD<sup>2</sup>; Christopher Taylor, PhD, RDN<sup>2</sup>; Kristen Roberts, PhD, RDN, LD, CNSC, FASPEN, FAND<sup>3</sup></p><p><sup>1</sup>Nationwide Children's Hospital, Columbus, OH; <sup>2</sup>The Ohio State University, Columbus, OH; <sup>3</sup>The Ohio State University, Granville, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Micronutrient deficiencies are common in Crohn's disease (CD) due to poor dietary absorption. Pregnancy has an increased nutritional demand and when coupled with a malabsorptive condition like CD, clinicians must closely monitor micronutrient status. However, there is a lack of evidence-based guidelines for clinicians when managing these complex patients leaving clinicians to use clinical judgement for management. A case study of a pregnant female with CD presents for delivery with an undetected fat-soluble vitamin deficiency. The impact of a vitamin K deficiency on the post-partum management of a patient with CD is presented along with potential assessment and treatment strategies. At 25 weeks gestation, the patient presented with a biochemical iron deficiency anemia, vitamin B12 deficiency, and zinc deficiency for which she was treated with oral supplementation and/or intramuscular injections. No assessment of fat-soluble vitamins during the gestation period was conducted despite a pre-pregnancy history of multiple micronutrient deficiencies. At 33 weeks gestation, the mother was diagnosed with preeclampsia and delivered the fetus at 35 weeks. After birth, the infant presented to the NICU with a mediastinal mass, abnormal liver function tests and initial coagulopathy. The mother experienced a uterine hemorrhage post-cesarean section. At this time her INR was 14.8 with a severe prolongation of the PT and PTT and suboptimal levels of blood clotting factors II, VII, IX, and X. The patient was diagnosed with a vitamin K deficiency and was treated initially with 10 mg daily by mouth x 3 days resulting in an elevated serum vitamin K while PT and INR were trending towards normal limits. At discharge she was recommended to take 1 mg daily by mouth of vitamin K to prevent further deficiency. PT and INR were the biochemical assays that were reassessed every 3 months since serum vitamin K is more reflective of recent intake. CD represents a complex disorder and the impact of pregnancy on micronutrient status is unknown. During pregnancy, patients with CD may require additional micronutrient monitoring particularly in the case of historical micronutrient deficiencies or other risk factors. This case presents the need for further research into CD-specific micronutrient deficiencies and the creation of specific supplementation guidelines and treatment algorithms for detection of micronutrient deficiencies in at-risk patients.</p><p><b>Methods:</b> None Reported.</p><p><b>Results:</b> None Reported.</p><p><b>Conclusion:</b> None Reported.</p><p>Gretchen Murray, BS, RDN<sup>1</sup>; Kristen Roberts, PhD, RDN, LD, CNSC, FASPEN, FAND<sup>2</sup>; Phil Hart, MD<sup>1</sup>; Mitchell Ramsey, MD<sup>1</sup></p><p><sup>1</sup>The Ohio State University Wexner Medical Center, Columbus, OH; <sup>2</sup>The Ohio State University, Granville, OH</p><p><b>Financial Support:</b> UL1TR002733.</p><p><b>Background:</b> Enteric hyperoxaluria (EH) and resultant lithiasis is well documented in many malabsorptive conditions including inflammatory bowel disease, celiac disease, short bowel syndrome, and post gastric bypass. Chronic pancreatitis (CP) often leads to exocrine pancreatic insufficiency (EPI) and subsequent fat malabsorption increasing the risk of EH secondary to calcium binding to dietary fat leaving oxalates available for colonic absorption. Modulating oxalate intake by reducing whole grains, greens, baked beans, berries, nuts, beer and chocolate while simultaneously improving hydration is the accepted medical nutrition therapy (MNT) for EH and calcium-oxalate stones. Although sources of dietary oxalate are well-known, there is limited literature regarding dietary oxalates in the CP diet, leaving a paucity of data to guide MNT for EH and calcium-oxalate lithiasis for these patients.</p><p><b>Methods:</b> A cross-sectional, case-control study was performed comparing subjects with CP to healthy controls. Vioscreen™ food frequency questionnaire was used to assess and quantify total oxalic acid intake in the CP cohort and to describe dietary sources. Descriptive statistics were used to describe dietary intake of oxalic acid and contributing food sources.</p><p><b>Results:</b> A total of 52 subjects with CP were included and had a mean age of 50 ± 15 years. Most subjects were male (n = 35; 67%). Mean BMI was 24 ± 6 kg/m2 and 8 subjects (15%) were classified as underweight by BMI. Median daily caloric intake was 1549 kcal with a median daily oxalic acid intake of 104 mg (range 11-1428 mg). The top three contributors to dietary oxalate intake were raw and cooked greens such as spinach or lettuce, followed by mixed foods such as pizza, spaghetti, and tacos and tea. Other significant contributors (&gt;100 mg) to dietary oxalate intake included sports or meal replacement bars, cookies and cakes, potato products (mashed, baked, chips, fried), and refined grains (breads, tortillas, bagels).</p><p><b>Conclusion:</b> In the CP population, highest contributors to oxalate intake include greens, mixed foods, tea, meal replacement bars, some desserts, potatoes, and refined grains. Many of the identified dietary oxalate sources are not considered for exclusion in a typical oxalate restricted diet. A personalized approach to dietary oxalate modulation is necessary to drive MNT for EH prevention in those with CP.</p><p>Qian Ren, PhD<sup>1</sup>; Peizhan Chen, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine<sup>2</sup></p><p><sup>1</sup>Department of Clinical Nutrition, Shanghai Sixth People's Hospital Affiliated to Shanghai Jiao Tong University School of Medicine, Shanghai; <sup>2</sup>Clinical Research Center, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai</p><p><b>Financial Support:</b> This study was supported by the Youth Cultivation Program of Shanghai Sixth People's Hospital (Grant No. ynqn202223), the Key Laboratory of Trace Element Nutrition, National Health Commission of the Peoples’ Republic of China (Grant No. wlkfz202308), and the Danone Institute China Diet Nutrition Research and Communication (Grant No. DIC2023-06).</p><p><b>Background:</b> Low serum vitamin D status was reported to be associated with reduced muscle mass; however, it is inconclusive whether this relationship is causal. This study used data from the National Health and Nutrition Examination Survey (NHANES) and two-sample Mendelian randomization (MR) analyses to ascertain the causal relationship between serum 25-hydroxyvitamin D [25(OH)D] and appendicular muscle mass (AMM).</p><p><b>Methods:</b> In the NHANES 2011–2018 dataset, 11,242 participants aged 18–59 years old were included, and multivariant linear regression was performed to assess the relationship between 25(OH)D and AMM measured by dual-energy X-ray absorptiometry (Figure 1). In two-sample MR analysis, 167 single nucleotide polymorphisms significantly associated with serum 25(OH)D were applied as instrumental variables (IVs) to assess vitamin D effects on AMM in the UK Biobank (417,580 Europeans) using univariable and multivariable MR models (Figure 2).</p><p><b>Results:</b> In the NHANES 2011–2018 dataset, serum 25(OH)D concentrations were positively associated with AMM (β = 0.013, SE = 0.001, p &lt; 0.001) in all participants, after adjustment for age, race, season of blood collection, education, income, body mass index and physical activity. In stratification analysis by sex, males (β = 0.024, SE = 0.002, p &lt; 0.001) showed more pronounced positive associations than females (β = 0.003, SE = 0.002, p = 0.024). In univariable MR, genetically higher serum 25(OH)D levels were positively associated with AMM in all participants (β = 0.049, SE = 0.024, p = 0.039) and males (β = 0.057, SE = 0.025, p = 0.021), but only marginally significant in females (β = 0.043, SE = 0.025, p = 0.090) based on IVW models were noticed. No significant pleiotropy effects were detected for the IVs in the two-sample MR investigations. In MVMR analysis, a positive causal effect of 25(OH)D on AMM were observed in total population (β = 0.116, SE = 0.051, p = 0.022), males (β = 0.111, SE = 0.053, p = 0.036) and females (β = 0.124, SE = 0.054, p = 0.021).</p><p><b>Conclusion:</b> Our results suggested a positive causal effect of serum 25(OH)D concentration on AMM; however, more researches are needed to understand the underlying biological mechanisms.</p><p></p><p><b>Figure 1.</b> Working Flowchart of Participants Selection in the Cross-Sectional Study.</p><p></p><p><b>Figure 2.</b> The study assumptions of the two-sample Mendelian Randomization analysis between serum 25(OH)D and appendicular muscle mass. The assumptions include: (1) the genetic instrumental variables (IVs) should exhibit a significant association with serum 25(OH)D; (2) the genetic IVs should not associate with any other potential confounding factors; and (3) the genetic IVs must only through serum 25(OH)D but not any other confounders to influence the appendicular muscle mass. The dotted lines indicate the violate of the assumptions.</p><p>Qian Ren, PhD<sup>1</sup>; Junxian Wu<sup>1</sup></p><p><sup>1</sup>Department of Clinical Nutrition, Shanghai Sixth People's Hospital Affiliated to Shanghai Jiao Tong University School of Medicine, Shanghai</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> A healthy diet is essential for both preventing and treating type 2 diabetes mellitus (T2DM), which has a negative impact on public health. Whole grains, which are rich in dietary fibre, can serve as a good source of carbohydrates. However, the correlation and causality between whole grain intake and the risk of T2DM and glucose metabolism remains to be clarified.</p><p><b>Methods:</b> First, the National Health and Nutrition Examination Survey database 2003-2018 were used to investigate the correlation between dietary whole grain/fibre intake and the risk of T2DM and glucose metabolism. Then, based on the largest publicly published genome-wide association analysis of whole grain intake in the Neale lab database, single nucleotide polymorphisms (SNPs) that were statistically and significantly associated with whole grain intake were selected as instrumental variables (p &lt; 5×10<sup>-8</sup>, linkage disequilibrium r<sup>2</sup> &lt; 0.1). Inverse variance weighted analysis (IVW), weighted median method and other methods were used to analyze the causal relationship between whole grain intake and T2DM. Heterogeneity test, gene pleiotropy test and sensitivity analysis were performed to evaluate the stability and reliability of the results.</p><p><b>Results:</b> The results showed that dietary intakes of whole grains (OR = 0.999, 95%CI: 0.999 ~ 1.000, p = 0.004)/fibre (OR = 0.996, 95% CI: 0.993 ~ 0.999, p = 0.014) were negatively associated with the risk of T2DM. In the population with normal glucose metabolism, dietary fibre intake was negatively associated with FPG (β = -0.003, SE = 0.001, p &lt; 0.001), FINS (β = -0.023, SE = 0.011, p = 0.044), and HOMA-IR (β = -0.007, SE = 0.003, p = 0.023). In the population with abnormal glucose metabolism, dietary intakes of whole grains (β = -0.001, SE = 0.001, p = 0.036) and fibre (β = -0.006, SE = 0.002, p = 0.005) were negatively associated with HbA1c. For further MR analyses, the IVW method demonstrated that for every one standard deviation increase in whole grains intake, T2DM risk was decreased by 1.9% (OR = 0.981, 95%CI: 0.970 ~ 0.993, β = -0.019, p = 0.002), with consistent findings for IVW-multiplicative random effects, IVW-fixed effects and IVW-radial. Meanwhile, MR-Egger regression analysis (intercept = -2.7 × 10<sup>-5</sup>, p = 0.954) showed that gene pleiotropy doesn't influence the results of MR analysis. Leave-one-out analysis showed that individual SNP greatly doesn't influence the results (p<sub>heterogeneity</sub>= 0.445).</p><p><b>Conclusion:</b> Dietary intakes of whole grains may reduce the risk of T2DM and improve glucose metabolic homeostasis and insulin resistance. The causal relationship between whole grains intake and T2DM, as well as the optimal daily intake of whole grains still needs to be further explored in the future through large randomised controlled intervention studies and prospective cohort studies.</p><p>Hikono Sakata, Registered Dietitian<sup>1</sup>; MIsa Funaki, Registered Dietitian<sup>2</sup>; Kanae Masuda, Registered Dietitian<sup>2</sup>; Rio Kurihara, Registered Dietitian<sup>2</sup>; Tomomi Komura, Registered Dietitian<sup>2</sup>; Masaru Yoshida, Doctor<sup>2</sup></p><p><sup>1</sup>University of Hyogo, Ashiya-shi, Hyogo; <sup>2</sup>University of Hyogo, Himezi-shi, Hyogo</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> In recent years, lifestyle-related diseases such as obesity, diabetes, and dyslipidemia have been recognized as problems, one of the causes of which is an excessive fatty diet. Obesity and other related diseases are known to be risk factors for the severity of infectious diseases such as sepsis and novel coronavirus infection, but their pathomechanisms have not been clarified. Therefore, we hypothesized that a diet high in fat might induce functional changes not only in adipocytes but also in macrophages, weakening the immune response and leading to aggravation of infectious diseases. Therefore, in this study, we performed proteome analysis and RNA sequence analysis to examine what kind of gene and protein expression is induced in macrophages by high-fat diet loading.</p><p><b>Methods:</b> Four-week-old mice were divided into a normal diet (ND) group and a high-fat diet (HFD) group and kept for four weeks. Macrophages were collected intraperitoneally from mice that had been intraperitoneally injected with 2 mL of thioglycolate medium to promote macrophage proliferation one week before dissection, and incubated at 37°C with 5% CO2 using Roswell Park Memorial Institute medium (RPMI). After culturing in an environment for 2 hours, floating cells were removed, and proteome analysis was performed using the recovered macrophages. In addition, RNA sequence analysis was performed on RNA extracted from macrophages.</p><p><b>Results:</b> Proteome analysis identified more than 4000 proteins in each group. In the HFD group, compared to the ND group, decreased expression of proteins involved in phagocytosis, such as immunoglobulin and eosinophil peroxidase, was observed. In addition, RNA sequencing data analysis also showed a decrease in the expression levels of genes related to phagocytosis, which had been observed to decrease in proteome analysis.</p><p><b>Conclusion:</b> From the above, it was suggested that the phagocytic ability of macrophages is reduced by high-fat diet loading. This research is expected to clarify the molecular mechanisms by which high-fat dietary loading induces the expression of genes and proteins and induces immunosuppressive effects.</p><p>Benjamin Davies, BS<sup>1</sup>; Chloe Amsterdam, BA<sup>1</sup>; Basya Pearlmutter, BS<sup>1</sup>; Jackiethia Butsch, C-CHW<sup>2</sup>; Aldenise Ewing, PhD, MPH, CPH<sup>3</sup>; Erin Holley, MS, RDN, LD<sup>2</sup>; Subhankar Chakraborty, MD, PHD<sup>4</sup></p><p><sup>1</sup>The Ohio State University College of Medicine, Columbus, OH; <sup>2</sup>The Ohio State University Wexner Medical Center, Columbus, OH; <sup>3</sup>The Ohio State University College of Public Health, Columbus, OH; <sup>4</sup>The Ohio State University Wexner Medical Center, Dublin, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Food insecurity (FI) refers to lack of consistent access to sufficient food for an active, healthy life. This issue, influenced by economic, social, and environmental factors, disproportionately affects disadvantaged populations. According to the United States Department of Agriculture, more than 10% of U.S. households experience FI, leading to physical and mental health consequences. FI has been linked to poorer dietary quality, increased consumption of processed, calorie-dense foods, and a higher prevalence of obesity, diabetes, and cardiovascular diseases. Chronic stress from FI can also trigger or exacerbate mental health disorders, including depression and anxiety, further complicating health outcomes. Affecting ~20% of the global population, dyspepsia significantly diminishes quality of life and increases healthcare costs. Its etiology is multifactorial, involving abnormal gastric motility, visceral hypersensitivity, inflammation, and psychosocial stressors. Although research on the link between FI and disorders of gut-brain interaction (DGBIs) like dyspepsia is limited, emerging evidence suggests a bidirectional relationship. Therefore, this study was designed to examine the association between FI, dyspepsia, and other health-related social needs (HRSN). Our hypotheses include 1) patients with FI have more severe dyspepsia symptoms, and 2) FI is associated with HRSN in other domains.</p><p><b>Methods:</b> Patients presenting to a specialty motility clinic were prospectively enrolled into a registry created with the goal of holistically investigating the pathophysiology of DGBIs. Validated questionnaires for HRSN and dyspepsia were completed prior to their clinic visits. Data were managed with REDCap and statistical analyses were performed using SPSS.</p><p><b>Results:</b> 53 patients completed the questionnaires. 88.7% of patients were White and 73.6% were female with an average age of 45.6 years (21-72) and BMI of 28.7 kg/m<sup>2</sup> (17.8-51.1). FI was present in 13 (24.5%) patients. The overall severity of dyspepsia symptoms was significantly less in the food secure patients (13.8 vs. 18.8, p = 0.042). Of the four subscales of dyspepsia (nausea-vomiting, postprandial fullness, bloating, and loss of appetite), only loss of appetite was significantly greater in those with FI (2.3 vs. 1.3, p = 0.017). Patients with FI were more likely to be at medium (61.5% vs. 5.0%) or high risk (30.8% vs. 2.5%, p &lt; 0.001) of financial hardship, experience unmet transportation needs (38.5% vs. 5.0%, p = 0.0.019) and housing instability (30.8% vs. 5.0%, p = 0.023) compared to those who were food secure. They were also at higher risk of depression (54% vs. 12.5%, p = 0.005), and reporting insufficient physical activity (92.3% vs. 55.0%, p = 0.05). After adjusting for age, gender, race, and BMI, FI was not a predictor of global dyspepsia severity. Greater BMI (O.R. 0.89, 95% C.I. 0.81-0.98) was associated with severity of early satiety. Female gender (O.R. 10.0, 95% C.I. 1.3-76.9, p = 0.03) was associated with the severity of nausea. Greater BMI (O.R. 1.10, 95% C.I. 1.001-1.22, p = 0.048) and female gender (O.R. 10.8, 95% C.I. 1.6-72.9, p = 0.015) both correlated with severity of postprandial fullness.</p><p><b>Conclusion:</b> FI affected ~25% of patients seen in our clinic. It was, however, not an independent predictor of overall severity of dyspepsia symptoms. Patients who experienced FI had higher prevalence of other HRSN, and risk of depression and physical inactivity. Our results highlight the importance of considering FI and other HRSN in the management of dyspepsia. Understanding this interaction is essential for improving clinical outcomes and guiding public health interventions.</p><p>Ashlesha Bagwe, MD<sup>1</sup>; Chandrashekhara Manithody, PhD<sup>1</sup>; Kento Kurashima, MD, PhD<sup>1</sup>; Shaurya Mehta, BS<sup>1</sup>; Marzena Swiderska-Syn<sup>1</sup>; Arun Verma, MD<sup>1</sup>; Austin Sims<sup>1</sup>; Uthayashanker Ezekiel<sup>1</sup>; Ajay Jain, MD, DNB, MHA<sup>1</sup></p><p><sup>1</sup>Saint Louis University, St. Louis, MO</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Farnesoid X receptor (FXR) a gut nuclear receptor regulates intestinal driven bile acid homeostasis in short bowel syndrome. Chenodeoxycholic acid (CDCA), a primary bile acid, acts as a FXR ligand. Stem cell-derived intestinal enteroids offer a valuable system to study intestinal FXR function. We hypothesized that transfection of porcine enteroids with small interfering RNA (siRNA) would modulate FXR to further define its mechanistic role.</p><p><b>Methods:</b> We developed a porcine protocol for matrigel-based 3D culture systems to generated enteroids from the small bowel of neonatal yorkshire pigs. After 7 days, cultures were passaged and expanded. RNA strands for Dicer-substrate siRNAs (DsiRNAs) were synthesized as single-strand RNAs (ssRNAs) by Integrated DNA Technologies and resuspended in an RNase-free buffer. DsiRNA targeted Sus scrofa, farnesoid x receptor (FXR) gene (KF597010.1). Three sets of DsiRNA were made for gene-specific silencing of FXR gene. Porcine enteroids were cultured and transfected with FXR-specific siRNA and control siRNA using Lipofectamine RNAiMAX reagent. They were also treated with escalating CDCA concentrations (25 μM to 50 μM) for 24 and 48 hrs. FXR mRNA levels were quantified by real-time PCR and functional assays performed to assess changes in bile acid uptake and efflux following transfection.</p><p><b>Results:</b> Data from 3 separate experiments of intestinal crypts showed similar results in enhanced FXR expression with CDCA against control (p &lt; 0.01). CDCA treatment resulted in a dose-dependent increase in FXR mRNA expression, reaching peak levels after 48 hours of exposure (2.8X increase) with enhanced nuclear localization. Functionally, CDCA-treated enteroids displayed increased bile acid uptake and reduced efflux, validating FXR's role in mediating bile acid driven enterohepatic circulation. Several runs with siRNA were conducted. Using 30pMole siRNA (sense: 5' GAUCAACAGAAUCUUCUUCAUUATA 3' and antisense: 5' UAUAAUGAAGAAGAUUCUGUUGAUCUG 3') there was a 68% reduction in FXR expression against scramble. FXR silencing led to decreased bile acid uptake and increased efflux. No significant effects were observed in enteroids transfected with control siRNA. Paradoxically, CDCA treated cultures showed a higher proportion of immature enteroids to mature enteroids.</p><p><b>Conclusion:</b> In porcine enteroids, CDCA treatment increased FXR gene expression and promoted its nuclear localization. This finding implies the existence of a positive feedback cycle in which CDCA, an FXR ligand, induces further synthesis and uptake. siRNA transfection was able to significantly decrease FXR activity. By employing this innovative methodology, one can effectively examine the function of FXR in ligand treated or control systems.</p><p><b>Pediatric, Neonatal, Pregnancy, and Lactation</b></p><p>Kento Kurashima, MD, PhD<sup>1</sup>; Si-Min Park, MD<sup>1</sup>; Arun Verma, MD<sup>1</sup>; Marzena Swiderska-Syn<sup>1</sup>; Shaurya Mehta, BS<sup>1</sup>; Austin Sims<sup>1</sup>; Mustafa Nazzal, MD<sup>1</sup>; John Long, DVM<sup>1</sup>; Chandrashekhara Manithody, PhD<sup>1</sup>; Shin Miyata, MD<sup>1</sup>; Ajay Jain, MD, DNB, MHA<sup>1</sup></p><p><sup>1</sup>Saint Louis University, St. Louis, MO</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> North American Society for Pediatric Gastroenterology, Hepatology and Nutrition (NASPGHAN) Florida.</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Biliary atresia (BA) is a major cause of obstructive neonatal cholestatic disease. Although hepato-portoenterostomy (HPE) is routinely performed for BA patients, more than half eventually require liver transplantation. The intricate mechanisms of bile ductular injury driving its pathogenesis remains elusive and not recapitulated in current small animal and rodent models that rely on bile duct ligation. Addressing prevailing lacunae, we hypothesized that extra and intra hepatic bile duct destruction through an endothelial irritant would recapitulate human condition. We have thus developed a novel neonatal piglet BA model called, ‘BATTED’. Piglets have liver and gastro-intestinal homology to human infants and share anatomic and physiological processes providing a robust platform for BATTED and HPE.</p><p><b>Methods:</b> Six 7-10 day old piglets were randomized to BATTED (US provisional Patent US63/603,995) or sham surgery. BATTED included cholecystectomy, common bile duct and hepatic duct injection of 95% ethanol and a retainer suture for continued ethanol induced intrahepatic injury. Vascular access ports were placed and scheduled ultrasound guided liver biopsies were performed. Six-weeks post-BATTED piglets underwent HPE. 8-weeks after initial surgery, animals were euthanized. Serology, histology, gene expression and immunohistochemistry were performed.</p><p><b>Results:</b> Serological evaluation revealed a surge in conjugated bilirubin 6 weeks after BATTED procedure from baseline (mean Δ 0.39 mg/dL to 3.88 mg/dL). Gamma-glutamyl transferase (GGT) also exhibited a several fold increase (mean: Δ 16.3IU to 89.5IU). Sham did not display these elevations (conjugated bilirubin: Δ 0.39 mg/dL to 0.98 mg/dL, GGT: Δ 9.2IU to 10.4IU). Sirius red staining demonstrated significant periportal and diffuse liver fibrosis (16-fold increase) and bile duct proliferation marker CK-7 increased 9-fold with BATTED. Piglets in the BATTED group demonstrated enhanced CD-3 (7-fold), alpha-SMA (8.85-fold), COL1A1 (11.7-fold) and CYP7A1 (7-fold), vs sham. Successful HPE was accomplished in piglets with improved nutritional status and a reduction in conjugated bilirubin (Δ 4.89 mg/dL to 2.11 mg/dL).</p><p><b>Conclusion:</b> BATTED replicated BA features, including hyperbilirubinemia, GGT elevation, significant hepatic fibrosis, bile duct proliferation, and inflammatory infiltration with subsequent successful HPE. This model offers substantial opportunities to elucidate the mechanism underlying BA and adaptation post HPE, paving the path for the development of diagnostics and therapeutics.</p><p>Sirine Belaid, MBBS, MPH<sup>1</sup>; Vikram Raghu, MD, MS<sup>1</sup></p><p><sup>1</sup>UPMC, Pittsburgh, PA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> At our institution, pediatric residents are responsible for managing patients with intestinal failure (IF) in the inpatient units. However, they have reported feelings of inexperience and anxiety when dealing with these complex cases. This project aimed to identify knowledge gaps and evaluate the confidence levels of pediatric residents in managing IF patients.</p><p><b>Methods:</b> We conducted an online needs assessment survey using Qualtrics, which included Likert-scale, multiple-choice, open-ended and rating questions to assess residents' confidence levels (from 1 to 10) in performing tasks related to IF patient care. This voluntary survey, approved by the IRB as exempt, was distributed to all pediatric residents at the University of Pittsburgh via QR codes and emails.</p><p><b>Results:</b> Of the respondents, 32% participated in the survey, with nearly 50% having completed a rotation on the Intestinal Rehabilitation (IR) service. Residents reported the lowest confidence in calculating Total Parenteral Nutrition (TPN)-like Intravenous (IV) fluids (solutions administered centrally or peripherally, containing dextrose and major electrolytes designed to match the patients’ home TPN contents), identifying signs of D-lactic acidosis and small intestinal bacterial overgrowth (SIBO), and managing SIBO (average confidence rating of 2/10). They also expressed low confidence in ordering home TPN and TPN-like IV Fluids, understanding related anatomy, ensuring proper stoma care, managing central access loss, and addressing poor catheter blood flow (average confidence ratings of 3-4/10). Conversely, residents felt more confident managing feeding intolerance and central line-associated infections (average confidence ratings of 5-6/10). Additionally, they rated their ability to identify signs of septic and hypovolemic shock as 8/10, though they felt less confident in managing these conditions (7/10). Furthermore, 64% of respondents agreed that managing IF patients is educationally valuable and preferred laminated cards or simulated sessions as educational resources (average ratings of 8 and 6.8 out of 10, respectively).</p><p><b>Conclusion:</b> The survey highlights several areas where pediatric residents need further education. Addressing these knowledge gaps through targeted curricular interventions can better equip residents to manage IF patients and potentially increase their interest in this specialty.</p><p></p><p>CLABSI = Central Line-Associated Bloodstream Infection, TPN = Total Parenteral Nutrition, EHR = Electronic Health Record, IV = Intravenous, SIBO = Small Intestinal Bacterial Overgrowth.</p><p><b>Figure 1.</b> Stratifies the tasks related to managing patients with Intestinal Failure (IF) into three categories based on the average confidence rating score (&gt;= 7/10, 5-6/10, &lt;=4/10) of pediatric residents.</p><p></p><p><b>Figure 2.</b> Illustrates the distribution of pediatric residents’ opinions on the educational value of managing patients with intestinal failure.</p><p>Alyssa Ramuscak, MHSc, MSc<sup>1</sup>; Inez Martincevic, MSc<sup>1</sup>; Hebah Assiri, MD<sup>1</sup>; Estefania Carrion, MD<sup>2</sup>; Jessie Hulst, MD, PhD<sup>1</sup></p><p><sup>1</sup>The Hospital for Sick Children, Toronto, ON; <sup>2</sup>Hospital Metropolitano de Quito, Quito, Pichincha</p><p><b>Financial Support:</b> Nestle Health Science Canada, North York, Ontario, Canada.</p><p><b>Background:</b> Enteral nutrition provides fluids and nutrients to individuals unable to meet needs orally. Recent interest in real food-based formulas highlights a shift towards providing nutrition with familiar fruit, vegetable and protein ingredients. This study aimed to evaluate the tolerability and nutritional adequacy of hypercaloric, plant-based, real food ingredient formula in pediatric tube-fed patients.</p><p><b>Methods:</b> This prospective, single-arm, open-label study evaluated the tolerability and nutritional adequacy of a hypercaloric, plant-based formula, Compleat® Junior 1.5, in medically complex, stable, pediatric tube-fed patients aged 1-13 years. Participants were recruited from outpatient clinics at The Hospital for Sick Children, Toronto from May 2023 to June 2024. Demographic and anthropometric measurements (weight, height) were obtained at baseline. Daily dose of study product was isocaloric when compared to routine feeds. Total fluid needs were met by adding water. Participants transitioned to the study formula over 3 days, followed by 14-days of exclusive use of study product. Caregivers monitored volume of daily intake, feed tolerance, and bowel movements during the transition and study period using an electronic database (Medrio®). An end of study visit was conducted to collect weight measurements. Descriptive statistics summarize demographic and clinical characteristics (Table 1). The paired t-test compared weight-for-age and BMI-for-age z-scores between baseline and the end-of-study. Symptoms of intolerance and bowel movements, using either the Bristol Stool Scale or Brussel's Infant and Toddler Stool Scale, were described as frequency of events and compared at baseline to intervention period. The percent calorie and protein goals during the study period were calculated as amount calories received over prescribed, and amount protein received as per dietary reference intake for age and weight.</p><p><b>Results:</b> In total, 27 ambulatory pediatric participants with a median age of 5.5 years (IQR, 2.5-7) were recruited for the study with 26 completing (Table 1). Participant weight-for-age and BMI-for-age z-scores significantly improved between baseline and end of study, from -1.75 ± 1.93 to -1.67 ± 1.88 (p &lt; 0.05), and from -0.47 ± 1.46 to 0.15 ± 0.23 (p &lt; 0.05), respectively. There was no significant difference in the frequency of any GI symptom, including vomiting, gagging/retching, tube venting or perceived pain/discomfort with feeds, between baseline and end of study. There was no significant difference in frequency or type of stool between baseline and end of study. Study participants met 100% of their prescribed energy for the majority (13 ± 1.7 days) of the study period. All participants exceeded protein requirements during the study period. Twenty families (76.9%) indicated wanting to continue to use study product after completing the study.</p><p><b>Conclusion:</b> This prospective study demonstrated that a hypercaloric, plant-based, real food ingredient formula among stable, yet medically complex children was well tolerated and calorically adequate to maintain or facilitate weight gain over a 14-day study period. The majority of caregivers preferred to continue use of the study product.</p><p><b>Table 1.</b> Demographic and Clinical Characteristics of Participants (n = 27).</p><p></p><p><b>Poster of Distinction</b></p><p>Gustave Falciglia, MD, MSCI, MSHQPS<sup>1</sup>; Daniel Robinson, MD, MSCI<sup>1</sup>; Karna Murthy, MD, MSCI<sup>1</sup>; Irem Sengul Orgut, PhD<sup>2</sup>; Karen Smilowitz, PhD, MS<sup>3</sup>; Julie Johnson, MSPH PhD<sup>4</sup></p><p><sup>1</sup>Northwestern University Feinberg School of Medicine, Chicago, IL; <sup>2</sup>University of Alabama Culverhouse College of Business, Tuscaloosa, AL; <sup>3</sup>Northwestern University Kellogg School of Business &amp; McCormick School of Engineering, Evanston, IL; <sup>4</sup>University of North Carolina School of Medicine, Chapel Hill, NC</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> Children's Hospital Neonatal Consortium (CHNC) Annual Conference, November 1, 2021, Houston, TX.</p><p><b>Financial Support:</b> None Reported.</p><p>Lyssa Lamport, MS, RDN, CDN<sup>1</sup>; Abigail O'Rourke, MD<sup>2</sup>; Barry Weinberger, MD<sup>2</sup>; Vitalia Boyar, MD<sup>2</sup></p><p><sup>1</sup>Cohen Children's Medical Center of New York, Port Washington, NY; <sup>2</sup>Cohen Children's Medical Center of NY, New Hyde Park, NY</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Premature infants in the neonatal intensive care unit (NICU) are at high risk for peripheral intravenous catheter infiltration (PIVI) because of the frequent cannulation of small and fragile vessels. The most common infusate in neonates is parenteral nutrition (PN), followed by antibiotics. Previous reports have suggested that the intrinsic properties of infusates, such as pH, osmolality, and calcium content, determine the severity of PIVIs. This has led to the common practice of restricting the intake of protein and/or calcium to less than that which is recommended for optimal growth and bone mineralization.</p><p><b>Methods:</b> Our objective was to identify the characteristics of infants and intravenous (IV) infusates that associated with the development of severe neonatal PIVIs. We conducted a retrospective analysis of PIVIs in our level IV NICU from 2018-2022 (n = 120). Each PIVI was evaluated by a wound certified neonatologist and classified as mild, moderate, or severe using a scoring system based on the Infusion Nurses Society (INS) staging criteria. Comparison between groups were done using ANOVA and chi-square analysis or Mann-Whitney tests for non-parametric data.</p><p><b>Results:</b> Infants with severe PIVIs had a lower mean birthweight than those with mild or moderate PIVIs (1413.1 g vs 2116.9 g and 2020.3 g respectively, p = .01) (Table 1). Most PIVIs occurred during infusions of PN and lipids, but the severity was not associated with the infusion rate, osmolality, or with the concentration of amino acids (median 3.8 g/dL in the mild group, 3.5 g/dL in the moderate group and 3.4 g/dL in the severe group) or calcium (median 6500 mg/L in all groups) (Table 2, Figure 1). Of note, the infusion of IV medications within 24 hours of PIVI was most common in the severe PIVI group (p = .03)(Table 2). Most PIVIs, including mild, were treated with hyaluronidase. Advanced wound care was required for 4% of moderate and 44% of severe PIVIs, and none required surgical intervention.</p><p><b>Conclusion:</b> Severe PIVIs in the NICU are most likely to occur in infants with a low birthweight and within 24 hours of administration of IV medications. This is most likely because medications must be kept at acidic or basic pH for stability, and many have high osmolarity and/or intrinsic caustic properties. Thus, medications may induce chemical phlebitis and extravasation with inflammation. In contrast, PN components, including amino acids and calcium, are not related to the severity of extravasations. Our findings suggest that increased surveillance of IV sites for preterm infants following medication administration may decease the risk of severe PIVIs. Conversely, reducing or withholding parenteral amino acid and or calcium to mitigate PIVI risk may introduce nutritional deficiencies without decreasing the risk of clinically significant PIVIs.</p><p><b>Table 1.</b> Characteristic Comparison of Mild, Moderate, and Severe Pivis in Neonatal ICU. PIVI Severity Was Designated Based on INS Criteria.</p><p></p><p><b>Table 2.</b> Association of Medication Administration and Components of Infusates With the Incidence and Severity of PIVI in NICU.</p><p></p><p></p><p><b>Figure 1.</b> Infusate Properties.</p><p>Stephanie Oliveira, MD, CNSC<sup>1</sup>; Josie Shiff<sup>2</sup>; Emily Romantic, RD<sup>3</sup>; Kathryn Hitchcock, RD<sup>4</sup>; Gillian Goddard, MD<sup>4</sup>; Paul Wales, MD<sup>5</sup></p><p><sup>1</sup>Cincinnati Children's Hospital Medical Center, Mason, OH; <sup>2</sup>University of Cincinnati, Cincinnati, OH; <sup>3</sup>Cincinnati Children's Hospital Medical Center, Cincinnati, OH; <sup>4</sup>Cincinnati Children's Hospital, Cincinnati, OH; <sup>5</sup>Cincinnati Children's Hospital Medical Center, Cincinnati, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> It is common for children with intestinal failure on parenteral nutrition to be fed an elemental enteral formula as it is believed they are typically better tolerated due to the protein module being free amino acids, the absence of other allergens, and the presence of long chain fatty acids. In February 2022, a popular elemental formula on the market was recalled due to bacterial contamination that necessitated an immediate transition to an alternative enteral formula. This included initiating plant-based options for some of our patients. We have experienced a growing interest and request from families to switch to plant-based formulas due to religious practices, cost concerns, and personal preferences. While plant-based formulas lack major allergens and may contain beneficial soluble fiber, they are under studied in this patient population. This study aims to determine if growth was affected amongst children with intestinal failure on parenteral nutrition who switched from elemental to plant-based formulas.</p><p><b>Methods:</b> We conducted a retrospective cohort study of IF patients on PN managed by our intestinal rehabilitation program who transitioned from elemental to plant-based formulas during the product recall. Data were collected on demographics, intestinal anatomy, formula intake, parenteral nutrition support, tolerance, stool consistency, and weight gain for 6 months before and after formula transition. Paired analyses were performed for change in growth and nutritional intake, using the Wilcoxon-signed rank test. Chi-squared tests were performed to compare formula tolerance. An alpha value &lt; 0.05 was considered significant.</p><p><b>Results:</b> Eleven patients were included in the study [8 Males; median gestational age 33 (IQR = 29, 35.5) weeks, median age at assessment 20.4 (IQR = 18.7,29.7) months]. All participants had short bowel syndrome (SBS) as their IF category. Residual small bowel length was 28(IQR = 14.5,47.5) cm. Overall, there was no statistically significant difference in growth observed after switching to plant-based formulas (p = 0.76) (Figure 1). Both median enteral formula volume and calorie intake were higher on plant-based formula, but not statistically significant (p = 0.83 and p = 0.41) (Figure 2). 7 of 11 patients (64%) reported decreased stool count (p = 0.078) and improved stool consistency (p = 0.103) after switching to plant-based formula. Throughout the study, the rate of PN calorie and volume weaning were not different after switching to plant-based formula (calories: p = 0.83; volume: p = 0.52) (Figure 3).</p><p><b>Conclusion:</b> In this small study of children with IF, the switch from free amino acid formula to an intact plant-based formula was well tolerated. Growth was maintained between groups. After switching to plant-based formulas these children tolerated increased enteral volumes, but we were underpowered to demonstrate a statistical difference. There was no evidence of protein allergy among children who switched. Plant-based formulas may be an alternative option to elemental formulas for children with intestinal failure.</p><p></p><p><b>Figure 1:</b> Change in Weight Gain (g/day) on Elemental and Plant-Based Formulas.</p><p></p><p><b>Figure 2.</b> Change in Enteral Calories (kcal/kg/day) and Volume (mL/kg/day) on Elemental and Plant-Based Formulas.</p><p></p><p><b>Figure 3.</b> Change in PN Calories (kcal/kg/day) and Volume (mL/kg/day) on Elemental and Plant-Based Formulas.</p><p>Carly McPeak, RD, LD<sup>1</sup>; Amanda Jacobson-Kelly, MD, MSc<sup>1</sup></p><p><sup>1</sup>Nationwide Children's Hospital, Columbus, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> In pediatrics, post-pyloric enteral feeding via jejunostomy, gastrojejunostomy or naso-jejunostomy tubes is increasingly utilized to overcome problems related to aspiration, severe gastroesophageal reflux, poor gastric motility, and gastric outlet obstruction. Jejunal enteral feeding bypasses the duodenum, a primary site of absorption for many nutrients. Copper is thought to be primarily absorbed in the stomach and proximal duodenum, thus patients receiving nutritional support in a way that bypasses this region can experience copper deficiency. The prevalence and complications of copper deficiency in the pediatric population are not well documented.</p><p><b>Methods:</b> This was a retrospective case series of two patients treated at Nationwide Children's Hospital (NCH). Medical records were reviewed to collect laboratory, medication/supplement data and enteral feeding history. In each case report, both patients were observed to be receiving Pediasure Peptide® for enteral formula.</p><p><b>Results:</b> Case 1: A 14-year-old male receiving exclusive post-pyloric enteral nutrition for two years. This patient presented with pancytopenia and worsening anemia. Laboratory data was drawn on 3/2017 and demonstrated deficient levels of copper (&lt; 10 ug/dL) and ceruloplasmin (20 mg/dL). Repletion initiated with intravenous cupric chloride 38 mcg/kg/day for 3 days and then transitioned to 119 mcg/kg/day via j-tube for 4 months. Labs were redrawn 2 months after initial episode of deficiency and indicated overall improvement of pancytopenia (Table 1). After 4 months, cupric chloride decreased to 57 mcg/kg/day as a maintenance dose. Laboratory data redrawn two and a half years after initial episode of deficiency and revealed deficient levels of copper (27 ug/dL) and ceruloplasmin (10 mg/dL) despite lower dose supplementation being administered. Case 2: An 8-year-old female receiving exclusive post-pyloric enteral nutrition for 3 months. Laboratory data was drawn on 3/2019 and revealed deficient levels of copper (38 ug/dL) and ceruloplasmin (13 mg/dL). Supplementation of 50 mcg/kg/day cupric chloride administered through jejunal tube daily. Copper and ceruloplasmin labs redrawn at 11 months and 15 months after initiation of supplementation and revealed continued deficiency though hematologic values remained stable (Table 2).</p><p><b>Conclusion:</b> There are currently no guidelines for clinicians for prevention, screening, treatment, and maintenance of copper deficiency in post-pyloric enteral feeding in pediatrics. Current dosing for copper repletion in profound copper deficiency is largely based on case series and expert opinions. At NCH, the current standard-of-care supplementation demonstrates inconsistent improvement in copper repletion, as evidenced by case reports discussed above. Future research should determine appropriate supplementation and evaluate their efficacy in patients with post-pyloric enteral feeding.</p><p><b>Table 1.</b> Laboratory Evaluation of Case 1.</p><p></p><p>'-' indicates no data available, bolded indicates result below the lower limit of normal for age.</p><p><b>Table 2.</b> Laboratory Evaluation of Case 2.</p><p></p><p>'-' indicates no data available, bolded indicates result below the lower limit of normal for age.</p><p>Meighan Marlo, PharmD<sup>1</sup>; Ethan Mezoff, MD<sup>1</sup>; Shawn Pierson, PhD, RPh<sup>1</sup>; Zachary Thompson, PharmD, MPH, BCPPS<sup>1</sup></p><p><sup>1</sup>Nationwide Children's Hospital, Columbus, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Parenteral nutrition (PN) is and remains a high-risk therapy, typically containing more than 40 different ingredients. Incorporation of PN prescribing into the electronic health record (EHR) has been recommended to minimize the risk of transcription errors and allow for implementation of important safety alerts for prescribers. Ambulatory PN prescribing workflows typically require manual transcription of orders in the absence of robust EHR interoperability between systems used for transitions of care. Pediatric patients receiving ambulatory PN have an increased risk of medication events due to the need for weight-based customization and low utilization of ambulatory PN leading to inexperience for many pharmacies. Development and implementation of a prescribing system incorporating inpatient and outpatient functionality within the EHR is necessary to improve the quality and safety of ambulatory PN in pediatric patients. The primary goal was to create a workflow that provided improvement in transitions of care through minimization of manual transcription and improved medication safety. We describe modification of standard EHR tools to achieve this aim.</p><p><b>Methods:</b> Utilizing a multidisciplinary team, development and incorporation of ambulatory PN prescribing within the EHR at Nationwide Children's Hospital was completed. Inpatient and outpatient variances, safety parameters to provide appropriate alerts to prescribers, and legal requirements were considered and evaluated for optimization within the new system.</p><p><b>Results:</b> The final product successfully incorporated ambulatory PN prescribing while allowing seamless transfer of prescriptions between care settings. The prescriber orders the patient specific ambulatory PN during an inpatient or outpatient encounter. The order is subsequently queued for pharmacist review/verification to assess the adjustments and determine extended stability considerations in the ambulatory setting. After pharmacist review, the prescription prints and is signed by the provider to be faxed to the pharmacy.</p><p><b>Conclusion:</b> To our knowledge, this is the first institution to be able to develop and incorporate pediatric PN prescribing into the EHR that transfers over in both the inpatient and outpatient settings independent of manual transcription while still allowing for customization of PN.</p><p>Faith Bala, PhD<sup>1</sup>; Enas Alshaikh, PhD<sup>1</sup>; Sudarshan Jadcherla, MD<sup>1</sup></p><p><sup>1</sup>The Research Institute at Nationwide Children's Hospital, Columbus, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Extrauterine growth remains a concern among preterm-born infants admitted to the neonatal ICU (NICU) as it rarely matches intrauterine fetal growth rates. Although the reasons are multifactorial, the role played by the duration of exclusive parenteral nutrition (EPN) and the transition to reach exclusive enteral nutrition (EEN) phase remains unclear. Significant nutrient deficits can exist during the critical phase from birth to EEN, and thereafter, and these likely impact short and long-term outcomes. Given this rationale, our aims were to examine the relationship between the duration from birth to EEN on growth and length of hospital stay (LOHS) among convalescing preterm-born infants with oral feeding difficulties.</p><p><b>Methods:</b> This is a retrospective analysis of prospectively collected data from 77 preterm infants admitted to the all-referral level IV NICU at Nationwide Children's Hospital, Columbus, Ohio, who were later referred to our innovative neonatal and infant feeding disorders program for the evaluation and management of severe feeding/aero-digestive difficulties. Inclusion criteria: infants born &lt; 32 weeks gestation, birthweight &lt; 1500 g, absence of chromosomal/genetic disorders, discharged at term equivalent postmenstrual age (37-42 weeks, PMA) on full oral feeding. Growth variables were converted to age- and gender-specific Z-scores using the Fenton growth charts. Using the Academy of Nutrition and Dietetics criteria for neonates and preterm populations, extrauterine growth restriction (EUGR) was defined as weight Z-score decline from birth to discharge &gt; 0.8. Clinical characteristics stratified by EUGR status were compared using the Chi-Square test, Fisher exact test, Mann Whitney U test, and T-test as appropriate. Multivariate regression was used to explore and assess the relationship between the duration from birth to EEN and growth Z-scores at discharge simultaneously. Multiple Linear regression was used to assess the relationship between the duration from birth to EEN and LOHS.</p><p><b>Results:</b> Forty-two infants (54.5%) had EUGR at discharge, and those with weight and length percentiles &lt; 10% were significantly greater at discharge than at birth (Table 1). The growth-restricted infants at discharge had significantly lower birth gestational age, a higher proportion required mechanical ventilation at birth, had a higher incidence of sepsis, and took longer days to attain EEN (Table 2). The duration from birth to EEN was significantly negatively associated with weight, length, and head circumference Z scores at discharge. Likewise, the duration from birth to EEN was significantly positively associated with the LOHS (Figure 1).</p><p><b>Conclusion:</b> The duration from birth to exclusive enteral nutrition (EEN) can influence growth outcomes. We speculate that significant gaps between recommended and actual nutrient intake exist during the period to EEN, particularly for those with chronic tube-feeding difficulties. Well-planned and personalized nutrition is relevant until EEN is established, and enteral nutrition advancement strategies using standardized feeding protocols offers prospects for generalizability, albeit provides opportunities for personalization.</p><p><b>Table 1.</b> Participant Growth Characteristics.</p><p></p><p><b>Table 2.</b> Participants Clinical Characteristics.</p><p></p><p></p><p><b>Figure 1.</b> Relationship between the Duration from Birth to EEN Versus Growth Parameters and Length of Hospital Stay.</p><p>Alayne Gatto, MS, MBA, RD, CSP, LD, FAND<sup>1</sup>; Jennifer Fowler, MS, RDN, CSPCC, LDN<sup>2</sup>; Deborah Abel, PhD, RDN, LDN<sup>3</sup>; Christina Valentine, MD, MS, RDN, FAAP, FASPEN<sup>4</sup></p><p><sup>1</sup>Florida International University, Bloomingdale, GA; <sup>2</sup>East Carolina Health, Washington, NC; <sup>3</sup>Florida International University, Miami Beach, FL; <sup>4</sup>Banner University Medical Center, The University of Arizona, Tucson, AZ</p><p><b>Financial Support:</b> The Rickard Foundation.</p><p><b>Background:</b> The neonatal registered dietitian nutritionist (NICU RDN) plays a crucial role in the care of premature and critically ill infants in the neonatal intensive care unit (NICU). Advanced pediatric competencies are frequently a gap in nutrition degree coursework or dietetic internships. Critical care success with such vulnerable patients requires expertise in patient-centered care, multidisciplinary collaboration, and adaptive clinical problem-solving. This research aimed to identify the needs, engagement levels, and expertise of NICU RDNs, while also providing insights into their job satisfaction and career longevity. Currently in the US, there are approximately 850 Level III and Level IV NICUs in which neonatal dietitians are vital care team members, making it imperative that hospital provide appropriate compensation, benefits, and educational support.</p><p><b>Methods:</b> This was a cross-sectional examination using a national, online, IRB approved survey during March 2024 sent to established Neonatal and Pediatric Dietitian practice groups. A Qualtrics link was provided for current and former NICU RDNs to complete a 10-minute online survey that provided an optional gift card for completion. The link remained open until 200 gift cards were exhausted, approximately one week after the survey opened. Statistical analyses were performed using Stats IQ Qualtrics. In the descriptive statistics, frequencies of responses were represented as counts and percentages. For comparison of differences, the Chi-Squared test and Fisher's Exact test are used for categorical analysis.</p><p><b>Results:</b> In total, 253 current (n = 206) and former (n = 47) NICU RDNs completed the online questionnaire. Of the 210 respondents, 84 (40%) reported having pediatric clinical experience, 94 (44%) had clinical pediatric dietetic intern experience, 21 (10%) had previously worked as a WIC nutritionist, 15 (7.1%) had specialized pediatric certification or fellowship, and 12 (5.7%) had no prior experience before starting in the NICU. (Table 1) Of 163 respondents, 83 (50.9%) reported receiving financial support or reimbursement for additional NICU training. Respondents who felt valued as team members planned to stay in the NICU RD role for more than 5 years (p &gt; 0.0046). Additionally, they reported having acknowledgement and appreciation (64.4%), motivation (54.1%), and opportunities for advancement (22.9%). (Table 2).</p><p><b>Conclusion:</b> NICU RDNs do not have a clear competency roadmap nor a career development track. In addition, financial support or reimbursement for continuing education is not consistently an employee benefit which may play a key role in job satisfaction and retention. This data provides valuable insight for not only managers of dietitians but also professional societies to build programs and retention opportunities.</p><p><b>Table 1.</b> Question: When You Started as a NICU RD, What Experience Did You Have? (n = 210).</p><p></p><p>N and Percentages will total more than 210 as respondents could check multiple answers.</p><p><b>Table 2.</b> Comparison of Questions: Do You Feel You Have the Following in Your Role in the NICU and How Long Do You Plan to Stay in Your Role?</p><p></p><p>Sivan Kinberg, MD<sup>1</sup>; Christine Hoyer, RD<sup>2</sup>; Everardo Perez Montoya, RD<sup>2</sup>; June Chang, MA<sup>2</sup>; Elizabeth Berg, MD<sup>2</sup>; Jyneva Pickel, DNP<sup>2</sup></p><p><sup>1</sup>Columbia University Irving Medical Center, New York, NY; <sup>2</sup>Columbia University Medical Center, New York, NY</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Patients with short bowel syndrome (SBS) can have significant fat malabsorption due to decreased intestinal surface area, bile acid deficiency, and rapid transit time, often leading to feeding intolerance and dependence on parenteral nutrition (PN). Patients with SBS can have deficiency of pancreatic enzymes and/or reduced effectiveness of available pancreatic enzymes, resulting in symptoms of exocrine pancreatic insufficiency (EPI), including weight loss, poor weight gain, abdominal pain, diarrhea, and fat-soluble vitamin deficiencies. Oral pancreatic enzyme replacement therapy (PERT) is often tried but is impractical for use with enteral nutrition (EN) due to inconsistent enzyme delivery and risk of clogged feeding tubes. When used with continuous EN, oral PERT provides inadequate enzyme delivery as its ability to hydrolyze fats decreases significantly after 30 minutes of ingestion. The significant need for therapies to improve enteral absorption in this population has led to interest in using an in-line digestive cartridge to treat EPI symptoms in SBS patients on tube feedings. Immobilized lipase cartridge is an FDA-approved in-line digestive cartridge designed to hydrolyze fat in EN before it reaches the gastrointestinal tract, allowing for the delivery of absorbable fats through continuous or bolus feeds. In patients with SBS, this modality may be more effective in improving enteral fat absorption compared to oral preparations of PERT. Preclinical studies have demonstrated that use of an in-line digestive cartridge in a porcine SBS model increased fat-soluble vitamin absorption, reduced PN dependence, and improved intestinal adaptation. Our study aims to evaluate changes in PN, EN, growth parameters, stool output, and fat-soluble vitamin levels in pediatric SBS patients using an in-line digestive cartridge at our center.</p><p><b>Methods:</b> Single-center retrospective study in pediatric patients with SBS on EN who used an in-line immobilized lipase (RELiZORB) cartridge. Data collection included patient demographics, etiology of SBS, surgical history, PN characteristics (calories, volume, infusion hours/days), EN characteristics (tube type, bolus or continuous feeds, formula, calories, volume, hours), immobilized lipase cartridge use (#cartridges/day, duration), anthropometrics, stool output, gastrointestinal symptoms, medications (including previous PERT use), laboratory assessments (fat-soluble vitamin levels, fatty acid panels, pancreatic elastase), indication to start immobilized lipase cartridge, and any reported side effects. Patients with small intestinal transplant or cystic fibrosis were excluded.</p><p><b>Results:</b> Eleven patients were included in the study (mean age 10.4 years, 55% female). The most common etiology of SBS was necrotizing enterocolitis (45%) and 7 (64%) of patients were dependent on PN. Results of interim analysis show: mean duration of immobilized lipase cartridge use of 3.9 months, PN calorie decrease in 43% of patients, weight gain in 100% of patients, and improvement in stool output in 6/9 (67%) patients. Clogging of the cartridges was the most common reported technical difficulty (33%), which was overcome with better mixing of the formula. No adverse events or side effects were reported.</p><p><b>Conclusion:</b> In this single-center study, use of an in-line immobilized lipase digestive cartridge in pediatric patients with SBS demonstrated promising outcomes, including weight gain, improved stool output and reduced dependence on PN. These findings suggest that in-line digestive cartridges may play a role in improving fat malabsorption and decreasing PN dependence in pediatric SBS patients. Larger multicenter studies are needed to further evaluate the efficacy, tolerability, and safety of in-line digestive cartridges in this population.</p><p>Vikram Raghu, MD, MS<sup>1</sup>; Feras Alissa, MD<sup>2</sup>; Simon Horslen, MB ChB<sup>3</sup>; Jeffrey Rudolph, MD<sup>2</sup></p><p><sup>1</sup>University of Pittsburgh School of Medicine, Gibsonia, PA; <sup>2</sup>UPMC Children's Hospital of Pittsburgh, Pittsburgh, PA; <sup>3</sup>University of Pittsburgh School of Medicine, Pittsburgh, PA</p><p><b>Financial Support:</b> National Center for Advancing Translational Sciences (KL2TR001856.)</p><p><b>Background:</b> Administrative databases can provide unique perspectives in rare disease due to the ability to query multicenter data efficiently. Pediatric intestinal failure can be challenging to study due to the rarity of the condition at single centers and the previous lack of a single diagnosis code. In October 2023, a new diagnosis code for intestinal failure was added to the International Classification of Diseases, 10<sup>th</sup> revision (ICD-10). We aimed to describe the usage and limitations of this code to identify children with intestinal failure.</p><p><b>Methods:</b> We performed a multicenter cross-sectional study using the Pediatric Health Information Systems database from October 1, 2023, to June 30, 2024. Children with intestinal failure were identified by ICD-10 code (K90.83). Descriptive statistics were used to characterize demographics, diagnoses, utilization, and outcomes.</p><p><b>Results:</b> We identified 1804 inpatient encounters from 849 unique patients with a diagnosis code of intestinal failure. Figure 1 shows the trend of code use by month since its inception in October 2023. The 849 patients had a total of 7085 inpatient encounters over that timeframe, meaning only 25% of their encounters included the intestinal failure diagnosis. Among these 849 patients, 638 had at least one encounter over the timeframe in which they received parenteral nutrition; 400 corresponded to an admission in which they also had an intestinal failure diagnosis code. Examining only inpatient stays longer than 2 days, 592/701 (84%) patients with such a stay received parenteral nutrition. Central line-associated bloodstream infections accounted for 501 encounters. Patients spent a median of 37 days (IQR 13-96) in the hospital. Patients were predominantly non-Hispanic White (43.7%), non-Hispanic Black (14.8%), or Hispanic (27.9%). Most had government-based insurance (63.5%). Child Opportunity Index was even split among all five quintiles. The total standardized cost from all encounters with an intestinal failure diagnosis totaled $157 million with the total from all encounters with these patients totaling $259 million. The median cost over those 9 months per patients was $104,890 (IQR $31,149 - $315,167). Death occurred in 28 patients (3.3%) over the study period.</p><p><b>Conclusion:</b> The diagnosis code of intestinal failure has been used inconsistently since implementation in October 2023, perhaps due to varying definitions of intestinal failure. Children with intestinal failure experience high inpatient stay costs and rare but significant mortality. Future work must consider the limitations of using only the new code in identifying these patients.</p><p></p><p><b>Figure 1.</b> Number of Encounters With an Intestinal Failure Diagnosis Code.</p><p><b>Poster of Distinction</b></p><p>Kera McNelis, MD, MS<sup>1</sup>; Allison Ta, MD<sup>2</sup>; Ting Ting Fu, MD<sup>2</sup></p><p><sup>1</sup>Emory University, Atlanta, GA; <sup>2</sup>Cincinnati Children's Hospital Medical Center, Cincinnati, OH</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> 6th Annual Pediatric Early Career Research Conference, August 27, 2024, Health Sciences Research Building I, Emory University.</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> The neonatal period is a time of rapid growth, and many infants who require intensive care need extra nutritional support. The Academy of Nutrition and Dietetics (AND) published an expert consensus statement to establish criteria for the identification of neonatal malnutrition. There is limited evidence regarding outcomes associated with diagnosis from these opinion-derived criteria. The objective of this study was to compare anthropometric-based malnutrition indicators with direct body composition measurements in infancy.</p><p><b>Methods:</b> Air displacement plethysmography is considered the gold standard for non-invasive body composition measurement, and this was incorporated into routine clinical care at a referral Level IV neonatal intensive care unit. Late preterm and term infants (34-42 weeks gestational age) with body composition measurement available were included in this study. Infants were categorized as having malnutrition per AND criteria. Fat mass, fat-free mass, and body fat percentage z-scores were determined per Norris body composition growth curves. Logistic regression was conducted to ascertain the relationship of fat mass, fat-free mass, and body fat percentage with malnutrition diagnosis. Linear regression was performed to predict body mass index (BMI) at age 18-24 months from each body composition variable.</p><p><b>Results:</b> Eighty-four infants were included, with 39% female and 96% singleton (Table 1). Fifteen percent were small for gestational age and 12% were large for gestational age at birth. Nearly half had a congenital intestinal anomaly, including gastroschisis and intestinal atresia. Sixty-three percent of the group met at least one malnutrition criterion. Fat-free mass z-score was negatively associated with a malnutrition diagnosis, with an odds ratio 0.77 (95% CI 0.59-0.99, p &lt; 0.05). There was not a statistically significant association between malnutrition diagnosis and body fat percentage or fat mass. There was not a statistically significant relationship between any body composition variable and BMI at 18-24 months, even after removing outliers with a high Cook's distance.</p><p><b>Conclusion:</b> Malnutrition diagnosis is associated with low fat-free mass in critically ill term infants. Body composition is not a predictor of later BMI in this small study.</p><p><b>Table 1.</b> Characteristics of Late Preterm and Term Infants in the Neonatal Intensive Care Unit With a Body Composition Measurement Performed Via Air Displacement Plethysmography.</p><p></p><p>John Stutts, MD, MPH<sup>1</sup>; Yong Choe, MAS<sup>1</sup></p><p><sup>1</sup>Abbott, Columbus, OH</p><p><b>Financial Support:</b> Abbott.</p><p><b>Background:</b> The prevalence of obesity in children is rising. Despite the awareness and work toward weight reduction, less is known about malnutrition in children with obesity. The purpose of this study was to evaluate the prevalence of obesity in U.S. children and determine which combination of indicators best define malnutrition in this population.</p><p><b>Methods:</b> The 2017-2018 National Health and Nutrition Examination Survey (NHANES) database was utilized to assess biomarkers (most recent complete dataset due to Covid-19 pandemic). Trends in prevalence were obtained from 2013-2018 survey data. Obesity was defined as ≥ 95th percentile of the CDC sex-specific BMI-for-age growth charts. Cohort age range was 12-18 years. Nutrient intake and serum were analyzed for vitamin D, vitamin E, vitamin C, potassium, calcium, vitamin B9, vitamin A, total protein, albumin, globulin, high sensitivity C-reactive protein (hs-CRP), iron, hemoglobin and mean corpuscular volume (MCV). Intake levels of fiber were also analyzed. Children consuming supplements were excluded. Categorical and continuous data analysis was performed using SAS® Proc SURVEYFREQ (with Wald chi-square test) and Proc SURVEYMEANS (with t-test) respectively in SAS® Version 9.4 and SAS® Enterprise Guide Version 8.3. Hypothesis tests were performed using 2-sided, 0.05 level tests. Results were reported as mean ± standard error (SE) (n = survey sample size) or percent ± SE (n).</p><p><b>Results:</b> The prevalence of obesity in the cohort was 21.3% ± 1.3 (993). Prevalence trended upward annually; 20.3% ± 2.1 (1232) in 2013-2014, 20.5% ± 2.0 (1129) in 2015-2016. When compared with children without obesity, the mean serum levels of those with obesity were significantly (P ≤ 0.05) lower for vitamin D (50.3 ± 2.4 vs. 61.5 ± 2.1, p &lt; 0.001), iron (14.4 ± 0.5 vs. 16.4 ± 0.4, p &lt; 0.001), albumin (41.7 ± 0.3 vs. 43.3 ± 0.3, p &lt; 0.001), and MCV (83.8 ± 0.5 vs. 85.8 ± 0.3, p = 0.003). When compared with children without obesity, the mean serum levels of those with obesity were significantly (p ≤ 0.05) higher for total protein (73.2 ± 0.4 vs. 72.0 ± 0.3, p = 0.002), globulin (31.5 ± 0.4 vs. 28.7 ± 0.3, p &lt; 0.001) and hs-CRP (3.5 ± 0.3 vs. 1.2 ± 0.2, p &lt; 0.001). A higher prevalence of insufficiency was found for Vitamin D (51.9% ± 5.6 vs. 26.8% ± 3.7, p = 0.001), hemoglobin (16.3% ± 3.1 vs. 7.5% ± 1.8, p = 0.034) and the combination of low hemoglobin + low MCV (11.2% ± 2.9 vs. 3.3% ± 1.0, p = 0.049). All other serum levels were not significantly (p &gt; 0.05) different, with no significant difference in intake.</p><p><b>Conclusion:</b> Results indicate a continued increase in prevalence of obesity in children. When comparing with the non-obese pediatric population, it also shows the differences in micro- and macronutrient serum levels, with no significant differences in dietary intake of these nutrients. The higher prevalence of low hemoglobin + low MCV supports iron deficiency and adds clinical relevance to the data surrounding low mean blood levels of iron. Children with obesity show higher mean globulin and hs-CRP levels consistent with an inflammatory state. The results underscore the existence of malnutrition in children with obesity and the need for nutrition awareness in this pediatric population.</p><p>Elisha London, BS, RD<sup>1</sup>; Derek Miketinas, PhD, RD<sup>2</sup>; Ariana Bailey, PhD, MS<sup>3</sup>; Thomas Houslay, PhD<sup>4</sup>; Fabiola Gutierrez-Orozco, PhD<sup>1</sup>; Tonya Bender, MS, PMP<sup>5</sup>; Ashley Patterson, PhD<sup>1</sup></p><p><sup>1</sup>Reckitt/Mead Johnson, Evansville, IN; <sup>2</sup>Data Minded Consulting, LLC, Houston, TX; <sup>3</sup>Reckitt/Mead Johnson Nutrition, Henderson, KY; <sup>4</sup>Reckitt/Mead Johnson Nutrition, Manchester, England; <sup>5</sup>Reckitt/Mead Johnson Nutrition, Newburgh, IN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> The objective was to examine whether nutrient intake varied across malnutrition classification among a nationally representative sample of children and adolescents.</p><p><b>Methods:</b> This was a secondary analysis of children and adolescents 1-18 y who participated in the National Health and Nutrition Examination Survey 2001-March 2020. Participants were excluded if they were pregnant or did not provide at least one reliable dietary recall. The degree of malnutrition risk was assessed by weight-for-height and BMI-for-age Z-scores for 1-2 y and 3-18 y, respectively. As per the Academy of Nutrition and Dietetics/American Society for Parenteral and Enteral Nutrition, malnutrition was classified using Z-scores: none (Z &gt; -1), mild (Z between -1 and -1.9), and moderate/severe (Z ≤ -2). Dietary intake was assessed from one to two 24-hr dietary recalls. Usual intakes of macronutrients from foods and beverages, and of micronutrients from foods, beverages, and supplements were assessed using the National Cancer Institute method. The cut-point approach was used to estimate the proportion of children with intake below the estimated average requirement for micronutrients and outside the acceptable macronutrient distribution range for macronutrients. All analyses were adjusted for sampling methodology and the appropriate sample weights were applied. Independent samples t-tests were conducted to compare estimates within age groups between nutrition status classifications, with no malnutrition as the reference group.</p><p><b>Results:</b> A total of 32,188 participants were analyzed. Of those, 31,689 (98.4%) provided anthropometrics. The majority (91%) did not meet criteria for malnutrition, while 7.4% and 1.6% met criteria for mild and moderate/severe malnutrition, respectively. Dietary supplement use (mean[SE]) was reported among 33.3[0.7]% of all children/adolescents. Of those experiencing moderate/severe malnutrition, inadequate calcium intake was greatest in adolescents 14-18 y (70.6[3.3]%), and older children 9-13 y (69.7[3.4]%), compared to 3-8 y (29.5[3.9]%) and 1-2 y (5.0[1.2]%). In children 9-13 y, risk for inadequate calcium intake was greater among those experiencing mild (65.3[2.1]%) and moderate/severe malnutrition (69.7[2.1]%) compared to those not experiencing malnutrition (61.2[1.1]%). Similarly, in those experiencing moderate/severe malnutrition, inadequate zinc and phosphorus intake was greatest in adolescents 14-18 y (20.9[3.3]% and 30.6[3.4]%, respectively) and 9-13 y (13.4[2.6]% and 33.9[3.6]%, respectively) compared to younger children (range 0.2[0.1]-0.9[0.4]% and 0.2[0.1]-0.4[0.2]% in 1-8 y, respectively). Although the greatest risk for inadequate protein intake was observed among those experiencing moderate/severe malnutrition, percent energy intake from protein and carbohydrate was adequate for most children and adolescents. Total and saturated fat were consumed in excess by all age groups regardless of nutrition status. The greatest risk for excessive saturated fat intake was reported in those experiencing moderate/severe malnutrition (range across all age groups: 85.9-95.1%).</p><p><b>Conclusion:</b> Older children and adolescents experiencing malnutrition were at greatest risk for inadequate intakes of certain micronutrients such as calcium, zinc, and phosphorus. These results may indicate poor diet quality among those at greatest risk for malnutrition, especially adolescents.</p><p>Anna Benson, DO<sup>1</sup>; Louis Martin, PhD<sup>2</sup>; Katie Huff, MD, MS<sup>2</sup></p><p><sup>1</sup>Indiana University School of Medicine, Carmel, IN; <sup>2</sup>Indiana University School of Medicine, Indianapolis, IN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Trace metals are essential for growth and are especially important in the neonatal period. Recommendations are available for intake of these trace metals in the neonate. However, these recommendations are based on limited data and there are few available descriptions regarding trace metal levels in neonates and their influence on outcomes. In addition, monitoring trace metal levels can be difficult as multiple factors, including inflammation, can affect accuracy. The goal of this project was to evaluate patient serum levels of zinc, selenium and copper and related outcomes including growth, rate of cholestasis, sepsis, bronchopulmonary dysplasia (BPD), and death in a cohort admitted to the neonatal intensive care unit (NICU) and parenterally dependent.</p><p><b>Methods:</b> We completed a retrospective chart review of NICU patients who received parenteral nutrition (PN) and had trace metal panels drawn between January 2016 and February 2023. Charts were reviewed for baseline labs, time on PN, trace metal panel level, dose of trace metals in PN, enteral feeds and supplements, and outcomes including morbidities and mortality. Sepsis was diagnosed based on positive blood culture and cholestasis as a direct bilirubin &gt;2 mg/dL. Fisher's Exact Test or Chi square were used to assess association between categorical variables. Spearman correlation was used to assess the correlation between two continuous variables. A p-value of 0.05 was used for significance.</p><p><b>Results:</b> We included 98 patients in the study with demographic data shown in Table 1. The number of patients with trace metal elevation and deficiency are noted in Tables 1 and 2, respectively. The correlation between growth and trace metal levels is shown in Figure 1. Patient outcomes related to the diagnosis of trace metal deficiency are noted in Table 2. Copper deficiency was found to be significantly associated with sepsis (p = 0.010) and BPD (p = 0.033). Selenium deficiency was associated with cholestasis (p = 0.001) and BPD (p = 0.003). To further assess the relation of selenium and cholestasis, spearman correlation noted a significant negative correlation between selenium levels and direct bilirubin levels with (p = 0.002; Figure 2).</p><p><b>Conclusion:</b> Trace metal deficiency was common in our population. In addition, selenium and copper deficiency were associated neonatal morbidities including sepsis, cholestasis, and BPD. When assessing selenium deficiency and cholestasis, the measured selenium level was found to correlate with direct bilirubin level. While there was correlation between trace metal levels and growth, the negative association noted is unclear and highlights the need for further assessment to determine the influence of other patient factors and the technique used for growth measurement. Overall this project highlights the important relation between trace metals and neonatal morbidities. Further research is needed to understand how trace metal supplementation might be used to optimize neonatal outcomes in the future.</p><p><b>Table 1.</b> Patient Demographic and Outcome Information for Entire Population Having Trace Metal Levels Obtained. (Total population n = 98 unless noted).</p><p></p><p>Patient demographic and outcome information for entire population having trace metal levels obtained.</p><p><b>Table 2.</b> Rate of Trace Metal Deficiency and Association With Patient Outcomes.</p><p>(Total n = 98).</p><p></p><p>Rate of trace metal deficiency and association with patient outcomes.</p><p></p><p>Scatter plot of average trace metal level and change in growth over time. (Growth represented as change in parameter over days). The Spearman correlation coefficient is listed for each graph. And significance note by symbol with *p-value &lt; 0.05, †p-value &lt; 0.01, ‡p-value &lt; 0.001.</p><p><b>Figure 1.</b> Correlation of Trace Metal Level and Growth.</p><p></p><p>Scatter plot of individual direct bilirubin levels plotted by selenium levels. Spearman correlation coefficient noted with negative correlation with p-value 0.002.</p><p><b>Figure 2.</b> Correlation of Selenium Level With Direct Bilirubin Level.</p><p>Kaitlin Berris, RD, PhD (student)<sup>1</sup>; Qian Zhang, MPH<sup>2</sup>; Jennifer Ying, BA<sup>3</sup>; Tanvir Jassal, BSc<sup>3</sup>; Rajavel Elango, PhD<sup>4</sup></p><p><sup>1</sup>BC Children's Hospital, North Vancouver, BC; <sup>2</sup>BCCHR, Vancouver, BC; <sup>3</sup>University of British Columbia, Vancouver, BC; <sup>4</sup>UBC/BCCHR, Vancouver, BC</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Pediatric critical illness causes increased demand for several nutrients. Children admitted requiring nutrition support have a naso-gastric tube to deliver enteral nutrition (EN) formula as liquid nutrition. These formulas are developed using dietary reference intakes (DRI) for healthy populations and does not account for altered nutrient needs in critical illness. Imbalanced nutrient delivery during this vulnerable time could lead to malnutrition, and negatively impact hospital outcomes. Published guidelines by the American Society of Parenteral and Enteral Nutrition (ASPEN) in 2017 aim to alleviate poor nutrition: use concentrated formula in fluid restriction, meet 2/3 estimated calories (resting energy expenditure, REE) by the end of first week, a minimum of 1.5 g/kg/day dietary protein and provide the DRI for each micronutrient. The objective of this retrospective cohort study was to evaluate nutrition delivery (prescribed vs. delivered) compared to 2017 guidelines, and correlate to adequacy in children admitted to a Canadian PICU.</p><p><b>Methods:</b> Three-years of charts were included over two retrospective cohorts: September 2018- December 2020 and February 2022- March 2023. The first cohort, paper chart based, included children 1-18 y with tube feeding started within 3 d after admission. The second cohort, after transition to electronic medical records, included children 1-6 y on exclusive tube feeding during the first week of admission. Patient characteristics, daily formula type, rate prescribed (physician order), amount delivered (nursing notes) and interruption hours and reasons were collected. Statistical analysis included descriptive analysis of characteristics, logistic regression for odds of achieving adequacy of intake with two exposures: age categories and formula type. Pearson correlation was used to interpret interruption hours with percentage of calories met.</p><p><b>Results:</b> Patients (n = 86) that were included spanned 458 nutrition support days (NSD). Admissions predominantly were for respiratory disease (73%), requiring ventilation support (81.4%). Calorie prescription (WHO REE equation) was met in 20.3% of NSD and 43.9% met 2/3 of calorie recommendation (table 1). Concentrated calories were provided in 34% of patients. Hours of interruptions vs. percentage of goal calories met was negatively correlated (r = -.52, p = .002) when looking at those ordered EN without prior EN history (i.e. home tube fed). More than 4 h of interruptions was more likely to not meet 2/3 calorie goal. Calorie goals were met when standard pediatric formula was used in children 1-8 y and concentrated adult formula in 9-18 y. Odds of meeting calorie goal increased by 85% per 1 day increase (OR 1.85 [1.52, 2.26], p &lt; .0001) with a median of 4 d after admission to meet 2/3 calories. Minimum protein intake (1.5 g/kg/d) was only met in 24.9% of all NSD. Micronutrients examined, except for vitamin D, met DRI based on prescribed amount. Delivered amounts provided suboptimal micronutrient intake, especially for vitamin D (figure 1).</p><p><b>Conclusion:</b> Current ASPEN recommendations for EN were not being achieved in critically ill children. Concentrated formula was often not chosen and decreased ability to meet calorie goals in younger patients. Prescribing shorter continuous EN duration (20/24 h) may improve odds of meeting calorie targets. Evaluation of NSD showed improving trend in calorie intake over the first week to meet 2/3 goal recommendation. However, results highlight inadequacy of protein even if calorie needs are increasingly met. Attention to micronutrient delivery with supplementation of Vitamin D is required.</p><p><b>Table 1.</b> Enteral Nutrition Characteristics Per Nutrition Support Days. (N = 458)</p><p></p><p></p><p>Estimated Vitamin D Intake and 95% Confidence Intervals by Age and Formula Groups.</p><p><b>Figure 1.</b> Estimated Vitamin D Intake by Age and Formula Groups.</p><p>Dana Steien, MD<sup>1</sup>; Megan Thorvilson, MD<sup>1</sup>; Erin Alexander, MD<sup>1</sup>; Molissa Hager, NP<sup>1</sup>; Andrea Armellino, RDN<sup>1</sup></p><p><sup>1</sup>Mayo Clinic, Rochester, MN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Home parenteral nutrition (HPN) is a life-sustaining therapy for children with long-term digestive dysfunction. Historically HPN has been considered a bridge to enteral autonomy, or intestinal transplantation. However, thanks to medical and management improvements, HPN is now used for a variety of diagnoses, including intractable feeding intolerance (IFI) in children with severe neurological impairment (SNI). IFI occurs most often near the end-of-life (EOL) in patients with SNI. Thus, outpatient planning and preparation for HPN in this population, vastly differs from historical HPN use.</p><p><b>Methods:</b> Case series of four pediatric patients with SNI who develop IFI and utilized HPN during their EOL care. Data was collected by retrospective chart review. The hospital pediatric palliative care service was heavily involved in the patients’ care when HPN was discussed and planned. The pediatric intestinal rehabilitation (PIR) and palliative care teams worked closely together during discharge planning and throughout the outpatient courses.</p><p><b>Results:</b> The children with SNI in this case series, developed IFI between ages 1 and 12 years. Duration of HPN use varied from 5 weeks to 2 years. All patients were enrolled in hospice, but at various stages. Routine, outpatient HPN management plans and expectations were modified, based on each family's goals of EOL care. Discussions regarding the use and timing of laboratory studies, fever plans, central line issues, growth, and follow up appointments, required detailed discussions and planning.</p><p><b>Conclusion:</b> EOL care for children differs from most EOL care in adults. Providing HPN to children with SNI and IFI can provide time, opportunities, and peace for families during their child's EOL journey, if it aligns with their EOL goals. PIR teams can provide valuable HPN expertise for palliative care services and families during these challenging times.</p><p>Jessica Lowe, DCN, MPH, RDN<sup>1</sup>; Carolyn Ricciardi, MS, RD<sup>2</sup>; Melissa Blandford, MS, RD<sup>3</sup></p><p><sup>1</sup>Nutricia North America, Roseville, CA; <sup>2</sup>Nutricia North America, Rockville, MD; <sup>3</sup>Nutricia North America, Greenville, NC</p><p><b>Financial Support:</b> This study was conducted by Nutricia North America.</p><p><b>Background:</b> Extensively hydrolyzed formulas (eHFs) are indicated for the management of cow milk allergy (CMA) and related symptoms. This category of formulas is often associated with an unpleasant smell and bitter taste, however; whey-based eHFs are considered more palatable than casein-based eHFs.<sup>1-4</sup> The inclusion of lactose, the primary carbohydrate in human milk, in hypoallergenic formula can also promote palatability. Historically, concerns for residual protein traces in lactose has resulted in complete avoidance of lactose in CMA. However, “adverse reactions to lactose in CMA are not supported in the literature, and complete avoidance of lactose in CMA is not warranted.”<sup>5</sup> Clinicians in the United Kingdom previously reported that taste and acceptance of an eHF is an important consideration when prescribing, as they believe a palatable eHF may result in decreased formula refusal and lead to more content families.<sup>1</sup> The objective of this study was to understand caregiver sensory perspectives on an infant, when-based eHF containing lactose.</p><p><b>Methods:</b> Fifteen clinical sites were recruited from across the United States. Clinicians enrolled 132 infants, whose families received the whey based eHF for 2 weeks, based on clinician recommendation. Caregivers completed two surveys: an enrollment survey and 2-week-post-survey characterizing eHF intake, CMA related symptoms, stooling patterns, sensory perspectives and satisfaction with eHF. Data was analyzed using SPSS 27 and descriptive statistics.</p><p><b>Results:</b> One hundred and twenty-two infants completed the study. At enrollment infants were 22 ( ± 14.7) weeks old. Prior to study initiation, 12.3% of infants were breastfed, 40.2% were on a casein-based eHF, and 18.9% were on a standard formula with intact proteins. Most patients (97.5%) were fed orally and 2.5% were tube fed. Among all parents who responded, 92.5% (n = 86/93) reported better taste and 88.9% (n = 96/108) reported better small for whey-based formula containing lactose compared to the previous formula. For caregivers whose child was on a casein-based eHF at enrollment and responded, 97.6% (n = 41/42) reported better taste and 95.7% (n = 44/46) reported better smell than the previous formula. Additional caregiver reported perceptions of taste and smell are reported in Figure 1 and Figure 2, respectively. Finally, 89.3% of caregivers said it was easy to start their child on the whey-based eHF containing lactose and 91.8% would recommend it to other caregivers whose child requires a hypoallergenic formula.</p><p><b>Conclusion:</b> The majority of caregivers had a positive sensory experience with the whey-based eHF containing lactose compared to the baseline formulas. Additionally, they found the trial formula easy to transition to and would recommend it to other families. These data support the findings of Maslin et al. and support the clinicians' expectation that good palatability would result in better acceptance and more content infants and families.<sup>1</sup> Further research is needed to better understand how improved palatability can contribute to decreased waste and heath care costs.</p><p></p><p><b>Figure 1.</b> Caregiver Ranking: Taste of Whey-Based, Lactose-Containing eHF.</p><p></p><p><b>Figure 2.</b> Caregiver Ranking: Smell of Whey-Based, Lactose-Containing eHF.</p><p>Michele DiCarlo, PharmD<sup>1</sup>; Emily Barlow, PharmD, BCPPS<sup>1</sup>; Laura Dinnes, PharmD, BCIDP<sup>1</sup></p><p><sup>1</sup>Mayo Clinic, Rochester, MN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> There is limited information on hyperkalemia in adult patients that received trimethoprim-sulfamethoxazole (TMP-SMX). The mechanism for hyperkalemia is related to the TMP component, which is structurally related to the potassium-sparing diuretic amiloride. In children, there is no information on clinical impact or monitoring required. We noted that a pediatric patient on total parenteral nutrition (TPN), had a drop in the TPN potassium dosing once the TMP-SMX was started. This reduction remained for two weeks, following the last dose of the antibiotic. Case presentation: A 7-month-old was in a cardiac intensive care unit following complex surgical procedures and extracorporeal membrane oxygenation requirement. TPN was started due to concerns related to poor perfusion and possible necrotizing enterocolitis. TPN continued for a total of one hundred and ten days. Per protocol while on TPN, electrolytes and renal function (urine output, serum creatinine) were monitored daily. Diuretic therapy, including loop diuretics, chlorothiazide and spironolactone, were prescribed prior to TPN and continued for the entire duration. Renal function remained stable for the duration of TPN therapy. Dosing of potassium in the TPN was initiated per ASPEN guidelines and adjusted for serum potassium levels. Due to respiratory requirement and positive cultures, TMP-SMX was added to the medication regimen on two separate occasions. TMP-SMX 15 mg/kg/day was ordered twelve days after the start of the TPN and continued for three days. TMP-SMX 15 mg/kg/day was again started on day forty-three of TPN and continued for a five-day duration. Serum potassium was closely monitored for adjustments once TMP-SMX was started. When the TPN was successfully weaned off, we reviewed this information again. There was an obvious drop in TPN potassium dosing by day two of both regimens of TMP-SMX start and did not return to the prior stable dosing until approximately two weeks after the last dose of the antibiotic. This reduction lasted far beyond the projected half-life of TMP-SMX. (Table 1) Discussion: TMP-SMX is known for potential hyperkalemia in adult patients with multiple confounding factors. Factors include high dosage, renal dysfunction, congestive heart failure, and concomitant medications known to cause hyperkalemia. Little literature exists to note this side effect in pediatrics. The onset of our patient's increased serum potassium levels, and concurrent decrease in TPN dosing, could be expected, as TMP-SMX's time to peak effect is 1-4 hours. Half-life of TMP in children &lt; 2 years old is 5.9 hours. Given this information, one would expect TMP-SMX to be cleared approximately thirty hours from the last dose administered. Our patient's potassium dosing took approximately two weeks from the end of the TMP-SMX administration to return to the pre TMP-SMX potassium dosing for both treatment regimens. Potential causes for the extended time to stabilize include concurrent high dose TMP-SMX and continuation of the potassium sparing diuretic. Prolonged potassium monitoring for pediatric patients started on high dose TMP-SMX while on TPN should be considered and further evaluation explored.</p><p><b>Methods:</b> None Reported.</p><p><b>Results:</b> None Reported.</p><p><b>Conclusion:</b> None Reported.</p><p></p><p>Graph representing TPN Potassium dose (in mEq/kg/day), and addition of TMP-SMX regimen on two separate occasions. Noted drop in TPN potassium dose and delayed return after each TMP-SMX regimen.</p><p><b>Figure 1.</b> TPN Potassium Dose and TMP-SMX Addition.</p><p>Jennifer Smith, MS, RD, CSP, LD, LMT<sup>1</sup>; Praveen Goday, MBBS<sup>2</sup>; Lauren Storch, MS, RD, CSP, LD<sup>2</sup>; Kirsten Jones, RD, CSP, LD<sup>2</sup>; Hannah Huey, MDN<sup>2</sup>; Hilary Michel, MD<sup>2</sup></p><p><sup>1</sup>Nationwide Children's Hospital, Dresden, OH; <sup>2</sup>Nationwide Children's Hospital, Columbus, OH</p><p><b>Financial Support:</b> North American Society for Pediatric Gastroenterology, Hepatology, and Nutrition (NASPGHAN) Foundation.</p><p><b>Background:</b> The prevalence of Avoidant/Restrictive Food Intake Disorder (ARFID) is approximately 3% in the general population and 10% in adults with inflammatory bowel disease (IBD). Up to 10% of adolescents with IBD have reported disordered eating behaviors; however, there have been no prospective studies on the prevalence of ARFID eating behaviors in this population.</p><p><b>Methods:</b> This is a one-time, cross-sectional, non-consecutive study of English-speaking patients with a confirmed diagnosis of IBD, aged 12-18 years. Participants completed the validated Nine Item ARFID Screen (NIAS), the SCOFF eating disorder screen (this screen utilizes an acronym [<span>S</span>ick, <span>C</span>ontrol, <span>O</span>ne, <span>F</span>at, and <span>F</span>ood] in relation to the five questions on the screen) and answered one question about perceived food intolerances. The NIAS is organized into the three specific ARFID domains: eating restriction due to picky eating, poor appetite/limited interest in eating, and fear of negative consequences from eating, each of which is addressed by three questions. Questions are based on a 6-point Likert scale. Participants scoring ≥23 on the total scale or ≥12 on an individual subscale were considered to meet criteria for ARFID eating behaviors. Since some individuals with a positive NIAS screen may have anorexia nervosa or bulimia, we also used the SCOFF questionnaire to assess the possible presence of an eating disorder. A score of two or more positive answers has a sensitivity of 100% and specificity of 90% for anorexia nervosa or bulimia. Apart from descriptive statistics, Chi-square testing was used to study the prevalence of malnutrition, positive SCOFF screening, or food intolerances in patients with and without positive NIAS screens.</p><p><b>Results:</b> We enrolled 82 patients whose demographics are shown in Table 1. Twenty percent (16/82) scored positive on the NIAS questionnaire, 16% (13/82) scored positive on the SCOFF questionnaire, and 48% (39/82) noted food intolerances. Of the 16 participants who scored positive on the NIAS, 50% (8/16) were male, 56% (9/16) had a diagnosis of Crohn's Disease, and 69% (11/16) had inactive disease. Twenty-five percent of those with a positive NIAS (4/16) met criteria for malnutrition (1 mild, 2 moderate, and 1 severe). Sixty-nine percent of those who scored positive on the NIAS (11/16) noted food intolerances and 30% (5/16) had a positive SCOFF screener. The prevalence of malnutrition (p = 0.4), the percentage of patients who scored positive on the SCOFF eating disorder screen (p = 0.3), or those who with reported food intolerances (p = 0.6) was similar in participants who scored positive on the NIAS vs. not.</p><p><b>Conclusion:</b> Using the NIAS, 20% of adolescents with IBD met criteria for ARFID. Participants were no more likely to have malnutrition, a positive score on the SCOFF eating disorder screen, or reported food intolerances whether or not they met criteria for ARFID. Routine screening of adolescents with IBD for ARFID or other eating disorders may identify patients who would benefit from further evaluation.</p><p><b>Table 1.</b> Demographics.</p><p></p><p>Qian Wen Sng, RN<sup>1</sup>; Jacqueline Soo May Ong<sup>2</sup>; Sin Wee Loh, MB BCh BAO (Ireland), MMed (Paed) (Spore), MRCPCH (RCPCH, UK)<sup>1</sup>; Joel Kian Boon Lim, MBBS, MRCPCH, MMed (Paeds)<sup>1</sup>; Li Jia Fan, MBBS, MMed (Paeds), MRCPCH (UK)<sup>3</sup>; Rehena Sultana<sup>4</sup>; Chengsi Ong, BS (Dietician), MS (Nutrition and Public Health), PhD<sup>1</sup>; Charlotte Lin<sup>3</sup>; Judith Ju Ming Wong, MB BCh BAO, LRCP &amp; SI (Ireland), MRCPCH (Paeds) (RCPCH, UK)<sup>1</sup>; Ryan Richard Taylor<sup>3</sup>; Elaine Hor<sup>2</sup>; Pei Fen Poh, MSc (Nursing), BSN<sup>1</sup>; Priscilla Cheng<sup>2</sup>; Jan Hau Lee, MCI, MRCPCH (UK), USMLE, MBBS<sup>1</sup></p><p><sup>1</sup>KK Hospital, Singapore; <sup>2</sup>National University Hospital, Singapore; <sup>3</sup>National University Hospital Singapore, Singapore; <sup>4</sup>Duke-NUS Graduate Medical School, Singapore</p><p><b>Financial Support:</b> This work is supported by the National Medical Research Council, Ministry of Health, Singapore.</p><p><b>Background:</b> Protein-energy malnutrition is pervasive in pediatric intensive care unit (PICU) patients. There remains clinical equipoise on the impact of protein supplementation in critically ill children. Our primary aim was to determine the feasibility of conducting a randomized controlled trial (RCT) of protein supplementation versus standard enteral nutrition (EN) in the PICU.</p><p><b>Methods:</b> An open-labelled pilot RCT was conducted from January 2021 to June 2024 in 2 tertiary pediatric centers in Singapore. Children with body mass index (BMI) z-score &lt; 0, who were expected to require invasive or non-invasive mechanical ventilation for at least 48 hours and required EN support for feeding were included. Patients were randomized (1:1 allocation) to protein supplementation of ³1.5 g/kg/day in addition to standard EN or standard EN alone for 7 days after enrolment or discharge to high dependency unit, whichever was earlier. Feasibility was based on 4 outcomes: Effective screening (&gt;80% eligible patients approached for consent), satisfactory enrolment (&gt;1 patient/center/month), timely protocol implementation (&gt;80% of participants receiving protein supplementation within first 72 hours) and protocol adherence (receiving &gt;80% of protein supplementation as per protocol).</p><p><b>Results:</b> A total of 20 patients were recruited - 10 (50.0%) and 10 (50.0%) in protein supplementation and standard EN groups, respectively. Median age was 13.0 [Interquartile range (IQR) 4.2, 49.1] months. Respiratory distress was the most common reason for PICU admission [11 (55.0%)]. Median PICU and hospital length of stay were 8.0 (IQR 4.5, 16.5) and 19.0 (IQR 11.5, 36.5) days, respectively. There were 3 (15%) deaths which were not related trial intervention. Screening rate was 50/74 (67.6%). Mean enrollment was 0.45 patient/center/month. Timely protocol implementation was performed in 15/20 (75%) participants. Protocol adherence was achieved by the participants in 11/15 (73.3%) of protein supplementation days.</p><p><b>Conclusion:</b> Satisfactory feasibility outcomes were not met in this pilot RCT. Based on inclusion criteria of this pilot study and setup of centers, a larger study in Singapore alone will not be feasible. With incorporation of revised logistic arrangements, a larger feasibility multi-center center study involving regional countries should be piloted.</p><p>Veronica Urbik, MD<sup>1</sup>; Kera McNelis, MD<sup>1</sup></p><p><sup>1</sup>Emory University, Atlanta, GA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Many clinical management and physiologic knowledge gaps are present in the care of the tiny baby population (neonates born at ≤23 weeks gestation, also labelled periviable gestational age). There is therefore significant center-specific variation in practice, as well as increased morbidity and mortality observed in infants born at 22-23 weeks compared to those born at later gestational ages<sup>1</sup>. The importance of nutrition applies to tiny babies, and appropriate nutrition is perhaps even more critical in this population than others. Numerous studies have demonstrated the benefits of full early enteral feeding, including decreased rates of central-line associated blood infection and cholestasis<sup>2,3</sup>. The risk of developing necrotizing enterocolitis (NEC) is a balancing measure for advancement of enteral feeds<sup>4</sup>. Adequate nutrition is critical for growth, reducing morbidity and mortality, and improving overall outcomes. Current proposed protocols for this population target full enteral feeding volumes to be reached by 10-14 days of life<sup>5</sup>.</p><p><b>Methods:</b> From baseline data collected at two Level III neonatal intensive care units (NICU) attended by a single group of academic neonatology faculty from January 2020 – January 2024, the average length of time from birth until full enteral feeds was achieved was 31 days. Using quality improvement (QI) methodology, we identified the barriers in advancement to full enteral feeds (defined as 120cc/kg/day) in babies born at 22- and 23-weeks gestational age admitted to the pediatric resident staffed Level III NICU.</p><p><b>Results:</b> The Pareto chart displays the primary barriers including undefined critical illness, vasopressor use, evaluation for NEC, or spontaneous intestinal perforation, patent ductus arteriosus treatment, and electrolyte derangements (Figure 1). In many cases, no specific reason was able to be identified in chart review for not advancing toward full enteral feeds.</p><p><b>Conclusion:</b> In this ongoing QI project, our SMART aim is to reduce the number of days to reach full feeds over the period of January 2024 -June 2025 by 10%, informed by root cause analysis and the key driver diagram developed from the data thus far (Figure 2). The first plan-do-study-act cycle started on January 16, 2024. Data are analyzed using statistical process control methods.</p><p></p><p>Pareto Chart.</p><p><b>Figure 1.</b></p><p></p><p>Key Driver Diagram.</p><p><b>Figure 2.</b></p><p>Bridget Hron, MD, MMSc<sup>1</sup>; Katelyn Ariagno, RD, LDN, CNSC, CSPCC<sup>1</sup>; Matthew Mixdorf<sup>1</sup>; Tara McCarthy, MS, RD, LDN<sup>1</sup>; Lori Hartigan, ND, RN, CPN<sup>1</sup>; Jennifer Lawlor, RN, BSN, CPN<sup>1</sup>; Coleen Liscano, MS, RD, CSP, LDN, CNSC, CLE, FAND<sup>1</sup>; Michelle Raymond, RD, LDN, CDCES<sup>1</sup>; Tyra Bradbury, MPH, RD, CSP, LDN<sup>1</sup>; Erin Keenan, MS, RD, LDN<sup>1</sup>; Christopher Duggan, MD, MPH<sup>1</sup>; Melissa McDonnell, RD, LDN, CSP<sup>1</sup>; Rachel Rosen, MD, MPH<sup>1</sup>; Elizabeth Hait, MD, MPH<sup>1</sup></p><p><sup>1</sup>Boston Children's Hospital, Boston, MA</p><p><b>Financial Support:</b> Some investigators received support from agencies including National Institutes of Health and NASPGHAN which did not directly fund this project.</p><p><b>Background:</b> The widespread shortage of amino acid-based formula in February 2022 highlighted the need for urgent and coordinated hospital response to disseminate accurate information to front-line staff in both inpatient and ambulatory settings.</p><p><b>Methods:</b> An interdisciplinary working group consisting of pediatric gastroenterologists, dietitians, nurses and a quality improvement analyst was established in September 2022. The group met at regular intervals to conduct a needs assessment for all disciplines. The group developed and refined a novel clinical process algorithm to respond to reports of formula shortages and/or recalls. The Clinical Process Map is presented in Figure 1. Plan-do-study-act cycles were implemented to improve the quality of the communication output based on staff feedback. The key performance indicator is time from notification of possible shortage to dissemination of communication to stakeholders, with a goal of &lt; 24 hours.</p><p><b>Results:</b> From September 2022 to August 2024, the group met 18 times for unplanned responses to formula recall/shortage events. Email communication was disseminated within 24 hours for 8/18 (44%) events; within 48 hours for 9/18 (50%) and 1/18 (6%) after 48 hours. Iterative changes included the initiation of an urgent huddle for key stakeholders to identify impact and substitution options; development of preliminary investigation pathway to ensure validity of report; development of structured email format that was further refined to table format including images of products (Figure 2) and creation of email distribution list to disseminate shortage reports. The keys to this project's success were the creation of a multidisciplinary team dedicated to meeting urgently for all events, and the real time drafting and approval of communication within the meeting. Of note, the one communication which was substantially delayed (94.7 hours) was addressed over email only, underscoring the importance of the multidisciplinary working meeting.</p><p><b>Conclusion:</b> Establishing clear lines of communication and assembling key stakeholders resulted in timely, accurate and coordinated communication regarding nutrition recalls/shortage events at our institution.</p><p></p><p><b>Figure 1.</b> Formula Recall Communication Algorithm.</p><p></p><p><b>Figure 2.</b></p><p>Nicole Misner, MS, RDN<sup>1</sup>; Michelle Yavelow, MS, RDN, LDN, CNSC, CSP<sup>1</sup>; Athanasios Tsalatsanis, PhD<sup>1</sup>; Racha Khalaf, MD, MSCS<sup>1</sup></p><p><sup>1</sup>University of South Florida, Tampa, FL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Early introduction of peanuts and eggs decreases the incidence of peanut and egg allergies in infants who are at high risk of developing food allergies. Prevention guidelines and clinical practice have shifted with recent studies. However, little is known about introducing common food allergens in infants fed via enteral feeding tubes. Early introduction of allergens could be of importance in infants working towards tube feeding wean and those that may benefit from blended tube feeding in the future. We aimed to compare the characteristics of patients with enteral tubes who received education during their gastroenterology visit verses those who did not.</p><p><b>Methods:</b> We performed a single center retrospective chart review involving all patients ages 4 to 24 months of age with an enteral feeding tube seen at the University of South Florida Pediatric Gastroenterology clinic from August 2020 to July 2024. Corrected age was used for infants born &lt; 37 weeks’ gestation age. All types of enteral nutrition were included i.e. nasogastric, gastrostomy, gastrostomy-jejunostomy tube. Data on demographics, clinical characteristics and parent-reported food allergen exposure were collected. An exception waiver was received by the University of South Florida Institution Review Board for this retrospective chart review. Differences between patients who received education and those who did not were evaluated using Student's t-test for continuous variables and chi-square test for categorical variables. All analysis was performed using R Statistical Software (v4.4.2). A p value &lt; =0.05 was considered statistically significant.</p><p><b>Results:</b> A total of 77 patients met inclusion criteria, with 349 total visits. Patient demographics at each visit are shown in Table 1. There was a documented food allergy in 12% (43) of total visits. Education on early introduction of common food allergens was provided in 12% (42) of total visits. Patients who received education at their visit were significantly younger compared to those who did not and were also more likely to have eczema. Table 2 compares nutrition characteristics of the patient at visits where education was discussed vs those where it was not. Infants with any percent of oral intake were more likely to have received education than those that were nil per os (p = 0.013). There was a significant association between starting solids and receiving education (p &lt; 0.001). Reported allergen exposure across all visits was low. For total visits with the patient &lt; 8 months of age (n = 103), only 6% (6) reported peanut and 8% (8) egg exposure. Expanded to &lt; 12 months of age at the time of visit (n = 198), there was minimal increase in reported allergen exposure, 7% (14) reported peanut and 8% (15) egg exposure. Oral feeds were the most common source of reported form of allergen exposure. Only one patient received commercial early allergen introduction product. Cow's milk exposure was the most reported allergen exposure with 61% (63) under 8 months and 54% (106) under 12 months of age, majority from their infant formula.</p><p><b>Conclusion:</b> Age and any proportion of oral intake were associated with receiving education on common food allergen introduction at their visit. However, there were missed opportunities for education in infants with enteral feeding tubes. There were few visits with patients that reported peanut or egg exposure. Further research and national guidelines are needed on optimal methods of introduction in this population.</p><p><b>Table 1.</b> Demographics.</p><p></p><p><b>Table 2.</b> Nutrition Characteristics.</p><p></p><p>Samantha Goedde-Papamihail, MS, RD, LD<sup>1</sup>; Ada Lin, MD<sup>2</sup>; Stephanie Peters, MS, CPNP-PC/AC<sup>2</sup></p><p><sup>1</sup>Nationwide Children's Hospital, Grove City, OH; <sup>2</sup>Nationwide Children's Hospital, Columbus, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Critically ill children with severe acute kidney injury (AKI) requiring continuous renal replacement therapy (CRRT) are at high risk for micronutrient deficiencies, including vitamin C (VC). VC acts as an antioxidant and enzyme cofactor. It cannot be made endogenously and must be derived from the diet. Critically ill patients often enter the hospital with some degree of nutritional debt and may have low VC concentrations upon admission. This is exacerbated in the case of sepsis, multi-organ dysfunction, burns, etc. when VC needs are higher to address increased oxidative and inflammatory stress. When these patients develop a severe AKI requiring CRRT, their kidneys do not reabsorb VC as healthy kidneys would, further increasing losses. Moreover, VC is a small molecule and is filtered out of the blood via CRRT. The combination of increased VC losses and elevated VC requirements in critically ill patients on CRRT result in high risk of VC deficiency. The true prevalence of VC deficiency in this population is not well-known; whereas the prevalence of deficiency in the general population is 5.9% and 18.3% in critically ill children, on average. The aim of this study is to ascertain the prevalence of VC deficiency in critically ill pediatric patients on CRRT by monitoring serum VC levels throughout their CRRT course and correcting noted deficiencies.</p><p><b>Methods:</b> An observational study was conducted from May 2023 through August 2024 in a 54 bed high-acuity PICU offering ECMO, CRRT, Level 1 Trauma, burn care, solid organ transplant and bone marrow transplantation as tertiary care services. Fifteen patients were identified upon initiation of CRRT and followed until transfer from the PICU. Serial serum VC levels were checked after 5-10 days on CRRT and rechecked weekly thereafter as appropriate. Deficiency was defined as VC concentrations &lt; 11 umol/L. Inadequacy was defined as concentrations between 11-23 umol/L. Supplementation was initiated for levels &lt; 23 umol/L; dose varied 250-500 mg/d depending on age and clinical situation. Most deficient patients received 500 mg/d supplementation. Those with inadequacy received 250 mg/d.</p><p><b>Results:</b> Of 15 patients, 9 had VC deficiency and 4 had VC inadequacy [FIGURE 1]. Of those with deficiency, 5 of 9 patients were admitted for septic shock [FIGURE 2]. VC level was rechecked in 8 patients; level returned to normal in 5 patients and 4 of those 5 received 500 mg/d supplementation. Levels remained low in 3 patients; all received 250 mg/d supplementation [FIGURE 3]. Supplementation dose changes noted in figure 4.</p><p><b>Conclusion:</b> VC deficiency was present in 60% of CRRT patients, suggesting deficiency is more common in this population and that critically ill patients on CRRT are at higher risk of developing deficiency than those who are not receiving CRRT. Septic shock and degree of VC deficiency appeared to be correlated; 56% of deficient patients were admitted with septic shock. Together, this suggests a need to start supplementation earlier, perhaps upon CRRT initiation vs upon admission to the PICU in a septic patient; and utilize higher supplementation doses as our patients with low VC levels at their follow-up check were all receiving 250 mg/d. Study limitations include: 1. Potential for VC deficiency prior to CRRT initiation with critical illness and/or poor intakes as confounding factors and 2. Small sample size. Our results suggest that critically ill children with AKI requiring CRRT are at increased risk of VC deficiency while on CRRT. Future research should focus on identifying at-risk micronutrients for this patient population and creating supplementation regimens to prevent the development of deficiencies. Our institution is currently crafting a quality improvement project with these aims.</p><p></p><p><b>Figure 1.</b> Initial Serum Vitamin C Levels of Our Patients on CRRT, Obtained 5-10 Days After CRRT Initiation (N = 15).</p><p></p><p><b>Figure 2.</b> Underlying Disease Process of Patients on CRRT (N = 15).</p><p></p><p><b>Figure 3.</b> Follow Up Vitamin C Levels After Supplementation (N = 8), Including Supplementation Regimen Prior to Follow-Up Lab (dose/route).</p><p></p><p><b>Figure 4.</b> Alterations in Supplementation Regimen (dose) Based on Follow-up Lab Data (N = 6).</p><p>Tanner Sergesketter, RN, BSN<sup>1</sup>; Kanika Puri, MD<sup>2</sup>; Emily Israel, PharmD, BCPS, BCPPS<sup>1</sup>; Ryan Pitman, MD, MSc<sup>3</sup>; Elaina Szeszycki, BS, PharmD, CNSC<sup>2</sup>; Ahmad Furqan Kazi, PharmD, MS<sup>1</sup>; Ephrem Abebe, PhD<sup>1</sup></p><p><sup>1</sup>Purdue University College of Pharmacy, West Lafayette, IN; <sup>2</sup>Riley Hospital for Children at Indiana University Health, Indianapolis, IN; <sup>3</sup>Indiana University, Indianapolis, IN</p><p><b>Financial Support:</b> The Gerber Foundation.</p><p><b>Background:</b> During the hospital-to-home transition period, family members or caregivers of medically complex children are expected to assume the responsibility of managing medication and feeding regimens for the child under their care. However, this transition period represents a time of high vulnerability, including the risk of communication breakdowns and lack of tools tailored to caregivers’ context. This vulnerability is further heightened when a language barrier is present as it introduces additional opportunities for misunderstandings leading to adverse events and errors in the home setting. Hence, it is critical that caregivers are educated to develop skills that aid in successful implementation of post-discharge care plans. Addressing unmet needs for caregivers who use languages other than English (LOE) requires an in-depth understanding of the current challenges associated with educating and preparing caregivers for the post-discharge period.</p><p><b>Methods:</b> In this prospective qualitative study, healthcare workers (HCWs) were recruited from a tertiary care children's hospital in central Indiana and were eligible to participate if they were involved directly or indirectly in preparing and assisting families of children under three years of age with medication and feeding needs around the hospital discharge period and/or outpatient care following discharge. Each HCW completed a brief demographic survey followed by observation on the job for 2-3 hours, and participated in a follow-up semi-structured interview using video conferencing technology to expand on observed behavior. Handwritten field notes were taken during the observations, which were immediately typed and expanded upon post-observation. Audio recordings from the interviews were transcribed and de-identified. Both the typed observation notes and interview transcripts were subjected to thematic content analysis, which was completed using the Dedoose software.</p><p><b>Results:</b> Data collection is ongoing with anticipated completion in October 2024. Fourteen HCW interviews have been completed to date, with a target sample of 20-25 participants. Preliminary analysis presented is from transcripts of seven interviews. Participants included six females and one male with a mean age of 35.4 years (range, 24 - 59). HCWs were from diverse inpatient and outpatient clinical backgrounds including registered dieticians, physicians, pharmacists, and nurses. Four overarching themes describe the challenges that HCWs experience when communicating with caregivers who use LOE during the hospital-to-home transition. These themes include lack of equipment and materials in diverse languages, challenges with people and technologies that assist with translating information, instructions getting lost in translation/uncertainty of translation, and difficulty getting materials translated in a timely manner. Main themes, subthemes, and examples are presented in Figure 1, and themes, subthemes, and quotes are presented in Table 1.</p><p><b>Conclusion:</b> The study is ongoing, however based on the preliminary analysis, it is evident that the systems and processes that are in place to aid in communication between HCWs and caregivers who use LOE can be improved. This can ultimately lead to improved quality of care provided to caregivers who use LOE during the hospital-to-home transition and resultant safer care in the home setting for medically complex children.</p><p><b>Table 1.</b> Themes, Subthemes, and Quotes.</p><p></p><p></p><p><b>Figure 1.</b> Main Themes, Subthemes, and Examples.</p><p>Ruthfirst Ayande, PhD, MSc, RD<sup>1</sup>; Shruti Gupta, MD, NABBLM-C<sup>1</sup>; Sarah Taylor, MD, MSCR<sup>1</sup></p><p><sup>1</sup>Yale University, New Haven, CT</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Growth faltering among preterm neonates admitted to NICUs may be managed with hypercaloric and/or hypervolemic feeding. However, in intractable cases of growth faltering or when fluid restriction is indicated, fortification with extensively hydrolyzed liquid protein may offer a solution to meeting the high protein demands for infant growth while limiting overfeeding. Yet, there is limited clinical data regarding supplementation with liquid protein, leaving clinicians to make decisions about dosing and duration on a case-by-case basis.</p><p><b>Methods:</b> We present a case of neonatal growth faltering managed with a liquid protein modular in a level IV NICU in North America.</p><p><b>Results:</b> Case Summary: A male infant, born extremely preterm (GA: 24, 1/7) and admitted to the NICU for respiratory distress, requiring intubation. NICU course was complicated by patent ductus arteriosus (PDA), requiring surgery on day of life (DOL) 31 and severe bronchopulmonary dysplasia. Birth Anthropometrics: weight: 0.78 kg; height: 31.5 cm. TPN was initiated at birth, with trophic feeds of donor human milk per gavage (PG) for a total provision of 117 ml/kg, 75 kcal/kg, and 3.5 gm/kg of protein. The regimen was advanced per unit protocol; however, on DOL 5, total volume was decreased in the setting of metabolic acidosis and significant PDA. PG feeds of maternal milk were fortified to 24 kcal/oz on DOL 23, and the infant reached full feeds on DOL 26. Feed provision by DOL 28 was ~144 ml/kg/day and 4 g/kg of protein based on estimated dry weight. TPN was first discontinued on DOL 25 but restarted on DOL 32 due to frequent NPO status and clinical instability. Of note, the infant required diuretics during the hospital stay. TPN was again discontinued on DOL 43. At DOL 116, the infant was receiving and tolerating PG feeds fortified to 24 kcal/oz at 151 ml/kg, 121 kcal/kg, and 2.1 gm/kg protein. The infant's weight gain rate was 56 g/day; however, linear growth was impaired, with a gain of 0.5 cm over 21 days (rate of ~0.2 cm/week). Liquid protein was commenced at DOL 124 to supply an additional 0.5 gm/kg of protein. A week after adding liquid protein, the infant's weight gain rate was 39 g/day, and height increased by 2.5 cm/week. Feed fortification was reduced to 22 kcal/oz on DOL 156 due to rapid weight gain, and liquid protein dosage increased to 0.6 gm/kg for a total protein intake of 2.2 g/kg. At DOL 170, Calorie fortification of maternal milk was discontinued, and liquid protein dosage was increased to 1 g/kg in the setting of a relapse of poor linear growth for a total protein intake of 3.1 g/kg. Liquid protein was provided for two months until discontinuation (d/c) at DOL 183 per parent request. At the time of d/c of liquid protein, the infant's weight and length gain rates for the protein supplementation period (59 days) was 42 gm/day and 1.78 cm/week, respectively.</p><p><b>Conclusion:</b> While we observed objective increases in linear growth for the presented case following the addition of a liquid protein modular, it is crucial to note that these findings are not generalizable, and there is limited evidence and guidelines on the use of hydrolyzed liquid protein. Larger, well-controlled studies examining mechanisms of action, appropriate dosage and duration, short- and long-term efficacy, and safety are required to guide best practices for using these modulars.</p><p>Sarah Peterson, PhD, RD<sup>1</sup>; Nicole Salerno, BS<sup>1</sup>; Hannah Buckley, RDN, LDN<sup>1</sup>; Gretchen Coonrad, RDN, LDN<sup>1</sup></p><p><sup>1</sup>Rush University Medical Center, Chicago, IL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Accurate assessment of growth and nutritional status is critical for preterm infants. Various growth charts have been developed to track the growth of preterm infants, but differences in reference standards may influence the diagnosis and management of malnutrition. The goal of this study was to compare the rate of malnutrition, defined by a decline in weight-for-age z-score, using the Fenton growth chart, the Olsen growth chart, and the INTERGROWTH-21st Preterm Postnatal Growth Standard.</p><p><b>Methods:</b> All preterm infants born between 24 and 37 weeks of gestational age who were admitted to the neonatal intensive care unit (NICU) in 2022 and had weight at birth and on day 28 recorded were included. Preterm infants were excluded if they were admitted to the NICU ≥seven days after birth. Sex and gestational age were recorded for each infant. Weight and weight-for-age z-score at birth and on day 28 were recorded. Weight-for-age z-score was determined using three growth charts: (1) the Fenton growth chart, which was developed using growth data from uncomplicated preterm infants who were at least 22 weeks of gestational age from the United States (US), Australia, Canada, Germany, Italy, and Scotland; (2) the Olsen growth chart, which was developed using growth data from uncomplicated preterm infants who were at least 23 weeks of gestational age from the US; and (3) the INTERGROWTH-21st Preterm Postnatal Growth Standard, which was developed using growth data from uncomplicated preterm infants who were at least 24 weeks of gestational age from the US, United Kingdom, Brazil, China, India, Italy, Kenya, and Oman. Change in weight-for-age z-score from birth to day 28 was calculated using z-scores from each growth chart. Malnutrition was defined as a decline in weight-for-age z-score of ≥0.8; any infant meeting this cut-point had their malnutrition status further classified into three categories: (1) mild malnutrition was defined as a weight-for-age z-score decline between 0.8-1.1, (2) moderate malnutrition was defined as a weight-for-age z-score decline between 1.2-1.9, and (3) severe malnutrition was defined as a weight-for-age z-score decline of ≥2.0.</p><p><b>Results:</b> The sample included 102 preterm infants, 58% male, with a mean gestational age of 29.3 weeks. At birth, the average weight was 1,192 grams, and the average weight-for-age z-score was -0.50, -0.36, and -1.14 using the Olsen, Fenton, and INTERGROWTH-21st growth charts, respectively. At 28 days, the average weight was 1,690 grams, and the average weight-for-age z-score was -0.96, -1.00, and -1.43 using the Olsen, Fenton, and INTERGROWTH-21st growth charts, respectively. Using the Olsen growth chart, 29 infants met the criteria for malnutrition; 15 had mild malnutrition and 14 had moderate malnutrition. Using the Fenton growth chart, 33 infants met the criteria for malnutrition; 21 had mild malnutrition and 12 had moderate malnutrition. Using the INTERGROWTH-21st growth chart, 32 infants met the criteria for malnutrition; 14 had mild malnutrition, 14 had moderate malnutrition, and 4 had severe malnutrition. In total, 24 infants met the criteria for malnutrition using all three growth charts, while 19 infants were only categorized as malnourished by one of the three growth charts.</p><p><b>Conclusion:</b> The findings of this study reveal important discrepancies in average z-scores and the classification of malnutrition among preterm infants when using the Olsen, Fenton, and INTERGROWTH-21st growth charts. These differences suggest that the choice of growth chart has important implications for identifying rates of malnutrition. Therefore, standardized guidelines for growth monitoring in preterm infants are necessary to ensure consistent and accurate diagnosis of malnutrition.</p><p>Emaan Abbasi, BSc<sup>1</sup>; Debby Martins, RD<sup>2</sup>; Hannah Piper, MD<sup>2</sup></p><p><sup>1</sup>Univery of Galway, Vancouver, BC; <sup>2</sup>BC Children's Hospital, Vancouver, BC</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Infants with gastroschisis have variable intestinal function with some achieving enteral independence within a few weeks and others remaining dependent on parental nutrition (PN) for prolonged periods. Establishing enteral feeds can be challenging for many of these neonates due to poor intestinal motility, frequent emesis and/or abdominal distension. Therefore, many care teams use standardized post-natal nutrition protocols in an attempt to minimize PN exposure and maximize oral feeding. However, it remains unclear whether initiating continuous feeds is advantageous or whether bolus feeding is preferred. Potential benefits of bolus feeding include that it is more physiologic and that the feeds can be given orally, but there remain concerns about feed tolerance and prolonged periods of withholding feeds with this approach. The objective of this study was to compare the initial feeding strategy in infants with gastroschisis to determine whether bolus feeding is a feasible approach.</p><p><b>Methods:</b> After obtaining REB approval (H24-01052) a retrospective chart review was performed in neonates born with gastroschisis, cared for by a neonatal intestinal rehabilitation team between 2018 and 2023. A continuous feeding protocol was used between 2018-2020 (human milk at 1 ml/h with 10 ml/kg/d advancements given continuously until 50 ml/kg/d and then trialing bolus feeding) and a bolus protocol was used between 2021-2023 (10-15 ml/kg divided into 8 feeds/d with 15-20 ml/kg/d advancements). Clinical data was collected including: gestational age, gastroschisis prognosis score (GPS), need for intestinal resection, age when feeds initiated, time to full feeds, route of feeding at full feeds and hepatic cholestasis were compared between groups. Welch's t-test and chi square test were performed to compare variables with p-values &lt; 0.05 considered significant.</p><p><b>Results:</b> Forty-one infants with gastroschisis were reviewed (23 who were managed with continuous feed initiation and 18 with bolus feed initiation). Continuous feed and bolus feed groups had comparable mean gestational age at birth, GPS score, need for intestinal surgery, age at feed initiation, and the incidence of cholestasis (Table 1). Time to achieve enteral independence was similar between both groups with nearly half of infants reaching full feeds by 6 weeks (48% continuous feeds vs. 44% bolus feeds) and most by 9 weeks of life (74% continuous vs. 72% bolus). Significantly more infants in the bolus feeding group were feeding exclusively orally compared to the continuous feeding group at the time of reaching full enteral feeds (50% vs. 17%, p = 0.017).</p><p><b>Conclusion:</b> Initiating bolus enteral feeding for infants with gastroschisis, including those requiring intestinal resection, is safe and does not result in prolonged time to full enteral feeding. Avoiding continuous feeds may improve oral feeding in this population.</p><p><b>Table 1.</b> Clinical Characteristics and Initial Feeding Strategy.</p><p></p><p><b>International Poster of Distinction</b></p><p>Matheus Albuquerque<sup>1</sup>; Diogo Ferreira<sup>1</sup>; João Victor Maldonado<sup>2</sup>; Mateus Margato<sup>2</sup>; Luiz Eduardo Nunes<sup>1</sup>; Emanuel Sarinho<sup>1</sup>; Lúcia Cordeiro<sup>1</sup>; Amanda Fifi<sup>3</sup></p><p><sup>1</sup>Federal University of Pernambuco, Recife, Pernambuco; <sup>2</sup>University of Brasilia, Brasília, Distrito Federal; <sup>3</sup>University of Miami, Miami, FL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Intestinal failure secondary to short bowel syndrome is a malabsorptive condition, caused by intestinal resection. Patients with intestinal failure require parenteral support to maintain hydration and nutrition. Long-term parenteral nutrition leads to complications. Teduglutide, an analog of GLP-2, may improve intestinal adaptation thereby minimizing reliance on parenteral nutrition. This meta-analysis evaluates the efficacy of teduglutide in reducing parenteral nutrition dependency in pediatric patients with intestinal failure.</p><p><b>Methods:</b> We included randomised controlled trials (RCTs) that assessed the efficacy of teglutide in reducing parenteral nutrition support and improving anthropometrics in pediatric patients with intestinal failure secondary to short bowel syndrome. The RoB-2 tool (Cochrane) evaluated the risk of bias, and statistical analyses were conducted utilizing RevMan 5.4.1 software. The results are expressed as mean differences with CI 95% and p-value.</p><p><b>Results:</b> Data was extracted from three clinical trials, involving a total of 172 participants. Teduglutide use was associated with a reduction in parenteral nutrition volume (-17.92 mL, 95% CI -24.65 to -11.20, p &lt; 0.00001) with most patients reducing parenteral support by &gt;20% (11.79, 95% CI 2.04 to 68.24, p = 0.006) (Figure2). Treatment with teduglutide also improved height (0.27 Z-score, 95% CI 0.08, to 0.46, p = 0.005) but did not significantly increase weight when compared to the control group (-0.13 Z-score, 95% CI, -0.41 to 0.16, p = 0.38) (Figure 3).</p><p><b>Conclusion:</b> This meta-analysis suggests that therapy with teduglutide reduces parenteral nutrition volume in patients with short bowel syndrome and intestinal failure. Reduced pareneteral nutrition dependency can minimize complications and improve quality of life in patients with short bowel syndrome and intestinal failure.</p><p></p><p><b>Figure 1.</b> Parenteral Nutrition Support Volume Change.</p><p></p><p><b>Figure 2.</b> Anthropometric Data (Weight and Height) Change from Baseline.</p><p>Korinne Carr<sup>1</sup>; Liyun Zhang, MS<sup>1</sup>; Amy Pan, PhD<sup>1</sup>; Theresa Mikhailov, MD, PhD<sup>2</sup></p><p><sup>1</sup>Medical College of Wisconsin, Milwaukee, WI; <sup>2</sup>Childrens Hospital of Wisconsin, Milwaukee, WI</p><p><b>Financial Support:</b> Medical College of Wisconsin, Department of Pediatrics.</p><p><b>Background:</b> Malnutrition is a significant concern in pediatric patients, particularly those critically ill. In children with diabetes mellitus (DM), the presence of malnutrition can exacerbate complications such as unstable blood sugar levels and delayed wound healing, potentially leading to worse clinical outcomes. Despite the known impact of malnutrition on adult patients and critically ill children, established criteria for identifying malnutrition in critically ill children are lacking. This study was designed to determine the relationship between malnutrition, mortality, and length of stay (LOS) in critically ill pediatric patients with diabetes mellitus.</p><p><b>Methods:</b> We conducted a retrospective cohort study using the VPS (Virtual Pediatric Systems, LLC) Database. We categorized critically ill pediatric patients with DM as malnourished or at risk of being malnourished based on admission nutrition screens. We compared mortality rates between malnourished and non-malnourished patients using Fisher's Exact test. We used logistic regression analysis to compare mortality controlling for measures like PRISM3 (a severity of illness measure), demographic, and clinical factors. We compared the LOS in the Pediatric Intensive Care Unit (PICU) between malnourished and non-malnourished patients using the Mann-Whitney-Wilcoxon test. Additionally, we used a general linear model with appropriate transformation to adjust for the severity of illness, demographic, and clinical factors. We considered statistical significance at p &lt; 0.05.</p><p><b>Results:</b> We analyzed data for 4,014 patients, of whom 2,653 were screened for malnutrition. Of these 2,653, 88.5% were type 1 DM, 9.3% were type 2 DM, and the remaining patients were unspecified DM. Of the 2,653 patients, 841 (31.7%) were malnourished based on their nutrition screen at admission to the PICU. Mortality in patients who were screened as malnourished did not differ from mortality in those who were not malnourished (0.4% vs. 0.2%, p = 0.15). Malnourished patients also had longer PICU LOS, with a geometric mean and 95% CI of 1.03 (0.94–1.13) days, compared to 0.91 (0.86–0.96) days for non-malnourished patients. Similarly, the malnourished patients had longer hospital LOS with a geometric mean and 95% CI of 5.31 (4.84–5.83) days, compared to 2.67 (2.53–2.82) days for those who were not malnourished. Both differences were significant with p &lt; 0.0001, after adjusting for age, race/ethnicity, and PRISM3.</p><p><b>Conclusion:</b> We found no difference in mortality rates, but critically ill children who were screened as malnourished had longer PICU and hospital LOS than those who were not malnourished. This was true even after adjusting for age, race/ethnicity, and PRISM3.</p><p>Emily Gutzwiller<sup>1</sup>; Katie Huff, MD, MS<sup>1</sup></p><p><sup>1</sup>Indiana University School of Medicine, Indianapolis, IN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Neonates with intestinal failure require parenteral nutrition for survival. While life sustaining, it can lead to serious complications, including intestinal failure associated liver disease (IFALD). The etiology of IFALD is likely multifactorial, with intravenous lipid emulsions (ILE) being a large contributor, particularly soybean oil-based lipid emulsions (SO-ILE). Alternate ILEs, including those containing fish oil, can be used to prevent and treat IFALD. Fish oil-based ILE (FO-ILE) is only approved at a dose of 1 g/kg/d, limiting calories prescribed from fat and shifting the calorie delivery to carbohydrate predominance. While FO-ILE was shown to have comparable growth to SO-ILE, a comparison to soy, MCT, olive, fish oil-based ILE (SO,MCT,OO,FO-ILE) has not been conducted to our knowledge. The purpose of this study is to compare the growth and laboratory data associated with SO,MCT,OO,FO-ILE and FO-ILE therapy in the setting of IFALD in a cohort of neonates treated during two time periods.</p><p><b>Methods:</b> We performed a retrospective chart review of patients with IFALD receiving SO,MCT,OO,FO-ILE (SMOFlipid) or FO-ILE (Omegaven) in a level IV neonatal intensive care unit from September 2016 – May 2024. IFALD was defined as direct bilirubin &gt;2 mg/dL after receiving &gt;2 weeks of parental nutrition. Patients with underlying genetic or hepatic diagnoses, as well as patients with elevated direct bilirubin prior to two weeks of life were excluded. Patients were divided based on the period they were treated. Data was collected on demographic characteristics, parental and enteral nutrition, weekly labs, and growth changes while receiving SO,MCT,OO,FO-ILE or FO-ILE. Rate of change of weight, length, and head circumference and comparison of z-scores over time were studied for each ILE group. Secondary outcomes included nutritional data in addition to hepatic labs. Nonparametric analysis using Mann-Whitney U test was conducted to compare ILE groups and a p-value of &lt; 0.05 was used to define statistical significance.</p><p><b>Results:</b> A total of 51 patients were enrolled, 25 receiving SO,MCT,OO,FO-ILE and 26 FO-ILE. Table 1 notes the demographic and baseline characteristics of the two ILE groups. There was no difference in the rate of OFC (p = 0.984) or length (p = 0.279) growth between the two treatment groups (Table 2). There was a difference, however, in the rate of weight gain between groups. (p = 0.002; Table 2), with the FO-ILE group gaining more weight over time. When comparing nutritional outcomes (Table 2), SO,MCT,OO,FO-ILE patients received greater total calories than FO-ILE patients (p = 0.005) including a higher ILE dose (p &lt; 0.001) and enteral calories (p = 0.029). The FO-ILE group, however received a higher carbohydrate dose (p = 0.003; Table 2). There was no difference in amino acid dose (p = 0.127) or parental nutrition calories (p = 0.821). Hepatic labs differed over time with the FO-ILE having a larger decrease in AST, direct bilirubin, and total bilirubin over time compared to the SO,MCT,OO,FO-ILE group (Table 2).</p><p><b>Conclusion:</b> Our results show the FO-ILE patients did have a significant increase in weight gain compared to the SO,MCT,OO,FO-ILE patients. This is despite SO,MCT,OO,FO-ILE patients receiving greater total calories and enteral calories. The FO-ILE only received greater calories in the form of glucose infusion. With this increased weight gain but similar length growth between groups, concerns regarding alterations in body composition and increased fat mass arise. Further research is needed to determine the influence of this various ILE products on neonatal body composition over time.</p><p><b>Table 1.</b> Demographic and Baseline Lab Data by Lipid Treatment Group.</p><p>(All data presented as median and interquartile range, unless specified.)</p><p></p><p><b>Table 2.</b> Nutritional, Hepatic Lab, and Growth Outcomes by Lipid Treatment Group.</p><p>(All data presented as median interquartile range unless specified.)</p><p>*z-score change compares z-score at end and beginning of study period</p><p>OFC-occipitofrontal circumference</p><p></p><p>Rachel Collins, BSN, RN<sup>1</sup>; Brooke Cherven, PhD, MPH, RN, CPON<sup>2</sup>; Ann-Marie Brown, PhD, APRN, CPNP-AC/PC, CCRN, CNE, FCCM, FAANP, FASPEN<sup>1</sup>; Christina Calamaro, PhD, PPCNP-BC, FNP-BC, FAANP, FAAN<sup>3</sup></p><p><sup>1</sup>Emory University Nell Hodgson Woodruff School of Nursing, Atlanta, GA; <sup>2</sup>Emory University School of Medicine; Children's Healthcare of Atlanta, Atlanta, GA; <sup>3</sup>Emory University Nell Hodgson Woodruff School of Nursing; Children's Healthcare of Atlanta, Atlanta, GA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Many children receiving hematopoietic stem cell transplant (HSCT) are malnourished prior to the start of transplant or develop malnutrition post-infusion. Children are at risk for malnutrition due to the pre-transplant conditioning phase, from chemotherapy treatments for their primary diagnosis, and from acute graft versus host disease (GVHD). These factors can cause impaired oral feeding, vomiting, diarrhea, and mucositis. Enteral nutrition (EN) and parenteral nutrition (PN) are support options for this patient population when oral feeding is impaired. There is currently a paucity of research in evidence-based guidelines of nutrition in this population. The purpose of this integrative literature review was to determine the outcomes of EN and PN in pediatric HSCT and to discuss evidence-based implications for practice to positively affect quality of life for these patients.</p><p><b>Methods:</b> A literature search was conducted using the following databases: PubMed, CINAHL, Cochrane, and Healthsource: Nursing/Academic Edition. The search strategy included randomized controlled trials, prospective and retrospective cohort studies, case control studies, cross sectional studies, systematic reviews, and meta-analyses. Relevant papers were utilized if they discussed patients 0-21 years of age, allogenic or autogenic HSCT, and enteral and/or parenteral nutrition. Papers were excluded if there was no English translation, they did not discuss nutrition, or they had animal subjects.</p><p><b>Results:</b> Initially 477 papers were identified and after the screening process 15 papers were utilized for this integrative review. EN and PN have effects on clinical outcomes, complications, and hospital and survival outcomes. EN was associated with faster platelet engraftment, improved gut microbiome, and decreased mucositis and GVHD. PN was used more in severe mucositis due to interference with feeding tube placement, therefore decreasing the use of EN. Use of PN is more common in severe grades III-IV of gut GVHD. Initiation of EN later in treatment, such as after conditioning and the presence of mucositis, can be associated with severe grades III-IV of gut GVHD. This is because conditioning can cause damage to the gut leading to mucosal atrophy and intestinal permeability altering gut microbiota. PN can induce gut mucosal atrophy and dysbiosis allowing for bacterial translocation, while EN improves the gut epithelium and microbiota reducing translocation. Additionally, increase access to central venous lines in PN can introduce bacterial infections to the bloodstream. Feeding tube placement complications include dislodgement, refusal of replacement, and increased risk of bleeding due to thrombocytopenia. Electrolyte imbalance can be attributed to loss of absorption through the gut. If a gastrostomy tube is present, there can be infection at the site. There is currently no consensus in the appropriate timeline of tube placement. There was no significant difference in neutrophil engraftment and variable findings in morbidity/mortality and weight gain. Weight gain can be attributed to edema in PN. Length of stay was significantly shorter in only EN than PN (p &lt; 0.0001).</p><p><b>Conclusion:</b> This literature review indicates a need for a more comprehensive nutritional assessment to adequately evaluate nutritional status before and after HSCT. EN should be given as a first line therapy and should be considered prior to the conditioning phase. The initiation of a feeding tube prior to conditioning should be considered. Finally, PN may be considered if EN cannot be tolerated. More research is needed for a sensitive nutritional evaluation, earlier administration of EN, and standardized pathways for EN and PN nutrition in pediatric HSCT.</p>","PeriodicalId":16668,"journal":{"name":"Journal of Parenteral and Enteral Nutrition","volume":"49 S1","pages":"S90-S308"},"PeriodicalIF":3.2000,"publicationDate":"2025-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1002/jpen.2735","citationCount":"0","resultStr":"{\"title\":\"Poster Abstracts\",\"authors\":\"\",\"doi\":\"10.1002/jpen.2735\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><b>P1–P34 Parenteral Nutrition Therapy</b></p><p><b>P35–P52 Enteral Nutrition Therapy</b></p><p><b>P53–P83 Malnutrition and Nutrition Assessment</b></p><p><b>P84–P103 Critical Care and Critical Health Issues</b></p><p><b>P104–P131 GI, Obesity, Metabolic, and Other Nutrition Related Concepts</b></p><p><b>P132–P165 Pediatric, Neonatal, Pregnancy, and Lactation</b></p><p><b>Parenteral Nutrition Therapy</b></p><p>Sarah Williams, MD, CNSC<sup>1</sup>; Angela Zimmerman, RD, CNSC<sup>2</sup>; Denise Jezerski, RD, CNSC<sup>2</sup>; Ashley Bestgen, RD, CNSC<sup>2</sup></p><p><sup>1</sup>Cleveland Clinic Foundation, Parma, OH; <sup>2</sup>Cleveland Clinic Foundation, Cleveland, OH</p><p><b>Financial Support:</b> Morrison Healthcare.</p><p><b>Background:</b> Essential fatty acid deficiency (EFAD) is a rare disorder among the general population but can be a concern in patients reliant on home parenteral nutrition (HPN), particularly those who are not receiving intravenous lipid emulsions (ILE). In the US, the only ILE available until 2016 was soybean oil based (SO-ILE), which contains more than adequate amounts of essential fatty acids, including alpha-linolenic acid (ALA, an omega-3 fatty acid) and linoleic acid (LA, an omega-6 fatty acid). In 2016, a mixed ILE containing soybean oil, medium chain triglycerides, olive oil and fish oil, became available (SO, MCT, OO, FO-ILE). However, it contains a lower concentration of essential fatty acids compared to SO-ILE, raising theoretical concerns for development of EFAD if not administered in adequate amounts. Liver dysfunction is a common complication in HPN patients that can occur with soybean based ILE use due to their pro-inflammatory properties. Short-term studies and case reports in patients receiving SO, MCT, OO, FO-ILE have shown improvements in liver dysfunction for some patients. Our study evaluates the long-term impact of SO, MCT, OO, FO-ILE in our HPN patient population.</p><p><b>Methods:</b> This single-center, retrospective cohort study was conducted at the Cleveland Clinic Center for Human Nutrition using data from 2017 to 2020. It involved adult patients who received HPN with SO, MCT, OO, FO-ILE for a minimum of one year. The study assessed changes in essential fatty acid profiles, including triene-tetraene ratios (TTRs) and liver function tests (LFTs) over the year. Data was described as mean and standard deviation for normal distributed continuous variables, medians and interquartile range for non-normally distributed continuous variables and frequency for categorical variables. The Wilcoxon signed rank test was used to compare the baseline and follow-up TTR values (mixed time points). The Wilcoxon signed rank test with pairwise comparisons was used to compare the LFTs at different time points and to determine which time groups were different. P-values were adjusted using Bonferroni corrections. Ordinal logistic regression was used to assess the association between lipid dosing and follow-up TTR level. Analyses were performed using R software and a significance level of 0.05 was assumed for all tests.</p><p><b>Results:</b> Out of 110 patients screened, 26 met the inclusion criteria of having baseline and follow-up TTRs. None of the patients developed EFAD, and there was no significant difference in the distribution of TTR values between baseline and follow-up. Additionally, 5.5% of patients reported adverse GI symptoms while receiving SO, MCT, OO, FO-ILE. A separate subgroup of 14 patients who had abnormal LFTs, including bilirubin, alkaline phosphatase (AP), aspartate aminotransferase (AST) or alanine aminotransferase (ALT), were evaluated. There was a statistically significant improvement of AST and ALT and decreases in bilirubin and AP that were not statistically significant.</p><p><b>Conclusion:</b> We found that using SO, MCT, OO, FO-ILE as the primary lipid source did not result in EFAD in any of our subset of 26 patients, and TTRs remained statistically unchanged after introduction of SO, MCT, OO, FO-ILE. Additionally, there was a statistically significant decrease in AST and ALT following the start of SO, MCT, OO, FO-ILE. While liver dysfunction from PN is multifactorial, the use of fish oil based lipids has been shown to improve LFT results due to a reduction of phytosterol content as well as less pro-inflammatory omega-6 content when compared to SO-ILEs. A significant limitation was the difficulty in obtaining TTR measurements by home health nursing in the outpatient setting, which considerably reduced the number of patients who could be analyzed for EFAD.</p><p><b>Table 1.</b> Summary Descriptive Statistics of 26 Patients With Baseline and Follow Up TTR.</p><p></p><p><b>Table 2.</b> Change in LFTs From Baseline Levels Compared to 3 Months, 6 Months, 9 Months and 12 Months.</p><p></p><p>Wendy Raissle, RD, CNSC<sup>1</sup>; Hannah Welch, MS, RD<sup>2</sup>; Jan Nguyen, PharmD<sup>3</sup></p><p><sup>1</sup>Optum Infusion Pharmacy, Buckeye, AZ; <sup>2</sup>Optum Infusion Pharmacy, Phoenix, AZ; <sup>3</sup>Optum Infusion Pharmacy, Mesa, AZ</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Aluminum is a non-nutrient contaminant of parenteral nutrition (PN) solution. The additive effects of PN components can contribute to toxicity and cause central nervous system issues as well as contribute to metabolic bone disease as observed in adults with osteomalacia. When renal function and gastrointestinal mechanisms are impaired, aluminum can accumulate in the body. Aluminum toxicity can result in anemia, dementia, bone disease and encephalopathy. Symptoms of aluminum toxicity may include mental status change, bone pain, muscle weakness, nonhealing fractures and premature osteoporosis. In July 2004, the U.S. Food and Drug Administration (FDA) mandated labeling of aluminum content with a goal to limit exposure to less than 5mCg/kg/day. Adult and pediatric dialysis patients, as well as patients of all ages receiving PN support, have an increased risk of high aluminum exposure. Reducing PN additives high in aluminum is the most effective way to decrease aluminum exposure and risk of toxicity. This abstract presents a unique case where antiperspirant use contributed to an accumulation of aluminum in an adult PN patient.</p><p><b>Methods:</b> A patient on long-term PN (Table 1) often had results of low ionized calcium of &lt; 3 mg/dL, leading to consideration of other contributing factors. In addition, patient was taking very high doses of vitamin D daily (by mouth) to stay in normal range (50,000IU orally 6 days/week). Risk factors for developing metabolic bone disease include mineral imbalances of calcium, magnesium, phosphorus, vitamin D, corticosteroid use, long-term PN use and aluminum toxicity (Table 2). A patient with known osteoporosis diagnosis had two stress fractures in left lower leg. Aluminum testing was completed in order to identify other factors that may be contributing to low ionized calcium values and osteoporosis. During patient discussion, the patient revealed they used an aluminum-containing antiperspirant one time daily. The range of aluminum content in antiperspirants is unknown, but studies show that minimal absorption may be possible, especially in populations with kidney insufficiency.</p><p><b>Results:</b> After an elevated aluminum value resulted on July 3, 2023 (Figure 1), patient changed products to a non-aluminum containing antiperspirant. Aluminum values were rechecked at 3 and 7 months. Results indicate that patient's antiperspirant choice may have been contributing to aluminum content through skin absorption. Antiperspirant choice may not lead to aluminum toxicity but can contribute to an increased total daily aluminum content.</p><p><b>Conclusion:</b> Preventing aluminum accumulation is vital for patients receiving long-term PN support due to heightened risk of aluminum toxicity. Other potential sources of contamination outside of PN include dialysis, processed food, aluminum foil, cosmetic products (antiperspirants, deodorant, toothpaste) medications (antacids), vaccinations, work environment with aluminum welding and certain processing industry plants. Aluminum content of medications and PN additives vary based on brands and amount. Clinicians should review all potential aluminum containing sources and assess ways to reduce aluminum exposure and prevent potential aluminum toxicity in long-term PN patients.</p><p><b>Table 1.</b> Patient Demographics.</p><p></p><p><b>Table 2.</b> Aluminum Content in PN Prescription.</p><p></p><p></p><p><b>Figure 1.</b> Aluminum Lab Value Result.</p><p>Haruka Takayama, RD, PhD<sup>1</sup>; Kazuhiko Fukatsu, MD, PhD<sup>2</sup>; MIdori Noguchi, BA<sup>3</sup>; Nana Matsumoto, RD, MS<sup>2</sup>; Tomonori Narita, MD<sup>4</sup>; Reo Inoue, MD, PhD<sup>3</sup>; Satoshi Murakoshi, MD, PhD<sup>5</sup></p><p><sup>1</sup>St. Luke's International Hospital, Chuo-ku, Tokyo; <sup>2</sup>The University of Tokyo, Bunkyo-ku, Tokyo; <sup>3</sup>The University of Tokyo Hospital, Bunkyo-ku, Tokyo; <sup>4</sup>The University of Tokyo, Chuo-City, Tokyo; <sup>5</sup>Kanagawa University of Human Services, Yokosuka-city, Kanagawa</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Our previous study has demonstrated beta-hydroxy-beta-methylbutyrate (HMB)-supplemented total parenteral nutrition (TPN) partially to restore gut-associated lymphoid tissue (GALT) atrophy observed in standard TPN-fed mice. Oral intake of HMB is now popular in body builders and athletes. Herein, we examined whether oral supplementation of HMB could increase GALT mass in mice which eat dietary chow ad libitum.</p><p><b>Methods:</b> Six-week-old male Institute of Cancer Research (ICR) mice were divided into the Control (n = 9), the H600 (n = 9) and the H2000 (n = 9) groups. All mice were allowed to take chow and water ad libitum for 7 days. The H600 or H2000 mice were given water containing Ca-HMB at 3 mg or 10 mg/mL water, while the Controls drank normal tap water. Because these mice drank approximately 6-7 mL water per day, the H600 and H2000 groups took 600 and 2000mg/kg Ca-HMB in one day, respectively. After 7 days manipulation, all mice were killed with cardiac puncture under general anesthesia, and the whole small intestine was harvested for GALT cell isolation. GALT cell numbers were evaluated in each tissue (Payer patches; PPs, Intraepithelial spaces; IE, and Lamina propria; LP). The nasal washings, bronchoalveolar lavage fluid (BALF), and intestinal washings were also collected for IgA level measurement by ELISA. The Kruskal-Wallis test was used for all parameter analyses, and the significance level was set at less than 5%.</p><p><b>Results:</b> There were no significant differences in the number of GALT cells in any sites among the 3 groups (Table 1). Likewise, mucosal IgA levels did not differ between any of the 2 groups (Table 2).</p><p><b>Conclusion:</b> Oral intake of HMB does not affect GALT cell number or mucosal IgA levels when the mice are given normal diet orally. It appears that beneficial effects of HMB on the GALT are expected only in parenterally fed mice. We should examine the influences of IV-HMB in orally fed model in the next study.</p><p><b>Table 1.</b> GALT Cell Number (x10<sup>7</sup>/body).</p><p></p><p><b>Table 2.</b> IgA Levels.</p><p></p><p>Median (interquartile range). Kruskal-Wallis test. n; Control=9, H600 = 8, H2000 = 9.</p><p>Nahoki Hayashi, MS<sup>1</sup>; Yoshikuni Kawaguchi, MD, PhD, MPH, MMA<sup>2</sup>; Kenta Murotani, PhD<sup>3</sup>; Satoru Kamoshita, BA<sup>1</sup></p><p><sup>1</sup>Medical Affairs Department, Research and Development Center, Chiyoda-ku, Tokyo; <sup>2</sup>Hepato-Biliary-Pancreatic Surgery Division, Bunkyo-ku, Tokyo; <sup>3</sup>School of Medical Technology, Kurume, Fukuoka</p><p><b>Financial Support:</b> Otsuka Pharmaceutical Factory, Inc.</p><p><b>Background:</b> The guideline of American Society for Parenteral and Enteral Nutrition recommends a target energy intake of 20 to 30 kcal/kg/day in patients undergoing surgery. Infectious complications reportedly decreased when the target energy and protein intake were achieved in the early period after gastrointestinal cancer surgery. However, no studies investigated the association of prescribed parenteral energy doses with clinical outcomes in patients who did not receive oral/tube feeding in the early period after gastrointestinal cancer surgery.</p><p><b>Methods:</b> Data of patients who underwent gastrointestinal cancer surgery during 2011–2022 and fasted for 7 days or longer after surgery were extracted from a nationwide medical claims database. The patients were divided into 3 groups based on the mean prescribed parenteral energy doses during 7 days after surgery as follows: the very-low group (&lt;10 kcal/kg/day), the low group (10–20 kcal/kg/day), and the moderate group (≥20 kcal/kg/day). Multivariable logistic regression model analysis was performed using in-hospital mortality, postoperative complications, length of hospital stay, and total in-hospital medical cost as the objective variable and the 3 group and confounding factors as the explanatory variables.</p><p><b>Results:</b> Of the 18,294 study patients, the number of patients in the very low, low, and moderate groups was 6,727, 9,760, and 1,807, respectively. The median prescribed energy doses on the 7th day after surgery were 9.2 kcal/kg, 16 kcal/kg, and 27 kcal/kg in the very low, low, and moderate groups, respectively. The adjusted odds ratio (95% confidence interval) for in-hospital mortality with reference to the very low group was 1.060 (1.057–1.062) for the low group and 1.281 (1.275–1.287) for the moderate group. That of postoperative complications was 1.030 (0.940–1.128) and 0.982 (0.842–1.144) for the low and moderate groups, respectively. The partial regression coefficient (95% confidence interval) for length of hospital stay (day) with reference to the very low group was 2.0 (0.7–3.3) and 3.2 (1.0–5.5), and that of total in-hospital medical cost (US$) was 1,220 (705–1,735) and 2,000 (1,136–2,864), for the low and moderate groups, respectively.</p><p><b>Conclusion:</b> Contrary to the guideline recommendation, the prescribed energy doses of ≥ 10 kcal/kg/day was associated with the increase in in-hospital mortality, length of hospital stay, and total in-hospital medical cost. Our findings questioned the efficacy of guideline-recommended energy intake for patients during 7 days after gastrointestinal cancer surgery.</p><p>Jayme Scali, BS<sup>1</sup>; Gaby Luna, BS<sup>2</sup>; Kristi Griggs, MSN, FNP-C, CRNI<sup>3</sup>; Kristie Jesionek, MPS, RDN, LDN<sup>4</sup>; Christina Ritchey, MS, RD, LD, CNSC, FASPEN, FNHIA<sup>5</sup></p><p><sup>1</sup>Optum Infusion Pharmacy, Thornton, PA; <sup>2</sup>Optum Infusion Pharmacy, Milford, MA; <sup>3</sup>Optum Infusion Pharmacy, Murphy, NC; <sup>4</sup>Optum Infusion Pharmacy, Franklin, TN; <sup>5</sup>Optum Infusion Pharmacy, Bulverde, TX</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Home parenteral nutrition (HPN) is a life-sustaining nutrition support therapy that is administered through a central venous access device (CVAD). Caregivers of children on HPN are initially trained to provide CVAD care and therapy administration by their clinical team or home infusion nurse. Often there is a gap in training when the patient is ready to assume responsibility for CVAD care. Without proper training, patients are at significant risk of complications such as bloodstream infections and catheter occlusions. The purpose of this study was twofold: 1) to explore the caregiver's perspective about current and future CVAD training practices and 2) to evaluate the need for a proactive formalized CVAD training program when care is transitioned from caregiver to patient.</p><p><b>Methods:</b> An 8-question survey was created using an online software tool. The target audience included caregivers of children receiving HPN. A link to the survey was sent via email and posted on various social media platforms that support the HPN community. The survey was conducted from June 17 to July 18, 2024. Respondents who were not caregivers of a child receiving HPN via a CVAD were excluded.</p><p><b>Results:</b> The survey received 114 responses, but only 86 were included in the analysis based on exclusion criteria. The distribution of children with a CVAD receiving HPN was evenly weighted between 0 and 18 years of age. The majority of the time, initial training regarding HPN therapy and CVAD care was conducted by the HPN clinic/hospital resource/learning center or home infusion pharmacy (Table 1). Forty-eight percent of respondents indicated their HPN team never offers reeducation or shares best practices (Figure 1). Most respondents selected the best individual to train their child on CVAD care and safety is the caregiver (Figure 2). In addition, 60% of respondents selected yes, they would want their child to participate in CVAD training if offered (Figure 3).</p><p><b>Conclusion:</b> This survey confirms that most caregivers anticipate training their child to perform CVAD care when it is determined the child is ready for this responsibility. One challenge to this provision of training is that almost half of the respondents in this survey stated they never receive reeducation or best practice recommendations from their team. This finding demonstrates a need for a formalized training program to assist caregivers when transitioning CVAD care to the patient. Since most respondents reported relying on their intestinal rehab or GI/motility clinic for CVAD related concerns, these centers would be the best place to establish a transition training program. Limitations of the study are as follows: It was only distributed via select social platforms, and users outside of these platforms were not captured. Additional studies would be beneficial in helping to determine the best sequence and cadence for content training.</p><p><b>Table 1.</b> Central Venous Access Device (CVAD) Training and Support Practices.</p><p></p><p></p><p><b>Figure 1.</b> How Often Does Your HPN Team Offer Reeducation or Share Best Practices?</p><p></p><p><b>Figure 2.</b> Who is Best to Train Your Child on CVAD Care Management and Safety?</p><p></p><p><b>Figure 3.</b> If Formalized CVAD Training is Offered, Would You Want Your Child to Participate?</p><p>Laryssa Grguric, MS, RDN, LDN, CNSC<sup>1</sup>; Elena Stoyanova, MSN, RN<sup>2</sup>; Crystal Wilkinson, PharmD<sup>3</sup>; Emma Tillman, PharmD, PhD<sup>4</sup></p><p><sup>1</sup>Nutrishare, Tamarac, FL; <sup>2</sup>Nutrishare, Kansas City, MO; <sup>3</sup>Nutrishare, San Diego, CA; <sup>4</sup>Indiana University, Carmel, IN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Long-term parenteral nutrition (LTPN) within the home is a lifeline for many patients throughout the United States. Patients utilize central venous access devices (CVAD) to administer LTPN. Central line-associated bloodstream infection (CLABSI) is a serious risk associated with patients who require LTPN. Rates of CLABSI in LTPN populations range from 0.9-1.1 per 1000 catheter days. The aim of this study was to determine the incidence of CLABSI in a cohort of patients serviced by a national home infusion provider specializing in LTPN and identify variables associated with an increased incidence of CLABSI.</p><p><b>Methods:</b> A retrospective review of electronic medical records of LTPN patients with intestinal failure was queried from March 2023 to May 2024 for patient demographics, anthropometric data, nursing utilization, parenteral nutrition prescription including lipid type, length of therapy use, geographic distribution, prescriber specialty, history of CLABSI, blood culture results as available, and use of ethanol lock. Patient zip codes were used to determine rural health areas, as defined by the US Department of Health &amp; Human Services. Patients were divided into two groups: 1) patients that had at least one CLABSI and 2) patients with no CLABSI during the study period. Demographic and clinical variables were compared between the two groups. Nominal data were analyzed by Fisher's exact test and continuous data were analyzed with student t-test for normal distributed data and Mann-Whitney U-test was used for non-normal distributed data.</p><p><b>Results:</b> We identified 198 persons that were maintained on LTPN during the study time. The overall CLABSI rate for this cohort during the study period was 0.49 per 1000 catheter days. Forty-four persons with LTPN had one or more CLABSI and 154 persons with LTPN did not have a CLABSI during the study period. Persons who experienced CLABSI weighed significantly more, had fewer days of infusing injectable lipid emulsions (ILE), and had a shorter catheter dwell duration compared to those that did not have a CLABSI (Table 1). There was no significant difference between the CLABSI and no CLABSI groups in the length of time on LTPN, location of consumer (rural versus non-rural), utilization of home health services, number of days parenteral nutrition (PN) was infused, or use of ethanol locks (Table 1).</p><p><b>Conclusion:</b> In this retrospective cohort study, we report a CLABSI rate of 0.49 per 1000 catheter days, which is lower than previously published CLABSI rates for similar patient populations. Patient weight, days of infusing ILE, and catheter dwell duration were significantly different between those that did and did not have a CLABSI in this study period. Yet, variables such as use of ethanol lock and proximity to care providers that had previously been reported to impact CLABSI were not significantly different in this cohort. An expanded study with more LTPN patients or a longer study duration may be necessary to confirm these results and their impact on CLABSI rates.</p><p><b>Table 1.</b> Long Term Parenteral Nutrition (LTPN) Characteristics.</p><p></p><p>Silvia Figueiroa, MS, RD, CNSC<sup>1</sup>; Stacie Townsend, MS, RD, CNSC<sup>2</sup></p><p><sup>1</sup>MedStar Washington Hospital Center, Bethesda, MD; <sup>2</sup>National Institutes of Health, Bethesda, MD</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> In hospitalized patients, lipid emulsions constitute an essential component of balanced parenteral nutrition (PN). Soy oil-based lipid injectable emulsions (SO-ILE) were traditionally administered as part of PN formulations primarily as a source of energy and for prevention of essential fatty acid deficiency. Contemporary practice has evolved to incorporate mixtures of different lipid emulsions, including a combination of soy, MCT, olive, and fish oils (SO, MCT, OO, FO-ILE). Evidence suggests that the use of SO, MCT, OO, FO-ILEs may alter essential fatty acid profiles, impacting hepatic metabolism and other processes associated with clinical benefits. The aim of this project was to compare essential fatty acid profile (EFAP), triglycerides (TGL), liver function tests (LFTs), and total bilirubin (TB) levels in adult patients receiving parenteral nutrition with SO-ILE or SO, MCT, OO, FO-ILE in a unique hospital entirely dedicated to conducting clinical research.</p><p><b>Methods:</b> This was a retrospective chart review from 1/1/2019 to 12/31/2023 of adult patients in our hospital who received PN with SO-ILE or SO, MCT, OO, FO-ILE and had EFAP assessed after 7 days of receiving ILE. Data included demographic, clinical, and nutritional parameters. Patients with no laboratory markers, those on propofol, and those who received both ILE products in the 7 days prior to collection of EFAP were excluded. Data was statistically analyzed using Fisher's tests and Mann-Whitney U tests as appropriate.</p><p><b>Results:</b> A total of 42 patient charts were included (14 SO-ILE; 28 SO, MCT, OO, FO-ILE). Group characteristics can be found in Table 1. Patients on SO-ILE received more ILE (0.84 vs 0.79 g/kg/day, p &lt; 0.0001). TGL levels changed significantly after start of ILE (p &lt; 0.0001). LFTs were found to be elevated in 57% of patients in the SO-ILE group and 60% in the SO, MCT, OO, FO-ILE group, while TB was increased in 21% and 40% of the patients respectively (Figure 1). Further analysis showed no significant differences in LFTs and TB between the two groups. Assessment of EFAP revealed a significant difference in the levels of DHA, docosenoic acid, and EPA, which were found to be higher in the group receiving SO, MCT, OO, FO-ILE. Conversely, significant differences were also observed in the levels of linoleic acid, homo-g-linolenic acid, and total omega 6 fatty acids, with those being higher in patients administered SO ILE (Figure 2). No differences were observed between groups regarding the presence of essential fatty acid deficiency, as indicated by the triene:tetraene ratio.</p><p><b>Conclusion:</b> In our sample analysis, LFTs and TB levels did not differ significantly between SO-ILE and SO, MCT, OO, FO-ILE groups. Increased levels of DHA, docosenoic acid, and EPA were found in the SO, MCT, OO, FO-ILE group, while linoleic acid, homo-g-linolenic acid, and total omega 6 fatty acids tended to be higher in the SO-ILE group. Although in our sample the SO, MCT, OO, FO-ILE group received a lower dosage of ILE/kg/day, there were no differences in the rate of essential fatty acid deficiency between groups.</p><p><b>Table 1.</b> General Characteristics (N = 42).</p><p></p><p></p><p><b>Figure 1.</b> Liver Function Tests (N = 39).</p><p></p><p><b>Figure 2.</b> Essential Fatty Acid Profile (N = 42).</p><p>Kassandra Samuel, MD, MA<sup>1</sup>; Jody (Lind) Payne, RD, CNSC<sup>2</sup>; Karey Schutte, RD<sup>3</sup>; Kristen Horner, RDN, CNSC<sup>3</sup>; Daniel Yeh, MD, MHPE, FACS, FCCM, FASPEN, CNSC<sup>3</sup></p><p><sup>1</sup>Denver Health, St. Joseph Hospital, Denver, CO; <sup>2</sup>Denver Health, Parker, CO; <sup>3</sup>Denver Health, Denver, CO</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Early nutritional support in hospitalized patients has long been established as a key intervention to improve overall patient outcomes. Patients admitted to the hospital often have barriers to receiving adequate nutrition via enteral route means and may be candidates for parenteral nutrition (PN). Central parenteral nutrition (CPN) requires central access, which has historically led to concerns for central line-associated bloodstream infection (CLABSI). Obtaining central access can be resource-intensive and may result in treatment delays while awaiting access. Conversely, peripheral parenteral nutrition (PPN) can be delivered without central access. In this quality improvement project, we sought to characterize our PPN utilization at a large urban tertiary hospital.</p><p><b>Methods:</b> We performed a retrospective review of adult inpatients receiving PN at our facility from 1/1/23–12/31/23. Patients were excluded from review if they had PN initiated prior to hospitalization. Demographic information, duration of treatment, timely administration status, and information regarding formula nutrition composition were collected.</p><p><b>Results:</b> A total of 128 inpatients received PN for a total of 1302 PN days. The mean age of these patients was 53.8 years old (SD: 17.9) and 65 (50%) were male. Twenty-six (20%) patients received only PPN for a median [IQR] length of 3 [2–4] days, and 61 (48%) patients received only CPN for a median length 6 [3–10] days. Thirty-nine (30%) patients were started on PPN with the median time to transition to CPN of 1 [1-3] day(s) and a median total duration of CPN being 8 [5-15.5] days. A small minority of patients received CPN and then transitioned to PPN (2%).</p><p><b>Conclusion:</b> At our institution, PPN is utilized in more than 50% of all inpatient PN, most commonly at PN initiation and then eventually transitioning to CPN for a relatively short duration of one to two weeks. Additional research is required to identify those patients who might avoid central access by increasing PPN volume and macronutrients to provide adequate nutrition therapy.</p><p>Nicole Halton, NP, CNSC<sup>1</sup>; Marion Winkler, PhD, RD, LDN, CNSC, FASPEN<sup>2</sup>; Elizabeth Colgan, MS, RD<sup>3</sup>; Benjamin Hall, MD<sup>4</sup></p><p><sup>1</sup>Brown Surgical Associates, Providence, RI; <sup>2</sup>Department of Surgery and Nutritional Support at Rhode Island Hospital, Providence, RI; <sup>3</sup>Rhode Island Hospital, Providence, RI; <sup>4</sup>Brown Surgical Associates, Brown University School of Medicine, Providence, RI</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Parenteral nutrition (PN) provides adequate nutrition and fluids to patients with impaired gastrointestinal function who cannot meet their nutritional needs orally or enterally. PN requires a venous access device that has associated risks including infection as well as metabolic abnormalities associated with the therapy. Monitoring of PN therapy in the hospital setting involves regular blood work, yet improperly collected samples can lead to abnormal laboratory results and unnecessary medical interventions.</p><p><b>Methods:</b> An IRB exempted quality improvement study was conducted at Rhode Island Hospital by the Surgical Nutrition Service which manages all adult PN. The purpose of the study was to quantify the occurrence of contaminated blood samples among PN patients between January 1, 2024, and August 31, 2024. Demographic data, venous access device, and PN-related diagnoses were collected. Quantification of contaminated blood specimens was determined per patient, per hospital unit, and adjusted for total PN days. Comparisons were made between serum glucose, potassium, and phosphorus levels from contaminated and redrawn blood samples. Descriptive data are reported.</p><p><b>Results:</b> 138 patients received PN for a total of 1840 days with a median length of PN therapy of 8 days (IQR 9, range 2-84). The most common vascular access device was dual lumen peripherally inserted central catheter. The majority (63%) of patients were referred by surgery teams and received care on surgical floors or critical care units. The most frequent PN related diagnoses were ileus, gastric or small bowel obstruction, and short bowel syndrome. There were 74 contaminated blood specimens among 42 (30%) patients receiving TPN for a rate of 4% per total patient days. Of 25 nursing units, 64% had at least one occurrence of contaminated blood specimens among TPN patients on that unit. Contaminated samples showed significantly different serum glucose, potassium, and phosphorus compared to redrawn samples (p &lt; 0.001); glucose in contaminated vs redrawn samples was 922 ± 491 vs 129 ± 44 mg/dL; potassium 6.1 ± 1.6 vs 3.9 ± 0.5 mEq/L; phosphorus 4.9 ± 1.2 vs 3.3 ± 0.6 mg/dL. The average time delay between repeated blood samples was 3 hours.</p><p><b>Conclusion:</b> Contaminated blood samples can lead to delays in patient care, discomfort from multiple blood draws, unnecessary medical interventions (insulin; discontinuation of PN), delay in placement of timely PN orders, and increased infection risk. Nursing re-education on proper blood sampling techniques is critical for reducing contamination occurrences. All policies and procedures will be reviewed, and an educational program will be implemented. Following this, occurrences of blood contamination during PN will be reassessed.</p><p>Hassan Dashti, PhD, RD<sup>1</sup>; Priyasahi Saravana<sup>1</sup>; Meghan Lau<sup>1</sup></p><p><sup>1</sup>Massachusetts General Hospital, Boston, MA</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> ASN Nutrition 2024.</p><p><b>Publication:</b> Saravana P, Lau M, Dashti HS. Continuous glucose monitoring in adults with short bowel syndrome receiving overnight infusions of home parenteral nutrition. Eur J Clin Nutr. 2024 Nov 23. doi: 10.1038/s41430-024-01548-z. Online ahead of print. PMID: 39580544.</p><p><b>Financial Support:</b> ASPEN Rhoads Research Foundation.</p><p>Maria Romanova, MD<sup>1</sup>; Azadeh Lankarani-Fard, MD<sup>2</sup></p><p><sup>1</sup>VA Greater Los Angeles Healthcare System, Oak Park, CA; <sup>2</sup>GA Greater Los Angeles Healthcare System, Los Angeles, CA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition is a serious complication of the hospital stay. Parenteral nutrition (PN) in most sophisticated way of addressing it but requires on-going monitoring. In our medical center PN provision is guided by the interdisciplinary Nutrition Support Team (NST). In 2024 we began creation of a dashboard to monitor safety and utilization of PN at the Greater Los Angeles VA. Here we discuss the collaborative process of developing the dashboard and its first use.</p><p><b>Methods:</b> A dashboard was constructed using data from the VA electronic health record. The dashboard used Microsoft Power BI technology to customize data visualization. The NST group worked closely with the Data Analytics team at the facility to modify and validate the dashboard to accommodate the needs of the group. The dashboard was maintained behind a VA firewall and only accessible to members of the NST. The dashboard reviewed patient level data for whom a Nutrition Support consult was placed for the last 2 years. The variables included were the data of admission, date of consult request, the treating specialty at the time of request, demographics, admission diagnosis, discharge diagnosis, number of orders for PPN/TPN, number of blood sugars &gt;200 mg/dL after admission, number of serum phosphorus values &lt; 2.5 mg/dL, number of serum potassium values &lt; 3.5 mmol/L, any discharge diagnosis of refeeding (ICD 10 E87.8), micronutrient levels during admission, and any discharge diagnosis of infection. The ICD10 codes used to capture infection were for: bacteremia (R78.81), sepsis (A41.*), or catheter associated line infection (ICD10 = T80.211*). The asterix (*) denotes any number in that ICD10 classification. The dashboard was updated once a week. The NST validated the information on the dashboard to ensure validity, and refine information as needed.</p><p><b>Results:</b> The initial data extraction noted duplicate consult request as patients changed treating specialties during the same admission and duplicate orders for PPN/TPN as the formulations were frequently modified before administration. The Data Analytics team worked to reduce these duplicates. The NST also collaborated with the Data Analytics team to modify their existing documentation to better capture the data needed going forward. Dashboard data was verified by direct chart review. Between April 2022- 2024 68 consults were placed from the acute care setting and 58 patients received PPN or TPN during this time period. Thirty-five patients experienced hyperglycemia. Two patients were deemed to have experience refeeding at the time of discharge. Fourteen episodes of infection were noted in those who received PPN/TPN but the etiology was unclear from the dashboard alone and required additional chart review.</p><p><b>Conclusion:</b> A dashboard can facilitate monitoring of Nutrition Support services in the hospital. Refinement of the dashboard requires collaboration between the clinical team and the data analytics team to ensure validity and workload capture.</p><p>Michael Fourkas, MS<sup>1</sup>; Julia Rasooly, MS<sup>1</sup>; Gregory Schears, MD<sup>2</sup></p><p><sup>1</sup>PuraCath Medical Inc., Newark, CA; <sup>2</sup>Department of Anesthesiology and Perioperative Medicine, Mayo Clinic, Rochester, MN</p><p><b>Financial Support:</b> Funding of the study has been provided by Puracath Medical.</p><p><b>Background:</b> Intravenous catheters can provide venous access for drug and nutrition delivery in patients for extended periods of time, but risk the occurrence of central line associated bloodstream infections (CLABSI) due to inadequate asepsis. Needleless connectors (NC), which provide access for injection of medications, are known to be one of the major sources of contamination. Studies demonstrate that current methods of disinfecting connectors such as a 15 second antiseptic wipe do not guarantee complete disinfection inside of connectors. With the rise of superbugs such as Candida auris, there is an urgent need for better aseptic technique compliance and non-antibiotic disinfection methods. Ultraviolet light-C (UV-C) is an established technology that is commonly used in hospital settings for disinfection of equipment and rooms. In this study, we investigate the efficacy of our novel UV-C light disinfection device on UV-C light-transmissive NCs inoculated with common CLABSI-associated organisms.</p><p><b>Methods:</b> Staphylococcus aureus (ATCC #6538), Candida albicans (ATCC #10231), Candida auris (CDC B11903), Escherichia coli (ATCC #8739), Pseudomonas aeruginosa (ATCC #9027), and Staphylococcus epidermidis (ATCC #12228) were used as test organisms for this study. A total of 29 NC samples were tested for each organism with 3 positive controls and 1 negative control. Each UV-C light-transmissive NC was inoculated with 10 µl of cultured inoculum (7.00-7.66 log) and were exposed to an average of 48 mW/cm2 of UV light for 1 second using our in-house UV light disinfection device FireflyTM. After UV disinfection, 10 mL of 0.9% saline solution was flushed through the NC and filtered through a 0.45 µm membrane. The membrane filter was plated onto an agar medium matched to the organism and was incubated overnight at 37°C for S. aureus, E. coli, S. epidermidis, and P. aeruginosa, and two days at room temperature for C. albicans and C. auris. Positive controls followed the same procedure without exposure to UV light and diluted by 100x before being spread onto agar plates in triplicates. The negative controls followed the same procedure without inoculation. After plates were incubated, the number of colonies on each plate were counted and recorded. Log reduction was calculated by determining the positive control log concentration over the sample concentration in cfu/mL. 1 cfu/10 mL was used to make calculations for total kills.</p><p><b>Results:</b> Using our UV light generating device, we were able to achieve greater than 4 log reduction average and complete kills for all test organisms. The log reduction for S. aureus, C. albicans, C. auris, E. coli, P. aeruginosa, and S. epidermidis were 5.29, 5.73, 5.05, 5.24, 5.10, and 5.19, respectively.</p><p><b>Conclusion:</b> We demonstrated greater than 4-log reduction in common CLABSI-associated organisms using our UV light disinfection device and UV-C transmissive NCs. By injecting inoculum directly inside the NC, we demonstrated that disinfection inside NCs can be achieved, which is not possible with conventional scrubbing methods. A one second NC disinfection time will allow less disruption in the workflow in hospitals, particularly in intensive care units where highly effective and efficient disinfection rates are essential for adoption of the technology.</p><p><b>Table 1.</b> Log Reduction of Tested Organisms After Exposure to 48 mW/cm2 UV-C for 1 Second.</p><p></p><p>Yaiseli Figueredo, PharmD<sup>1</sup></p><p><sup>1</sup>University of Miami Hospital, Miami, FL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Octreotide belongs to the somatostatin analog class. It is used off-label for malignant bowel obstructions (MBO). Somatostatin analogs (SSA) inhibit the release and action of multiple hormones, reducing gastric secretions, peristalsis, and splanchnic blood flow while enhancing water and electrolyte absorption. National Comprehensive Cancer Network (NCCN) guidelines recommend octreotide 100-300 mcg subcutaneous twice to three times a day or 10-40 mcg/hour continuous infusion for the management of malignant bowel obstructions, and if prognosis is greater than 8 weeks, consider long-acting release (LAR) or depot injection. Using octreotide as an additive to parenteral nutrition solutions has been a debatable topic due to concerns of formation of a glycosyl octreotide conjugate that may decrease the octreotide's efficacy. However, other compatibility studies have concluded little octreotide loss over 48 hours in TPN solutions at room temperature in ambient room light. At the University of Miami Hospital, it is practiced using octreotide as an additive to Total Parenteral Nutrition (TPN) solutions to reduce gastro-intestinal secretions in patients with malignant bowel obstructions. The starting dose is 300 mcg, and dose is increased on 300 mcg increments to a maximum dose of 900 mcg if output remains uncontrolled/elevated. The objective of this study is to evaluate the effectiveness and safety of octreotide as an additive to TPN in hospitalized patients with malignant bowel obstructions.</p><p><b>Methods:</b> A three-year retrospective chart review (June 2021-June 2024) was conducted to evaluate the effectiveness and safety of octreotide as an additive to TPN in hospitalized patients with MBO diagnosis at UMH. The following information was obtained from chart review: age, gender, oncologic diagnosis, TPN indication, TPN dependency, octreotide doses used, baseline and final gastrointestinal secretion output recorded, type of venting gastrostomy in place, length of hospital stay, and baseline and final hepatic function tests.</p><p><b>Results:</b> A total of 27 patients were identified to have malignant bowel obstruction requiring TPN which had octreotide additive. All patients were started on octreotide 300 mcg/day added into 2-in-1 TPN solution. The gastrointestinal secretion output was reduced on average by 65% among all patients with a final average daily amount of 540 mL recorded. The baseline average output recorded was 1,518 mL/day. The average length of treatment as an inpatient was 23 days, range 3-98 days. Liver function tests (LFTs) were assessed at baseline and last inpatient value available for the admission. Four out of the 27 patients (15%) reviewed were observed to have a significant rise in liver enzymes greater than three times the upper limit of normal.</p><p><b>Conclusion:</b> Octreotide represents a valuable addition to the limited pharmacological options for managing malignant bowel obstruction. Its ability to reduce gastrointestinal secretions by 65% on average as observed in this retrospective chart review can significantly alleviate symptoms and improve patient care. Using octreotide as an additive to TPN solutions for patients with malignant bowel obstructions who are TPN dependent reduces the number of infusions or subcutaneous injections patients receive per day. According to octreotide's package insert, the incidence of hepato-biliary complications is up to 63%. The finding that 15% of patients from this retrospective chart review had significant liver enzyme elevations remains an important monitoring parameter to evaluate.</p><p>Pavel Tesinsky, Assoc. Prof., MUDr.<sup>1</sup>; Jan Gojda, Prof., MUDr, PhD<sup>2</sup>; Petr Wohl, MUDr, PhD<sup>3</sup>; Katerina Koudelkova, MUDr<sup>4</sup></p><p><sup>1</sup>Department of Medicine, Prague, Hlavni mesto Praha; <sup>2</sup>Department of Medicine, University Hospital, 3rd Faculty of Medicine Charles University in Prague, Praha, Hlavni mesto Praha; <sup>3</sup>Institute for Clinical and Experimental Medicine, Prague, Hlavni mesto Praha; <sup>4</sup>Department of Medicine, University Hospital and 3rd Faculty of Medicine in Prague, Prague, Hlavni mesto Praha</p><p><b>Financial Support:</b> The Registry was supported by Takeda and Baxter scientific grants.</p><p><b>Background:</b> Trends in indications, syndromes, performance, weaning, and complications of patients on total HPN based on the updated 30 years analysis and stratification of patients on home parenteral nutrition (HPN) in Czech Republic.</p><p><b>Methods:</b> Records from the HPN National Registry were analysed for the time period 2007 – 2023, based on the data from the HPN centers. Catheter related sepsis (CRS), catheter occlusions, and thrombotic complications were analyzed for the time–to–event using the competing-risks regression (Fine and Gray) model. Other data is presented as median or mean with 95% CI (p &lt; 0.05 as significant).</p><p><b>Results:</b> The incidence rate of HPN is 1.98 per 100.000 inhabitants (population 10.5 mil.). Lifetime dependency is expected in 20% patients, potential weaning in 40%, and 40% patients are palliative. Out of 1838 records representing almost 1.5 million catheter days, short bowel syndrome was present in 672 patients (36.6%), intestinal obstruction in 531 patients (28.9%), malabsorption in 274 patients (14.9%), and the rest of 361 patients (19.6%) was split among fistulas, dysphagia, or remained unspecified. The majority of SBS were type I (57.8 %) and II (20.8%). Mean length of residual intestine was 104.3 cm (35.9 - 173.4 cm) with longer remnants in type I SBS. Dominant indications for HPN were pseudoobstruction (35.8%), non-maignant surgical conditions (8.9%), Crohn disease (7.3%), and mesenteric occlusion (6.8%). Mobility for a substantial part of the day was reported from 77.8% HPN patients, economic activity and independence from 162 (24.8 %) out of 653 economically potent patients. A tunneled catheter was primarily used in 49.1%, PICC in 24.3%, and IV port in 19.8% patients. Commercially prepared bags were used in 69.7%, and pharmacy-prepared admixtures in 24.7% patients. A total of 66.9% patients were administered 1 bag per day/7 days a week. The sepsis ratio per 1000 catheter days decreased from 0.84 in 2013 to 0.15 in 2022. The catheter occlusions ratio decreased from 0.152 to 0.10 per 1000 catheter days, and thrombotic complications ratio from 0.05 to 0.04. Prevalence of metabolic bone disease is 15.6 %, and prevalence of PNALD is 22.3%. In the first 12 months, 28 % patients achieved intestinal autonomy increasing to 45 % after 5 years. Patient survival rate is 62% in the first year, 45% at 5 years, and 35% at the 10-years mark. Tedeglutide was indicated in 36 patients up to date with reduction of the daily HPN volume to 60.3% on average.</p><p><b>Conclusion:</b> Prevalence of HPN patients in the Czech Republic is increasing in the past ten years and it is corresponding to the incidence rate. Majority of patients are expected to terminate HPN within the first year. Risk of CRS decreased significantly in the past five years and remains low, while catheter occlusion and thrombotic complications have a stable trend. Tedeglutide significantly reduced the required IV volume.</p><p></p><p><b>Figure 1.</b> Per-Year Prevalence of HPN Patients and Average Number of Catheter Days Per Patient (2007 - 2022).</p><p></p><p><b>Figure 2.</b> Annual Incidence of HPN Patients (2007 - 2022).</p><p></p><p><b>Figure 3.</b> Catheter related bloodstream infections (events per 1,000 catheter-days).</p><p>Jill Murphree, MS, RD, CNSC, LDN<sup>1</sup>; Anne Ammons, RD, LDN, CNSC<sup>2</sup>; Vanessa Kumpf, PharmD, BCNSP, FASPEN<sup>2</sup>; Dawn Adams, MD, MS, CNSC<sup>2</sup></p><p><sup>1</sup>Vanderbilt University Medical Center, Nashville, TN; <sup>2</sup>Vanderbilt University Medical Center, Nashville, TN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Determining macronutrient goals in patients requiring home parenteral nutrition (HPN) can be difficult due to various factors. While indirect calorimetry is the gold standard for measuring energy expenditure, it is not readily available in the outpatient setting. Therefore, clinicians typically rely on less accurate weight-based equations for assessment of protein and energy requirements. Energy goals are also impacted by the targeted desire for weight loss, weight gain, or weight maintenance. Patients receiving HPN may consume some oral dietary intake and experience variable degrees of macronutrient absorption. These factors, as well as underlying clinical conditions, can significantly impact protein and energy requirements and may change over the course of HPN therapy. The purpose of this study was to evaluate the range of protein and energy doses prescribed in patients receiving HPN managed by an interdisciplinary intestinal failure clinic at a large academic medical center.</p><p><b>Methods:</b> Patient demographics including patient age, gender, and PN indication/diagnosis were retrospectively obtained for all patients discharged home with PN between May 2021 to May 2023 utilizing an HPN patient database. Additional information was extracted from the electronic medical record at the start of HPN, then at 2-week, 2 to 3 month, and 6-month intervals following discharge home that included height, actual weight, target weight, HPN energy dose, HPN protein dose, and whether the patient was eating. Data collection ended at completion of HPN therapy or up to 6 months of HPN. All data was entered and stored in an electronic database.</p><p><b>Results:</b> During the study period, 248 patients were started on HPN and 56 of these patients received HPN for at least 6 months. Patient demographics are included in Table 1. At the start of HPN, prescribed energy doses ranged from 344 to 2805 kcal/d (6 kcal/kg/d to 45 kcal/kg/d) and prescribed protein doses ranged from 35 to 190 g/d (0.6 g/kg/d to 2.1 g/kg/d). There continued to be a broad range of prescribed energy and protein doses at 2-week, 2 to 3 month, and 6-month intervals of HPN. Figures 1 and 2 provide the prescribed energy and protein doses for all patients and for those who are eating and not eating. For patients not eating, the prescribed range for energy was 970 to 2791 kcal/d (8 kcal/kg/d to 45 kcal/kg/d) and for protein was 40 to 190 g/d (0.6 g/kg/d to 2.0 g/kg/d) at the start of PN therapy. The difference between actual weight and target weight was assessed at each study interval. Over the study period, patients demonstrated a decrease in the difference between actual and target weight to suggest improvement in reaching target weight (Figure 3).</p><p><b>Conclusion:</b> The results of this study demonstrate a wide range of energy and protein doses prescribed in patients receiving HPN. This differs from use of PN in the inpatient setting, where weight-based macronutrient goals tend to be more defined. Macronutrient adjustments may be necessary in the long-term setting when patients are consuming oral intake, for achievement/maintenance of target weight, or for changes in underlying conditions. Patients receiving HPN require an individualized approach to care that can be provided by interdisciplinary nutrition support teams specializing in intestinal failure.</p><p><b>Table 1.</b> Patient Demographics Over 6-Month Study Period.</p><p></p><p></p><p><b>Figure 1.</b> Parenteral Nutrition (PN) Energy Range.</p><p></p><p><b>Figure 2.</b> Parenteral Nutrition (PN) Protein Range.</p><p></p><p><b>Figure 3.</b> Difference Between Actual Weight and Target Weight.</p><p>Jennifer Lachnicht, RD, CNSC<sup>1</sup>; Christine Miller, PharmD<sup>2</sup>; Jessica Younkman, RD CNSC<sup>2</sup></p><p><sup>1</sup>Soleo Home Infusion, Frisco, TX; <sup>2</sup>Soleo Health, Frisco, TX</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Since the 1990s, initiating parenteral nutrition (PN) at home has been performed, though some clinicians prefer hospital initiation due to risks like refeeding syndrome (RS). A key factor in successful home PN initiation is careful evaluation by an experienced nutrition support clinician, particularly assessing RS risk. In 2020, ASPEN published consensus recommendations for identifying patients at risk for RS and guidelines for initiating and advancing nutrition. Home PN initiation offers advantages, including avoiding hospitalization, reducing healthcare costs, minimizing hospital-acquired infections, and improving quality of life. Literature suggests a savings of $2,000 per day when PN is started at home. This study aims to determine the risk and incidence of RS in patients who began PN at home, based on the 2020 ASPEN Consensus Recommendations for RS.</p><p><b>Methods:</b> A national home infusion provider's nutrition support service reviewed medical records for 27 adult patients who initiated home PN between September 2022 and June 2024. Patients were evaluated for RS risk before PN initiation and the actual incidence of RS based on pre- and post-feeding phosphorus, potassium, and magnesium levels using ASPEN 2020 criteria. Initial lab work was obtained before and after PN initiation. Refeeding risk was categorized as mild, moderate, moderate to severe, or severe based on initial nutrition assessment, including BMI, weight loss history, recent caloric intake, pre-feeding lab abnormalities, fat/muscle wasting, and high-risk comorbidities. The percent change in phosphorus, potassium, and magnesium was evaluated and categorized as mild, moderate, or severe if levels decreased after therapy start. Initial PN prescriptions included multivitamins and supplemental thiamin per provider policy and consensus recommendations.</p><p><b>Results:</b> The average baseline BMI for the study population was 19.6 kg/m² (range 12.7-31.8, median 18.9). Weight loss was reported in 88.9% of patients, averaging 22%. Little to no oral intake at least 5-10 days before assessment was reported in 92.3% of patients. Initial lab work was obtained within 5 days of therapy start in 96.2% of cases, with 18.5% showing low prefeeding electrolytes. 100% had high-risk comorbidities. RS risk was categorized as mild (4%), moderate (48%), moderate to severe (11%), and severe (37%). Home PN was successfully initiated in 25 patients (93%). Two patients could not start PN at home: one due to persistently low pre-PN electrolyte levels despite IV repletion, and one due to not meeting home care admission criteria. Starting dextrose averaged 87.2 g/d (range: 50-120, median 100). Average total starting calories were 730 kcals/d, representing 12.5 kcals/kg (range: 9-20, median 12). Initial PN formula electrolyte content included potassium (average 55.4 mEq/d, range: 15-69, median 60), magnesium (average 11.6 mEq/d, range: 4-16, median 12), and phosphorus (average 15.6 mmol/d, range: 8-30, median 15). Labs were drawn on average 4.3 days after therapy start. Potassium, magnesium, and phosphorus levels were monitored for decreases ≥10% of baseline to detect RS. Decreases in magnesium and potassium were classified as mild (10-20%) and experienced by 4% of patients, respectively. Eight patients (32%) had a ≥ 10% decrease in phosphorus: 4 mild (10-20%), 2 moderate (20-30%), 2 severe (&gt;30%).</p><p><b>Conclusion:</b> Home initiation of PN can be safely implemented with careful monitoring and evaluation of RS risk. This review showed a low incidence of RS based on ASPEN criteria, even in patients at moderate to severe risk prior to home PN initiation. Close monitoring of labs and patient status, along with adherence to initial prescription recommendations, resulted in successful PN initiation at home in 92.5% of patients.</p><p>Dana Finke, MS, RD, CNSC<sup>1</sup>; Christine Miller, PharmD<sup>1</sup>; Paige Paswaters, RD, CNSC<sup>1</sup>; Jessica Younkman, RD, CNSC<sup>1</sup></p><p><sup>1</sup>Soleo Health, Frisco, TX</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Iron deficiency anemia is common in home parenteral nutrition (HPN) patients (Hwa et al., 2016). Post iron infusion, hypophosphatemia due to fibroblast growth factor 23 is a known side effect of some IV iron formulations (Wolf et al., 2019). Ferric carboxymaltose can cause serum phosphorus levels to drop below 2 mg/dL in 27% of patients (Onken et al., 2014). Hypophosphatemia (&lt; 2.7 mg/dL) can lead to neurological, neuromuscular, cardiopulmonary, and hematologic issues, and long-term effects like osteopenia and osteoporosis (Langley et al., 2017). This case series reviews the occurrence and clinical implications of transient serum phosphorus drops in patients receiving ferric carboxymaltose and HPN.</p><p><b>Methods:</b> A retrospective case series review was performed for three patients who were administered ferric carboxymaltose while on HPN therapy. Serum phosphorus levels were measured at baseline prior to initial iron dose, within 1 week post injection, and at subsequent time intervals up to 4 months post initial injection. Data was collected on the timing, magnitude, and duration of any phosphorus decreases, and any associated clinical symptoms or complications. Patient records were also reviewed to evaluate any correlations with HPN composition or phosphorus dosing.</p><p><b>Results:</b> Among the three patients reviewed, all exhibited short-term drops in serum phosphorus levels following ferric carboxymaltose injection (Table 1). All patients had baseline serum phosphorus levels within normal limits prior to the initial dose of ferric carboxymaltose. All cases involved multiple doses of ferric carboxymaltose, which contributed to the fluctuations in phosphorus levels. The average drop in serum phosphorus from baseline to the lowest point was 50.3%. The lowest recorded phosphorus level among these three patients was 1.4 mg/dL, and this was in a patient who received more than two doses of ferric carboxymaltose. In two cases, increases were made in HPN phosphorus in response to serum levels, and in one case no HPN changes were made. However, all serum phosphorus levels returned to normal despite varied interventions. Despite the low phosphorus levels, none of the patients reported significant symptoms of hypophosphatemia during the monitoring periods. Ferric carboxymaltose significantly impacts serum phosphorus in HPN patients, consistent with existing literature. The need for vigilant monitoring is highlighted, patients receiving HPN are closely monitored by a trained nutrition support team with frequent lab monitoring. Lab monitoring in patients receiving ferric carboxymaltose who are not on HPN may be less common. The lowest level recorded was 1.4 mg/dL, indicating potential severity. Despite significant drops, no clinical symptoms were observed, suggesting subclinical hypophosphatemia may be common. In two of the reviewed cases, hypophosphatemia was addressed by making incremental increases in the patient's HPN formulas. Note that there are limitations to phosphorus management in HPN due to compatibility and stability issues, and alternative means of supplementation may be necessary depending upon the patient's individual formula.</p><p><b>Conclusion:</b> Three HPN patients receiving ferric carboxymaltose experienced transient, generally mild reductions in serum phosphorus. Monitoring is crucial, but the results from this case series suggest that clinical complications are rare. Adjustments to HPN or additional supplementation may be needed based on individual patient needs, with some cases self-correcting over time.</p><p><b>Table 1.</b> Timeline of Iron Injections and the Resulting Serum Phosphorus Levels and HPN Formula Adjustments.</p><p></p><p>Danial Nadeem, MD<sup>1</sup>; Stephen Adams, MS, RPh, BCNSP<sup>2</sup>; Bryan Snook<sup>2</sup></p><p><sup>1</sup>Geisinger Wyoming Valley, Bloomsburg, PA; <sup>2</sup>Geisinger, Danville, PA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Ferric carboxymaltose (FC) is a widely used intravenous iron formulation, primarily employed in the treatment of iron deficiency. It offers significant benefits, particularly in cases where oral iron supplementation proves ineffective or is not well-tolerated. However, an important potential adverse effect associated with FC use is hypophosphatemia. This condition has been observed in multiple patients following their treatment with FC. The paper discusses the potential mechanisms leading to this adverse effect and its significant implications for patient care.</p><p><b>Methods:</b> A middle-aged female with history of malnutrition and iron deficiency receiving parenteral nutrition at home had received multiple doses Venofer in the past, with the last dose given in 2017 to which the patient developed an anaphylactic reaction. She was therefore switched to ferric carboxymaltose (FCM) therapy. However, upon receiving multiple doses of FCM in 2018, the patient developed significant hypophosphatemia. As hypophosphatemia was noted, adjustments were made to the patient's total parenteral nutrition (TPN) regimen to increase the total phosphorus content in an effort to treat the low phosphate levels. The patient also received continued doses of FCM in subsequent years, with persistent hypophosphatemia despite repletion.</p><p><b>Results:</b> Ferric carboxymaltose (FC) is a widely used intravenous iron therapy for the treatment of iron deficiency. It is particularly beneficial in cases where oral iron supplementation is ineffective or not tolerated. FC works by delivering iron directly to the macrophages in the reticuloendothelial system. The iron is then released slowly for use by the body, primarily for the production of hemoglobin. However, recent studies have highlighted the potential adverse effect of hypophosphatemia associated with the use of FC. Hypophosphatemia induced by FC is thought to be caused by an increase in the secretion of the hormone fibroblast growth factor 23 (FGF23). FGF23 is a hormone that regulates phosphate homeostasis. When FGF23 levels rise, the kidneys increase the excretion of phosphate, leading to lower levels of phosphate in the blood. There are many implications of hypophosphatemia in regards to patient care. Symptoms of hypophosphatemia can include muscle weakness, fatigue, bone pain, and confusion. In severe cases, persistent hypophosphatemia can lead to serious complications such as rhabdomyolysis, hemolysis, respiratory failure, and even death. Therefore, it is crucial for clinicians to be aware of the potential risk of hypophosphatemia when administering FC. Regular monitoring of serum phosphate levels is recommended in patients receiving repeated doses of FC. Further research is needed to fully understand the mechanisms underlying this adverse effect and to develop strategies for its prevention and management.</p><p><b>Conclusion:</b> In conclusion, while FC is an effective treatment for iron deficiency, it is important for clinicians to be aware of the potential risk of hypophosphatemia. Regular monitoring of serum phosphate levels is recommended in patients receiving repeated doses of FC. Further research is needed to fully understand the mechanisms underlying this adverse effect and to develop strategies for its prevention and management.</p><p><b>Table 1.</b> Phosphorous Levels and Iron Administration.</p><p></p><p>Table 1 shows the response to serum phosphorous levels in a patient given multiple doses of intravenous iron over time.</p><p>Ashley Voyles, RD, LD, CNSC<sup>1</sup>; Jill Palmer, RD, LD, CNSC<sup>1</sup>; Kristin Gillespie, MD, RD, LDN, CNSC<sup>1</sup>; Suzanne Mack, MS, MPH, RD, LDN, CNSC<sup>1</sup>; Tricia Laglenne, MS, RD, LDN, CNSC<sup>1</sup>; Jessica Monczka, RD, CNSC, FASPEN<sup>1</sup>; Susan Dietz, PharmD, BCSCP<sup>1</sup>; Kathy Martinez, RD, LD<sup>1</sup></p><p><sup>1</sup>Option Care Health, Bannockburn, IL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Home parenteral nutrition (HPN) can be successfully initiated in the home setting with careful evaluation and management by an experienced nutrition support team (NST).<sup>1,2</sup> Safe candidates for HPN initiation include medically stable patients with an appropriate indication, a safe environment, and the means for reliable follow-up. However, some patients are not appropriate to start directly with HPN due to logistical reasons or the risk of refeeding syndrome (RFS).<sup>2</sup> Consensus Recommendations for RFS provide guidance regarding recognizing and managing risk.<sup>3</sup> An experienced NST that provides individualized care can recommend intravenous (IV) hydration before initiating HPN to expedite the initiation of therapy and normalize blood chemistry to mitigate the risk of RFS. The purpose of this study is to evaluate the impact of IV hydration on adult patients managed by a home infusion NST who received IV hydration prior to initiating HPN. The proportion of patients who received IV hydration prior to HPN, the reason for initiating IV hydration, and the impact this intervention may have had on their care and outcomes will be described.</p><p><b>Methods:</b> This retrospective review includes 200 HPN patients 18 years of age and older initiated on HPN therapy with a national home infusion pharmacy over a 6-month period from January 1, 2024, to June 30, 2024. Data collection included baseline demographics, indication for HPN, risk of RFS, days receiving IV hydration prior to initiating HPN, the number of rehospitalizations within the first 2 weeks, whether they received IV hydration, and if so, the indication for IV hydration and the components of the orders. Data was collected via electronic medical records and deidentified into a standardized data collection form.</p><p><b>Results:</b> Of the 200 total patients, 19 (9.5%) received IV hydration prior to HPN. Of these 19 patients, 16 were female, and 3 were male (Table 1). The most common indications for HPN were bariatric surgery complication (5), intestinal failure (4), and oncology diagnosis (4) (Figure 1). Among these patients, 9 (47%) were at moderate RFS risk and 10 (53%) were at high RFS risk. The indications for IV hydration included 7 (37%) due to electrolyte abnormalities/RFS risk, 5 (26%) due to delay in central line placement, and 7 (37%) due to scheduling delays (Figure 2). IV hydration orders included electrolytes in 15 (79%) of the orders. All orders without electrolytes (4) had an indication related to logistical reasons (Figure 3). All 19 patients started HPN within 7 days of receiving IV hydration. Two were hospitalized within the first two weeks of therapy with admitting diagnoses unrelated to HPN.</p><p><b>Conclusion:</b> In this group of patients, HPN was successfully initiated in the home setting when managed by an experienced NST, preventing unnecessary hospitalizations. This study demonstrated that safe initiation of HPN may include IV hydration with or without electrolytes first, either to mitigate RFS or due to logistical reasons, when started on HPN within 7 days. The IV hydration orders were individualized to fit the needs of each patient. This data only reflects IV hydration dispensed through the home infusion pharmacy and does not capture IV hydration received at an outside clinic. This patient population also does not include those deemed medically unstable for HPN or those not conducive to starting in the home setting for other factors. Future research should account for these limitations and detail IV hydration components, dosing, and frequency of orders.</p><p><b>Table 1.</b> Demographics.</p><p></p><p></p><p><b>Figure 1.</b> HPN Indications of IV Hydration.</p><p></p><p><b>Figure 2.</b> Indication for IV Hydration and Refeeding Risk.</p><p></p><p><b>Figure 3.</b> Indications and Types of IV Hydration.</p><p>Emily Boland Kramer, MS, RD, LDN, CNSC<sup>1</sup>; Jessica Monczka, RD, CNSC, FASPEN<sup>1</sup>; Tricia Laglenne, MS, RD, LDN, CNSC<sup>1</sup>; Ashley Voyles, RD, LD, CNSC<sup>1</sup>; Susan Dietz, PharmD, BCSCP<sup>1</sup>; Kathy Martinez, RD, LD<sup>1</sup></p><p><sup>1</sup>Option Care Health, Bannockburn, IL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Parenteral nutrition (PN) is a critical therapy for patients who are unable to absorb nutrients adequately via the gastrointestinal tract.<sup>1</sup> PN is complex, with 10 or more individually dosed components in each order which inherently increases the risk for dosing errors. <sup>2</sup> This study seeks to analyze the PN orders at hospital discharge received by a home infusion provider and identify the incidence of the omission of the standard components, as determined by ASPEN Recommendations on Appropriate Parenteral Nutrition Dosing.<sup>3</sup> The primary objective of this study was to identify missing sodium, potassium, magnesium, calcium, phosphorus, and multivitamin components in hospital discharge orders. The secondary objective was to determine whether identified missing components were added back during the transition of care (TOC) process from hospital to home.</p><p><b>Methods:</b> This multi-center, retrospective chart review analyzed patients referred to a national home infusion provider over a 3-month period. Data was collected from the electronic medical record and internal surveillance software. The Registered Dietitian, Certified Nutrition Support Clinician (RD, CNSC) reviewed all PN hospital discharge orders for patients transitioning clinical management and PN therapy from hospital to home. Inclusion criteria were patients 18 years of age or older with PN providing the majority of their nutritional needs, as defined in Table 1, who were missing sodium, potassium, magnesium, calcium, phosphorus, or multivitamin from their PN order at hospital discharge. Exclusion criteria were patients less than 18 years of age, patients receiving supplemental PN not providing majority of nutritional needs, and patients with doses ordered for all electrolytes and multivitamin.</p><p><b>Results:</b> During the 3-month period (April 1, 2024 to June 30, 2024), 267 patients were identified who were greater than 18 years of age, receiving the majority of their nutritional needs via PN, and missing at least one PN component in their hospital discharge order. See Table 2 and Figure 1 for demographics. One hundred seventy-five (65.5%) patients were missing one component and 92 (34.5%) were missing multiple components from the hospital discharge order. One hundred seventy-five (65.5%) patients were missing calcium, 68 (25.5%) phosphorus, 38 (14%) multivitamin, 23 (8.6%) magnesium, 20 (7.5%) potassium, and 20 (7.5%) sodium. During the transition from hospital to home, after discussion with the provider, 94.9% of patients had calcium added back, 94.7% multivitamin, 91.3% magnesium, 90% potassium, 88.2% phosphorus, and 80% sodium.</p><p><b>Conclusion:</b> This study highlights the prevalence of missing components in PN hospital discharge orders, with calcium being the most frequently omitted at a rate of 65.5%. Given that many patients discharging home on PN will require long term therapy, adequate calcium supplementation is essential to prevent bone resorption and complications related to metabolic bone disease. In hospital discharge orders that were identified by the RD, CNSC as missing calcium, 94.9% of the time the provider agreed that it was clinically appropriate to add calcium to the PN order during the TOC process. This underlines the importance of nutrition support clinician review and communication during the transition from hospital to home. Future research should analyze the reasons why components are missing from PN orders and increase awareness of the need for a thorough clinical review of all patients going home on PN to ensure the adequacy of all components required for safe and optimized long term PN.</p><p><b>Table 1.</b> Inclusion and Exclusion Criteria.</p><p></p><p><b>Table 2.</b> Demographics.</p><p></p><p></p><p><b>Figure 1.</b> Primary PN Diagnosis.</p><p></p><p><b>Figure 2.</b> Components Missing from Order and Added Back During TOC Process.</p><p>Avi Toiv, MD<sup>1</sup>; Hope O'Brien, BS<sup>2</sup>; Arif Sarowar, MSc<sup>2</sup>; Thomas Pietrowsky, MS, RD<sup>1</sup>; Nemie Beltran, RN<sup>1</sup>; Yakir Muszkat, MD<sup>1</sup>; Syed-Mohammad Jafri, MD<sup>1</sup></p><p><sup>1</sup>Henry Ford Hospital, Detroit, MI; <sup>2</sup>Wayne State University School of Medicine, Detroit, MI</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Intestinal Failure Associated Liver Disease (IFALD) is a known complication in patients reliant on total parenteral nutrition (TPN), especially those awaiting intestinal transplantation. There is concern that IFALD may negatively impact post-transplant outcomes, including graft survival and overall patient mortality. This study aims to evaluate the impact of IFALD, as indicated by liver function test (LFT) abnormalities before intestinal or multivisceral transplant, on transplant outcomes.</p><p><b>Methods:</b> We conducted a retrospective chart review of all patients who underwent IT at an academic transplant center from 2010 to 2023. The primary outcome was patient survival and graft failure in transplant recipients.</p><p><b>Results:</b> Among 50 IT recipients, there were 30 IT recipients (60%) who required TPN before IT. The median age at transplant in was 50 years (range, 17-68). In both groups, the majority of transplants were exclusively IT, however they included MVT as well. 87% of patients on TPN developed elevated LFTs before transplant. 33% had persistently elevated LFTs 1 year after transplant. TPN-associated liver dysfunction in our cohort was associated with mixed liver injury patterns, with both hepatocellular injury (p &lt; 0.001) and cholestatic injury (p &lt; 0.001). No significant associations were found between TPN-related elevated LFTs and major transplant outcomes, including death (p = 0.856), graft failure (p = 0.144), or acute rejection (p = 0.306). Similarly, no significant difference was observed between elevated LFTs and death (p = 0.855), graft failure (p = 0.769), or acute rejection (p = 0.386) in patients who were not on TPN. TPN-associated liver dysfunction before transplant was associated with elevated LFTs at one year after transplant (p &lt; 0.001) but lacked clinical relevance.</p><p><b>Conclusion:</b> Although IFALD related to TPN use is associated with specific liver dysfunction patterns in patients awaiting intestinal or multivisceral transplants, it does not appear to be associated with significant key transplant outcomes such as graft failure, mortality, or acute rejection. However, it is associated with persistently elevated LFTs even 1 year after transplant. These findings suggest that while TPN-related liver injury is common, it may not have a clinically significant effect on long-term transplant success. Further research is needed to explore the long-term implications of IFALD in this patient population.</p><p>Jody (Lind) Payne, RD, CNSC<sup>1</sup>; Kassandra Samuel, MD, MA<sup>2</sup>; Heather Young, MD<sup>3</sup>; Karey Schutte, RD<sup>3</sup>; Kristen Horner, RDN, CNSC<sup>3</sup>; Daniel Yeh, MD, MHPE, FACS, FCCM, FASPEN, CNSC<sup>3</sup></p><p><sup>1</sup>Denver Health, Parker, CO; <sup>2</sup>Denver Health, St. Joseph Hospital, Denver, CO; <sup>3</sup>Denver Health, Denver, CO</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Central line-associated blood stream infection (CLABSI) is associated with increased complications, length of stay and cost of care. The majority of CLABSI studies are focused on home parenteral nutrition (PN) patients and there is a paucity of data documenting the incidence of CLABSI attributable to PN in the inpatient setting. At our institution, we have observed that clinicians are reluctant to initiate PN in patients with clear indications for PN due to concerns about CLABSI. Therefore, we performed a quality improvement project to document our incidence of CLABSI rate for new central parenteral nutrition (CPN) initiated during hospitalization.</p><p><b>Methods:</b> We performed a retrospective review of adult inpatients who initiated CPN at our facility from 1/1/23-12/31/23. Patients were excluded if they received PN prior to admission or only received peripheral PN. The National Healthcare Safety Network (NHSN) definitions were used for CLABSI and secondary attribution. Further deeper review of CLABSI cases was provided by an Infectious Disease (ID) consultant to determine if positive cases were attributable to CPN vs other causes. The type of venous access for the positive patients was also reviewed.</p><p><b>Results:</b> A total of 106 inpatients received CPN for a total of 1121 CPN days. The median [IQR] length of CPN infusion was 8 [4-14] days. Mean (standard deviation) age of patients receiving CPN infusion was 53.3 (18.6) years and 65 (61%) were men. The CPN patients who met criteria for CLABSI were further reviewed by ID consultant and resulted in only four CLABSI cases being attributable to CPN. These four cases resulted in an incidence rate of 3.6 cases of CLABSI per 1000 CPN days. Two of these patients were noted for additional causes of infection including gastric ulcer perforation and bowel perforation with anastomotic leak. Three of the patients had CPN infused via a central venous catheter (a port, a femoral line, and a non-tunneled internal jugular catheter) and the fourth patient had CPN infused via a peripherally inserted central catheter. The incidence rate for CLABSI cases per catheter days was not reported in our review.</p><p><b>Conclusion:</b> At our institution, &lt; 4% of patients initiating short-term CPN during hospitalization developed a CLABSI attributable to the CPN. This low rate of infection serves as a benchmark for our institution's quality improvement and quality assurance efforts. Collaboration with ID is recommended for additional deeper review of CPN patients with CLABSI to determine if the infection is more likely to be related to other causes than infusion of CPN.</p><p>Julianne Harcombe, RPh<sup>1</sup>; Jana Mammen, PharmD<sup>1</sup>; Hayato Delellis, PharmD<sup>1</sup>; Stefani Billante, PharmD<sup>1</sup></p><p><sup>1</sup>Baycare, St. Joseph's Hospital, Tampa, FL</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> Florida Residency Conference 2023.</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Purpose/Background: Refeeding syndrome is defined as potentially fatal shifts in fluids and electrolytes that may occur in malnourished patients receiving enteral or parenteral nutrition. When refeeding syndrome occurs, a reduction in phosphorus/magnesium/potassium levels and thiamine deficiency can be seen shortly after initiation of calorie provision. The American Society of Parenteral and Enteral Nutrition guidelines considers hypophosphatemia as the hallmark sign of refeeding syndrome; however, magnesium and potassium have been shown to be equally important. The purpose of this study is to identify the incidence of hypophosphatemia in patients who are at risk of refeeding syndrome and the importance of monitoring phosphorus.</p><p><b>Methods:</b> This study was a multicenter retrospective chart review that was conducted using the BayCare Health System medical records database, Cerner. The study included patients who were 18 years or older, admitted between January 2023 through December 2023, received total parenteral nutrition (TPN) and were at risk of refeeding syndrome. We defined patients at risk of refeeding syndrome as patients who meet two of the following criteria prior to starting the TPN: body mass index (BMI) prior to starting TPN &lt; 18.5 kg/m2, 5% weight loss in 1 month, no oral intake 5 days, or low levels of serum phosphorus/magnesium/potassium. COVID-19 patients and patients receiving propofol were excluded from the study. The primary objective of the study was to evaluate the incidence of hypophosphatemia versus hypomagnesemia versus hypokalemia in patients receiving TPN who were at risk of refeeding syndrome. The secondary objective was to evaluate whether the addition of thiamine upon initiation of TPN showed benefit in the incidence of hypophosphatemia.</p><p><b>Results:</b> A total of 83 patients met the criteria for risk of refeeding syndrome. Out of the 83 patients, a total of 53 patients were used to run a pilot study to determine the sample size and 30 patients were included in the study. The results on day 1 and day 2 suggest the incidence of hypomagnesemia differs from that of hypophosphatemia and hypokalemia, with a notably lower occurrence. The Cochran's Q test yielded x2(2) = 9.57 (p-value = 0.008) on day 1 and x2(2) = 4.77 (p-value = 0.097) on day 2, indicating a difference in at least one group compared to the others on only day 1. A post hoc analysis found a difference on day 1 between the incidence of hypophosphatemia vs hypomagnesemia (30%) and hypomagnesemia vs hypokalemia (33.3%). For the secondary outcome, the difference in day 2 versus day 1 phosphorus levels with the addition of thiamine in the TPN was 0.073 (p-value = 0.668, 95% CI [-0.266 – 0.413]).</p><p><b>Conclusion:</b> Among patients who were on parenteral nutrition and were at risk of refeeding syndrome, there was not a statistically significant difference in the incidence of hypophosphatemia vs hypomagnesemia and hypomagnesemia vs hypokalemia. Among patients who were on parenteral nutrition and were at risk of refeeding syndrome, there was not a statistically significant difference on day 2 phosphorus levels vs day 1 phosphorus levels when thiamine was added.</p><p></p><p></p><p>Jennifer McClelland, MS, RN, FNP-BC<sup>1</sup>; Margaret Murphy, PharmD, BCNSP<sup>1</sup>; Matthew Mixdorf<sup>1</sup>; Alexandra Carey, MD<sup>1</sup></p><p><sup>1</sup>Boston Children's Hospital, Boston, MA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Iron deficiency anemia (IDA) is common in patients with intestinal failure (IF) dependent on parenteral nutrition (PN). Treatment with enteral iron is preferred; however, may not be tolerated or efficacious. In these cases, intravenous (IV) iron is a suitable alternative. Complications include adverse reactions and infection, though low when using low-molecular-weight (LMW) formulations. In a home PN (HPN) program, an algorithm (Figure 1) was developed to treat IDA utilizing IV iron.</p><p><b>Methods:</b> A retrospective chart review was conducted in large HPN program (~150 patients annually) from Jan 2019 - April 2024 who were prescribed IV iron following an algorithm. Laboratory studies were analyzed looking for instances of ferritin &gt;500 ng/mL indicating potential iron overload, as well as transferrin saturation 12-20% indicating iron sufficiency. In instances of ferritin levels &gt;500 further review was conducted to understand etiology, clinical significance and if the IV iron algorithm was adhered to.</p><p><b>Results:</b> HPN patients are diagnosed with IDA based on low iron panel (low hemoglobin and/or MCV, low ferritin, high reticulocyte count, serum iron and transferrin saturation and/or high total iron binding capacity (TIBC). If the patient can tolerate enteral iron supplementation, a dose of 3-6 mg/kg/day is initiated. If patient cannot tolerate enteral iron, the IV route is initiated. Initial IV dose is administered in the hospital or infusion center for close monitoring and to establish home maintenance administration post repletion dosing. Iron dextran is preferred as it can be directly added into the PN and run for duration of the cycle. Addition to the PN eliminates an extra infusion and decrease additional CVC access. Iron dextran is incompatible with IV lipids, so the patient must have one lipid-free day weekly to be able to administer. If patient receives daily IV lipids, iron sucrose is given as separate infusion from the PN. Maintenance IV iron dosing is 1 mg/kg/week, with dose and frequency titrated based on clinical status, lab studies and trends. Iron panel and C-reactive protein (CRP) are ordered every 2 months. If lab studies are below the desired range and consistent with IDA, IV iron dose is increased by 50% by dose or frequency; if studies are over the desired range, IV iron dose is decreased by 50% by dose or frequency. Maximum home dose is &lt; 3 mg/kg/dose; if higher dose needed, patient is referred to an infusion center. IV iron is suspended if ferritin &gt;500 ng/mL due to risk for iron overload and deposition in the liver. Ferritin results (n = 4165) for all patients in HPN program from January 2019-April 2024 were reviewed looking for levels &gt;500 ng/mL indicating iron overload. Twenty-nine instances of ferritin &gt;500 ng/mL (0.7% of values reviewed) were identified in 14 unique patients on maintenance IV iron. In 9 instances, the high ferritin level occurred with concomitant acute illness with an elevated CRP; elevated ferritin in these cases was thought to be related to an inflammatory state vs. iron overload. In 2 instances, IV iron dose was given the day before lab draw, rendering a falsely elevated result. Two patients had 12 instances (0.28% of values reviewed) of elevated ferritin thought to be related to IV iron dosing in the absence of inflammation, with normal CRP levels. During this period, there were no recorded adverse events.</p><p><b>Conclusion:</b> IDA is common in patients with IF dependent on PN. Iron is not a standard component or additive in PN. Use of IV iron in this population can increase quality of life by decreasing need for admissions, visits to infusion centers, or need for blood transfusions in cases of severe anemia. IV iron can be safely used for maintenance therapy in HPN patients with appropriate dosing and monitoring.</p><p></p><p><b>Figure 1.</b> Intravenous Iron in the Home Parenteral Nutrition dependent patient Algorithm.</p><p>Lynne Sustersic, MS, RD<sup>1</sup>; Debbie Stevenson, MS, RD, CNSC<sup>2</sup></p><p><sup>1</sup>Amerita Specialty Infusion Services, Thornton, CO; <sup>2</sup>Amerita Specialty Infusion Services, Rochester Hills, MI</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Desmoplastic small round tumor (DSRT) is a soft-tissue sarcoma that causes tumors to form in the abdomen and pelvis. To improve control, cytoreduction, hyperthermic intraperitoneal chemotherapy (CRS/HIPEC) and radiotherapy are often used, which can result in bowel obstruction secondary to sclerosing peritonitis. This necessitates total parenteral nutrition therapy (TPN) due to the inability to consume nutrition orally or enterally. A major complication of parenteral nutrition therapy is parenteral nutrition associated liver disease (PNALD) and the most common metastasis for DSRT is the liver. This case report details the substitution of an olive and soy oil-based intravenous lipid emulsion (OO, SO-ILE) for a soy, MCT, olive, fish oil-based intravenous lipid emulsion (SO, MCT, OO, FO-ILE) to treat high liver function tests (LFTs).</p><p><b>Methods:</b> A 28-year-old male with DSRT metastatic to peritoneum and large hepatic mass complicated by encapsulating peritonitis and enterocutaneous fistula (ECF), following CRS/HIPEC presented to the parenteral nutrition program at Amerita Specialty Infusion Services in 2022. TPN was initiated from January 2022 to March 2023, stopped, and restarted in December 2023 following a biliary obstruction. TPN was initiated and advanced using 1.3 g/kg/day SMOFlipid, (SO, MCT, OO, FO-ILE), known to help mitigate PNALD. The patient developed rising LFTs, with alanine transferase (ALT) peaking at 445 U/L, aspartate transferase (AST) at 606 U/L, and alkaline phosphatase (ALP) at 1265 U/L. Despite transitioning the patient to a cyclic regimen and maximizing calories in dextrose and amino acids, liver function continued to worsen. A switch to Clinolipid, (OO, SO-ILE), at 1.3 g/kg/day was tried.</p><p><b>Results:</b> Following the initiation of OO, SO-ILE, LFTs improved in 12 days with ALT resulting at 263 U/L, AST at 278 U/L, and ALP at 913 U/L. These values continued to improve until the end of therapy in June 2024 with a final ALT value of 224 U/L, AST at 138 U/L, and ALP at 220 U/L. See Figure 1. No significant improvements in total bilirubin were found. The patient was able to successfully tolerate this switch in lipid emulsions and was able to increase his weight from 50 kg to 53.6 kg.</p><p><b>Conclusion:</b> SO, MCT, OO, FO-ILE is well-supported to help prevent and alleviate adverse effects of PNALD, however lipid emulsion impacts on other forms of liver disease need further research. Our case suggests that elevated LFTs were likely cancer induced, rather than associated with prolonged use of parenteral nutrition. A higher olive oil lipid concentration may have beneficial impacts on LFTs that are not associated with PNALD. It is also worth noting that soybean oil has been demonstrated in previous research to have a negative impact on liver function, and the concentration of soy in SO, MCT, OO, FO-ILE is larger (30%) compared to OO, SO-ILE (20%). This may warrant further investigation into specific soy concentrations’ impact on liver function. LFTs should be assessed and treated on a case-by-case basis that evaluates disease mechanisms, medication-drug interactions, parenteral nutrition composition, and patient subjective information.</p><p></p><p><b>Figure 1.</b> OO, SO-ILE Impact on LFTs.</p><p>Shaurya Mehta, BS<sup>1</sup>; Ajay Jain, MD, DNB, MHA<sup>1</sup>; Kento Kurashima, MD, PhD<sup>1</sup>; Chandrashekhara Manithody, PhD<sup>1</sup>; Arun Verma, MD<sup>1</sup>; Marzena Swiderska-Syn<sup>1</sup>; Shin Miyata, MD<sup>1</sup>; Mustafa Nazzal, MD<sup>1</sup>; Miguel Guzman, MD<sup>1</sup>; Sherri Besmer, MD<sup>1</sup>; Matthew Mchale, MD<sup>1</sup>; Jordyn Wray<sup>1</sup>; Chelsea Hutchinson, MD<sup>1</sup>; John Long, DVM<sup>1</sup></p><p><sup>1</sup>Saint Louis University, St. Louis, MO</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> North American Society for Pediatric Gastroenterology, Hepatology and Nutrition (NASPGHAN) Florida.</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Short bowel syndrome (SBS) is a devastating condition. In absence of enteral nutrition (EN), patients are dependent on Total Parenteral Nutrition (TPN) and suffer from of intestinal failure associated liver disease and gut atrophy. Intestinal adaptation (IA) and enteral autonomy (EA) remains the clinical goal. We hypothesized EA can be achieved using our DREAM system, (US patent 63/413,988) which allows EN via the stomach and then a mechanism for cyclical recirculation of nutrient rich distal intestinal content into proximal bowel enabling full EN despite SBS.</p><p><b>Methods:</b> 24 neonatal pigs were randomly allocated to enteral nutrition (EN; n = 8); TPN-SBS (on TPN only, n = 8); or DREAM (n = 8). Liver, gut, and serum were collected for histology, and serum biochemistry. Statistical analysis was performed using ‘Graph Pad Prism 10.1.2 (324)’. All tests were 2-tailed using a significance level of 0.05.</p><p><b>Results:</b> TPN-SBS piglets had significant cholestasis vs DREAM (p = 0.001) with no statistical difference in DREAM vs EN (p = 0.14). DREAM transitioned to full EN by day 4. Mean serum conjugated bilirubin for EN was 0.037 mg/dL, TPN-SBS 1.2 mg/dL, and DREAM 0.05 mg/dL. Serum bile acids were significantly elevated in TPN-SBS vs EN (p = 0.007) and DREAM (p = 0.03). Mean GGT, a marker of cholangiocytic injury was significantly higher in TPN-SBS vs EN (p &lt; 0.001) and DREAM (p &lt; 0.001) with values of EN 21.2 U/L, TPN-SBS 47.9 U/L, and DREAM 22.5 U/L (p = 0.89 DREAM vs EN). To evaluate gut growth, we measured lineal gut mass (LGM), calculated as the weight of the bowel per centimeter. There was significant IA and preservation in gut atrophy with DREAM. Mean proximal gut LGM was EN 0.21 g/cm, TPN-SBS 0.11 g/cm, and DREAM 0.31 g/cm (p = 0.004, TPN-SBS vs DREAM). Distal gut LGM was EN 0.34 g/cm, TPN-SBS 0.13 g/cm, and DREAM 0.43 g/cm (p = 0.006, TPN-SBS vs DREAM). IHC revealed DREAM had similar hepatic CK-7 (bile duct epithelium marker), p = 0.18 and hepatic Cyp7A1, p = 0.3 vs EN. No statistical differences were noted in LGR5 positive intestinal stem cells in EN vs DREAM, p = 0.18. DREAM prevented changes in hepatic, CyP7A1, BSEP, FGFR4, SHP, SREPBP-1 and gut FXR, TGR5, EGF vs TPN and SBS groups.</p><p><b>Conclusion:</b> DREAM resulted in a significant reduction of hepatic cholestasis, prevented gut atrophy, and presents a novel method enabling early full EN despite SBS. This system, by driving IA and EN autonomy highlights a major advancement in SBS management, bringing a paradigm change to life saving strategies for SBS patients.</p><p>Silvia Figueiroa, MS, RD, CNSC<sup>1</sup>; Paula Delmerico, MS, RD, CNSC<sup>2</sup></p><p><sup>1</sup>MedStar Washington Hospital Center, Bethesda, MD; <sup>2</sup>MedStar Washington Hospital Center, Arlington, VA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Parenteral nutrition (PN) therapy is a vital clinical intervention for patients of all ages and across care settings. The complexity of PN has the potential to cause significant patient harm, especially when errors occur. According to The Institute for Safe Medication Practices (ISMP), PN is classified as a high-alert medication and safety-focused strategies should be formulated to minimize errors and harm. Processing PN is multifactorial and includes prescribing, order review and verification, compounding, labeling, and administration. PN prescription should consider clinical appropriateness and formulation safety. The PN formulation must be written to provide appropriate amounts of macronutrients and micronutrients based on the patient's clinical condition, laboratory parameters, and nutrition status. The PN admixture should not exceed the total amounts, recommended concentrations, or rate of infusion of these nutrients as it could result in toxicities and formulation incompatibility or instability. The ASPEN Parenteral Nutrition Safety Consensus Recommendations recommend PN be prescribed using standardized electronic orders via a computerized provider order entry (CPOE) system, as handwritten orders have potential for error. Data suggests up to 40% of PN-related errors occur during the prescription and transcription steps. Registered pharmacists (RPh) are tasked with reviewing PN orders for order accuracy and consistency with recommendations made by the nutrition support team. These RPh adjustments ensure formula stability and nutrient optimization. This quality improvement project compares the frequency of RPh PN adjustments following initial provider order after transition from a paper to CPOE ordering system. Our hypothesis is CPOE reduces the need for PN adjustments by pharmacists during processing which increases clinical effectiveness and maximizes resource efficiency.</p><p><b>Methods:</b> This was a retrospective evaluation of PN ordering practices at a large, academic medical center after shifting from paper to electronic orders. PN orders were collected during three-week periods in December 2022 (Paper) and December 2023 (CPOE) and analyzed for the frequency of order adjustments by the RPh. Adjustments were classified into intravascular access, infusion rate, macronutrients, electrolytes, multivitamin (MVI) and trace elements (TE), and medication categories. The total number of adjustments made by the RPh during final PN processing was collected. These adjustments were made per nutrition support team.</p><p><b>Results:</b> Daily PN orders for 106 patients – totaling 694 orders – were reviewed for provider order accuracy at the time of fax (paper) and electronic (CPOE) submission. Order corrections made by the RPh decreased by 96% for infusion rate, 91.5% for macronutrients, 79.6% for electrolytes, 81.4% for MVI and TE, and 50% for medication additives (Table 1).</p><p><b>Conclusion:</b> Transitioning to CPOE led to reduction in the need for PN order adjustments at the time of processing. One reason for this decline is improvement in physician understanding of PN recommendations. With CPOE, the registered dietitian's formula recommendation is viewable within the order template and can be referenced at the time of ordering. The components of the active PN infusion also automatically populate upon ordering a subsequent bag. This information can aid the provider when calculating insulin needs or repleting electrolytes outside the PN, increasing clinical effectiveness. A byproduct of this process change is improved efficiency, as CPOE requires less time for provider prescription and RPh processing and verification.</p><p><b>Table 1.</b> RPh Order Adjustments Required During Collection Period.</p><p></p><p>Elaina Szeszycki, BS, PharmD, CNSC<sup>1</sup>; Emily Gray, PharmD<sup>2</sup>; Kathleen Doan, PharmD, BCPPS<sup>3</sup>; Kanika Puri, MD<sup>1</sup></p><p><sup>1</sup>Riley Hospital for Children at Indiana University Health, Indianapolis, IN; <sup>2</sup>Lurie Children's Hospital, Chicago, IL; <sup>3</sup>Riley Hospital for Children at IU Health, Indianapolis, IN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Parenteral nutrition (PN) is ordered daily at Riley Hospital for Children at IU Health by different specialties. Our hospital is a stand-alone pediatric hospital, including labor, delivery, and high-risk maternal care. Historically, the PN orders were due by early afternoon with a hard cut-off by end of the day shift for timely central compounding at a nearby adult hospital. Due to the relocation of staff and equipment to a new sterile compounding facility, a hard deadline was created with an earlier cutoff time and contingency plans for orders received after the deadline. This updated process was created to allow for timely delivery to Riley and subsequently to the patients to meet the standard PN hang-time of 2100.The Nutrition Support Team (NST) and Pharmacy and Therapeutics (P&amp;T) Committee approved an updated PN order process as follows:</p><p>Enforce Hard PN Deadline of 1200 for new and current PN orders If PN not received by 1200, renew active PN order for the next 24 hours If active PN order is not appropriate for the next 24 hours, the providers will need to order IVF in place of PN until the following day PN orders into PN order software by 1500</p><p><b>Methods:</b> A quality improvement (QI) process check was performed 3 months initiation of the updated PN order process. Data collection was performed for 1 month with the following data points: Total PN orders, Missing PN orders at 1200, PN orders re-ordered per P&amp;T policy after 1200 deadline, Lab review, Input and output, Subsequent order changes for 24 hours after renewal of active PN order, Waste of PN, and Service responsible for late PN order.</p><p><b>Results:</b></p><p></p><p><b>Conclusion:</b> The number of late PN orders after the hard deadline was &lt; 5% and there was a minimal number of renewed active PN orders due to the pharmacists' concern for ensuring safety of our patients. No clinically significant changes resulted from renewal of active PN, so considered a safe process despite small numbers. The changes made to late PN orders were minor or related to the planned discontinuation of PN. After review of results by NST and pharmacy administration, it was decided to take the following actions: Review data and process with pharmacy staff to assist with workload flow and education Create a succinct Riley TPN process document for providers, specifically services with late orders, reviewing PN order entry hard deadline and need for DC PN order by deadline to assist with pharmacy staff workflow and avoidance of potential PN waste Repeat QI analysis in 6-12 months.</p><p><b>International Poster of Distinction</b></p><p>Muna Islami, PharmD, BCNSP<sup>1</sup>; Mohammed Almusawa, PharmD, BCIDP<sup>2</sup>; Nouf Alotaibi, PharmD, BCPS, BCNSP<sup>3</sup>; Jwael Alhamoud, PharmD<sup>1</sup>; Maha Islami, PharmD<sup>4</sup>; Khalid Eljaaly, PharmD, MS, BCIDP, FCCP, FIDSA<sup>4</sup>; Majda Alattas, PharmD, BCPS, BCIDP<sup>1</sup>; Lama Hefni, RN<sup>5</sup>; Basem Alraddadi, MD<sup>1</sup></p><p><sup>1</sup>King Faisal Specialist Hospital, Jeddah, Makkah; <sup>2</sup>Wayne State University, Jeddah, Makkah; <sup>3</sup>Umm al Qura University, Jeddah, Makkah; <sup>4</sup>King Abdulaziz University Hospital, Jeddah, Makkah; <sup>5</sup>King Faisal Specialist Hospital, Jeddah, Makkah</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Parenteral nutrition (PN) is a critical therapy for patients unable to meet their nutritional needs through the gastrointestinal tract. While it offers a life-saving solution, it also carries the risk of central line-associated bloodstream infections (CLABSIs). However, there is a lack of comprehensive studies examining the risk factors for CLABSIs in a more heterogeneous cohort of PN recipients. This study aims to identify the risk factors associated with CLABSIs in patients receiving PN therapy in Saudi Arabia.</p><p><b>Methods:</b> This retrospective cohort multicenter study was conducted in three large tertiary referral centers in Saudi Arabia. The study included all hospitalized patients who received PN therapy through central lines between 2018 and 2022. The purpose of the study was to investigate the association between parenteral nutrition (PN) and central line-associated bloodstream infections (CLABSIs), using both univariate and multivariate analysis.</p><p><b>Results:</b> Out of 662 hospitalized patients who received PN and had central lines, 123 patients (18.6%) developed CLABSI. Among our patients, the duration of parenteral nutrition was a dependent risk factor for CLABSI development (OR, 1.012; 95% CI, 0.9-1.02). In patients who were taken PN, the incidence of CLABSI did not change significantly over the course of the study's years.</p><p><b>Conclusion:</b> The length of PN therapy is still an important risk factor for CLABSIs; more research is required to determine the best ways to reduce the incidence of CLABSI in patients on PN.</p><p><b>Table 1.</b> Characteristics of Hospitalized Patients Who Received PN.</p><p></p><p>1 n (%); Median (IQR) BMI, Body Mass Index.</p><p><b>Table 2.</b> The Characteristics of Individuals With and Without CLABSI Who Received PN.</p><p></p><p>1 n (%); Median (IQR), 2 Fisher's exact test; Pearson's Chi-squared test; Mann Whitney U test PN, Parenteral Nutrition</p><p></p><p>CLABSI, central line-associated bloodstream infections PN, parenteral nutrition</p><p><b>Figure 1.</b> Percentage of Patients With a Central Line Receiving PN Who Experienced CLABSI.</p><p>Duy Luu, PharmD<sup>1</sup>; Rachel Leong, PharmD<sup>2</sup>; Nisha Dave, PharmD<sup>2</sup>; Thomas Ziegler, MD<sup>2</sup>; Vivian Zhao, PharmD<sup>2</sup></p><p><sup>1</sup>Emory University Hospital - Nutrition Support Team, Lawrenceville, GA; <sup>2</sup>Emory Healthcare, Atlanta, GA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Intravenous lipid emulsion (ILE) is an essential component in parenteral nutrition (PN)-dependent patients because it provides calories and essential fatty acids; however, the use of soybean oil-based ILE (SO-ILE) may contribute to the development of PN-associated liver disease (PNALD) in patients with intestinal failure who require chronic PN. To mitigate this risk, new formulations of ILE, such as a mixture of SO, medium chain triglycerides (MCT), olive oil (OO), and fish oil-based ILE (SO/MCT/OO/FO-ILE), or pure fish oil-based ILE (FO-ILE) are now available in the US. FO-ILE is only approved for pediatric use for PN-associated cholestasis. This patient case highlights the benefits of using a combination of FO-containing ILEs to improve PNALD.</p><p><b>Methods:</b> A 65-year-old female with symptomatic achalasia required robotic-assisted esophagomyotomy with fundoplasty in February 2020. Her postoperative course was complicated with bowel injury that required multiple small bowel resections and total colectomy with end jejunostomy, resulting in short bowel syndrome with 80 centimeters of residual small bowel. Postoperatively, daily PN containing SO-ILE was initiated along with tube feedings (TF) during hospitalization, and she was discharged home with PN and TF. Her surgeon referred her to the Emory University Hospital (EUH) Nutrition Support Team (NST) for management. She received daily cyclic-PN infused over 12 hours, providing 25.4 kcal/kg/day with SO-ILE 70 grams (1.2 g/kg) three times weekly and standard TF 1-2 containers/day. In September 2020, she complained of persistent jaundice and was admitted to EUH. She presented with scleral icterus, hyperbilirubinemia, and elevated liver function tests (LFTs). EUH NST optimized PN to provide SO/MCT/OO/FO-ILE (0.8 g/kg/day), which improved blood LFTs, and the dose was then increased to 1 g/kg/day. In the subsequent four months, her LFTs worsened despite optimizing pharmacotherapy, continuing cyclic-TF, and reducing and then discontinuing ILE. She required multiple readmissions at EUH and obtained two liver biopsies that confirmed a diagnosis of PN-induced hepatic fibrosis after 15 months of PN. Her serum total bilirubin level peaked at 18.5 mg/dL, which led to an intensive care unit admission and required molecular adsorbent recirculating system therapy. In March 2022, the NST exhausted all options and incorporated FO-ILE (0.84 g/kg/day) three times weekly (separate infusion) and SO/MCT/OO/FO-ILE (1 g/kg/day) weekly.</p><p><b>Results:</b> The patient's LFTs are shown in Figure 1. The blood level of aspartate aminotransferase improved from 123 to 60 units/L, and alanine aminotransferase decreased from 84 to 51 units/L after 2 months and returned to normal after 4 months of the two ILEs. Similarly, the total bilirubin decreased from 5.6 to 2.2 and 1.1 mg/dL by 2 and 6 months, respectively. Both total bilirubin and transaminase levels remained stable. Although her alkaline phosphatase continued to fluctuate and elevated in the last two years, this marker decreased from 268 to 104 units/L. All other PNALD-related symptoms were resolved.</p><p><b>Conclusion:</b> This case demonstrates that the combination of FO-containing ILEs significantly improved and stabilized LFTs in an adult with PNALD. Additional research is needed to investigate the effect of FO-ILE in adult PN patients to mitigate PNALD.</p><p></p><p>SO: soybean oil; MCT: median-chain triglyceride; OO: olive oil; FO: fish oil; AST: Aspartate Aminotransferase; ALT: Alanine Aminotransferase.</p><p><b>Figure 1.</b> Progression of Liver Enzymes Status in Relation to Lipid Injectable Emulsions.</p><p>Narisorn Lakananurak, MD<sup>1</sup>; Leah Gramlich, MD<sup>2</sup></p><p><sup>1</sup>Department of Medicine, Faculty of Medicine, Chulalongkorn University, Bangkok, Krung Thep; <sup>2</sup>Division of Gastroenterology, Faculty of Medicine and Dentistry, University of Alberta, Edmonton, AB</p><p><b>Financial Support:</b> This research study received a grant from Baxter, Canada.</p><p><b>Background:</b> Pre-operative parenteral nutrition (PN) has been shown to enhance outcomes in malnourished surgical patients. Traditionally, pre-operative PN necessitates hospital admission, which leads to increased length of stay (LOS) and higher hospital costs. Furthermore, inpatient pre-operative PN may not be feasible or prioritized when access to hospital beds is restricted. Outpatient PN presents a potential solution to this issue. To date, the feasibility and impact of outpatient PN for surgical patients have not been investigated. This study aims to assess the outcomes and feasibility of outpatient pre-operative PN in malnourished surgical patients.</p><p><b>Methods:</b> Patients scheduled for major surgery who were identified as at risk of malnutrition using the Canadian Nutrition Screening Tool and classified as malnourished by Subjective Global Assessment (SGA) B or C were enrolled. Exclusion criteria included severe systemic diseases as defined by the American Society of Anesthesiologists (ASA) classification III to V, insulin-dependent diabetes mellitus, and extreme underweight (less than 40 kg). Eligible patients received a peripherally inserted central catheter (PICC) line and outpatient PN (Olimel 7.6%E 1,000 ml) for 5-10 days using the maximum infusion days possible prior to surgery at an infusion clinic. PN was administered via an infusion pump over 4-5 hours by infusion clinic nurses. The outcomes and feasibility of outpatient PN were assessed. Safety outcomes, including refeeding syndrome, dysglycemia, volume overload, and catheter-related complications, were monitored.</p><p><b>Results:</b> Outpatient PN was administered to eight patients (4 males, 4 females). Pancreatic cancer and Whipple's procedure were the most common diagnoses and operations, accounting for 37.5% of cases. (Table 1) The mean (SD) duration of PN was 5.5 (1.2) days (range 5-8 days). Outpatient PN was completed in 75% of patients, with an 88% completion rate of PN days (44/50 days). Post-PN infusion, mean body weight and body mass index increased by 4.6 kg and 2.1 kg/m², respectively. The mean PG-SGA score improved by 4.9 points, and mean handgrip strength increased from 20 kg to 25.2 kg. Quality of life, as measured by SF-12, improved in both physical and mental health domains (7.3 and 3.8 points, respectively). Patient-reported feasibility scores were high across all aspects (Acceptability, Appropriateness, and Feasibility), with a total score of 55.7/60 (92.8%). Infusion clinic nurses (n = 3) also reported high total feasibility scores (52.7/60, 87.8%). (Table 2) No complications were observed in any of the patients.</p><p><b>Conclusion:</b> Outpatient pre-operative PN was a feasible approach that was associated with improved outcomes in malnourished surgical patients. This novel approach has the potential to enhance outcomes and decrease the necessity for hospital admission in malnourished surgical patients. Future studies involving larger populations are needed to evaluate the efficacy of outpatient PN.</p><p><b>Table 1.</b> Baseline Characteristics of the Participants (n = 8).</p><p></p><p><b>Table 2.</b> Outcomes and Feasibility of Outpatient Preoperative Parenteral Nutrition (n = 8).</p><p></p><p>Adrianna Wierzbicka, MD<sup>1</sup>; Rosmary Carballo Araque, RD<sup>1</sup>; Andrew Ukleja, MD<sup>1</sup></p><p><sup>1</sup>Cleveland Clinic Florida, Weston, FL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Gastroparesis (GP) is a chronic motility disorder marked by delayed gastric emptying, associated with symptoms: nausea, vomiting and abdominal pain. Treatment consists of diet modifications and medications, with nutritional support tailored to disease severity. Severe refractory cases may require enteral or parenteral nutrition (PN). However, the role of home parenteral nutrition (HPN) in managing GP is underexplored. This study aims to enhance nutrition therapy practice by examining the utilization of HPN in GP population, addressing a significant gap in current nutrition support strategies.</p><p><b>Methods:</b> We conducted a retrospective, single-center analysis of patients receiving HPN from August 2022 to August 2024. Data were obtained through a review of electronic medical records as part of a quality improvement monitoring process. Patients' demographics, etiology of GP, indications for HPN, types of central access, duration of therapy, and PN-related complications were analyzed using descriptive statistics. Inclusion criteria were: adults (&gt;18 yrs.), GP diagnosis by gastric scintigraphy, and HPN for a minimum of 2 consecutive months. Among 141 identified HPN patients, 10 were diagnosed with GP as indication for PN.</p><p><b>Results:</b> GP patients constituted 7% (10/141) of our home PN population. In this cohort analysis of 10 patients with GP receiving HPN, the demographic profile was predominantly female (80%); a mean age of 42.6 yrs., all individuals identified as Caucasian. All patients had idiopathic GP, severe gastric emptying delay was found in 80% of cases, with all experiencing predominant symptoms of nausea/vomiting. Type of central access: 50% PICC lines, 30% Hickman catheters, 10% Powerlines, and 10% mediport. The mean weight change with PN therapy was an increase of 21.9 lbs. 80% of patients experienced infection-related complications, including bacteremia (Methicillin-Sensitive Staphylococcus Aureus (MSSA), Methicillin-Resistant Staphylococcus Aureus (MRSA)), Pseudomonas, and fungemia. Deep vein thrombosis (DVT) was identified in 20% of patients, alongside one case of a cardiac thrombus. Tube feeding trials were attempted in 70% of cases, but 50% ultimately discontinued due to intolerance, such as abdominal pain or complications like buried bumper syndrome. Chronic pain management was used in 60% of patients, with 40% on opioid therapy (morphine, fentanyl). PN was discontinued in 50% of patients due to recurrent infections (20%), advancement to tube feeding (20%), questionable compliance (20%), or improvement in oral intake (40%).</p><p><b>Conclusion:</b> This retrospective analysis underscores the potential of HPN as a nutritional strategy for GP, particularly in patients with refractory symptoms and severe delay in gastric emptying who previously failed EN or experienced complications related to the enteral access. In addition to the observed mean weight gain, HPN seems to play a crucial role in alleviating debilitating symptoms such as nausea, vomiting, and abdominal pain, thereby improving patients' overall quality of life. Nonetheless, the prevalence of infection-related complications and the requirement for chronic pain management underscore the challenges associated with GP treatment. The variability in patient responses to different nutritional strategies emphasizes the importance of individualized care plans. These findings advocate for further research to optimize HPN protocols and improve comprehensive management strategies in GP.</p><p></p><p><b>Figure 1.</b> Reasons for PN Discontinuation.</p><p></p><p><b>Figure 2.</b> Complication Associated with PN.</p><p>Longchang Huang, MD<sup>1</sup>; Peng Wang<sup>2</sup>; Shuai Liu<sup>3</sup>; Xin Qi<sup>1</sup>; Li Zhang<sup>1</sup>; Xinying Wang<sup>4</sup></p><p><sup>1</sup>Department of General Surgery, Jinling Hospital, Medical School of Nanjing University, Nanjing, Jiangsu; <sup>2</sup>Department of Digestive Disease Research Center, Gastrointestinal Surgery, The First People's Hospital of Foshan, Guangdong, Foshan; <sup>3</sup>Department of General Surgery, Jinling Hospital, Medical School of Nanjing University, Nanjing, Jiangsu; <sup>4</sup>Wang, Department of General Surgery, Jinling Hospital, Medical School of Nanjing University, Nanjing, Jiangsu</p><p><b>Financial Support:</b> National Natural Science Foundation of China, 82170575 and 82370900.</p><p><b>Background:</b> Total parenteral nutrition (TPN) induced gut microbiota dysbiosis is closely linked to intestinal barrier damage, but the mechanism remains unclear.</p><p><b>Methods:</b> Through the application of 16S rRNA gene sequencing and metagenomic analysis, we examined alterations in the gut microbiota of patients with chronic intestinal failure (CIF) and TPN mouse models subjected to parenteral nutrition, subsequently validating these observations in an independent verification cohort. Additionally, we conducted a comprehensive analysis of key metabolites utilizing liquid chromatography-mass spectrometry (LC-MS). Moreover, we explored modifications in essential innate-like lymphoid cell populations through RNA sequencing (RNA-seq), flow cytometry, and single-cell RNA sequencing (scRNA-seq).</p><p><b>Results:</b> The gut barrier damage associated with TPN is due to decreased Lactobacillus murinus. L.murinus mitigates TPN-induced intestinal barrier damage through the metabolism of tryptophan into indole-3-carboxylic acid (ICA). Furthermore, ICA stimulates innate lymphoid cells 3 (ILC3) to secrete interleukin-22 by targeting the nuclear receptor Rorc to enhance intestinal barrier protection.</p><p><b>Conclusion:</b> We elucidate the mechanisms driving TPN-associated intestinal barrier damage and indicate that interventions with L. murinus or ICA could effectively ameliorate TPN-induced gut barrier injury.</p><p></p><p><b>Figure 1.</b> TPN Induces Intestinal Barrier Damage in Humans and Mice. (a) the rate of febrile and admission of ICU in the Cohort 1. (b-d) Serum levels of IFABP, CRP, and LPS in patients with CIF. (e) Representative intestinal H&amp;E staining and injury scores (f) (n = 10 mice per group). (g) Results of the electrical resistance of the intestine in mice by Ussing Chamber (n = 5 mice per group). (h) Immunofluorescence experiments in the intestines and livers of mice. (i) The results of Western blot in the Chow and TPN groups.</p><p></p><p><b>Figure 2.</b> TPN Induces Gut Dysbiosis in Humans and Mice. (a) PCoA for 16S rRNA of fecal content from Cohort 1 (n = 16 individuals/group). (b) Significant abundance are identified using linear discriminant analysis (LDA). (c) Top 10 abundant genus. (d) PCoA of the relative genus or species abundances (n = 5 mice per group). (e) LDA for mice. (f) Sankey diagram showing the top 10 abundant genus of humans and mice. (g) The following heatmap illustrates the correlation between the abundance of species in intestinal microbiota and clinical characteristics of patients with CIF.</p><p></p><p><b>Figure 3.</b> Metabolically active L.murinus ameliorate intestinal barrier damage. (a) RT-PCR was conducted to quantify the abundance of L. murinus in feces from L-PN and H-PN patients (Cohorts 1 and 2). (b) Representative intestinal H&amp;E staining and injury scores (c) (n = 10 mice per group). (d) The results of the electrical resistance of the intestine in mice by Ussing Chamber (n = 5 mice per group). (e) The results of Western Blot. (f) 3D-PCA and volcano plot (g) analyses between the Chow and TPN group mice. (h) The metabolome-wide pathways were enriched based on the metabolomics data obtained from fecal content from Chow and TPN group mice (n = 5 mice per group). (i) The heatmap depicts the correlation between the abundance of intestinal microbiota in species level and tryptophan metabolites of the Chow and TPN group mice (n = 5 mice per group). (j) VIP scores of 3D-PCA. A taxon with a variable importance in projection (VIP) score of &gt;1.5 was deemed to be of significant importance in the discrimination process.</p><p></p><p><b>Figure 4.</b> ICA is critical for the effects of L.murinus. (a) The fecal level of ICA from TPN mice treated PBS control or ICA(n = 10 mice per group). (b) Representative intestinal H&amp;E staining and injury scores (c) (n = 10 mice per group). (d) The results of the electrical resistance of the intestine in mice by Ussing Chamber (n = 5 mice per group). (e) The results of Western blot. (f) This metabolic pathway illustrates the production of ICA by the bacterium L. murinus from the tryptophan. (g) PLS-DA for the profiles of metabolite in feces from TPN mice receiving ΔArAT or live L. murinus (n = 5 mice per group). (h) The heat map of tryptophan-targeted metabolomics in fecal samples from TPN mice that received either ΔArAT (n = 5) or live L. murinus (n = 5). (i) Representative intestinal H&amp;E staining and injury scores (j)(n = 10 mice per group). (k) The results of Western Blot.</p><p>Callie Rancourt, RDN<sup>1</sup>; Osman Mohamed Elfadil, MBBS<sup>1</sup>; Yash Patel, MBBS<sup>1</sup>; Taylor Dale, MS, RDN<sup>1</sup>; Allison Keller, MS, RDN<sup>1</sup>; Alania Bodi, MS, RDN<sup>1</sup>; Suhena Patel, MBBS<sup>1</sup>; Andrea Morand, MS, RDN, LD<sup>1</sup>; Amanda Engle, PharmD, RPh<sup>1</sup>; Manpreet Mundi, MD<sup>1</sup></p><p><sup>1</sup>Mayo Clinic, Rochester, MN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Although patent foramen ovale (PFO) are generally asymptomatic and cause no health concerns, they can be a risk factor for embolism and stroke. Due to this theoretical risk, some institutions have established protocols requiring most IV solutions to be administered through a small micron filter in patients with PFO. While 2-in-1 dextrose and amino acid solutions can be filtered through a 0.22-micron filter with relative ease, injectable lipid emulsions (ILEs), whether on their own or as part of a total admixture, consist of larger particles, requiring a 1.2-micron or bigger filter size. The use of the larger filter precludes the administration of ILE, an essential source of calories, in patients with PFO. It is unknown if patients who do receive ILE have an increased incidence of lipid embolism and stroke.</p><p><b>Methods:</b> A single-center retrospective review of patients on central parenteral nutrition (CPN) was completed. Demographics, and baseline clinical characteristics including co-morbidities and history of CVA were collected. The outcome of interest is defined as an ischemic cerebrovascular accident (CVA) within 30 days of CPN and, therefore, potentially attributable to it. Other cardiovascular and thromboembolic events were captured. All Patients with a PFO diagnosis and inpatient CPN administration between January 1, 2018, and December 18, 2023, at our quaternary care referral center were included as the case cohort. A 3:1 control group matched to age, gender, duration of inpatient CPN, and clinical co-morbidities was identified and utilized to examine the difference in the outcome of interest.</p><p><b>Results:</b> Patients with PFO who received CPN (n = 38, 53.8% female) had a mean age of 63.5 ± 13.1 years and a mean BMI of 31.1 ± 12.1 at CPN initiation (Table 1). The PFO varied in size, with the majority (38.5%) having a very small/trivial one (Table 2). All patients in this cohort had appropriate size filters placed for CPN and ILE administration. CPN prescription and duration were comparable between both groups. The majority of patients with PFO (53.8%) received mixed oil ILE, followed by soy-olive oil ILE (23.1%), whereas the majority of patients without PFO (51.8%) received soy-olive oil ILE and (42.9%) received mixed oil ILE (Table 3). Case and control groups had cardiovascular risks at comparable prevalence, including obesity, hypertension, diabetes, and dyslipidemia. However, more patients had a history of vascular cardiac events and atrial fibrillation in the PFO group, and more patients were smokers in the non-PFO group (Table 4). Patients with PFO received PN for a median of 7 days (IQR: 5,13), and 32 (84.2%) received ILE. Patients without PFO who received CPN (n = 114, 52.6%) had a mean age of 64.9 ± 10 years and a mean BMI of 29.6 ± 8.5 at CPN initiation. Patients in this cohort received PN for a median of 7 days (IQR: 5,13.5), and 113 (99.1%) received ILE. There was no difference in the incidence of ischemic CVA within 30 days of receiving CPN between both groups (2 (5.3%) in the PFO group vs. 1 (0.8%) in the non-PFO group; p = 0.092) (Table 4).</p><p><b>Conclusion:</b> The question of the risk of CVA/stroke with CPN in patients with PFO is clinically relevant and often lacks a definitive answer. Our study revealed no difference in ischemic CVA potentially attributable to CPN between patients with PFO and patients without PFO in a matched control cohort in the first 30 days after administration of PN. This finding demonstrates that CPN with ILE is likely safe for patients with PFO in an inpatient setting.</p><p><b>Table 1.</b> Baseline Demographics and Clinical Characteristics.</p><p></p><p><b>Table 2.</b> PFO Diagnosis.</p><p></p><p>*All received propofol concomitantly.</p><p><b>Table 3.</b> PN Prescription.</p><p></p><p><b>Table 4.</b> Outcomes and Complications.</p><p></p><p><b>Enteral Nutrition Therapy</b></p><p>Osman Mohamed Elfadil, MBBS<sup>1</sup>; Edel Keaveney, PhD<sup>2</sup>; Adele Pattinson, RDN<sup>1</sup>; Danelle Johnson, MS, RDN<sup>1</sup>; Rachael Connolly, BSc.<sup>2</sup>; Suhena Patel, MBBS<sup>1</sup>; Yash Patel, MBBS<sup>1</sup>; Ryan Hurt, MD, PhD<sup>1</sup>; Manpreet Mundi, MD<sup>1</sup></p><p><sup>1</sup>Mayo Clinic, Rochester, MN; <sup>2</sup>Rockfield MD, Galway</p><p><b>Financial Support:</b> Rockfield Medical Devices.</p><p><b>Background:</b> Patients on home enteral nutrition (HEN), many of whom are mobile, can experience significant hardships and reduced quality of life (QoL) due to limitations on mobility, on top of burdens due to underlying disease processes. Improving mobility while feeding could reduce burdens associated with HEN and potentially improve QoL. This prospective cohort study aims to evaluate participants’ perspectives on their mobility, ease of performing physical activities while feeding, and QoL following the use of a novel enteral feeding system (EFS).</p><p><b>Methods:</b> A prospective single-center study was conducted to evaluate a novel EFS, which is an FDA-cleared elastomeric system (Mobility + ®) that consists of a lightweight feeding pouch (reservoir for 500 mL feed), a filling set (used in conjunction with a syringe to fill EFS) and a feeding set to deliver EN formula to an extension set/feeding tube with an ISO 80369-3 compatible connector. Adult HEN-dependent patients were recruited by invitation to use the study EFS for a minimum of 2 feeds a day for 14 days, preceded by a familiarization period of 5-7 days. Participant perspectives on how they rated performing typical daily activities while feeding (e.g., moving, traveling, socializing) and feeding system parameters (ease of use, portability, noise, discretion, performance) were evaluated using HEN-expert validated questionnaires. A score was given for each rating from 1 to 5, with 5 being the most positive response. An overall score was calculated and averaged for the cohort. Participants were followed up during the familiarization period. On days 7 and 14, additional telephone interviews were conducted regarding compliance, enteral feed intake, participant perspectives on study EFS vs. current system, and other measures. We excluded those with reduced functional capacity due to their underlying disease(s).</p><p><b>Results:</b> Seventeen participants completed the study (mean age 63.8 ± 12 years; 70.6% male). Participants used various feeding systems, including gravity, bolus method, and pump, with the majority (82.4%) having a G-tube placed (Table 1). Sixteen (94.1%) patients achieved use of study EFS for at least two feeds a day (and majority of daily EN calories) for all study days (Table 2). The ratings for the ability to perform various activities using study EFS were significantly different compared to those of the systems used before the study. An improvement in ratings was noted for the ease of performing common daily activities, including moving between rooms or on stairs, taking short and long walks, traveling by car or public transport, engaging in moderate- to high-intensity activities, sleeping, and socializing with family and friends, between the time point before enrolment and end of study (day 14) (p-value &lt; 0.0001) (Table 3). Ratings of feeding system parameters were significantly different between systems used before the study and the study EFS (p &lt; 0.0001) (Table 3), with the largest increases in positive ratings noted in relation to easiness to carry, noise level, and ability to feed discreetly. Ratings for overall satisfaction with the performance of study EFS did not differ from the ratings for the systems used before the study, with participants reporting that the main influencing factors were the length of time and the effort needed to fill study EFS. No difference was noted in the QoL rating.</p><p><b>Conclusion:</b> The studied EFS is safe and effective as an enteral feeding modality that provides an alternative option for HEN recipients. Participants reported a significant positive impact of study EFS on their activities of daily living. Although the overall QoL rating remained the same, improvements in mobility, discretion, and ease of carrying— aspects of QoL—were associated with the use of study EFS.</p><p><b>Table 1.</b> Baseline Demographics and Clinical Characteristics.</p><p></p><p><b>Table 2.</b> Safety and Effectiveness.</p><p></p><p><b>Table 3.</b> Usability and Impact of the Study EFS.</p><p></p><p>Talal Sharaiha, MD<sup>1</sup>; Martin Croce, MD, FACS<sup>2</sup>; Lisa McKnight, RN, BSN MS<sup>2</sup>; Alejandra Alvarez, ACP, PMP, CPXP<sup>2</sup></p><p><sup>1</sup>Aspisafe Solutions Inc., Brooklyn, NY; <sup>2</sup>Regional One Health, Memphis, TN</p><p><b>Financial Support:</b> Talal Sharaiha is an executive of Aspisafe Solutions Inc. Martin Croce, Lisa McKnight and Alejandra Alvarez are employees of Regional One Health institutions. Regional One Health has a financial interest in Aspisafe Solutions Inc. through services, not cash. Aspisafe provided the products at no charge.</p><p><b>Background:</b> Feeding tube securement has seen minimal innovation over the past decades, leaving medical adhesives to remain as the standard method. However, adhesives frequently fail to maintain secure positioning, with dislodgement rates reported between 36% and 62%, averaging approximately 40%. Dislodgement can lead to adverse outcomes, including aspiration, increased risk of malnutrition, higher healthcare costs, and extended nursing care. We aimed to evaluate the safety and efficacy of a novel feeding tube securement device in preventing NG tube dislodgement compared to standard adhesive tape. The device consists of a bracket that sits on the patient's upper lip. The bracket has a non-adhesive mechanism for securing feeding tubes ranging from sizes 10 to 18 French. It extends to two cheek pads that sit on either side of the patient's cheeks and is further supported by a head strap that wraps around the patient's head (Fig. 1 + Fig. 2).</p><p><b>Methods:</b> We conducted a prospective, case-control trial at Regional One Health Center, Memphis, TN, comparing 50 patients using the novel securement device, the NG Guard, against 50 patients receiving standard care with adhesive tape. The primary outcome was the rate of accidental or intentional NG tube dislodgement. Secondary outcomes included the number of new NG tubes required as a result of dislodgement, and device-related complications or adhesive-related skin injuries in the control group. Statistical analyses employed Student's t-test for continuous variables and Fisher's exact test for categorical variables. Significance was set at an alpha level of 0.05. We adjusted for confounding variables, including age, sex, race, and diagnosis codes related to delirium, dementia, and confusion (Table 1).</p><p><b>Results:</b> There were no significant differences between the groups in baseline characteristics, including age, sex, race, or confusion-related diagnoses (Table 2) (p ≥ 0.09). Nasogastric tube dislodgement occurred significantly more often in the adhesive tape group (31%) compared to the intervention group (11%) (p &lt; 0.05). The novel device reduced the risk of tube dislodgement by 65%. Additionally, 12 new tubes were required in the control group compared to 3 in the intervention group (p &lt; 0.05), translating to 18 fewer reinsertion events per 100 tubes inserted in patients secured by the novel device. No device-related complications or adhesive-related injuries were reported in either group.</p><p><b>Conclusion:</b> The novel securement device significantly reduced the incidence of nasogastric tube dislodgement compared to traditional adhesive tape. It is a safe and effective securement method and should be considered for use in patients with nasogastric tubes to reduce the likelihood of dislodgement and the need for reinsertion of new NG tubes.</p><p><b>Table 1.</b> Diagnosis Codes Related to Dementia and Delirium.</p><p></p><p><b>Table 2.</b> Baseline Demographics.</p><p></p><p></p><p><b>Figure 1.</b> Novel Securement Device - Front View.</p><p></p><p><b>Figure 2.</b> Novel Securement Device - Side Profile.</p><p><b>Best of ASPEN-Enteral Nutrition Therapy</b></p><p><b>Poster of Distinction</b></p><p>Alexandra Kimchy, DO<sup>1</sup>; Sophia Dahmani, BS<sup>2</sup>; Sejal Dave, RDN<sup>1</sup>; Molly Good, RDN<sup>1</sup>; Salam Sunna, RDN<sup>1</sup>; Karen Strenger, PA-C<sup>1</sup>; Eshetu Tefera, MS<sup>3</sup>; Alex Montero, MD<sup>1</sup>; Rohit Satoskar, MD<sup>1</sup></p><p><sup>1</sup>MedStar Georgetown University Hospital, Washington, DC; <sup>2</sup>Georgetown University Hospital, Washington, DC; <sup>3</sup>MedStar Health Research Institute, Columbia, MD</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Early nutrition intervention is of high importance in patients with cirrhosis given the faster onset of protein catabolism for gluconeogenesis compared to those without liver disease. Severe malnutrition is associated with frequent complications of cirrhosis such as infection, hepatic encephalopathy, and ascites. Furthermore, studies have demonstrated higher mortality rates in cirrhotic patients who are severely malnourished in both the pre- and post-transplant setting. The current practice guidelines encourage the use of enteral nutrition in cirrhotic patients who are unable to meet their intake requirements with an oral diet. The aim of this study was to evaluate the utilization and implications of enteral feeding in hospitalized patients with cirrhosis diagnosed with severe protein calorie malnutrition at our institution.</p><p><b>Methods:</b> This was a retrospective study of patients admitted to the transplant hepatology inpatient service at MedStar Georgetown University Hospital from 2019-2023. ICD-10-CM code E43 was then used to identity patients with a diagnosis of severe protein calorie malnutrition. The diagnosis of cirrhosis and pre-transplant status were confirmed by review of the electronic medical record. Patients with the following characteristics were excluded: absence of cirrhosis, history of liver transplant, admission diagnosis of upper gastrointestinal bleed, and/or receipt of total parenteral nutrition. Wilcoxon rank sum and two sample t-tests were used to examine differences in the averages of continuous variables between two groups. Chi-square and Fisher exact tests were used to investigate differences for categorical variables. Statistical significance was defined as p-values ≤ 0.05.</p><p><b>Results:</b> Of the 96 patients with cirrhosis and severe protein calorie malnutrition, 31 patients (32%) received enteral nutrition. Time from admission to initiation of enteral feeding was on average 7 days with an average total duration of enteral nutrition of 10 days. In the group that received enteral nutrition, there was no significant change in weight, BMI, creatinine, total bilirubin or MELD 3.0 score from admission to discharge; however, albumin, sodium and INR levels had significantly increased (Table 1). A comparative analysis between patients with and without enteral nutrition showed a significant increase in length of stay, intensive care requirement, bacteremia, gastrointestinal bleeding, discharge MELD 3.0 score and in hospital mortality rates among patients with enteral nutrition. There was no significant difference in rates of spontaneous bacterial peritonitis, pneumonia, admission MELD 3.0 score or post-transplant survival duration in patients with enteral nutrition compared to those without enteral nutrition (Table 2).</p><p><b>Conclusion:</b> In this study, less than fifty percent of patients hospitalized with cirrhosis received enteral nutrition despite having a diagnosis of severe protein calorie malnutrition. Initiation of enteral nutrition was found to be delayed a week, on average, after hospital admission. Prolonged length of stay and higher in-hospital mortality rates suggest a lack of benefit of enteral nutrition when started late in the hospital course. Based on these findings, our institution has implemented a quality improvement initiative to establish earlier enteral feeding in hospitalized patients with cirrhosis and severe protein calorie malnutrition. Future studies will evaluate the efficacy of this initiative and implications for clinical outcomes.</p><p><b>Table 1.</b> The Change in Clinical End Points from Admission to Discharge Among Patients Who Received Enteral Nutrition.</p><p></p><p>Abbreviations: kg, kilograms; BMI, body mass index; INR, international normalized ratio; Na, sodium, Cr, creatinine; TB, total bilirubin; MELD, model for end stage liver disease; EN, enteral nutrition; Std, standard deviation</p><p><b>Table 2.</b> Comparative Analysis of Clinical Characteristics and Outcomes Between Patients With And Without Enteral Nutrition.</p><p></p><p>Abbreviations: MASLD, metabolic dysfunction-associated steatotic liver disease; HCV, hepatitis C virus; HBV, hepatitis B virus; AIH, autoimmune hepatitis; PSC, primary sclerosing cholangitis; PBC, primary biliary cholangitis; EN, enteral nutrition; RD, registered dietician; MICU, medical intensive care unit; PNA, pneumonia; SBP, spontaneous bacterial peritonitis; GIB, gastrointestinal bleed; MELD, model for end stage liver disease; d, days; N, number; Std, standard deviation.</p><p>Jesse James, MS, RDN, CNSC<sup>1</sup></p><p><sup>1</sup>Williamson Medical Center, Franklin, TN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Feeding tubes (Tubes) are used to deliver enteral nutrition to patients who are unable to safely ingest nutrients and medications orally, a population at elevated risk of malnutrition and dehydration. Unfortunately, these Tubes have a propensity for becoming clogged. Staff will attempt to unclog Tubes using standard bedside techniques including warm water flushes or chemical enzymes. However, not only are these practices time-consuming, often they are unsuccessful, requiring replacement. An actuated mechanical device for restoring patency in clogged small bore Tubes was evaluated at a level 2 medical center as an alternative declogging method from September 2021- July 2023. Study objectives were to explore the actuated mechanical device's ability to unclog indwelling Tubes and monitor any potential safety issues.</p><p><b>Methods:</b> The TubeClear® System (actuated mechanical device, Actuated Medical, Inc., Bellefonte, PA, Figure 1) was developed to resolve clogs from various indwelling Tubes. N = 20 patients (Table 1) with n = 16, 10Fr 109 cm long nasogastric (NG) tubes, and n = 4, 10Fr 140 cm long nasojejunal (NJ) tubes, underwent clearing attempts with the actuated mechanical device. Initially, patients underwent standard declogging strategies for a minimum of 30 minutes, including warm water flushes and Creon/NaHCO3 slurry. Following unsuccessful patency restoration (n = 17) or patency restoration and reclogging occurring (n = 3), the actuated mechanical device was attempted. Procedure time was estimated from electronic monitoring records charting system and included set up, use, and cleaning time for the actuated mechanical device, to the closest five minutes. All clearing procedures were completed by three trained registered dietitians.</p><p><b>Results:</b> The average time to restore Tube patency (n = 20) was 26.5 min (25 minutes for NG, 32.5 min for NJ) with 90% success (Table 2), and no significant safety issues reported by the operator or patient. User satisfaction was 100% (20/20) and patient discomfort being 10% (2/20).</p><p><b>Conclusion:</b> Based on presented results, the actuated mechanical device was significantly more successful at resolving clogs compared to alternative bedside practices. Operators noted that the “Actuated mechanical device was able to work with clogs when slurries/water can't be flushed.” It was noted that actuated mechanical device use prior to formation of full clog, utilizing a prophylactic approach, “was substantially easier than waiting until the Tube fully clogged.” For a partly clogged Tube, “despite it being somewhat patent and useable, a quick pass of the actuated mechanical device essentially restored full patency and likely prevented a full clog.” For an NG patient, “no amount of flushing or medication slurry was effective, but the actuated mechanical device worked in just minutes without issue.” “Following standard interventions failure after multiple attempts, only the actuated mechanical device was able to restore Tube patency, saving money on not having to replace Tube.” For a failed clearance, the operator noted “that despite failure to restore patency, there was clearly no opportunity for flushes to achieve a better result and having this option [actuated mechanical device] was helpful to attempt to avoid tube replacement.” For an NJ patient, “there would have been no other conventional method to restore patency of a NJ small bore feeding tube without extensive x-ray exposure and \\\"guess work,\\\" which would have been impossible for this patient who was critically ill and ventilator dependent.” Having an alternative to standard bedside unclogging techniques proved beneficial to this facility, with 90% effectiveness and saving those patients from undergoing a Tube replacement and saving our facility money by avoiding Tube replacement costs.</p><p><b>Table 1.</b> Patient and Feeding Tube Demographics.</p><p></p><p><b>Table 2.</b> Actuated Mechanical Device Uses.</p><p></p><p></p><p><b>Figure 1.</b> Actuated Mechanical Device for Clearing Partial and Fully Clogged Indwelling Feeding Tubes.</p><p>Vicki Emch, MS, RD<sup>1</sup>; Dani Foster<sup>2</sup>; Holly Walsworth, RD<sup>3</sup></p><p><sup>1</sup>Aveanna Medical Solutions, Lakewood, CO; <sup>2</sup>Aveanna Medical Solutions, Chandler, AZ; <sup>3</sup>Aveanna Medical Solutions, Erie, CO</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Homecare providers have managed through multiple formula backorders since the pandemic. Due to creative problem-solving, clinicians have successfully been able to offer substitutions. However, when pump feeding sets are on backorder, the options are limited; feeding sets are specific to the brand of pump. Recently, a backorder of feeding sets used for pumps common in acute and home care resulted in a severe shortage in the home care supply chain. This required fast action to ensure patients are able to continue administering their tube feeding and prevent readmissions. A solution is to change the patient to a pump brand which is not on backorder. Normally transitioning a patient to a different brand of pump would require in-person teaching. Due to the urgency of the situation, a more efficient method needed to be established. A regional home care provider determined that 20% of patients using enteral feeding pumps were using backordered sets, and 50% were pediatric patients who tend not to tolerate other methods of feeding. In response this home care provider assembled a team to create a new educational model for pump training. The team was composed of Registered Nurses, Registered Dietitians, and patient care and distribution representatives. The hypothesis is that providing high quality educational material with instructional videos, detailed communication on the issue and telephonic clinical support will allow for a successful transition.</p><p><b>Methods:</b> To determine urgency of transition we prioritized patients with a diagnosis of short bowel syndrome, gastroparesis, glycogen storage disease, vent dependency, &lt; 2 years of age, those living in a rural area with a 2-day shipping zip code and conducted a clinical review to determine patients with jejunal feeding tube. (See Table 1) A pump conversion team contacted patients/caregivers to review the situation, discuss options for nutrition delivery, determine current inventory of sets, assessed urgency for transition and coordinated pump, sets and educational material delivery. Weekly reporting tracked number of patients using the impacted pump, transitioned patients, and those requesting to transition back to their original pump.</p><p><b>Results:</b> A total of 2111 patients were using the feeding pump with backordered sets and 50% of these patients were under the age of 12 yrs. old. Over a period of 3 months, 1435 patients or 68% of this patient population were successfully transitioned to a different brand of pump and of those only 7 patients or 0.5% requested to return to their original pump even though they understood the risk of potentially running short on feeding sets. (See Figure 1).</p><p><b>Conclusion:</b> A team approach which included proactively communicating with patients/caregivers, prioritizing patient risk level, providing high-quality educational material with video links and outbound calls from a clinician resulted in a successful transition to a new brand of feeding pump.</p><p><b>Table 1.</b> Patient Priority Levels for Pump with Backordered Sets (Table 1).</p><p></p><p></p><p><b>Figure 1.</b> Number of Pump Conversions (Chart 1).</p><p>Desiree Barrientos, DNP, MSN, RN, LEC<sup>1</sup></p><p><sup>1</sup>Coram CVS, Chino, CA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Follow-up care for the enteral nutrition therapy community is essential for good outcomes. No data had been collected regarding home enteral nutrition (HEN) outcomes at a major university medical center. There was no robust program in place to follow up with patients who were discharged on tube feedings. Consequently, there was little information regarding the follow-up care or patient outcomes related to current practice, complications, re-hospitalizations, and equipment issues for this population.</p><p><b>Methods:</b> The tools utilized were the questionnaire for the 48-hours and 30-day post-discharge discharge outreach calls, pre-discharge handouts, and feeding pump handouts.</p><p><b>Results:</b> Education: Comparison of 48-Hours and 30 days. Q1: Can you tell me why you were hospitalized?Q2: Did a provider contact you for education prior to discharging home? Q3: Do you understand your nutrition orders from your Doctor? Q4: Can you tell me the steps of how to keep your PEG/TF site clean? Q5: Can you tell me how much water to flush your tube? There was an improvement in patient understanding, self-monitoring, and navigation between the two survey timepoints. Regarding patient education in Q3, there was an improved understanding of nutrition orders from 91% to 100%, Q4: steps to keeping tube feeding site clean resulted from 78% to 96%, and knowledge of water flushed before and after each feeding from 81% to 100% at the 48-hour and 30-day timepoints, respectively.</p><p><b>Conclusion:</b> There was an improvement in patient understanding, self-monitoring, and navigation between the two survey timepoints. Verbal responses to open-ended and informational questions were aggregated to analyze complications, care gaps, and service failures.</p><p><b>Table 1.</b> Questionnaire Responses At 48 Hours and 30 Days.</p><p></p><p><b>Table 2.</b> Questionnaire Responses At 48 Hours and 30 Days.</p><p></p><p></p><p><b>Figure 1.</b> Education: Comparison at 48-hours and 30-days.</p><p></p><p><b>Figure 2.</b> Self-monitoring and Navigation: Comparison at 48-hours and 30-days.</p><p>Rachel Ludke, MS, RD, CD, CNSC, CCTD<sup>1</sup>; Cayla Marshall, RD, CD<sup>2</sup></p><p><sup>1</sup>Froedtert Memorial Lutheran Hospital, Waukesha, WI; <sup>2</sup>Froedtert Memorial Lutheran Hospital, Big Bend, WI</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Initiation of early enteral nutrition plays an essential role in improving patient outcomes<sup>1</sup>. Historically, feeding tubes have been placed by nurses, doctors and advanced practice providers. Over the past two decades, the prevalence of dietitians (RDNs) placing feedings tubes at the bedside has grown. This practice has been endorsed by the Academy of Nutrition and Dietetics and American Society for Enteral and Parenteral Nutrition through modifications to the Scope of Practice for the Registered Dietitian Nutritionist and Standards of Professional Performance for Registered Dietitian Nutritionists.<sup>2,3</sup> Feeding tubes placed at the bedside by RDNs has the potential to decrease nursing, fluoroscopy and internal transport time, which is of interest to our hospital. In fall of 2023, we launched a pilot to evaluate the feasibility of an RDN-led bedside tube placement team at our 800-bed level 1 trauma center.</p><p><b>Methods:</b> RDNs first worked closely with nursing leadership to create a tube placement workflow to identify appropriate patients, outline communication needed with the bedside nurse and provider, and establish troubleshooting solutions (Figure 1). Intensive care unit nursing staff then trained RDNs in tube placement using camera-guided technology (IRIS) and deemed RDNs competent after 10 successful tube placements. Given limited literature on RDN led tube placement, we defined success as &gt;80% of tube placements in an appropriate position within the gastrointestinal tract.</p><p><b>Results:</b> To date, the pilot includes 57 patients; Forty-six tubes (80.7%; 39 gastric and 7 post-pyloric) were placed successfully, as confirmed by KUB. Of note, 2 of the 39 gastric tubes were originally ordered to be placed post-pylorically, however gastric tubes were deemed acceptable for these patients after issues were identified during placement. Eleven (19%) tube placements were unsuccessful due to behavioral issues, blockages in the nasal cavity, and anatomical abnormalities.</p><p><b>Conclusion:</b> This pilot demonstrated that well trained RDNs can successfully place feeding tubes at the bedside using camera-guided tube placement technology. One limitation to this pilot is the small sample size. We initially limited the pilot to 2 hospital floors and had trouble educating nursing staff on the availability of the RDN to place tubes. Through evaluation of tube placement orders, we found the floors placed on average 198 tubes during our pilot, indicating 141 missed opportunities for RDNs to place tubes. To address these issues, we created numerous educational bulletins and worked with unit nursing staff to encourage contacting the RDN when feeding tube placement was needed. We also expanded the pilot hospital-wide and are looking into time periods when tubes are most often placed. Anecdotally, bedside feeding tube placement takes 30 to 60 minutes, therefore this pilot saved 1350 to 2700 minutes of nursing time and 180 to 360 minutes of fluoroscopy time necessary to place post-pyloric tubes. Overall, our pilot has demonstrated feasibility in RDN-led bedside feeding tube placement, allowing staff RDNs to practice at the top of their scope and promoting effective use of hospital resources.</p><p></p><p><b>Figure 1.</b> Dietitian Feeding Tube Insertion Pilot: 2NT and 9NT.</p><p>Lauren Murch, MSc, RD<sup>1</sup>; Janet Madill, PhD, RD, FDC<sup>2</sup>; Cindy Steel, MSc, RD<sup>3</sup></p><p><sup>1</sup>Nestle Health Science, Cambridge, ON; <sup>2</sup>Brescia School of Food and Nutritional Sciences, Faculty of Health Sciences, Western University, London, ON; <sup>3</sup>Nestle Health Science, Hamilton, ON</p><p><b>Financial Support:</b> Nestle Health Science.</p><p><b>Background:</b> Continuing education (CE) is a component of professional development which serves two functions: maintaining practice competencies, and translating new knowledge into practice. Understanding registered dietitian (RD) participation and perceptions of CE facilitates creation of more effective CE activities to enhance knowledge acquisition and practice change. This preliminary analysis of a practice survey describes RD participation in CE and evaluates barriers to CE participation.</p><p><b>Methods:</b> This was a cross-sectional survey of clinical RDs working across care settings in Canada. Targeted participants (n = 4667), identified using a convenience sample and in accordance with applicable law, were invited to complete a 25-question online survey, between November 2023 to February 2024. Descriptive statistics and frequencies were reported.</p><p><b>Results:</b> Nationally, 428 RDs working in acute care, long term care and home care, fully or partially completed the survey (9.1% response). Respondents indicated the median ideal number of CE activities per year was 3 in-person, 5 reviews of written materials, and 6 virtual activities. However, participation was most frequently reported as “less often than ideal” for in person activities (74.7% of respondents) and written material (53.6%) and “as often as ideal” for virtual activities (50.7%). Pre-recorded video presentations, live virtual presentations and critically reviewing written materials were the most common types of CE that RDs had participated in at least once in the preceding 12-months. In-person hands-on sessions, multimodal education and simulations were the least common types of CE that RDs had encountered in the preceding 12-months (Figure 1). The most frequent barriers to participation in CE were cost (68% of respondents), scheduling (60%), and location (51%). However, encountered barriers that most greatly limited participation in CE were inadequate staff coverage, lack of dedicated education days within role, and lack of dedicated time during work hours (Table 1). When deciding to participate in CE, RDs ranked the most important aspects of the content as 1) from a credible source, 2) specific/narrow topic relevant to practice and 3) enabling use of practical tools/skills at the bedside.</p><p><b>Conclusion:</b> This data suggests there is opportunity for improvement in RD CE participation, with the greatest gaps in in-person and written activities. Although RDs recognize the importance of relevant and practical content, they reported infrequent exposure to types of CE that are well-suited to this, such as simulations and hands-on activities. When planning and executing CE, content should be credible, relevant, and practical, using a format that is both accessible and impactful. Results of this study help benchmark Canadian RD participation in CE and provide convincing evidence to address barriers and maximize optimal participation.</p><p><b>Table 1.</b> Frequent and Impactful Barriers Limiting Participation in CE Activities.</p><p></p><p>Note: 1. Percentages total greater than 100% because all respondents selected the 3 most important barriers impacting their participation in CE activities. 2. Items are ranked based on a weighted score calculated from a 5-point Likert scale, indicating the extent to which the barrier was perceived to have limited participation in CE activities.</p><p></p><p><b>Figure 1.</b> Types of Continuing Education Activities Dietitians Participated In At Least Once, In The Preceding 12-Months.</p><p>Karen Sudders, MS, RDN, LDN<sup>1</sup>; Alyssa Carlson, RD, CSO, LDN, CNSC<sup>2</sup>; Jessica Young, PharmD<sup>3</sup>; Elyse Roel, MS, RDN, LDN, CNSC<sup>2</sup>; Sophia Vainrub, PharmD, BCPS<sup>4</sup></p><p><sup>1</sup>Medtrition, Huntingdon Valley, PA; <sup>2</sup>Endeavor Health/Aramark Healthcare +, Evanston, IL; <sup>3</sup>Parkview Health, Fort Wayne, IN; <sup>4</sup>Endeavor Health, Glenview, IL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Adequate nutrition is a critical component of patient care and plays a significant role in reducing hospital length of stay (LOS). Malnutrition is associated with numerous adverse outcomes, including increased morbidity, delayed recovery, and prolonged hospitalization (Tappenden et al., 2013). Timely nutrition interventions, and strategies for integrating successful nutrition care into hospital protocols to reduce LOS are essential parts of medical nutrition therapy. Modular nutrition often plays a key role in these interventions. A study by Klek et al. (2020) emphasizes the role of modular nutrition in providing personalized nutritional support for the specific needs of critically ill patients. The study suggests that using nutrient modules allows for a more precise adjustment of nutrition based on the metabolic requirements of patients (Klek et al., 2020). Modular nutrition has also been shown to positively impact clinical outcomes in ICU patients. An observational study by Compher et al. (2019) reported that the targeted administration of protein modules to achieve higher protein intake was associated with improved clinical outcomes, such as reduced ICU LOS (Compher et al., 2019).</p><p><b>Methods:</b> Administration of modular nutrition can be a challenge. Typically, modular proteins (MP) are ordered through the dietitian and dispensed as part of the diet order. The nursing team is responsible for administration and documentation of the MP. There is commonly a disconnect between MP prescription and administration. In some cases, it's related to the MP not being a tracked task in the electronic health record (EHR). The goal of this evaluation was to review data related to a quality improvement (QI)initiative where MP (ProSource TF) was added to the medication administration record (MAR) which used a barcode scanning process to track provision and documentation of MP. The objective of this evaluation was to determine possible correlation between the QI initiative and patients’ ICU LOS. The QI initiative evaluated a pre-implementation timeframe from June 1<sup>st</sup>, 2021 to November 30<sup>th</sup>, 2021, with a post implementation timeframe from January 1<sup>st</sup>, 2022 to June 30<sup>th</sup>, 2022. There were a total of 1962 ICU encounters in the pre-implementation period and 1844 ICU encounters in the post implementation period. The data was analyzed using a series of statistical tests.</p><p><b>Results:</b> The t-test for the total sample was significant, t(3804) = 8.35, p &lt; .001, indicating the average LOS was significantly lower at post compared to pre implementation(TABLE 1). This positive correlation allows us to assume that improved provision of MP may be related to a reduced LOS in the ICU. In addition to LOS, we can also suggest a relationship with the MAR and MP utilization. Pre-implementation, 1600 doses of MP were obtained with an increase of 2400 doses obtained post implementation. The data suggests there is a correlation between product use and MAR implementation even though the overall encounters at post implementation were reduced. There was a 50% increase in product utilization post implementation compared to previous.</p><p><b>Conclusion:</b> The data provided suggests the benefit for adding MP on the MAR to help improve provision, streamline documentation and potentially reduce ICU LOS.</p><p><b>Table 1.</b> Comparison of LOS Between Pre and Post Total Encounters.</p><p></p><p>Table 1 displays the t-test comparison of LOS in pre vs post implementation of MP on the MAR.</p><p></p><p><b>Figure 1.</b> Displays Product Utilization and Encounters Pre vs Post Implementation of MP on the MAR.</p><p><b>International Poster of Distinction</b></p><p>Eliana Giuntini, PhD<sup>1</sup>; Ana Zanini, RD, MSc<sup>2</sup>; Hellin dos Santos, RD, MSc<sup>2</sup>; Ana Paula Celes, MBA<sup>2</sup>; Bernadette Franco, PhD<sup>3</sup></p><p><sup>1</sup>Food Research Center/University of São Paulo, São Paulo; <sup>2</sup>Prodiet Medical Nutrition, Curitiba, Parana; <sup>3</sup>Food Research Center/School of Pharmaceutical Sciences/University of São Paulo, São Paulo</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Critically ill patients present an increased need for protein to preserve muscle mass due to anabolic resistance. Additionally, these patients are more prone to developing hyperglycemia, and one of the nutritional strategies that can be adopted is to provide a diet with a low glycemic index. Hypercaloric and high-protein enteral formulas can help meet energy and protein goals for these patients. Due to their reduced carbohydrate content, these formulas can also contribute to lowering the postprandial glycemic response. The study aimed to evaluate the glycemic index (GI) and the glycemic load (GL) of a specialized high-protein enteral nutrition formula.</p><p><b>Methods:</b> Fifteen healthy volunteers were selected, based on self-reported absence of diseases or regular medication use, aged between 21 and 49 years, with normal glucose tolerance according to fasting and postprandial glucose assessments over 2 hours. The individuals attended after a 10-hour fast, once per week, consuming the glucose solution – reference food – for 3 weeks, and the specialized high-protein enteral formula (Prodiet Medical Nutrition) in the following week, both in amounts equivalent to 25 g of available carbohydrates. The specialized high-protein formula provides 1.5 kcal/ml, 26% protein (98 g/L), 39% carbohydrates, and 35% lipids, including EPA + DHA. Capillary blood sampling was performed at regular intervals, at 0 (before consumption), 15, 30, 45, 60, 90, and 120 minutes. The incremental area under the curve (iAUC) was calculated, excluding areas below the fasting line. Glycemic load (GL) was determined based on the equation GL = [GI (glucose=reference) X grams of available carbohydrates in the portion]/100. Student's t-test were conducted to identify differences (p &lt; 0.05).</p><p><b>Results:</b> To consume 25 g of available carbohydrates, the individuals ingested 140 g of the high-protein formula. The high-protein enteral formula showed a low GI (GI = 23) with a significant difference compared to glucose (p &lt; 0.0001) and a low GL (GL = 8.2). The glycemic curve data showed significant differences at all time points between glucose and the specialized high-protein formula, except at T90, with the glycemic peak occurring at T30 for glucose (126 mg/dL) and at both T30 and T45 for the specialized high-protein enteral formula, with values significantly lower than glucose (102 vs 126 mg/dL). The iAUC was smaller for the specialized high-protein formula compared to glucose (538 ± 91 vs 2061 ± 174 mg/dL x min) (p &lt; 0.0001), exhibiting a curve without high peak, typically observed in foods with a reduced glycemic index.</p><p><b>Conclusion:</b> The specialized high-protein enteral nutrition formula showed a low GI and GL, resulting in a significantly reduced postprandial glycemic response, with lower glucose elevation and variation. This may reduce insulin requirements, and glycemic variability.</p><p></p><p><b>Figure 1.</b> Mean Glycemic Response of Volunteers (N = 15) to 25 G of Available Carbohydrates After Consumption of Reference Food and a Specialized High-Protein Enteral Nutrition Formula, in 120 Min.</p><p>Lisa Epp, RDN, LD, CNSC, FASPEN<sup>1</sup>; Bethaney Wescott, APRN, CNP, MS<sup>2</sup>; Manpreet Mundi, MD<sup>2</sup>; Ryan Hurt, MD, PhD<sup>2</sup></p><p><sup>1</sup>Mayo Clinic Rochester, Rochester, MN; <sup>2</sup>Mayo Clinic, Rochester, MN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Hypnotherapy is the use of hypnosis for the treatment of a medical or psychological disorder. Specifically, gut directed hypnotherapy is a treatment option for functional gastrointestinal disorders and disorders of the gut brain axis. It has been shown to be effective in management of GI symptoms such as abdominal pain, nausea, functional dyspepsia and irritable bowel syndrome symptoms. Evidence suggests that 6%–19% of patients with these GI symptoms exhibit characteristics of Avoidant/restrictive food intake disorder (ARFID). Multiple studies show improvement in GI symptoms and ability to maintain that improvement after 1 year. However, there is a paucity of data regarding use of hypnotherapy in home enteral nutrition patients.</p><p><b>Methods:</b> A case report involving a 67-year-old adult female with h/o Irritable bowel syndrome (diarrhea predominant) and new mucinous appendiceal cancer s/p debulking of abdominal tumor, including colostomy and distal gastrectomy is presented. She was on parenteral nutrition (PN) for 1 month post op due to delayed return of bowel function before her oral diet was advanced. Unfortunately, she had difficulty weaning from PN as she was “scared to start eating” due to functional dysphagia with gagging at the sight of food, even on TV. After 4 weeks of PN, a nasojejunal feeding tube was placed and she was dismissed home.</p><p><b>Results:</b> At multidisciplinary outpatient nutrition clinic visit, the patient was dependent on enteral nutrition and reported inability to tolerate oral intake for unclear reasons. Long term enteral access was discussed, however the patient wished to avoid this and asked for alternative interventions she could try to help her eat. She was referred for gut directed hypnotherapy. After 4 in-person sessions over 3 weeks of hypnotherapy the patient was able to tolerate increasing amounts of oral intake and remove her nasal jejunal feeding. Upon follow-up 4 months later, she was still eating well and continued to praise the outcome she received from gut directed hypnotherapy.</p><p><b>Conclusion:</b> Patient-centered treatments for gut-brain axis disorders and disordered eating behaviors and/or eating disorders are important to consider in addition to nutrition support. These include but are not limited to Cognitive Behavior Therapy, mindfulness interventions, acupuncture, biofeedback strategies, and gut directed hypnotherapy. Group, online and therapist directed therapies could be considered for treatment avenues dependent on patient needs and preferences. Additional research is needed to better delineate impact of these treatment modalities in the home enteral nutrition population.</p><p>Allison Krall, MS, RD, LD, CNSC<sup>1</sup>; Cassie Fackler, RD, LD, CNSC<sup>1</sup>; Gretchen Murray, RD, LD, CNSC<sup>1</sup>; Amy Patton, MHI, RD, CNSC, LSSGB<sup>2</sup></p><p><sup>1</sup>The Ohio State University Wexner Medical Center, Columbus, OH; <sup>2</sup>The Ohio State University Wexner Medical Center, Westerville, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> It is well documented that unnecessary hospital admissions can have a negative impact on patient's physical and emotional wellbeing and can increase healthcare costs.<sup>1</sup> Numerous strategies exist to limit unnecessary hospital admissions; one innovative strategy being utilized at our 1000+ bed Academic Medical Center involves Registered Dietitians (RDs). Literature demonstrates that feeding tube placement by a dedicated team using electromagnetic tracking improves patient morbidity and mortality, and is a cost effective solution for this procedure<sup>.2</sup> RDs have been part of feeding tube teams for many years, though exact numbers of RD only teams are unclear.<sup>3</sup> The Revised 2021 Standards of Practice and Standards of Professional Performance for RDNs (Competent, Proficient, and Expert) in Nutrition Support identifies that dietitians at the “expert” level strive for additional experience and training, and thus may serve a vital role in feeding tube placement teams.<sup>4</sup> Prior to the implementation of the RD led tube team, there was no uniform process at our hospital for these patients to obtain enteral access in a timely manner.</p><p><b>Methods:</b> In December 2023 an “RD tube team” consult and order set went live within the electronic medical record at our hospital. The original intent of the tube team was to cover the inpatient units, but it soon became apparent that there were opportunities to extend this service to observation areas and the Emergency Department (ED). This case series abstract will outline case studies from three patients with various clinical backgrounds and how the tube team was able to prevent an inpatient admission. Patient 1: An 81-year-old female who returned to the ED on POD# 4 s/p esophageal repair with a dislodged nasoenteric feeding tube. The RD tube team was consulted and was able to replace her tube and bridled it in place. Patient discharged from ED without requiring hospital readmission. Patient 2: An 81-year-old male with a history of ENT cancer who transferred to our ED after outside hospital ED had no trained staff available to replace his dislodge nasoenteric feeding tube. RD tube team replaced his tube and bridled it into place. Patient was able to discharge from the ED without admission. Patient 3: A 31-year-old female with complex GI history and multiple prolonged hospitalizations due to PO intolerance. Patient returned to ED 2 days post discharge with a clogged nasoenteric feeding tube. Tube was unable to be unclogged, thus the RD tube team was able to replace tube in ED and prevent readmission.</p><p><b>Results:</b> Consult volumes validated there was a need for a tube team service. In the first 8 months of consult and order set implementation, a total of 403 tubes were placed by the RD team. Of those, 24 (6%) were placed in the ED and observation units. In the 3 patient cases described above, and numerous other patient cases, the RD was able to successfully place a tube using an electromagnetic placement device (EMPD), thereby preventing a patient admission.</p><p><b>Conclusion:</b> Creating a feeding tube team can be a complex process to navigate and requires support from senior hospital administration, physician champions, nursing teams and legal/risk management teams. Within the first year of implementation, our hospital system was able to demonstrate that RD led tube teams have the potential to not only help with establishing safe enteral access for patients, but also can be an asset to the medical facility by preventing admissions and readmissions.</p><p><b>Table 1.</b> RD Tube Team Consults (December 11, 2023-August 31, 2024).</p><p></p><p>Arina Cazac, RD<sup>1</sup>; Joanne Matthews, RD<sup>2</sup>; Kirsten Willemsen, RD<sup>3</sup>; Paisley Steele, RD<sup>4</sup>; Savannah Zantingh, RD<sup>5</sup>; Sylvia Rinaldi, RD, PhD<sup>2</sup></p><p><sup>1</sup>Internal Equilibrium, King City, ON; <sup>2</sup>London Health Sciences Centre, London, ON; <sup>3</sup>NutritionRx, London, ON; <sup>4</sup>Vanier Children's Mental Wellness, London, ON; <sup>5</sup>Listowel-Wingham and Area Family Health Team, Wingham, ON</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Parkinson's disease is the second most prevalent neurodegenerative disease with a predominant disease-related symptom known as dysphagia. Dysphagia can increase the risk of aspiration, the inhaling of food or liquids into the lungs, potentially instigating the onset of pneumonia, a recurrent fatality in patients with Parkinson's disease. Therefore, feeding tubes are placed to deliver nutrition into the stomach or the small intestine to maintain appropriate nutrition delivery and reduce the risk of aspiration by oral intake. To the best of our knowledge, there is no research comparing the differences in outcomes between gastric (G) or jejunal (J) tube feeding in patients with Parkinson's disease-related dysphagia, however, limited research does exist in critically ill populations comparing these two modalities. The purpose of this study is to compare the differences in hospital readmissions related to aspiration events and differences in mortality rates after placement of a gastric or jejunal feeding tube in patients with Parkinson's disease-related dysphagia.</p><p><b>Methods:</b> This was a retrospective chart review of patients either admitted to the medicine or clinical neurosciences units at University Hospital in London Ontario, Canada, between January 1, 2010, to December 31, 2022. Patients were included if they had a documented diagnosis of Parkinson's or associated diseases and had a permanent feeding tube placed during hospital admission that was used for enteral nutrition. Patients were excluded from the study if they had a comorbidity that would affect their survival, such as cancer, or if a feeding tube was placed unrelated to Parkinson's Disease-related dysphagia, for example, feeding tube placement post-stroke. A p-value &lt; 0.05 was considered statistically significant.</p><p><b>Results:</b> 25 participants were included in this study; 7 had gastric feeding tubes, and 18 had jejunal feeding tubes. Demographic data is shown in Table 1. No statistically significant differences were found in demographic variables between the G- and J-tube groups. Of interest, none of the 28% of participants that had dementia were discharged to home; 5 were discharged to long-term care, 1 was discharged to a complex continuing care facility, and 1 passed in hospital. Differences in readmission rates and morality between groups did not reach significance, likely due to our small sample size in the G-tube group (Figures 1 and 2). However, we found that 50% of participants were known to have passed within 1 year of initiating enteral nutrition via their permanent feeding tube, and there was a trend of higher readmission rates in the G-tube group.</p><p><b>Conclusion:</b> While this study did not yield statistically significant results, it highlights the need for further research of a larger sample size to assess confounding factors, such as concurrent oral intake, that affect the difference in outcomes between G- and J-tube groups. Future research would also benefit from examining the influence on quality of life in these patients. Additional research is necessary to inform clinical practice guidelines and clinical decision-making for clinicians, patients and families when considering a permanent feeding tube.</p><p><b>Table 1.</b> Participant Demographics.</p><p></p><p></p><p>Readmission rates were calculated as a hospital. If a participant was readmitted more than once within the defined percentage of the number of readmissions to the number of discharges from timeframes, subsequent readmissions were counted as a new readmission and new discharge event. Readmission rate calculations did not include participants who passed during or after the defined timeframes. Differences in readmission rates between gastric and jejunal feeding tube groups did no reach statistical significance.</p><p><b>Figure 1.</b> Readmission Rate.</p><p></p><p>Mortality rates were calculated from the time that enteral nutrition was initiated through a permanent feeding tube in 30-day, 60-day, 90-day, and 1-year time intervals. Differences in mortality rates between gastric and jejunal feeding tube groups did not reach statistical significance.</p><p><b>Figure 2.</b> Mortality Rate.</p><p>Jennifer Carter, MHA, RD<sup>1</sup></p><p><sup>1</sup>Winchester Medical Center, Valley Health, Winchester, VA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Early enteral nutrition is shown to improve patient outcomes and can decrease, or attenuate, the progression of malnutrition. Placement of nasoenteric feeding tubes was deemed within the scope of practice (SOP) for Registered Dietitian Nutritionists (RDNs) in 2007, with recent revisions in 2021, by the Academy of Nutrition and Dietetics (AND) and the American Society of Parenteral and Enteral Nutrition (ASPEN). Due to enhanced order writing privileges, RDNs are knowledgeable and aware of those in need of enteral nutrition recommendations. The goal of this abstract is to bring light to the efficiency of a RDN led nasoenteric tube placement team.</p><p><b>Methods:</b> A retrospective chart review of the first 70 patients who received a nasoenteric tube placed by a RDN in 2023 was conducted. Data points collected include time of tube order to tube placement and time of tube order to enteral nutrition order.</p><p><b>Results:</b> Out of 70 tubes placed, the average time from tube order to tube placement was 2.28 hours. The longest time from tube order to placement was 19 hours. The average time from tube order to enteral nutrition order was 5.58 hours. The longest time from tube order to enteral nutrition order was 23.6 hours.</p><p><b>Conclusion:</b> This retrospective review reflects the timeliness of placement and provision of enteral nutrition in the acute care setting when performed by an all RDN team. Overall, placement occurred within less than 2.5 hours of tube placement order, and enteral nutrition orders entered less than 6 hours of tube placement order. The RDNs at Winchester Medical Center have been placing nasoenteric-feeding tubes since 2013 using an electromagnetic tube placement device (EMPD). As of 2018, it became an all RDN team. With the enhanced support from AND and ASPEN, nasoenteric feeding tube placement continues to be promoted within the SOP for RDNS. Also, the Accreditation Council for Education in Nutrition and Dietetics (ACEND) now requires dietetic interns to learn, observe, and even assist with nasoenteric tube placements. Over time, more RDNs in the acute care setting will likely advance their skillset to include this expertise.</p><p></p><p><b>Figure 1.</b> Time From MD Order to Tube Placement in Hours.</p><p></p><p><b>Figure 2.</b> Time From MD Order of Tube to Tube Feed Order in Hours.</p><p><b>Poster of Distinction</b></p><p>Vanessa Millovich, DCN, MS, RDN, CNSC<sup>1</sup>; Susan Ray, MS, RD, CNSC, CDCES<sup>2</sup>; Robert McMahon, PhD<sup>3</sup>; Christina Valentine, MD, RDN, FAAP, FASPEN<sup>4</sup></p><p><sup>1</sup>Kate Farms, Hemet, CA; <sup>2</sup>Kate Farms, Temecula, CA; <sup>3</sup>Seven Hills Strategies, Columbus, OH; <sup>4</sup>Kate Farms, Cincinnati, OH</p><p><b>Financial Support:</b> Kate Farms provided all financial support.</p><p><b>Background:</b> Whole food plant-based diets have demonstrated metabolic benefits across many populations. The resulting increased intake of dietary fiber and phytonutrients is integral to the success of this dietary pattern due to the positive effect on the digestive system. Patients dependent on tube feeding may not receive dietary fiber or sources of phytonutrients, and the impact of this is unknown. Evidence suggests that the pathways that promote digestive health include more than traditional prebiotic sources from carbohydrate fermentation. Data on protein fermentation metabolites and potential adverse effects on colon epithelial cell integrity are emerging. These lesser-known metabolites, branched-chain fatty acids (BCFAs), are produced through proteolytic fermentation. Emerging research suggests that the overproduction of BCFAs via protein fermentation may be associated with toxic by-products like p-cresol. These resulting by-products may play a role in digestive disease pathogenesis. Enteral formulas are often used to support the nutritional needs of those with digestive conditions. Plant-based formulations made with yellow pea protein have been reported to improve GI tolerance symptoms. However, the underlying mechanisms responsible have yet to be investigated. The purpose of this study was to assess the impact of a mixed food matrix enteral formula containing pea protein, fiber, and phytonutrients on various markers of gut health in healthy children and adults, using an in-vitro model.</p><p><b>Methods:</b> Stool samples of ten healthy pediatric and 10 adult donors were collected and stored at -80°C. The yellow pea protein formulas (Kate Farms™ Pediatric Standard 1.2 Vanilla-P1, Pediatric Peptide 1.0 Vanilla-P2, and Standard 1.4 Plain-P3) were first predigested using standardized intestinal processing to simulate movement along the digestive tract. The in-vitro model was ProDigest's Colon-on-a-Plate (CoaP®) simulation platform which has demonstrated in vivo-in vitro correlation. Measurements of microbial metabolic activity included pH, production of gas, SCFAs, BCFA, ammonia, and microbiota shifts. Paired two-sided t-tests were performed to evaluate differences between treatment and control. Differential abundance analysis was performed using LEfSe and treeclimbR. Statistical significance, as compared to negative control, is indicated by a p-value of &lt; 0.05.</p><p><b>Results:</b> In the pediatric group, the microbial analysis showed significant enrichment of Bifidobacteria as well as butyrate-producing genera Agathobacter and Agathobaculum with the use of the pediatric formulas when compared to the control. P1 resulted in a statistically significant reduction of BCFA production (p &lt; = 0.05). P1 and P2 resulted in statistically significant increases in acetate and propionate. In the adult group, with treatment using P3, microbial analysis showed significant enrichment of Bifidobacteria compared to the control group. P3 also resulted in a reduction of BCFAs, although not statistically significant. Gas production and drop in pH were statistically significant (p &lt; = 0.05) for all groups P1, P2, and P3 compared to control, which indicates microbial activity.</p><p><b>Conclusion:</b> All enteral formulas demonstrated a consistent prebiotic effect on the gut microbial community composition in healthy pediatric and adult donors. These findings provide insight into the mechanisms related to digestive health and highlight the importance of designing prospective interventional research to better understand the role of fiber and phytonutrients within enteral products.</p><p>Hill Johnson, MEng<sup>1</sup>; Shanshan Chen, PhD<sup>2</sup>; Garrett Marin<sup>3</sup></p><p><sup>1</sup>Luminoah Inc, Charlottesville, VA; <sup>2</sup>Virginia Commonwealth University, Richmond, VA; <sup>3</sup>Luminoah Inc, San Diego, CA</p><p><b>Financial Support:</b> Research was conducted with the support of VBHRC's VA Catalyst Grant Funding.</p><p><b>Background:</b> Medical devices designed for home use must prioritize user safety, ease of operation, and reliability, especially in critical activities such as enteral feeding. This study aimed to validate the usability, safety, and overall user satisfaction of a novel enteral nutrition system through summative testing and task analysis.</p><p><b>Methods:</b> A simulation-based, human factors summative study was conducted with 36 participants, including both caregivers and direct users of enteral feeding technology. Participants were recruited across three major cities: Houston, Chicago, and Phoenix. Task analysis focused on critical and non-critical functions of the Luminoah FLOW™ Enteral Nutrition System, while user satisfaction was measured using the System Usability Scale (SUS). The study evaluated successful task completion, potential use errors, and qualitative user feedback.</p><p><b>Results:</b> All critical tasks were completed successfully by 100% of users, with the exception of a cleaning task, which had an 89% success rate. Non-critical tasks reached an overall completion rate of 95.7%, demonstrating the ease of use and intuitive design of the system. The SUS score was exceptionally high, with an average score of 91.5, indicating strong user preference for the device over current alternatives. Furthermore, 91% of participants indicated they would choose the new system over other products in the market.</p><p><b>Conclusion:</b> The innovative portable enteral nutrition system demonstrated excellent usability and safety, meeting the design requirements for its intended user population. High completion rates for critical tasks and an overwhelmingly positive SUS score underscore the system's ease of use and desirability. These findings suggest that the system is a superior option for home enteral feeding, providing both safety and efficiency in real-world scenarios. Further refinements in instructional materials may improve user performance on non-critical tasks.</p><p>Elease Tewalt<sup>1</sup></p><p><sup>1</sup>Phoenix Veterans Affairs Administration, Phoenix, AZ</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Enhanced Recovery After Surgery (ERAS) protocols, including preoperative carbohydrate loading, aim to accelerate recovery by reducing the stress responses and insulin resistance. These protocols have been shown to decrease hospital stays, postoperative complications, and healthcare costs. However, there is limited knowledge about the safety and efficacy of ERAS for diabetic patients. Patients with diabetes make up 15% of surgical cases and often have longer hospital stays and more postoperative complications. This study evaluated outcome measures important to patients with diabetes in a non-diabetic population in order to support the groundwork for future trials that could include diabetic patients in ERAS protocols.</p><p><b>Methods:</b> A retrospective chart review at the Phoenix Veterans Affairs Health Care System compared 24 colorectal surgery patients who received preoperative carbohydrate drinks with 24 who received traditional care. Outcomes assessed included blood glucose (BG) levels, aspiration, and postoperative complications. Additional analyses evaluated adherence, length of hospital stay, and healthcare costs.</p><p><b>Results:</b> The demographics of the two groups were comparable (Table 1). The preoperative BG levels of the carbohydrate loading group were similar (164.6 ± 36.3 mg/dL) to the control group (151.8 ± 47.7 mg/dL) (p &gt; 0.05) (Figure 1). The carbohydrate loading group demonstrated lower and more stable postoperative BG levels (139.4 ± 37.5 mg/dL) compared to the control group (157.6 ± 61.9 mg/dL), but this difference was not statistically significant (p &gt; 0.05) (Figure 2). There were no significant differences in aspiration or vomiting between the groups (p &gt; 0.05) (Table 2). The carbohydrate loading group had a shorter average hospital stay by one day, but this difference was not statistically significant (p &gt; 0.05) (Table 2).</p><p><b>Conclusion:</b> Carbohydrate loading as part of ERAS protocols was associated with better postoperative glucose control, no increased risk of complications, and reduced hospital stays. Although diabetic patients were not included in this study, these findings suggest thatcarbohydrate loading is a safe and effective component of ERAS. Including diabetic patients in ERAS is a logical next step that could significantly improve surgical outcomes for this population. Future research should focus on incorporating diabetic patients to assess the impact of carbohydrate loading on postoperative glucose control, complication rates, length of stay, and healthcare costs.</p><p><b>Table 1.</b> Demographics.</p><p></p><p>The table includes the demographics of the two groups. The study group consists of participants who received the carbohydrate drink, while the control group includes those who received traditional care.</p><p><b>Table 2.</b> Postoperative Outcomes.</p><p></p><p>The table includes the postoperative outcomes of the two groups. The study group consists of participants who received the carbohydrate drink, while the control group includes those who received traditional care.</p><p></p><p>The figure shows the preoperative BG levels of the carbohydrate and non-carbohydrate groups, along with a trendline for each group (p &gt; 0.05).</p><p><b>Figure 1.</b> Preoperative BG Levels.</p><p></p><p>The figure shows the postoperative BG levels of the carbohydrate and non-carbohydrate groups, along with a trendline for each group (p &gt; 0.05).</p><p><b>Figure 2.</b> Postoperative BG Levels.</p><p><b>Malnutrition and Nutrition Assessment</b></p><p>Amy Patton, MHI, RD, CNSC, LSSGB<sup>1</sup>; Elisabeth Schnicke, RD, LD, CNSC<sup>2</sup>; Sarah Holland, MSc, RD, LD, CNSC<sup>3</sup>; Cassie Fackler, RD, LD, CNSC<sup>2</sup>; Holly Estes-Doetsch, MS, RDN, LD<sup>4</sup>; Kristen Roberts, PhD, RDN, LD, CNSC, FASPEN, FAND<sup>5</sup>; Christopher Taylor, PhD, RDN<sup>4</sup></p><p><sup>1</sup>The Ohio State University Wexner Medical Center, Westerville, OH; <sup>2</sup>The Ohio State University Wexner Medical Center, Columbus, OH; <sup>3</sup>The Ohio State University Wexner Medical Center, Upper Arlington, OH; <sup>4</sup>The Ohio State University, Columbus, OH; <sup>5</sup>The Ohio State University, Granville, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> The unfavorable association of malnutrition on hospital outcomes such as longer length of stays (LOS), increased falls, and increased hospital readmissions has been well documented in the literature. We aimed to see if a different model of care that lowered Registered Dietitian (RD) to patient ratios would lead to increased malnutrition identification and documentation within our facility. We also evaluated the relationship between these metrics and LOS on monitored hospital units.</p><p><b>Methods:</b> In July 2022, two additional RDs were hired at a large 1000+ bed academic medical center as part of a pilot program focused on malnutrition identification, documentation integrity, and staff training. The RD to patient ratio was reduced from 1:75 to 1:47 on three units of the hospital. On the pilot units, the RD completed a full nutrition assessment, including a Nutrition Focused Physical Exam (NFPE), for all patients who were identified as \\\"at risk\\\" per hospital nutrition screening policy. Those patients that were not identified as “at risk” received a full RD assessment with NFPE by day 7 of admission or per consult request. A malnutrition dashboard was created with assistance from a Quality Data Manager from the Quality and Operations team. This visual graphic allowed us to track and monitor RD malnutrition identification rates by unit and the percentage of patients that had a malnutrition diagnosis captured by the billing and coding team. Data was also pulled from the Electronic Medical Record (EMR) to look at other patient outcomes. In a retrospective analysis we compared the new model of care to the standard model on one of these units.</p><p><b>Results:</b> There was an increase in the RD identified capture rate of malnutrition on the pilot units. On a cardiac care unit, the RD identification rate went from a baseline of 6% in Fiscal Year (FY) 2022 to an average of 12.5% over FY 2023-2024. On two general medicine units, the malnutrition rates identified by RD nearly doubled during the two-year intervention (Table 1). LOS was significantly lower on one of the general medicine intervention floors compared to a control unit (p &lt; 0.001, Cohen's D: 13.8) (Table 2). LOS was reduced on all units analyzed between FY22 and FY23/24. Those patients with a malnutrition diagnosis had a 15% reduction in LOS FY22 to FY23/24 in control group compared to 19% reduction in LOS for those identified with malnutrition on intervention unit. When comparing intervention versus control units for FY23 and FY24 combined, the intervention had a much lower LOS than control unit.</p><p><b>Conclusion:</b> Dietitian assessments and related interventions may contribute in reducing LOS. Reducing RD to patient ratios may allow for greater identification of malnutrition and support patient outcomes such as LOS. There is an opportunity to evaluate other patient outcomes for the pilot units including falls, readmission rates and Case Mix Index.</p><p><b>Table 1.</b> RD Identified Malnutrition Rates on Two General Medicine Pilot Units.</p><p></p><p><b>Table 2.</b> Control Unit and Intervention Unit Length of Stay Comparison.</p><p></p><p>Amy Patton, MHI, RD, CNSC, LSSGB<sup>1</sup>; Misty McGiffin, DTR<sup>2</sup></p><p><sup>1</sup>The Ohio State University Wexner Medical Center, Westerville, OH; <sup>2</sup>The Ohio State University Wexner Medical Center, Columbus, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Delays in identifying patients at nutrition risk can impact patient outcomes. Per facility policy, Dietetic Technicians (DTs) and Registered Dietitians (RDs) review patient charts, meet with patients, and assign a nutrition risk level. High risk patients are then assessed by the RD for malnutrition among other nutrition concerns. Based on a new tracking process implemented in January of 2023, an average of 501 patient nutrition risk assignments were overdue or incomplete per month in January thru April of 2023 with a dramatic increase to 835 in May. Missed risk assignments occur daily. If the risk assignment is not completed within the 72-hour parameters of the policy, this can result in late or missed RD assessment opportunities and policy compliance concerns.</p><p><b>Methods:</b> In June 2023, a Lean Six Sigma quality improvement project using the DMAIC (Define, Measure, Analyze, Improve, Control) framework was initiated at a 1000+ bed Academic Medical Center with the goal to improve efficiency of the nutrition risk assignment (NRA) process for RDs and DTs. A secondary goal was to see if potential improved efficiency would also lead to an increase in RD identified malnutrition based on Academy of Nutrition and Dietetics/American Society for Parenteral and Enteral Nutrition Indicators or Malnutrition (AAIM) criteria. A group of RDs and DTs met to walk through each of the quality improvement process steps. The problem was defined and baseline measurements of malnutrition identification rates and missed nutrition risk assignments were analyzed. A fishbone diagram was used to help with root cause analysis and later a payoff matrix was used to identify potential interventions. The improve phase was implemented in October and November 2023 and included changes to the screening policy itself and redistributing clinical nutrition staff on certain patient units.</p><p><b>Results:</b> Identified improvements did have a positive impact on incomplete work and on malnutrition identification rates. Malnutrition identification rates on average May thru October were 11.7% compared to malnutrition rates November thru April of 12.1%. The number of missed NRA's decreased from an average of 975 per month May thru October to 783 per month November thru April, a decrease of 192 per month (20%). An additional quality improvement process cycle is currently underway to further improve these metrics.</p><p><b>Conclusion:</b> Fostering a culture of ongoing improvement presents a significant challenge for both clinicians and leaders. Enhancing nutrition care and boosting clinician efficiency are critical goals. Introducing clinical nutrition leaders to tools designed for quality monitoring and enhancement can lead to better performance outcomes and more effective nutrition care. Tools such as those used for this project along with PDSA (Plan, Do, Study, Act) projects are valuable in this process. Involving team members from the beginning of these improvement efforts can also help ensure the successful adoption of practice changes.</p><p><b>Table 1.</b> RD Identified Malnutrition Rates.</p><p></p><p><b>Table 2.</b> Incomplete Nutrition Risk Assignments (NRA's).</p><p></p><p>Maurice Jeanne Aguero, RN, MD<sup>1</sup>; Precy Gem Calamba, MD, FPCP, DPBCN<sup>2</sup></p><p><sup>1</sup>Department of Internal Medicine, Prosperidad, Agusan del Sur; <sup>2</sup>Medical Nutrition Department, Tagum City, Davao del Norte</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition is a strong predictor for mortality and morbidity through poor response to therapy, and quality of life among Gastro-intestinal (GI) cancer patients. In a tertiary government hospital in Tagum City where a cancer center is present, although malnutrition screening among patients with cancer is routinary, no studies focusing on determining the association between nutritional status and quality of life among GI cancer patients were conducted in the past. This study generally aims to determine if nutritional status is associated with the quality of life among adult GI cancer Filipino patients seeking cancer therapy in a tertiary government hospital.</p><p><b>Methods:</b> A quantitative, observational, cross-sectional, survey analytical, and predictive type of research was done. World Health Organization Quality of Life Brief Version (WHOQOL-BREF) questionnaire was utilized to determine the quality of life of cases. Logistic regression analysis was used for the association between the demographic, clinical and nutritional profile among patients with gastrointestinal cancer.</p><p><b>Results:</b> Among respondents (n = 160, mean age 56.4 ± 12 years), majority were male (61.9%), married (77.5%), Roman Catholic (81.1%), and finished high school (38.1%). Almost half were diagnosed cases of colon adenocarcinoma (43.125%), followed by rectal adenocarcinoma (22.5%), rectosigmoid adenocarcinoma (11.875%), then GI stromal tumor (5.625%). On cancer staging, 40.625% were Stage 4, followed by Stage 3b (19.375%), Stage 3c (10%), Stage 3a (5.625%), then Stage 2a (4.375%). Only 2.5% were Stage 4a, while 0.625% were Stage 4b. More than one fourth received CAPEOX (38.125%), followed by FOLFOX (25.625%), then IMATINIB (5.625%). Among cases, 15.6% were underweight or obese, or overweight (34.4%). In terms of SGA grading, 38.1% had severe level, 33.8% moderate level, while the rest normal to mild. On quality of life, mean scores per variable were: generally good for general quality of life (3.71 ± 0.93), generally satisfied for perception on general health, being satisfied with one's self, and his or her relationship with others (3.46 to 3.86 ± 0.97), generally of moderate satisfaction on having enough energy to daily life, on accepting bodily appearance, on the availability of information needed for daily living, and on the extent of having the opportunity for leisure (2.71 to 3.36 ± 1.02), a little level of satisfaction was thrown on having enough money to meet their needs (2.38 ± 0.92). Participants, on average, experienced quite often negative feelings such as having blue mood, despair, depression and anxiety (2.81 ± 0.79). A significant association between age (p = 0.047), cancer diagnosis (p = 0.001), BMI status (p = 0.028), and SGA nutritional status (p = 0.010) relative to the quality of life among adult cancer patients was documented.</p><p><b>Conclusion:</b> Nutritional status was significantly associated with the quality of life among adult GI cancer Filipino patients seeking cancer therapy in a tertiary government hospital. Public health interventions may play a critical role on these factors to improve patient survival and outcome.</p><p>Carmen, Kaman Lo, MS, RD, LDN, CNSC<sup>1</sup>; Hannah Jacobs, OTD, OTR/L<sup>2</sup>; Sydney Duong, MS, RD, LDN<sup>3</sup>; Julie DiCarlo, MS<sup>4</sup>; Donna Belcher, EdD, MS, RD, LDN, CDCES, CNSC, FAND<sup>5</sup>; Galina Gheihman, MD<sup>6</sup>; David Lin, MD<sup>7</sup></p><p><sup>1</sup>Massachusetts General Hospital, Sharon, MA; <sup>2</sup>MedStart National Rehabilitation Hospital, Washington, DC; <sup>3</sup>New England Baptist Hospital, Boston, MA; <sup>4</sup>Center for Neurotechnology and Neurorecovery, Mass General Hospital, Boston, MA; <sup>5</sup>Nutrition and Food Services, MGH, Boston, MA; <sup>6</sup>Harvard Medical School and Mass General Hospital, Boston, MA; <sup>7</sup>Neurocritical Care &amp; Neurorecovery, MGH, Boston, MA</p><p><b>Financial Support:</b> Academy of Nutrition and Dietetics, Dietitian in Nutrition Support Member Research Award.</p><p><b>Background:</b> Nutritional status is a known modifiable factor for optimal recovery in brain injury survivors, yet, data on specific benchmarks for optimizing clinical outcomes through nutrition are limited. This pilot study aimed to quantify the clinical, nutritional, and functional characteristics of brain injury survivors from ICU admission to 90 days post-discharge.</p><p><b>Methods:</b> Patients admitted to the Massachusetts General Hospital (MGH) Neurosciences ICU over a 12-month period were enrolled based on these criteria: age 18 years and greater, primary diagnoses of acute brain injury, ICU stay for at least 72 hours, and meeting MGH Neurorecovery Clinic referral criteria for outpatient follow-up with survival beyond 90 days post-discharge. Data were collected from the electronic health record and from the neurorecovery clinic follow-visit/phone interview. These included patient characteristics, acute clinical outcomes, nutrition intake, surrogate nutrition and functional scores at admission, discharge, and 90 days post-discharge. Descriptive statistics were used for analysis.</p><p><b>Results:</b> Of 212 admissions during the study period, 50 patients were included in the analysis. The mean APACHE (n = 50), GCS (n = 50), and NIHSS (n = 20) scores were 18, 11 and 15, respectively. Seventy-eight percent of the patients required ventilation with a mean duration of 6.3 days. Mean ICU and post ICU length of stay were 17.4 and 15.9 days, respectively. Eighty percent received nutrition (enteral or oral) within 24-48 hours of ICU admission. The first 7-day mean ICU energy and protein intake were 1128 kcal/day and 60.3 g protein/day, respectively, both 63% of estimated needs. Assessing based on ASPEN guidelines, patients with a BMI ≥ 30 received less energy (11.6 vs 14.6 kcal/kg/day) but higher protein (1.04 vs 0.7 g protein/kg/day) than those with a BMI &lt; 30. Twelve percent of patients had less than 50% of their nutritional intake for at least 7 days pre-discharge and were considered nutritionally at risk. Forty-six percent were discharged with long-term enteral access. Only 16% of the patients were discharged to home rather than a rehabilitation facility. By 90 days post-discharge, 32% of the patients were readmitted, with 27% due to stroke. Upon admission, patients’ mean MUST (Malnutrition Universal Screening Tool) and MST (Malnutrition Screening Tool) scores were 0.56 and 0.48, respectively, reflecting that they were at low nutritional risk. By discharge, the mean MUST and MST scores of these patients increased to 1.16 and 2.08, respectively, suggesting these patients had become nutritionally at risk. At 90 days post-discharge, both scores returned to a low nutrition risk (MUST 0.48 and MST 0.59). All patients’ functional scores, as measured by the modified Rankin scale (mRS), followed a similar pattern: the mean score was 0.1 at admission, 4.2 at discharge, and 2.8 at 90 days post-discharge. The 90 days post-discharge Barthel index was 64.1, indicating a moderate dependence in these patients.</p><p><b>Conclusion:</b> This pilot study highlighted key nutritional, clinical, and functional characteristics of brain injury survivors from ICU admission to 90 days post-discharge. Further statistical analysis will help delineate the relationships between nutritional status and clinical and functional outcomes, which may guide future research and practice in ICU nutrition and neurorehabilitation.</p><p>Lavanya Chhetri, BS<sup>1</sup>; Amanda Van Jacob, MS, RDN, LDN, CCTD<sup>1</sup>; Sandra Gomez, PhD, RD<sup>1</sup>; Pokhraj Suthar, MBBS<sup>1</sup>; Sarah Peterson, PhD, RD<sup>1</sup></p><p><sup>1</sup>Rush University Medical Center, Chicago, IL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Identifying frailty in patients with liver disease provides valuable insights into a patient's nutritional status and physical resilience. However, it is unclear if reduced muscle is an important etiology of frailty in liver disease. Identifying the possible connection between frailty and muscle mass may lead to better risk prediction, personalized interventions, and improved outcomes in patient care. The purpose of this study is to determine if frail patients have lower skeletal muscle index (SMI) compared to not-frail patients with liver disease undergoing liver transplant evaluation.</p><p><b>Methods:</b> A retrospective, cross-sectional study design was utilized. Patients greater than 18 years of age who underwent a liver transplant evaluate from January 1<sup>st</sup>, 2019 until December 31, 2023 were included if they had a liver frailly index (LFI) assessment completed during the initial liver transplant evaluation and a diagnostic abdominal CT scan completed within 30 days of the initial liver transplant evaluation. Demographic data (age, sex, height, and BMI), etiology of liver disease, MELD-Na score, history of diabetes and hepatocellular carcinoma, liver disease complications (ascites, hepatocellular carcinoma, hepatic encephalopathy &amp; esophageal varices), and LFI score were recorded for each patient. LFI was recorded as both a continuous variable and dichotomized into a categorical variable (frail: defined as LFI ≥ 4.5 versus not frail: defined as LFI ≤ 4.4). Cross-sectional muscle area (cm<sup>2</sup>) from the third lumbar region of the CT was quantified; SMI was calculated (cm<sup>2</sup>/height in meters<sup>2</sup>) and low muscle mass was dichotomized into a categorical variable (low muscle mass: defined as SMI ≤ 50 cm<sup>2</sup>/m<sup>2</sup> for males and ≤39 cm<sup>2</sup>/m<sup>2</sup> for females versus normal muscle mass: defined as SMI &gt; 50 cm<sup>2</sup>/m<sup>2</sup> for males and &gt;39 cm<sup>2</sup>/m<sup>2</sup> for females). An independent t-test analysis was used to determine if there is a difference in SMI between patients who are categorized as frail versus not frail.</p><p><b>Results:</b> A total of 104 patients, 57% male with a mean age of 57 ± 10 years and mean of BMI 28.1 ± 6.4 kg/m<sup>2</sup>, were included. The mean MELD-Na score was 16.5 ± 6.9; 25% had a history of hepatocellular carcinoma and 38% had a history of diabetes. The majority of the sample had at least one liver disease complication (72% had ascites, 54% had hepatic encephalopathy, and 67% had varices). The mean LFI score was 4.5 ± 0.9 and 44% were categorized as frail. The mean SMI was 45.3 ± 12.6 cm<sup>2</sup>/m<sup>2</sup> and 52% were categorized as having low muscle mass (males: 63% and females: 38%). There was no difference in SMI between patients who were frail versus not frail (43.5 ± 10.6 versus 47.3 ± 13.9 cm<sup>2</sup>/m<sup>2</sup>, p = 0.06). The difference between SMI by frailty status was reported for males and females, no significance testing was used due to the small sample size. Both frail males (43.5 ± 12.2 versus 48.4 ± 14.9) and females (43.4 ± 9.3 versus 45.2 ± 11.8) had a lower SMI compared to non-frail patients.</p><p><b>Conclusion:</b> No difference in SMI between frail versus not frail patients was observed; however, based on the p-value of 0.06 a marginal trend and possible difference may exist, but further research is needed to confirm the findings. Additionally, it is concerning that men had a higher rate of low muscle mass and the mean SMI for both frail and not frail men was below the cut-off used to identify low muscle mass (SMI ≤ 50 cm<sup>2</sup>/m<sup>2</sup>). Additional research is needed to explore the underlying factors contributing to low muscle mass in men, particularly in frail populations, and to determine whether targeted interventions aimed at improving muscle mass could mitigate frailty and improve clinical outcomes in patients undergoing liver transplant evaluation.</p><p>Rebekah Preston, MS, RD, LD<sup>1</sup>; Keith Pearson, PhD, RD, LD<sup>2</sup>; Stephanie Dobak, MS, RD, LDN, CNSC<sup>3</sup>; Amy Ellis, PhD, MPH, RD, LD<sup>1</sup></p><p><sup>1</sup>The University of Alabama, Tuscaloosa, AL; <sup>2</sup>The University of Alabama at Birmingham, Birmingham, AL; <sup>3</sup>Thomas Jefferson University, Philadelphia, PA</p><p><b>Financial Support:</b> The ALS Association Quality of Care Grant.</p><p><b>Background:</b> Amyotrophic Lateral Sclerosis (ALS) is a progressive neurodegenerative disease. Malnutrition is common in people with ALS (PALS) due to dysphagia, hypermetabolism, self-feeding difficulty and other challenges. In many clinical settings, malnutrition is diagnosed using the Academy of Nutrition and Dietetics/American Society of Parenteral and Enteral Nutrition indicators to diagnose malnutrition (AAIM) or the Global Leadership Initiative on Malnutrition (GLIM) criteria. However, little is known about malnutrition assessment practices in ALS clinics. This qualitative study explored how RDs at ALS clinics are diagnosing malnutrition in PALS.</p><p><b>Methods:</b> Researchers conducted 6 virtual focus groups with 22 RDs working in ALS clinics across the United States. Audio recordings were transcribed verbatim, and transcripts imported to NVivo 14 software (QSR International, 2023, Melbourne, Australia). Two research team members independently analyzed the data using deductive thematic analysis.</p><p><b>Results:</b> The AAIM indicators were identified as the most used malnutrition diagnostic criteria. Two participants described using a combination of AAIM and GLIM criteria. Although all participants described performing thorough nutrition assessments, some said they do not formally document malnutrition in the outpatient setting due to lack of reimbursement and feeling as though the diagnosis would not change the intervention. Conversely, others noted the importance of documenting malnutrition for reimbursement purposes. Across all groups, RDs reported challenges in diagnosing malnutrition because of difficulty differentiating between disease- versus malnutrition-related muscle loss. Consequently, several RDs described adapting current malnutrition criteria to focus on weight loss, decreased energy intake, or fat loss.</p><p><b>Conclusion:</b> Overall, RDs agreed that malnutrition is common among PALS, and they conducted thorough nutrition assessments as part of standard care. Among those who documented malnutrition, most used AAIM indicators to support the diagnosis. However, as muscle loss is a natural consequence of ALS, RDs perceived difficulty in assessing nutrition-related muscle loss. This study highlights the need for malnutrition criteria specific for PALS.</p><p><b>Table 1.</b> Themes Related to Diagnosing Malnutrition in ALS.</p><p></p><p>Carley Rusch, PhD, RDN, LDN<sup>1</sup>; Nicholas Baroun, BS<sup>2</sup>; Katie Robinson, PhD, MPH, RD, LD, CNSC<sup>1</sup>; Maria Geraldine E. Baggs, PhD<sup>1</sup>; Refaat Hegazi, MD, PhD, MPH<sup>1</sup>; Dominique Williams, MD, MPH<sup>1</sup></p><p><sup>1</sup>Abbott Nutrition, Columbus, OH; <sup>2</sup>Miami University, Oxford, OH</p><p><b>Financial Support:</b> This study was supported by Abbott Nutrition.</p><p><b>Background:</b> Malnutrition is increasingly recognized as a condition that is present in all BMI categories. Although much research to date has focused on malnutrition in patients with lower BMIs, there is a need for understanding how nutrition interventions may alter outcomes for those with higher BMIs and existing comorbidities. In a post-hoc analysis of the NOURISH trial, which investigated hospitalized older adults with malnutrition, we sought to determine whether consuming a specialized ONS containing high energy, protein and beta-hydroxy-beta-methylbutyrate (ONS+HMB) can improve vitamin D and nutritional status in those with a BMI ≥ 27.</p><p><b>Methods:</b> Using data from the NOURISH trial, a randomized, placebo-controlled, multi-center, double-blind study conducted in hospitalized participants with malnutrition and a primary diagnosis of congestive heart failure, acute myocardial infarction, pneumonia, or chronic obstructive pulmonary disease, post-hoc analysis was conducted. In the trial, participants received standard care with either ONS + HMB or a placebo beverage (target 2 servings/day) during hospitalization and for 90 days post-discharge. Nutritional status was assessed using Subjective Global Assessment (SGA) and handgrip strength at baseline, 0-, 30-, 60- and 90-days post-discharge. Vitamin D (25-hydroxyvitamin D) was assessed within 72 hours of admission (baseline), 30- and 60-days post-discharge. Participants with a BMI ≥ 27 formed the analysis cohort. Treatment effect was determined using analysis of covariance adjusted for baseline measures.</p><p><b>Results:</b> The post-hoc cohort consisted of 166 patients with a BMI ≥ 27, mean age 76.41 ± 8.4 years, and who were predominantly female (51.2%). Baseline handgrip strength (n = 137) was 22.3 ± 0.8 kg while serum concentrations of 25-hydroxyvitamin D (n = 138) was 26.0 ± 1.14 ng/mL. At day 90, ONS+HMB improved nutritional status in which 64% of ONS+HMB group were well-nourished (SGA-A) vs. 37% of the control group (p = 0.011). There was a trend towards higher changes in handgrip strength with ONS+HMB during index hospitalization (baseline to discharge) compared to placebo (least squares means ± standard error: 1.34 kg ± 0.35 vs. 0.41 ± 0.39; p = 0.081) but was not significant at all other timepoints. Vitamin D concentrations were significantly higher at day 60 in those receiving ONS + HMB compared to placebo (29.7 ± 0.81 vs. 24.8 ± 0.91; p &lt; 0.001).</p><p><b>Conclusion:</b> Hospitalized older patients with malnutrition and a BMI ≥ 27 had significant improvements in their vitamin D and nutritional status at day 60 and 90, respectively, if they received standard care + ONS+HMB as compared to placebo. This suggests transitions of care post-acute setting should consider continuation of nutrition for patients with elevated BMI and malnutrition using interventions such as ONS+HMB in combination with standard care.</p><p>Aline Dos Santos<sup>1</sup>; Isis Helena Buonso<sup>2</sup>; Marisa Chiconeli Bailer<sup>2</sup>; Maria Fernanda Jensen Kok<sup>2</sup></p><p><sup>1</sup>Hospital Samaritano Higienópolis, São Paulo; <sup>2</sup>Hospital Samaritano Higienopolis, São Paulo</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition negatively impacts the length of hospital stay, infection rate, mortality, clinical complications, hospital readmission and also on average healthcare costs. It is believed that early nutritional interventions could reduce negative events and generate economic impact. Therefore, our objective was to evaluate the average cost of hospitalization of patients at nutritional risk through nutritional screening with indication of oral nutritional supplementation.</p><p><b>Methods:</b> Retrospective study including 110 adult patients hospitalized in a private institution admitted between August 2023 and January 2024. Nutritional screening was performed within 24 hours of admission. To classify low muscle mass according to calf circumference (CC), the cutoff points were considered: 33 cm for women and 34 cm for men, measured within 96 hours of hospital admission. They were evaluated in a grouped manner considering G1 patients with an indication for oral supplementation (OS) but not started for modifiable reasons, G2 patients with an indication for OS and started assertively (within 48 hours of the therapeutic indication) and G3 patients with an indication for OS but started late (after 48 hours of therapeutic indication) and G4 the joining of patients from G1 and G3 as both did not receive OS assertively. Patients receiving enteral or parenteral nutritional therapy were excluded.</p><p><b>Results:</b> G2 was prevalent in the studied sample (51%), had an intermediate average length of stay (20.9 days), lower average daily hospitalization cost, average age of 71 years, significant prevalence of low muscle mass (56%) and lower need for hospitalization in intensive care (IC) (63%) with an average length of stay (SLA) in IC of 13.5 days. G1 had a lower prevalence (9%), shorter average length of stay (16 days), average daily cost of hospitalization 41% higher than G2, average age of 68 years, unanimity of adequate muscle mass (100%) and considerable need for hospitalization in intensive care (70%), but with a SLA in IC of 7.7 days. G3 represented 40% of the sample studied, had a longer average length of stay (21.5 days), average daily cost of hospitalization 22% higher than G2, average age of 73 years, significant prevalence of low muscle mass (50%) and intermediate need for hospitalization in intensive care (66%) but with SLA in IC of 16.5 days. Compared to G2, G4 presented a similar sample group (G2: 56 patients and G4: 54 patients) as well as mean age (72 years), hospitalization (20.55 days), hospitalization in IC (66%), SLA in IC (64.23%) but higher average daily hospitalization cost (39% higher than G2) and higher prevalence of patients with low muscle mass (59%).</p><p><b>Conclusion:</b> From the results presented, we can conclude that the percentage of patients who did not receive OS and who spent time in the IC was on average 5% higher than the other groups, with unanimously adequate muscle mass in this group, but with the need for supplementation due to clinical conditions, food acceptance and weight loss. More than 50% of patients among all groups except G1 had low muscle mass. Regarding costs, patients supplemented assertively or late cost, respectively, 45% and 29% less compared to patients who did not receive OS. Comparing G2 with G4, the cost remains 39% lower in patients assertively supplemented.</p><p><b>International Poster of Distinction</b></p><p>Daphnee Lovesley, PhD, RD<sup>1</sup>; Rajalakshmi Paramasivam, MSc, RD<sup>1</sup></p><p><sup>1</sup>Apollo Hospitals, Chennai, Tamil Nadu</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition in hospitals, once an underreported issue, has gained significant attention in recent decades. This widespread problem negatively impacts recovery, length of stay (LOS), and overall outcomes. This study aimed to evaluate the effectiveness of clinical nutrition practices and the role of a Nutrition Steering Committee (NSC) in addressing and potentially eradicating this persistent problem by improving nutritional care and patient outcomes.</p><p><b>Methods:</b> Consecutive patients admitted to non-critical units of a tertiary care hospital between January 2018 and August 2024 were included in the study. Patient demographics, body mass index (BMI), modified Subjective Global Assessment (mSGA), oral nutritional supplements (ONS) usage, and clinical outcomes were retrospectively extracted from electronic medical records. Data was analyzed using SPSS version 20.0, comparing results before and after the implementation of the NSC.</p><p><b>Results:</b> Out of 239,630 consecutive patients, 139,895 non-critical patients were included, with a mean age of 57.10 ± 15.89 years; 64.3% were men and 35.7% women. The mean BMI was 25.76 ± 4.74 kg/m<sup>2</sup>, and 49.6% of the patients were polymorbid. The majority (25.8%) were admitted with cardiac illness. According to the modified Subjective Global Assessment (mSGA), 87.1% were well-nourished, 12.8% moderately malnourished, and 0.1% severely malnourished. ONS were prescribed for 10% of the population and ONS prescription was highest among underweight (28.4%); Normal BMI (13%); Overweight (9.1%); Obese (7.7%) (p = 0.000) and mSGA – well-nourished (5.5%); moderately malnourished (MM) 41%; severely malnourished (SM) 53.2% (p = 0.000) and pulmonology (23.3%), followed by gastroenterology &amp; Hepatology (19.2%) (p = 0.000). The mean hospital LOS was 4.29 ± 4.03 days, with an overall mortality rate of 1.2%. Severe malnutrition, as rated by mSGA, significantly impacted mortality (0.8% vs. 5.1%, p = 0.000). Mortality risk increased with polymorbidity (0.9% vs. 1.5%) and respiratory illness (2.6%, p = 0.000). Poor nutritional status, as assessed by mSGA (34.7%, 57.4%, 70.9%) and BMI (43.7% vs. 38%), was associated with longer hospital stays (LOS ≥ 4 days, p = 0.000). The implementation of the NSC led to significant improvements - average LOS decreased (4.4 vs. 4.1 days, p = 0.000), and mortality risk was reduced from 1.6% to 0.7% (p = 0.000). No significant changes were observed in baseline nutritional status, indicating effective clinical nutrition practices in assessing patient nutrition. ONS prescriptions increased from 5.2% to 9.7% between 2022 and 2024 (p = 0.000), contributing to the reduction in mortality rates to below 1% after 2022, compared to over 1% before NSC (p = 0.000). A significant negative correlation was found between LOS and ONS usage (p = 0.000). Step-wise binary logistic regression indicated that malnutrition assessed by mSGA predicts mortality with an odds ratio of 2.49 and LOS with an odds ratio of 1.7, followed by ONS prescription and polymorbidity (p = 0.000).</p><p><b>Conclusion:</b> A well-functioning NSC is pivotal in driving successful nutritional interventions and achieving organizational goals. Early identification of malnutrition through mSGA, followed by timely and appropriate nutritional interventions, is essential to closing care gaps and improving clinical outcomes. Strong leadership and governance are critical in driving these efforts, ensuring that the patients receive optimal nutritional support to enhance recovery and reduce mortality.</p><p><b>Table 1.</b> Patient Characteristics: Details of Baseline Anthropometric &amp; Nutritional Status.</p><p></p><p>Baseline details of Anthropometric Measurements and Nutrition Status.</p><p><b>Table 2.</b> Logistic Regression to Predict Hospital LOS and Mortality.</p><p></p><p>Step-wise binary logistic regression indicated that malnutrition assessed by mSGA predicts mortality with an odds ratio of 2.49 and LOS with an odds ratio of 1.7, followed by ONS prescription and polymorbidity (p = 0.000).</p><p></p><p>mSGA-rated malnourished patients stayed longer in the hospital compared to the well-nourished category (p = 0.000)</p><p><b>Figure 1.</b> Nutritional Status (mSGA) Vs Hospital LOS (&gt;4days).</p><p>Hannah Welch, MS, RD<sup>1</sup>; Wendy Raissle, RD, CNSC<sup>2</sup>; Maria Karimbakas, RD, CNSC<sup>3</sup></p><p><sup>1</sup>Optum Infusion Pharmacy, Phoenix, AZ; <sup>2</sup>Optum Infusion Pharmacy, Buckeye, AZ; <sup>3</sup>Optum Infusion Pharmacy, Milton, MA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Food insecurity is when people do not have enough food to eat and do not know where their next meal will come from. In the United States approximately 49 million people relied on food assistance charities in 2022 (data from Feeding America). Patients receiving parenteral nutrition (PN), who may be capable of supplementing with oral intake, may experience food insecurity due to chronic health conditions limiting work capability and total family income. Patients may also experience lack of affordable housing, increased utilities and the burden of medical expenses. Signs of food insecurity may present as weight loss, malnourishment, low energy, difficulty concentrating, or other physical indicators such as edema, chronically cracked lips, dry skin, and itchy eyes. The purpose of this abstract is to highlight two unique patient case presentations where food insecurity prompted clinicians to intervene.</p><p><b>Methods:</b> Patient 1: 50-year-old male with short bowel syndrome (SBS) on long-term PN called the registered dietitian (RD) regarding financial difficulties with feeding the family (see Table 1). The patient and clinician relationship allowed the patient to convey sensitive concerns to the RD regarding inability to feed himself and his family, which resulted in the patient relying on the PN for all nutrition. Due to the food insecurity present, the clinician made changes to PN/hydration to help improve patient's clinical status. Patient 2: A 21-year-old male with SBS on long-term PN spoke with his in-home registered nurse (RN) regarding family's difficulties affording food (see Table 2). The RN informed the clinical team of suspected food insecurity and the insurance case manager (CM) was contacted regarding food affordability. The RD reached out to local community resources such as food banks, food boxes and community programs. A community program was able to assist the patient with meals until patient's aunt started cooking meals for him. This patient did not directly share food insecurity with RD; however, the relationship with the in-home RN proved valuable in having these face-to-face conversations with the patient.</p><p><b>Results:</b> In these two patient examples, difficulty obtaining food affected the patients’ clinical status. The clinical team identified food insecurity and the need for further education for the interdisciplinary team. A food insecurity informational handout was created by the RD with an in-service to nursing to help aid recognition of signs (Figure 1) to detect possible food insecurity and potential patient resources available in the community. Figure 2 presents suggested questions to ask a patient if an issue is suspected.</p><p><b>Conclusion:</b> Given the prevalence of food insecurity, routine assessment for signs and symptoms is essential. Home nutrition support teams (including RDs, RNs, pharmacists and care technicians) are positioned to assist in this effort as they have frequent phone and in-home contact with patients and together build a trusted relationship with patients and caregivers. Clinicians should be aware regarding potential social situations which can warrant changes to PN formulations. To approach this sensitive issue thoughtfully, PN infusion providers should consider enhancing patient assessments and promote education across the interdisciplinary team to create awareness of accessible community resources.</p><p><b>Table 1.</b> Patient 1 Information.</p><p></p><p><b>Table 2.</b> Suspected Food Insecurity Timeline.</p><p></p><p></p><p><b>Figure 1.</b> Signs to Detect Food Insecurity.</p><p></p><p><b>Figure 2.</b> Questions to Ask.</p><p><b>Poster of Distinction</b></p><p>Christan Bury, MS, RD, LD, CNSC<sup>1</sup>; Amanda Hodge Bode, RDN, LD<sup>2</sup>; David Gardinier, RD, LD<sup>3</sup>; Roshni Sreedharan, MD, FASA, FCCM<sup>3</sup>; Maria Garcia Luis, MS, RD, LD<sup>4</sup></p><p><sup>1</sup>Cleveland Clinic, University Heights, OH; <sup>2</sup>Cleveland Clinic Foundation, Sullivan, OH; <sup>3</sup>Cleveland Clinic, Cleveland, OH; <sup>4</sup>Cleveland Clinic Cancer Center, Cleveland, OH</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> The Society of Critical Care Medicine, Critical Care Congress in Orlando, FL on February 25<sup>th</sup>.</p><p><b>Publication:</b> Critical Care Medicine.2025;53(1):In press.</p><p><b>Financial Support:</b> Project support provided by Morrison Cleveland Clinic Nutrition Research collaborative.</p><p><b>Background:</b> Hospitalized and critically ill patients who have preexisting malnutrition can have worse outcomes and an increased length of stay (LOS). Currently, Registered Dietitians (RDs) assess malnutrition with a nutrition focused physical exam (NFPE). Recent recommendations encourage the use of body composition tools such as computed tomography (CT) along with the NFPE. Trained RDs can use CT scans to evaluate skeletal muscle mass at the third lumbar vertebrae (L3) and then calculate Skeletal Muscle Index (SMI) and mean Hounsfield Units (HU) to determine muscle size and quality, respectively. This has been validated in various clinical populations and may be particularly useful in the critically ill where the NFPE is difficult. We aim to evaluate if using CT scans in the surgical and critical care population can be a supportive tool to capture a missed malnutrition diagnosis.</p><p><b>Methods:</b> One-hundred and twenty patients admitted to the Cleveland Clinic from 2021-2023 with a malnutrition evaluation that included an NFPE within 2 days of an abdominal CT were evaluated. Of those, 59 patients had a major surgery or procedure completed that admission and were included in the final analysis. The CT scans were read by a trained RD at L3 using Terarecon, and results were cross-referenced by an artificial intelligence software (AI) called Veronai. Age, sex, BMI, SMI &amp; HU were analyzed, along with the malnutrition diagnosis.</p><p><b>Results:</b> Fifty-nine patients were analyzed. Of these, 61% were male, 51% were &gt;65 years old, and 24% had a BMI &gt; 30. Malnutrition was diagnosed in 47% of patients. A total of 24 patients had no muscle wasting based on the NFPE while CT captured low muscle mass in 58% in that group. Twenty-two percent of patients (13/59) had a higher level of malnutrition severity when using CT. Additionally, poor muscle quality was detected in 71% of patients among all age groups. Notably, there was a 95% agreement between AI and the RD's assessment in detecting low muscle mass.</p><p><b>Conclusion:</b> RDs can effectively analyze CT scans and use SMI and HU with their NFPE. The NFPE alone is not always sensitive enough to detect low muscle in surgical and critically ill patients. The degree of malnutrition dictates nutrition interventions, including the timing and type of nutrition support, so it is imperative to accurately diagnose and tailor interventions to improve outcomes.</p><p><b>Table 1.</b> Change in Malnutrition Diagnosis Using CT.</p><p></p><p>The graph shows the change in Malnutrition Diagnosis when CT was applied in conjunction with the NFPE utilizing ASPEN Guidelines.</p><p><b>Table 2.</b> Muscle Assessment: CT vs NFPE.</p><p></p><p>This graph compares muscle evaluation using both CT and the NFPE.</p><p></p><p>CT scan at 3rd lumbar vertebrae showing normal muscle mass and normal muscle quality in a pt &gt;65 years old.</p><p><b>Figure 1.</b> CT Scans Evaluating Muscle Size and Quality.</p><p></p><p>CT scan at 3rd lumbar vetebrae showing low muscle mass and low muscle quality in a patient with obesity.</p><p><b>Figure 2.</b> CT Scans Evaluating Muscle Size and Quality.</p><p>Elif Aysin, PhD, RDN, LD<sup>1</sup>; Rachel Platts, RDN, LD<sup>1</sup>; Lori Logan, RN<sup>1</sup></p><p><sup>1</sup>Henry Community Health, New Castle, IN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Disease-related malnutrition alters body composition and causes functional decline. In acute care hospitals, 20-50% of inpatients have malnutrition when they are admitted to the hospital. Patients with malnutrition experience higher medical costs, increased mortality, and longer hospital stays. Malnutrition is associated with an increased risk of readmission and complications. Monitoring, diagnosis, treatment, and documentation of malnutrition are important for treating patients. It also contributes to proper Diagnosis Related Group (DRG) coding and accurate CMI (Case Mix Index), which can increase reimbursement.</p><p><b>Methods:</b> After dietitians completed the Nutrition Focused Physical Exam (NFPE) course and sufficient staff had been provided, the malnutrition project was initiated under the leadership of RDNs in our small rural community hospital. The interdisciplinary medical committee approved the project to improve malnutrition screening, diagnosis, treatment practices, and coding. It was decided to use the Academy/American Society of Parenteral and Enteral Nutrition (ASPEN) and Global Leadership Initiative on Malnutrition (GLIM) criteria to diagnose malnutrition. The Malnutrition Screening Tool (MST) was completed by nurses to determine the risk of malnutrition. The Nutrition and Dietetics department created a new custom report that provides NPO-Clear-Full liquids patients' reports using the nutrition database. RDNs check NPO-Clear-Full liquids patients' reports, BMI reports, and length of stay (LOS) patient lists on weekdays and weekends. RDNs also performed the NFPE exam to evaluate nutritional status. If malnutrition is identified, RDNs communicate with providers through the hospital messenger system. Providers add malnutrition diagnosis in their documentation and plan of care. RDNs created a dataset and shared it with Coders and Clinical Documentation Integrity Specialists/Care Coordination. We keep track of malnutrition patients. In addition, RDNs spent more time with malnourished patients. They contributed to discharge planning and education.</p><p><b>Results:</b> The prevalence of malnutrition diagnosis and the amount of reimbursement for 2023 were compared to the six months after implementing the malnutrition project. Malnutrition diagnosis increased from 2.6% to 10.8%. Unspecified protein-calorie malnutrition diagnoses decreased from 39% to 1.5%. RDN-diagnosed malnutrition has been documented in provider notes for 82% of cases. The malnutrition diagnosis rate increased by 315% and the malnutrition reimbursement rate increased by 158%. Of those patients identified with malnutrition, 59% received malnutrition DRG code. The remaining 41% of patients received higher major complications and comorbidities (MCCs) codes. Our malnutrition reimbursement increased from ~$106,000 to ~$276,000.</p><p><b>Conclusion:</b> The implementation of evidence-based practice guidelines was key in identifying and accurately diagnosing malnutrition. The provision of sufficient staff with the necessary training and multidisciplinary teamwork has improved malnutrition diagnosis documentation in our hospital, increasing malnutrition reimbursement.</p><p><b>Table 1.</b> Before and After Malnutrition Implementation Results.</p><p></p><p></p><p><b>Figure 1.</b> Prevalence of Malnutrition Diagnosis.</p><p>Elisabeth Schnicke, RD, LD, CNSC<sup>1</sup>; Sarah Holland, MSc, RD, LD, CNSC<sup>2</sup></p><p><sup>1</sup>The Ohio State University Wexner Medical Center, Columbus, OH; <sup>2</sup>The Ohio State University Wexner Medical Center, Upper Arlington, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition is associated with increased length of stay, readmissions, mortality and poor outcomes. Early identification and treatment are essential. The Malnutrition Screening Tool (MST) is a quick, easy tool recommended for screening malnutrition in adult hospitalized patients. This is commonly used with or without additional indicators. We aimed to evaluate our current nutrition screening policy, which utilizes MST, age and body mass index (BMI), to improve malnutrition identification.</p><p><b>Methods:</b> This quality improvement project data was obtained over a 3-month period on 4 different adult services at a large academic medical center. Services covered included general medicine, hepatology, heart failure and orthopedic surgery. Patients were assessed by a Registered Dietitian (RD) within 72hrs of admission if they met the following high-risk criteria: MST score &gt;2 completed by nursing on admission, age ≥65 yrs or older, BMI ≤ 18.5 kg/m<sup>2</sup>. If none of the criteria were met, patients were seen within 7 days of admission or sooner by consult request. Malnutrition was diagnosed using Academy of Nutrition and Dietetics/American Society for Parenteral and Enteral Nutrition indicators of malnutrition (AAIM) criteria. Data collected included malnutrition severity and etiology, age, gender, BMI and MST generated on admission.</p><p><b>Results:</b> A total of 239 patients were diagnosed with malnutrition. Table 1 shows detailed characteristics. Malnutrition was seen similarly across gender (51% male, 49% female) and age groups. Age range was 21-92 yrs with an average age of 61 yrs. BMI range was 9.8-50.2 kg/m2 with an average BMI of 24.6 kg/m2. More patients were found to have moderate malnutrition at 61.5% and chronic malnutrition at 54%. When data was stratified by age ≥65 yrs, similar characteristics were seen for malnutrition severity and etiology. Notably, more patients (61.5%) had an MST of &lt; 2 or an incomplete MST compared to patients &lt; 65 yrs of age (56%). There were 181 patients (76%) that met high risk screening criteria and were seen for an assessment within 72hrs. Seventy patients (39%) were screened only due to age ≥65 yrs. Forty-five (25%) were screened due to MST alone. There were 54 (30%) that met 2 indicators for screening. Only a small number of patients met BMI criteria alone or 3 indicators (6 patients or 3% each).</p><p><b>Conclusion:</b> Utilizing MST alone would have missed over half of patients diagnosed with malnutrition and there was a higher miss rate with older adults using MST alone. Age alone as a screening criteria caught more patients than MST alone did. Adding BMI to screening criteria added very little and we still missed 24% of patients with our criteria. A multi-faceted tool should be explored to best capture patients.</p><p><b>Table 1.</b> Malnutrition characteristics.</p><p></p><p>*BMI: excludes those with amputations, paraplegia; all patients n = 219, patients ≥65yo n = 106.</p><p>Robin Nuse Tome, MS, RD, CSP, LDN, CLC, FAND<sup>1</sup></p><p><sup>1</sup>Nemours Children's Hospital, DE, Landenberg, PA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition is a global problem impacting patients of all ages, genders, and races. Malnourished hospitalized patients are associated with poorer outcomes including longer in-hospital length of stays, higher rate of death, higher need for home healthcare service, and a higher rate of 30day readmission. This results in a higher economic burden for the healthcare industry, insurance, and the individual. Malnutrition documentation plays a vital role in capturing the status of the patient and starting the conversation about interventions to address then concern.</p><p><b>Methods:</b> A taskforce made up of physicians, registered dietitians (RDs), and clinical documentation specialists met to discuss strategies to increase documentation of the malnutrition diagnosis to facilitate better conversation about the concern and potential interventions. Options included inbox messages, a best practice alert, a SmartLink between the physician and RD note, and adding the diagnosis to the problem list. Of the options, the team selected to develop a SmartLink was developed within the Electronic Medical Record (EMR) that links text from the RD note about malnutrition to the physician note to capture the diagnosis of malnutrition, the severity, and the progression of the diagnosis over time.</p><p><b>Results:</b> Preliminary data shows that physician documentation of the malnutrition diagnosis as well as the severity and progression of the diagnosis increased by 20% in the pilot medical team. Anecdotally, physicians were more aware of the patient's nutrition status with documentation linked to their note and collaboration between the medical team and the RD to treat malnutrition has increased.</p><p><b>Conclusion:</b> We hypothesize that expanding the practice to the entire hospital, will increase documentation of malnutrition diagnosis in the physician note. This will help increase awareness of the nutrition status of the patient, draw attention and promote collaboration on interventions to treat, and increase billable revenue to the hospital by capturing the documentation of the degree of malnutrition in the physician note.</p><p>David López-Daza, RD<sup>1</sup>; Cristina Posada-Alvarez, Centro Latinoamericano de Nutrición<sup>1</sup>; Alejandra Agudelo-Martínez, Universidad CES<sup>2</sup>; Ana Rivera-Jaramillo, Boydorr SAS<sup>3</sup>; Yeny Cuellar-Fernández, Centro Latinoamericano de Nutrición<sup>1</sup>; Ricardo Merchán-Chaverra, Centro Latinoamericano de Nutrición<sup>1</sup>; María-Camila Gómez-Univio, Centro Latinoamericano de Nutrición<sup>1</sup>; Patricia Savino-Lloreda, Centro Latinoamericano de Nutrición<sup>1</sup></p><p><sup>1</sup>Centro Latinoamericano de Nutrición (Latin American Nutrition Center), Chía, Cundinamarca; <sup>2</sup>Universidad CES (CES University), Medellín, Antioquia; <sup>3</sup>Boydorr SAS, Chía, Cundinamarca</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> The Malnutrition Screening Tool (MST) is a simple and quick instrument designed to identify the risk of malnutrition in various healthcare settings, including home hospitalization. Its use has become widespread due to its ease of application and ability to preliminarily assess the nutritional status of patients. In the context of home care, where clinical resources may be limited, an efficient screening tool is crucial to ensure early interventions that prevent nutritional complications in vulnerable populations. The objective was to evaluate the diagnostic accuracy of the MST in detecting malnutrition among patients receiving home care.</p><p><b>Methods:</b> A diagnostic test study was conducted, collecting sociodemographic data, MST results, and malnutrition diagnoses based on the Global Leadership Initiative on Malnutrition (GLIM) criteria. A positive MST score was defined as a value of 2 or above. Categorical data were summarized using proportions, while quantitative variables were described using measures of central tendency. Sensitivity, specificity, the area under the receiver operating characteristic curve (AUC), positive likelihood ratio (LR + ), and negative likelihood ratio (LR-) were estimated along with their respective 95% confidence intervals.</p><p><b>Results:</b> A total of 676 patients were included, with a median age of 82 years (interquartile range: 68-89 years), and 57.3% were female. According to the GLIM criteria, 59.8% of the patients met the criteria for malnutrition. The MST classified patients into low risk (62.4%), medium risk (30.8%), and high risk (6.8%). The sensitivity of the MST was 11.4% (95% CI: 8.5-14.9%), while its specificity was 100% (95% CI: 98.7-100%). The positive likelihood ratio (LR + ) was 1.75, and the negative likelihood ratio (LR-) was 0.0. The area under the curve was 0.71 (95% CI: 0.69-0.73), indicating moderate discriminative capacity.</p><p><b>Conclusion:</b> While the MST demonstrates extremely high specificity, its low sensitivity limits its effectiveness in accurately identifying malnourished patients in the context of home care. This suggests that, although the tool is highly accurate for confirming the absence of malnutrition, it fails to detect a significant number of patients who are malnourished. As a result, while the MST may be useful as an initial screening tool, its use should be complemented with more comprehensive assessments to ensure more precise detection of malnutrition in this high-risk population.</p><p><b>Poster of Distinction</b></p><p>Colby Teeman, PhD, RDN, CNSC<sup>1</sup>; Kaylee Griffith, BS<sup>2</sup>; Karyn Catrine, MS, RDN, LD<sup>3</sup>; Lauren Murray, MS, RD, CNSC, LD<sup>3</sup>; Amanda Vande Griend, BS, MS<sup>2</sup></p><p><sup>1</sup>University of Dayton, Xenia, OH; <sup>2</sup>University of Dayton, Dayton, OH; <sup>3</sup>Premier Health, Dayton, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> The prevalence of malnutrition in critically ill populations has previously been shown to be between 38-78%. Previously published guidelines have stated that patients in the ICU should be screened for malnutrition within 24-48 hours and all patients in the ICU for &gt;48 hours should be considered at high risk for malnutrition. Patients with a malnutrition diagnosis in the ICU have been shown to have poorer clinical outcomes; including longer length of stay, greater readmission rates, and increased mortality. The purpose of the current study was to see if severity of malnutrition impacted the time to initiate enteral nutrition and the time to reach goal enteral nutrition rate in critically ill patients and determine the possible impact of malnutrition severity on clinical outcomes.</p><p><b>Methods:</b> A descriptive, retrospective chart review was conducted in multiple ICU units at a large level I trauma hospital in the Midwest. All participants included in analysis had been assessed for malnutrition by a registered dietitian according to ASPEN clinical practice guidelines. Exclusion criteria included patients receiving EN prior to the RDN assessment, those who received EN for &lt; 24 hours total, patients on mixed oral and enteral nutrition diets, and patients receiving any parenteral nutrition. Participants were grouped by malnutrition status; no malnutrition n = 27, moderate malnutrition n = 22, and severe malnutrition n = 32. All data analysis were analyzed SPSS version 29.</p><p><b>Results:</b> There was no difference in primary outcomes (time to EN initiation, time to EN goal rate) by malnutrition status (both p &gt; 0.05). Multiple regression analysis found neither moderately malnourished or severely malnourished patients were more likely to have enteral nutrition initiation delayed for &gt;48 hours from admission (p &gt; 0.05). Neither ICU LOS nor hospital LOS was different among malnutrition groups (p &gt; 0.05). Furthermore, neither ICU nor hospital mortality was different among malnutrition groups (p &lt; 0.05). Among patients who were moderately malnourished, 81.8% required vasopressors, compared to 75% of patients who were severely malnourished, and 44.4% of patients who did not have a malnutrition diagnosis (p = 0.010). 90.9% of moderately malnourished patients required extended time on a ventilator (&gt;72 hours), compared to 59.4% of severely malnourished patients, and 51.9% of patients without a malnutrition diagnosis (p = 0.011).</p><p><b>Conclusion:</b> Although the severity of malnutrition did not impact LOS, readmission, or mortality, malnutrition status did significantly predict greater odds of a patient requiring vasopressors and spending an extended amount of time on a ventilator. Further studies with larger sample sizes are warranted to continue developing a better understanding of the relationship between malnutrition status and clinical outcomes.</p><p>Jamie Grandic, RDN-AP, CNSC<sup>1</sup>; Cindi Stefl, RN, BSN, CCDS<sup>2</sup></p><p><sup>1</sup>Inova Health System, Fairfax Station, VA; <sup>2</sup>Inova Health System, Fairfax, VA</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> Vizient Connections Summit 2024 (Sept 16-19, 2024).</p><p><b>Publication:</b> 2024 Vizient supplement to the American Journal of Medical Quality (AJMQ).</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Research reveals that up to 50% of hospitalized patients are malnourished, yet only 9% of these cases are diagnosed. <sup>(1)</sup> Inadequate diagnosis and intervention of malnutrition can lead to poorer patient outcomes and reduced revenue. Our systemwide malnutrition awareness campaign successfully enhanced dietitian engagement, provider education, and streamlined documentation processes. This initiative resulted in a two-fold increase in the capture of malnutrition codes, a notable rise in malnutrition variable capture, and an average increase in diagnosis-related group relative weight by approximately 0.9. Consequently, there was a ~ 300% increase in revenue associated with accurate malnutrition diagnosis and documentation, alongside improvements in the observed-to-expected (O/E) ratio for mortality and length of stay. Given the critical impact of malnutrition on mortality, length of stay, and costs, enhanced identification programs are essential. Within our health system, a malnutrition identification program has been implemented across five hospitals for several years. System leadership roles for clinical nutrition and clinical documentation integrity (CDI) were established to ensure consistency, implement best practices, and optimize program efficacy. In 2022, a comprehensive analysis identified opportunities for improvement: a low systemwide capture rate (2%), limited awareness of the program's benefits, and inconsistent documentation practices. The leadership team, with the support of our executive sponsors, addressed these issues, engaged with service line leaders, and continue to drive program enhancements.</p><p><b>Methods:</b> A 4-part malnutrition education campaign was implemented: Strengthened collaboration between Clinical Nutrition and CDI, ensuring daily systemwide communication of newly identified malnourished patients. Leadership teams, including coding and compliance, reviewed documentation protocols considering denial risks and regulatory audits. Launched a systemwide dietitian training program with a core RD malnutrition optimization team, a 5-hour comprehensive training, and monthly chart audits aiming for &gt;80% documentation compliance. Created a Provider Awareness Campaign featuring interactive presentations on the malnutrition program's benefits and provider documentation recommendations. Developed an electronic health record (EHR) report and a malnutrition EHR tool to standardize documentation. EHR and financial reports were used to monitor program impact and capture rates.</p><p><b>Results:</b> The malnutrition campaign has notably improved outcomes through ongoing education for stakeholders. A malnutrition EHR tool was also created in November 2022. This tool is vital for enhancing documentation, significantly boosting provider and CDI efficiency. Key results include: Dietitian documentation compliance increased from 85% (July 2022) to 95% (2024); RD-identified malnutrition cases increased from 2% (2021) to 16% (2024); Monthly average of final coded malnutrition diagnoses increased from 240 (2021) to 717 (2023); Average DRG relative weight climbed from 1.24 (2021) to 2.17 (2023); Financial impact increased from $5.5 M (2021) to $17.7 M (2024); and LOS O/E improved from 1.04 to 0.94 and mortality O/E improved from 0.77 to 0.62 (2021-2023).</p><p><b>Conclusion:</b> This systemwide initiative not only elevates capture rates and documentation but also enhances overall outcomes. By CDI and RD teams taking on more of a collaborative, leadership role, providers can concentrate more on patient care, allowing these teams to operate at their peak. Looking ahead to 2025, the focus will shift towards leading indicators to refine malnutrition identification and assess the educational campaign's impact further.</p><p>Ryota Sakamoto, MD, PhD<sup>1</sup></p><p><sup>1</sup>Kyoto University, Kyoto</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Growing concern about the environmental impact of meat eating has led to consideration of a shift to a plant-based diet. One nutrient that tends to be particularly deficient in a plant-based diet is vitamin B12. Since vitamin B12 deficiency can cause anemia, limb paresthesia and muscle weakness, and psychiatric symptoms including depression, delirium, and cognitive impairment, it is important to find sustainable local sources of vitamin B12. In this study, we focused on gundruk and sinki, two fermented and preserved vegetables traditionally consumed mainly in Nepal, India, and Bhutan, and investigated their vitamin B12 content. The sinki is mainly made from radish roots, while gundruk is made from green leaves such as mustard leaves, which are preserved through fermentation and sun-drying processes. Previous reports indicated that, in these regions, not only vegetarians and vegans but also a significant number of people may have consistently low meat intake, especially among the poor. The governments and other organizations have been initiating feeding programs to supply fortified foods with vitamins A, B1, B2, B3, B6, B9, B12, iron, and zinc, especially to schools. At this time, however, it is not easy to get fortified foods to residents in the community. It is important to explore the possibility of getting vitamin B12 from locally available products that can be taken by vegetarians, vegans, or the poor in the communities.</p><p><b>Methods:</b> Four samples of gundruk and five samples of sinki were obtained from markets, and the vitamin B12 content in them was determined using Lactobacillus delbrueckii subsp. lactis (Lactobacillus leichmannii) ATCC7830. The lower limit of quantification was set at 0.03 µg/100 g. The sample with the highest vitamin B12 concentration in the microbial quantification method was also measured for cyanocobalamin using LC-MS/MS (Shimadzu LC system equipped with Triple Quad 5500 plus AB-Sciex mass spectrometer). The Multiple Reaction Monitoring transition pattern for cyanocobalamin were Q1: 678.3 m/z, Q3: 147.1 m/z.</p><p><b>Results:</b> For gundruk, vitamin B12 was detected in all four of the four samples, with values of 5.0 µg/100 g, 0.13 µg/100 g, 0.12 µg/100 g, and 0.04 µg/100 g, respectively, from highest to lowest. For sinki, it was detected in four of the five samples, with values of 1.4 µg/100 g, 0.41 µg/100 g, 0.34 µg/100 g, and 0.16 µg/100 g, respectively, from highest to lowest. The cyanocobalamin concentration by LC-MS/MS in one sample was estimated to be 1.18 µg/100 g.</p><p><b>Conclusion:</b> According to “Vitamin and mineral requirements in human nutrition (2nd edition) (2004)” by the World Health Organization and the Food and Agriculture Organization of the United Nations, the recommended intake of vitamin B12 is 2.4 µg/day for adults, 2.6 µg/day for pregnant women and 2.8 µg/day for lactating women. The results of this study suggest that gundruk and sinki have potential as a source of vitamin B12, although there is a great deal of variability among samples. In order to use gundruk and sinki as a source of vitamin B12, it may be necessary to find a way to stabilize the vitamin B12 content while focusing on the relationship between vitamin B12 and the different ways of making gundruk and sinki.</p><p>Teresa Capello, MS, RD, LD<sup>1</sup>; Amanda Truex, MS, RRT, RCP, AE-C<sup>1</sup>; Jennifer Curtiss, MS, RD, LD, CLC<sup>1</sup>; Ada Lin, MD<sup>1</sup></p><p><sup>1</sup>Nationwide Children's Hospital, Columbus, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> The metabolic demands of critically ill children are defined by an increase in resting energy expenditure. (1,2). Energy needs in the PICU are ever changing and accurate evaluations are challenging to obtain. (3) Predictive equations have been found to be inaccurate due to over or under feeding patients which can lead to negative outcomes such as muscle loss and poor healing (underfeeding) and weight gain as adipose (overfeeding) (1,4,5). Indirect calorimetry (IC) is considered the gold standard to assess metabolic demand especially for critically ill pediatric patients (1,2,4). The use of IC may be limited due to staffing, equipment availability and cost as well as other patient related issues and/or cart specifications (6). In our facility, we identified limited use of IC by dietitians. Most tests were ordered by PICU dietitians and rarely outside the critical care division even though testing would benefit patients in other divisions of the hospital such as the CTICU, rehab, NICU and stepdown areas. Informal polling of non-PICU dietitians revealed that they had significant uncertainty interpreting data and providing recommendations based on test results. Reasons for uncertainty mostly centered from a lack of familiarity with this technology. The purpose of this study was to develop guidelines and a worksheet for consistently evaluating IC results with the goal of encouraging increased use of indirect calorimetry at our pediatric facility.</p><p><b>Methods:</b> A committee of registered dietitians (RDs) and respiratory therapists (RTs) met in January 2023 and agreed on step-by-step guidelines which were trialed, reviewed, and updated monthly. Finalized guidelines were transitioned to a worksheet to improve consistency of use and aid in interpretation of IC results. A shared file was established for articles about IC as well as access to the guidelines and worksheet (Figures 1 and 2). For this study, IC data from January 1, 2022 to July 31, 2024 was reviewed. This data included number of tests completed and where the orders originated.</p><p><b>Results:</b> Since the guidelines have been implemented, the non-PICU areas using IC data increased from 16% in 2022 to 30% in 2023 and appears to be on track to be the same in 2024 (Figure 3). RDs report an improved comfort level with evaluating test results as well as making recommendations for test ordering.</p><p><b>Conclusion:</b> The standardized guidelines and worksheet increased RD's level of comfort and interpretation of test results. The PICU RDs have become more proficient and comfortable explaining IC during PICU rounds. It is our hope that with the development of the guidelines/worksheet, more non-PICU RDs will utilize the IC testing outside of the critical care areas where longer lengths of stay may occur. IC allows for more individualized nutrition prescriptions. An additional benefit was the mutual exchange of information between disciplines. The RTs provided education on the use of the machine to the RDs. This enhanced RDs understanding of IC test results from the RT perspective. In return, the RDs educated the RTs as to why certain aspects of the patient's testing environment were helpful to report with the results for the RD to interpret the information correctly. The committee continues to meet and discuss patients’ tests to see how testing can be optimized as well as how results may be used to guide nutrition care.</p><p></p><p><b>Figure 1.</b> Screen Capture of Metabolic Cart Shared File.</p><p></p><p><b>Figure 2.</b> IC Worksheet.</p><p></p><p><b>Figure 3.</b> Carts completed per year by unit: 2022 is pre-intervention; 2023 and 2024 are post intervention. Key: H2B = PICU; other areas are non-PICU (H4A = cardiothoracic stepdown, H4B = cardiothoracic ICU, H5B = burn, H8A = pulmonary, H8B = stable trach/vent unit, H10B = Neurosurgery/Neurology, H11B = Nephrology/GI, H12A = Hematology/Oncology, C4A = NICU, C5B = infectious disease).</p><p>Alfredo Lozornio-Jiménez-de-la-Rosa, MD, MSCN<sup>1</sup>; Minu Rodríguez-Gil, MSCN<sup>2</sup>; Luz Romero-Manriqe, MSCN<sup>2</sup>; Cynthia García-Vargas, MD, MSCN<sup>2</sup>; Rosa Castillo-Valenzuela, PhD<sup>2</sup>; Yolanda Méndez-Romero, MD, MSC<sup>1</sup></p><p><sup>1</sup>Colegio Mexicano de Nutrición Clinica y Terapia Nutricional (Mexican College of Clinical Nutrition and Nutritional Therapy), León, Guanajuato; <sup>2</sup>Colegio Mexicano de Nutrición Clínica y Terapia Nutricionalc (Mexican College of Clinical Nutrition and Nutritional Therapy), León, Guanajuato</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Sarcopenia is a systemic, progressive musculoskeletal disorder associated with an increased risk of adverse events and is highly prevalent in older adults. This condition leads to a decline in functionality, quality of life, and has an economic impact. Sarcopenia is becoming increasingly common due to age-related changes, metabolic changes, obesity, sedentary lifestyle, chronic degenerative diseases, malnutrition, and pro-inflammatory states. The objective of this study was to investigate the relationship between strength and muscle mass, measured by calf circumference, both corrected for BMI, in young and older Mexican adults.</p><p><b>Methods:</b> This is a prospective, observational, cross-sectional, population-based clinical study conducted among Mexican men and women aged 30 to 90 years old, obtained through convenience sampling. The study was approved by the Ethics Committee of the Aranda de la Parra Hospital in León, Guanajuato, Mexico, and adheres to the Declaration of Helsinki. Informed consent was obtained from all participants after explaining the nature of the study. Inclusion criteria were: Mexican men and women aged 30 to 90 years old who were functionally independent (Katz index category \\\"a\\\"). Participants with amputations, movement disorders, or immobilization devices on their extremities were excluded. The research team was previously standardized in anthropometric measurements. Demographic data and measurements of weight, height, BMI, calf circumference, and grip strength using a dynamometer were collected. Data are presented as average and standard deviation. Spearman's correlation analysis was used to assess the relationship between BMI and calf circumference adjusted for BMI, with grip strength, considering a significance level of p &lt; 0.05.</p><p><b>Results:</b> The results of 1032 subjects were presented, 394 men and 638 women from central Mexico, located in workplaces, recreation centers, and health facilities, aged between 30 and 90 years old. Table 1 shows the distribution of the population in each age category, categorized by sex. Combined obesity and overweight were found in 75.1% of the sample population, with a frequency of 69.2% in men and 78.7% in women; 20% had a normal weight, with 25.6% in men and 16.6% in women, and 4.8% had low BMI, with 5.1% of men and 4.7% of women (Graph 1). The depletion of calf circumference corrected for BMI and age in the female population begins at 50 years old with exacerbation at 65 years old and older, while in men a greater depletion can be observed from 70 years old onwards. (Graph 2). When analyzing the strength corrected for BMI and age, grip strength lowers at 55 years old, lowering even more as age increases, in both genders; Chi-square=83.5, p &lt; 0.001 (Graph 3). By Spearman correlation, an inverse and high relationship was found in both genders between age and grip strength, that is, as age increases, grip strength decreases (r = -0.530, p &lt; 0.001). A moderate and negative correlation was found between age and calf circumference, as age increases, calf circumference decreases independently of BMI (r = -0.365, p &lt; 0.001). Calf circumference and grip strength are positively and moderately related, as calf circumference decreases, the grip strength decreases, independently of BMI (r = 0.447, p &lt; 0.0001).</p><p><b>Conclusion:</b> These results show that the study population exhibited a decrease in grip strength, not related to BMI, from early ages, which may increase the risk of early-onset sarcopenia. This findings encourage early assessment of both grip strength and muscle mass, using simple and accessible measurements such as grip strength and calf circumference, adjusted for BMI. These measurements can be performed in the office during the initial patient encounter or in large populations, as in this study.</p><p><b>Table 1.</b> Distribution of the Population According to Age and Gender.</p><p></p><p>Alison Hannon, Medical Student<sup>1</sup>; Anne McCallister, DNP, CPNP<sup>2</sup>; Kanika Puri, MD<sup>3</sup>; Anthony Perkins, MS<sup>1</sup>; Charles Vanderpool, MD<sup>1</sup></p><p><sup>1</sup>Indiana University School of Medicine, Indianapolis, IN; <sup>2</sup>Indiana University Health, Indianapolis, IN; <sup>3</sup>Riley Hospital for Children at Indiana University Health, Indianapolis, IN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> The Academy of Nutrition and Dietetics and American Society for Parenteral and Enteral Nutrition have published criteria to diagnose a patient with mild, moderate, or severe malnutrition that use either single or multiple data points. Malnutrition is associated with worse clinical outcomes, and previous data from our institution showed that hospitalized children with severe malnutrition are at higher risk of mortality compared to mild and moderate malnutrition. This project aims to determine if differences in clinical outcomes exist in patients with severe malnutrition based on the diagnostic criteria or anthropometrics differences in patients.</p><p><b>Methods:</b> We included all patients discharged from Riley Hospital for Children within the 2023 calendar year diagnosed with severe malnutrition, excluding maternity discharges. Diagnostic criteria used to determine severe malnutrition was collected from registered dietitian (RD) documentation and RD-assigned malnutrition statement within medical records for the admission. Data was collected on readmission rates, mortality, length of stay (LOS), LOS index, cost, operative procedures, pediatric intensive care unit (ICU) admissions, and anthropometrics measured on admission. We used mixed-effects regression or mixed effects logistic regression to test whether the outcome of interest differed by severe malnutrition type. Each model contained a random effect for patient to account for correlation within admissions by the same patient and a fixed effect for severe malnutrition type. All analyses were performed using SAS v9.4.</p><p><b>Results:</b> Data was gathered on 409 patient admissions. 383 admissions had diagnostic criteria clearly defined regarding severity of malnutrition. This represented 327 unique patients (due to readmissions). There was no difference in any measured clinical outcomes based on the criteria used for severe malnutrition, including single or multiple point indicators or patients who met both single and multiple point indicators (Table 1). Anthropometric data was analyzed including weight Z-score (n = 398) and BMI Z-score (n = 180). There was no difference seen in the majority of measured clinical outcomes in admissions with severe malnutrition when comparing based on weight or BMI Z-score categories of Z &lt; -2, -2 &lt; Z &lt; -0.01, or Z &gt; 0 (Table 2). Patients admitted with severe malnutrition and a BMI Z score &gt; 0 had an increase in median cost (p = 0.042) compared to BMI &lt; -2 or between -2 and 0 (Table 2). There was a trend towards increased median cost (p = 0.067) and median LOS (p = 0.104) in patients with weight Z score &gt; 0.</p><p><b>Conclusion:</b> Hospitalized patients with severe malnutrition have similar clinical outcomes regardless of diagnostic criteria used to determine the diagnosis of severe malnutrition. Fewer admissions with severe malnutrition (n = 180 or 44%) had sufficient anthropometric data to determine BMI. Based on this data, future programs at our institution aimed at intervention prior to and following admission will need to focus on all patients with severe malnutrition and will not be narrowed based on criteria (single, multiple data point) of severe malnutrition or anthropometrics. Quality improvement projects include improving height measurement and BMI determination upon admission, which will allow for future evaluation of impact of anthropometrics on clinical outcomes.</p><p><b>Table 1.</b> Outcomes by Severe Malnutrition Diagnosis Category.</p><p></p><p>Outcomes by severe malnutrition compared based on the diagnostic criteria used to determine malnutrition diagnosis. Patients with severe malnutrition only are represented. Diagnostic criteria determined based on ASPEN/AND guidelines and defined during admission by registered dietitian (RD). OR = operative room; ICU = intensive care unit; LOS = length of stay. Data on 383 admissions presented, total of 327 patients due to readmissions: 284 patients had 1 admission; 33 patients had 2 admissions; 8 patients had 3 admissions; 1 patient had 4 admissions; 1 patient had 5 admissions</p><p><b>Table 2.</b> Outcomes By BMI Z-score Category.</p><p></p><p>Outcomes of patients admitted with severe malnutrition, stratified based on BMI Z-score. Patients with severe malnutrition only are represented. BMI Z-score determined based on weight and height measurement at time of admission, recorded by bedside admission nurse. OR = operative room; ICU = intensive care unit; LOS = length of stay. Due to incomplete height measurements, data on only 180 admissions was available, total of 158 patients: 142 patients had 1 admission; 12 patients had 2 admissions; 3 patients had 3 admissions; 1 patient had 5 admissions</p><p>Claudia Maza, ND MSc<sup>1</sup>; Isabel Calvo, MD, MSc<sup>2</sup>; Andrea Gómez, ND<sup>2</sup>; Tania Abril, MSc<sup>3</sup>; Evelyn Frias-Toral, MD, MSc<sup>4</sup></p><p><sup>1</sup>Centro Médico Militar (Military Medical Center), Guatemala, Santa Rosa; <sup>2</sup>Hospital General de Tijuana (Tijuana General Hospital), Tijuana, Baja California; <sup>3</sup>Universidad Católica de Santiago de Guayaquil (Catholic University of Santiago de Guayaquil), Guayaquil, Guayas; <sup>4</sup>Universidad Espíritu Santo (Holy Spirit University), Dripping Springs, TX</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition is a common and significant issue among hospitalized patients, particularly in older adults or those with multiple comorbidities. The presence of malnutrition in such patients is associated with an increased risk of morbidity, mortality, prolonged hospital stays, and elevated healthcare costs. A crucial indicator of nutritional status is muscle strength, which can be effectively measured using handgrip strength (HGS). This study aimed to describe the relationship between nutritional status and muscle strength reduction in patients from two hospitals in Latin America.</p><p><b>Methods:</b> A retrospective observational study was conducted from February to May 2022. Data were collected from two hospitals: one in Guatemala and one in Mexico. A total of 169 patients aged 19-98 years were initially considered for the study, and 127 met the inclusion criteria. The sample comprised adult patients of both sexes admitted to internal medicine, surgery, and geriatrics departments. Handgrip strength, demographic data, baseline medical diagnosis, weight, and height were recorded at admission and on the 14th day of hospitalization. Exclusion criteria included patients with arm or hand movement limitations, those under sedation or mechanical ventilation, and those hospitalized for less than 24 hours. HGS was measured using JAMAR® and Smedley dynamometers, following standard protocols. Statistical analysis was performed using measures of central tendency, and results were presented in tables and figures.</p><p><b>Results:</b> In the first hospital (Mexico), 62 patients participated, with a predominant female sample. The average weight was 69.02 kg, height 1.62 meters, and BMI 26.14 kg/m² (classified as overweight). The most common admission diagnoses were infectious diseases, nervous system disorders, and digestive diseases. (Table 1) A slight increase in HGS (0.49 kg) was observed between the first and second measurements. (Figure 1) In the second hospital (Guatemala), 62 patients also met the inclusion criteria, with a predominant male sample. The average weight was 65.92 kg, height 1.61 meters, and BMI 25.47 kg/m² (classified as overweight). Infectious diseases and musculoskeletal disorders were the most common diagnoses. (Table 1) HGS decreased by 2 kg between the first and second measurements. (Figure 2) Low HGS was associated with underweight patients and those with class II and III obesity. Patients with normal BMI in both centers exhibited significant reductions in muscle strength, indicating that weight alone is not a sufficient indicator of muscle strength preservation.</p><p><b>Conclusion:</b> This multicenter study highlights the significant relationship between nutritional status and decreased muscle strength in hospitalized patients. While underweight patients showed reductions in HGS, those with class II and III obesity also experienced significant strength loss. These findings suggest that HGS is a valuable, non-invasive tool for assessing both nutritional status and muscle strength in hospitalized patients. Early identification of muscle strength deterioration can help healthcare providers implement timely nutritional interventions to improve patient outcomes.</p><p><b>Table 1.</b> Baseline Demographic and Clinical Characteristics of the Study Population.</p><p></p><p>NS: Nervous System, BMI: Body Mass Index</p><p></p><p><b>Figure 1.</b> Relationship Between Nutritional Status and Handgrip Strength (Center 1 - Mexico).</p><p></p><p><b>Figure 2.</b> Relationship Between Nutritional Status and Handgrip Strength (Center 2 - Guatemala).</p><p>Reem Farra, MDS, RD, CNSC, CCTD<sup>1</sup>; Cassie Greene, RD, CNSC, CDCES<sup>2</sup>; Michele Gilson, MDA, RD, CEDS<sup>2</sup>; Mary Englick, MS, RD, CSO, CDCES<sup>2</sup>; Kristine Thornham, MS, RD, CDE<sup>2</sup>; Debbie Andersen, MS, RD, CEDRD-S, CHC<sup>3</sup>; Stephanie Hancock, RD, CSP, CNSC<sup>4</sup></p><p><sup>1</sup>Kaiser Permanente, Lone Tree, CO; <sup>2</sup>Kaiser Permanente, Denver, CO; <sup>3</sup>Kaiser Permanente, Castle Rock, CO; <sup>4</sup>Kaiser Permanente, Littleton, CO</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Currently, there is no standardized screening required by the Centers for Medicare and Medicaid Services for malnutrition in outpatient settings. This raises concerns about early identification of malnutrition, the likelihood of nutrition interventions, and increased healthcare costs. While malnutrition screening in hospitalized patients is well-studied, its impact in outpatient care has not been thoroughly examined. Studies show that malnourished patients are 40% more likely to be readmitted within 30 days of discharge, adding over $10,000 to hospitalization costs. This quality improvement project aimed to evaluate the impact of implementing a standardized malnutrition screening tool for all patients as part of nutrition assessments by Registered Dietitians (RDs).</p><p><b>Methods:</b> The Malnutrition Screening Tool (MST) was chosen due to its validity, sensitivity, and specificity in identifying malnutrition risk in outpatient settings. This tool assesses risk by asking about recent unintentional weight loss and decreased intake due to appetite loss. Based on responses, a score of 0-5 indicates the severity of risk. Scores of 0-1 indicate no risk, while scores of 2-5 indicate a risk for malnutrition. This questionnaire was integrated into the nutrition assessment section of the electronic medical record to standardize screening for all patients. Those with scores of 2 or greater were included, with no exclusions for disease states. Patients were scheduled for follow-ups 2-6 weeks after the initial assessment, during which their MST score was recalculated.</p><p><b>Results:</b> A total of 414 patients were screened, with 175 completing follow-up visits. Of these, 131 showed improvements in their MST scores after nutrition interventions, 12 had score increases, and 32 maintained the same score. Two hundred thirty-nine patients were lost to follow-up for various reasons, including lack of response, limited RD scheduling access, changes in insurance, and mortality. Those with improved MST scores experienced an average cost avoidance of $15,000 each in subsequent hospitalizations. This cost avoidance was due to shorter hospitalizations, better treatment responses, and reduced need for medical interventions. The project raised awareness about the importance of early nutrition interventions among the multidisciplinary team.</p><p><b>Conclusion:</b> This project suggests that standardizing malnutrition screening in outpatient settings could lead to cost avoidance for patients and health systems, along with improved overall care. Further studies are needed to identify the best tools for outpatient malnutrition screening, optimal follow-up timelines, and effective nutrition interventions for greatest cost avoidance.</p><p>Amy Sharn, MS, RDN, LD<sup>1</sup>; Raissa Sorgho, PhD, MScIH<sup>2</sup>; Suela Sulo, PhD, MSc<sup>3</sup>; Emilio Molina-Molina, PhD, MSc, MEd<sup>4</sup>; Clara Rojas Montenegro, RD<sup>5</sup>; Mary Jean Villa-Real Guno, MD, FPPS, FPSPGHAN, MBA<sup>6</sup>; Sue Abdel-Rahman, PharmD, MA<sup>7</sup></p><p><sup>1</sup>Abbott Nutrition, Columbus, OH; <sup>2</sup>Center for Wellness and Nutrition, Public Health Institute, Sacramento, CA; <sup>3</sup>Global Medical Affairs and Research, Abbott Nutrition, Chicago, IL; <sup>4</sup>Research &amp; Development, Abbott Nutrition, Granada, Andalucia; <sup>5</sup>Universidad del Rosario, Escuela de Medicina, Bogota, Cundinamarca; <sup>6</sup>Ateneo de Manila University, School of Medicine and Public Health, Metro Manila, National Capital Region; <sup>7</sup>Health Data Synthesis Institute, Chicago, IL</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> American Society for Nutrition, June 29-July 2, Chicago, IL, USA; American Academy of Pediatrics, September 27-October 1, Orlando, FL, USA.</p><p><b>Publication:</b> Sharn AR, Sorgho R, Sulo S, Molina-Molina E, Rojas Montenegro C, Villa-Real Guno MJ, Abdel-Rahman S. Using mid-upper arm circumference z-score measurement to support youth malnutrition screening as part of a global sports and wellness program and improve access to nutrition care. Front Nutr. 2024 Aug 12;11:1423978. doi: 10.3389/fnut.2024.1423978. PMID: 39188981; PMCID: PMC11345244.</p><p><b>Financial Support:</b> This study was financially supported by the Abbott Center for Malnutrition Solutions, Chicago, IL, USA.</p><p>Veeradej Pisprasert, MD, PhD<sup>1</sup>; Kittipadh Boonyavarakul, MD<sup>2</sup>; Sornwichate Rattanachaiwong, MD<sup>3</sup>; Thunchanok Kuichanuan, MD<sup>3</sup>; Pranithi Hongsprabhas, MD<sup>3</sup>; Chingching Foocharoen, MD<sup>3</sup></p><p><sup>1</sup>Faculty of Medicine, Khon Kaen University, Muang Khon Kaen, Khon Kaen; <sup>2</sup>Chulalongkorn University, Bangkok, Krung Thep; <sup>3</sup>Department of Medicine, Khon Kaen University, Muang Khon Kaen, Khon Kaen</p><p><b>Financial Support:</b> Grant supported by Khon Kaen University.</p><p><b>Background:</b> Systemic sclerosis (SSc) is an autoimmune disease which malnutrition is a common complication caused by chronic inflammation of its natural history and/or gastrointestinal tract involvement. Current nutritional assessment tools, e.g. GLIM criteria, may include data regarding muscle mass measurement for nutritional diagnosis. Anthropometric measurement is a basic method in determining muscle mass, however, data in such condition is limited. This study aimed to determine utility of determining muscle mass and muscle function by anthropometric measurement for diagnosing malnutrition in SSc patients.</p><p><b>Methods:</b> A cross-sectional diagnostic study was conducted in adult SSc patients at Srinagarind Hospital, Thailand. All patients were assessed for malnutrition based on Subjective Global Assessment (SGA). Muscle mass was measured by mid-upper-arm muscle circumference (MUAC) and calf circumference (CC), in addition, muscle function was determined by handgrip strength (HGS).</p><p><b>Results:</b> A total of 208 SSc patients were included, of which 149 were females (71.6%). The respective mean age and body mass index was 59.3 ± 11.0 years and 21.1 ± 3.9 kg/m². Nearly half (95 cases; 45.7%) were malnourished based on SGA. Mean values of MUAC, CC, and HGS were 25.9 ± 3.83, 31.5 ± 3.81, and 19.0 ± 6.99 kg, respectively. Area under the curve (AUC) of receiver operating characteristic (ROC) curves of MUAC for diagnosing malnutrition was 0.796, of CC was 0.759, and HGS was 0.720. Proposed cut-off values were shown in table 1.</p><p><b>Conclusion:</b> Muscle mass and muscle function were associated with malnutrition. Assessment of muscle mass and/or function by anthropometric measurement may be one part of nutritional assessment in patients with systemic sclerosis.</p><p><b>Table 1.</b> Proposed Cut-Off Values of MUAC, CC, and HGS in Patients With Systemic Sclerosis.</p><p></p><p>CC; calf circumference, HGS; handgrip strength, MUAC; mid-upper-arm circumference.</p><p></p><p><b>Figure 1.</b> ROC Curve of MUAC, CC, and HGS in Diagnosing Malnutrition by Subjective Global Assessment (SGA).</p><p>Trevor Sytsma, BS<sup>1</sup>; Megan Beyer, MS, RD, LDN<sup>2</sup>; Hilary Winthrop, MS, RD, LDN, CNSC<sup>3</sup>; William Rice, BS<sup>4</sup>; Jeroen Molinger, PhDc<sup>5</sup>; Suresh Agarwal, MD<sup>3</sup>; Cory Vatsaas, MD<sup>3</sup>; Paul Wischmeyer, MD, EDIC, FCCM, FASPEN<sup>6</sup>; Krista Haines, DO, MA<sup>3</sup></p><p><sup>1</sup>Duke University, Durham, NC; <sup>2</sup>Duke University School of Medicine- Department of Anesthesiology, Durham, NC; <sup>3</sup>Duke University School of Medicine, Durham, NC; <sup>4</sup>Eastern Virginia Medical School, Norfolk, VA; <sup>5</sup>Duke University Medical Center - Department of Anesthesiology - Duke Heart Center, Raleigh, NC; <sup>6</sup>Duke University Medical School, Durham, NC</p><p><b>Financial Support:</b> Baxter.</p><p><b>Background:</b> Achieving acceptable nutritional goals is a crucial but often overlooked component of postoperative care, impacting patient-important outcomes like reducing infectious complications and shortening ICU length of stay. Predictive resting energy expenditure (pREE) equations poorly correlate with actual measured REE (mREE), leading to potentially harmful over- or under-feeding. International ICU guidelines now recommend the use of indirect calorimetry (IC) to determine mREE and personalized patient nutrition. Surgical stress increases protein catabolism and insulin resistance, but the effect of age on postoperative mREE trends, which commonly used pREE equations do not account for, is not well studied. This study hypothesized that older adults undergoing major abdominal surgery experience slower metabolic recovery than younger patients, as measured by IC.</p><p><b>Methods:</b> This was an IRB-approved prospective trial of adult patients following major abdominal surgery with open abdomens secondary to blunt or penetrating trauma, sepsis, or vascular emergencies. Patients underwent serial IC assessments to guide postoperative nutrition delivery. Assessments were offered within the first 72 hours after surgery, then every 3 ± 2 days during ICU stay, followed by every 7 ± 2 days in stepdown. Patient mREE was determined using the Q-NRG® Metabolic Monitor (COSMED) in ventilator mode for mechanically ventilated patients or the mask or canopy modes, depending on medical feasibility. IC data were selected from ≥ 3-minute intervals that met steady-state conditions, defined by a variance of oxygen consumption and carbon dioxide production of less than 10%. Measurements not meeting these criteria were excluded from the final analysis. Patients without mREE data during at least two-time points in the first nine postoperative days were also excluded. Older adult patients were defined as ≥ 65 years and younger patients as ≤ 50. Trends in REE were calculated using the method of least squares and compared using t-tests assuming unequal variance (α = 0.05). Patients' pREE values were calculated from admission anthropometric data using ASPEN-SCCM equations and compared against IC measurements.</p><p><b>Results:</b> Eighteen older and 15 younger adults met pre-specified eligibility criteria and were included in the final analysis. Average rates and standard error of REE recovery in older and younger adult patients were 28.9 ± 17.1 kcal/day and 75.5 ± 18.9 kcal/day, respectively, which approached – but did not reach – statistical significance (p = 0.07). The lower and upper bands of pREE for the older cohort averaged 1728 ± 332 kcal and 2093 ± 390 kcal, respectively, markedly exceeding the mREE obtained by IC for most patients and failing to capture the observed variability identified using mREE. In younger adults, pREE values were closer to IC measurements, with lower and upper averages of 1705 ± 278 kcal and 2084 ± 323 kcal, respectively.</p><p><b>Conclusion:</b> Our data signal a difference in rates of metabolic recovery after major abdominal surgery between younger and older adult patients but did not reach statistical significance, possibly due to insufficient sample size. Predictive energy equations do not adequately capture changes in REE and may overestimate postoperative energy requirements in older adult patients, failing to appreciate the increased variability in mREE that our study found in older patients. These findings reinforce the importance of using IC to guide nutrition delivery during the early recovery period post-operatively. Larger trialsemploying IC and quantifying protein metabolism contributions are needed to explore these questions further.</p><p><b>Table 1.</b> Patient Demographics.</p><p></p><p></p><p><b>Figure 1.</b> Postoperative Changes in mREE in Older and Younger Adult Patients Following Major Abdominal Surgery Compared to pREE (ASPEN).</p><p>Amber Foster, BScFN, BSc<sup>1</sup>; Heather Resvick, PhD(c), MScFN, RD<sup>2</sup>; Janet Madill, PhD, RD, FDC<sup>3</sup>; Patrick Luke, MD, FRCSC<sup>2</sup>; Alp Sener, MD, PhD, FRCSC<sup>4</sup>; Max Levine, MD, MSc<sup>5</sup></p><p><sup>1</sup>Western University, Ilderton, ON; <sup>2</sup>LHSC, London, ON; <sup>3</sup>Brescia School of Food and Nutritional Sciences, Faculty of Health Sciences, Western University, London, ON; <sup>4</sup>London Health Sciences Centre, London, ON; <sup>5</sup>University of Alberta, Edmonton, AB</p><p><b>Financial Support:</b> Brescia University College MScFN stipend.</p><p><b>Background:</b> Currently, body mass index (BMI) is used as the sole criterion to determine whether patients with chronic kidney disease (CKD) are eligible for kidney transplantation. However, BMI is not a good measure of health in this patient population, as it does not distinguish between muscle mass, fat mass, and water weight. Individuals with end-stage kidney disease often experience shifts in fluid balance, resulting in fluid retention, swelling, and weight gain. Consequently, BMI of these patients may be falsely elevated. Therefore, it is vitally important to consider more accurate and objective measures of body composition for this patient population. The aim this study is to determine whether there is a difference in body composition between individuals with CKD who are categorized as having healthy body weight, overweight, or obesity.</p><p><b>Methods:</b> This was a cross-sectional study analyzing body composition of 114 adult individuals with CKD being assessed for kidney transplantation. Participants were placed into one of three BMI groups: healthy weight (group 1, BMI &lt; 24.9 kg/m<sup>2</sup>, n = 29), overweight (group 2, BMI ≥ 24.9-29.9 kg/m2, n = 39) or with obesity (group 3, BMI ≥ 30 kg/m2, n = 45). Fat mass, lean body mass (LBM), and phase angle (PhA) were measured using Bioelectrical Impedance Analysis (BIA). Standardized phase angle (SPhA), a measure of cellular health, was calculated by [(observed PhA-mean PhA)/standard deviation of PhA]. Handgrip strength (HGS) was measured using a Jamar dynamometer, and quadriceps muscle layer thickness (QMLT) was measured using ultrasonography. Normalized HGS (nHGS) was calculated as [HGS/body weight (kg)], and values were compared to age- and sex-specific standardized cutoff values. Fat free mass index (FFMI) was calculated using [LBM/(height (m))<sup>2</sup>]. Low FFMI, which may identify high risk of malnutrition, was determined using ESPEN cut-off values of &lt; 17 kg/m2 for males and &lt; 15 kg/m2 for females. Frailty status was determined using the validated Fried frailty phenotype assessment tool. Statistical analysis: continuous data was analyzed using one-way ANOVA followed by Tukey post hoc, while chi-square tests were used for analysis of categorical data. IBM SPSS version 29, significance p &lt; 0.05.</p><p><b>Results:</b> Participants in group 1 were younger than either group 2 (p = 0.004) or group 3 (p &lt; 0.001). There was no significant difference in males and females between the three groups. FFMI values below cutoff were significantly higher for group 1 (13%), versus group 2 (0%) and group 3 (2.1%) (p = 0.02). A significant difference in nHGS was found between groups, with lower muscle strength occurring more frequently among participants in group 3 (75%) vs 48.7% in group 2 and 28.5% in group 1 (p &lt; 0.001). No significant differences were seen in QMLT, SPhA, HGS, or frailty status between the three BMI groups.</p><p><b>Conclusion:</b> It appears that there is no difference in body composition parameters such as QMLT, SPhA, or frailty status between the three BMI groups. However, patients with CKD categorized as having a healthy BMI were more likely to be at risk for malnutrition. Furthermore, those individuals categorized as having a healthy BMI appeared to have more muscle strength compared to the other two groups. Taken together, these results provide convincing evidence that BMI should not be the sole criterion for listing patients for kidney transplantation. Further research is needed to confirm these findings.</p><p>Kylie Waynick, BS<sup>1</sup>; Katherine Petersen, MS, RDN, CSO<sup>2</sup>; Julie Kurtz, MS, CDCES, RDN<sup>2</sup>; Maureen McCoy, MS, RDN<sup>3</sup>; Mary Chew, MS, RDN<sup>4</sup></p><p><sup>1</sup>Arizona State University and Veterans Healthcare Administration, Phoenix, AZ; <sup>2</sup>Veterans Healthcare Administration, Phoenix, AZ; <sup>3</sup>Arizona State University, Phoenix, AZ; <sup>4</sup>Phoenix VAHCS, Phoenix, AZ</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Malnutrition does not have a standardized definition nor universal identification criteria. Registered dietitian nutritionists (RDNs) most often diagnose based on Academy and ASPEN Identification of Malnutrition (AAIM) criteria while physicians are required to use International Classification of Diseases Version 10 (ICD-10-CM). However, there are major differences in how malnutrition is identified between the two. The malnutrition ICD-10 codes (E43.0, E44.0, E44.1, E50.0-E64.9) have vague diagnostic criteria leading providers to use clinical expertise and prior nutrition education. For dietitians, AAIM's diagnostic criteria is clearly defined and validated to identify malnutrition based on reduced weight and intake, loss of muscle and fat mass, fluid accumulation, and decrease in physical functioning. Due to lack of standardization, the process of identifying and diagnosing malnutrition is inconsistent. The purpose of this study was to analyze the congruence of malnutrition diagnosis between physicians and dietitians using the two methods and to compare patient outcomes between the congruent and incongruent groups.</p><p><b>Methods:</b> A retrospective chart review of 668 inpatients assigned a malnutrition diagnostic code were electronically pulled from the Veteran Health Administration's Clinical Data Warehouse for the time periods of April through July in 2019, 2020, and 2021. Length of stay, infection, pressure injury, falls, thirty day readmissions, and documentation of communication between providers was collected from chart review. Data for cost to the hospital was pulled from Veterans Equitable Resource Allocation (VERA) and paired with matching social security numbers in the sample. Chi squares were used for comparing differences between incongruency and congruency for infection, pressure injury, falls, and readmissions. Means for length of stay and cost to hospital between the two groups were analyzed using ANOVA through SPSS.</p><p><b>Results:</b> The diagnosis of malnutrition is incongruent between providers. The incongruent group had a higher percentage of adverse patient outcomes than those with congruent diagnosis. Congruent diagnoses were found to be significantly associated with incidence of documented communication (p &lt; 0.001).</p><p><b>Conclusion:</b> This study showcases a gap in malnutrition patient care. Further research needs to be conducted to understand the barriers to congruent diagnosis and communication between providers.</p><p>Nana Matsumoto, RD, MS<sup>1</sup>; Koji Oba, Associate Professor<sup>2</sup>; Tomonori Narita, MD<sup>3</sup>; Reo Inoue, MD<sup>2</sup>; Satoshi Murakoshi, MD, PhD<sup>4</sup>; Yuki Taniguchi, MD<sup>2</sup>; Kenichi Kono, MD<sup>2</sup>; MIdori Noguchi, BA<sup>5</sup>; Seiko Tsuihiji<sup>2</sup>; Kazuhiko Fukatsu, MD, PhD<sup>2</sup></p><p><sup>1</sup>The University of Tokyo, Bunkyo-City, Tokyo; <sup>2</sup>The University of Tokyo, Bunkyo-ku, Tokyo; <sup>3</sup>The University of Tokyo, Chuo-City, Tokyo; <sup>4</sup>Kanagawa University of Human Services, Yokosuka-city, Kanagawa; <sup>5</sup>The University of Tokyo Hospital, Bunkyo-ku, Tokyo</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Therapeutic diets are often prescribed for patients with various disorders, for example, diabetes, renal dysfunction and hypertension. However, due to the limitations of nutrients amount given, therapeutic diets might reduce appetite. Hospital meals maintain patients’ nutritional status when the meals are fully consumed regardless of the diet types. It is possible that therapeutic diets are at least partly associated with malnutrition in hospital. Therefore, we conducted an exploratory retrospective cohort study to investigate whether there are differences inpatients’ oral consumption between therapeutic and regular diets, taking into account other factors.</p><p><b>Methods:</b> The study protocol was approved by the Ethics Committee of the University of Tokyo under protocol No.2023396NI-(1). We retrospectively extracted information from medical record of the patients who were admitted to the department of orthopedic and spine surgery at the University of Tokyo Hospital between June to October 2022. Eligible patients were older than 20 years old and hospitalized more than 7 days. These patients were provided oral diets as main source of nutrition. Patients prescribed texture modified diet, half or liquid diet are excluded. The measurements include percentages of oral food intake at various points during the hospitalization (e.g. at admission, before and after surgery and discharge), sex and ages. The differences in patient's oral consumption rate between therapeutic diet and regular diet were analyzed through a linear mixed-effect model.</p><p><b>Results:</b> A total of 290 patients were analyzed, with 50 patients receiving a therapeutic diet and 240 patients receiving regular diet at admission. The mean percentage of oral intake was 83.1% in a therapeutic diet and 87.2% in a regular diet, and consistently 4-6% higher for regular diets compared to therapeutic diets, at each timing of hospitalization (Figure). In a linear mixed effect model with adjustment of sex and age, mean percentage of oral intake of a regular diet was 4.0% higher (95% confidence interval [CI], -0.8% to 8.9%, p = 0.100) than a therapeutic diet, although the difference did not reach statistical significance. The mean percentage of oral intake in women were 15.6% lower than men (95%CI, -19.5% to -11.8%.) Likewise, older patient's intake rate was reduced compared than younger patients (difference, -0.2% per age, 95%CI -0.3% to -0.1%).</p><p><b>Conclusion:</b> This exploratory study failed to show that therapeutic diets may reduce food intake in orthopedic and supine surgery patients as compared to regular diet. However, sex and ages were important factors affecting food intake. We need to pay special attention to female and/or aged patients for increasing oral food intake. Future research will increase the number of patients examined, expand the cohort to other department and proceed to prospective study to find out what factors truthfully affect patient's oral intake during the hospitalization.</p><p></p><p><b>Figure 1.</b> The Percentage of Oral Intake During Hospitalization in Each Diet.</p><p>Lorena Muhaj, MS<sup>1</sup>; Michael Owen-Michaane, MD, MA, CNSC<sup>2</sup></p><p><sup>1</sup>1 Institute of Human Nutrition, Vagelos College of Physicians and Surgeons, Columbia University Irving Medical Center, New York, NY; <sup>2</sup>Columbia University Irving Medical Center, New York, NY</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Muscle mass is crucial for overall health and well-being. Consequently, accurate estimation of muscle mass is essential for diagnosing malnutrition and conditions such as sarcopenia and cachexia. The Kim equation uses biomarker data to estimate muscle mass, but whether this tool provides an accurate estimate in populations with high BMI and kidney disease remains uncertain. Therefore, the aim of this study is to assess whether the Kim equation is suitable and reliable for estimating muscle mass and predicting malnutrition, sarcopenia, and related outcomes in a cohort with diverse BMI and kidney disease.</p><p><b>Methods:</b> This is a cross-sectional study using data from the All of Us Research Program. Data on demographics, weight, height, creatinine, cystatin C, and diagnoses of malnutrition, hip fractures, and cachexia were obtained from electronic health records (EHR). The Kim equation was derived from creatinine, cystatin C and weight (Table 1) and compared with established sarcopenia cutoffs for appendicular lean mass (ALM), including ALM/BMI and ALM/height<sup>2</sup>. Malnutrition was identified through specific ICD-10-CM codes recorded in the EHR and participants were categorized based on malnutrition status (with or without malnutrition). Muscle mass and biomarker levels were compared between groups with or without severe/moderate malnutrition, and the relationships of BMI with creatinine and cystatin C levels were analyzed using linear regression. Wilcoxon rank-sum tests were used to assess associations between estimated muscle mass and malnutrition diagnosis.</p><p><b>Results:</b> Baseline characteristics were stratified by gender for comparison. The mean age of participants was 58.2 years (SD = 14.7). The mean BMI was 30.4 kg/m<sup>2</sup> (SD = 7.5). Mean serum creatinine and cystatin C levels were 2.01 mg/dL (SD = 1.82) and 0.18 mg/dL (SD = 0.11), respectively. The mean estimated muscle mass was 80.1 kg (SD = 21), with estimated muscle mass as a percentage of body weight being 92.8%. Mean ALM/BMI was 2.38 (SD = 0.32), while ALM/height<sup>2</sup> value was 25.41 kg/m<sup>2</sup> (SD = 5.6). No participant met the cutoffs for sarcopenia. All calculated variables are summarized in Table 1. In this cohort, &lt; 2% were diagnosed with severe malnutrition and &lt; 2% with moderate malnutrition (Table 2). Muscle mass was lower in participants with severe malnutrition compared to those without (W = 2035, p &lt; 0.05) (Figure 1).</p><p><b>Conclusion:</b> This study suggests the Kim equation overestimates muscle mass in populations with high BMI or kidney disease, as no participants met sarcopenia cutoffs despite the expected prevalence in CKD. This overestimation is concerning given the known risk of low muscle mass in CKD. While lower muscle mass was significantly associated with severe malnutrition (p &lt; 0.05), the Kim equation identified fewer malnutrition cases than expected based on clinical data. Though biomarkers like creatinine and cystatin C may help diagnose malnutrition, the Kim equation may not accurately estimate muscle mass or predict malnutrition and sarcopenia in diverse populations. Further research is needed to improve these estimates.</p><p><b>Table 1.</b> Muscle Mass Metrics Calculations and Diagnosis of Sarcopenia Based on FNIH (ALM/BMI) and EWGSOP (ALM/Height2) Cut-off Values.</p><p></p><p>Abbreviations: BMI-Body Mass Index; TBMM-Total Body Muscle Mass (referred as Muscle Mass as well) (calculated using the Kim Equation); ALM-Appendicular Lean Muscle Mass (using the McCarthy equation); ALM/Height2-Appendicular Lean Muscle Mass adjusted for height square (using EWGSOP cutoffs for diagnosing sarcopenia); ALM/BMI-Appendicular Lean Muscle Mass adjusted for BMI (using FNIH cutoffs for diagnosing sarcopenia). Equation 1: Kim equation - Calculated body muscle mass = body weight * serum creatinine/((K * body weight * serum cystatin C) + serum creatinine)</p><p><b>Table 2.</b> Prevalence of Severe and Moderate Malnutrition.</p><p></p><p>(Counts less than 20 suppressed to prevent reidentification of participants).</p><p></p><p><b>Figure 1.</b> Muscle Mass in Groups With and Without Severe Malnutrition.</p><p><b>Poster of Distinction</b></p><p>Robert Weimer, BS<sup>1</sup>; Lindsay Plank, PhD<sup>2</sup>; Alisha Rovner, PhD<sup>1</sup>; Carrie Earthman, PhD, RD<sup>1</sup></p><p><sup>1</sup>University of Delaware, Newark, DE; <sup>2</sup>University of Auckland, Auckland</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Loss of skeletal muscle is common in patients with liver cirrhosis and low muscle mass is internationally recognized as a key phenotypic criterion to diagnose malnutrition.<sup>1,2</sup> Muscle can be clinically assessed through various modalities, although reference data and consensus guidance are limited for the interpretation of muscle measures to define malnutrition in this patient population. The aim of this study was to evaluate the sensitivity and specificity of published sarcopenia cutpoints applied to dual-energy X-ray absorptiometry (DXA) muscle measures to diagnose malnutrition by the GLIM criteria, using in vivo neutron activation analysis (IVNAA) measures of total body protein (TBP) as the reference.</p><p><b>Methods:</b> Adults with liver cirrhosis underwent IVNAA and whole body DXA at the Body Composition Laboratory of the University of Auckland. DXA-fat-free mass (FFM) and appendicular skeletal muscle mass (ASMM, with and without correction for wet bone mass<sup>3</sup>) were measured and indexed to height squared (FFMI, ASMI). The ratio of measured to predicted TBP based on healthy reference data matched for age, sex and height was calculated as a protein index; values less than 2 standard deviations below the mean (&lt; 0.77) were defined as protein depletion (malnutrition). Published cut points from recommended guidelines were evaluated (Table 1).<sup>4-9</sup> DXA values below the cut point were interpreted as ‘sarcopenic’. Sensitivity and specificity for each cut point were determined.</p><p><b>Results:</b> Study sample included 350 adults (238 males/112 females, median age 52 years) with liver cirrhosis and median model for end-stage liver disease (MELD) score 12 (range 5-36). Application of published sarcopenia cutpoints to diagnose malnutrition by DXA in patients with liver cirrhosis had sensitivity ranging from 40.8% - 79.0% and specificity ranging from 79.6% to 94.2% (Table 1). Although all of the selected published cutpoints for DXA-measured ASMI were similar, the Baumgartner<sup>4</sup> and Newman<sup>5</sup> ASMI cutpoints when applied to our DXA-measured ASMI, particularly after correction for wet bone mass, yielded the best combination of sensitivity and specificity for diagnosing malnutrition as identified by protein depletion in patients with liver cirrhosis. The Studentski ASMM cutpoints in Table 1 yielded unacceptably low sensitivity (41-55%).</p><p><b>Conclusion:</b> These findings suggest that the use of the DXA-derived Baumgartner/Newman bone-corrected ASMI cutpoints offer acceptable validity in the diagnosis of malnutrition by GLIM in patients with liver cirrhosis. However, given that it is not common practice to make this correction for wet bone mass in DXA measures of ASMI, the application of these cutpoints to standard uncorrected measures of ASMI by DXA would likely yield much lower sensitivity, suggesting that many individuals with low muscularity and malnutrition would be misdiagnosed as non-malnourished when applying these cutpoints.</p><p><b>Table 1.</b> Evaluation of Selected Published Cut-Points for Dual-Energy X-Ray Absorptiometry Appendicular Skeletal Muscle Index to Identify Protein Depletion in Patients with Liver Cirrhosis.</p><p></p><p>Abbreviations: DXA, dual-energy X-ray absorptiometry; M, male; F, female; GLIM, Global Leadership Initiative on Malnutrition; EWGSOP, European Working Group on Sarcopenia in Older People; AWGS, Asia Working Group for Sarcopenia; FNIH, Foundation for the National Institutes of Health; ASMM, appendicular skeletal muscle mass in kg determined from DXA-measured lean soft tissue of the arms and legs; ASMI, ASMM indexed to height in meters-squared; ASMM-BC, ASMM corrected for wet bone mass according to Heymsfield et al 1990; ASMI-BC, ASMI corrected for wet bone mass.</p><p><b>Critical Care and Critical Health Issues</b></p><p>Amir Kamel, PharmD, FASPEN<sup>1</sup>; Tori Gray, PharmD<sup>2</sup>; Cara Nys, PharmD, BCIDP<sup>3</sup>; Erin Vanzant, MD, FACS<sup>4</sup>; Martin Rosenthal, MD, FACS, FASPEN<sup>1</sup></p><p><sup>1</sup>University of Florida, Gainesville, FL; <sup>2</sup>Cincinnati Children, Gainesville, FL; <sup>3</sup>Orlando Health, Orlando, FL; <sup>4</sup>Department of Surgery, Division of Trauma and Acute Care Surgery, College of Medicine, University of Florida, Gainesville, FL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Amino acids (AAs) serve different purposes in our body including structural, enzymatic, and integral cellular functions. Amino acids utilization and demand may vary between healthy and disease states. Certain conditions such as chronic kidney disease or short bowel syndrome can affect plasma AA levels. Previous research has identified citrulline as a marker of intestinal function and absorptive capacity. Stressors such as surgery or trauma can alter AAs metabolism, potentially leading to a hypercatabolic state and changes in the available AAs pool. The primary objective of this study is to compare AA levels in patients who have undergone abdominal surgery with those who have not. The secondary endpoint is to describe post-surgical complications and correlate plasma AAs level to such complications.</p><p><b>Methods:</b> This study was a single-center retrospective analysis conducted between January 1, 2007and March 15, 2019, of patients who were referred to the University of Florida Health Nutrition Support Team (NST) and had a routine metabolic evaluation with amino acid levels as a part of nutrition support consult. Amino acid data were excluded if specimen were deemed contaminated. Patients with genetic disorders were also excluded from the study. During the study period, the amino acid bioassay was performed using Bio-chrome Ion exchange chromatography (ARUP Laboratories, Salt Lake City, UT).</p><p><b>Results:</b> Of the 227 patients screened, 181 patients were included in the study (58 who underwent abdominal surgery and 123 who did not). The mean age, BMI and height of participants were 52.2 years, 25.1 kg/m<sup>2</sup> and 169 cm respectively. Baseline characteristics were similar between the two groups, 31% of the surgery arm had undergone a surgical procedure within a year from the index encounter, 86.5% retained their colon, 69.2% had a bowel resection with mean of 147.6 cm of bowel left for those with documented length of reaming bowel (36 out of 58). Postoperative complications of small bowel obstruction, ileus, leak, abscess, bleeding and surgical site infection (SSI) were 12.1%, 24%, 17.2%, 20.7%, 3.4% and 17.2% respectively. Among the 19 AAs evaluated, median citrulline and methionine levels were significantly different between the 2 groups (23 [14-35] vs 17 [11-23]; p = 0.0031 and 27 [20-39] vs 33[24-51]; p = 0.0383. Alanine and arginine levels were associated with postoperative ileus, leucine levels correlated with SSI and glutamic acid and glycine levels were linked to postoperative fistula formation.</p><p><b>Conclusion:</b> Most amino acid levels showed no significant differences between patients who underwent abdominal surgery and those who did not except for citrulline and methionine. Specific amino acids, such as alanine, arginine, leucine, glutamic acid and glycine may serve as an early indicator of post-surgical complications, however larger prospective trial is warranted to validate our findings.</p><p>Kento Kurashima, MD, PhD<sup>1</sup>; Grace Trello<sup>1</sup>; James Fox<sup>1</sup>; Edward Portz<sup>1</sup>; Shaurya Mehta, BS<sup>1</sup>; Austin Sims<sup>1</sup>; Arun Verma, MD<sup>1</sup>; Chandrashekhara Manithody, PhD<sup>1</sup>; Yasar Caliskan, MD<sup>1</sup>; Mustafa Nazzal, MD<sup>1</sup>; Ajay Jain, MD, DNB, MHA<sup>1</sup></p><p><sup>1</sup>Saint Louis University, St. Louis, MO</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Marginal donor livers (MDLs) have been used for liver transplantation to address major organ shortages. However, MDLs are notably susceptible to Ischemia/Reperfusion injury (IRI). Recent investigations have highlighted Ferroptosis, a new type of programmed cell death, as a potential contributor to IRI. We hypothesized that modulating ferroptosis by the iron chelator deferoxamine (DFO) could alter the course of IRI.</p><p><b>Methods:</b> Using our novel Perfusion Regulated Organ Therapeutics with Enhanced Controlled Testing (PROTECT) model (US provision Patent, US63/136,165), six human MDLs (liver A to F) were procured and split into paired lobes. Simultaneous perfusion was performed on both lobes, with one lobe subjected to DFO while the other serving as an internal control. Histology, serum chemistry, expression of ferroptosis-associated genes, determination of iron accumulation, and measurement of lipid peroxidation, were performed.</p><p><b>Results:</b> Histological analysis revealed severe macrovesicular steatosis (&gt;30%) in liver A and D, while liver B and E exhibited mild to moderate macrovesicular steatosis. Majority of the samples noted mild inflammation dominantly in zone 3. No significant necrosis was noted during perfusion. Perl's Prussian blue stain and non-heme iron quantification demonstrated a suppression of iron accumulation in liver A to D with DFO treatment (p &lt; 0.05). Based on the degree of iron chelation, 12 lobes were categorized into two groups: lobes with decreased iron (n = 4) and those with increased iron (n = 8). Comparative analysis demonstrated that ferroptosis-associated genes (HIF1-alpha, RPL8, IREB2, ACSF2, NQO1) were significantly downregulated in the former (p = 0.0338, p = 0.0085, p = 0.0138, p = 0.0138, p = 0.0209, respectively). Lipid peroxidation was significantly suppressed in lobes with decreased iron (p = 0.02). While serum AST was lower in iron chelated lobes this did not reach statistical significance.</p><p><b>Conclusion:</b> This study affirmed that iron accumulation was driven by normothermic perfusion. Reduction of iron content suppressed ferroptosis-associated genes and lipid peroxidation to mitigate IRI. Our results using human MDLs revealed a novel relationship between iron content and ferroptosis, providing a solid foundation for future development of IRI therapeutics.</p><p>Gabriella ten Have, PhD<sup>1</sup>; Macie Mackey, BSc<sup>1</sup>; Carolina Perez, MSc<sup>1</sup>; John Thaden, PhD<sup>1</sup>; Sarah Rice, PhD<sup>1</sup>; Marielle Engelen, PhD<sup>1</sup>; Nicolaas Deutz, PhD, MD<sup>1</sup></p><p><sup>1</sup>Texas A&amp;M University, College Station, TX</p><p><b>Financial Support:</b> Department of Defense: CDMRP PR190829 - ASPEN Rhoads Research Foundation - C. Richard Fleming Grant &amp; Daniel H. Teitelbaum Grant.</p><p><b>Background:</b> Sepsis is a potentially life-threatening complication of infection in critically ill patients and is characterized by severe tissue breakdown in several organs, leading to long-term muscle weakness, fatigue, and reduced physical activity. Recent guidelines suggest increasing protein intake gradually in the first days of recovery from a septic event. It is, however, unclear how this affects whole-body protein turnover. Therefore in an acute sepsis-recovery pig model, we studied whole-body protein metabolism in the early sepsis recovery phase after restricted feeding with a balanced meal of amino acids (AA).</p><p><b>Methods:</b> In 25 catheterized pigs (± 25 kg), sepsis was induced by IV infusion of live Pseudomonas aeruginosa bacteria (5*108 CFU/hour). At t = 9 h, recovery was initiated with IV administration of gentamicin. Post sepsis, twice daily food was given incrementally (Day 1:25%, 2:50%, 3:75%, &gt;4:100%). The 100% meals contained per kg BW: 15.4gr CHO and 3.47gr fat, and a balanced free AA (reflecting muscle AA profile) mixture (0.56 gr n = 3.9 gr AA). Before sepsis (Baseline) and on recovery day 3, an amino acid stable isotope mixture was administered IV as pulse (post-absorptive). Subsequently, arterial blood samples were collected postabsorptive for 2 hours. Amino acid concentrations and enrichments were determined with LCMS. Statistics: RM-ANOVA, <i><b>α </b></i>= 0.05.</p><p><b>Results:</b> At day 3, animal body weight was decreased (2.4 [0.9, 3.9]%, p = 0.0025). Compared to baseline values, plasma AA concentration profiles were changed. Overall, the total non-essential AA plasma concentration did not change. Essential AA plasma concentrations of histidine, leucine, methionine, phenylalanine, tryptophan, and valine were lower (p &lt; 0.05) and lysine higher (p = 0.0027). No change in isoleucine. We observed lower whole-body production (WBP) of the non-essential amino acids arginine (p &lt; 0.0001), glutamine (p &lt; 0.0001), glutamate (p &lt; 0.0001), glycine (p &lt; 0.0001), hydro-proline (p = 0.0041), ornithine (p = 0.0003), taurine (p &lt; 0.0001), and tyrosine (p &lt; 0.0001). Citrulline production has not changed. In addition, lower WBP was observed for the essential amino acids, isoleucine (p = 0.0002), leucine (p &lt; 0.0001), valine (p &lt; 0.0001), methionine (p &lt; 0.0001), tryptophane (p &lt; 0.0001), and lysine (p &lt; 0.0001). Whole-body protein breakdown and protein synthesis were also lower (p &lt; 0.0001), while net protein breakdown has not changed.</p><p><b>Conclusion:</b> Our sepsis-recovery pig model suggests that food restriction in the early phase of sepsis recovery leads to diminished protein turnover.</p><p>Gabriella ten Have, PhD<sup>1</sup>; Macie Mackey, BSc<sup>1</sup>; Carolina Perez, MSc<sup>1</sup>; John Thaden, PhD<sup>1</sup>; Sarah Rice, PhD<sup>1</sup>; Marielle Engelen, PhD<sup>1</sup>; Nicolaas Deutz, PhD, MD<sup>1</sup></p><p><sup>1</sup>Texas A&amp;M University, College Station, TX</p><p><b>Financial Support:</b> Department of Defense: CDMRP PR190829 - ASPEN Rhoads Research Foundation - C. Richard Fleming Grant &amp; Daniel H. Teitelbaum Grant.</p><p><b>Background:</b> Sepsis is a potentially life-threatening complication of infection in critically ill patients. It is characterized by severe tissue breakdown in several organs, leading to long-term muscle weakness, fatigue, and reduced physical activity (ICU-AW). In an acute sepsis-recovery ICU-AW pig model, we studied whether meals that only contain essential amino acids (EAA) can restore the metabolic deregulations during sepsis recovery, as assessed by comprehensive metabolic phenotyping1.</p><p><b>Methods:</b> In 49 catheterized pigs (± 25 kg), sepsis was induced by IV infusion of live Pseudomonas aeruginosa bacteria (5*108 CFU/hour). At t = 9 h, recovery was initiated with IV administration of gentamicin. Post sepsis, twice daily food was given blindly and incrementally (Day 1:25%, 2:50%, 3:75%, &gt;4:100%). The 100% meals contained per kg BW: 15.4gr CHO and 3.47gr fat, and 0.56 gr N of an EAA mixture (reflecting muscle protein EAA, 4.3 gr AA) or control (TAA, 3.9 gr AA). Before sepsis (Baseline) and on recovery day 7, an amino acid stable isotope mixture was administered IV as pulse (post-absorptive). Subsequently, arterial blood samples were collected for 2 hours. AA concentrations and enrichments were determined with LCMS. Statistics: RM-ANOVA, <i><b>α</b></i> = 0.05.</p><p><b>Results:</b> A body weight reduction was found after sepsis, restored on Day 7 post sepsis. Compared to baseline, in the EAA group, increased muscle fatigue (p &lt; 0.0001), tau-methylhistidine whole-body production (WBP) (reflects myofibrillar muscle breakdown, p &lt; 0.0001), and whole-body net protein breakdown (p &lt; 0.0001) was observed but less in the control group (muscle fatigue: p &lt; 0.0001, tau-methylhistidine: p = 0.0531, net protein breakdown (p &lt; 0.0001). In addition on day 7, lower WBP was observed of glycine (p &lt; 0.0001), hydroxyproline (p &lt; 0.0001), glutamate (p &lt; 0.0001), glutamine (p &lt; 0.0001), and taurine (p &lt; 0.0001), however less (glycine: p = 0.0014; hydroxyproline (p = 0.0007); glutamate p = 0.0554) or more (glutamine: p = 0.0497; taurine: p &lt; 0.0001) in the control group. In addition, the WBP of citrulline (p = 0.0011) was increased on day seven but less in the control group (p = 0.0078). Higher plasma concentrations of asparagine (p &lt; 0.0001), citrulline (p &lt; 0.0001), glutamine (p = 0.0001), tau-methylhistidine (0.0319), serine (p &lt; 0.0001), taurine (p &lt; 0.0001), and tyrosine (p &lt; 0.0001) were observed in the EAA group. In the EAA group, the clearance was lower (p &lt; 0.05), except for glycine, tau-methylhistidine, and ornithine.</p><p><b>Conclusion:</b> Conclusion Our sepsis-recovery pig ICU-AW model shows that feeding EAA-only meals after sepsis relates to an increased muscle and whole-body net protein breakdown and affects non-EAA metabolism. We hypothesize that non-essential amino acids in post-sepsis nutrition are needed to improve protein anabolism.</p><p>Rebecca Wehner, RD, LD, CNSC<sup>1</sup>; Angela Parillo, MS, RD, LD, CNSC<sup>1</sup>; Lauren McGlade, RD, LD, CNSC<sup>1</sup>; Nan Yang, RD, LD, CNSC<sup>1</sup>; Allyson Vasu-Sarver, MSN, APRN-CNP<sup>1</sup>; Michele Weber, DNP, RN, APRN-CNS, APRN-NP, CCRN, CCNS, OCN, AOCNS<sup>1</sup>; Stella Ogake, MD, FCCP<sup>1</sup></p><p><sup>1</sup>The Ohio State University Wexner Medical Center, Columbus, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Patients in the intensive care unit, especially those on mechanical ventilation, frequently receive inadequate enteral nutrition (EN) therapy. Critically ill patients receive on average 40-50% of prescribed nutritional requirements, while ASPEN/SCCM guidelines encourage efforts be made to provide &gt; 80% of goal energy and protein needs. One method to help achieve these efforts is the use of volume-based feedings (VBF). At our institution, an hourly rate-based feeding (RBF) approach is standard. In 2022, our Medical Intensive Care Unit (MICU) operations committee inquired about the potential benefits of implementing VBF. Before changing our practice, we collected data to assess our current performance in meeting EN goals and to identify the reasons for interruptions. While literature suggests that VBF is considered relatively safe in terms of EN complications compared to RBF, to our knowledge, there is currently no information on the safety of starting VBF in patients at risk of gastrointestinal (GI) intolerance. Therefore, we also sought to determine whether EN is more frequently held due to GI intolerance versus operative procedures.</p><p><b>Methods:</b> We conducted a retrospective evaluation of EN delivery compared to EN goal and the reason for interruption if EN delivery was below goal in the MICU of a large tertiary academic medical center. We reviewed ten days of information on any MICU patient on EN. One day constituted the total EN volume received, in milliliters, from 0700-0659 hours. Using the QI Assessment Form, we collected the following data: goal EN volume in 24 hours, volume received in 24 hours, percent volume received versus prescribed, and hours feeds were held (or below goal rate) each day. The reasons for holding tube feeds were divided into six categories based on underlying causes: feeding initiation/titration, GI issues (constipation, diarrhea, emesis, nausea, distention, high gastric residual volume), operative procedure, non-operative procedure, mechanical issue, and practice issues. Data was entered on a spreadsheet, and descriptive statistics were used to evaluate results.</p><p><b>Results:</b> MICU patients receiving EN were observed over ten random days in a two-week period in August 2022. Eighty-two patients were receiving EN. Three hundred and four EN days were observed. Average percent EN delivered was 70% among all patients. EN was withheld for the following reasons: 34 cases (23%) were related to feeding initiation, 55 (37%) GI issues, 19 (13%) operative procedures, 32 (22%) non-operative procedures, 2 (1%) mechanical issues, and 5 (3%) cases were related to practice issues. VBF could have been considered in 51 cases (35%).</p><p><b>Conclusion:</b> These results suggest that EN delivery in our MICU is most often below prescribed amount due to GI issues and feeding initiation. Together, they comprised 89 cases (60%). VBF protocols would not improve delivery in either case. VBF would likely lead to increased discomfort in patients experiencing GI issues, and feeding initiation can be improved with changes to advancement protocols. Due to VBF having potential benefit in only 35% of cases, as well as observing above average EN delivery, this protocol was not implemented in the observed MICU.</p><p>Delaney Adams, PharmD<sup>1</sup>; Brandon Conaway, PharmD<sup>2</sup>; Julie Farrar, PharmD<sup>3</sup>; Saskya Byerly, MD<sup>4</sup>; Dina Filiberto, MD<sup>4</sup>; Peter Fischer, MD<sup>4</sup>; Roland Dickerson, PharmD<sup>3</sup></p><p><sup>1</sup>Regional One Health, Memphis, TN; <sup>2</sup>Veterans Affairs Medical Center, Memphis, TN; <sup>3</sup>University of Tennessee College of Pharmacy, Memphis, TN; <sup>4</sup>University of Tennessee College of Medicine, Memphis, TN</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> Society for Critical Care Medicine 54th Annual Critical Care Congress. February 23 to 25, 2025, Orlando, FL.</p><p><b>Publication:</b> Critical Care Medicine.2025;53(1):In press.</p><p><b>Financial Support:</b> None Reported.</p><p><b>Best of ASPEN-Critical Care and Critical Health Issues</b></p><p>Megan Beyer, MS, RD, LDN<sup>1</sup>; Krista Haines, DO, MA<sup>2</sup>; Suresh Agarwal, MD<sup>2</sup>; Hilary Winthrop, MS, RD, LDN, CNSC<sup>2</sup>; Jeroen Molinger, PhDc<sup>3</sup>; Paul Wischmeyer, MD, EDIC, FCCM, FASPEN<sup>4</sup></p><p><sup>1</sup>Duke University School of Medicine- Department of Anesthesiology, Durham, NC; <sup>2</sup>Duke University School of Medicine, Durham, NC; <sup>3</sup>Duke University Medical Center - Department of Anesthesiology - Duke Heart Center, Raleigh, NC; <sup>4</sup>Duke University Medical School, Durham, NC</p><p><b>Financial Support:</b> Baxter, Abbott.</p><p><b>Background:</b> Resting energy expenditure (REE) is critical in managing nutrition in intensive care unit (ICU) patients. Accurate energy needs assessments are vital for optimizing nutritional interventions. However, little is known about how different disease states specifically influence REE in ICU patients. Existing energy recommendations are generalized and do not account for the metabolic variability across disease types. Indirect calorimetry (IC) is considered the gold standard for measuring REE but is underutilized. This study addresses this gap by analyzing REE across disease states using metabolic cart assessments in a large academic medical center. The findings are expected to inform more precise, disease-specific nutritional recommendations in critical care.</p><p><b>Methods:</b> This is a pooled analysis of patients enrolled in four prospective clinical trials evaluating ICU patients across a range of disease states. Patients included in this analysis were admitted to the ICU with diagnoses of COVID-19, respiratory failure, cardiothoracic (CT) surgery, trauma, or surgical intensive care conditions. All patients underwent IC within 72 hours of ICU admission to assess REE, with follow-up measurements conducted when patients were medically stable. In each study, patients were managed under standard ICU care protocols in each study, and nutritional interventions were individualized or standardized based on clinical trial protocols. The primary outcome was the measured REE, expressed in kcal/day and normalized to body weight (kcal/kg/day). Summary statistics for demographic and clinical characteristics, such as age, gender, height, weight, BMI, and comorbidities, were reported. Comparative analyses across the five disease states were performed using ANOVA tests to determine the significance of differences in REE.</p><p><b>Results:</b> The analysis included 165 ICU patients. The cohort had a mean age of 58 years, with 58% male and 42% female. Racial demographics included 36% black, 52% white, and 10% from other backgrounds. Patients in the surgical ICU group had the lowest caloric requirements, averaging 1503 kcal/day, while COVID-19 patients had the highest calorie needs of 1982 kcal/day. CT surgery patients measured 1644 kcal/day, respiratory failure measured 1763 kcal/day, and trauma patients required1883 kcal/day. ANOVA analysis demonstrated statistically significant differences in REE between these groups (p &lt; 0.001). When normalized to body weight (kcal/kg/day), the range of REE varied from 20.3 to 23.5 kcal/kg/day, with statistically significant differences between disease states (p &lt; 0.001).</p><p><b>Conclusion:</b> This study reveals significant variability in REE across different disease states in ICU patients, highlighting the need for disease-specific energy recommendations. These findings indicate that specific disease processes, such as COVID-19 and trauma, may increase metabolic demands, while patients recovering from surgical procedures may have comparatively lower energy needs. These findings emphasize the importance of individualized nutritional interventions based on a patient's disease state to optimize recovery, clinical outcomes and prevent underfeeding or overfeeding, which can adversely affect patient outcomes. The results suggest IC should be more widely implemented in ICU settings to guide precise and effective nutrition delivery based on real-time metabolic data rather than relying on standard predictive equations. Further research is needed to refine these recommendations and explore continuous monitoring of REE and tailored nutrition needs in the ICU.</p><p><b>Table 1.</b> Demographic and Clinical Characteristics.</p><p></p><p><b>Table 2.</b> Disease Group Diagnoses.</p><p></p><p></p><p><b>Figure 1.</b> Average Measured Resting Energy Expenditure by Disease Group.</p><p>Hailee Prieto, MA, RD, LDN, CNSC<sup>1</sup>; Emily McDermott, MS, RD, LDN, CNSC<sup>2</sup></p><p><sup>1</sup>Northwestern Memorial Hospital, Shorewood, IL; <sup>2</sup>Northwestern Memorial Hospital, Chicago, IL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Communication between Registered Dietitians and CTICU teams is non-standardized and often ineffective in supporting proper implementation of RD recommendations. Late or lack of RD support can impact the quality of nutrition care provided to patients. In FY23, the CTICU nutrition consult/risk turnaround time was 58% within 24hrs and missed nutrition consults/risk was 9%. Our goal was to improve RD consult/risk turnaround time within 24hrs, based on our department goal, from 58% to 75% and missed RD consult/risk from 9% to 6%, through standardizing communication between RD and CTICU APRNs. The outcome metrics nutrition risk turnaround time and nutrition consult turnaround time. The process metric was our percent of RD presence in rounds.</p><p><b>Methods:</b> We used the DMAIC module to attempt to solve our communication issue in CTICU. We took the voice of the customer and surveyed the CTICU ARPNs and found that a barrier was the RDs limited presence in the CTICU. We found that the CTICU APRNs find it valuable to have an RD rounding daily with their team. We then did a literature search on RDs rounding in ICU, specifically cardiac/thoracic ICUs and found critically ill cardiac surgery patients are at high risk of developing malnutrition, however initiation of medical nutrition therapy and overall adequacy of nutrition provision is lower compared to noncardiac surgical or MICU patients. RD written orders directly improve outcomes in patients receiving nutrition support. To have the most influence, RDs need to be present in the ICU and be involved when important decisions are being made. Dietitian involvement in the ICU team significantly improves the team's ability to implement prompt, relevant nutrition support interventions. We used process mapping to address that rounding times overlap with the step-down cardiac floors and ICU rounding. We optimized the schedule for the RDs daily to be able to attend as many rounds as possible daily, including the CTICU rounds. We then implemented a new rounding structure within the Cardiac Service Line based on literature search for the standard of care and RD role in ICU rounding.</p><p><b>Results:</b> Our percentage of turnaround time for nutrition consults/risks increased by 26% within 24hrs (58 to 84%) and decreased for missed consults/risks to 1% to exceed goals. The number of nutrition interventions we were able to implement increased with more RDs attending rounds, which was tracked with implementation of a RD rounding structure within the CTICU. The number of implemented interventions from 1 to 2 RDs was skewed due to the RD attempting to round with both teams each day there was only 1 RD.</p><p><b>Conclusion:</b> Communication between the CTICU team and Clinical Nutrition continues to improve with consistent positive feedback from the ICU providers regarding the new rounding structure. The new workflow was implemented in the Clinical Nutrition Cardiac Service Line. For future opportunities, there are other ICU teams at NMH that do not have a dedicated RD to round with them due to RD staffing that could also benefit from a dedicated RD in those rounds daily.</p><p><b>Table 1.</b> New Rounding Structure.</p><p></p><p>*Critical Care Rounds; Green: Attend; Gold: Unable to attend.</p><p><b>Table 2.</b> Control Plan.</p><p></p><p></p><p><b>Figure 1.</b> Results Consult Risk Turn Around Time Pre &amp; Post Rounding.</p><p></p><p><b>Figure 2.</b> Number of Implemented RD Nutrition Interventions by Number of RDs Rounding.</p><p>Kenny Ngo, PharmD<sup>1</sup>; Rachel Leong, PharmD<sup>2</sup>; Vivian Zhao, PharmD<sup>2</sup>; Nisha Dave, PharmD<sup>2</sup>; Thomas Ziegler, MD<sup>2</sup></p><p><sup>1</sup>Emory Healthcare, Macon, GA; <sup>2</sup>Emory Healthcare, Atlanta, GA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Micronutrients play a crucial role in biochemical processes in the body. During critical illness, the status of micronutrients can be affected by factors such as disease severity and medical interventions. Extracorporeal membrane oxygenation (ECMO) is a vital supportive therapy that has seen increased utilization for critically ill patients with acute severe refractory cardiorespiratory failure. The potential alterations in micronutrient status and requirements during ECMO are an area of significant interest, but data are limited. This study aimed to determine the incidence of micronutrient depletion in critically ill patients requiring ECMO.</p><p><b>Methods:</b> A retrospective chart review was conducted for patients with at least one micronutrient level measured in blood while receiving ECMO between January 1, 2015, and September 30, 2023. Chart reviews were completed using the Emory Healthcare electronic medical record system after the Emory University Institutional Review Board approved the study. A full waiver for informed consent and authorization was approved for this study. Data on demographic characteristics, ECMO therapy-related information, and reported micronutrient levels were collected. Descriptive statistics were used to evaluate the data.</p><p><b>Results:</b> A total of 77 of the 128 reviewed patients met inclusion criteria and were included in data analysis (Table 1). The average age of patients was 49, and 55.8% were female. The average duration of ECMO was 14 days, and the average length of stay in the intensive care unit was 46.7 days. Among the included patients, 56 required continuous renal replacement therapy (CRRT) along with ECMO. Patients who required CRRT received empiric standard supplementation of folic acid (1 mg), pyridoxine (200 mg), and thiamine (100 mg) every 48 hours. Of the 77 patients, 44% had below-normal blood levels of at least one of the measured micronutrients. The depletion percentages of various nutrients were as follows: vitamin C (80.6%), vitamin D (75.0%), iron (72.7%), copper (53.3%), carnitine (31.3%), selenium (30.0%), pyridoxine (25.0%), folic acid (18.8%), vitamin A (10.0%), zinc (7.1%), and thiamine (3.7%) (Table 2). Measured vitamin B12, manganese, and vitamin E levels were within normal limits.</p><p><b>Conclusion:</b> This study demonstrated that 60% of patients on ECMO had orders to evaluate at least one micronutrient in their blood. Of these, almost half had at least one micronutrient level below normal limits. These findings underscore the need for regular nutrient monitoring for critically ill patients. Prospective studies are needed to understand the impact of ECMO on micronutrient status, determine the optimal time for evaluation, and assess the need for and efficacy of supplementation in these patients.</p><p><b>Table 1.</b> General Demographic and ECMO Characteristics (N = 77).</p><p></p><p><b>Table 2.</b> Observed Micronutrient Status during ECMO for Critically Ill Patients.</p><p></p><p>Diane Nowak, RD, LD, CNSC<sup>1</sup>; Mary Kronik, RD, LD, CNSC<sup>2</sup>; Caroline Couper, RD, LD, CNSC<sup>3</sup>; Mary Rath, MEd, RD, LD, CNSC<sup>4</sup>; Ashley Ratliff, MS, RD, LD, CNSC<sup>4</sup>; Eva Leszczak-Lesko, BS Health Sciences, RRT<sup>4</sup></p><p><sup>1</sup>Cleveland Clinic, Elyria, OH; <sup>2</sup>Cleveland Clinic, Olmsted Twp, OH; <sup>3</sup>Cleveland Clinic, Rocky River, OH; <sup>4</sup>Cleveland Clinic, Cleveland, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Indirect calorimetry (IC) is the gold standard for the accurate determination of energy expenditure. The team performed a comprehensive literature review on current IC practices across the nation which showed facilities employing IC typically follow a standard protocol dictated by length of stay (LOS). Intensive Care Unit (ICU) Registered Dietitians (RD) have directed IC intervention to reduce reliance on inaccurate predictive equations and judiciously identify patients (1, 2) with the assistance of IC order-based practice. While IC candidacy is determined by clinical criteria, implementation has been primarily dictated by RD time constraints. Our project aims to include IC in our standard of care by using a standardized process for implementation.</p><p><b>Methods:</b> To implement IC at our 1,299-bed quaternary care hospital, including 249 ICU beds, a multidisciplinary team including ICU RDs and Respiratory Therapists (RT) partnered with a physician champion. Three Cosmed QNRG+ indirect calorimeters were purchased after a 6-month trial period. Due to the potential for rapid clinical status changes and RD staffing, the ICU team selected an order-based practice as opposed to a protocol. The order is signed by a Licensed Independent Practitioner (LIP) and includes three components: an order for IC with indication, a nutrition assessment consult, and a conditional order for the RD to release to RT once testing approved. After the order is signed, the RD collaborates with the Registered Nurse and RT by verifying standardized clinical criteria to assess IC candidacy. If appropriate, the RD will release the order for RT prior to testing to allow for documentation of ventilator settings. To begin the test, the RD enters patient information and calibrates the pneumotach following which RT secures ventilation connections. Next, RD starts the test and remains at bedside for the standardized 20-minute duration to ensure steady state is not interrupted. Once testing is completed, the best 5-minute average is selected to obtain the measured resting energy expenditure (mREE). The RD interprets the results considering a multitude of factors and, if warranted, modifies nutrition interventions.</p><p><b>Results:</b> Eight ICU registered dietitians completed 87 IC measurements from May 2024 through August 2024 which included patients across various ICUs. All 87 patients were selected by the RD due to concerns for over or underfeeding. Eighty-three percent of the measurements were valid tests and seventy-nine percent of the measurements led to intervention modifications. The amount of face-to-face time spent was 66 hours and 45 minutes or an average of 45 minutes per test. Additional time spent interpreting results and making modifications to interventions ranged from 15-30 minutes.</p><p><b>Conclusion:</b> IC has the ability to capture accurate energy expenditures in the critically ill. RD directed IC order-based practice has allowed for the successful IC introduction at our institution. The future transition from limited IC implementation to a standard of care will be dependent on the consideration of numerous challenges including RD time constraints and patient volumes amid the ebbs and flows of critical care. To align with the ever-changing dynamics of critical care, staffing level and workflows are being actively evaluated.</p><p><b>Table 1.</b> Indirect Calorimetry (IC) Checklist.</p><p></p><p></p><p><b>Figure 1.</b> IC Result with Invalid Test.</p><p></p><p><b>Figure 2.</b> IC Result with Valid Test.</p><p></p><p><b>Figure 3.</b> IC Indications and Contraindications.</p><p></p><p><b>Figure 4.</b> IC EPIC Order.</p><p>Rebecca Frazier, MS, RD, CNSC<sup>1</sup>; Chelsea Heisler, MD, MPH<sup>1</sup>; Bryan Collier, DO, FACS, FCCM<sup>1</sup></p><p><sup>1</sup>Carilion Roanoke Memorial Hospital, Roanoke, VA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Adequate energy intake with appropriate macronutrient composition is an essential part of patient recovery; however, predictive equations have been found to be of variable accuracy. Indirect calorimetry (IC), gives insight into the primary nutrition substrate being utilized as metabolic fuel and caloric needs, often identifying over- and under-feeding. Though IC is considered the gold standard for determining resting energy expenditure, it has challenges with cost, equipment feasibility, and time restraints with personnel, namely respiratory therapy (RT). Our hypothesis: Registered Dietitian (RD)-led IC tests can be conducted in a safe and feasible manner without undue risk of complication. In addition, IC will show a higher caloric need, particularly in patients who require at least seven days of ventilatory support.</p><p><b>Methods:</b> A team of RDs screened surgical ICU patients at a single institution. Intubated patients for at least 3 days were considered eligible for testing. Exclusion criteria included a PEEP &gt; 10, fraction of inspired oxygen &gt;60%, Richmond Agitation Sedation Scale ≥ 1, chest tube air leak, extracorporeal membrane oxygenation use, and &gt;1°C change in 1 hour. Tests were completed using the Q-NRG+ Portable Metabolic Monitor (Baxter) based on RD, and patient availability. Test results were compared to calculated needs based on the Penn State Equation (39 tests). Overfeeding/underfeeding was defined as &gt;15% deviation from the equation results. Analysis of mean difference in energy needs was calculated using a standard paired, two tailed t-test for &lt;/= 7 total ventilated days and &gt;7 ventilated days.</p><p><b>Results:</b> Thirty patients underwent IC testing; a total of 39 tests were completed. There were no complications in RD led IC testing, and minimal RT involvement was required after 5 tests completed. Overall, 56.4% of all IC regardless of ventilator days indicated overfeeding. In addition, 33.3% of tests indicated appropriate feeding (85-115% of calculated REE), and 10.3% of tests demonstrated underfeeding. When stratified by ventilator days (&gt; 7 d vs. ≤7 d), similar results were found noting 66% of IC tests were &gt;15% from calculated caloric needs; 54.4-60.0% via equation were overfed and 12.5-6.7% were underfed, respectively.</p><p><b>Conclusion:</b> Equations estimating caloric needs provide inconsistent results. Nutritional equations under and overestimate nutritional needs similarly regardless of ventilatory days compared to IC. Despite the lack of statistical significance, the effects of poor nutrition are well documented and vastly clinically significant. With minimal training, IC can be performed safely with an RD and bedside RN. Utilizing the RD to coordinate and perform IC testing is a feasible process that maximizes personnel efficiency and allows for immediate adjustment in the nutrition plan. IC as the gold standard for nutrition estimation should be performed on surgical ICU patients to assist in developing nutritional treatment algorithms.</p><p>Dolores Rodríguez<sup>1</sup>; Mery Guerrero<sup>2</sup>; María Centeno<sup>2</sup>; Barbara Maldonado<sup>2</sup>; Sandra Herrera<sup>2</sup>; Sergio Santana<sup>3</sup></p><p><sup>1</sup>Ecuadorian Society for the Fight against Cancer, Guayaquil, Guayas; <sup>2</sup>SOLCA, Guayaquil, Guayas; <sup>3</sup>University of Havana, La Habana, Ciudad de la Habana</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> In 2022, the International Agency for Research on Cancer-Globocan reported nearly 20 million new cases of cancer worldwide, with 30,888 cases in Ecuador. Breast, prostate, and stomach cancers were the most diagnosed types. Oncohematological diseases (OHD) significantly affect the nutritional status of patients. The ELAN Ecuador-2014 study, involving over 5,000 patients, found malnutrition in 37% of participants overall, rising to 65% among those with OHD. The Latin American Study of Malnutrition in Oncology (LASOMO), conducted by FELANPE between 2019 and 2020, revealed a 59.1% frequency of malnutrition among 1,842 patients across 52 health centers in 10 Latin American countries. This study aims to present the current state of malnutrition associated with OHD among patients treated in Ecuadorian hospitals.</p><p><b>Methods:</b> The Ecuadorian segment of the LASOMO Study was conducted between 2019 and 2020, as part of the previously mentioned regional epidemiological initiative. This study was designed as a one-day, nationwide, multicenter survey involving health centers and specialized services for patients with Oncohematological diseases (OHD) across five hospitals located in the provinces of Guayas (3), Manabí (1), and Azuay (1). The nutritional status of patients with Oncohematological diseases (OHD) was assessed using the B + C scores from Detsky et al.'s Subjective Global Assessment (SGA). This study included male and female patients aged 18 years and older, admitted to clinical, surgical, intensive care, and bone marrow transplant (BMT) units during October and November 2019. Participation was voluntary, and patients provided informed consent by signing a consent form. Data were analyzed using location, dispersion, and aggregation statistics based on variable types. The nature and strength of relationships were assessed using chi-square tests for independence, with a significance level of &lt; 5% to identify significant associations. Odds ratios for malnutrition were calculated along with their associated 95% confidence intervals.</p><p><b>Results:</b> The study enrolled 390 patients, with 63.6% women and 36.4% men, averaging 55.3 ± 16.5 years old; 47.2% were aged 60 years or older. The most common tumor locations included kidneys, urinary tract, uterus, ovaries, prostate, and testicles, accounting for 18.7% of all cases (refer to Table 1). Chemotherapy was the predominant oncological treatment, administered to 42.8% of the patients surveyed. Malnutrition affected 49.7% of the patients surveyed, with 14.4% categorized as severely malnourished (see figure 1). The incidence of malnutrition was found to be independent of age, educational level, tumor location, and current cytoreductive treatment (refer to Table 2). Notably, the majority of the malnourished individuals were men.</p><p><b>Conclusion:</b> Malnutrition is highly prevalent in patients treated for OHD in Ecuadorian hospitals.</p><p><b>Table 1.</b> Most Frequent Location of Neoplastic Disease in Hospitalized Ecuadorian Patients and Type of Treatment Received. The number and {in brackets} the percentage of patients included in the corresponding.</p><p></p><p><b>Table 2.</b> Distribution of Malnutrition Associated with Cancer According to Selected Demographic, Clinical, and Health Characteristics of Patients Surveyed During the Oncology Malnutrition Study in Ecuador. The number and {in brackets} the percentage of malnourished patients included in each characteristic category are presented. The frequency of malnutrition was estimated using the Subjective Global Assessment (Detsky et al,. 1987.)</p><p></p><p></p><p><b>Figure 1.</b> State of Malnutrition Among Patients Treated for Cancer in Hospitals In Ecuador.</p><p>Ranna Modir, MS, RD, CNSC, CDE, CCTD<sup>1</sup>; Christina Salido, RD<sup>1</sup>; William Hiesinger, MD<sup>2</sup></p><p><sup>1</sup>Stanford Healthcare, Stanford, CA; <sup>2</sup>Stanford Medicine, Stanford, CA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Cardiovascular Intensive Care Unit (CVICU) patients are at high risk for significant nutrient deficits, especially during the early intensive care unit (ICU) phase, when postoperative complications and hemodynamic instability are prevalent. These deficits can exacerbate the catabolic state, leading to muscle wasting, impaired immune function, and delayed recovery. A calorie deficit &gt; 10,000 with meeting &lt; 80% of nutritional needs in the early ICU phase (first 14 days) has been linked to worse outcomes, including prolonged intubation, increased ICU length of stay (LOS) and higher risk of organ dysfunction and infections like central line-associated bloodstream infections (CLABSI). When evaluating CLABSI risk factors, the role of nutrition adequacy and malnutrition if often underestimated and overlooked, with more emphasis placed on the type of nutrition support (NS) provided, whether enteral nutrition (EN) or parenteral nutrition (PN). Historically, there has been practice of avoiding PN to reduce CLABSI risk, rather than ensuring that nutritional needs are fully met. This practice is based on initial reports from decades ago linking PN with hospital-acquired infections. However, updated guidelines based on modern data now indicate no difference in infectious outcomes with EN vs PN. In fact, adding PN when EN alone is insufficient can help reduce nutritional deficiencies, supporting optimal immune response and resilience to infection. As part of an ongoing NS audit in our CVICU, we reviewed all CLABSI cases over a 23-month period to assess nutrition adequacy and the modality of NS (EN vs PN) provided.</p><p><b>Methods:</b> Data were extracted from electronic medical records for all CLABSI cases from September 2020 to July 2022. Data collected included patient characteristics, clinical and nutrition outcomes (Table 1). Descriptive statistics (means, standard deviations, frequencies) were calculated. Chi-square or Fisher's exact test assessed the association between type of NS and meeting &gt;80% of calorie/protein targets within 14 days and until CLABSI onset. A significance level of 0.05 was used.</p><p><b>Results:</b> In a 23 month period, 28/51 (54.9%) patients with a CLABSI required exclusive NS throughout their entire CVICU stay with n = 18 male (64.3%), median age 54.5 years, mean BMI 27.4, median CVICU LOS was 49.5 days with a 46.4% mortality rate. Surgical intervention was indicated in 60.7% patients with 41.2% requiring preoperative extracorporeal membrane oxygenation (ECMO) and 52.9% postoperative ECMO (Table 1). Majority of patients received exclusive EN (53.6%) with 46.4% EN + PN and 0% exclusive PN. In the first 14 ICU days, 21.4% met &gt;80% of calorie needs, 32.1% met &gt;80% of protein needs with 32.1% having a calorie deficit &gt;10,000 kcal. No difference in type of NS and ability to meet &gt;80% of nutrient targets in the first 14 days (Table 1, p = 0.372, p = 0.689). Majority of PN (61.5%) was initiated after ICU day 7. From ICU day 1 until CLABSI onset, the EN + PN group were more able to meet &gt;80% of calorie targets vs exclusive EN (p = 0.016). 50% were diagnosed with malnutrition. 82% required ECMO cannulas and 42.9% dialysis triple lumen. Enterococcus Faecalis was the most common organism for the EN (43.7%) and EN + PN group (35.7%) (Table 2).</p><p><b>Conclusion:</b> This single-center analysis of CVICU CLABSI patients found majority requiring exclusive NS failed to meet &gt;80% of nutrition needs during the early ICU phase. Exclusive EN was the primary mode of NS compared to EN + PN or PN alone, challenging the assumption that PN inherently increases CLABSI risk. In fact, EN + PN improved ability to meet calorie targets until CLABSI onset. These findings suggest that early nutrient deficits may increase CLABSI risk and that the risk is not dependent on the type of NS provided.</p><p><b>Table 1.</b> Patient Characteristics, Clinical and Nutritional Outcomes.</p><p></p><p><b>Table 2.</b> Type of Central Access Device and Microorganism in Relation to Modality of Nutrition Support Provided.</p><p></p><p>Oki Yonatan, MD<sup>1</sup>; Faya Nuralda Sitompul<sup>2</sup></p><p><sup>1</sup>ASPEN, Jakarta, Jakarta Raya; <sup>2</sup>Osaka University, Minoh, Osaka</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Ginseng, widely used as a functional food or therapeutic supplement in Asia, contains bioactive compounds such as ginsenosides, which exert a range of biological effects, including hypoglycemic, anti-inflammatory, cardioprotective, and anti-tumor properties. However, studies have indicated that ginseng also has anticoagulant and anti-aggregation effects and may be associated with bleeding. This case report presents a potential case of ginseng-induced bleeding in an elderly patient with advanced pancreatic cancer. Case Description: A 76-year-old male with stage IV pancreatic cancer and metastases to the liver, lymph nodes, and peritoneum was in home care with BIPAP ventilation, NGT feed, ascites drainage, and foley catheter. He had a history of type A aortic dissection repair, anemia, and thrombocytopenia, with platelet counts consistently below 50,000/µL. Despite no history of anticoagulant use, the patient developed massive gastrointestinal bleeding and hematuria after consuming 100 grams of American Ginseng (AG) per day for five days. Disseminated intravascular coagulation (DIC) was initially suspected, but no signs of bleeding were observed until the third week of care, coinciding with ginseng consumption. Endoscopy was not performed due to the patient's unstable condition and the family's refusal. Discussion: The consumption of AG may have triggered bleeding due to the patient's already unstable condition and low platelet count. Ginsenosides, particularly Rg1, Rg2, and Rg3, have been shown to exert anticoagulant effects, prolonging clotting times and inhibiting platelet aggregation. Studies have demonstrated that AG extracts can significantly extend clotting times and reduce platelet activity, potentially contributing to the observed bleeding. Conclusion: This case highlights the possible role of AG in inducing severe bleeding in a patient with pancreatic cancer and thrombocytopenia. Given ginseng's known anticoagulant properties, caution should be exercised when administering it to patients with hematological abnormalities or bleeding risks, and further research is warranted to assess its safety in these populations.</p><p><b>Methods:</b> None Reported.</p><p><b>Results:</b> None Reported.</p><p><b>Conclusion:</b> None Reported.</p><p>Kursat Gundogan, MD<sup>1</sup>; Mary Nellis, PhD<sup>2</sup>; Nurhayat Ozer, PhD<sup>3</sup>; Sahin Temel, MD<sup>3</sup>; Recep Yuksel, MD<sup>4</sup>; Murat Sungar, MD<sup>5</sup>; Dean Jones, PhD<sup>2</sup>; Thomas Ziegler, MD<sup>6</sup></p><p><sup>1</sup>Division of Clinical Nutrition, Erciyes University Health Sciences Institute, Kayseri; <sup>2</sup>Emory University, Atlanta, GA; <sup>3</sup>Erciyes University Health Sciences Institute, Kayseri; <sup>4</sup>Department of Internal Medicine, Erciyes University School of Medicine, Kayseri; <sup>5</sup>Department of Internal Medicine, Erciyes University School of Medicine, Kayseri; <sup>6</sup>Emory Healthcare, Atlanta, GA</p><p><b>Financial Support:</b> Erciyes University Scientific Research Committee (TSG- 2021–10078) and TUBITAK 2219 program (1059B-192000150), each to KG, and National Institutes of Health grant P30 ES019776, to DPJ and TRZ.</p><p><b>Background:</b> Metabolomics represents a promising profiling technique for the investigation of significant metabolic alterations that arise in response to critical illness. The present study utilized plasma high-resolution metabolomics (HRM) analysis to define systemic metabolism associated with common critical illness severity scores in critically ill adults.</p><p><b>Methods:</b> This cross-sectional study performed at Erciyes University Hospital, Kayseri, Turkiye and Emory University, Atlanta, GA, USA. Participants were critically ill adults with an expected length of intensive care unit stay longer than 48 h. Plasma for metabolomics was obtained on the day of ICU admission. Data was analyzed using regression analysis of two ICU admission illness severity scores (APACHE II and mNUTRIC) against all plasma metabolomic features in metabolome-wide association studies (MWAS). APACHE II score was analyzed as a continuous variable, and mNUTRIC score was analyzed as a dichotomous variable [≤4 (low) vs. &gt; 4 (high)]. Pathway enrichment analysis was performed on significant metabolites (raw p &lt; 0.05) related to each of the two illness severity scores independently.</p><p><b>Results:</b> A total of 77 patients were included. The mean age was 69 years (range 33-92 years); 65% were female. More than 15,000 metabolomic features were identified for MWAS of APACHE II and mNUTRIC scores, respectively. Metabolic pathways significantly associated with APACHE II score at ICU admission included: C21-steroid hormone biosynthesis, and urea cycle, vitamin E, seleno amino acid, aspartate/asparagine, and thiamine metabolism. Metabolic pathways associated with mNUTRIC score at ICU admission were N-glycan degradation, and metabolism of fructose/mannose, vitamin D3, pentose phosphate, sialic acid, and linoleic acid. Within the significant pathways, the gut microbiome-derived metabolites hippurate and N-acetyl ornithine were downregulated, and creatine and glutamate were upregulated with increasing APACHE II scores. Metabolites involved in energy metabolism that were altered with a high (&gt; 4) mNUTRIC score included N-acetylglucosamine (increased) and gluconate (decreased).</p><p><b>Conclusion:</b> Plasma HRM identified significant associations between two commonly used illness severity scores and metabolic processes involving steroid biosynthesis, the gut microbiome, skeletal muscle, and amino acid, vitamin, and energy metabolism in adult critically ill patients.</p><p>Hilary Winthrop, MS, RD, LDN, CNSC<sup>1</sup>; Megan Beyer, MS, RD, LDN<sup>2</sup>; Jeroen Molinger, PhDc<sup>3</sup>; Suresh Agarwal, MD<sup>4</sup>; Paul Wischmeyer, MD, EDIC, FCCM, FASPEN<sup>5</sup>; Krista Haines, DO, MA<sup>4</sup></p><p><sup>1</sup>Duke Health, Durham, NC; <sup>2</sup>Duke University School of Medicine- Department of Anesthesiology, Durham, NC; <sup>3</sup>Duke University Medical Center - Department of Anesthesiology - Duke Heart Center, Raleigh, NC; <sup>4</sup>Duke University School of Medicine, Durham, NC; <sup>5</sup>Duke University Medical School, Durham, NC</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Little evidence currently exists on the effect of body mass index (BMI) on resting energy expenditure (REE). Current clinical practice guidelines are based primarily on expert opinion and provide a wide range of calorie recommendations without delineations specific to BMI categories, making it difficult for the clinician to accurately prescribe calorie needs for their hospitalized patients. This abstract utilizes metabolic cart data from studies conducted at a large academic healthcare system to investigate trends within BMI and REE.</p><p><b>Methods:</b> A pooled cohort of hospitalized patients was compiled from three clinical trials where metabolic cart measurements were collected. In all three studies, indirect calorimetry was initially conducted in the intensive care setting, with follow up measurements conducted as clinically able. Variables included in the analysis was measured resting energy expenditure (mREE) in total kcals as well as kcals per kilogram of body weight on ICU admission, respiratory quotient, days post ICU admission, as well as demographics and clinical characteristics. ANOVA tests were utilized to analyze continuous data.</p><p><b>Results:</b> A total of 165 patients were included in the final analysis with 338 indirect calorimetry measurements. Disease groups included COVID-19 pneumonia, non-COVID respiratory failure, surgical ICU, cardiothoracic surgery, and trauma. The average age of patients was 58 years old, with 96 males (58.2%) and 69 females (41.8%), and an average BMI of 29.0 kg/m<sup>2</sup>. The metabolic cart measurements on average were taken on day 8 post ICU admission (ranging from day 1 to day 61). See Table 1 for more demographics and clinical characteristics. Indirect calorimetry measurements were grouped into three BMI categories: BMI ≤ 29.9 (normal), BMI 30-39.9 (obese), and BMI ≥ 40 (super obese). ANOVA analyses showed statistical significance amongst the three BMI groups in both total kcals (p &lt; 0.001) and kcals per kg (p &lt; 0.001). The normal BMI group had an average mREE of 1632 kcals (range of 767 to 4023), compared to 1868 kcals (range of 1107 to 3754) in the obese BMI group, and 2004 kcals (range of 1219 to 3458) in the super obese BMI group. Similarly, when analyzing kcals per kg, the normal BMI group averaged 23.3 kcals/kg, the obese BMI group 19.8, and the super obese BMI group 16.3.</p><p><b>Conclusion:</b> Without access to a metabolic cart to accurately measure REE, the majority of nutrition clinicians are left to estimations. Current clinical guidelines and published data do not provide the guidance that is necessary to accurately feed many hospitalized patients. This current analysis only scratches the surface on the metabolic demands of different patient populations based on their BMI status, especially given the wide ranges of energy expenditures. Robust studies are needed to further elucidate the relationships between BMI and REE in different disease states.</p><p><b>Table 1.</b> Demographics and Clinical Characteristics.</p><p></p><p></p><p><b>Figure 1.</b> Average Measured Resting Energy Expenditure (in Total Kcals), by BMI Group.</p><p></p><p><b>Figure 2.</b> Average Measured Resting Energy Expenditure (in kcals/kg), by BMI Group.</p><p>Carlos Reyes Torres, PhD, MSc<sup>1</sup>; Daniela Delgado Salgado, Dr<sup>2</sup>; Sergio Diaz Paredes, Dr<sup>1</sup>; Sarish Del Real Ordoñez, Dr<sup>1</sup>; Eva Willars Inman, Dr<sup>1</sup></p><p><sup>1</sup>Hospital Oncológico de Coahuila (Oncological Hospital of Coahuila), Saltillo, Coahuila de Zaragoza; <sup>2</sup>ISSSTE, Saltillo, Coahuila de Zaragoza</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Chemotherapy is one of the principal treatment in cancer. Is described that some degree of toxicity is present in 98% of patients. Changes in body composition are frequent and are related to worse outcomes. Low muscle mass is associated with chemotherapy toxicity in observational studies. Phase angle (PhA) is an indicator cell integrity and positively correlates with adequate nutritional status and muscle mass. There is limited studies that have been evaluated the associaton of PhA and chemotherapy toxicity. The aim of this study was to evaluate the association of PhA, body composition and chemotherapy toxicity in cancer patients.</p><p><b>Methods:</b> A prospective cohort study was conducted in adult patients with solid neoplasic disease with first-line systemic treatments. The subjects were evaluated at first chemotherapy treatment using bioelectrical impedance analysis with a RJL device and according to the standardized technique. The outcome is to know chemotherapy toxicity in the first 4 cycles of chemotherapy and associated with PhA and body composition. Toxicity was evaluated using National Cancer Institute (NCI) common terminology criteria for adverse events version 5.0. A PhA &lt; 4.7 was considered low according to other studies.</p><p><b>Results:</b> A total of 54 patients were evaluated and included in the study. The most common cancer diagnosis was breast cancer (40%), gastrointestinal tumors (33%), lung cancer (18%). Chemotherapy toxicity was presented in 46% of the patients. The most common adverse effects were gastrointestinal (48%), blood disorders (32%) and metabolically disorders (40%). There were statistically differences in PhA between patients with chemotherapy toxicity and patients without adverse effects: 4.45º (3.08-4.97) vs 6.07º (5.7-6.2) respectively, p vale &lt; 0.001. PhA was associated with the risk of chemotherapy toxicity HR 8.7 (CI 95% 6.1-10.7) log rank test p = 0.02.</p><p><b>Conclusion:</b> PhA was associated with the risk of chemotherapy toxicity in cancer patients.</p><p>Lizl Veldsman, RD, M Nutr, BSc Dietetics<sup>1</sup>; Guy Richards, MD, PhD<sup>2</sup>; Carl Lombard, PhD<sup>3</sup>; Renée Blaauw, PhD, RD<sup>1</sup></p><p><sup>1</sup>Division of Human Nutrition, Department of Global Health, Faculty of Medicine &amp; Health Sciences, Stellenbosch University, Cape Town, Western Cape; <sup>2</sup>Department of Surgery, Division of Critical Care, Faculty of Health Sciences, University of the Witwatersrand, Johannesburg, Gauteng; <sup>3</sup>Division of Epidemiology and Biostatistics, Department of Global Health, Faculty of Medicine and Health Sciences, Stellenbosch University, Cape Town, Western Cape</p><p><b>Financial Support:</b> Fresenius Kabi JumpStart Research Grant.</p><p><b>Background:</b> Critically ill patients lose a significant amount of muscle mass over the first ICU week. We sought to determine the effect of bolus amino acid (AA) supplementation on the urea-to-creatinine ratio (UCR) trajectory over time, and whether UCR correlates with histology myofiber cross-sectional area (CSA) as a potential surrogate marker of muscle mass.</p><p><b>Methods:</b> This was a secondary analysis of data from a registered clinical trial (ClinicalTrials.gov NCT04099108) undertaken in a predominantly trauma surgical ICU. Participants were randomly assigned into two groups both of which received standard care nutrition (SCN) and mobilisation. Study participants randomised to the intervention group also received a bolus AA supplement, with a 45-minute in-bed cycling session, from average ICU day 3 for a mean of 6 days. The change in vastus lateralis myofiber CSA, measured from pre-intervention (average ICU day 2) to post-intervention (average ICU day 8), was assessed through biopsy and histological analysis. A linear mixed-effects regression model was used to compare the mean daily UCR profiles between study groups from ICU day 0 to day 10. As a sensitivity analysis, we adjusted for disease severity on admission (APACHE II and SOFA scores), daily fluid balance, and the presence of acute kidney injury (AKI). A spearman correlation compared the UCR on ICU day 2 (pre-intervention) and ICU day 8 (post-intervention) with the corresponding myofiber CSA.</p><p><b>Results:</b> A total of 50 enrolled participants were randomised to the study intervention and control groups in a 1:1 ratio. The control and intervention groups received, on average, 87.62 ± 32.18 and 85.53 ± 29.29 grams of protein per day (1.26 ± 0.41 and 1.29 ± 0.40 g/kg/day, respectively) from SCN, and the intervention group an additional 30.43 ± 5.62 grams of AA (0.37 ± 0.06 g/kg protein equivalents) from the AA supplement. At baseline, the mean UCR for the control (75.6 ± 31.5) and intervention group (63.8 ± 27.1) were similar. Mean UCR increased daily from baseline in both arms, but at a faster rate in the intervention arm with a significant intervention effect (p = 0.0127). The UCR for the intervention arm at day 7 and 8 was significantly higher by 21 and 22 units compared to controls (p = 0.0214 and p = 0.0215, respectively). After day 7 the mean daily UCR plateaued in the intervention arm, but not in the controls. Adjusting for disease severity, daily fluid balance, and AKI did not alter the intervention effect. A significant negative association was found between UCR and myofiber CSA (r = -0.39, p = 0.011) at ICU day 2 (pre-intervention), but not at day 8 (post-intervention) (r = 0.23, p = 0.153).</p><p><b>Conclusion:</b> Bolus amino acid supplementation significantly increases the UCR during the first ICU week, thereafter plateauing. UCR at baseline may be an indicator of muscle status.</p><p></p><p><b>Figure 1.</b> Change in Urea-to-Creatinine Ratio (UCR) Over ICU Days in the Control and Intervention Group. Error bars Represent 95% Confidence Intervals (CIs).</p><p>Paola Renata Lamoyi Domínguez, MSc<sup>1</sup>; Iván Osuna Padilla, PhD<sup>2</sup>; Lilia Castillo Martínez, PhD<sup>3</sup>; Josué Daniel Cadeza-Aguilar, MD<sup>2</sup>; Martín Ríos-Ayala, MD<sup>2</sup></p><p><sup>1</sup>UNAM, National Autonomous University of Mexico, Mexico City, Distrito Federal; <sup>2</sup>National Institute of Respiratory Diseases, Mexico City, Distrito Federal; <sup>3</sup>National Institute of Medical Sciences and Nutrition Salvador Zubirán, Mexico City, Distrito Federal</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Non-defecation (ND) is highly prevalent in critically ill patients on mechanical ventilation (MV) and has been reported in up to 83% of cases. This disorder is associated with high morbidity and mortality rates. Most existing research has focused on the association between clinical data and non-defecation; however, there is a lack of evidence regarding its association with dietary factors. We aimed to analyze the association between dietary fiber in enteral nutrition and the amount of fluids administered through enteral and parenteral routes with defecation during the first 6-days of MV in critically ill patients with pneumonia and other lung manifestations.</p><p><b>Methods:</b> We conducted a longitudinal analysis of MV patients receiving enteral nutrition (EN) at a tertiary care hospital in Mexico City between May 2023 and April 2024. The inclusion criteria were age &gt;18 years, MV, admission to the respiratory intensive care unit (ICU) for pneumonia or other lung manifestations, and nutritional assessment performed during the first 24 h after ICU admission. Exclusion criteria was established as patients who required parenteral nutrition, major surgery, traumatic brain injury, or neuromuscular disorders were excluded from this study. A nutritional assessment, including the NUTRIC score, SOFA and APACHE II assessments, and an estimation of energy-protein requirements, was performed by trained dietitians within the first 24 hours after ICU admission. During each day of follow-up (0 to 6 days), we recorded the amount of fiber provided in EN, and volume of infusion fluids, including enteral and parenteral route, and medical prescription of opioids, sedatives, neuromuscular blockers and vasopressors. ND was defined as &gt;6 days without defecation from ICU admission. The differences between ND and defecation were also assessed. Association of ND with dietary factors were examined using discrete-time survival analysis.</p><p><b>Results:</b> Seventy-four patients were included; ND was observed in 40 patients (54%). Non-defecation group had higher ICU length of stay, and 50% of this group had the first defecation until day 10. No differences in fiber provision and volume of infusion fluids were observed between the groups. In multivariate analysis, no associations between ND and fiber (fiber intake 10 to 20 g per day, OR 1.17 95% CI:0.41-3.38, p = 0.29) or total fluids (fluids intake 25 to 30 ml/kg/d, OR 1.85 95%CI:0.44-7.87, p = 0.404) were observed.</p><p><b>Conclusion:</b> Non-defecation affected 54% of the study population. Although fiber and fluids are considered a treatment for non-defecation, we did not find an association in critically ill patients.</p><p><b>Table 1.</b> Demographic and Clinical Characteristics by Groups.</p><p></p><p><b>Table 2.</b> Daily Comparison of Dietary Factors.</p><p></p><p>Andrea Morand, MS, RDN, LD<sup>1</sup>; Osman Mohamed Elfadil, MBBS<sup>1</sup>; Kiah Graber, RDN<sup>1</sup>; Yash Patel, MBBS<sup>1</sup>; Suhena Patel, MBBS<sup>1</sup>; Chloe Loersch, RDN<sup>1</sup>; Isabelle Wiggins, RDN<sup>1</sup>; Anna Santoro, MS, RDN<sup>1</sup>; Natalie Johnson, MS<sup>1</sup>; Kristin Eckert, MS, RDN<sup>1</sup>; Dana Twernbold, RDN<sup>1</sup>; Dacia Talmo, RDN<sup>1</sup>; Elizabeth Engel, RRT, LRT<sup>1</sup>; Avery Erickson, MS, RDN<sup>1</sup>; Alex Kirby, MS, RDN<sup>1</sup>; Mackenzie Vukelich, RDN<sup>1</sup>; Kate Sandbakken, RDN<sup>1</sup>; Victoria Vasquez, RDN<sup>1</sup>; Manpreet Mundi, MD<sup>1</sup></p><p><sup>1</sup>Mayo Clinic, Rochester, MN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Current guidelines for nutrition support in critically ill patients recommend the utilization of indirect calorimetry (IC) to determine energy needs. However, IC testing is limited at many institutions due to accessibility, labor, and costs, which leads to reliance on predictive equations to determine caloric targets. A quality improvement (QI) initiative was implemented to assess the impact on nutrition care when IC is routinely completed.</p><p><b>Methods:</b> A prospective QI project in selected medical and surgical intensive care units (ICU) included critically ill patients assessed by the dietitian within 24-48 hours of consult order or by hospital day 4. Patients with contraindications to IC were excluded, including those requiring ECMO, CRRT, MARS, or FiO2 &gt; 55, as well as spontaneously breathing patients requiring significant supplemental oxygen. The initial caloric target was established utilizing predictive equations, which is the standard of care at our institution. After day 4 of the ICU stay, IC measurements, predictive equations, and weight-based nomograms were collected. The predictive equations utilized included Harris-Benedict (HB) - basal, adjusted HB (75% of basal when body mass index (BMI) &gt; 30), Penn State if ventilated, Mifflin St. Jeor (MSJ), revised HB, and revised HB adjusted (75% of basal when BMI &gt; 30). Additional demographic, anthropometric, and clinical data were collected.</p><p><b>Results:</b> Patients (n = 85) were majority male (n = 53, 62.4%), admitted to the surgical ICU (n = 57, 67.1%), overweight (mean BMI 29.8 kg/m^2), and the average age was 61.3 years old (SD 16.5). At the time of the IC test, the median ICU length of stay was 6 days; 77.6% (n = 66) were supported with mechanical ventilation, and median ventilator days were 4 days (Table 1). Mean IC measured REE was compared to predictive equations, showing that except for weight-based nomogram high caloric needs (p = 0.3615), all equations were significantly lower than IC (p &lt; 0.0001). Median absolute differences from IC for evaluated predictive equations (Figure 1). After REE was measured, caloric goals increased significantly for patients on enteral (EN) and parenteral nutrition (PN) (p = 0.0016 and p = 0.05, respectively). In enterally fed the mean calorie goal before REE was 1655.4 (SD 588), and after REE 1917.6 (SD 528.6), an average increase of 268.4 kcal/day; In parenterally fed patients, the mean calorie goal before REE was1395.2 kcal (SD 313.6) and after REE 1614.1 (SD 239.3), an average increase of 167.5 kcal (Table 2). The mean REE per BMI category per actual body weight was BMI &lt; 29.9 = 25.7 ± 7.9 kcal/kg, BMI 30-34.9 = 20.3 ± 3.8 kcal/kg, BMI 35-39.9 = 22.8 ± 4.6 kcal/kg, and BMI ≥ 40 = 16.3 ± 2.9 kcal/kg (25.4 ± 10.5 kcal/kg of ideal body weight). (Figure 2) illustrates the average daily calorie need broken down by BMI for IC and examined predictive equations.</p><p><b>Conclusion:</b> There was a significant difference between IC measurements and various predictive equations except for weight-based high-estimated calorie needs. Nutrition goals changed significantly in response to IC measurements. It is recommended that we expand the use of IC in the critically ill population at our institution. In settings where IC is not possible, weight-based nomograms should be utilized.</p><p><b>Table 1.</b> Baseline Demographics and Clinical Characteristics.</p><p></p><p><b>Table 2.</b> Nutrition Support.</p><p></p><p></p><p><b>Figure 1.</b> Difference in Daily Calorie Estimation Utilizing IC Compared to Predictive Equations.</p><p></p><p><b>Figure 2.</b> RMR by IC and Other Predictive Equations by BMI.</p><p><b>GI, Obesity, Metabolic, and Other Nutrition Related Concepts</b></p><p>Suhena Patel, MBBS<sup>1</sup>; Osman Mohamed Elfadil, MBBS<sup>1</sup>; Yash Patel, MBBS<sup>1</sup>; Chanelle Hager, RN<sup>1</sup>; Manpreet Mundi, MD<sup>1</sup>; Ryan Hurt, MD, PhD<sup>1</sup></p><p><sup>1</sup>Mayo Clinic, Rochester, MN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Chronic capillary leak syndrome is a rare but potentially fatal side effect of immunotherapy for some malignancies, mainly manifested with intractable generalized edema and often refractory hypotension. An idiopathic type of the syndrome is also known. It can be diagnosed by exclusion in patients with a single or recurrent episode of intravascular hypovolemia or generalized edema, primarily manifested by the diagnostic triad of hypotension, hemoconcentration, and hypoalbuminemia in the absence of an identifiable alternative cause. Supportive care with a role for steroids remains the standard treatment. In capillary leak syndrome secondary to cancer immune therapy, discontinuing the offending agent is typically considered. This relatively rare syndrome can be associated with significant clinical challenges. This clinical case report focuses on aspects of nutrition care.</p><p><b>Methods:</b> A 45-year-old male with a past medical history of hypertension, pulmonary tuberculosis in childhood, and rectal cancer was admitted for evaluation of anasarca. He was first diagnosed with moderately differentiated invasive adenocarcinoma of rectum IIIb (cT3, cN1, cM0) in October 2022. As initial therapy, he was enrolled in a clinical trial. He received 25 cycles of immunotherapy with the study drug Vudalimab (PD1/CTLA4 bispecific antibody), achieving a complete clinical response without additional chemotherapy, radiation, or surgery. He, unfortunately, has developed extensive capillary leak syndrome manifested with recurrent anasarca, chylous ascites, and pleural effusions since November 2023. His treatment was also complicated by the development of thyroiditis and insulin-dependent diabetes. The patient most recently presented with abdominal fullness, ascites, and peripheral edema that did not improve despite diuretic therapy. A diagnostic and therapeutic paracentesis was performed, and chylous ascites were revealed. Two weeks later, the patient was presented with re-accumulation of ascites and worsening anasarca with pleural and pericardial effusion. A PET CT then was negative for malignant lesions but revealed increased uptake along the peritoneal wall, suggestive of peritonitis. A lymphangiogram performed for further evaluation revealed no gross leak/obstruction; however, this study could not rule out microleak from increased capillary permeability. However, he required bilateral pleural and peritoneal drains (output ranged from 0.5 to 1 L daily). A diagnosis of Capillary leak syndrome was made. In addition to Octreotide, immunosuppression therapy was initiated with IV methyl prednisone (40 mg BID) followed by a transition to oral steroids (60 mg PO); however, the patient's symptoms reappeared with a reduction in dose of prednisone and transition to oral steroids. His immunosuppression regimen was modified to include a trial of IVIG weekly and IV albumin twice daily. From a nutritional perspective, he was initially on a routine oral diet. Still, his drain was increased specifically after fatty food consumption, so he switched to a low fat 40 g/day and high protein diet to prevent worsening chylous ascites. In the setting of worsening anasarca and moderate malnutrition based on ASPEN criteria, along with significant muscle loss clinically, he was started on TPN. A no-fat diet was initiated to minimize lymphatic flow with subsequent improvement in his chest tube output volume, followed by a transition to home parenteral nutrition with mixed oil and oral diet.</p><p><b>Results:</b> None Reported.</p><p><b>Conclusion:</b> Chronic capillary/lymphatic leak syndrome can be challenging and necessitate dietary modification. Along with dietary changes to significantly reduce oral fat intake, short—or long-term PN can be considered.</p><p>Kishore Iyer, MBBS<sup>1</sup>; Francisca Joly, MD, PhD<sup>2</sup>; Donald Kirby, MD, FACG, FASPEN<sup>3</sup>; Simon Lal, MD, PhD, FRCP<sup>4</sup>; Kelly Tappenden, PhD, RD, FASPEN<sup>2</sup>; Palle Jeppesen, MD, PhD<sup>5</sup>; Nader Youssef, MD, MBA<sup>6</sup>; Mena Boules, MD, MBA, FACG<sup>6</sup>; Chang Ming, MS, PhD<sup>6</sup>; Tomasz Masior, MD<sup>6</sup>; Susanna Huh, MD, MPH<sup>7</sup>; Tim Vanuytsel, MD, PhD<sup>8</sup></p><p><sup>1</sup>Icahn School of Medicine at Mount Sinai, New York, NY; <sup>2</sup>Nutritional Support, Hôpital Beaujon, Paris, Ile-de-France; <sup>3</sup>Department of Intestinal Failure and Liver Diseases, Cleveland, OH; <sup>4</sup>Salford Royal NHS Foundation Trust, Salford, England; <sup>5</sup>Department of Intestinal Failure and Liver Diseases, Rigshospitalet, Copenhagen, Denmark, Hovedstaden; <sup>6</sup>Ironwood Pharmaceuticals, Basel, Basel-Stadt; <sup>7</sup>Ironwood Pharmaceuticals, Boston, MA; <sup>8</sup>University Hospitals Leuven, Leuven, Brabant Wallon</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> American College of Gastroenterology 2024, October 25-30, 2024, Philadelphia, Pennsylvania.</p><p><b>Financial Support:</b> None Reported.</p><p><b>International Poster of Distinction</b></p><p>Francisca Joly, MD, PhD<sup>1</sup>; Tim Vanuytsel, MD, PhD<sup>2</sup>; Donald Kirby, MD, FACG, FASPEN<sup>3</sup>; Simon Lal, MD, PhD, FRCP<sup>4</sup>; Kelly Tappenden, PhD, RD, FASPEN<sup>1</sup>; Palle Jeppesen, MD, PhD<sup>5</sup>; Federico Bolognani, MD, PhD<sup>6</sup>; Nader Youssef, MD, MBA<sup>6</sup>; Carrie Li, PhD<sup>6</sup>; Reda Sheik, MPH<sup>6</sup>; Isabelle Statovci, BS, CH<sup>6</sup>; Susanna Huh, MD, MPH<sup>7</sup>; Kishore Iyer, MBBS<sup>8</sup></p><p><sup>1</sup>Nutritional Support, Hôpital Beaujon, Paris, Ile-de-France; <sup>2</sup>University Hospitals Leuven, Leuven, Brabant Wallon; <sup>3</sup>Department of Intestinal Failure and Liver Diseases, Cleveland, OH; <sup>4</sup>Salford Royal NHS Foundation Trust, Salford, England; <sup>5</sup>Department of Intestinal Failure and Liver Diseases, Rigshospitalet, Copenhagen, Denmark, Hovedstaden; <sup>6</sup>Ironwood Pharmaceuticals, Basel, Basel-Stadt; <sup>7</sup>Ironwood Pharmaceuticals, Boston, MA; <sup>8</sup>Icahn School of Medicine at Mount Sinai, New York, NY</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> Digestive Disease Week 2024, May 18 - 21, 2024, Washington, US.</p><p><b>Financial Support:</b> None Reported.</p><p>Tim Vanuytsel, MD, PhD<sup>1</sup>; Simon Lal, MD, PhD, FRCP<sup>2</sup>; Kelly Tappenden, PhD, RD, FASPEN<sup>3</sup>; Donald Kirby, MD, FACG, FASPEN<sup>4</sup>; Palle Jeppesen, MD, PhD<sup>5</sup>; Francisca Joly, MD, PhD<sup>3</sup>; Tomasz Masior, MD<sup>6</sup>; Patricia Valencia, PharmD<sup>7</sup>; Chang Ming, MS, PhD<sup>6</sup>; Mena Boules, MD, MBA, FACG<sup>6</sup>; Susanna Huh, MD, MPH<sup>7</sup>; Kishore Iyer, MBBS<sup>8</sup></p><p><sup>1</sup>University Hospitals Leuven, Leuven, Brabant Wallon; <sup>2</sup>Salford Royal NHS Foundation Trust, Salford, England; <sup>3</sup>Nutritional Support, Hôpital Beaujon, Paris, Ile-de-France; <sup>4</sup>Department of Intestinal Failure and Liver Diseases, Cleveland, OH; <sup>5</sup>Department of Intestinal Failure and Liver Diseases, Rigshospitalet, Copenhagen, Denmark, Hovedstaden; <sup>6</sup>Ironwood Pharmaceuticals, Basel, Basel-Stadt; <sup>7</sup>Ironwood Pharmaceuticals, Boston, MA; <sup>8</sup>Icahn School of Medicine at Mount Sinai, New York, NY</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> American College of Gastroenterology 2024, October 25-30, 2024, Philadelphia, Pennsylvania.</p><p><b>Financial Support:</b> None Reported.</p><p>Boram Lee, MD<sup>1</sup>; Ho-Seong Han, PhD<sup>1</sup></p><p><sup>1</sup>Seoul National University Bundang Hospital, Seoul, Seoul-t'ukpyolsi</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Pancreatic cancer is one of the fatal malignancies, with a 5-year survival rate of less than 10%. Despite advancements in treatment, its incidence is increasing, driven by aging populations and increasing obesity rates. Obesity is traditionally considered to be a negative prognostic factor for many cancers, including pancreatic cancer. However, the \\\"obesity paradox\\\" suggests that obesity might be associated with better outcomes in certain diseases. This study investigated the effect of obesity on the survival of long-term pancreatic cancer survivors after pancreatectomy.</p><p><b>Methods:</b> A retrospective analysis was conducted on 404 patients with pancreatic ductal adenocarcinoma (PDAC) patients who underwent surgery between January 2004 and June 2022. Patients were classified into the non-obese (BMI 18.5-24.9) (n = 313) and obese (BMI ≥ 25.0) (n = 91) groups. The data collected included demographic, clinical, perioperative, and postoperative information. Survival outcomes (overall survival [OS], recurrence-free survival [RFS], and cancer-specific survival [CSS]) were analyzed using Kaplan-Meier curves and Cox regression models. A subgroup analysis examined the impact of the visceral fat to subcutaneous fat ratio (VSR) on survival within the obese cohort.</p><p><b>Results:</b> Obese patients (n = 91) had a significantly better 5-year OS (38.9% vs. 27.9%, p = 0.040) and CSS (41.4% vs. 33%, p = 0.047) than non-obese patients. RFS did not differ significantly between the groups. Within the obese cohort, a lower VSR was associated with improved survival (p = 0.012), indicating the importance of fat distribution in outcomes.</p><p><b>Conclusion:</b> Obesity is associated with improved overall and cancer-specific survival in patients with pancreatic cancer undergoing surgery, highlighting the potential benefits of a nuanced approach to managing obese patients. The distribution of adipose tissue, specifically higher subcutaneous fat relative to visceral fat, further influences survival, suggesting that tailored treatment strategies could enhance the outcomes.</p><p>Nicole Nardella, MS<sup>1</sup>; Nathan Gilchrist, BS<sup>1</sup>; Adrianna Oraiqat, BS<sup>1</sup>; Sarah Goodchild, BS<sup>1</sup>; Dena Berhan, BS<sup>1</sup>; Laila Stancil, HS<sup>1</sup>; Jeanine Milano, BS<sup>1</sup>; Christina Santiago, BS<sup>1</sup>; Melissa Adams, PA-C<sup>1</sup>; Pamela Hodul, MD<sup>1</sup></p><p><sup>1</sup>Moffitt Cancer Center, Tampa, FL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Pancreatic cancer (PC) is a devastating diagnosis with 66,440 new cases and 51,750 deaths estimated in 2024. The prevalence of malnutrition in patients with cancer has been reported to range from 30-85% depending on patient age, cancer type, and stage of disease. Specifically, PC patients frequently present with malnutrition which can lead to negative effects on quality of life and tumor therapy. We hypothesize that increased awareness of early nutritional intervention for PC patients has led to high utilization of dietitian consultations at our tertiary cancer center.</p><p><b>Methods:</b> This IRB-exempt retrospective review included newly diagnosed, treatment naïve PC patients presenting to our institution in 2021-2023 (n = 701). We define newly diagnosed as having pathologically confirmed adenocarcinoma within 30 days of clinic presentation. Patients were screened for weight loss and risk of malnutrition using the validated Malnutrition Screening Tool (MST) (89.5 positive predictive value) at the initial consultation and were referred to a dietitian based on risk or patient preference. Data was collected on demographics, disease stage (localized vs metastatic), tumor location (head/neck/uncinate, body/tail, multi-focal), presenting symptoms (percent weight loss, abdominal pain, bloating, nausea/vomiting, fatigue, change in bowel habits), experience of jaundice, pancreatitis, and gastric outlet obstruction, and dietitian consultation. Descriptive variables and Fisher's exact test were used to report outcomes.</p><p><b>Results:</b> The majority of patients were male (54%) with median age of 70 (27-95). About half of patients had localized disease (54%) with primary tumor location in the head/neck/uncinate region (57%). Patients with head/neck/uncinate tumor location mostly had localized disease (66%), while patients with body/tail tumors tended to have metastatic disease (63%). See Table 1 for further demographics. Unintentional weight loss was experienced by 66% of patients (n = 466), 69% of localized patients (n = 261) and 64% of metastatic patients (n = 205). Patients with localized disease stated a 12% loss in weight over a median of 3 months, while metastatic patients reported a 10% weight loss over a median of 5 months. Of the localized patients, the majority presented with symptoms of abdominal pain (66%), nausea/vomiting/fatigue (61%), and change in bowel habits (44%). Presenting symptoms of the metastatic patients were similar (see Table 2). There was no statistical significance of tumor location in relation to presenting symptoms. Dietitian consults occurred for 67% (n = 473) of the patient population, 77% for those with localized disease and 57% for those with metastatic disease. Of those with reported weight loss, 74% (n = 343) had dietitian consultation.</p><p><b>Conclusion:</b> Overall, it was seen that a high number of newly diagnosed, treatment naïve PC patients present with malnutrition. Patients with localized disease and tumors located in the head/neck/uncinate region experience the greatest gastrointestinal symptoms of nausea, vomiting, change in bowel habits, and fatigue. Early implementation of a proactive nutritional screening program resulted in increased awareness of malnutrition and referral for nutritional intervention for newly diagnosed PC patients.</p><p><b>Table 1.</b> Demographics and Disease Characteristics.</p><p></p><p><b>Table 2.</b> Presenting Symptoms.</p><p></p><p>Nicole Nardella, MS<sup>1</sup>; Nathan Gilchrist, BS<sup>1</sup>; Adrianna Oraiqat, BS<sup>1</sup>; Sarah Goodchild, BS<sup>1</sup>; Dena Berhan, BS<sup>1</sup>; Laila Stancil, HS<sup>1</sup>; Jeanine Milano, BS<sup>1</sup>; Christina Santiago, BS<sup>1</sup>; Melissa Adams, PA-C<sup>1</sup>; Pamela Hodul, MD<sup>1</sup></p><p><sup>1</sup>Moffitt Cancer Center, Tampa, FL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Pancreatic cancer (PC) is an aggressive disease, with 5-year survival rate of 13%. Symptoms occur late in the disease course, leading to approximately 50% of patients presenting with metastatic disease. New-onset diabetes is often one of the first symptoms of PC, with diagnosis occurring up to 3 years before cancer diagnosis. We hypothesize increasing awareness of PC prevalence in diabetic patients, both new-onset and pre-existing, may lead to early PC diagnosis.</p><p><b>Methods:</b> This IRB-exempt retrospective review included new PC patients presenting to our institution in 2021-2023 with diabetes diagnosis (n = 458). We define new-onset diabetes as having been diagnosed within 3 years prior to pathologically confirmed adenocarcinoma. We define newly diagnosed PC as having pathologically confirmed adenocarcinoma within 30 days of clinic presentation. Data was collected on demographics, staging (localized vs metastatic), tumor location (head/neck/uncinate, body/tail, multi-focal), treatment initiation, diabetes onset (new-onset vs pre-existing), diabetes regimen, and weight loss. Descriptive variables were used to report outcomes.</p><p><b>Results:</b> In the study period, 1,310 patients presented to our institution. Of those, 35% had a diabetes diagnosis (n = 458). The majority of patients were male (61%) with age at PC diagnosis of 69 (41-92). Patients mostly had localized disease (57%) with primary tumor location in the head/neck/uncinate region (59%). New-onset diabetes was present in 31% of diabetics, 11% of all new patients, with 63% having localized disease (79% head/neck/uncinate) and 37% metastatic (66% body/tail). Of those with pre-existing diabetes (69%), 54% had localized disease (69% head/neck/uncinate) and 46% had metastatic disease (53% body/tail). See Table 1 for further demographic/disease characteristics. Abrupt worsening of diabetes was seen in 10% (n = 31) of patients with pre-existing diabetes, and 12% had a change in their regimen prior to PC diagnosis. Hence 13% (175/1,310) of all new patients presented with either new-onset or worsening diabetes. Weight loss was present in 75% (n = 108) of patients with new-onset diabetes, with a median of 14% weight loss (3%-38%) over 12 months (1-24). Alternatively, weight loss was present in 66% (n = 206) of patients with pre-existing diabetes, with a median of 14% weight loss (4%-51%) over 6 months (0.5-18). Diabetes medication was as follows: 41% oral, 30% insulin, 20% both oral and insulin, 10% no medication. Of those patients with new-onset diabetes, 68% were diagnosed within 1 year of PC diagnosis and 32% were diagnosed within 1-3 years of PC diagnosis. Of those within 1 year of diagnosis, 68% had localized disease with 81% having head/neck/uncinate tumors. Of the metastatic (31%), 73% had body/tail tumors. For patients with diabetes diagnosis within 1-3 years of PC diagnosis, 52% had localized disease (75% head/neck/uncinate) and 48% had metastatic disease (59% body/tail). See Table 2 for further characteristics.</p><p><b>Conclusion:</b> Overall, approximately one-third of new patients presenting with PC at our institution had diabetes, and new-onset diabetes was present in one-third of those patients. The majority of diabetes patients presented with localized head/neck/uncinate tumors. When comparing new-onset vs pre-existing diabetes, new-onset tended to experience greater weight loss over a longer time with more localized disease than pre-existing diabetes patients. Patients with diabetes diagnosis within 1 year of PC diagnosis had more localized disease (head/neck/uncinate). Hence increased awareness of diabetes in relation to PC, particularly new onset and worsening pre-existing, may lead to early diagnosis.</p><p><b>Table 1.</b> Demographics and Disease Characteristics.</p><p></p><p><b>Table 2.</b> New-Onset Diabetes Characteristics.</p><p></p><p>Marcelo Mendes, PhD<sup>1</sup>; Gabriela Oliveira, RD<sup>2</sup>; Ana Zanini, RD, MSc<sup>2</sup>; Hellin dos Santos, RD, MSc<sup>2</sup></p><p><sup>1</sup>Cicatripelli, Belém, Para; <sup>2</sup>Prodiet Medical Nutrition, Curitiba, Parana</p><p><b>Encore Poster</b></p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> According to the NPUAP, a pressure injury (PI) is damage that occurs to the skin and/or underlying soft tissue, primarily over bony prominences, and may also be related to the use of medical devices. PIs can range from intact skin to deeper ulcers, affecting structures such as muscles and bones. The aim of this study was to report the experience of using a specialized supplement for wound healing in the treatment of a PI.</p><p><b>Methods:</b> This is a case report based on the experience of the nurses from Cicatripelli®. Data were collected from May to July 2024 through a review of medical records and photographs showing the wound's progression. The patient was a 69-year-old woman with COPD, diabetes mellitus, and systemic arterial hypertension, who denied smoking and alcohol consumption. She developed a stage 4 PI in the sacral region after 15 days of mechanical ventilation due to bacterial pneumonia and was admitted to a private clinic for treatment on May 2, 2024. Initial wound assessment: Measurements: 16.5x13x4cm (WxLxD); abundant purulent exudate with a foul odor; intact peripheral skin; mild to moderate pain; 75% granulation tissue and 25% liquefactive necrosis (slough) (Figure 1). The wound was cleansed with 0.1% polyhexamethylene biguanide (PHMB) solution, followed by conservative debridement, primary coverage with hydrophiber with 1.2% silver, and secondary coverage with cotton gauze and transparent film, with dressing changes scheduled every 72 hours. Supplementation (Correctmax – Prodiet Medical Nutrition) started on 05/20/2024, with a dosage of 2 sachets per day, containing 10 g of collagen peptide, 3 g of L-arginine, 612 mg of vitamin A, 16 mg of vitamin E, 508 mg of vitamin C, 30 mcg of selenium, and 16 mg of zinc.</p><p><b>Results:</b> On the 17th day of supplementation, the hydrophiber with silver dressing was replaced with PHMB-impregnated gauze, as the wound showed no signs of infection and demonstrated significant clinical improvement. Measurements: 8x6x2cm (WxLxD); moderate serohematic exudate; intact peripheral skin; 100% granulation tissue; significant improvement in pain and odor (Figure 2). On the 28th day, the dressing was switched to calcium and sodium alginate to optimize exudate control due to the appearance of mild dermatitis. Low-intensity laser therapy was applied, and a skin protective spray was used. Wound assessment: Measurements: 7x5.5x1.5 cm (WxLxD), with maintained characteristics (Figure 3). On the 56th day, the patient returned for dressing change and discharge instructions, as she could not continue the treatment due to a lack of resources. The approach remained the same with dressing changes every 3 days. Wound assessment: Measurements: 5x3.5x0.5 cm (WxLxD), with approximately 92% reduction in wound area, epithelialized margins, and maintained characteristics (Figure 4).</p><p><b>Conclusion:</b> Nutritional intervention with specific nutrient supplementation can aid in the management of complex wounds, serving as a crucial tool in the healing process and contributing to reduced healing time.</p><p></p><p><b>Figure 1.</b> Photo of the wound on the day of the initial assessment on 05/02/2024.</p><p></p><p><b>Figure 2.</b> Photo of the wound after 17 days of supplementation on 06/06/2024.</p><p></p><p><b>Figure 3.</b> Photo of the wound after 28 days of supplementation on 06/17/2024.</p><p></p><p><b>Figure 4.</b> Photo of the wound after 56 days of supplementation on 07/15/2024.</p><p>Ludimila Ribeiro, RD, MSc<sup>1</sup>; Bárbara Gois, RD, PhD<sup>2</sup>; Ana Zanini, RD, MSc<sup>3</sup>; Hellin dos Santos, RD, MSc<sup>3</sup>; Ana Paula Celes, MBA<sup>3</sup>; Flávia Corgosinho, PhD<sup>2</sup>; Joao Mota, PhD<sup>4</sup></p><p><sup>1</sup>School of Nutrition, Federal University of Goiás, Goiania, Goias; <sup>2</sup>School of Nutrition, Federal University of Goiás, Goiânia, Goias; <sup>3</sup>Prodiet Medical Nutrition, Curitiba, Parana; <sup>4</sup>Federal University of Goias, Goiania, Goias</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Postprandial blood glucose is considered an important risk factor for the development of macrovascular and microvascular diseases. Despite the use of hypoglycemic agents, patients with diabetes often experience postprandial hyperglycemia due to unbalanced meals. The aim of this study was to compare the effects of a low glycemic index formula for glycemic control as a substitute for a standard breakfast in patients with type 2 diabetes.</p><p><b>Methods:</b> This randomized, placebo-controlled, crossover study included 18 individuals with type 2 diabetes. Participants were instructed to consume, in random order, either a nutritional formula or a typical Brazilian breakfast with the same caloric content for three consecutive week days in different weeks. The nutritional formula (200 mL) provided 200 kcal, with 20 g of carbohydrates, 8.8 g of protein, 9.4 g of fat (MUFA: 5.6 g, PUFA: 2.0 g), and 3.0 g of fiber (DiamaxIG – Prodiet Medical Nutrition), serving as a breakfast substitute. Both the nutritional formula and the standard breakfast were provided to the participants. During the two weeks of intervention, participants used continuous glucose monitoring sensors (Libre 2). Weight and height were measured to calculate body mass index (BMI), and medication use was monitored.</p><p><b>Results:</b> The sample consisted of 61% females, with a mean age of 50.28 ± 12.58 years. The average blood glucose level was 187.13 ± 77.98 mg/dL and BMI 29.67 ± 4.86 kg/m². All participants were taking metformin, and two were taking it concomitantly with insulin. There were no changes in medication doses or regimens during the study. The incremental area under the curve was significantly lower in the nutritional formula group compared to the standard breakfast group (2,794.02 ± 572.98 vs. 4,461.55 ± 2,815.73, p = 0.01).</p><p><b>Conclusion:</b> The low glycemic index formula for glycemic control significantly reduced postprandial glycemic response compared to a standard Brazilian breakfast in patients with type 2 diabetes. These findings suggest that incorporating low glycemic index meals could be an effective strategy for better managing postprandial blood glucose levels in this population, which may help mitigate the risk of developing macro and microvascular complications.</p><p>Kirk Kerr, PhD<sup>1</sup>; Bjoern Schwander, PhD<sup>2</sup>; Dominique Williams, MD, MPH<sup>1</sup></p><p><sup>1</sup>Abbott Nutrition, Columbus, OH; <sup>2</sup>AHEAD GmbH, Bietigheim-Bissingen, Baden-Wurttemberg</p><p><b>Financial Support:</b> Abbott Nutrition.</p><p><b>Background:</b> According to the World Health Organization, obesity is a leading risk factor for global non communicable diseases like diabetes, heart disease and cancer. Weight cycling, often defined as intentionally losing and unintentionally regaining weight, is observed in people with obesity and can have adverse health and economic consequences. This study develops the first model of the health economic consequences of weight cycling in individuals with obesity, defined by a body-mass index (BMI) ≥ 30 kg/m².</p><p><b>Methods:</b> A lifetime state-transition model (STM) with monthly cycles was developed to simulate a cohort of individuals with obesity, comparing “weight cyclers” versus “non-cyclers”. The simulated patient cohort was assumed to have average BMI 35.5, with 11% of patients with cardiovascular disease, and 6% of patients with type 2 diabetes. Key outcomes were the cost per obesity-associated event avoided, the cost per life-year (LY) gained, and the cost per quality-adjusted life year (QALY) gained, using a US societal perspective. Transition probabilities for obesity-associated diseases were informed by meta-analyses and were based on US disease-related base risks. Risks were adjusted by BMI-related, weight-cycling-related, and T2D-related relative risks (RR). BMI progression, health utilities, direct and indirect costs, and other population characteristics were informed by published US studies. Future costs and effects were discounted by 3% per year. Deterministic and probabilistic sensitivity analyses were performed to investigate the robustness of the results.</p><p><b>Results:</b> Simulating a lifetime horizon, non-cyclers had 0.090 obesity-associated events avoided, 0.602 LYs gained, 0.518 QALYs gained and reduced total costs of approximately $4,592 ($1,004 direct and $3,588 indirect costs) per person. Sensitivity analysis showed the model was most responsive to changes in patient age and cardiovascular disease morbidity and mortality risks. Non-cycling as the cost-effective option was robust in sensitivity analyses.</p><p><b>Conclusion:</b> The model shows that weight cycling has a major impact on the health of people with obesity, resulting in increased direct and indirect costs. The approach to chronic weight management programs should focus not only on weight reduction but also on weight maintenance to prevent the enhanced risks of weight cycling.</p><p>Avi Toiv, MD<sup>1</sup>; Arif Sarowar, MSc<sup>2</sup>; Hope O'Brien, BS<sup>2</sup>; Thomas Pietrowsky, MS, RD<sup>1</sup>; Nemie Beltran, RN<sup>1</sup>; Yakir Muszkat, MD<sup>1</sup>; Syed-Mohammad Jafri, MD<sup>1</sup></p><p><sup>1</sup>Henry Ford Hospital, Detroit, MI; <sup>2</sup>Wayne State University School of Medicine, Detroit, MI</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Age is an important factor in the transplant evaluation as age at transplantation is historically thought to influence transplant outcomes in organ transplant recipients. There is limited data on the impact of age on intestinal (IT) and multivisceral (MVT) organ transplantation. This study investigates the impact of age on post-transplant outcomes in patients who received an intestinal or multivisceral (including intestine) transplant, comparing those under 40 years old to those aged 40 and above.</p><p><b>Methods:</b> We conducted a retrospective chart review of all patients who underwent IT at an academic transplant center from 2010 to 2023. The primary outcome was patient survival and graft failure analyzed with Kaplan-Meier survival analysis.</p><p><b>Results:</b> Among 50 IT recipients, there were 11 IT recipients &lt; 40years old and 39 IT recipients ≥40 years old (Table). The median age at transplant in the &lt;40 group was 37 years (range, 17-39) and in the ≥40 group was 54 years (range, 40-68). In both groups, the majority of transplants were exclusively IT, however they included MVT as well. Kaplan-Meier survival analysis revealed no significant differences between the two age groups in graft survival or patient mortality. Reoperation within 1 month was significantly linked to decreased survival (p = 0.015) and decreased graft survival (p = 0.003) as was moderate to severe rejection within 1 month (p = 0.009) but was not significantly different between the two age groups. Wilcoxon rank-sum test showed no difference between groups in regard to reoperation or moderate to severe rejection at 1 or 3 months or the development of chronic kidney disease.</p><p><b>Conclusion:</b> Age at the time of intestinal transplantation (&lt; 40 vs. ≥40 years old) does not appear to significantly impact major transplant outcomes, such as patient mortality, graft survival, or rejection rates. While reoperation and moderate to severe rejection within 1 and 3 months negatively affected overall outcomes, these complications were not more frequent in older or younger patients.</p><p><b>Table 1.</b> Demographic Characteristics of Intestinal Transplant Recipients.</p><p></p><p>BMI, body mass index; TPN, total parenteral nutrition.</p><p><b>International Poster of Distinction</b></p><p>Gabriela de Oliveira Lemos, MD<sup>1</sup>; Natasha Mendonça Machado, PhD<sup>2</sup>; Raquel Torrinhas, PhD<sup>3</sup>; Dan Linetzky Waitzberg, PhD<sup>3</sup></p><p><sup>1</sup>University of Sao Paulo School of Medicine, Brasília, Distrito Federal; <sup>2</sup>University of Sao Paulo School of Medicine, São Paulo; <sup>3</sup>Faculty of Medicine of the University of São Paulo, São Paulo</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> Ganepão 2023.</p><p><b>Publication:</b> Braspen Journal. ISSN 2764-1546 | Online Version.</p><p><b>Financial Support:</b> This study is linked to project no. 2011/09612-3 and was funded by the Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP).</p><p><b>Background:</b> Sphingolipids (SLs) are key molecules in cell signaling and play a central role in lipotoxicity. Moreover, they are major constituents of eucaryotic membrane cells. Data on SLs remodeling in the gastrointestinal tract after Roux-en-Y gastric bypass (RYGB) are lacking and may help to elicit tissue turnover and metabolism. This protocol aims to evaluate the SLs’ profile and remodeling within the plasma and different segments of the gastrointestinal tract (GIT) before and 3 months after RYGB in a population of women with obesity and type 2 diabetes mellitus (T2DM) and to correlate the changes within these tissues. This investigation is part of the SURMetaGIT study, registered at www.clinicalTrials.gov (NCT01251016).</p><p><b>Methods:</b> Twenty-eight women with obesity and T2DM who underwent RYGB were enrolled in this protocol. Those on insulin therapy were excluded. We collected plasma (n = 28) and intestinal samples from the gastric pouch (n = 9), duodenum (n = 8), jejunum (n = 9), and ileum (n = 9) for untarget metabolomics analysis at baseline and 3 months post-surgery. Indian ink (SPOT®) was used to check the places for the following biopsies. SLs were identified using high-performance liquid chromatography coupled mass spectrometry. Data were processed and analyzed using AnalysisBaseFileConverter and MS-DIAL, respectively. The magnitude of SLs’ changes after RYGB was assessed by the fold change (log2 post-surgery mean/pre-surgery mean). The Spearman test was performed for the correlation analysis. P value &lt; 0.05 was considered significant. Statistics were carried out in the Jamovi software (2.2.5) and MetaboAnalyst 5.0.</p><p><b>Results:</b> 34 SLs were identified, including sphingomyelins (SM), ceramides (Cer), and glycosphingolipids (GlcSL). SMs were the most common SL found in the plasma – 27 SM, 3 Cer, and 4 GlcSL- and in GIT -16 SM, 13 Cer, and 5 GlcSL. Every GIT tissue presented distinct SL remodeling. The plasma and the jejunum better-discriminated SLs’ changes following surgery (Figure 1). The jejunum expressed the most robust changes, followed by the plasma and the duodenum (Figure 2). Figure 3 presents the heatmap of the plasma and the GIT tissues. Correlation analysis showed that the plasmatic SLs, particularly SM(d32:0), GlcCer(d42:1), and SM(d36:1), strongly correlated with jejunum SLs. These lipids showed a negative-strong correlation with jejunal sphingomyelins, but a positive-strong correlation with jejunal ceramides (Table 1).</p><p><b>Conclusion:</b> RYGB was associated with SL remodeling in the plasma and GIT. SM was the main SL found in the plasma and the GIT. The most robust changes occurred in the jejunum and the plasma, and these 2 samples presented the more relevant correlation. Considering our findings, the role of SM in metabolic changes after RYGB should be investigated.</p><p><b>Table 1.</b> Correlation Analysis of Sphingolipids from the plasma with the Sphingolipids from the Gastrointestinal Tract.</p><p></p><p>*p &lt; ,05; **p &lt; ,01; ***p &lt; 0,001.</p><p></p><p>The green circle represents samples at baseline and the red circles represent the samples 3 months after RYGB.</p><p><b>Figure 1.</b> Principal Component Analysis (PCA) from GIT Tissues and Plasma.</p><p></p><p>Fold change = log2 post-surgery mean/pre-surgery mean.</p><p><b>Figure 2.</b> Fold Change of Sphingolipids from the Plasma and Gastrointestinal Tract.</p><p></p><p>The map under the right top green box represents lipids’ abundance before surgery, and the map under the left top red box represents lipids’ abundance after RYGB.</p><p><b>Figure 3.</b> Heatmap of Sphingolipids from the Plasma and Gastrointestinal Tract.</p><p>Lucas Santander<sup>1</sup>; Gabriela de Oliveira Lemos, MD<sup>2</sup>; Daiane Mancuzo<sup>3</sup>; Natasha Mendonça Machado, PhD<sup>4</sup>; Raquel Torrinhas, PhD<sup>5</sup>; Dan Linetzky Waitzberg, PhD<sup>5</sup></p><p><sup>1</sup>Universidade Santo Amaro (Santo Amaro University), São Bernardo Do Campo, Sao Paulo; <sup>2</sup>University of Sao Paulo School of Medicine, Brasília, Distrito Federal; <sup>3</sup>Universidade São Caetano (São Caetano University), São Caetano do Sul, Sao Paulo; <sup>4</sup>University of Sao Paulo School of Medicine, São Paulo; <sup>5</sup>Faculty of Medicine of the University of São Paulo, São Paulo</p><p><b>Financial Support:</b> Fundação de Amparo a Pesquisa do Estado de São Paulo.</p><p><b>Background:</b> Microalbuminuria (MAL) is an early biomarker of kidney injury, linked to conditions like hypertension, type 2 diabetes (T2DM), and obesity. It is also associated with higher cardiovascular (CV) risk. This protocol examines the impact of Roux-en-Y gastric bypass (RYGB) on MAL and CV markers in patients with obesity, T2DM, and MAL.</p><p><b>Methods:</b> 8 women with grade II-III obesity, T2DM, and MAL who underwent RYGB were included. Patients on insulin therapy were not included. MAL was defined as a urinary albumin-to-creatinine ratio &gt; 30 mg/g. MAL, glycemic, and lipid serum biomarkers were measured at baseline and three months post-surgery. Systolic (SBP) and diastolic (DBP) blood pressures and medical treatments for T2DM and hypertension were also assessed. T2DM remission was defined by ADA 2021 criteria. Categorical variables were reported as absolute and relative frequencies, while continuous variables were expressed as median and IQR based on normality tests. Intragroup and intergroup comparisons were conducted using the Wilcoxon and Mann-Whitney tests for numeric data. The Fisher test was performed when necessary to compare dichotomic variables. Data were analyzed in the JASP software version 0.18.1.0.</p><p><b>Results:</b> Overall, RYGB was associated with weight loss, improved body composition, and better CV markers (Table 1). Post-surgery, MAL decreased by at least 70% in all patients and resolved in half. All patients with MAL resolution had pre-surgery levels ≤ 100 mg/g. Those without resolution had severe pre-surgery MAL (33.8 vs. 667.5, p = 0.029), and higher SBP (193 vs. 149.5, p = 0.029) DBP (138 vs. 98, p = 0.025). Blood pressure decreased after surgery but remained higher in patients without MAL resolution: SBP (156.0 vs. 129.2, p = 0.069) and DBP (109.5 vs. 76.5, p &lt; 0.001). MAL resolution was not linked to T2DM remission at 3 months (75% vs. 50%, p = 1.0). One patient had worsened MAL (193.0 vs. 386.9 mg/g) after RYGB. Glomerular filtration rate (GFR) tended to increase post-surgery only in the group with MAL resolution (95.4 vs. 108.2 ml/min/1.73 m², p = 0.089), compared to the group without MAL resolution (79.2 vs. 73.7 ml/min/1.73 m², p = 0.6).</p><p><b>Conclusion:</b> RYGB effectively reduced markers of renal dysfunction and cardiovascular risk in this cohort. Patients showed a decrease in MAL, with resolution in half of the patients. The small sample size and short follow-up period may have limited the overall impact of the surgery on renal function. Future studies with larger cohorts and longer follow-ups are needed to understand better the effects of bariatric surgery on MAL, and its relation to other CV markers.</p><p><b>Table 1.</b> Biochemical and Clinical Data Analysis Following RYGB.</p><p></p><p>eGFR: estimated glomerular filtration rate; HbA1c: glycated hemoglobin; HDL-c: high-density lipoprotein cholesterol; HOMA-BETA: beta-cell function by the homeostasis model; HOMA-IR: Homeostasis Model Assessment of Insulin Resistance; LDL-c: low-density lipoprotein cholesterol; Non-HDL-c: non-high-density lipoprotein cholesterol; VLDL-c: very-low-density lipoprotein cholesterol; DBP: diastolic blood pressure; SBP: systolic blood pressure; WC: waist circumference.</p><p>Michelle Nguyen, BSc, MSc<sup>1</sup>; Johane P Allard, MD, FRCPC<sup>2</sup>; Dane Christina Daoud, MD<sup>3</sup>; Maitreyi Raman, MD, MSc<sup>4</sup>; Jennifer Jin, MD, FRCPC<sup>5</sup>; Leah Gramlich, MD<sup>6</sup>; Jessica Weiss, MSc<sup>1</sup>; Johnny H. Chen, PhD<sup>7</sup>; Lidia Demchyshyn, PhD<sup>8</sup></p><p><sup>1</sup>Pentavere Research Group Inc., Toronto, ON; <sup>2</sup>Division of Gastroenterology, Department of Medicine, Toronto General Hospital, Toronto, ON; <sup>3</sup>Division of Gastroenterology, Centre Hospitalier de l'Université de Montréal (CHUM), Department of Medicine, University of Montreal, Montreal, QC; <sup>4</sup>Division of Gastroenterology, University of Calgary, Calgary, AB; <sup>5</sup>Department of Medicine, University of Alberta, Division of Gastroenterology, Royal Alexandra Hospital, Edmonton, AB; <sup>6</sup>Division of Gastroenterology, Faculty of Medicine and Dentistry, University of Alberta, Edmonton, AB; <sup>7</sup>Takeda Canada Inc., Vancouver, BC; <sup>8</sup>Takeda Canada Inc., Toronto, ON</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> 46th European Society for Clinical Nutrition and Metabolism (ESPEN) Congress. 7-10 September 2024, Milan, Italy.</p><p><b>Financial Support:</b> Funding of this study is from Takeda Canada Inc.</p><p><b>Background:</b> Teduglutide is indicated for the treatment of patients with short bowel syndrome (SBS) who are dependent on parenteral support (PS). This study evaluated longer-term teduglutide effectiveness and safety in Canadian patients diagnosed with SBS dependent on PS using Real-World Evidence.</p><p><b>Methods:</b> This was an observational, retrospective study, using data from the national Canadian Takeda patient support program, and included adults with SBS. Data were collected 6 months before teduglutide initiation and from initiation to Dec-01-2023, death, or loss of follow-up. Descriptive statistics characterized the population and treatment-emergent adverse events (TEAEs). Changes in parenteral nutrition/intravenous fluid supply (PN/IV) were assessed based on decreases in PN/IV volume from baseline. Statistical significance was set at p &lt; 0.05.</p><p><b>Results:</b> 52 patients (60% women) were included in this study. Median age (range) was 54 (22–81) and 50% had Crohn's disease as their etiology of SBS. At 6 months, median (range) absolute and percentage reduction from baseline in PN/IV volume was 3,900 mL/week (-6,960–26,784; p &lt; 0.001), and 28.1% (-82.9–100). At 24 months, median (range) absolute reduction from baseline was 6,650 mL/week (-4,400–26,850; p = 0.003), and the proportion of patients who achieved ≥20% reduction in weekly PN/IV was 66.7%. Over the study, 27% achieved independence from PN/IV. TEAEs were reported in 51 (98%) patients (83% were serious TEAEs) during the study period, the 3 most common were weight changes, diarrhea, and fatigue.</p><p><b>Conclusion:</b> Patients showed significant decreases in PN/IV volumes after initiating teduglutide, with no unexpected safety findings. This study demonstrates real-world, longer-term effectiveness and safety of teduglutide in Canadian patients with SBS, complimenting previous clinical trials, and real-world studies.</p><p><b>Poster of Distinction</b></p><p>Sarah Carter, RD, LDN, CNSC<sup>1</sup>; Ruth Fisher, RDN, LD, CNSC<sup>2</sup></p><p><sup>1</sup>Coram CVS/Specialty Infusion Services, Tullahoma, TN; <sup>2</sup>Coram CVS/Specialty Infusion Services, Saint Hilaire, MN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Perceived benefit is one factor determining therapy continuation. Little information is published regarding positive outcomes for patients receiving the GLP-2 analog teduglutide apart from success rates of weaning HPN and hydration volumes by 20%. Patients who are new to therapy may question how long after initiation should they expect to see results. Patients may collaborate with prescribers to improve therapy tolerance if they have hope for improvements in their quality of life. This data analysis will provide details regarding patients receiving teduglutide and their perceived benefits to therapy.</p><p><b>Methods:</b> Dietitians interview patients receiving teduglutide as part of a service agreement, monitoring persistency and helping to eliminate barriers to therapy. Dietitians make weekly and monthly calls based on patients’ drug start dates and document interventions in flowsheets in patients’ electronic medical records. The interventions are published to data visualization software to monitor compliance with dietitian outreach as part of a quality improvement project. All patients were diagnosed with short bowel syndrome, but baseline characteristics were not exported to this dashboard. This study is a retrospective analysis of the existing dashboard using 3 years of assessments (May 1, 2021-April 30, 2024). Exclusion criteria included therapy dispensed from a pharmacy not using the company's newest computer platform and patients who did not respond to outreach. We analyzed the time on therapy before a positive outcome was reported by the patient to the dietitian and which positive outcomes were most frequently reported.</p><p><b>Results:</b> The data set included 336 patients with 2509 phone assessments. The most frequently reported first positive outcome was improved ostomy output/less diarrhea (72%, n = 243), occurring just after a month on therapy (mean 31 days ± 26.4). The mean time for first positive outcome for all patients who reported one was 32 days ± 28.5 (n = 314). Of the 22 patients who reported no positive outcome, 13 didn't answer the dietitians’ calls after initial contact. A summary is listed in Table 1. Overall positive outcomes reported were improved ostomy output/less diarrhea (87%, n = 292), weight gain (70%, n = 236), improved appetite/interest in food (33%, n = 112), feeling stronger/more energetic (27%, n = 92), improved quality of life (23%, n = 90), improved lab results (13%, n = 45) and fewer antidiarrheal medications (12%, n = 40). Of the 218 patients receiving parenteral support, 44 patients stopped hydration and HPN completely (20%) with another 92 patients reporting less time or days on hydration and HPN (42%) for a total of 136 patients experiencing a positive outcome of parenteral support weaning (62%). Patients reported improvements in other areas of their lives including fewer hospitalizations (n = 39), being able to travel (n = 35), tolerating more enteral nutrition volume (n = 19), returning to work/school (n = 14) and improving sleep (n = 13). A summary is diagramed in Figure 2.</p><p><b>Conclusion:</b> This retrospective analysis indicates that teduglutide is associated with improved symptom control and improved quality of life measures, with most patients seeing a response to therapy within the first 2 months. Patients responded to teduglutide with a decrease in ostomy output and diarrhea as the most frequent recognizable response to therapy. In addition to the goal of weaning parenteral support, clinicians should be cognizant of improvements in patients’ clinical status that can have significant impact in quality of life.</p><p><b>Table 1.</b> Timing of First Reported Positive Outcome by Patients Receiving Teduglutide.</p><p></p><p></p><p><b>Figure 1.</b> Total Positive Outcomes Reported by Patients (n = 336).</p><p><b>Poster of Distinction</b></p><p>Jennifer Cholewka, RD, CNSC, CDCES, CDN<sup>1</sup>; Jeffrey Mechanick, MD<sup>1</sup></p><p><sup>1</sup>The Mount Sinai Hospital, New York, NY</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Bariatric surgery is a guideline-directed intervention for patients with severe obesity. Adherence to post-operative recommendations is variable and consequent undernutrition complicated by multiple micronutrient deficiencies is prevalent. A post-bariatric surgery syndrome (PBSS) is not well defined and therefore poorly understood, though early detection and intervention would likely decrease clinical and economic burdens. Our current experience with PBSS is presented here as a case series. We will define PBSS based on clinical evidence related to risk factors, interventions, and clinical/biochemical responses.</p><p><b>Methods:</b> Twenty four consecutive patients referred to our metabolic support service were identified between January 1, 2019 and December 31, 2023 who were admitted to The Mount Sinai Hospital in New York City with a history of RYGB (roux en y gastric bypass) or BPDDS (biliopancreatic diversion with duodenal switch) and found to have failure to thrive, undernutrition, clinical/biochemical features of at least one micronutrient deficiency, and indications for parenteral nutrition. Patients were excluded if they had surgical complications or were not treated with parenteral nutrition. Fifteen patients were included in this case series and de-identified prior to data collection using the electronic health records (EPIC) and coded data collection sheets. Descriptive statistical methods were used to analyze parametric variables as mean ± standard deviation and non-parametric variables as median (interquartile range).</p><p><b>Results:</b> Results are provided in Table 1.</p><p><b>Conclusion:</b> The PBSS is defined by significant decompensation following a bariatric surgery procedure with malabsorptive component characterized by failure to thrive, hypoalbuminemia, multiple micronutrient deficiencies, and need for parenteral nutrition. Major risk factors include inadequate protein and micronutrient intake due to either unawareness (e.g., not recommended) or poor adherence, significant alcohol consumption, or a complicating medical/surgical condition. Parenteral nutrition formulation was safe in this population and prioritizes adequate nitrogen, nonprotein calories, and micronutrition. Further analyses on risk factors, responses to therapy, and role of a multidisciplinary team are in progress.</p><p><b>Table 1.</b> Risks/Presentation.</p><p></p><p><b>Table 2.</b> Responses to Parenteral Nutrition Intervention.</p><p></p><p>Holly Estes-Doetsch, MS, RDN, LD<sup>1</sup>; Aimee Gershberg, RD, CDN, CPT<sup>2</sup>; Megan Smetana, PharmD, BCPS, BCTXP<sup>3</sup>; Lindsay Sobotka, DO<sup>3</sup>; Kristen Roberts, PhD, RDN, LD, CNSC, FASPEN, FAND<sup>4</sup></p><p><sup>1</sup>The Ohio State University, Columbus, OH; <sup>2</sup>NYC Health + Hospitals, New York City, NY; <sup>3</sup>The Ohio State University Wexner Medical Center, Columbus, OH; <sup>4</sup>The Ohio State University, Granville, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Decompensated cirrhosis increases the risk of fat maldigestion through altered bile synthesis and excretion through the bile canaliculi. Maldigestion increases the risk of vitamin and mineral deficiencies which when untreated contribute to consequential health issues such as metabolic bone disease, xerophthalmia, and hyperkeratosis. There is an absence of comprehensive guidelines for prevention and treatment of deficiencies.</p><p><b>Methods:</b> Medical and surgical history, anthropometrics, medications and nutritional supplements, laboratory data, and medical procedures were extracted and analyzed from the electronic medical record.</p><p><b>Results:</b> A patient with congenital biliary atresia and decompensated cirrhosis was seen in a hepatology outpatient clinic. Biochemical assessment revealed severe vitamin A deficiency and suboptimal vitamin D and zinc status. Physical assessment indicated telogen effluvium and transient blurry vision. Despite a history of high-dose oral retinyl acetate ranging from 10,000-50,000 units daily and a 3-day course of 100,000 units via intermuscular injection and co-treatment of zinc deficiency to ensure adequate circulating retinol binding protein, normalization of serum retinol was not possible over the last 10 years. The patient's serum vitamin A level normalized following liver transplantation.</p><p><b>Conclusion:</b> In decompensated cirrhosis, there is a lack of sufficient guidelines for micronutrient dosing when traditional treatment strategies are unsuccessful. Furthermore, altered secretion of transport proteins due to underlying liver dysfunction may pose challenges in evaluating laboratory markers of micronutrient status. Collaborations with pharmacy and medicine support a thorough assessment, and the establishment of a safe treatment and monitoring plan. Clinical research is needed to understand strategies for acceptable and safe dosing strategies for patients with chronic, unresponsive fat soluble vitamin deficiencies.</p><p>Gang Wang, PhD<sup>1</sup></p><p><sup>1</sup>Nimble Science, Calgary, AB</p><p><b>Financial Support:</b> This work was supported by the National Research Council of Canada Industrial Research Assistance Program (NRC IRAP), an Alberta Innovate Accelerating Innovations into CarE (AICE) grant, and partially by a Clinical Problems Incubator Grant from the Snyder Institute for Chronic Disease at the University of Calgary.</p><p><b>Background:</b> The small intestine (SI) microbiome plays a crucial role in nutrient absorption and emerging evidence indicates that fecal contents is insufficient to represent the SI microenvironment. Endoscopic sampling is possible but expensive and not scalable. Ingestible sampling capsule technologies are emerging. However, potential contamination becomes are major limitations of these devices.</p><p><b>Methods:</b> We previously reported the Small Intestinal MicroBiome Aspiration (SIMBA) capsules as effective means for sampling, sealing and preserving SI luminal contents for 16 s rRNA gene sequencing analysis. A subset of the DNA samples, including SI samples collected by SIMBA capsules (CAP) and the matched saliva (SAL), fecal (FEC) and duodenal endoscopic aspirate (ASP) and brush (BRU) samples, from 16 participants recruited for an observational clinical validation study were sent for shotgun metagenomic sequencing. The aims are 1) to compare the sampling performance of the capsule (CAP) compared to endoscopic aspirates (ASP) and 850 small intestine, large intestine and fecal samples from the Clinical Microbiomics data warehouse (PRJEB28097), and 2) to characterize samples from the 4 different sampling sites in terms of species composition and functional potential.</p><p><b>Results:</b> 4/80 samples (1/16 SAL, 2/16 ASP, 0/16 BRU, 1/16 CAP and 0/16 FEC) failed library preparation and 76 samples were shotgun sequenced (average of 38.5 M read pairs per sample) (Figure 1). Quality assessment demonstrated that despite the low raw DNA yield of CAP samples, they retained a minimal level of host contamination, in comparison to ASP and BRU (mean 5.27 % vs. 93.09 - 96.44% of average total reads per sample) (Figure 2). CAP samples shared majority of the species with ASP samples as well as a big fraction of species detected in the terminal ileum samples. ASP and CAP sample composition was more similar to duodenum, jejunum and saliva and very different from large intestine and stool samples. Functional genomics further revealed GI regional-specific differences: In both ASP and CAP samples we detect a number of Gut Metabolic Modules (GMMs) for carbohydrate digestion and short-chain fatty acids. However, probiotic species, and species and genes involved in the bile acid metabolism were mainly prevalent in CAP and FEC samples and could not be detected in ASP samples.</p><p><b>Conclusion:</b> CAP and ASP microbiome are compositionally similar despite of the high level of host contamination of ASP samples. CAP appears to be of better quality to reveal GI regional-specific functional potentials than ASP. This analysis demonstrates the great potential for the SIMBA capsule for unveiling the SI microbiome and supports the prospective use of SIMBA capsules in observational and interventional studies to investigate the impacts of short-term and long-term biotic foods interventions to the gut microbiome (Figure 3). A series of studies utilizing the SIMBA capsules are being conducted under way and the detectability of the biotic foods intervention impacts will be reported in near future (Table 1).</p><p><b>Table 1.</b> List of Ongoing Observational and Interventional Clinical Studies using SIMBA Capsule.</p><p></p><p></p><p><b>Figure 1.</b> Shotgun Metagenomic Sequencing: Taxonomy Overview (Relative Abundance of the 10 Most Species By Sampling Site).</p><p></p><p><b>Figure 2.</b> Shotgun Metagenomic Sequencing: High Quality Non-Host Contamination Reads On Sampling Sites.</p><p></p><p><b>Figure 3.</b> Short-term and Long-term Interventional Study Protocols Using SIMBA Capsules.</p><p>Darius Bazimya, MSc. Nutrition, RN<sup>1</sup>; Francine Mwitende, RN<sup>1</sup>; Theogene Uwizeyimana, Phn<sup>1</sup></p><p><sup>1</sup>University of Global Health Equity, Kigali</p><p><b>Financial Support:</b> University of Global Health Equity.</p><p><b>Background:</b> Rwanda is currently grappling with the double burden of malnutrition and rising obesity, particularly in urban populations. As the country experiences rapid urbanization and dietary shifts, traditional issues of undernutrition coexist with increasing rates of obesity and related metabolic disorders. This study aims to investigate the relationship between these nutrition-related issues and their impact on gastrointestinal (GI) health and metabolic outcomes in urban Rwandan populations.</p><p><b>Methods:</b> A cross-sectional study was conducted in Kigali, Rwanda's capital, involving 1,200 adult participants aged 18 to 65. Data were collected on dietary intake, body mass index (BMI), GI symptoms, and metabolic markers such as fasting glucose, cholesterol levels, and liver enzymes. Participants were categorized into three groups: undernourished, normal weight, and overweight/obese based on their BMI. GI symptoms were assessed using a validated questionnaire, and metabolic markers were evaluated through blood tests. Statistical analyses were performed to assess correlations between dietary patterns, BMI categories, and GI/metabolic health outcomes.</p><p><b>Results:</b> The study found that 25% of participants were classified as undernourished, while 22% were obese, reflecting the double burden of malnutrition and rising obesity in Rwanda's urban population. Among obese participants, 40% exhibited elevated fasting glucose levels (p &lt; 0.01), and 30% reported significant GI disturbances, such as irritable bowel syndrome (IBS) and non-alcoholic fatty liver disease (NAFLD). In contrast, undernourished individuals reported fewer GI symptoms but showed a higher prevalence of micronutrient deficiencies, including anaemia (28%) and vitamin A deficiency (15%). Dietary patterns characterized by high-fat and low-fiber intake were significantly associated with increased GI disorders and metabolic dysfunction in both obese and normal-weight participants (p &lt; 0.05).</p><p><b>Conclusion:</b> This study highlights the growing public health challenge posed by the coexistence of undernutrition and obesity in Rwanda's urban centres. The dietary shifts associated with urbanization are contributing to both ends of the nutritional spectrum, adversely affecting GI and metabolic health. Addressing these issues requires comprehensive nutrition interventions that consider the dual challenges of undernutrition and obesity, promoting balanced diets and improving access to health services. These findings have important implications for nutrition therapy and metabolic support practices in Rwanda, emphasizing the need for tailored interventions that reflect the country's unique nutritional landscape.</p><p>Levi Teigen, PhD, RD<sup>1</sup>; Nataliia Kuchma, MD<sup>2</sup>; Hijab Zehra, BS<sup>1</sup>; Annie Lin, PhD, RD<sup>3</sup>; Sharon Lopez, BS<sup>2</sup>; Amanda Kabage, MS<sup>2</sup>; Monika Fischer, MD<sup>4</sup>; Alexander Khoruts, MD<sup>2</sup></p><p><sup>1</sup>University of Minnesota, St. Paul, MN; <sup>2</sup>University of Minnesota, Minneapolis, MN; <sup>3</sup>University of Minnesota, Austin, MN; <sup>4</sup>Indiana University School of Medicine, Indianapolis, IN</p><p><b>Financial Support:</b> Achieving Cures Together.</p><p><b>Background:</b> Fecal microbiota transplantation (FMT) is a highly effective treatment for recurrent Clostridioides difficile infection (rCDI). The procedure results in the repair of the gut microbiota following severe antibiotic injury. Recovery from rCDI is associated with high incidence of post-infection irritable bowel syndrome (IBS). In addition, older age, medical comorbidities, and prolonged diarrheal illness contribute to frailty in this patient population. The effect of FMT on IBS symptoms and frailty in patients with rCDI is largely unknown. In this prospective cohort study, we collected IBS symptom and frailty data over the 3 months following FMT treatment for rCDI in patients at two large, academic medical centers.</p><p><b>Methods:</b> Consenting adults who underwent FMT treatment for rCDI were enrolled in this study (n = 113). We excluded patients who developed a recurrence of CDI within the 3-month follow-up period (n = 15) or had incomplete IBS symptom severity scale (IBS-SSS) scores at any timepoint. The IBS-SSS is a 5-item survey measuring symptom intensity, with a score range from 0 to 500 with higher scores representing greater severity. IBS-SSS was collected at baseline, 1-week post FMT, 1-month post-FMT, and 3-months post-FMT. Frailty was assessed at baseline and 3-months using the FRAIL scale (categorical variable: “Robust Health”, “Pre-Frail”, “Frail”). Kruskal-Wallis test was used to compare IBS-SSS across timepoints. Post-hoc analysis was performed with the Pairwise Wilcoxon Rank Sum Tests using the False Discovery Rate adjustment method. The Friedman test was used to compare frailty distribution between the baseline and 3-month timepoints.</p><p><b>Results:</b> Mean age of the cohort was 63.3 (SD 15.4) years; 75% of the patients were female sex (total n = 58 patients). The IBS-SSS scores across timepoints are presented in Table 1 and Figure 1. The median IBS score at baseline was 134 [IQR 121], which decreased to a median score of 65 [IQR 174] at 1-week post-FMT. The baseline timepoint was found to differ from the 1-week, 1-month, and 3-month timepoints (p &lt; 0.05). No other differences between timepoints were observed. Frailty was assessed at baseline and 3 months (total n = 52). At baseline, 71% of patients (n = 37) were considered Pre-Frail or Frail, but this percentage decreased to 46% (n = 24) at 3-months (Table 2; p &lt; 0.05).</p><p><b>Conclusion:</b> Findings from this multicenter, prospective cohort study indicate an overall improvement in both IBS symptoms and frailty in the 3-months following FMT therapy for rCDI. Notably, IBS symptom scores were found to improve by 1-week post FMT. Further research is required to understand what predicts IBS symptom improvement following FMT and if nutrition therapy can help support further improvement. It will also be important to further understand how nutrition therapy can help support the improvement in frailty status observed following FMT for rCDI.</p><p><b>Table 1.</b> Distribution of IBS-SSS Scores at Baseline and Following FMT.</p><p></p><p><b>Table 2.</b> Frailty Distribution Assessed by FRAIL Scale at Baseline and 3-Months Post-FMT.</p><p></p><p></p><p>Box-plot distributions of IBS-SSS scores across timepoints. Median IBS-SSS score are baseline was 134 and decreased to a median score of 65 at 1-week post-FMT. The baseline timepoint was found to differ from the 1-week, 1-month, and 3-month timepoints (p &lt; 0.05).</p><p><b>Figure 1.</b> Distribution of IBS-SSS Scores by Timepoint.</p><p>Oshin Khan, BS<sup>1</sup>; Subanandhini Subramaniam Parameshwari, MD<sup>2</sup>; Kristen Heitman, PhD, RDN<sup>1</sup>; Kebire Gofar, MD, MPH<sup>2</sup>; Kristin Goheen, BS, RDN<sup>1</sup>; Gabrielle Vanhouwe, BS<sup>1</sup>; Lydia Forsthoefel, BS<sup>1</sup>; Mahima Vijaybhai Vyas<sup>2</sup>; Saranya Arumugam, MBBS<sup>2</sup>; Peter Madril, MS, RDN<sup>1</sup>; Praveen Goday, MBBS<sup>3</sup>; Thangam Venkatesan, MD<sup>2</sup>; Kristen Roberts, PhD, RDN, LD, CNSC, FASPEN, FAND<sup>4</sup></p><p><sup>1</sup>The Ohio State University, Columbus, OH; <sup>2</sup>The Ohio State University Wexner Medical Center, Columbus, OH; <sup>3</sup>Nationwide Children's Hospital, Columbus, OH; <sup>4</sup>The Ohio State University, Granville, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Cyclic vomiting syndrome (CVS) is a disorder of gut-brain interaction characterized by recurrent spells of nausea, vomiting, and abdominal pain lasting hours to days. Estimated prevalence is approximately 2% in adults and children. Due to recalcitrant symptoms, patients might develop disorder eating patterns though data are lacking. Identifying patients at risk for disordered eating patterns and malnutrition can help optimize nutritional status and may improve overall patient outcomes. The purpose of this study is to define nutrient intakes and dietary patterns of those with CVS seeking care at a tertiary care referral clinic, and to establish variation in dietary intakes based on disease severity.</p><p><b>Methods:</b> In this ongoing, cross-sectional study of adults diagnosed with CVS based on Rome IV criteria, participants are asked to complete validated surveys including a food frequency questionnaire (FFQ). Baseline demographics and clinical characteristics including disease severity (defined by the number of episodes per year) were ascertained. Healthy eating index (HEI) scores (scale of 0-100) were calculated to assess diet quality with higher scores indicating better diet quality compared to lower scores. Those with complete data were included in this interim analysis.</p><p><b>Results:</b> Data from 33 participants with an average age of 40 ± 16 years and an average BMI of 28.6 ± 7.9 is presented. The cohort was predominately female (67%), white (79%) and with moderate to severe disease (76%). The malnutrition screening tool supported that 42% of participants were at risk of malnutrition independent of BMI status (p = 0.358) and disease severity (p = 0.074). HEI scores were poor amongst those with CVS (55) and did not differ based on disease severity (58 vs 54; p = 0.452). Energy intakes varied ranging from 416-3974 kcals/day with a median intake of 1562 kcals/day.</p><p><b>Conclusion:</b> In CVS, dietary intake is poor and there is a high risk of malnutrition regardless of disease severity and BMI. Providers and registered dietitian nutritionists must be aware of the high rates of malnutrition risk and poor dietary intakes in this patient population to improve delivery of dietary interventions. Insight into disordered eating and metabolic derangements may improve the understanding of dietary intakes in CVS.</p><p>Hannah Huey, MDN<sup>1</sup>; Holly Estes-Doetsch, MS, RDN, LD<sup>2</sup>; Christopher Taylor, PhD, RDN<sup>2</sup>; Kristen Roberts, PhD, RDN, LD, CNSC, FASPEN, FAND<sup>3</sup></p><p><sup>1</sup>Nationwide Children's Hospital, Columbus, OH; <sup>2</sup>The Ohio State University, Columbus, OH; <sup>3</sup>The Ohio State University, Granville, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Micronutrient deficiencies are common in Crohn's disease (CD) due to poor dietary absorption. Pregnancy has an increased nutritional demand and when coupled with a malabsorptive condition like CD, clinicians must closely monitor micronutrient status. However, there is a lack of evidence-based guidelines for clinicians when managing these complex patients leaving clinicians to use clinical judgement for management. A case study of a pregnant female with CD presents for delivery with an undetected fat-soluble vitamin deficiency. The impact of a vitamin K deficiency on the post-partum management of a patient with CD is presented along with potential assessment and treatment strategies. At 25 weeks gestation, the patient presented with a biochemical iron deficiency anemia, vitamin B12 deficiency, and zinc deficiency for which she was treated with oral supplementation and/or intramuscular injections. No assessment of fat-soluble vitamins during the gestation period was conducted despite a pre-pregnancy history of multiple micronutrient deficiencies. At 33 weeks gestation, the mother was diagnosed with preeclampsia and delivered the fetus at 35 weeks. After birth, the infant presented to the NICU with a mediastinal mass, abnormal liver function tests and initial coagulopathy. The mother experienced a uterine hemorrhage post-cesarean section. At this time her INR was 14.8 with a severe prolongation of the PT and PTT and suboptimal levels of blood clotting factors II, VII, IX, and X. The patient was diagnosed with a vitamin K deficiency and was treated initially with 10 mg daily by mouth x 3 days resulting in an elevated serum vitamin K while PT and INR were trending towards normal limits. At discharge she was recommended to take 1 mg daily by mouth of vitamin K to prevent further deficiency. PT and INR were the biochemical assays that were reassessed every 3 months since serum vitamin K is more reflective of recent intake. CD represents a complex disorder and the impact of pregnancy on micronutrient status is unknown. During pregnancy, patients with CD may require additional micronutrient monitoring particularly in the case of historical micronutrient deficiencies or other risk factors. This case presents the need for further research into CD-specific micronutrient deficiencies and the creation of specific supplementation guidelines and treatment algorithms for detection of micronutrient deficiencies in at-risk patients.</p><p><b>Methods:</b> None Reported.</p><p><b>Results:</b> None Reported.</p><p><b>Conclusion:</b> None Reported.</p><p>Gretchen Murray, BS, RDN<sup>1</sup>; Kristen Roberts, PhD, RDN, LD, CNSC, FASPEN, FAND<sup>2</sup>; Phil Hart, MD<sup>1</sup>; Mitchell Ramsey, MD<sup>1</sup></p><p><sup>1</sup>The Ohio State University Wexner Medical Center, Columbus, OH; <sup>2</sup>The Ohio State University, Granville, OH</p><p><b>Financial Support:</b> UL1TR002733.</p><p><b>Background:</b> Enteric hyperoxaluria (EH) and resultant lithiasis is well documented in many malabsorptive conditions including inflammatory bowel disease, celiac disease, short bowel syndrome, and post gastric bypass. Chronic pancreatitis (CP) often leads to exocrine pancreatic insufficiency (EPI) and subsequent fat malabsorption increasing the risk of EH secondary to calcium binding to dietary fat leaving oxalates available for colonic absorption. Modulating oxalate intake by reducing whole grains, greens, baked beans, berries, nuts, beer and chocolate while simultaneously improving hydration is the accepted medical nutrition therapy (MNT) for EH and calcium-oxalate stones. Although sources of dietary oxalate are well-known, there is limited literature regarding dietary oxalates in the CP diet, leaving a paucity of data to guide MNT for EH and calcium-oxalate lithiasis for these patients.</p><p><b>Methods:</b> A cross-sectional, case-control study was performed comparing subjects with CP to healthy controls. Vioscreen™ food frequency questionnaire was used to assess and quantify total oxalic acid intake in the CP cohort and to describe dietary sources. Descriptive statistics were used to describe dietary intake of oxalic acid and contributing food sources.</p><p><b>Results:</b> A total of 52 subjects with CP were included and had a mean age of 50 ± 15 years. Most subjects were male (n = 35; 67%). Mean BMI was 24 ± 6 kg/m2 and 8 subjects (15%) were classified as underweight by BMI. Median daily caloric intake was 1549 kcal with a median daily oxalic acid intake of 104 mg (range 11-1428 mg). The top three contributors to dietary oxalate intake were raw and cooked greens such as spinach or lettuce, followed by mixed foods such as pizza, spaghetti, and tacos and tea. Other significant contributors (&gt;100 mg) to dietary oxalate intake included sports or meal replacement bars, cookies and cakes, potato products (mashed, baked, chips, fried), and refined grains (breads, tortillas, bagels).</p><p><b>Conclusion:</b> In the CP population, highest contributors to oxalate intake include greens, mixed foods, tea, meal replacement bars, some desserts, potatoes, and refined grains. Many of the identified dietary oxalate sources are not considered for exclusion in a typical oxalate restricted diet. A personalized approach to dietary oxalate modulation is necessary to drive MNT for EH prevention in those with CP.</p><p>Qian Ren, PhD<sup>1</sup>; Peizhan Chen, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine<sup>2</sup></p><p><sup>1</sup>Department of Clinical Nutrition, Shanghai Sixth People's Hospital Affiliated to Shanghai Jiao Tong University School of Medicine, Shanghai; <sup>2</sup>Clinical Research Center, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai</p><p><b>Financial Support:</b> This study was supported by the Youth Cultivation Program of Shanghai Sixth People's Hospital (Grant No. ynqn202223), the Key Laboratory of Trace Element Nutrition, National Health Commission of the Peoples’ Republic of China (Grant No. wlkfz202308), and the Danone Institute China Diet Nutrition Research and Communication (Grant No. DIC2023-06).</p><p><b>Background:</b> Low serum vitamin D status was reported to be associated with reduced muscle mass; however, it is inconclusive whether this relationship is causal. This study used data from the National Health and Nutrition Examination Survey (NHANES) and two-sample Mendelian randomization (MR) analyses to ascertain the causal relationship between serum 25-hydroxyvitamin D [25(OH)D] and appendicular muscle mass (AMM).</p><p><b>Methods:</b> In the NHANES 2011–2018 dataset, 11,242 participants aged 18–59 years old were included, and multivariant linear regression was performed to assess the relationship between 25(OH)D and AMM measured by dual-energy X-ray absorptiometry (Figure 1). In two-sample MR analysis, 167 single nucleotide polymorphisms significantly associated with serum 25(OH)D were applied as instrumental variables (IVs) to assess vitamin D effects on AMM in the UK Biobank (417,580 Europeans) using univariable and multivariable MR models (Figure 2).</p><p><b>Results:</b> In the NHANES 2011–2018 dataset, serum 25(OH)D concentrations were positively associated with AMM (β = 0.013, SE = 0.001, p &lt; 0.001) in all participants, after adjustment for age, race, season of blood collection, education, income, body mass index and physical activity. In stratification analysis by sex, males (β = 0.024, SE = 0.002, p &lt; 0.001) showed more pronounced positive associations than females (β = 0.003, SE = 0.002, p = 0.024). In univariable MR, genetically higher serum 25(OH)D levels were positively associated with AMM in all participants (β = 0.049, SE = 0.024, p = 0.039) and males (β = 0.057, SE = 0.025, p = 0.021), but only marginally significant in females (β = 0.043, SE = 0.025, p = 0.090) based on IVW models were noticed. No significant pleiotropy effects were detected for the IVs in the two-sample MR investigations. In MVMR analysis, a positive causal effect of 25(OH)D on AMM were observed in total population (β = 0.116, SE = 0.051, p = 0.022), males (β = 0.111, SE = 0.053, p = 0.036) and females (β = 0.124, SE = 0.054, p = 0.021).</p><p><b>Conclusion:</b> Our results suggested a positive causal effect of serum 25(OH)D concentration on AMM; however, more researches are needed to understand the underlying biological mechanisms.</p><p></p><p><b>Figure 1.</b> Working Flowchart of Participants Selection in the Cross-Sectional Study.</p><p></p><p><b>Figure 2.</b> The study assumptions of the two-sample Mendelian Randomization analysis between serum 25(OH)D and appendicular muscle mass. The assumptions include: (1) the genetic instrumental variables (IVs) should exhibit a significant association with serum 25(OH)D; (2) the genetic IVs should not associate with any other potential confounding factors; and (3) the genetic IVs must only through serum 25(OH)D but not any other confounders to influence the appendicular muscle mass. The dotted lines indicate the violate of the assumptions.</p><p>Qian Ren, PhD<sup>1</sup>; Junxian Wu<sup>1</sup></p><p><sup>1</sup>Department of Clinical Nutrition, Shanghai Sixth People's Hospital Affiliated to Shanghai Jiao Tong University School of Medicine, Shanghai</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> A healthy diet is essential for both preventing and treating type 2 diabetes mellitus (T2DM), which has a negative impact on public health. Whole grains, which are rich in dietary fibre, can serve as a good source of carbohydrates. However, the correlation and causality between whole grain intake and the risk of T2DM and glucose metabolism remains to be clarified.</p><p><b>Methods:</b> First, the National Health and Nutrition Examination Survey database 2003-2018 were used to investigate the correlation between dietary whole grain/fibre intake and the risk of T2DM and glucose metabolism. Then, based on the largest publicly published genome-wide association analysis of whole grain intake in the Neale lab database, single nucleotide polymorphisms (SNPs) that were statistically and significantly associated with whole grain intake were selected as instrumental variables (p &lt; 5×10<sup>-8</sup>, linkage disequilibrium r<sup>2</sup> &lt; 0.1). Inverse variance weighted analysis (IVW), weighted median method and other methods were used to analyze the causal relationship between whole grain intake and T2DM. Heterogeneity test, gene pleiotropy test and sensitivity analysis were performed to evaluate the stability and reliability of the results.</p><p><b>Results:</b> The results showed that dietary intakes of whole grains (OR = 0.999, 95%CI: 0.999 ~ 1.000, p = 0.004)/fibre (OR = 0.996, 95% CI: 0.993 ~ 0.999, p = 0.014) were negatively associated with the risk of T2DM. In the population with normal glucose metabolism, dietary fibre intake was negatively associated with FPG (β = -0.003, SE = 0.001, p &lt; 0.001), FINS (β = -0.023, SE = 0.011, p = 0.044), and HOMA-IR (β = -0.007, SE = 0.003, p = 0.023). In the population with abnormal glucose metabolism, dietary intakes of whole grains (β = -0.001, SE = 0.001, p = 0.036) and fibre (β = -0.006, SE = 0.002, p = 0.005) were negatively associated with HbA1c. For further MR analyses, the IVW method demonstrated that for every one standard deviation increase in whole grains intake, T2DM risk was decreased by 1.9% (OR = 0.981, 95%CI: 0.970 ~ 0.993, β = -0.019, p = 0.002), with consistent findings for IVW-multiplicative random effects, IVW-fixed effects and IVW-radial. Meanwhile, MR-Egger regression analysis (intercept = -2.7 × 10<sup>-5</sup>, p = 0.954) showed that gene pleiotropy doesn't influence the results of MR analysis. Leave-one-out analysis showed that individual SNP greatly doesn't influence the results (p<sub>heterogeneity</sub>= 0.445).</p><p><b>Conclusion:</b> Dietary intakes of whole grains may reduce the risk of T2DM and improve glucose metabolic homeostasis and insulin resistance. The causal relationship between whole grains intake and T2DM, as well as the optimal daily intake of whole grains still needs to be further explored in the future through large randomised controlled intervention studies and prospective cohort studies.</p><p>Hikono Sakata, Registered Dietitian<sup>1</sup>; MIsa Funaki, Registered Dietitian<sup>2</sup>; Kanae Masuda, Registered Dietitian<sup>2</sup>; Rio Kurihara, Registered Dietitian<sup>2</sup>; Tomomi Komura, Registered Dietitian<sup>2</sup>; Masaru Yoshida, Doctor<sup>2</sup></p><p><sup>1</sup>University of Hyogo, Ashiya-shi, Hyogo; <sup>2</sup>University of Hyogo, Himezi-shi, Hyogo</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> In recent years, lifestyle-related diseases such as obesity, diabetes, and dyslipidemia have been recognized as problems, one of the causes of which is an excessive fatty diet. Obesity and other related diseases are known to be risk factors for the severity of infectious diseases such as sepsis and novel coronavirus infection, but their pathomechanisms have not been clarified. Therefore, we hypothesized that a diet high in fat might induce functional changes not only in adipocytes but also in macrophages, weakening the immune response and leading to aggravation of infectious diseases. Therefore, in this study, we performed proteome analysis and RNA sequence analysis to examine what kind of gene and protein expression is induced in macrophages by high-fat diet loading.</p><p><b>Methods:</b> Four-week-old mice were divided into a normal diet (ND) group and a high-fat diet (HFD) group and kept for four weeks. Macrophages were collected intraperitoneally from mice that had been intraperitoneally injected with 2 mL of thioglycolate medium to promote macrophage proliferation one week before dissection, and incubated at 37°C with 5% CO2 using Roswell Park Memorial Institute medium (RPMI). After culturing in an environment for 2 hours, floating cells were removed, and proteome analysis was performed using the recovered macrophages. In addition, RNA sequence analysis was performed on RNA extracted from macrophages.</p><p><b>Results:</b> Proteome analysis identified more than 4000 proteins in each group. In the HFD group, compared to the ND group, decreased expression of proteins involved in phagocytosis, such as immunoglobulin and eosinophil peroxidase, was observed. In addition, RNA sequencing data analysis also showed a decrease in the expression levels of genes related to phagocytosis, which had been observed to decrease in proteome analysis.</p><p><b>Conclusion:</b> From the above, it was suggested that the phagocytic ability of macrophages is reduced by high-fat diet loading. This research is expected to clarify the molecular mechanisms by which high-fat dietary loading induces the expression of genes and proteins and induces immunosuppressive effects.</p><p>Benjamin Davies, BS<sup>1</sup>; Chloe Amsterdam, BA<sup>1</sup>; Basya Pearlmutter, BS<sup>1</sup>; Jackiethia Butsch, C-CHW<sup>2</sup>; Aldenise Ewing, PhD, MPH, CPH<sup>3</sup>; Erin Holley, MS, RDN, LD<sup>2</sup>; Subhankar Chakraborty, MD, PHD<sup>4</sup></p><p><sup>1</sup>The Ohio State University College of Medicine, Columbus, OH; <sup>2</sup>The Ohio State University Wexner Medical Center, Columbus, OH; <sup>3</sup>The Ohio State University College of Public Health, Columbus, OH; <sup>4</sup>The Ohio State University Wexner Medical Center, Dublin, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Food insecurity (FI) refers to lack of consistent access to sufficient food for an active, healthy life. This issue, influenced by economic, social, and environmental factors, disproportionately affects disadvantaged populations. According to the United States Department of Agriculture, more than 10% of U.S. households experience FI, leading to physical and mental health consequences. FI has been linked to poorer dietary quality, increased consumption of processed, calorie-dense foods, and a higher prevalence of obesity, diabetes, and cardiovascular diseases. Chronic stress from FI can also trigger or exacerbate mental health disorders, including depression and anxiety, further complicating health outcomes. Affecting ~20% of the global population, dyspepsia significantly diminishes quality of life and increases healthcare costs. Its etiology is multifactorial, involving abnormal gastric motility, visceral hypersensitivity, inflammation, and psychosocial stressors. Although research on the link between FI and disorders of gut-brain interaction (DGBIs) like dyspepsia is limited, emerging evidence suggests a bidirectional relationship. Therefore, this study was designed to examine the association between FI, dyspepsia, and other health-related social needs (HRSN). Our hypotheses include 1) patients with FI have more severe dyspepsia symptoms, and 2) FI is associated with HRSN in other domains.</p><p><b>Methods:</b> Patients presenting to a specialty motility clinic were prospectively enrolled into a registry created with the goal of holistically investigating the pathophysiology of DGBIs. Validated questionnaires for HRSN and dyspepsia were completed prior to their clinic visits. Data were managed with REDCap and statistical analyses were performed using SPSS.</p><p><b>Results:</b> 53 patients completed the questionnaires. 88.7% of patients were White and 73.6% were female with an average age of 45.6 years (21-72) and BMI of 28.7 kg/m<sup>2</sup> (17.8-51.1). FI was present in 13 (24.5%) patients. The overall severity of dyspepsia symptoms was significantly less in the food secure patients (13.8 vs. 18.8, p = 0.042). Of the four subscales of dyspepsia (nausea-vomiting, postprandial fullness, bloating, and loss of appetite), only loss of appetite was significantly greater in those with FI (2.3 vs. 1.3, p = 0.017). Patients with FI were more likely to be at medium (61.5% vs. 5.0%) or high risk (30.8% vs. 2.5%, p &lt; 0.001) of financial hardship, experience unmet transportation needs (38.5% vs. 5.0%, p = 0.0.019) and housing instability (30.8% vs. 5.0%, p = 0.023) compared to those who were food secure. They were also at higher risk of depression (54% vs. 12.5%, p = 0.005), and reporting insufficient physical activity (92.3% vs. 55.0%, p = 0.05). After adjusting for age, gender, race, and BMI, FI was not a predictor of global dyspepsia severity. Greater BMI (O.R. 0.89, 95% C.I. 0.81-0.98) was associated with severity of early satiety. Female gender (O.R. 10.0, 95% C.I. 1.3-76.9, p = 0.03) was associated with the severity of nausea. Greater BMI (O.R. 1.10, 95% C.I. 1.001-1.22, p = 0.048) and female gender (O.R. 10.8, 95% C.I. 1.6-72.9, p = 0.015) both correlated with severity of postprandial fullness.</p><p><b>Conclusion:</b> FI affected ~25% of patients seen in our clinic. It was, however, not an independent predictor of overall severity of dyspepsia symptoms. Patients who experienced FI had higher prevalence of other HRSN, and risk of depression and physical inactivity. Our results highlight the importance of considering FI and other HRSN in the management of dyspepsia. Understanding this interaction is essential for improving clinical outcomes and guiding public health interventions.</p><p>Ashlesha Bagwe, MD<sup>1</sup>; Chandrashekhara Manithody, PhD<sup>1</sup>; Kento Kurashima, MD, PhD<sup>1</sup>; Shaurya Mehta, BS<sup>1</sup>; Marzena Swiderska-Syn<sup>1</sup>; Arun Verma, MD<sup>1</sup>; Austin Sims<sup>1</sup>; Uthayashanker Ezekiel<sup>1</sup>; Ajay Jain, MD, DNB, MHA<sup>1</sup></p><p><sup>1</sup>Saint Louis University, St. Louis, MO</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Farnesoid X receptor (FXR) a gut nuclear receptor regulates intestinal driven bile acid homeostasis in short bowel syndrome. Chenodeoxycholic acid (CDCA), a primary bile acid, acts as a FXR ligand. Stem cell-derived intestinal enteroids offer a valuable system to study intestinal FXR function. We hypothesized that transfection of porcine enteroids with small interfering RNA (siRNA) would modulate FXR to further define its mechanistic role.</p><p><b>Methods:</b> We developed a porcine protocol for matrigel-based 3D culture systems to generated enteroids from the small bowel of neonatal yorkshire pigs. After 7 days, cultures were passaged and expanded. RNA strands for Dicer-substrate siRNAs (DsiRNAs) were synthesized as single-strand RNAs (ssRNAs) by Integrated DNA Technologies and resuspended in an RNase-free buffer. DsiRNA targeted Sus scrofa, farnesoid x receptor (FXR) gene (KF597010.1). Three sets of DsiRNA were made for gene-specific silencing of FXR gene. Porcine enteroids were cultured and transfected with FXR-specific siRNA and control siRNA using Lipofectamine RNAiMAX reagent. They were also treated with escalating CDCA concentrations (25 μM to 50 μM) for 24 and 48 hrs. FXR mRNA levels were quantified by real-time PCR and functional assays performed to assess changes in bile acid uptake and efflux following transfection.</p><p><b>Results:</b> Data from 3 separate experiments of intestinal crypts showed similar results in enhanced FXR expression with CDCA against control (p &lt; 0.01). CDCA treatment resulted in a dose-dependent increase in FXR mRNA expression, reaching peak levels after 48 hours of exposure (2.8X increase) with enhanced nuclear localization. Functionally, CDCA-treated enteroids displayed increased bile acid uptake and reduced efflux, validating FXR's role in mediating bile acid driven enterohepatic circulation. Several runs with siRNA were conducted. Using 30pMole siRNA (sense: 5' GAUCAACAGAAUCUUCUUCAUUATA 3' and antisense: 5' UAUAAUGAAGAAGAUUCUGUUGAUCUG 3') there was a 68% reduction in FXR expression against scramble. FXR silencing led to decreased bile acid uptake and increased efflux. No significant effects were observed in enteroids transfected with control siRNA. Paradoxically, CDCA treated cultures showed a higher proportion of immature enteroids to mature enteroids.</p><p><b>Conclusion:</b> In porcine enteroids, CDCA treatment increased FXR gene expression and promoted its nuclear localization. This finding implies the existence of a positive feedback cycle in which CDCA, an FXR ligand, induces further synthesis and uptake. siRNA transfection was able to significantly decrease FXR activity. By employing this innovative methodology, one can effectively examine the function of FXR in ligand treated or control systems.</p><p><b>Pediatric, Neonatal, Pregnancy, and Lactation</b></p><p>Kento Kurashima, MD, PhD<sup>1</sup>; Si-Min Park, MD<sup>1</sup>; Arun Verma, MD<sup>1</sup>; Marzena Swiderska-Syn<sup>1</sup>; Shaurya Mehta, BS<sup>1</sup>; Austin Sims<sup>1</sup>; Mustafa Nazzal, MD<sup>1</sup>; John Long, DVM<sup>1</sup>; Chandrashekhara Manithody, PhD<sup>1</sup>; Shin Miyata, MD<sup>1</sup>; Ajay Jain, MD, DNB, MHA<sup>1</sup></p><p><sup>1</sup>Saint Louis University, St. Louis, MO</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> North American Society for Pediatric Gastroenterology, Hepatology and Nutrition (NASPGHAN) Florida.</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Biliary atresia (BA) is a major cause of obstructive neonatal cholestatic disease. Although hepato-portoenterostomy (HPE) is routinely performed for BA patients, more than half eventually require liver transplantation. The intricate mechanisms of bile ductular injury driving its pathogenesis remains elusive and not recapitulated in current small animal and rodent models that rely on bile duct ligation. Addressing prevailing lacunae, we hypothesized that extra and intra hepatic bile duct destruction through an endothelial irritant would recapitulate human condition. We have thus developed a novel neonatal piglet BA model called, ‘BATTED’. Piglets have liver and gastro-intestinal homology to human infants and share anatomic and physiological processes providing a robust platform for BATTED and HPE.</p><p><b>Methods:</b> Six 7-10 day old piglets were randomized to BATTED (US provisional Patent US63/603,995) or sham surgery. BATTED included cholecystectomy, common bile duct and hepatic duct injection of 95% ethanol and a retainer suture for continued ethanol induced intrahepatic injury. Vascular access ports were placed and scheduled ultrasound guided liver biopsies were performed. Six-weeks post-BATTED piglets underwent HPE. 8-weeks after initial surgery, animals were euthanized. Serology, histology, gene expression and immunohistochemistry were performed.</p><p><b>Results:</b> Serological evaluation revealed a surge in conjugated bilirubin 6 weeks after BATTED procedure from baseline (mean Δ 0.39 mg/dL to 3.88 mg/dL). Gamma-glutamyl transferase (GGT) also exhibited a several fold increase (mean: Δ 16.3IU to 89.5IU). Sham did not display these elevations (conjugated bilirubin: Δ 0.39 mg/dL to 0.98 mg/dL, GGT: Δ 9.2IU to 10.4IU). Sirius red staining demonstrated significant periportal and diffuse liver fibrosis (16-fold increase) and bile duct proliferation marker CK-7 increased 9-fold with BATTED. Piglets in the BATTED group demonstrated enhanced CD-3 (7-fold), alpha-SMA (8.85-fold), COL1A1 (11.7-fold) and CYP7A1 (7-fold), vs sham. Successful HPE was accomplished in piglets with improved nutritional status and a reduction in conjugated bilirubin (Δ 4.89 mg/dL to 2.11 mg/dL).</p><p><b>Conclusion:</b> BATTED replicated BA features, including hyperbilirubinemia, GGT elevation, significant hepatic fibrosis, bile duct proliferation, and inflammatory infiltration with subsequent successful HPE. This model offers substantial opportunities to elucidate the mechanism underlying BA and adaptation post HPE, paving the path for the development of diagnostics and therapeutics.</p><p>Sirine Belaid, MBBS, MPH<sup>1</sup>; Vikram Raghu, MD, MS<sup>1</sup></p><p><sup>1</sup>UPMC, Pittsburgh, PA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> At our institution, pediatric residents are responsible for managing patients with intestinal failure (IF) in the inpatient units. However, they have reported feelings of inexperience and anxiety when dealing with these complex cases. This project aimed to identify knowledge gaps and evaluate the confidence levels of pediatric residents in managing IF patients.</p><p><b>Methods:</b> We conducted an online needs assessment survey using Qualtrics, which included Likert-scale, multiple-choice, open-ended and rating questions to assess residents' confidence levels (from 1 to 10) in performing tasks related to IF patient care. This voluntary survey, approved by the IRB as exempt, was distributed to all pediatric residents at the University of Pittsburgh via QR codes and emails.</p><p><b>Results:</b> Of the respondents, 32% participated in the survey, with nearly 50% having completed a rotation on the Intestinal Rehabilitation (IR) service. Residents reported the lowest confidence in calculating Total Parenteral Nutrition (TPN)-like Intravenous (IV) fluids (solutions administered centrally or peripherally, containing dextrose and major electrolytes designed to match the patients’ home TPN contents), identifying signs of D-lactic acidosis and small intestinal bacterial overgrowth (SIBO), and managing SIBO (average confidence rating of 2/10). They also expressed low confidence in ordering home TPN and TPN-like IV Fluids, understanding related anatomy, ensuring proper stoma care, managing central access loss, and addressing poor catheter blood flow (average confidence ratings of 3-4/10). Conversely, residents felt more confident managing feeding intolerance and central line-associated infections (average confidence ratings of 5-6/10). Additionally, they rated their ability to identify signs of septic and hypovolemic shock as 8/10, though they felt less confident in managing these conditions (7/10). Furthermore, 64% of respondents agreed that managing IF patients is educationally valuable and preferred laminated cards or simulated sessions as educational resources (average ratings of 8 and 6.8 out of 10, respectively).</p><p><b>Conclusion:</b> The survey highlights several areas where pediatric residents need further education. Addressing these knowledge gaps through targeted curricular interventions can better equip residents to manage IF patients and potentially increase their interest in this specialty.</p><p></p><p>CLABSI = Central Line-Associated Bloodstream Infection, TPN = Total Parenteral Nutrition, EHR = Electronic Health Record, IV = Intravenous, SIBO = Small Intestinal Bacterial Overgrowth.</p><p><b>Figure 1.</b> Stratifies the tasks related to managing patients with Intestinal Failure (IF) into three categories based on the average confidence rating score (&gt;= 7/10, 5-6/10, &lt;=4/10) of pediatric residents.</p><p></p><p><b>Figure 2.</b> Illustrates the distribution of pediatric residents’ opinions on the educational value of managing patients with intestinal failure.</p><p>Alyssa Ramuscak, MHSc, MSc<sup>1</sup>; Inez Martincevic, MSc<sup>1</sup>; Hebah Assiri, MD<sup>1</sup>; Estefania Carrion, MD<sup>2</sup>; Jessie Hulst, MD, PhD<sup>1</sup></p><p><sup>1</sup>The Hospital for Sick Children, Toronto, ON; <sup>2</sup>Hospital Metropolitano de Quito, Quito, Pichincha</p><p><b>Financial Support:</b> Nestle Health Science Canada, North York, Ontario, Canada.</p><p><b>Background:</b> Enteral nutrition provides fluids and nutrients to individuals unable to meet needs orally. Recent interest in real food-based formulas highlights a shift towards providing nutrition with familiar fruit, vegetable and protein ingredients. This study aimed to evaluate the tolerability and nutritional adequacy of hypercaloric, plant-based, real food ingredient formula in pediatric tube-fed patients.</p><p><b>Methods:</b> This prospective, single-arm, open-label study evaluated the tolerability and nutritional adequacy of a hypercaloric, plant-based formula, Compleat® Junior 1.5, in medically complex, stable, pediatric tube-fed patients aged 1-13 years. Participants were recruited from outpatient clinics at The Hospital for Sick Children, Toronto from May 2023 to June 2024. Demographic and anthropometric measurements (weight, height) were obtained at baseline. Daily dose of study product was isocaloric when compared to routine feeds. Total fluid needs were met by adding water. Participants transitioned to the study formula over 3 days, followed by 14-days of exclusive use of study product. Caregivers monitored volume of daily intake, feed tolerance, and bowel movements during the transition and study period using an electronic database (Medrio®). An end of study visit was conducted to collect weight measurements. Descriptive statistics summarize demographic and clinical characteristics (Table 1). The paired t-test compared weight-for-age and BMI-for-age z-scores between baseline and the end-of-study. Symptoms of intolerance and bowel movements, using either the Bristol Stool Scale or Brussel's Infant and Toddler Stool Scale, were described as frequency of events and compared at baseline to intervention period. The percent calorie and protein goals during the study period were calculated as amount calories received over prescribed, and amount protein received as per dietary reference intake for age and weight.</p><p><b>Results:</b> In total, 27 ambulatory pediatric participants with a median age of 5.5 years (IQR, 2.5-7) were recruited for the study with 26 completing (Table 1). Participant weight-for-age and BMI-for-age z-scores significantly improved between baseline and end of study, from -1.75 ± 1.93 to -1.67 ± 1.88 (p &lt; 0.05), and from -0.47 ± 1.46 to 0.15 ± 0.23 (p &lt; 0.05), respectively. There was no significant difference in the frequency of any GI symptom, including vomiting, gagging/retching, tube venting or perceived pain/discomfort with feeds, between baseline and end of study. There was no significant difference in frequency or type of stool between baseline and end of study. Study participants met 100% of their prescribed energy for the majority (13 ± 1.7 days) of the study period. All participants exceeded protein requirements during the study period. Twenty families (76.9%) indicated wanting to continue to use study product after completing the study.</p><p><b>Conclusion:</b> This prospective study demonstrated that a hypercaloric, plant-based, real food ingredient formula among stable, yet medically complex children was well tolerated and calorically adequate to maintain or facilitate weight gain over a 14-day study period. The majority of caregivers preferred to continue use of the study product.</p><p><b>Table 1.</b> Demographic and Clinical Characteristics of Participants (n = 27).</p><p></p><p><b>Poster of Distinction</b></p><p>Gustave Falciglia, MD, MSCI, MSHQPS<sup>1</sup>; Daniel Robinson, MD, MSCI<sup>1</sup>; Karna Murthy, MD, MSCI<sup>1</sup>; Irem Sengul Orgut, PhD<sup>2</sup>; Karen Smilowitz, PhD, MS<sup>3</sup>; Julie Johnson, MSPH PhD<sup>4</sup></p><p><sup>1</sup>Northwestern University Feinberg School of Medicine, Chicago, IL; <sup>2</sup>University of Alabama Culverhouse College of Business, Tuscaloosa, AL; <sup>3</sup>Northwestern University Kellogg School of Business &amp; McCormick School of Engineering, Evanston, IL; <sup>4</sup>University of North Carolina School of Medicine, Chapel Hill, NC</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> Children's Hospital Neonatal Consortium (CHNC) Annual Conference, November 1, 2021, Houston, TX.</p><p><b>Financial Support:</b> None Reported.</p><p>Lyssa Lamport, MS, RDN, CDN<sup>1</sup>; Abigail O'Rourke, MD<sup>2</sup>; Barry Weinberger, MD<sup>2</sup>; Vitalia Boyar, MD<sup>2</sup></p><p><sup>1</sup>Cohen Children's Medical Center of New York, Port Washington, NY; <sup>2</sup>Cohen Children's Medical Center of NY, New Hyde Park, NY</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Premature infants in the neonatal intensive care unit (NICU) are at high risk for peripheral intravenous catheter infiltration (PIVI) because of the frequent cannulation of small and fragile vessels. The most common infusate in neonates is parenteral nutrition (PN), followed by antibiotics. Previous reports have suggested that the intrinsic properties of infusates, such as pH, osmolality, and calcium content, determine the severity of PIVIs. This has led to the common practice of restricting the intake of protein and/or calcium to less than that which is recommended for optimal growth and bone mineralization.</p><p><b>Methods:</b> Our objective was to identify the characteristics of infants and intravenous (IV) infusates that associated with the development of severe neonatal PIVIs. We conducted a retrospective analysis of PIVIs in our level IV NICU from 2018-2022 (n = 120). Each PIVI was evaluated by a wound certified neonatologist and classified as mild, moderate, or severe using a scoring system based on the Infusion Nurses Society (INS) staging criteria. Comparison between groups were done using ANOVA and chi-square analysis or Mann-Whitney tests for non-parametric data.</p><p><b>Results:</b> Infants with severe PIVIs had a lower mean birthweight than those with mild or moderate PIVIs (1413.1 g vs 2116.9 g and 2020.3 g respectively, p = .01) (Table 1). Most PIVIs occurred during infusions of PN and lipids, but the severity was not associated with the infusion rate, osmolality, or with the concentration of amino acids (median 3.8 g/dL in the mild group, 3.5 g/dL in the moderate group and 3.4 g/dL in the severe group) or calcium (median 6500 mg/L in all groups) (Table 2, Figure 1). Of note, the infusion of IV medications within 24 hours of PIVI was most common in the severe PIVI group (p = .03)(Table 2). Most PIVIs, including mild, were treated with hyaluronidase. Advanced wound care was required for 4% of moderate and 44% of severe PIVIs, and none required surgical intervention.</p><p><b>Conclusion:</b> Severe PIVIs in the NICU are most likely to occur in infants with a low birthweight and within 24 hours of administration of IV medications. This is most likely because medications must be kept at acidic or basic pH for stability, and many have high osmolarity and/or intrinsic caustic properties. Thus, medications may induce chemical phlebitis and extravasation with inflammation. In contrast, PN components, including amino acids and calcium, are not related to the severity of extravasations. Our findings suggest that increased surveillance of IV sites for preterm infants following medication administration may decease the risk of severe PIVIs. Conversely, reducing or withholding parenteral amino acid and or calcium to mitigate PIVI risk may introduce nutritional deficiencies without decreasing the risk of clinically significant PIVIs.</p><p><b>Table 1.</b> Characteristic Comparison of Mild, Moderate, and Severe Pivis in Neonatal ICU. PIVI Severity Was Designated Based on INS Criteria.</p><p></p><p><b>Table 2.</b> Association of Medication Administration and Components of Infusates With the Incidence and Severity of PIVI in NICU.</p><p></p><p></p><p><b>Figure 1.</b> Infusate Properties.</p><p>Stephanie Oliveira, MD, CNSC<sup>1</sup>; Josie Shiff<sup>2</sup>; Emily Romantic, RD<sup>3</sup>; Kathryn Hitchcock, RD<sup>4</sup>; Gillian Goddard, MD<sup>4</sup>; Paul Wales, MD<sup>5</sup></p><p><sup>1</sup>Cincinnati Children's Hospital Medical Center, Mason, OH; <sup>2</sup>University of Cincinnati, Cincinnati, OH; <sup>3</sup>Cincinnati Children's Hospital Medical Center, Cincinnati, OH; <sup>4</sup>Cincinnati Children's Hospital, Cincinnati, OH; <sup>5</sup>Cincinnati Children's Hospital Medical Center, Cincinnati, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> It is common for children with intestinal failure on parenteral nutrition to be fed an elemental enteral formula as it is believed they are typically better tolerated due to the protein module being free amino acids, the absence of other allergens, and the presence of long chain fatty acids. In February 2022, a popular elemental formula on the market was recalled due to bacterial contamination that necessitated an immediate transition to an alternative enteral formula. This included initiating plant-based options for some of our patients. We have experienced a growing interest and request from families to switch to plant-based formulas due to religious practices, cost concerns, and personal preferences. While plant-based formulas lack major allergens and may contain beneficial soluble fiber, they are under studied in this patient population. This study aims to determine if growth was affected amongst children with intestinal failure on parenteral nutrition who switched from elemental to plant-based formulas.</p><p><b>Methods:</b> We conducted a retrospective cohort study of IF patients on PN managed by our intestinal rehabilitation program who transitioned from elemental to plant-based formulas during the product recall. Data were collected on demographics, intestinal anatomy, formula intake, parenteral nutrition support, tolerance, stool consistency, and weight gain for 6 months before and after formula transition. Paired analyses were performed for change in growth and nutritional intake, using the Wilcoxon-signed rank test. Chi-squared tests were performed to compare formula tolerance. An alpha value &lt; 0.05 was considered significant.</p><p><b>Results:</b> Eleven patients were included in the study [8 Males; median gestational age 33 (IQR = 29, 35.5) weeks, median age at assessment 20.4 (IQR = 18.7,29.7) months]. All participants had short bowel syndrome (SBS) as their IF category. Residual small bowel length was 28(IQR = 14.5,47.5) cm. Overall, there was no statistically significant difference in growth observed after switching to plant-based formulas (p = 0.76) (Figure 1). Both median enteral formula volume and calorie intake were higher on plant-based formula, but not statistically significant (p = 0.83 and p = 0.41) (Figure 2). 7 of 11 patients (64%) reported decreased stool count (p = 0.078) and improved stool consistency (p = 0.103) after switching to plant-based formula. Throughout the study, the rate of PN calorie and volume weaning were not different after switching to plant-based formula (calories: p = 0.83; volume: p = 0.52) (Figure 3).</p><p><b>Conclusion:</b> In this small study of children with IF, the switch from free amino acid formula to an intact plant-based formula was well tolerated. Growth was maintained between groups. After switching to plant-based formulas these children tolerated increased enteral volumes, but we were underpowered to demonstrate a statistical difference. There was no evidence of protein allergy among children who switched. Plant-based formulas may be an alternative option to elemental formulas for children with intestinal failure.</p><p></p><p><b>Figure 1:</b> Change in Weight Gain (g/day) on Elemental and Plant-Based Formulas.</p><p></p><p><b>Figure 2.</b> Change in Enteral Calories (kcal/kg/day) and Volume (mL/kg/day) on Elemental and Plant-Based Formulas.</p><p></p><p><b>Figure 3.</b> Change in PN Calories (kcal/kg/day) and Volume (mL/kg/day) on Elemental and Plant-Based Formulas.</p><p>Carly McPeak, RD, LD<sup>1</sup>; Amanda Jacobson-Kelly, MD, MSc<sup>1</sup></p><p><sup>1</sup>Nationwide Children's Hospital, Columbus, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> In pediatrics, post-pyloric enteral feeding via jejunostomy, gastrojejunostomy or naso-jejunostomy tubes is increasingly utilized to overcome problems related to aspiration, severe gastroesophageal reflux, poor gastric motility, and gastric outlet obstruction. Jejunal enteral feeding bypasses the duodenum, a primary site of absorption for many nutrients. Copper is thought to be primarily absorbed in the stomach and proximal duodenum, thus patients receiving nutritional support in a way that bypasses this region can experience copper deficiency. The prevalence and complications of copper deficiency in the pediatric population are not well documented.</p><p><b>Methods:</b> This was a retrospective case series of two patients treated at Nationwide Children's Hospital (NCH). Medical records were reviewed to collect laboratory, medication/supplement data and enteral feeding history. In each case report, both patients were observed to be receiving Pediasure Peptide® for enteral formula.</p><p><b>Results:</b> Case 1: A 14-year-old male receiving exclusive post-pyloric enteral nutrition for two years. This patient presented with pancytopenia and worsening anemia. Laboratory data was drawn on 3/2017 and demonstrated deficient levels of copper (&lt; 10 ug/dL) and ceruloplasmin (20 mg/dL). Repletion initiated with intravenous cupric chloride 38 mcg/kg/day for 3 days and then transitioned to 119 mcg/kg/day via j-tube for 4 months. Labs were redrawn 2 months after initial episode of deficiency and indicated overall improvement of pancytopenia (Table 1). After 4 months, cupric chloride decreased to 57 mcg/kg/day as a maintenance dose. Laboratory data redrawn two and a half years after initial episode of deficiency and revealed deficient levels of copper (27 ug/dL) and ceruloplasmin (10 mg/dL) despite lower dose supplementation being administered. Case 2: An 8-year-old female receiving exclusive post-pyloric enteral nutrition for 3 months. Laboratory data was drawn on 3/2019 and revealed deficient levels of copper (38 ug/dL) and ceruloplasmin (13 mg/dL). Supplementation of 50 mcg/kg/day cupric chloride administered through jejunal tube daily. Copper and ceruloplasmin labs redrawn at 11 months and 15 months after initiation of supplementation and revealed continued deficiency though hematologic values remained stable (Table 2).</p><p><b>Conclusion:</b> There are currently no guidelines for clinicians for prevention, screening, treatment, and maintenance of copper deficiency in post-pyloric enteral feeding in pediatrics. Current dosing for copper repletion in profound copper deficiency is largely based on case series and expert opinions. At NCH, the current standard-of-care supplementation demonstrates inconsistent improvement in copper repletion, as evidenced by case reports discussed above. Future research should determine appropriate supplementation and evaluate their efficacy in patients with post-pyloric enteral feeding.</p><p><b>Table 1.</b> Laboratory Evaluation of Case 1.</p><p></p><p>'-' indicates no data available, bolded indicates result below the lower limit of normal for age.</p><p><b>Table 2.</b> Laboratory Evaluation of Case 2.</p><p></p><p>'-' indicates no data available, bolded indicates result below the lower limit of normal for age.</p><p>Meighan Marlo, PharmD<sup>1</sup>; Ethan Mezoff, MD<sup>1</sup>; Shawn Pierson, PhD, RPh<sup>1</sup>; Zachary Thompson, PharmD, MPH, BCPPS<sup>1</sup></p><p><sup>1</sup>Nationwide Children's Hospital, Columbus, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Parenteral nutrition (PN) is and remains a high-risk therapy, typically containing more than 40 different ingredients. Incorporation of PN prescribing into the electronic health record (EHR) has been recommended to minimize the risk of transcription errors and allow for implementation of important safety alerts for prescribers. Ambulatory PN prescribing workflows typically require manual transcription of orders in the absence of robust EHR interoperability between systems used for transitions of care. Pediatric patients receiving ambulatory PN have an increased risk of medication events due to the need for weight-based customization and low utilization of ambulatory PN leading to inexperience for many pharmacies. Development and implementation of a prescribing system incorporating inpatient and outpatient functionality within the EHR is necessary to improve the quality and safety of ambulatory PN in pediatric patients. The primary goal was to create a workflow that provided improvement in transitions of care through minimization of manual transcription and improved medication safety. We describe modification of standard EHR tools to achieve this aim.</p><p><b>Methods:</b> Utilizing a multidisciplinary team, development and incorporation of ambulatory PN prescribing within the EHR at Nationwide Children's Hospital was completed. Inpatient and outpatient variances, safety parameters to provide appropriate alerts to prescribers, and legal requirements were considered and evaluated for optimization within the new system.</p><p><b>Results:</b> The final product successfully incorporated ambulatory PN prescribing while allowing seamless transfer of prescriptions between care settings. The prescriber orders the patient specific ambulatory PN during an inpatient or outpatient encounter. The order is subsequently queued for pharmacist review/verification to assess the adjustments and determine extended stability considerations in the ambulatory setting. After pharmacist review, the prescription prints and is signed by the provider to be faxed to the pharmacy.</p><p><b>Conclusion:</b> To our knowledge, this is the first institution to be able to develop and incorporate pediatric PN prescribing into the EHR that transfers over in both the inpatient and outpatient settings independent of manual transcription while still allowing for customization of PN.</p><p>Faith Bala, PhD<sup>1</sup>; Enas Alshaikh, PhD<sup>1</sup>; Sudarshan Jadcherla, MD<sup>1</sup></p><p><sup>1</sup>The Research Institute at Nationwide Children's Hospital, Columbus, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Extrauterine growth remains a concern among preterm-born infants admitted to the neonatal ICU (NICU) as it rarely matches intrauterine fetal growth rates. Although the reasons are multifactorial, the role played by the duration of exclusive parenteral nutrition (EPN) and the transition to reach exclusive enteral nutrition (EEN) phase remains unclear. Significant nutrient deficits can exist during the critical phase from birth to EEN, and thereafter, and these likely impact short and long-term outcomes. Given this rationale, our aims were to examine the relationship between the duration from birth to EEN on growth and length of hospital stay (LOHS) among convalescing preterm-born infants with oral feeding difficulties.</p><p><b>Methods:</b> This is a retrospective analysis of prospectively collected data from 77 preterm infants admitted to the all-referral level IV NICU at Nationwide Children's Hospital, Columbus, Ohio, who were later referred to our innovative neonatal and infant feeding disorders program for the evaluation and management of severe feeding/aero-digestive difficulties. Inclusion criteria: infants born &lt; 32 weeks gestation, birthweight &lt; 1500 g, absence of chromosomal/genetic disorders, discharged at term equivalent postmenstrual age (37-42 weeks, PMA) on full oral feeding. Growth variables were converted to age- and gender-specific Z-scores using the Fenton growth charts. Using the Academy of Nutrition and Dietetics criteria for neonates and preterm populations, extrauterine growth restriction (EUGR) was defined as weight Z-score decline from birth to discharge &gt; 0.8. Clinical characteristics stratified by EUGR status were compared using the Chi-Square test, Fisher exact test, Mann Whitney U test, and T-test as appropriate. Multivariate regression was used to explore and assess the relationship between the duration from birth to EEN and growth Z-scores at discharge simultaneously. Multiple Linear regression was used to assess the relationship between the duration from birth to EEN and LOHS.</p><p><b>Results:</b> Forty-two infants (54.5%) had EUGR at discharge, and those with weight and length percentiles &lt; 10% were significantly greater at discharge than at birth (Table 1). The growth-restricted infants at discharge had significantly lower birth gestational age, a higher proportion required mechanical ventilation at birth, had a higher incidence of sepsis, and took longer days to attain EEN (Table 2). The duration from birth to EEN was significantly negatively associated with weight, length, and head circumference Z scores at discharge. Likewise, the duration from birth to EEN was significantly positively associated with the LOHS (Figure 1).</p><p><b>Conclusion:</b> The duration from birth to exclusive enteral nutrition (EEN) can influence growth outcomes. We speculate that significant gaps between recommended and actual nutrient intake exist during the period to EEN, particularly for those with chronic tube-feeding difficulties. Well-planned and personalized nutrition is relevant until EEN is established, and enteral nutrition advancement strategies using standardized feeding protocols offers prospects for generalizability, albeit provides opportunities for personalization.</p><p><b>Table 1.</b> Participant Growth Characteristics.</p><p></p><p><b>Table 2.</b> Participants Clinical Characteristics.</p><p></p><p></p><p><b>Figure 1.</b> Relationship between the Duration from Birth to EEN Versus Growth Parameters and Length of Hospital Stay.</p><p>Alayne Gatto, MS, MBA, RD, CSP, LD, FAND<sup>1</sup>; Jennifer Fowler, MS, RDN, CSPCC, LDN<sup>2</sup>; Deborah Abel, PhD, RDN, LDN<sup>3</sup>; Christina Valentine, MD, MS, RDN, FAAP, FASPEN<sup>4</sup></p><p><sup>1</sup>Florida International University, Bloomingdale, GA; <sup>2</sup>East Carolina Health, Washington, NC; <sup>3</sup>Florida International University, Miami Beach, FL; <sup>4</sup>Banner University Medical Center, The University of Arizona, Tucson, AZ</p><p><b>Financial Support:</b> The Rickard Foundation.</p><p><b>Background:</b> The neonatal registered dietitian nutritionist (NICU RDN) plays a crucial role in the care of premature and critically ill infants in the neonatal intensive care unit (NICU). Advanced pediatric competencies are frequently a gap in nutrition degree coursework or dietetic internships. Critical care success with such vulnerable patients requires expertise in patient-centered care, multidisciplinary collaboration, and adaptive clinical problem-solving. This research aimed to identify the needs, engagement levels, and expertise of NICU RDNs, while also providing insights into their job satisfaction and career longevity. Currently in the US, there are approximately 850 Level III and Level IV NICUs in which neonatal dietitians are vital care team members, making it imperative that hospital provide appropriate compensation, benefits, and educational support.</p><p><b>Methods:</b> This was a cross-sectional examination using a national, online, IRB approved survey during March 2024 sent to established Neonatal and Pediatric Dietitian practice groups. A Qualtrics link was provided for current and former NICU RDNs to complete a 10-minute online survey that provided an optional gift card for completion. The link remained open until 200 gift cards were exhausted, approximately one week after the survey opened. Statistical analyses were performed using Stats IQ Qualtrics. In the descriptive statistics, frequencies of responses were represented as counts and percentages. For comparison of differences, the Chi-Squared test and Fisher's Exact test are used for categorical analysis.</p><p><b>Results:</b> In total, 253 current (n = 206) and former (n = 47) NICU RDNs completed the online questionnaire. Of the 210 respondents, 84 (40%) reported having pediatric clinical experience, 94 (44%) had clinical pediatric dietetic intern experience, 21 (10%) had previously worked as a WIC nutritionist, 15 (7.1%) had specialized pediatric certification or fellowship, and 12 (5.7%) had no prior experience before starting in the NICU. (Table 1) Of 163 respondents, 83 (50.9%) reported receiving financial support or reimbursement for additional NICU training. Respondents who felt valued as team members planned to stay in the NICU RD role for more than 5 years (p &gt; 0.0046). Additionally, they reported having acknowledgement and appreciation (64.4%), motivation (54.1%), and opportunities for advancement (22.9%). (Table 2).</p><p><b>Conclusion:</b> NICU RDNs do not have a clear competency roadmap nor a career development track. In addition, financial support or reimbursement for continuing education is not consistently an employee benefit which may play a key role in job satisfaction and retention. This data provides valuable insight for not only managers of dietitians but also professional societies to build programs and retention opportunities.</p><p><b>Table 1.</b> Question: When You Started as a NICU RD, What Experience Did You Have? (n = 210).</p><p></p><p>N and Percentages will total more than 210 as respondents could check multiple answers.</p><p><b>Table 2.</b> Comparison of Questions: Do You Feel You Have the Following in Your Role in the NICU and How Long Do You Plan to Stay in Your Role?</p><p></p><p>Sivan Kinberg, MD<sup>1</sup>; Christine Hoyer, RD<sup>2</sup>; Everardo Perez Montoya, RD<sup>2</sup>; June Chang, MA<sup>2</sup>; Elizabeth Berg, MD<sup>2</sup>; Jyneva Pickel, DNP<sup>2</sup></p><p><sup>1</sup>Columbia University Irving Medical Center, New York, NY; <sup>2</sup>Columbia University Medical Center, New York, NY</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Patients with short bowel syndrome (SBS) can have significant fat malabsorption due to decreased intestinal surface area, bile acid deficiency, and rapid transit time, often leading to feeding intolerance and dependence on parenteral nutrition (PN). Patients with SBS can have deficiency of pancreatic enzymes and/or reduced effectiveness of available pancreatic enzymes, resulting in symptoms of exocrine pancreatic insufficiency (EPI), including weight loss, poor weight gain, abdominal pain, diarrhea, and fat-soluble vitamin deficiencies. Oral pancreatic enzyme replacement therapy (PERT) is often tried but is impractical for use with enteral nutrition (EN) due to inconsistent enzyme delivery and risk of clogged feeding tubes. When used with continuous EN, oral PERT provides inadequate enzyme delivery as its ability to hydrolyze fats decreases significantly after 30 minutes of ingestion. The significant need for therapies to improve enteral absorption in this population has led to interest in using an in-line digestive cartridge to treat EPI symptoms in SBS patients on tube feedings. Immobilized lipase cartridge is an FDA-approved in-line digestive cartridge designed to hydrolyze fat in EN before it reaches the gastrointestinal tract, allowing for the delivery of absorbable fats through continuous or bolus feeds. In patients with SBS, this modality may be more effective in improving enteral fat absorption compared to oral preparations of PERT. Preclinical studies have demonstrated that use of an in-line digestive cartridge in a porcine SBS model increased fat-soluble vitamin absorption, reduced PN dependence, and improved intestinal adaptation. Our study aims to evaluate changes in PN, EN, growth parameters, stool output, and fat-soluble vitamin levels in pediatric SBS patients using an in-line digestive cartridge at our center.</p><p><b>Methods:</b> Single-center retrospective study in pediatric patients with SBS on EN who used an in-line immobilized lipase (RELiZORB) cartridge. Data collection included patient demographics, etiology of SBS, surgical history, PN characteristics (calories, volume, infusion hours/days), EN characteristics (tube type, bolus or continuous feeds, formula, calories, volume, hours), immobilized lipase cartridge use (#cartridges/day, duration), anthropometrics, stool output, gastrointestinal symptoms, medications (including previous PERT use), laboratory assessments (fat-soluble vitamin levels, fatty acid panels, pancreatic elastase), indication to start immobilized lipase cartridge, and any reported side effects. Patients with small intestinal transplant or cystic fibrosis were excluded.</p><p><b>Results:</b> Eleven patients were included in the study (mean age 10.4 years, 55% female). The most common etiology of SBS was necrotizing enterocolitis (45%) and 7 (64%) of patients were dependent on PN. Results of interim analysis show: mean duration of immobilized lipase cartridge use of 3.9 months, PN calorie decrease in 43% of patients, weight gain in 100% of patients, and improvement in stool output in 6/9 (67%) patients. Clogging of the cartridges was the most common reported technical difficulty (33%), which was overcome with better mixing of the formula. No adverse events or side effects were reported.</p><p><b>Conclusion:</b> In this single-center study, use of an in-line immobilized lipase digestive cartridge in pediatric patients with SBS demonstrated promising outcomes, including weight gain, improved stool output and reduced dependence on PN. These findings suggest that in-line digestive cartridges may play a role in improving fat malabsorption and decreasing PN dependence in pediatric SBS patients. Larger multicenter studies are needed to further evaluate the efficacy, tolerability, and safety of in-line digestive cartridges in this population.</p><p>Vikram Raghu, MD, MS<sup>1</sup>; Feras Alissa, MD<sup>2</sup>; Simon Horslen, MB ChB<sup>3</sup>; Jeffrey Rudolph, MD<sup>2</sup></p><p><sup>1</sup>University of Pittsburgh School of Medicine, Gibsonia, PA; <sup>2</sup>UPMC Children's Hospital of Pittsburgh, Pittsburgh, PA; <sup>3</sup>University of Pittsburgh School of Medicine, Pittsburgh, PA</p><p><b>Financial Support:</b> National Center for Advancing Translational Sciences (KL2TR001856.)</p><p><b>Background:</b> Administrative databases can provide unique perspectives in rare disease due to the ability to query multicenter data efficiently. Pediatric intestinal failure can be challenging to study due to the rarity of the condition at single centers and the previous lack of a single diagnosis code. In October 2023, a new diagnosis code for intestinal failure was added to the International Classification of Diseases, 10<sup>th</sup> revision (ICD-10). We aimed to describe the usage and limitations of this code to identify children with intestinal failure.</p><p><b>Methods:</b> We performed a multicenter cross-sectional study using the Pediatric Health Information Systems database from October 1, 2023, to June 30, 2024. Children with intestinal failure were identified by ICD-10 code (K90.83). Descriptive statistics were used to characterize demographics, diagnoses, utilization, and outcomes.</p><p><b>Results:</b> We identified 1804 inpatient encounters from 849 unique patients with a diagnosis code of intestinal failure. Figure 1 shows the trend of code use by month since its inception in October 2023. The 849 patients had a total of 7085 inpatient encounters over that timeframe, meaning only 25% of their encounters included the intestinal failure diagnosis. Among these 849 patients, 638 had at least one encounter over the timeframe in which they received parenteral nutrition; 400 corresponded to an admission in which they also had an intestinal failure diagnosis code. Examining only inpatient stays longer than 2 days, 592/701 (84%) patients with such a stay received parenteral nutrition. Central line-associated bloodstream infections accounted for 501 encounters. Patients spent a median of 37 days (IQR 13-96) in the hospital. Patients were predominantly non-Hispanic White (43.7%), non-Hispanic Black (14.8%), or Hispanic (27.9%). Most had government-based insurance (63.5%). Child Opportunity Index was even split among all five quintiles. The total standardized cost from all encounters with an intestinal failure diagnosis totaled $157 million with the total from all encounters with these patients totaling $259 million. The median cost over those 9 months per patients was $104,890 (IQR $31,149 - $315,167). Death occurred in 28 patients (3.3%) over the study period.</p><p><b>Conclusion:</b> The diagnosis code of intestinal failure has been used inconsistently since implementation in October 2023, perhaps due to varying definitions of intestinal failure. Children with intestinal failure experience high inpatient stay costs and rare but significant mortality. Future work must consider the limitations of using only the new code in identifying these patients.</p><p></p><p><b>Figure 1.</b> Number of Encounters With an Intestinal Failure Diagnosis Code.</p><p><b>Poster of Distinction</b></p><p>Kera McNelis, MD, MS<sup>1</sup>; Allison Ta, MD<sup>2</sup>; Ting Ting Fu, MD<sup>2</sup></p><p><sup>1</sup>Emory University, Atlanta, GA; <sup>2</sup>Cincinnati Children's Hospital Medical Center, Cincinnati, OH</p><p><b>Encore Poster</b></p><p><b>Presentation:</b> 6th Annual Pediatric Early Career Research Conference, August 27, 2024, Health Sciences Research Building I, Emory University.</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> The neonatal period is a time of rapid growth, and many infants who require intensive care need extra nutritional support. The Academy of Nutrition and Dietetics (AND) published an expert consensus statement to establish criteria for the identification of neonatal malnutrition. There is limited evidence regarding outcomes associated with diagnosis from these opinion-derived criteria. The objective of this study was to compare anthropometric-based malnutrition indicators with direct body composition measurements in infancy.</p><p><b>Methods:</b> Air displacement plethysmography is considered the gold standard for non-invasive body composition measurement, and this was incorporated into routine clinical care at a referral Level IV neonatal intensive care unit. Late preterm and term infants (34-42 weeks gestational age) with body composition measurement available were included in this study. Infants were categorized as having malnutrition per AND criteria. Fat mass, fat-free mass, and body fat percentage z-scores were determined per Norris body composition growth curves. Logistic regression was conducted to ascertain the relationship of fat mass, fat-free mass, and body fat percentage with malnutrition diagnosis. Linear regression was performed to predict body mass index (BMI) at age 18-24 months from each body composition variable.</p><p><b>Results:</b> Eighty-four infants were included, with 39% female and 96% singleton (Table 1). Fifteen percent were small for gestational age and 12% were large for gestational age at birth. Nearly half had a congenital intestinal anomaly, including gastroschisis and intestinal atresia. Sixty-three percent of the group met at least one malnutrition criterion. Fat-free mass z-score was negatively associated with a malnutrition diagnosis, with an odds ratio 0.77 (95% CI 0.59-0.99, p &lt; 0.05). There was not a statistically significant association between malnutrition diagnosis and body fat percentage or fat mass. There was not a statistically significant relationship between any body composition variable and BMI at 18-24 months, even after removing outliers with a high Cook's distance.</p><p><b>Conclusion:</b> Malnutrition diagnosis is associated with low fat-free mass in critically ill term infants. Body composition is not a predictor of later BMI in this small study.</p><p><b>Table 1.</b> Characteristics of Late Preterm and Term Infants in the Neonatal Intensive Care Unit With a Body Composition Measurement Performed Via Air Displacement Plethysmography.</p><p></p><p>John Stutts, MD, MPH<sup>1</sup>; Yong Choe, MAS<sup>1</sup></p><p><sup>1</sup>Abbott, Columbus, OH</p><p><b>Financial Support:</b> Abbott.</p><p><b>Background:</b> The prevalence of obesity in children is rising. Despite the awareness and work toward weight reduction, less is known about malnutrition in children with obesity. The purpose of this study was to evaluate the prevalence of obesity in U.S. children and determine which combination of indicators best define malnutrition in this population.</p><p><b>Methods:</b> The 2017-2018 National Health and Nutrition Examination Survey (NHANES) database was utilized to assess biomarkers (most recent complete dataset due to Covid-19 pandemic). Trends in prevalence were obtained from 2013-2018 survey data. Obesity was defined as ≥ 95th percentile of the CDC sex-specific BMI-for-age growth charts. Cohort age range was 12-18 years. Nutrient intake and serum were analyzed for vitamin D, vitamin E, vitamin C, potassium, calcium, vitamin B9, vitamin A, total protein, albumin, globulin, high sensitivity C-reactive protein (hs-CRP), iron, hemoglobin and mean corpuscular volume (MCV). Intake levels of fiber were also analyzed. Children consuming supplements were excluded. Categorical and continuous data analysis was performed using SAS® Proc SURVEYFREQ (with Wald chi-square test) and Proc SURVEYMEANS (with t-test) respectively in SAS® Version 9.4 and SAS® Enterprise Guide Version 8.3. Hypothesis tests were performed using 2-sided, 0.05 level tests. Results were reported as mean ± standard error (SE) (n = survey sample size) or percent ± SE (n).</p><p><b>Results:</b> The prevalence of obesity in the cohort was 21.3% ± 1.3 (993). Prevalence trended upward annually; 20.3% ± 2.1 (1232) in 2013-2014, 20.5% ± 2.0 (1129) in 2015-2016. When compared with children without obesity, the mean serum levels of those with obesity were significantly (P ≤ 0.05) lower for vitamin D (50.3 ± 2.4 vs. 61.5 ± 2.1, p &lt; 0.001), iron (14.4 ± 0.5 vs. 16.4 ± 0.4, p &lt; 0.001), albumin (41.7 ± 0.3 vs. 43.3 ± 0.3, p &lt; 0.001), and MCV (83.8 ± 0.5 vs. 85.8 ± 0.3, p = 0.003). When compared with children without obesity, the mean serum levels of those with obesity were significantly (p ≤ 0.05) higher for total protein (73.2 ± 0.4 vs. 72.0 ± 0.3, p = 0.002), globulin (31.5 ± 0.4 vs. 28.7 ± 0.3, p &lt; 0.001) and hs-CRP (3.5 ± 0.3 vs. 1.2 ± 0.2, p &lt; 0.001). A higher prevalence of insufficiency was found for Vitamin D (51.9% ± 5.6 vs. 26.8% ± 3.7, p = 0.001), hemoglobin (16.3% ± 3.1 vs. 7.5% ± 1.8, p = 0.034) and the combination of low hemoglobin + low MCV (11.2% ± 2.9 vs. 3.3% ± 1.0, p = 0.049). All other serum levels were not significantly (p &gt; 0.05) different, with no significant difference in intake.</p><p><b>Conclusion:</b> Results indicate a continued increase in prevalence of obesity in children. When comparing with the non-obese pediatric population, it also shows the differences in micro- and macronutrient serum levels, with no significant differences in dietary intake of these nutrients. The higher prevalence of low hemoglobin + low MCV supports iron deficiency and adds clinical relevance to the data surrounding low mean blood levels of iron. Children with obesity show higher mean globulin and hs-CRP levels consistent with an inflammatory state. The results underscore the existence of malnutrition in children with obesity and the need for nutrition awareness in this pediatric population.</p><p>Elisha London, BS, RD<sup>1</sup>; Derek Miketinas, PhD, RD<sup>2</sup>; Ariana Bailey, PhD, MS<sup>3</sup>; Thomas Houslay, PhD<sup>4</sup>; Fabiola Gutierrez-Orozco, PhD<sup>1</sup>; Tonya Bender, MS, PMP<sup>5</sup>; Ashley Patterson, PhD<sup>1</sup></p><p><sup>1</sup>Reckitt/Mead Johnson, Evansville, IN; <sup>2</sup>Data Minded Consulting, LLC, Houston, TX; <sup>3</sup>Reckitt/Mead Johnson Nutrition, Henderson, KY; <sup>4</sup>Reckitt/Mead Johnson Nutrition, Manchester, England; <sup>5</sup>Reckitt/Mead Johnson Nutrition, Newburgh, IN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> The objective was to examine whether nutrient intake varied across malnutrition classification among a nationally representative sample of children and adolescents.</p><p><b>Methods:</b> This was a secondary analysis of children and adolescents 1-18 y who participated in the National Health and Nutrition Examination Survey 2001-March 2020. Participants were excluded if they were pregnant or did not provide at least one reliable dietary recall. The degree of malnutrition risk was assessed by weight-for-height and BMI-for-age Z-scores for 1-2 y and 3-18 y, respectively. As per the Academy of Nutrition and Dietetics/American Society for Parenteral and Enteral Nutrition, malnutrition was classified using Z-scores: none (Z &gt; -1), mild (Z between -1 and -1.9), and moderate/severe (Z ≤ -2). Dietary intake was assessed from one to two 24-hr dietary recalls. Usual intakes of macronutrients from foods and beverages, and of micronutrients from foods, beverages, and supplements were assessed using the National Cancer Institute method. The cut-point approach was used to estimate the proportion of children with intake below the estimated average requirement for micronutrients and outside the acceptable macronutrient distribution range for macronutrients. All analyses were adjusted for sampling methodology and the appropriate sample weights were applied. Independent samples t-tests were conducted to compare estimates within age groups between nutrition status classifications, with no malnutrition as the reference group.</p><p><b>Results:</b> A total of 32,188 participants were analyzed. Of those, 31,689 (98.4%) provided anthropometrics. The majority (91%) did not meet criteria for malnutrition, while 7.4% and 1.6% met criteria for mild and moderate/severe malnutrition, respectively. Dietary supplement use (mean[SE]) was reported among 33.3[0.7]% of all children/adolescents. Of those experiencing moderate/severe malnutrition, inadequate calcium intake was greatest in adolescents 14-18 y (70.6[3.3]%), and older children 9-13 y (69.7[3.4]%), compared to 3-8 y (29.5[3.9]%) and 1-2 y (5.0[1.2]%). In children 9-13 y, risk for inadequate calcium intake was greater among those experiencing mild (65.3[2.1]%) and moderate/severe malnutrition (69.7[2.1]%) compared to those not experiencing malnutrition (61.2[1.1]%). Similarly, in those experiencing moderate/severe malnutrition, inadequate zinc and phosphorus intake was greatest in adolescents 14-18 y (20.9[3.3]% and 30.6[3.4]%, respectively) and 9-13 y (13.4[2.6]% and 33.9[3.6]%, respectively) compared to younger children (range 0.2[0.1]-0.9[0.4]% and 0.2[0.1]-0.4[0.2]% in 1-8 y, respectively). Although the greatest risk for inadequate protein intake was observed among those experiencing moderate/severe malnutrition, percent energy intake from protein and carbohydrate was adequate for most children and adolescents. Total and saturated fat were consumed in excess by all age groups regardless of nutrition status. The greatest risk for excessive saturated fat intake was reported in those experiencing moderate/severe malnutrition (range across all age groups: 85.9-95.1%).</p><p><b>Conclusion:</b> Older children and adolescents experiencing malnutrition were at greatest risk for inadequate intakes of certain micronutrients such as calcium, zinc, and phosphorus. These results may indicate poor diet quality among those at greatest risk for malnutrition, especially adolescents.</p><p>Anna Benson, DO<sup>1</sup>; Louis Martin, PhD<sup>2</sup>; Katie Huff, MD, MS<sup>2</sup></p><p><sup>1</sup>Indiana University School of Medicine, Carmel, IN; <sup>2</sup>Indiana University School of Medicine, Indianapolis, IN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Trace metals are essential for growth and are especially important in the neonatal period. Recommendations are available for intake of these trace metals in the neonate. However, these recommendations are based on limited data and there are few available descriptions regarding trace metal levels in neonates and their influence on outcomes. In addition, monitoring trace metal levels can be difficult as multiple factors, including inflammation, can affect accuracy. The goal of this project was to evaluate patient serum levels of zinc, selenium and copper and related outcomes including growth, rate of cholestasis, sepsis, bronchopulmonary dysplasia (BPD), and death in a cohort admitted to the neonatal intensive care unit (NICU) and parenterally dependent.</p><p><b>Methods:</b> We completed a retrospective chart review of NICU patients who received parenteral nutrition (PN) and had trace metal panels drawn between January 2016 and February 2023. Charts were reviewed for baseline labs, time on PN, trace metal panel level, dose of trace metals in PN, enteral feeds and supplements, and outcomes including morbidities and mortality. Sepsis was diagnosed based on positive blood culture and cholestasis as a direct bilirubin &gt;2 mg/dL. Fisher's Exact Test or Chi square were used to assess association between categorical variables. Spearman correlation was used to assess the correlation between two continuous variables. A p-value of 0.05 was used for significance.</p><p><b>Results:</b> We included 98 patients in the study with demographic data shown in Table 1. The number of patients with trace metal elevation and deficiency are noted in Tables 1 and 2, respectively. The correlation between growth and trace metal levels is shown in Figure 1. Patient outcomes related to the diagnosis of trace metal deficiency are noted in Table 2. Copper deficiency was found to be significantly associated with sepsis (p = 0.010) and BPD (p = 0.033). Selenium deficiency was associated with cholestasis (p = 0.001) and BPD (p = 0.003). To further assess the relation of selenium and cholestasis, spearman correlation noted a significant negative correlation between selenium levels and direct bilirubin levels with (p = 0.002; Figure 2).</p><p><b>Conclusion:</b> Trace metal deficiency was common in our population. In addition, selenium and copper deficiency were associated neonatal morbidities including sepsis, cholestasis, and BPD. When assessing selenium deficiency and cholestasis, the measured selenium level was found to correlate with direct bilirubin level. While there was correlation between trace metal levels and growth, the negative association noted is unclear and highlights the need for further assessment to determine the influence of other patient factors and the technique used for growth measurement. Overall this project highlights the important relation between trace metals and neonatal morbidities. Further research is needed to understand how trace metal supplementation might be used to optimize neonatal outcomes in the future.</p><p><b>Table 1.</b> Patient Demographic and Outcome Information for Entire Population Having Trace Metal Levels Obtained. (Total population n = 98 unless noted).</p><p></p><p>Patient demographic and outcome information for entire population having trace metal levels obtained.</p><p><b>Table 2.</b> Rate of Trace Metal Deficiency and Association With Patient Outcomes.</p><p>(Total n = 98).</p><p></p><p>Rate of trace metal deficiency and association with patient outcomes.</p><p></p><p>Scatter plot of average trace metal level and change in growth over time. (Growth represented as change in parameter over days). The Spearman correlation coefficient is listed for each graph. And significance note by symbol with *p-value &lt; 0.05, †p-value &lt; 0.01, ‡p-value &lt; 0.001.</p><p><b>Figure 1.</b> Correlation of Trace Metal Level and Growth.</p><p></p><p>Scatter plot of individual direct bilirubin levels plotted by selenium levels. Spearman correlation coefficient noted with negative correlation with p-value 0.002.</p><p><b>Figure 2.</b> Correlation of Selenium Level With Direct Bilirubin Level.</p><p>Kaitlin Berris, RD, PhD (student)<sup>1</sup>; Qian Zhang, MPH<sup>2</sup>; Jennifer Ying, BA<sup>3</sup>; Tanvir Jassal, BSc<sup>3</sup>; Rajavel Elango, PhD<sup>4</sup></p><p><sup>1</sup>BC Children's Hospital, North Vancouver, BC; <sup>2</sup>BCCHR, Vancouver, BC; <sup>3</sup>University of British Columbia, Vancouver, BC; <sup>4</sup>UBC/BCCHR, Vancouver, BC</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Pediatric critical illness causes increased demand for several nutrients. Children admitted requiring nutrition support have a naso-gastric tube to deliver enteral nutrition (EN) formula as liquid nutrition. These formulas are developed using dietary reference intakes (DRI) for healthy populations and does not account for altered nutrient needs in critical illness. Imbalanced nutrient delivery during this vulnerable time could lead to malnutrition, and negatively impact hospital outcomes. Published guidelines by the American Society of Parenteral and Enteral Nutrition (ASPEN) in 2017 aim to alleviate poor nutrition: use concentrated formula in fluid restriction, meet 2/3 estimated calories (resting energy expenditure, REE) by the end of first week, a minimum of 1.5 g/kg/day dietary protein and provide the DRI for each micronutrient. The objective of this retrospective cohort study was to evaluate nutrition delivery (prescribed vs. delivered) compared to 2017 guidelines, and correlate to adequacy in children admitted to a Canadian PICU.</p><p><b>Methods:</b> Three-years of charts were included over two retrospective cohorts: September 2018- December 2020 and February 2022- March 2023. The first cohort, paper chart based, included children 1-18 y with tube feeding started within 3 d after admission. The second cohort, after transition to electronic medical records, included children 1-6 y on exclusive tube feeding during the first week of admission. Patient characteristics, daily formula type, rate prescribed (physician order), amount delivered (nursing notes) and interruption hours and reasons were collected. Statistical analysis included descriptive analysis of characteristics, logistic regression for odds of achieving adequacy of intake with two exposures: age categories and formula type. Pearson correlation was used to interpret interruption hours with percentage of calories met.</p><p><b>Results:</b> Patients (n = 86) that were included spanned 458 nutrition support days (NSD). Admissions predominantly were for respiratory disease (73%), requiring ventilation support (81.4%). Calorie prescription (WHO REE equation) was met in 20.3% of NSD and 43.9% met 2/3 of calorie recommendation (table 1). Concentrated calories were provided in 34% of patients. Hours of interruptions vs. percentage of goal calories met was negatively correlated (r = -.52, p = .002) when looking at those ordered EN without prior EN history (i.e. home tube fed). More than 4 h of interruptions was more likely to not meet 2/3 calorie goal. Calorie goals were met when standard pediatric formula was used in children 1-8 y and concentrated adult formula in 9-18 y. Odds of meeting calorie goal increased by 85% per 1 day increase (OR 1.85 [1.52, 2.26], p &lt; .0001) with a median of 4 d after admission to meet 2/3 calories. Minimum protein intake (1.5 g/kg/d) was only met in 24.9% of all NSD. Micronutrients examined, except for vitamin D, met DRI based on prescribed amount. Delivered amounts provided suboptimal micronutrient intake, especially for vitamin D (figure 1).</p><p><b>Conclusion:</b> Current ASPEN recommendations for EN were not being achieved in critically ill children. Concentrated formula was often not chosen and decreased ability to meet calorie goals in younger patients. Prescribing shorter continuous EN duration (20/24 h) may improve odds of meeting calorie targets. Evaluation of NSD showed improving trend in calorie intake over the first week to meet 2/3 goal recommendation. However, results highlight inadequacy of protein even if calorie needs are increasingly met. Attention to micronutrient delivery with supplementation of Vitamin D is required.</p><p><b>Table 1.</b> Enteral Nutrition Characteristics Per Nutrition Support Days. (N = 458)</p><p></p><p></p><p>Estimated Vitamin D Intake and 95% Confidence Intervals by Age and Formula Groups.</p><p><b>Figure 1.</b> Estimated Vitamin D Intake by Age and Formula Groups.</p><p>Dana Steien, MD<sup>1</sup>; Megan Thorvilson, MD<sup>1</sup>; Erin Alexander, MD<sup>1</sup>; Molissa Hager, NP<sup>1</sup>; Andrea Armellino, RDN<sup>1</sup></p><p><sup>1</sup>Mayo Clinic, Rochester, MN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Home parenteral nutrition (HPN) is a life-sustaining therapy for children with long-term digestive dysfunction. Historically HPN has been considered a bridge to enteral autonomy, or intestinal transplantation. However, thanks to medical and management improvements, HPN is now used for a variety of diagnoses, including intractable feeding intolerance (IFI) in children with severe neurological impairment (SNI). IFI occurs most often near the end-of-life (EOL) in patients with SNI. Thus, outpatient planning and preparation for HPN in this population, vastly differs from historical HPN use.</p><p><b>Methods:</b> Case series of four pediatric patients with SNI who develop IFI and utilized HPN during their EOL care. Data was collected by retrospective chart review. The hospital pediatric palliative care service was heavily involved in the patients’ care when HPN was discussed and planned. The pediatric intestinal rehabilitation (PIR) and palliative care teams worked closely together during discharge planning and throughout the outpatient courses.</p><p><b>Results:</b> The children with SNI in this case series, developed IFI between ages 1 and 12 years. Duration of HPN use varied from 5 weeks to 2 years. All patients were enrolled in hospice, but at various stages. Routine, outpatient HPN management plans and expectations were modified, based on each family's goals of EOL care. Discussions regarding the use and timing of laboratory studies, fever plans, central line issues, growth, and follow up appointments, required detailed discussions and planning.</p><p><b>Conclusion:</b> EOL care for children differs from most EOL care in adults. Providing HPN to children with SNI and IFI can provide time, opportunities, and peace for families during their child's EOL journey, if it aligns with their EOL goals. PIR teams can provide valuable HPN expertise for palliative care services and families during these challenging times.</p><p>Jessica Lowe, DCN, MPH, RDN<sup>1</sup>; Carolyn Ricciardi, MS, RD<sup>2</sup>; Melissa Blandford, MS, RD<sup>3</sup></p><p><sup>1</sup>Nutricia North America, Roseville, CA; <sup>2</sup>Nutricia North America, Rockville, MD; <sup>3</sup>Nutricia North America, Greenville, NC</p><p><b>Financial Support:</b> This study was conducted by Nutricia North America.</p><p><b>Background:</b> Extensively hydrolyzed formulas (eHFs) are indicated for the management of cow milk allergy (CMA) and related symptoms. This category of formulas is often associated with an unpleasant smell and bitter taste, however; whey-based eHFs are considered more palatable than casein-based eHFs.<sup>1-4</sup> The inclusion of lactose, the primary carbohydrate in human milk, in hypoallergenic formula can also promote palatability. Historically, concerns for residual protein traces in lactose has resulted in complete avoidance of lactose in CMA. However, “adverse reactions to lactose in CMA are not supported in the literature, and complete avoidance of lactose in CMA is not warranted.”<sup>5</sup> Clinicians in the United Kingdom previously reported that taste and acceptance of an eHF is an important consideration when prescribing, as they believe a palatable eHF may result in decreased formula refusal and lead to more content families.<sup>1</sup> The objective of this study was to understand caregiver sensory perspectives on an infant, when-based eHF containing lactose.</p><p><b>Methods:</b> Fifteen clinical sites were recruited from across the United States. Clinicians enrolled 132 infants, whose families received the whey based eHF for 2 weeks, based on clinician recommendation. Caregivers completed two surveys: an enrollment survey and 2-week-post-survey characterizing eHF intake, CMA related symptoms, stooling patterns, sensory perspectives and satisfaction with eHF. Data was analyzed using SPSS 27 and descriptive statistics.</p><p><b>Results:</b> One hundred and twenty-two infants completed the study. At enrollment infants were 22 ( ± 14.7) weeks old. Prior to study initiation, 12.3% of infants were breastfed, 40.2% were on a casein-based eHF, and 18.9% were on a standard formula with intact proteins. Most patients (97.5%) were fed orally and 2.5% were tube fed. Among all parents who responded, 92.5% (n = 86/93) reported better taste and 88.9% (n = 96/108) reported better small for whey-based formula containing lactose compared to the previous formula. For caregivers whose child was on a casein-based eHF at enrollment and responded, 97.6% (n = 41/42) reported better taste and 95.7% (n = 44/46) reported better smell than the previous formula. Additional caregiver reported perceptions of taste and smell are reported in Figure 1 and Figure 2, respectively. Finally, 89.3% of caregivers said it was easy to start their child on the whey-based eHF containing lactose and 91.8% would recommend it to other caregivers whose child requires a hypoallergenic formula.</p><p><b>Conclusion:</b> The majority of caregivers had a positive sensory experience with the whey-based eHF containing lactose compared to the baseline formulas. Additionally, they found the trial formula easy to transition to and would recommend it to other families. These data support the findings of Maslin et al. and support the clinicians' expectation that good palatability would result in better acceptance and more content infants and families.<sup>1</sup> Further research is needed to better understand how improved palatability can contribute to decreased waste and heath care costs.</p><p></p><p><b>Figure 1.</b> Caregiver Ranking: Taste of Whey-Based, Lactose-Containing eHF.</p><p></p><p><b>Figure 2.</b> Caregiver Ranking: Smell of Whey-Based, Lactose-Containing eHF.</p><p>Michele DiCarlo, PharmD<sup>1</sup>; Emily Barlow, PharmD, BCPPS<sup>1</sup>; Laura Dinnes, PharmD, BCIDP<sup>1</sup></p><p><sup>1</sup>Mayo Clinic, Rochester, MN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> There is limited information on hyperkalemia in adult patients that received trimethoprim-sulfamethoxazole (TMP-SMX). The mechanism for hyperkalemia is related to the TMP component, which is structurally related to the potassium-sparing diuretic amiloride. In children, there is no information on clinical impact or monitoring required. We noted that a pediatric patient on total parenteral nutrition (TPN), had a drop in the TPN potassium dosing once the TMP-SMX was started. This reduction remained for two weeks, following the last dose of the antibiotic. Case presentation: A 7-month-old was in a cardiac intensive care unit following complex surgical procedures and extracorporeal membrane oxygenation requirement. TPN was started due to concerns related to poor perfusion and possible necrotizing enterocolitis. TPN continued for a total of one hundred and ten days. Per protocol while on TPN, electrolytes and renal function (urine output, serum creatinine) were monitored daily. Diuretic therapy, including loop diuretics, chlorothiazide and spironolactone, were prescribed prior to TPN and continued for the entire duration. Renal function remained stable for the duration of TPN therapy. Dosing of potassium in the TPN was initiated per ASPEN guidelines and adjusted for serum potassium levels. Due to respiratory requirement and positive cultures, TMP-SMX was added to the medication regimen on two separate occasions. TMP-SMX 15 mg/kg/day was ordered twelve days after the start of the TPN and continued for three days. TMP-SMX 15 mg/kg/day was again started on day forty-three of TPN and continued for a five-day duration. Serum potassium was closely monitored for adjustments once TMP-SMX was started. When the TPN was successfully weaned off, we reviewed this information again. There was an obvious drop in TPN potassium dosing by day two of both regimens of TMP-SMX start and did not return to the prior stable dosing until approximately two weeks after the last dose of the antibiotic. This reduction lasted far beyond the projected half-life of TMP-SMX. (Table 1) Discussion: TMP-SMX is known for potential hyperkalemia in adult patients with multiple confounding factors. Factors include high dosage, renal dysfunction, congestive heart failure, and concomitant medications known to cause hyperkalemia. Little literature exists to note this side effect in pediatrics. The onset of our patient's increased serum potassium levels, and concurrent decrease in TPN dosing, could be expected, as TMP-SMX's time to peak effect is 1-4 hours. Half-life of TMP in children &lt; 2 years old is 5.9 hours. Given this information, one would expect TMP-SMX to be cleared approximately thirty hours from the last dose administered. Our patient's potassium dosing took approximately two weeks from the end of the TMP-SMX administration to return to the pre TMP-SMX potassium dosing for both treatment regimens. Potential causes for the extended time to stabilize include concurrent high dose TMP-SMX and continuation of the potassium sparing diuretic. Prolonged potassium monitoring for pediatric patients started on high dose TMP-SMX while on TPN should be considered and further evaluation explored.</p><p><b>Methods:</b> None Reported.</p><p><b>Results:</b> None Reported.</p><p><b>Conclusion:</b> None Reported.</p><p></p><p>Graph representing TPN Potassium dose (in mEq/kg/day), and addition of TMP-SMX regimen on two separate occasions. Noted drop in TPN potassium dose and delayed return after each TMP-SMX regimen.</p><p><b>Figure 1.</b> TPN Potassium Dose and TMP-SMX Addition.</p><p>Jennifer Smith, MS, RD, CSP, LD, LMT<sup>1</sup>; Praveen Goday, MBBS<sup>2</sup>; Lauren Storch, MS, RD, CSP, LD<sup>2</sup>; Kirsten Jones, RD, CSP, LD<sup>2</sup>; Hannah Huey, MDN<sup>2</sup>; Hilary Michel, MD<sup>2</sup></p><p><sup>1</sup>Nationwide Children's Hospital, Dresden, OH; <sup>2</sup>Nationwide Children's Hospital, Columbus, OH</p><p><b>Financial Support:</b> North American Society for Pediatric Gastroenterology, Hepatology, and Nutrition (NASPGHAN) Foundation.</p><p><b>Background:</b> The prevalence of Avoidant/Restrictive Food Intake Disorder (ARFID) is approximately 3% in the general population and 10% in adults with inflammatory bowel disease (IBD). Up to 10% of adolescents with IBD have reported disordered eating behaviors; however, there have been no prospective studies on the prevalence of ARFID eating behaviors in this population.</p><p><b>Methods:</b> This is a one-time, cross-sectional, non-consecutive study of English-speaking patients with a confirmed diagnosis of IBD, aged 12-18 years. Participants completed the validated Nine Item ARFID Screen (NIAS), the SCOFF eating disorder screen (this screen utilizes an acronym [<span>S</span>ick, <span>C</span>ontrol, <span>O</span>ne, <span>F</span>at, and <span>F</span>ood] in relation to the five questions on the screen) and answered one question about perceived food intolerances. The NIAS is organized into the three specific ARFID domains: eating restriction due to picky eating, poor appetite/limited interest in eating, and fear of negative consequences from eating, each of which is addressed by three questions. Questions are based on a 6-point Likert scale. Participants scoring ≥23 on the total scale or ≥12 on an individual subscale were considered to meet criteria for ARFID eating behaviors. Since some individuals with a positive NIAS screen may have anorexia nervosa or bulimia, we also used the SCOFF questionnaire to assess the possible presence of an eating disorder. A score of two or more positive answers has a sensitivity of 100% and specificity of 90% for anorexia nervosa or bulimia. Apart from descriptive statistics, Chi-square testing was used to study the prevalence of malnutrition, positive SCOFF screening, or food intolerances in patients with and without positive NIAS screens.</p><p><b>Results:</b> We enrolled 82 patients whose demographics are shown in Table 1. Twenty percent (16/82) scored positive on the NIAS questionnaire, 16% (13/82) scored positive on the SCOFF questionnaire, and 48% (39/82) noted food intolerances. Of the 16 participants who scored positive on the NIAS, 50% (8/16) were male, 56% (9/16) had a diagnosis of Crohn's Disease, and 69% (11/16) had inactive disease. Twenty-five percent of those with a positive NIAS (4/16) met criteria for malnutrition (1 mild, 2 moderate, and 1 severe). Sixty-nine percent of those who scored positive on the NIAS (11/16) noted food intolerances and 30% (5/16) had a positive SCOFF screener. The prevalence of malnutrition (p = 0.4), the percentage of patients who scored positive on the SCOFF eating disorder screen (p = 0.3), or those who with reported food intolerances (p = 0.6) was similar in participants who scored positive on the NIAS vs. not.</p><p><b>Conclusion:</b> Using the NIAS, 20% of adolescents with IBD met criteria for ARFID. Participants were no more likely to have malnutrition, a positive score on the SCOFF eating disorder screen, or reported food intolerances whether or not they met criteria for ARFID. Routine screening of adolescents with IBD for ARFID or other eating disorders may identify patients who would benefit from further evaluation.</p><p><b>Table 1.</b> Demographics.</p><p></p><p>Qian Wen Sng, RN<sup>1</sup>; Jacqueline Soo May Ong<sup>2</sup>; Sin Wee Loh, MB BCh BAO (Ireland), MMed (Paed) (Spore), MRCPCH (RCPCH, UK)<sup>1</sup>; Joel Kian Boon Lim, MBBS, MRCPCH, MMed (Paeds)<sup>1</sup>; Li Jia Fan, MBBS, MMed (Paeds), MRCPCH (UK)<sup>3</sup>; Rehena Sultana<sup>4</sup>; Chengsi Ong, BS (Dietician), MS (Nutrition and Public Health), PhD<sup>1</sup>; Charlotte Lin<sup>3</sup>; Judith Ju Ming Wong, MB BCh BAO, LRCP &amp; SI (Ireland), MRCPCH (Paeds) (RCPCH, UK)<sup>1</sup>; Ryan Richard Taylor<sup>3</sup>; Elaine Hor<sup>2</sup>; Pei Fen Poh, MSc (Nursing), BSN<sup>1</sup>; Priscilla Cheng<sup>2</sup>; Jan Hau Lee, MCI, MRCPCH (UK), USMLE, MBBS<sup>1</sup></p><p><sup>1</sup>KK Hospital, Singapore; <sup>2</sup>National University Hospital, Singapore; <sup>3</sup>National University Hospital Singapore, Singapore; <sup>4</sup>Duke-NUS Graduate Medical School, Singapore</p><p><b>Financial Support:</b> This work is supported by the National Medical Research Council, Ministry of Health, Singapore.</p><p><b>Background:</b> Protein-energy malnutrition is pervasive in pediatric intensive care unit (PICU) patients. There remains clinical equipoise on the impact of protein supplementation in critically ill children. Our primary aim was to determine the feasibility of conducting a randomized controlled trial (RCT) of protein supplementation versus standard enteral nutrition (EN) in the PICU.</p><p><b>Methods:</b> An open-labelled pilot RCT was conducted from January 2021 to June 2024 in 2 tertiary pediatric centers in Singapore. Children with body mass index (BMI) z-score &lt; 0, who were expected to require invasive or non-invasive mechanical ventilation for at least 48 hours and required EN support for feeding were included. Patients were randomized (1:1 allocation) to protein supplementation of ³1.5 g/kg/day in addition to standard EN or standard EN alone for 7 days after enrolment or discharge to high dependency unit, whichever was earlier. Feasibility was based on 4 outcomes: Effective screening (&gt;80% eligible patients approached for consent), satisfactory enrolment (&gt;1 patient/center/month), timely protocol implementation (&gt;80% of participants receiving protein supplementation within first 72 hours) and protocol adherence (receiving &gt;80% of protein supplementation as per protocol).</p><p><b>Results:</b> A total of 20 patients were recruited - 10 (50.0%) and 10 (50.0%) in protein supplementation and standard EN groups, respectively. Median age was 13.0 [Interquartile range (IQR) 4.2, 49.1] months. Respiratory distress was the most common reason for PICU admission [11 (55.0%)]. Median PICU and hospital length of stay were 8.0 (IQR 4.5, 16.5) and 19.0 (IQR 11.5, 36.5) days, respectively. There were 3 (15%) deaths which were not related trial intervention. Screening rate was 50/74 (67.6%). Mean enrollment was 0.45 patient/center/month. Timely protocol implementation was performed in 15/20 (75%) participants. Protocol adherence was achieved by the participants in 11/15 (73.3%) of protein supplementation days.</p><p><b>Conclusion:</b> Satisfactory feasibility outcomes were not met in this pilot RCT. Based on inclusion criteria of this pilot study and setup of centers, a larger study in Singapore alone will not be feasible. With incorporation of revised logistic arrangements, a larger feasibility multi-center center study involving regional countries should be piloted.</p><p>Veronica Urbik, MD<sup>1</sup>; Kera McNelis, MD<sup>1</sup></p><p><sup>1</sup>Emory University, Atlanta, GA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Many clinical management and physiologic knowledge gaps are present in the care of the tiny baby population (neonates born at ≤23 weeks gestation, also labelled periviable gestational age). There is therefore significant center-specific variation in practice, as well as increased morbidity and mortality observed in infants born at 22-23 weeks compared to those born at later gestational ages<sup>1</sup>. The importance of nutrition applies to tiny babies, and appropriate nutrition is perhaps even more critical in this population than others. Numerous studies have demonstrated the benefits of full early enteral feeding, including decreased rates of central-line associated blood infection and cholestasis<sup>2,3</sup>. The risk of developing necrotizing enterocolitis (NEC) is a balancing measure for advancement of enteral feeds<sup>4</sup>. Adequate nutrition is critical for growth, reducing morbidity and mortality, and improving overall outcomes. Current proposed protocols for this population target full enteral feeding volumes to be reached by 10-14 days of life<sup>5</sup>.</p><p><b>Methods:</b> From baseline data collected at two Level III neonatal intensive care units (NICU) attended by a single group of academic neonatology faculty from January 2020 – January 2024, the average length of time from birth until full enteral feeds was achieved was 31 days. Using quality improvement (QI) methodology, we identified the barriers in advancement to full enteral feeds (defined as 120cc/kg/day) in babies born at 22- and 23-weeks gestational age admitted to the pediatric resident staffed Level III NICU.</p><p><b>Results:</b> The Pareto chart displays the primary barriers including undefined critical illness, vasopressor use, evaluation for NEC, or spontaneous intestinal perforation, patent ductus arteriosus treatment, and electrolyte derangements (Figure 1). In many cases, no specific reason was able to be identified in chart review for not advancing toward full enteral feeds.</p><p><b>Conclusion:</b> In this ongoing QI project, our SMART aim is to reduce the number of days to reach full feeds over the period of January 2024 -June 2025 by 10%, informed by root cause analysis and the key driver diagram developed from the data thus far (Figure 2). The first plan-do-study-act cycle started on January 16, 2024. Data are analyzed using statistical process control methods.</p><p></p><p>Pareto Chart.</p><p><b>Figure 1.</b></p><p></p><p>Key Driver Diagram.</p><p><b>Figure 2.</b></p><p>Bridget Hron, MD, MMSc<sup>1</sup>; Katelyn Ariagno, RD, LDN, CNSC, CSPCC<sup>1</sup>; Matthew Mixdorf<sup>1</sup>; Tara McCarthy, MS, RD, LDN<sup>1</sup>; Lori Hartigan, ND, RN, CPN<sup>1</sup>; Jennifer Lawlor, RN, BSN, CPN<sup>1</sup>; Coleen Liscano, MS, RD, CSP, LDN, CNSC, CLE, FAND<sup>1</sup>; Michelle Raymond, RD, LDN, CDCES<sup>1</sup>; Tyra Bradbury, MPH, RD, CSP, LDN<sup>1</sup>; Erin Keenan, MS, RD, LDN<sup>1</sup>; Christopher Duggan, MD, MPH<sup>1</sup>; Melissa McDonnell, RD, LDN, CSP<sup>1</sup>; Rachel Rosen, MD, MPH<sup>1</sup>; Elizabeth Hait, MD, MPH<sup>1</sup></p><p><sup>1</sup>Boston Children's Hospital, Boston, MA</p><p><b>Financial Support:</b> Some investigators received support from agencies including National Institutes of Health and NASPGHAN which did not directly fund this project.</p><p><b>Background:</b> The widespread shortage of amino acid-based formula in February 2022 highlighted the need for urgent and coordinated hospital response to disseminate accurate information to front-line staff in both inpatient and ambulatory settings.</p><p><b>Methods:</b> An interdisciplinary working group consisting of pediatric gastroenterologists, dietitians, nurses and a quality improvement analyst was established in September 2022. The group met at regular intervals to conduct a needs assessment for all disciplines. The group developed and refined a novel clinical process algorithm to respond to reports of formula shortages and/or recalls. The Clinical Process Map is presented in Figure 1. Plan-do-study-act cycles were implemented to improve the quality of the communication output based on staff feedback. The key performance indicator is time from notification of possible shortage to dissemination of communication to stakeholders, with a goal of &lt; 24 hours.</p><p><b>Results:</b> From September 2022 to August 2024, the group met 18 times for unplanned responses to formula recall/shortage events. Email communication was disseminated within 24 hours for 8/18 (44%) events; within 48 hours for 9/18 (50%) and 1/18 (6%) after 48 hours. Iterative changes included the initiation of an urgent huddle for key stakeholders to identify impact and substitution options; development of preliminary investigation pathway to ensure validity of report; development of structured email format that was further refined to table format including images of products (Figure 2) and creation of email distribution list to disseminate shortage reports. The keys to this project's success were the creation of a multidisciplinary team dedicated to meeting urgently for all events, and the real time drafting and approval of communication within the meeting. Of note, the one communication which was substantially delayed (94.7 hours) was addressed over email only, underscoring the importance of the multidisciplinary working meeting.</p><p><b>Conclusion:</b> Establishing clear lines of communication and assembling key stakeholders resulted in timely, accurate and coordinated communication regarding nutrition recalls/shortage events at our institution.</p><p></p><p><b>Figure 1.</b> Formula Recall Communication Algorithm.</p><p></p><p><b>Figure 2.</b></p><p>Nicole Misner, MS, RDN<sup>1</sup>; Michelle Yavelow, MS, RDN, LDN, CNSC, CSP<sup>1</sup>; Athanasios Tsalatsanis, PhD<sup>1</sup>; Racha Khalaf, MD, MSCS<sup>1</sup></p><p><sup>1</sup>University of South Florida, Tampa, FL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Early introduction of peanuts and eggs decreases the incidence of peanut and egg allergies in infants who are at high risk of developing food allergies. Prevention guidelines and clinical practice have shifted with recent studies. However, little is known about introducing common food allergens in infants fed via enteral feeding tubes. Early introduction of allergens could be of importance in infants working towards tube feeding wean and those that may benefit from blended tube feeding in the future. We aimed to compare the characteristics of patients with enteral tubes who received education during their gastroenterology visit verses those who did not.</p><p><b>Methods:</b> We performed a single center retrospective chart review involving all patients ages 4 to 24 months of age with an enteral feeding tube seen at the University of South Florida Pediatric Gastroenterology clinic from August 2020 to July 2024. Corrected age was used for infants born &lt; 37 weeks’ gestation age. All types of enteral nutrition were included i.e. nasogastric, gastrostomy, gastrostomy-jejunostomy tube. Data on demographics, clinical characteristics and parent-reported food allergen exposure were collected. An exception waiver was received by the University of South Florida Institution Review Board for this retrospective chart review. Differences between patients who received education and those who did not were evaluated using Student's t-test for continuous variables and chi-square test for categorical variables. All analysis was performed using R Statistical Software (v4.4.2). A p value &lt; =0.05 was considered statistically significant.</p><p><b>Results:</b> A total of 77 patients met inclusion criteria, with 349 total visits. Patient demographics at each visit are shown in Table 1. There was a documented food allergy in 12% (43) of total visits. Education on early introduction of common food allergens was provided in 12% (42) of total visits. Patients who received education at their visit were significantly younger compared to those who did not and were also more likely to have eczema. Table 2 compares nutrition characteristics of the patient at visits where education was discussed vs those where it was not. Infants with any percent of oral intake were more likely to have received education than those that were nil per os (p = 0.013). There was a significant association between starting solids and receiving education (p &lt; 0.001). Reported allergen exposure across all visits was low. For total visits with the patient &lt; 8 months of age (n = 103), only 6% (6) reported peanut and 8% (8) egg exposure. Expanded to &lt; 12 months of age at the time of visit (n = 198), there was minimal increase in reported allergen exposure, 7% (14) reported peanut and 8% (15) egg exposure. Oral feeds were the most common source of reported form of allergen exposure. Only one patient received commercial early allergen introduction product. Cow's milk exposure was the most reported allergen exposure with 61% (63) under 8 months and 54% (106) under 12 months of age, majority from their infant formula.</p><p><b>Conclusion:</b> Age and any proportion of oral intake were associated with receiving education on common food allergen introduction at their visit. However, there were missed opportunities for education in infants with enteral feeding tubes. There were few visits with patients that reported peanut or egg exposure. Further research and national guidelines are needed on optimal methods of introduction in this population.</p><p><b>Table 1.</b> Demographics.</p><p></p><p><b>Table 2.</b> Nutrition Characteristics.</p><p></p><p>Samantha Goedde-Papamihail, MS, RD, LD<sup>1</sup>; Ada Lin, MD<sup>2</sup>; Stephanie Peters, MS, CPNP-PC/AC<sup>2</sup></p><p><sup>1</sup>Nationwide Children's Hospital, Grove City, OH; <sup>2</sup>Nationwide Children's Hospital, Columbus, OH</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Critically ill children with severe acute kidney injury (AKI) requiring continuous renal replacement therapy (CRRT) are at high risk for micronutrient deficiencies, including vitamin C (VC). VC acts as an antioxidant and enzyme cofactor. It cannot be made endogenously and must be derived from the diet. Critically ill patients often enter the hospital with some degree of nutritional debt and may have low VC concentrations upon admission. This is exacerbated in the case of sepsis, multi-organ dysfunction, burns, etc. when VC needs are higher to address increased oxidative and inflammatory stress. When these patients develop a severe AKI requiring CRRT, their kidneys do not reabsorb VC as healthy kidneys would, further increasing losses. Moreover, VC is a small molecule and is filtered out of the blood via CRRT. The combination of increased VC losses and elevated VC requirements in critically ill patients on CRRT result in high risk of VC deficiency. The true prevalence of VC deficiency in this population is not well-known; whereas the prevalence of deficiency in the general population is 5.9% and 18.3% in critically ill children, on average. The aim of this study is to ascertain the prevalence of VC deficiency in critically ill pediatric patients on CRRT by monitoring serum VC levels throughout their CRRT course and correcting noted deficiencies.</p><p><b>Methods:</b> An observational study was conducted from May 2023 through August 2024 in a 54 bed high-acuity PICU offering ECMO, CRRT, Level 1 Trauma, burn care, solid organ transplant and bone marrow transplantation as tertiary care services. Fifteen patients were identified upon initiation of CRRT and followed until transfer from the PICU. Serial serum VC levels were checked after 5-10 days on CRRT and rechecked weekly thereafter as appropriate. Deficiency was defined as VC concentrations &lt; 11 umol/L. Inadequacy was defined as concentrations between 11-23 umol/L. Supplementation was initiated for levels &lt; 23 umol/L; dose varied 250-500 mg/d depending on age and clinical situation. Most deficient patients received 500 mg/d supplementation. Those with inadequacy received 250 mg/d.</p><p><b>Results:</b> Of 15 patients, 9 had VC deficiency and 4 had VC inadequacy [FIGURE 1]. Of those with deficiency, 5 of 9 patients were admitted for septic shock [FIGURE 2]. VC level was rechecked in 8 patients; level returned to normal in 5 patients and 4 of those 5 received 500 mg/d supplementation. Levels remained low in 3 patients; all received 250 mg/d supplementation [FIGURE 3]. Supplementation dose changes noted in figure 4.</p><p><b>Conclusion:</b> VC deficiency was present in 60% of CRRT patients, suggesting deficiency is more common in this population and that critically ill patients on CRRT are at higher risk of developing deficiency than those who are not receiving CRRT. Septic shock and degree of VC deficiency appeared to be correlated; 56% of deficient patients were admitted with septic shock. Together, this suggests a need to start supplementation earlier, perhaps upon CRRT initiation vs upon admission to the PICU in a septic patient; and utilize higher supplementation doses as our patients with low VC levels at their follow-up check were all receiving 250 mg/d. Study limitations include: 1. Potential for VC deficiency prior to CRRT initiation with critical illness and/or poor intakes as confounding factors and 2. Small sample size. Our results suggest that critically ill children with AKI requiring CRRT are at increased risk of VC deficiency while on CRRT. Future research should focus on identifying at-risk micronutrients for this patient population and creating supplementation regimens to prevent the development of deficiencies. Our institution is currently crafting a quality improvement project with these aims.</p><p></p><p><b>Figure 1.</b> Initial Serum Vitamin C Levels of Our Patients on CRRT, Obtained 5-10 Days After CRRT Initiation (N = 15).</p><p></p><p><b>Figure 2.</b> Underlying Disease Process of Patients on CRRT (N = 15).</p><p></p><p><b>Figure 3.</b> Follow Up Vitamin C Levels After Supplementation (N = 8), Including Supplementation Regimen Prior to Follow-Up Lab (dose/route).</p><p></p><p><b>Figure 4.</b> Alterations in Supplementation Regimen (dose) Based on Follow-up Lab Data (N = 6).</p><p>Tanner Sergesketter, RN, BSN<sup>1</sup>; Kanika Puri, MD<sup>2</sup>; Emily Israel, PharmD, BCPS, BCPPS<sup>1</sup>; Ryan Pitman, MD, MSc<sup>3</sup>; Elaina Szeszycki, BS, PharmD, CNSC<sup>2</sup>; Ahmad Furqan Kazi, PharmD, MS<sup>1</sup>; Ephrem Abebe, PhD<sup>1</sup></p><p><sup>1</sup>Purdue University College of Pharmacy, West Lafayette, IN; <sup>2</sup>Riley Hospital for Children at Indiana University Health, Indianapolis, IN; <sup>3</sup>Indiana University, Indianapolis, IN</p><p><b>Financial Support:</b> The Gerber Foundation.</p><p><b>Background:</b> During the hospital-to-home transition period, family members or caregivers of medically complex children are expected to assume the responsibility of managing medication and feeding regimens for the child under their care. However, this transition period represents a time of high vulnerability, including the risk of communication breakdowns and lack of tools tailored to caregivers’ context. This vulnerability is further heightened when a language barrier is present as it introduces additional opportunities for misunderstandings leading to adverse events and errors in the home setting. Hence, it is critical that caregivers are educated to develop skills that aid in successful implementation of post-discharge care plans. Addressing unmet needs for caregivers who use languages other than English (LOE) requires an in-depth understanding of the current challenges associated with educating and preparing caregivers for the post-discharge period.</p><p><b>Methods:</b> In this prospective qualitative study, healthcare workers (HCWs) were recruited from a tertiary care children's hospital in central Indiana and were eligible to participate if they were involved directly or indirectly in preparing and assisting families of children under three years of age with medication and feeding needs around the hospital discharge period and/or outpatient care following discharge. Each HCW completed a brief demographic survey followed by observation on the job for 2-3 hours, and participated in a follow-up semi-structured interview using video conferencing technology to expand on observed behavior. Handwritten field notes were taken during the observations, which were immediately typed and expanded upon post-observation. Audio recordings from the interviews were transcribed and de-identified. Both the typed observation notes and interview transcripts were subjected to thematic content analysis, which was completed using the Dedoose software.</p><p><b>Results:</b> Data collection is ongoing with anticipated completion in October 2024. Fourteen HCW interviews have been completed to date, with a target sample of 20-25 participants. Preliminary analysis presented is from transcripts of seven interviews. Participants included six females and one male with a mean age of 35.4 years (range, 24 - 59). HCWs were from diverse inpatient and outpatient clinical backgrounds including registered dieticians, physicians, pharmacists, and nurses. Four overarching themes describe the challenges that HCWs experience when communicating with caregivers who use LOE during the hospital-to-home transition. These themes include lack of equipment and materials in diverse languages, challenges with people and technologies that assist with translating information, instructions getting lost in translation/uncertainty of translation, and difficulty getting materials translated in a timely manner. Main themes, subthemes, and examples are presented in Figure 1, and themes, subthemes, and quotes are presented in Table 1.</p><p><b>Conclusion:</b> The study is ongoing, however based on the preliminary analysis, it is evident that the systems and processes that are in place to aid in communication between HCWs and caregivers who use LOE can be improved. This can ultimately lead to improved quality of care provided to caregivers who use LOE during the hospital-to-home transition and resultant safer care in the home setting for medically complex children.</p><p><b>Table 1.</b> Themes, Subthemes, and Quotes.</p><p></p><p></p><p><b>Figure 1.</b> Main Themes, Subthemes, and Examples.</p><p>Ruthfirst Ayande, PhD, MSc, RD<sup>1</sup>; Shruti Gupta, MD, NABBLM-C<sup>1</sup>; Sarah Taylor, MD, MSCR<sup>1</sup></p><p><sup>1</sup>Yale University, New Haven, CT</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Growth faltering among preterm neonates admitted to NICUs may be managed with hypercaloric and/or hypervolemic feeding. However, in intractable cases of growth faltering or when fluid restriction is indicated, fortification with extensively hydrolyzed liquid protein may offer a solution to meeting the high protein demands for infant growth while limiting overfeeding. Yet, there is limited clinical data regarding supplementation with liquid protein, leaving clinicians to make decisions about dosing and duration on a case-by-case basis.</p><p><b>Methods:</b> We present a case of neonatal growth faltering managed with a liquid protein modular in a level IV NICU in North America.</p><p><b>Results:</b> Case Summary: A male infant, born extremely preterm (GA: 24, 1/7) and admitted to the NICU for respiratory distress, requiring intubation. NICU course was complicated by patent ductus arteriosus (PDA), requiring surgery on day of life (DOL) 31 and severe bronchopulmonary dysplasia. Birth Anthropometrics: weight: 0.78 kg; height: 31.5 cm. TPN was initiated at birth, with trophic feeds of donor human milk per gavage (PG) for a total provision of 117 ml/kg, 75 kcal/kg, and 3.5 gm/kg of protein. The regimen was advanced per unit protocol; however, on DOL 5, total volume was decreased in the setting of metabolic acidosis and significant PDA. PG feeds of maternal milk were fortified to 24 kcal/oz on DOL 23, and the infant reached full feeds on DOL 26. Feed provision by DOL 28 was ~144 ml/kg/day and 4 g/kg of protein based on estimated dry weight. TPN was first discontinued on DOL 25 but restarted on DOL 32 due to frequent NPO status and clinical instability. Of note, the infant required diuretics during the hospital stay. TPN was again discontinued on DOL 43. At DOL 116, the infant was receiving and tolerating PG feeds fortified to 24 kcal/oz at 151 ml/kg, 121 kcal/kg, and 2.1 gm/kg protein. The infant's weight gain rate was 56 g/day; however, linear growth was impaired, with a gain of 0.5 cm over 21 days (rate of ~0.2 cm/week). Liquid protein was commenced at DOL 124 to supply an additional 0.5 gm/kg of protein. A week after adding liquid protein, the infant's weight gain rate was 39 g/day, and height increased by 2.5 cm/week. Feed fortification was reduced to 22 kcal/oz on DOL 156 due to rapid weight gain, and liquid protein dosage increased to 0.6 gm/kg for a total protein intake of 2.2 g/kg. At DOL 170, Calorie fortification of maternal milk was discontinued, and liquid protein dosage was increased to 1 g/kg in the setting of a relapse of poor linear growth for a total protein intake of 3.1 g/kg. Liquid protein was provided for two months until discontinuation (d/c) at DOL 183 per parent request. At the time of d/c of liquid protein, the infant's weight and length gain rates for the protein supplementation period (59 days) was 42 gm/day and 1.78 cm/week, respectively.</p><p><b>Conclusion:</b> While we observed objective increases in linear growth for the presented case following the addition of a liquid protein modular, it is crucial to note that these findings are not generalizable, and there is limited evidence and guidelines on the use of hydrolyzed liquid protein. Larger, well-controlled studies examining mechanisms of action, appropriate dosage and duration, short- and long-term efficacy, and safety are required to guide best practices for using these modulars.</p><p>Sarah Peterson, PhD, RD<sup>1</sup>; Nicole Salerno, BS<sup>1</sup>; Hannah Buckley, RDN, LDN<sup>1</sup>; Gretchen Coonrad, RDN, LDN<sup>1</sup></p><p><sup>1</sup>Rush University Medical Center, Chicago, IL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Accurate assessment of growth and nutritional status is critical for preterm infants. Various growth charts have been developed to track the growth of preterm infants, but differences in reference standards may influence the diagnosis and management of malnutrition. The goal of this study was to compare the rate of malnutrition, defined by a decline in weight-for-age z-score, using the Fenton growth chart, the Olsen growth chart, and the INTERGROWTH-21st Preterm Postnatal Growth Standard.</p><p><b>Methods:</b> All preterm infants born between 24 and 37 weeks of gestational age who were admitted to the neonatal intensive care unit (NICU) in 2022 and had weight at birth and on day 28 recorded were included. Preterm infants were excluded if they were admitted to the NICU ≥seven days after birth. Sex and gestational age were recorded for each infant. Weight and weight-for-age z-score at birth and on day 28 were recorded. Weight-for-age z-score was determined using three growth charts: (1) the Fenton growth chart, which was developed using growth data from uncomplicated preterm infants who were at least 22 weeks of gestational age from the United States (US), Australia, Canada, Germany, Italy, and Scotland; (2) the Olsen growth chart, which was developed using growth data from uncomplicated preterm infants who were at least 23 weeks of gestational age from the US; and (3) the INTERGROWTH-21st Preterm Postnatal Growth Standard, which was developed using growth data from uncomplicated preterm infants who were at least 24 weeks of gestational age from the US, United Kingdom, Brazil, China, India, Italy, Kenya, and Oman. Change in weight-for-age z-score from birth to day 28 was calculated using z-scores from each growth chart. Malnutrition was defined as a decline in weight-for-age z-score of ≥0.8; any infant meeting this cut-point had their malnutrition status further classified into three categories: (1) mild malnutrition was defined as a weight-for-age z-score decline between 0.8-1.1, (2) moderate malnutrition was defined as a weight-for-age z-score decline between 1.2-1.9, and (3) severe malnutrition was defined as a weight-for-age z-score decline of ≥2.0.</p><p><b>Results:</b> The sample included 102 preterm infants, 58% male, with a mean gestational age of 29.3 weeks. At birth, the average weight was 1,192 grams, and the average weight-for-age z-score was -0.50, -0.36, and -1.14 using the Olsen, Fenton, and INTERGROWTH-21st growth charts, respectively. At 28 days, the average weight was 1,690 grams, and the average weight-for-age z-score was -0.96, -1.00, and -1.43 using the Olsen, Fenton, and INTERGROWTH-21st growth charts, respectively. Using the Olsen growth chart, 29 infants met the criteria for malnutrition; 15 had mild malnutrition and 14 had moderate malnutrition. Using the Fenton growth chart, 33 infants met the criteria for malnutrition; 21 had mild malnutrition and 12 had moderate malnutrition. Using the INTERGROWTH-21st growth chart, 32 infants met the criteria for malnutrition; 14 had mild malnutrition, 14 had moderate malnutrition, and 4 had severe malnutrition. In total, 24 infants met the criteria for malnutrition using all three growth charts, while 19 infants were only categorized as malnourished by one of the three growth charts.</p><p><b>Conclusion:</b> The findings of this study reveal important discrepancies in average z-scores and the classification of malnutrition among preterm infants when using the Olsen, Fenton, and INTERGROWTH-21st growth charts. These differences suggest that the choice of growth chart has important implications for identifying rates of malnutrition. Therefore, standardized guidelines for growth monitoring in preterm infants are necessary to ensure consistent and accurate diagnosis of malnutrition.</p><p>Emaan Abbasi, BSc<sup>1</sup>; Debby Martins, RD<sup>2</sup>; Hannah Piper, MD<sup>2</sup></p><p><sup>1</sup>Univery of Galway, Vancouver, BC; <sup>2</sup>BC Children's Hospital, Vancouver, BC</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Infants with gastroschisis have variable intestinal function with some achieving enteral independence within a few weeks and others remaining dependent on parental nutrition (PN) for prolonged periods. Establishing enteral feeds can be challenging for many of these neonates due to poor intestinal motility, frequent emesis and/or abdominal distension. Therefore, many care teams use standardized post-natal nutrition protocols in an attempt to minimize PN exposure and maximize oral feeding. However, it remains unclear whether initiating continuous feeds is advantageous or whether bolus feeding is preferred. Potential benefits of bolus feeding include that it is more physiologic and that the feeds can be given orally, but there remain concerns about feed tolerance and prolonged periods of withholding feeds with this approach. The objective of this study was to compare the initial feeding strategy in infants with gastroschisis to determine whether bolus feeding is a feasible approach.</p><p><b>Methods:</b> After obtaining REB approval (H24-01052) a retrospective chart review was performed in neonates born with gastroschisis, cared for by a neonatal intestinal rehabilitation team between 2018 and 2023. A continuous feeding protocol was used between 2018-2020 (human milk at 1 ml/h with 10 ml/kg/d advancements given continuously until 50 ml/kg/d and then trialing bolus feeding) and a bolus protocol was used between 2021-2023 (10-15 ml/kg divided into 8 feeds/d with 15-20 ml/kg/d advancements). Clinical data was collected including: gestational age, gastroschisis prognosis score (GPS), need for intestinal resection, age when feeds initiated, time to full feeds, route of feeding at full feeds and hepatic cholestasis were compared between groups. Welch's t-test and chi square test were performed to compare variables with p-values &lt; 0.05 considered significant.</p><p><b>Results:</b> Forty-one infants with gastroschisis were reviewed (23 who were managed with continuous feed initiation and 18 with bolus feed initiation). Continuous feed and bolus feed groups had comparable mean gestational age at birth, GPS score, need for intestinal surgery, age at feed initiation, and the incidence of cholestasis (Table 1). Time to achieve enteral independence was similar between both groups with nearly half of infants reaching full feeds by 6 weeks (48% continuous feeds vs. 44% bolus feeds) and most by 9 weeks of life (74% continuous vs. 72% bolus). Significantly more infants in the bolus feeding group were feeding exclusively orally compared to the continuous feeding group at the time of reaching full enteral feeds (50% vs. 17%, p = 0.017).</p><p><b>Conclusion:</b> Initiating bolus enteral feeding for infants with gastroschisis, including those requiring intestinal resection, is safe and does not result in prolonged time to full enteral feeding. Avoiding continuous feeds may improve oral feeding in this population.</p><p><b>Table 1.</b> Clinical Characteristics and Initial Feeding Strategy.</p><p></p><p><b>International Poster of Distinction</b></p><p>Matheus Albuquerque<sup>1</sup>; Diogo Ferreira<sup>1</sup>; João Victor Maldonado<sup>2</sup>; Mateus Margato<sup>2</sup>; Luiz Eduardo Nunes<sup>1</sup>; Emanuel Sarinho<sup>1</sup>; Lúcia Cordeiro<sup>1</sup>; Amanda Fifi<sup>3</sup></p><p><sup>1</sup>Federal University of Pernambuco, Recife, Pernambuco; <sup>2</sup>University of Brasilia, Brasília, Distrito Federal; <sup>3</sup>University of Miami, Miami, FL</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Intestinal failure secondary to short bowel syndrome is a malabsorptive condition, caused by intestinal resection. Patients with intestinal failure require parenteral support to maintain hydration and nutrition. Long-term parenteral nutrition leads to complications. Teduglutide, an analog of GLP-2, may improve intestinal adaptation thereby minimizing reliance on parenteral nutrition. This meta-analysis evaluates the efficacy of teduglutide in reducing parenteral nutrition dependency in pediatric patients with intestinal failure.</p><p><b>Methods:</b> We included randomised controlled trials (RCTs) that assessed the efficacy of teglutide in reducing parenteral nutrition support and improving anthropometrics in pediatric patients with intestinal failure secondary to short bowel syndrome. The RoB-2 tool (Cochrane) evaluated the risk of bias, and statistical analyses were conducted utilizing RevMan 5.4.1 software. The results are expressed as mean differences with CI 95% and p-value.</p><p><b>Results:</b> Data was extracted from three clinical trials, involving a total of 172 participants. Teduglutide use was associated with a reduction in parenteral nutrition volume (-17.92 mL, 95% CI -24.65 to -11.20, p &lt; 0.00001) with most patients reducing parenteral support by &gt;20% (11.79, 95% CI 2.04 to 68.24, p = 0.006) (Figure2). Treatment with teduglutide also improved height (0.27 Z-score, 95% CI 0.08, to 0.46, p = 0.005) but did not significantly increase weight when compared to the control group (-0.13 Z-score, 95% CI, -0.41 to 0.16, p = 0.38) (Figure 3).</p><p><b>Conclusion:</b> This meta-analysis suggests that therapy with teduglutide reduces parenteral nutrition volume in patients with short bowel syndrome and intestinal failure. Reduced pareneteral nutrition dependency can minimize complications and improve quality of life in patients with short bowel syndrome and intestinal failure.</p><p></p><p><b>Figure 1.</b> Parenteral Nutrition Support Volume Change.</p><p></p><p><b>Figure 2.</b> Anthropometric Data (Weight and Height) Change from Baseline.</p><p>Korinne Carr<sup>1</sup>; Liyun Zhang, MS<sup>1</sup>; Amy Pan, PhD<sup>1</sup>; Theresa Mikhailov, MD, PhD<sup>2</sup></p><p><sup>1</sup>Medical College of Wisconsin, Milwaukee, WI; <sup>2</sup>Childrens Hospital of Wisconsin, Milwaukee, WI</p><p><b>Financial Support:</b> Medical College of Wisconsin, Department of Pediatrics.</p><p><b>Background:</b> Malnutrition is a significant concern in pediatric patients, particularly those critically ill. In children with diabetes mellitus (DM), the presence of malnutrition can exacerbate complications such as unstable blood sugar levels and delayed wound healing, potentially leading to worse clinical outcomes. Despite the known impact of malnutrition on adult patients and critically ill children, established criteria for identifying malnutrition in critically ill children are lacking. This study was designed to determine the relationship between malnutrition, mortality, and length of stay (LOS) in critically ill pediatric patients with diabetes mellitus.</p><p><b>Methods:</b> We conducted a retrospective cohort study using the VPS (Virtual Pediatric Systems, LLC) Database. We categorized critically ill pediatric patients with DM as malnourished or at risk of being malnourished based on admission nutrition screens. We compared mortality rates between malnourished and non-malnourished patients using Fisher's Exact test. We used logistic regression analysis to compare mortality controlling for measures like PRISM3 (a severity of illness measure), demographic, and clinical factors. We compared the LOS in the Pediatric Intensive Care Unit (PICU) between malnourished and non-malnourished patients using the Mann-Whitney-Wilcoxon test. Additionally, we used a general linear model with appropriate transformation to adjust for the severity of illness, demographic, and clinical factors. We considered statistical significance at p &lt; 0.05.</p><p><b>Results:</b> We analyzed data for 4,014 patients, of whom 2,653 were screened for malnutrition. Of these 2,653, 88.5% were type 1 DM, 9.3% were type 2 DM, and the remaining patients were unspecified DM. Of the 2,653 patients, 841 (31.7%) were malnourished based on their nutrition screen at admission to the PICU. Mortality in patients who were screened as malnourished did not differ from mortality in those who were not malnourished (0.4% vs. 0.2%, p = 0.15). Malnourished patients also had longer PICU LOS, with a geometric mean and 95% CI of 1.03 (0.94–1.13) days, compared to 0.91 (0.86–0.96) days for non-malnourished patients. Similarly, the malnourished patients had longer hospital LOS with a geometric mean and 95% CI of 5.31 (4.84–5.83) days, compared to 2.67 (2.53–2.82) days for those who were not malnourished. Both differences were significant with p &lt; 0.0001, after adjusting for age, race/ethnicity, and PRISM3.</p><p><b>Conclusion:</b> We found no difference in mortality rates, but critically ill children who were screened as malnourished had longer PICU and hospital LOS than those who were not malnourished. This was true even after adjusting for age, race/ethnicity, and PRISM3.</p><p>Emily Gutzwiller<sup>1</sup>; Katie Huff, MD, MS<sup>1</sup></p><p><sup>1</sup>Indiana University School of Medicine, Indianapolis, IN</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Neonates with intestinal failure require parenteral nutrition for survival. While life sustaining, it can lead to serious complications, including intestinal failure associated liver disease (IFALD). The etiology of IFALD is likely multifactorial, with intravenous lipid emulsions (ILE) being a large contributor, particularly soybean oil-based lipid emulsions (SO-ILE). Alternate ILEs, including those containing fish oil, can be used to prevent and treat IFALD. Fish oil-based ILE (FO-ILE) is only approved at a dose of 1 g/kg/d, limiting calories prescribed from fat and shifting the calorie delivery to carbohydrate predominance. While FO-ILE was shown to have comparable growth to SO-ILE, a comparison to soy, MCT, olive, fish oil-based ILE (SO,MCT,OO,FO-ILE) has not been conducted to our knowledge. The purpose of this study is to compare the growth and laboratory data associated with SO,MCT,OO,FO-ILE and FO-ILE therapy in the setting of IFALD in a cohort of neonates treated during two time periods.</p><p><b>Methods:</b> We performed a retrospective chart review of patients with IFALD receiving SO,MCT,OO,FO-ILE (SMOFlipid) or FO-ILE (Omegaven) in a level IV neonatal intensive care unit from September 2016 – May 2024. IFALD was defined as direct bilirubin &gt;2 mg/dL after receiving &gt;2 weeks of parental nutrition. Patients with underlying genetic or hepatic diagnoses, as well as patients with elevated direct bilirubin prior to two weeks of life were excluded. Patients were divided based on the period they were treated. Data was collected on demographic characteristics, parental and enteral nutrition, weekly labs, and growth changes while receiving SO,MCT,OO,FO-ILE or FO-ILE. Rate of change of weight, length, and head circumference and comparison of z-scores over time were studied for each ILE group. Secondary outcomes included nutritional data in addition to hepatic labs. Nonparametric analysis using Mann-Whitney U test was conducted to compare ILE groups and a p-value of &lt; 0.05 was used to define statistical significance.</p><p><b>Results:</b> A total of 51 patients were enrolled, 25 receiving SO,MCT,OO,FO-ILE and 26 FO-ILE. Table 1 notes the demographic and baseline characteristics of the two ILE groups. There was no difference in the rate of OFC (p = 0.984) or length (p = 0.279) growth between the two treatment groups (Table 2). There was a difference, however, in the rate of weight gain between groups. (p = 0.002; Table 2), with the FO-ILE group gaining more weight over time. When comparing nutritional outcomes (Table 2), SO,MCT,OO,FO-ILE patients received greater total calories than FO-ILE patients (p = 0.005) including a higher ILE dose (p &lt; 0.001) and enteral calories (p = 0.029). The FO-ILE group, however received a higher carbohydrate dose (p = 0.003; Table 2). There was no difference in amino acid dose (p = 0.127) or parental nutrition calories (p = 0.821). Hepatic labs differed over time with the FO-ILE having a larger decrease in AST, direct bilirubin, and total bilirubin over time compared to the SO,MCT,OO,FO-ILE group (Table 2).</p><p><b>Conclusion:</b> Our results show the FO-ILE patients did have a significant increase in weight gain compared to the SO,MCT,OO,FO-ILE patients. This is despite SO,MCT,OO,FO-ILE patients receiving greater total calories and enteral calories. The FO-ILE only received greater calories in the form of glucose infusion. With this increased weight gain but similar length growth between groups, concerns regarding alterations in body composition and increased fat mass arise. Further research is needed to determine the influence of this various ILE products on neonatal body composition over time.</p><p><b>Table 1.</b> Demographic and Baseline Lab Data by Lipid Treatment Group.</p><p>(All data presented as median and interquartile range, unless specified.)</p><p></p><p><b>Table 2.</b> Nutritional, Hepatic Lab, and Growth Outcomes by Lipid Treatment Group.</p><p>(All data presented as median interquartile range unless specified.)</p><p>*z-score change compares z-score at end and beginning of study period</p><p>OFC-occipitofrontal circumference</p><p></p><p>Rachel Collins, BSN, RN<sup>1</sup>; Brooke Cherven, PhD, MPH, RN, CPON<sup>2</sup>; Ann-Marie Brown, PhD, APRN, CPNP-AC/PC, CCRN, CNE, FCCM, FAANP, FASPEN<sup>1</sup>; Christina Calamaro, PhD, PPCNP-BC, FNP-BC, FAANP, FAAN<sup>3</sup></p><p><sup>1</sup>Emory University Nell Hodgson Woodruff School of Nursing, Atlanta, GA; <sup>2</sup>Emory University School of Medicine; Children's Healthcare of Atlanta, Atlanta, GA; <sup>3</sup>Emory University Nell Hodgson Woodruff School of Nursing; Children's Healthcare of Atlanta, Atlanta, GA</p><p><b>Financial Support:</b> None Reported.</p><p><b>Background:</b> Many children receiving hematopoietic stem cell transplant (HSCT) are malnourished prior to the start of transplant or develop malnutrition post-infusion. Children are at risk for malnutrition due to the pre-transplant conditioning phase, from chemotherapy treatments for their primary diagnosis, and from acute graft versus host disease (GVHD). These factors can cause impaired oral feeding, vomiting, diarrhea, and mucositis. Enteral nutrition (EN) and parenteral nutrition (PN) are support options for this patient population when oral feeding is impaired. There is currently a paucity of research in evidence-based guidelines of nutrition in this population. The purpose of this integrative literature review was to determine the outcomes of EN and PN in pediatric HSCT and to discuss evidence-based implications for practice to positively affect quality of life for these patients.</p><p><b>Methods:</b> A literature search was conducted using the following databases: PubMed, CINAHL, Cochrane, and Healthsource: Nursing/Academic Edition. The search strategy included randomized controlled trials, prospective and retrospective cohort studies, case control studies, cross sectional studies, systematic reviews, and meta-analyses. Relevant papers were utilized if they discussed patients 0-21 years of age, allogenic or autogenic HSCT, and enteral and/or parenteral nutrition. Papers were excluded if there was no English translation, they did not discuss nutrition, or they had animal subjects.</p><p><b>Results:</b> Initially 477 papers were identified and after the screening process 15 papers were utilized for this integrative review. EN and PN have effects on clinical outcomes, complications, and hospital and survival outcomes. EN was associated with faster platelet engraftment, improved gut microbiome, and decreased mucositis and GVHD. PN was used more in severe mucositis due to interference with feeding tube placement, therefore decreasing the use of EN. Use of PN is more common in severe grades III-IV of gut GVHD. Initiation of EN later in treatment, such as after conditioning and the presence of mucositis, can be associated with severe grades III-IV of gut GVHD. This is because conditioning can cause damage to the gut leading to mucosal atrophy and intestinal permeability altering gut microbiota. PN can induce gut mucosal atrophy and dysbiosis allowing for bacterial translocation, while EN improves the gut epithelium and microbiota reducing translocation. Additionally, increase access to central venous lines in PN can introduce bacterial infections to the bloodstream. Feeding tube placement complications include dislodgement, refusal of replacement, and increased risk of bleeding due to thrombocytopenia. Electrolyte imbalance can be attributed to loss of absorption through the gut. If a gastrostomy tube is present, there can be infection at the site. There is currently no consensus in the appropriate timeline of tube placement. There was no significant difference in neutrophil engraftment and variable findings in morbidity/mortality and weight gain. Weight gain can be attributed to edema in PN. Length of stay was significantly shorter in only EN than PN (p &lt; 0.0001).</p><p><b>Conclusion:</b> This literature review indicates a need for a more comprehensive nutritional assessment to adequately evaluate nutritional status before and after HSCT. EN should be given as a first line therapy and should be considered prior to the conditioning phase. The initiation of a feeding tube prior to conditioning should be considered. Finally, PN may be considered if EN cannot be tolerated. More research is needed for a sensitive nutritional evaluation, earlier administration of EN, and standardized pathways for EN and PN nutrition in pediatric HSCT.</p>\",\"PeriodicalId\":16668,\"journal\":{\"name\":\"Journal of Parenteral and Enteral Nutrition\",\"volume\":\"49 S1\",\"pages\":\"S90-S308\"},\"PeriodicalIF\":3.2000,\"publicationDate\":\"2025-03-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://onlinelibrary.wiley.com/doi/epdf/10.1002/jpen.2735\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Parenteral and Enteral Nutrition\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1002/jpen.2735\",\"RegionNum\":3,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"NUTRITION & DIETETICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Parenteral and Enteral Nutrition","FirstCategoryId":"3","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/jpen.2735","RegionNum":3,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"NUTRITION & DIETETICS","Score":null,"Total":0}
引用次数: 0

摘要

p3 - p34肠外营养治疗p3 - p52肠内营养治疗p3 - p83营养不良和营养评估p3 - p103重症监护和关键健康问题p3 - p131 GI、肥胖、代谢和其他营养相关概念p3 - p165儿科、新生儿、妊娠和哺乳期肠外营养治疗Angela Zimmerman, RD, CNSC2;Denise Jezerski, RD, CNSC2;Ashley Bestgen, RD, cnsc21克利夫兰诊所基金会,帕尔马,俄亥俄州;2克利夫兰诊所基金会,俄亥俄州克利夫兰。背景:必需脂肪酸缺乏症(EFAD)在普通人群中是一种罕见的疾病,但对于依赖家庭肠外营养(HPN)的患者,特别是那些不接受静脉脂质乳(ILE)的患者,可能是一个问题。在美国,直到2016年,唯一可用的ILE是基于大豆油的(SO-ILE),它含有足够量的必需脂肪酸,包括α -亚麻酸(ALA,一种ω -3脂肪酸)和亚油酸(LA,一种ω -6脂肪酸)。2016年,一种含有大豆油、中链甘油三酯、橄榄油和鱼油的混合ILE问世(SO、MCT、OO、FO-ILE)。然而,与SO-ILE相比,它含有较低浓度的必需脂肪酸,如果不给予足够的量,理论上对EFAD的发展提出了担忧。肝功能障碍是HPN患者的常见并发症,由于大豆基ILE的促炎特性,可发生肝功能障碍。短期研究和病例报告显示,接受SO、MCT、OO、FO-ILE的患者肝功能改善。我们的研究评估了SO、MCT、OO、FO-ILE在我们的HPN患者群体中的长期影响。方法:这项单中心、回顾性队列研究在克利夫兰人类营养中心进行,使用了2017年至2020年的数据。该研究涉及接受HPN并接受SO、MCT、OO、FO-ILE至少一年的成年患者。该研究评估了一年来必需脂肪酸谱的变化,包括三烯-四烯比率(trs)和肝功能测试(LFTs)。对于正态分布的连续变量,用平均值和标准差来描述数据;对于非正态分布的连续变量,用中位数和四分位数范围来描述数据;对于分类变量,用频率来描述数据。采用Wilcoxon符号秩检验比较基线和随访的TTR值(混合时间点)。采用两两比较的Wilcoxon符号秩检验来比较不同时间点的LFTs,并确定哪些时间组不同。p值采用Bonferroni校正进行调整。使用有序逻辑回归评估脂质剂量与随访TTR水平之间的关系。采用R软件进行分析,假设所有检验的显著性水平为0.05。结果:在筛选的110例患者中,26例符合基线和随访trs的纳入标准。所有患者均未发生EFAD,基线与随访期间TTR值分布无显著差异。此外,5.5%的患者在接受SO、MCT、OO、FO-ILE时报告了不良的胃肠道症状。另设14例LFTs异常亚组,包括胆红素、碱性磷酸酶(AP)、天冬氨酸转氨酶(AST)或丙氨酸转氨酶(ALT)。AST、ALT改善有统计学意义,胆红素、AP降低无统计学意义。结论:我们发现,使用SO、MCT、OO、FO-ILE作为主要脂质来源的26例患者中,没有任何患者发生EFAD,并且在引入SO、MCT、OO、FO-ILE后,TTRs保持统计学不变。此外,在SO、MCT、OO、FO-ILE开始后,AST和ALT均有统计学意义的降低。虽然PN引起的肝功能障碍是多因素的,但与SO-ILEs相比,使用基于鱼油的脂质已被证明可以改善LFT结果,因为植物甾醇含量减少,促炎ω -6含量减少。一个重要的限制是很难通过门诊环境中的家庭保健护理获得TTR测量,这大大减少了可以分析EFAD的患者数量。表1。26例基线及随访TTR患者描述性统计。表2。与基线水平相比,3个月、6个月、9个月和12个月LFTs的变化。Wendy Raissle, RD, CNSC1;汉娜·韦尔奇,MS, RD2;Jan Nguyen, PharmD31Optum输液药房,七叶树,AZ;2Optum输液药房,凤凰城,亚利桑那州;3Optum Infusion Pharmacy, Mesa, az1。资金支持:无报告。背景:铝是肠外营养液中的一种非营养性污染物。 HPN患者的年患病率和每位患者的平均导管天数(2007 - 2022)。图2。HPN患者年发病率(2007 - 2022)。图3。导管相关血流感染(每1000个导管日发生的事件)。Jill Murphree, MS, RD, CNSC, LDN1;Anne Ammons, RD, LDN, CNSC2;Vanessa Kumpf,药学博士,BCNSP, FASPEN2;道恩·亚当斯,医学博士,硕士,cnsc21田纳西州纳什维尔范德比尔特大学医学中心;2范德比尔特大学医学中心,纳什维尔,田纳西州财政支持:无报告。背景:由于各种因素,确定需要家庭肠外营养(HPN)的患者的宏量营养素目标可能很困难。虽然间接量热法是测量能量消耗的金标准,但它在门诊环境中并不容易获得。因此,临床医生通常依靠不太准确的基于体重的方程式来评估蛋白质和能量需求。能量目标也受到减重、增重或维持体重的目标愿望的影响。接受HPN的患者可能会摄入一些口服膳食,并经历不同程度的宏量营养素吸收。这些因素,以及潜在的临床条件,可以显著影响蛋白质和能量需求,并可能在HPN治疗过程中发生变化。本研究的目的是评估由一家大型学术医疗中心的跨学科肠衰竭诊所管理的接受HPN治疗的患者所规定的蛋白质和能量剂量范围。方法:利用HPN患者数据库,回顾性获取2021年5月至2023年5月期间所有因PN出院的患者的患者人口统计数据,包括患者年龄、性别和PN指征/诊断。 在HPN开始时、出院后2周、2至3个月和6个月的间隔从电子病历中提取额外信息,包括身高、实际体重、目标体重、HPN能量剂量、HPN蛋白剂量以及患者是否进食。数据收集在HPN治疗完成或HPN治疗6个月后结束。所有数据都输入并储存在电子数据库中。结果:在研究期间,248例患者开始接受HPN治疗,其中56例患者接受HPN治疗至少6个月。患者人口统计资料见表1。在HPN开始时,规定能量剂量为344 ~ 2805 kcal/d (6 ~ 45 kcal/kg/d),规定蛋白质剂量为35 ~ 190 g/d (0.6 ~ 2.1 g/kg/d)。在HPN的2周、2至3个月和6个月的间隔时间内,继续给予广泛的规定能量和蛋白质剂量。图1和图2为所有患者以及正在进食和不进食的患者提供了规定的能量和蛋白质剂量。对于不进食的患者,在PN治疗开始时,规定的能量范围为970至2791 kcal/d (8 kcal/kg/d至45 kcal/kg/d),蛋白质范围为40至190 g/d (0.6 g/kg/d至2.0 g/kg/d)。在每个研究间隔评估实际体重和目标体重之间的差异。在研究期间,患者的实际体重和目标体重之间的差异减小,表明达到目标体重有所改善(图3)。结论:本研究的结果表明,接受HPN的患者所规定的能量和蛋白质剂量范围很广。这不同于在住院患者中使用PN,在住院患者中,基于体重的宏量营养素目标往往更明确。在长期情况下,当患者口服摄入时,为了达到/维持目标体重,或为了改变基础疾病,常量营养素的调整可能是必要的。接受HPN的患者需要个性化的护理方法,可以由专门研究肠衰竭的跨学科营养支持团队提供。表1。6个月研究期间的患者人口统计数据。图1所示。肠外营养(PN)能量范围。图2。肠外营养蛋白质范围。图3。实际重量与目标重量之差。Jennifer Lachnicht, RD, CNSC1;Christine Miller, pharm2;Jessica Younkman, RD CNSC21Soleo Home Infusion, francisco, TX;2Soleo Health, Frisco, texas资金支持:无报告。背景:自20世纪90年代以来,开始在家中进行肠外营养(PN),尽管由于再喂养综合征(RS)等风险,一些临床医生更倾向于在医院开始。成功启动家庭PN的一个关键因素是由经验丰富的营养支持临床医生进行仔细评估,特别是评估RS风险。2020年,ASPEN发布了关于识别RS风险患者的共识建议以及启动和推进营养的指南。家庭PN启动的优势包括避免住院,降低医疗保健费用,最大限度地减少医院获得性感染和提高生活质量。文献表明,如果在家开始实行家庭护理,每天可以节省2000美元。 通常不选择浓缩配方,降低了年轻患者达到卡路里目标的能力。处方较短的连续EN持续时间(20/24小时)可能会提高达到卡路里目标的几率。对NSD的评估显示,第一周的卡路里摄入量有改善趋势,达到建议目标的2/3。然而,研究结果强调,即使热量需求日益得到满足,蛋白质仍然不足。需要注意微量营养素的输送,同时补充维生素D。表1。每营养支持日肠内营养特征。(N = 458)按年龄和配方奶粉组估计的维生素D摄入量和95%置信区间。图1所示。按年龄和配方奶粉组估计的维生素D摄入量。Dana Steien, MD1;Megan Thorvilson, MD1;艾琳·亚历山大,MD1;Molissa Hager, NP1;Andrea Armellino, rdn11梅奥诊所,罗切斯特,明尼苏达州。背景:家庭肠外营养(HPN)是长期消化功能障碍儿童的一种维持生命的治疗方法。历史上,HPN被认为是肠内自主或肠移植的桥梁。然而,由于医疗和管理的改进,HPN现在用于各种诊断,包括严重神经损伤(SNI)儿童的难治性喂养不耐受(IFI)。IFI最常发生在SNI患者的生命末期(EOL)。因此,在这一人群中,HPN的门诊计划和准备与历史HPN使用有很大不同。方法:对4例发生IFI并在EOL护理期间使用HPN的SNI患儿进行病例分析。采用回顾性图表法收集数据。当讨论和计划HPN时,医院的儿科姑息治疗服务大量参与患者的护理。儿童肠道康复(PIR)和姑息治疗团队在出院计划和整个门诊过程中密切合作。结果:本病例系列中SNI患儿在1 - 12岁之间发生IFI。HPN的使用时间从5周到2年不等。所有病人都被纳入临终关怀,但处于不同的阶段。根据每个家庭EOL护理的目标,修改常规、门诊HPN管理计划和期望。关于实验室检查的使用和时间安排、发热计划、中心静脉问题、生长和随访预约的讨论需要详细的讨论和计划。结论:儿童EOL护理不同于大多数成人EOL护理。为患有SNI和IFI的儿童提供HPN可以在孩子的EOL旅程中为家庭提供时间,机会和和平,如果它符合他们的EOL目标。在这些充满挑战的时期,PIR团队可以为姑息治疗服务和家庭提供宝贵的HPN专业知识。Jessica Lowe, DCN, MPH, RDN1;Carolyn Ricciardi, MS, RD2;Melissa Blandford, MS, RD31Nutricia North America, Roseville, CA;2纽迪西亚北美公司,洛克维尔,马里兰州;3纽迪西亚北美公司,Greenville, nc财政支持:该研究由纽迪西亚北美公司进行。背景:广泛水解配方(eHFs)适用于管理牛奶过敏(CMA)和相关症状。然而,这类配方通常与难闻的气味和苦味联系在一起;乳清基eHFs被认为比酪蛋白基eHFs更美味。1-4在低过敏性配方奶粉中加入乳糖(人乳中的主要碳水化合物)也可以促进适口性。从历史上看,对乳糖中残留蛋白质痕迹的关注导致CMA中完全避免乳糖。然而,“CMA中乳糖的不良反应在文献中没有得到支持,并且不保证完全避免CMA中的乳糖。”英国的临床医生此前曾报道,eHF的口味和接受度是开处方时的一个重要考虑因素,因为他们认为可口的eHF可能会减少配方拒绝,并导致更多的内容家庭本研究的目的是了解照顾者对婴儿的感觉观点,当基于eHF含有乳糖。方法:从美国各地招募了15个临床站点。临床医生招募了132名婴儿,根据临床医生的建议,他们的家庭接受了基于乳清的eHF治疗2周。护理人员完成了两项调查:一项入组调查和两周后调查,描述eHF摄入情况、CMA相关症状、大便模式、感觉视角和对eHF的满意度。数据分析采用SPSS 27和描述性统计。结果:122名婴儿完成了研究。入组时,婴儿为22(±14.7)周大。在研究开始之前,12.3%的婴儿是母乳喂养的,40.2%的婴儿使用基于酪蛋白的eHF, 18.9%的婴儿使用含有完整蛋白质的标准配方。大多数患者(97.5%)为口服喂养,2.5%为管饲。在所有回应的家长中,92.5% (n = 86/93)表示味道更好,88。 9% (n = 96/108)报告说,与以前的配方相比,含有乳糖的乳清基配方奶粉的体积更小。对于儿童在入组时使用酪蛋白为基础的eHF并有反应的护理人员,97.6% (n = 41/42)报告比以前的配方更好的味道,95.7% (n = 44/46)报告更好的气味。额外的照顾者报告的味觉和嗅觉知觉分别在图1和图2中报告。最后,89.3%的护理人员表示,很容易让他们的孩子开始使用含有乳糖的乳清基eHF, 91.8%的护理人员会向其他需要低过敏性配方奶粉的孩子推荐它。结论:与基线配方相比,大多数护理人员对含有乳糖的乳清基eHF有积极的感觉体验。此外,他们发现试验配方很容易过渡,并将其推荐给其他家庭。这些数据支持Maslin等人的发现,也支持临床医生的期望,即良好的适口性会使婴儿和家庭更好地接受和更满足需要进一步的研究来更好地了解改善适口性如何有助于减少浪费和医疗保健成本。图1所示。照顾者排名:乳清基,含乳糖eHF的味道。图2。护理人员排名:乳清味,含乳糖eHF。Michele DiCarlo,药学1;Emily Barlow,药学博士,BCPPS1;Laura Dinnes,药学博士,bcidp11梅奥诊所,罗切斯特,明尼苏达州。背景:关于接受甲氧苄啶-磺胺甲恶唑(TMP-SMX)治疗的成人患者高钾血症的信息有限。高钾血症的机制与TMP成分有关,TMP成分在结构上与保钾利尿剂阿米洛利有关。在儿童中,没有关于临床影响或需要监测的信息。我们注意到,一名接受全肠外营养(TPN)的儿科患者,一旦开始使用TMP-SMX, TPN钾剂量就会下降。在最后一次服用抗生素后,这种减少持续了两周。病例介绍:一个7个月大的婴儿在复杂的外科手术和体外膜氧合要求后住在心脏重症监护病房。由于担心灌注不良和可能的坏死性小肠结肠炎,开始了TPN。TPN总共持续了110天。根据TPN方案,每天监测电解质和肾功能(尿量、血清肌酐)。利尿剂治疗,包括环利尿剂、氯噻嗪和螺内酯,在TPN之前开处方,并持续整个疗程。在TPN治疗期间,肾功能保持稳定。TPN中钾的剂量是根据ASPEN指南开始的,并根据血清钾水平进行调整。由于呼吸需要和阳性培养,TMP-SMX在两个不同的场合被添加到药物治疗方案中。TMP-SMX 15mg /kg/天在TPN开始12天后开始,并持续3天。TPN第43天再次开始使用TMP-SMX 15mg /kg/天,并持续5天。一旦开始使用TMP-SMX,密切监测血清钾以进行调整。当TPN成功断开后,我们再次查看了这些信息。在两种TMP-SMX方案开始的第2天,TPN钾剂量明显下降,直到最后一次给药后大约两周才恢复到先前的稳定剂量。这种减少的持续时间远远超过了TMP-SMX的预计半衰期。(表1)讨论:已知TMP-SMX在成人患者中具有多种混杂因素的潜在高钾血症。因素包括高剂量、肾功能不全、充血性心力衰竭和已知引起高钾血症的伴随药物。很少有文献注意到儿科的这种副作用。由于TMP-SMX达到峰值效应的时间为1-4小时,可以预期患者血清钾水平升高,同时TPN剂量减少。小儿TMP半衰期&lt;2岁是5.9小时。考虑到这些信息,人们预计TMP-SMX在最后一次给药后大约30小时被清除。两种治疗方案中,患者的钾剂量从TMP-SMX给药结束到恢复到TMP-SMX前的钾剂量大约需要两周时间。稳定时间延长的潜在原因包括同时使用高剂量TMP-SMX和持续使用保钾利尿剂。对于开始使用高剂量TMP-SMX同时使用TPN的儿科患者,应考虑延长钾监测时间,并进一步评估。方法:无报道。结果:无报道。结论:无报道。曲线图表示TPN钾剂量(单位为mEq/kg/天),以及在两个不同场合添加TMP-SMX方案。注意到每次TMP-SMX方案后TPN钾剂量下降和延迟返回。图1所示。 TPN钾剂量与TMP-SMX添加。Jennifer Smith, MS, RD, CSP, LD, LMT1;Praveen Goday, MBBS2;劳伦·斯托奇,MS, RD, CSP, LD2;柯尔斯滕·琼斯,RD, CSP, LD2;Hannah Huey, MDN2;希拉里·米歇尔,md21全国儿童医院,德累斯顿,俄亥俄州;资金支持:北美儿科胃肠病学、肝病学和营养学会(NASPGHAN)基金会。背景:回避/限制性食物摄入障碍(ARFID)的患病率在一般人群中约为3%,在患有炎症性肠病(IBD)的成人中约为10%。多达10%的IBD青少年报告有饮食失调行为;然而,目前还没有关于ARFID饮食行为在这一人群中的患病率的前瞻性研究。方法:这是一项一次性、横断面、非连续的研究,研究对象为12-18岁的英语确诊IBD患者。参与者完成了经过验证的九项ARFID筛查(NIAS), SCOFF饮食障碍筛查(该筛查使用与屏幕上的五个问题相关的首字母缩略词[Sick, Control, One, Fat, and Food]),并回答了一个关于感知食物不耐受的问题。NIAS被组织成三个特定的ARFID领域:挑食导致的饮食限制,食欲不振/对饮食兴趣有限,以及对饮食负面后果的恐惧,每一个都通过三个问题来解决。问题基于6分李克特量表。总量表得分≥23分或单项子量表得分≥12分的参与者被认为符合ARFID饮食行为标准。由于一些NIAS筛查阳性的个体可能患有神经性厌食症或贪食症,我们也使用SCOFF问卷来评估饮食失调的可能存在。对于神经性厌食症或贪食症,两个或两个以上的阳性答案的敏感性为100%,特异性为90%。除了描述性统计外,卡方检验还用于研究NIAS筛查阳性和非NIAS筛查阳性患者中营养不良、SCOFF筛查阳性或食物不耐受的患病率。结果:我们纳入了82例患者,其人口统计数据见表1。20%(16/82)的人在NIAS问卷上得分为阳性,16%(13/82)的人在SCOFF问卷上得分为阳性,48%(39/82)的人指出食物不耐受。在NIAS阳性的16名参与者中,50%(8/16)为男性,56%(9/16)诊断为克罗恩病,69%(11/16)为非活动性疾病。25%的NIAS阳性患者(4/16)符合营养不良标准(1例轻度,2例中度,1例重度)。在NIAS得分为阳性的人中,有69%(11/16)注意到食物不耐受,30%(5/16)的SCOFF筛查呈阳性。营养不良的患病率(p = 0.4), SCOFF饮食失调筛查阳性的患者百分比(p = 0.3),或报告食物不耐受的患者百分比(p = 0.6)在NIAS评分阳性与非NIAS评分阳性的参与者中相似。结论:使用NIAS, 20%的IBD青少年患者符合ARFID标准。参与者不太可能有营养不良,在SCOFF饮食失调筛查中得分为正,或报告食物不耐受,无论他们是否符合ARFID标准。对患有IBD的青少年进行ARFID或其他饮食失调的常规筛查,可以确定哪些患者将受益于进一步的评估。表1。人口统计数据。宋倩文,RN1;翁秀梅2;Sin Wee Loh, MB BCh BAO(爱尔兰),MMed (Paed)(孢子),MRCPCH (RCPCH,英国)1;Joel Kian Boon Lim, MBBS, MRCPCH, MMed (Paeds)1;李佳范,MBBS, MMed (Paeds), MRCPCH (UK)3;Rehena Sultana4;王成思,BS(营养师),MS(营养与公共卫生),phd;夏洛特Lin3;黄朱明,MB BCh BAO, LRCP &;SI(爱尔兰),MRCPCH (Paeds) (RCPCH,英国)1;瑞安·理查德·泰勒;伊莱恩Hor2;裴芬,理学硕士(护理学),BSN1;普里西拉Cheng2;Jan Hau Lee, MCI, MRCPCH (UK), USMLE, MBBS11KK医院,新加坡;2新加坡国立大学医院;3新加坡国立大学医院,新加坡;资金支持:本研究得到了新加坡卫生部国家医学研究委员会的支持。背景:蛋白质-能量营养不良在儿科重症监护病房(PICU)患者中普遍存在。对危重儿童补充蛋白质的影响仍然存在临床平衡。我们的主要目的是确定在PICU进行蛋白质补充与标准肠内营养(EN)的随机对照试验(RCT)的可行性。方法:于2021年1月至2024年6月在新加坡的两所三级儿科中心进行了一项开放标签的试点RCT。 儿童体重指数(BMI) z-score &lt;0例患者,预计需要有创或无创机械通气至少48小时,并需要EN支持进食。患者在入组或出院至高依赖病房后7天内(以较早者为准),在标准EN或单独标准EN的基础上,随机(1:1分配)补充1.5 g/kg/天的蛋白质。可行性基于4个结果:有效筛查(80%符合条件的患者获得同意)、令人满意的入组(1例患者/中心/月)、及时的方案实施(80%的参与者在最初72小时内接受蛋白质补充)和方案依从性(按照方案接受80%的蛋白质补充)。结果:共招募20例患者,蛋白质补充组和标准EN组分别为10例(50.0%)和10例(50.0%)。中位年龄为13.0[四分位间距(IQR) 4.2, 49.1]个月。呼吸窘迫是PICU入院最常见的原因[11(55.0%)]。PICU和住院时间的中位数分别为8.0 (IQR为4.5,16.5)和19.0 (IQR为11.5,36.5)天。有3例(15%)死亡与试验干预无关。筛查率为50/74(67.6%)。平均入组人数为0.45名患者/中心/月。15/20(75%)的参与者及时执行了方案。在补充蛋白质的11/15天(73.3%)中,参与者遵守了方案。结论:本试验未达到令人满意的可行性结果。根据本试点研究的纳入标准和中心的设置,仅在新加坡进行更大规模的研究是不可行的。在纳入订正后勤安排后,应试行一项涉及区域国家的更大的可行性多中心中心研究。Veronica Urbik, MD1;Kera McNelis, md11埃默里大学,亚特兰大,美国财政支持:无报道背景:许多临床管理和生理学知识差距存在于小婴儿群体(新生儿出生≤23周妊娠,也称为围生胎龄)的护理中。因此,在实践中存在显著的中心特异性差异,并且与晚孕周出生的婴儿相比,22-23周出生的婴儿的发病率和死亡率增加。营养的重要性适用于小婴儿,适当的营养可能比其他人群更重要。许多研究已经证明了早期完全肠内喂养的好处,包括降低中央静脉相关血液感染和胆汁淤积的发生率2,3。发生坏死性小肠结肠炎(NEC)的风险是肠内饲料推进的一个平衡措施。充足的营养对于生长、降低发病率和死亡率以及改善总体结果至关重要。目前针对这一人群的拟议方案的目标是在出生后10-14天达到完全的肠内喂养量5。方法:从2020年1月至2024年1月在两家三级新生儿重症监护病房(NICU)收集的基线数据来看,从出生到实现完全肠内喂养的平均时间为31天。使用质量改进(QI)方法,我们确定了入住儿科住院医师III级新生儿重症监护室的22和23周出生的婴儿进入完全肠内喂养(定义为120cc/kg/天)的障碍。结果:帕累托图显示了主要障碍,包括未定义的危重疾病、血管加压剂的使用、NEC的评估、或自发性肠穿孔、动脉导管未闭治疗和电解质紊乱(图1)。在许多情况下,在图表回顾中无法确定没有推进到完全肠内喂养的具体原因。结论:在这个正在进行的QI项目中,我们的SMART目标是通过根本原因分析和从迄今为止的数据中得出的关键驱动图(图2),将2024年1月至2025年6月期间达到完全喂养的天数减少10%。第一个计划-执行-研究-行动周期始于2024年1月16日。采用统计过程控制方法对数据进行分析。帕累托图。图1所示。关键驱动图。图2。布里奇特·罗恩,医学博士,MMSc1;凯特琳·阿里亚诺,RD, LDN, CNSC, CSPCC1;马修Mixdorf1;塔拉·麦卡锡,MS, RD, LDN1;Lori Hartigan, ND,注册护士,CPN1;Jennifer Lawlor,注册护士,BSN, CPN1;Coleen Liscano, MS, RD, CSP, LDN, CNSC, CLE, FAND1;米歇尔·雷蒙德,RD, LDN, CDCES1;Tyra Bradbury, MPH, RD, CSP, LDN1;艾琳·基南,MS, RD, LDN1;Christopher Duggan, MD, MPH1;Melissa McDonnell, RD, LDN, CSP1;Rachel Rosen, MD, MPH1;财政支持:一些研究者得到了包括美国国立卫生研究院和NASPGHAN在内的机构的支持,这些机构并没有直接资助这个项目。 与患者的总访问次数&lt;8个月大(n = 103),只有6%(6)报告接触过花生,8%(8)接触过鸡蛋。扩展为&lt;访问时12个月大(n = 198),报告的过敏原暴露量略有增加,7%(14)报告花生暴露量,8%(15)报告鸡蛋暴露量。口服饲料是报告的过敏原暴露形式的最常见来源。只有一名患者接受了商业化的早期过敏原导入产品。牛奶暴露是报告最多的过敏原暴露,有61%(63)在8个月以下,54%(106)在12个月以下,大多数来自婴儿配方奶粉。结论:年龄和任何比例的口服摄入与就诊时接受常见食物过敏原介绍教育有关。然而,使用肠内喂养管的婴儿错过了接受教育的机会。很少有病人报告说他们接触过花生或鸡蛋。需要进一步的研究和国家指南来确定在这一人群中引入的最佳方法。表1。人口统计数据。表2。营养特点。Samantha Goedde-Papamihail, MS, RD, LD1;林爱达,MD2;Stephanie Peters, MS, CPNP-PC/ ac21全美儿童医院,Grove City, OH;2全国儿童医院,哥伦布市,俄亥俄州财政支持:无报告。背景:需要持续肾替代治疗(CRRT)的重症急性肾损伤(AKI)患儿存在微量营养素缺乏的高风险,包括维生素C (VC)。VC是一种抗氧化剂和酶辅助因子。它不能由体内产生,必须从饮食中获得。危重患者入院时往往有一定程度的营养欠债,入院时VC浓度可能较低。在脓毒症、多器官功能障碍、烧伤等情况下,当VC需要更高以应对增加的氧化和炎症应激时,这种情况会加剧。当这些患者发展为需要CRRT的严重AKI时,他们的肾脏不能像健康肾脏那样重新吸收VC,进一步增加了损失。此外,VC是一种小分子,通过CRRT从血液中过滤出来。危重患者接受CRRT治疗时,VC损失增加和VC需要量增加共同导致VC缺乏的高风险。这一人群中VC缺乏症的真实患病率尚不清楚;而普通人群的缺乏率平均为5.9%,危重儿童的缺乏率为18.3%。本研究的目的是通过在CRRT过程中监测血清VC水平并纠正注意到的缺陷,确定危重儿科CRRT患者中VC缺乏症的患病率。方法:一项观察性研究于2023年5月至2024年8月在54张床位的高灵敏度PICU进行,提供ECMO, CRRT,一级创伤,烧伤护理,实体器官移植和骨髓移植作为三级护理服务。15例患者在CRRT开始时被确定,并随访至从PICU转移。在CRRT治疗5-10天后检测血清VC水平,之后每周复查一次。缺乏定义为VC浓度&lt;11 umol / L。当浓度在11-23 μ mol/L之间时,定义为浓度不足。开始补充水平&lt;23 umol / L;剂量根据年龄和临床情况250- 500mg /d不等。大多数缺乏的患者接受500 mg/d的补充。不足者每天服用250毫克。结果:15例患者中,VC缺乏症9例,VC不足症4例[图1]。在缺乏症患者中,9例患者中有5例因脓毒性休克入院[图2]。8例患者复查VC水平;5例患者水平恢复正常,其中4例补充500mg /d。3例患者水平仍低;所有患者均补充250 mg/d[图3]。补充剂量变化见图4。结论:60%的CRRT患者存在VC缺乏症,表明缺乏症在这一人群中更为常见,并且接受CRRT的危重患者发生缺乏症的风险高于未接受CRRT的患者。感染性休克与VC缺乏程度相关;56%的缺陷患者以感染性休克入院。总之,这表明需要更早开始补充,可能是在CRRT开始时,而不是在脓毒症患者进入PICU时;并使用更高的补充剂量,因为在随访检查中VC水平较低的患者都接受了250毫克/天的补充剂量。研究的局限性包括:1。在CRRT开始前,危重疾病和/或摄入不足的潜在VC缺乏是混杂因素。样本量小。我们的研究结果表明,需要CRRT的AKI危重儿童在进行CRRT时VC缺乏的风险增加。 体重-年龄z分数采用三个生长图确定:(1)芬顿生长图,使用来自美国(US)、澳大利亚、加拿大、德国、意大利和苏格兰的至少22周胎龄的无并发症早产儿的生长数据;(2)奥尔森生长图,该图表是根据美国至少23周胎龄的无并发症早产儿的生长数据开发的;(3) intergrowth -21早产儿产后生长标准,该标准是根据来自美国、英国、巴西、中国、印度、意大利、肯尼亚和阿曼的至少24周胎龄的无并发症早产儿的生长数据制定的。从出生到第28天体重年龄z分数的变化使用每个生长图的z分数计算。营养不良定义为体重年龄比z-score下降≥0.8;任何符合该临界值的婴儿的营养不良状况将进一步分为三类:(1)轻度营养不良定义为年龄体重z分数下降0.8-1.1之间;(2)中度营养不良定义为年龄体重z分数下降1.2-1.9之间;(3)严重营养不良定义为年龄体重z分数下降≥2.0。结果:样本包括102例早产儿,其中58%为男婴,平均胎龄29.3周。出生时平均体重为1192克,平均年龄体重z-score分别为-0.50、-0.36和-1.14(采用Olsen、Fenton和intergrowth -21生长图)。28 d时,平均体重为1690 g,平均年龄体重z-score分别为-0.96、-1.00和-1.43(采用Olsen、Fenton和intergrowth -21生长图)。根据奥尔森生长图,29名婴儿符合营养不良标准;15人患有轻度营养不良,14人患有中度营养不良。根据芬顿生长图,33名婴儿符合营养不良标准;21例为轻度营养不良,12例为中度营养不良。根据intergrowth -21生长图表,32名婴儿符合营养不良标准;轻度营养不良14例,中度营养不良14例,重度营养不良4例。总共有24名婴儿符合所有三个生长图表的营养不良标准,而19名婴儿仅通过三个生长图表中的一个被归类为营养不良。结论:本研究的发现揭示了使用Olsen, Fenton和intergrowth -21生长图时,早产儿的平均z分数和营养不良分类的重要差异。这些差异表明,生长图的选择对确定营养不良率具有重要意义。因此,有必要制定标准化的早产儿生长监测指南,以确保对营养不良进行一致和准确的诊断。Emaan Abbasi, BSc1;黛比·马丁斯,RD2;Hannah Piper, md21高威大学,温哥华,BC;2BC儿童医院,温哥华,bc。背景:胃裂的婴儿具有不同的肠道功能,一些在几周内实现肠内独立,而另一些则长时间依赖于父母的营养(PN)。由于肠道蠕动不良、频繁呕吐和/或腹胀,建立肠内喂养对许多这些新生儿来说是具有挑战性的。因此,许多护理团队使用标准化的产后营养方案,以尽量减少PN暴露和最大化口服喂养。然而,目前尚不清楚是否开始连续喂养是有利的,还是更倾向于小剂量喂养。丸饲的潜在好处包括它更符合生理性,并且可以口服饲料,但这种方法对饲料耐受性和长时间不喂饲料仍然存在担忧。本研究的目的是比较胃裂伤婴儿的初始喂养策略,以确定大剂量喂养是否可行。方法:在获得REB批准(H24-01052)后,对2018年至2023年由新生儿肠道康复团队护理的胃裂新生儿进行回顾性图表回顾。在2018-2020年期间使用连续喂养方案(1 ml/h母乳,10 ml/kg/d连续给予,直到50 ml/kg/d,然后试用批量喂养),在2021-2023年期间使用批量喂养方案(10-15 ml/kg分为8次饲料/d, 15-20 ml/kg/d)。收集临床资料,包括:胎龄、胃裂预后评分(GPS)、是否需要肠切除术、开始饲喂年龄、至饱食时间、饱食喂养方式、肝胆汁淤积等。采用Welch’st检验和卡方检验比较p值&lt; 0.05认为显著的变量。 本研究旨在根据2020年ASPEN关于RS的共识建议,确定在家开始PN的患者RS的风险和发生率。方法:一家国家家庭输液提供商的营养支持服务审查了2022年9月至2024年6月期间开始家庭PN的27名成年患者的医疗记录。根据ASPEN 2020标准,根据喂养前后的磷、钾和镁水平,评估患者在PN开始前的RS风险和RS的实际发生率。在PN起始前后进行了初步的实验室工作。根据最初的营养评估,包括BMI、体重减轻史、最近的热量摄入、喂食前实验室异常、脂肪/肌肉萎缩和高风险合并症,将再喂养风险分为轻度、中度、中度至重度或重度。评估磷、钾和镁的变化百分比,如果治疗开始后水平下降,将其分为轻度、中度或重度。最初的PN处方包括复合维生素和补充硫胺素每位提供者政策和共识建议。结果:研究人群的平均基线BMI为19.6 kg/m²(范围12.7-31.8,中位数18.9)。88.9%的患者体重减轻,平均22%。92.3%的患者报告在评估前至少5-10天很少或没有口服摄入。96.2%的病例在治疗开始的5天内进行了初步实验室检查,其中18.5%的病例显示喂食前电解质较低。100%有高危合并症。RS风险分为轻度(4%)、中度(48%)、中度至重度(11%)和重度(37%)。25例患者(93%)成功启动家庭PN。两名患者不能在家中开始PN:一名患者尽管静脉补充,但PN前电解质水平持续较低,另一名患者由于不符合家庭护理入院标准。起始葡萄糖平均为87.2 g/d(范围:50-120,中位数100)。平均总起始热量为730千卡/天,代表12.5千卡/公斤(范围:9-20,中位数12)。初始PN配方电解质含量包括钾(平均55.4 mEq/d,范围15-69,中位数60)、镁(平均11.6 mEq/d,范围4-16,中位数12)和磷(平均15.6 mmol/d,范围8-30,中位数15)。治疗开始后平均4.3天抽取实验室。监测钾、镁和磷水平是否低于基线水平≥10%,以检测RS。镁和钾水平的下降分别为轻度(10-20%)和4%的患者出现。8例(32%)患者磷降低≥10%:轻度(10-20%)4例,中度(20-30%)2例,重度(30%)2例。结论:通过严密的RS风险监测和评估,可以安全地实施家庭PN。该综述显示,基于ASPEN标准的RS发生率较低,即使在家庭PN启动前具有中度至重度风险的患者中也是如此。密切监测实验室和患者状态,以及遵守最初的处方建议,导致92.5%的患者在家中成功启动PN。Dana Finke, MS, RD, CNSC1;Christine Miller, pharm1;Paige Paswaters, RD, CNSC1;Jessica Younkman, RD, CNSC11Soleo Health, Frisco, texas财政支持:无报道。背景:缺铁性贫血在家庭肠外营养(HPN)患者中很常见(Hwa et al, 2016)。铁输注后,由成纤维细胞生长因子23引起的低磷血症是一些IV铁制剂的已知副作用(Wolf等人,2019)。羧基麦芽糖铁可导致27%的患者血清磷水平降至2 mg/dL以下(Onken et al., 2014)。低磷酸盐血(& lt;2.7 mg/dL)可导致神经、神经肌肉、心肺和血液学问题,以及骨质减少和骨质疏松等长期影响(Langley等人,2017)。本病例系列回顾了在接受三羧基麦芽糖铁和HPN治疗的患者中短暂性血清磷下降的发生和临床意义。方法:对3例在HPN治疗的同时给予羧麦芽糖铁的患者进行回顾性病例系列回顾。在初始铁剂量前、注射后1周内以及注射后4个月的时间间隔测量血清磷水平。收集有关任何磷减少的时间、幅度和持续时间以及任何相关临床症状或并发症的数据。患者记录也被审查,以评估与HPN组成或磷剂量的任何相关性。结果:在回顾的3例患者中,所有患者在注射三羧基麦芽糖铁后血清磷水平均出现短期下降(表1)。在初始剂量三羧基麦芽糖铁之前,所有患者的基线血清磷水平均在正常范围内。所有病例都涉及多次剂量的羧麦芽糖铁,这导致了磷水平的波动。 结果:对41例腹裂患儿进行了回顾性分析,其中23例采用连续起始喂养,18例采用颗粒起始喂养。连续喂养组和批量喂养组的出生时平均胎龄、GPS评分、肠道手术需求、起始喂养年龄和胆汁淤积发生率相似(表1)。两组之间实现肠内独立的时间相似,近一半的婴儿在6周时达到全喂养(48%连续喂养对44%批量喂养),大部分在9周时达到全喂养(74%连续喂养对72%批量喂养)。与连续喂养组相比,在达到完全肠内喂养时,单口服喂养组的婴儿明显更多(50%对17%,p = 0.017)。结论:对胃裂伤患儿(包括需要肠切除术的患儿)开始肠内小丸喂养是安全的,且不会导致肠内喂养时间延长。避免连续喂养可能会改善这一人群的口服喂养。表1。临床特点和初始喂养策略。国际特色海报matheus albuquerque;•迪奥戈Ferreira1;维克托·马尔多纳多;Mateus Margato2;路易斯·爱德华多·努内斯;伊曼纽尔Sarinho1;露西娅Cordeiro1;Amanda fifi31伯南布哥联邦大学,累西腓,伯南布哥;2巴西利亚大学,Brasília,联邦区;3迈阿密大学,迈阿密,佛罗里达州资金支持:无报道背景:继发于短肠综合征的肠衰竭是一种吸收不良的疾病,由肠切除术引起。肠衰竭患者需要肠外支持来维持水分和营养。长期肠外营养会导致并发症。Teduglutide是GLP-2的类似物,可以改善肠道适应,从而减少对肠外营养的依赖。本荟萃分析评估了特杜葡肽在减少肠衰竭患儿肠外营养依赖方面的疗效。方法:我们纳入了随机对照试验(RCTs),评估了teglutide在减少肠外营养支持和改善短肠综合征继发肠衰竭儿科患者人体测量学方面的疗效。采用rob2工具(Cochrane)评估偏倚风险,采用RevMan 5.4.1软件进行统计分析。结果以CI 95%和p值的平均差异表示。结果:数据来自3项临床试验,共172名受试者。Teduglutide的使用与肠外营养量减少(-17.92 mL, 95% CI -24.65至-11.20,p &lt; 0.00001)相关,大多数患者将肠外支持减少了20% (11.79,95% CI 2.04至68.24,p = 0.006)(图2)。与对照组相比,teduglutide治疗也改善了身高(0.27 Z-score, 95% CI 0.08,至0.46,p = 0.005),但没有显著增加体重(-0.13 Z-score, 95% CI, -0.41至0.16,p = 0.38)(图3)。结论:本荟萃分析表明,teduglutide治疗可减少短肠综合征和肠衰竭患者的肠外营养量。减少肠外营养依赖可以减少并发症,提高短肠综合征和肠衰竭患者的生活质量。图1所示。肠外营养支持量改变。图2。人体测量数据(体重和身高)从基线变化。Korinne Carr1;张丽云,MS1;潘美美博士;Theresa Mikhailov,医学博士21威斯康星医学院,密尔沃基,威斯康星;2威斯康星儿童医院,密尔沃基,威斯康星州财政支持:威斯康星医学院,儿科。背景:营养不良是儿科患者的一个重要问题,特别是那些危重症患者。在患有糖尿病(DM)的儿童中,营养不良的存在会加剧并发症,如血糖水平不稳定和伤口愈合延迟,可能导致更糟糕的临床结果。尽管已知营养不良对成年患者和危重儿童的影响,但缺乏确定危重儿童营养不良的既定标准。本研究旨在确定小儿糖尿病危重患者营养不良、死亡率和住院时间(LOS)之间的关系。方法:我们使用VPS (Virtual Pediatric Systems, LLC)数据库进行了一项回顾性队列研究。根据入院营养筛查,我们将患有糖尿病的危重儿科患者分为营养不良或有营养不良风险。我们使用Fisher精确检验比较营养不良和非营养不良患者的死亡率。我们使用逻辑回归分析比较死亡率控制措施,如PRISM3(疾病严重程度测量)、人口统计学和临床因素。 我们使用Mann-Whitney-Wilcoxon检验比较了儿童重症监护病房(PICU)营养不良和非营养不良患者的LOS。此外,我们使用了一个具有适当转换的一般线性模型来调整疾病的严重程度、人口统计学和临床因素。我们认为p &lt; 0.05具有统计学意义。结果:我们分析了4,014例患者的数据,其中2,653例进行了营养不良筛查。在这2653例患者中,88.5%为1型糖尿病,9.3%为2型糖尿病,其余患者为未明确的糖尿病。在2653例患者中,841例(31.7%)患者在进入PICU时的营养筛查显示为营养不良。被筛查为营养不良的患者的死亡率与未被筛查为营养不良的患者的死亡率没有差异(0.4% vs. 0.2%, p = 0.15)。营养不良患者PICU LOS也更长,几何平均值和95% CI为1.03(0.94-1.13)天,而非营养不良患者为0.91(0.86-0.96)天。同样,营养不良患者的住院LOS较长,几何平均值和95% CI为5.31(4.84-5.83)天,而非营养不良患者的住院LOS为2.67(2.53-2.82)天。在调整了年龄、种族/民族和PRISM3因素后,两种差异均为p &lt; 0.0001。结论:我们发现死亡率没有差异,但被筛查为营养不良的危重儿童比没有营养不良的危重儿童有更长的PICU和医院LOS。即使在调整了年龄、种族/民族和PRISM3因素后,这一结论仍然成立。艾米丽Gutzwiller1;Katie Huff, MD, ms11印第安纳大学医学院,印第安纳波利斯,in财政支持:无报道。背景:肠衰竭的新生儿需要肠外营养来维持生存。在维持生命的同时,它可能导致严重的并发症,包括肠衰竭相关的肝脏疾病(IFALD)。IFALD的病因可能是多因素的,静脉注射脂质乳(ILE)是一个很大的因素,特别是豆油基脂质乳(SO-ILE)。包括那些含有鱼油的替代品,可以用来预防和治疗IFALD。以鱼油为基础的ILE (FO-ILE)仅以每公斤/天1克的剂量获得批准,限制了从脂肪中规定的卡路里,并将卡路里输送转向以碳水化合物为主。虽然FO-ILE被证明与SO-ILE有相当的增长,但据我们所知,还没有对大豆、MCT、橄榄油、鱼油基ILE (SO、MCT、OO、FO-ILE)进行比较。本研究的目的是比较在两个时间段内接受IFALD治疗的新生儿队列中与SO、MCT、OO、FO-ILE和FO-ILE治疗相关的生长和实验室数据。方法:我们对2016年9月至2024年5月在IV级新生儿重症监护病房接受SO、MCT、OO、FO-ILE (SMOFlipid)或FO-ILE (Omegaven)治疗的IFALD患者进行回顾性图表回顾。IFALD定义为接受父母营养两周后直接胆红素2mg /dL。排除了有潜在遗传或肝脏诊断的患者,以及两周前直接胆红素升高的患者。患者根据他们接受治疗的时间进行分类。在接受SO、MCT、OO、FO-ILE或FO-ILE时,收集了人口统计学特征、父母和肠内营养、每周化验和生长变化的数据。研究了每个ILE组的体重、长度和头围的变化率以及z分数随时间的比较。次要结局包括营养数据和肝脏化验。采用Mann-Whitney U检验进行非参数分析比较ILE组,p值为&lt;统计学意义采用0.05。结果:共纳入51例患者,其中25例接受SO、MCT、OO、FO-ILE, 26例接受FO-ILE。表1说明了两个ILE组的人口统计学和基线特征。两个治疗组之间OFC (p = 0.984)和长度(p = 0.279)的生长率没有差异(表2)。然而,两组之间的体重增加率存在差异。(p = 0.002;表2),随着时间的推移,FO-ILE组的体重增加更多。在比较营养结局时(表2),SO、MCT、OO、FO-ILE患者比FO-ILE患者获得更多的总热量(p = 0.005),包括更高的ILE剂量(p &lt; 0.001)和肠内热量(p = 0.029)。然而,FO-ILE组接受了更高的碳水化合物剂量(p = 0.003;表2)。氨基酸剂量(p = 0.127)和父母营养热量(p = 0.821)无差异。与SO、MCT、OO、FO-ILE组相比,FO-ILE组的AST、直接胆红素和总胆红素随时间的下降幅度更大(表2)。结论:我们的结果显示,与SO、MCT、OO、FO-ILE组相比,FO-ILE组的体重增加确实有显著增加。尽管SO、MCT、OO、FO-ILE患者获得了更多的总热量和肠内热量。 FO-ILE只以葡萄糖输注的形式获得更多的卡路里。随着体重增加,但两组之间的长度增长相似,人们开始担心身体成分的改变和脂肪量的增加。随着时间的推移,需要进一步的研究来确定这种不同的ILE产品对新生儿身体成分的影响。表1。脂质治疗组的人口统计学和基线实验室数据。(除特别说明外,所有数据均以中位数和四分位数范围表示。)表2。脂质治疗组的营养、肝脏实验室和生长结果。(除特别说明外,所有数据均以中位数四分位数范围表示)*z-score变化比较研究期结束和开始时的z-score变化[fc -枕额周长][achel Collins, BSN, RN1];Brooke Cherven, PhD, MPH, RN, CPON2;Ann-Marie Brown, PhD, APRN, CPNP-AC/PC, CCRN, CNE, FCCM, FAANP, FASPEN1;Christina Calamaro,博士,PPCNP-BC, FNP-BC, FAANP, faan31埃默里大学内尔霍奇森伍德拉夫护理学院,亚特兰大,佐治亚州;2埃默里大学医学院;亚特兰大儿童保健,亚特兰大,佐治亚州;3埃默里大学内尔霍奇森伍德拉夫护理学院;亚特兰大儿童保健中心,亚特兰大,美国。财政支持:无报道。背景:许多接受造血干细胞移植(HSCT)的儿童在移植开始前营养不良或在输注后出现营养不良。由于移植前适应阶段、初步诊断的化疗治疗以及急性移植物抗宿主病(GVHD),儿童面临营养不良的风险。这些因素可导致口服进食受损、呕吐、腹泻和粘膜炎。当口服喂养受损时,肠内营养(EN)和肠外营养(PN)是这类患者的支持选择。目前在这一人群的循证营养指南方面缺乏研究。本综合文献综述的目的是确定儿科HSCT中EN和PN的结果,并讨论实践对这些患者生活质量产生积极影响的循证意义。方法:使用PubMed、CINAHL、Cochrane和health数据库进行文献检索。来源:Nursing/Academic Edition。检索策略包括随机对照试验、前瞻性和回顾性队列研究、病例对照研究、横断面研究、系统评价和荟萃分析。如果他们讨论0-21岁的患者,同种异体或自体造血干细胞移植,以及肠内和/或肠外营养,则使用相关文献。如果论文没有英文翻译,没有讨论营养,或者有动物主题,则被排除在外。结果:最初确定了477篇论文,筛选过程后15篇论文被用于本综合综述。EN和PN对临床结果、并发症、住院和生存结果有影响。EN与更快的血小板植入、改善的肠道微生物组、减少粘膜炎和GVHD相关。由于饲管放置的干扰,严重粘膜炎患者更多地使用PN,因此减少了EN的使用。在严重的III-IV级肠道GVHD患者中使用PN更为常见。在治疗后期,如在调节和出现粘膜炎后开始发生EN,可能与严重的III-IV级肠道GVHD相关。这是因为调理会对肠道造成损伤,导致粘膜萎缩和肠道通透性改变肠道微生物群。PN可诱导肠道黏膜萎缩和生态失调,导致细菌易位,而EN可改善肠道上皮和微生物群,减少易位。此外,增加进入PN中心静脉线的机会可引入细菌感染的血液。喂食管放置的并发症包括移位,拒绝更换,以及由于血小板减少导致出血的风险增加。电解质失衡可归因于肠道吸收的丧失。如果有胃造口管存在,则该部位可能存在感染。目前对于导管放置的合适时间还没有达成共识。在中性粒细胞植入和发病率/死亡率和体重增加的可变结果方面没有显著差异。体重增加可归因于PN水肿。只有EN患者的住院时间明显短于PN患者(p &lt; 0.0001)。结论:本文献综述表明,需要更全面的营养评估,以充分评估移植前后的营养状况。EN应作为一线治疗,并应在调节阶段之前考虑。应考虑在调节前启动饲管。最后,如果EN不能容忍,可以考虑PN。需要更多的研究来进行敏感的营养评估,早期给予EN,以及标准化的EN和PN营养途径在儿童HSCT中。 血清磷从基线到最低点平均下降50.3%。这三名患者中记录的最低磷水平为1.4 mg/dL,这是在接受了两剂以上的三羧基麦芽糖铁的患者中。在两个病例中,HPN磷随着血清水平的升高而升高,而在一个病例中,HPN没有变化。然而,尽管采取了各种干预措施,所有血清磷水平均恢复正常。尽管磷水平较低,但在监测期间没有患者报告明显的低磷血症症状。羧基麦芽糖铁显著影响HPN患者血清磷,与已有文献一致。强调了警惕监测的必要性,接受HPN的患者由训练有素的营养支持团队密切监测,并进行频繁的实验室监测。实验室监测患者接受铁羧麦芽糖谁不是在HPN可能不太常见。记录的最低水平为1.4 mg/dL,表明潜在的严重程度。尽管明显下降,但未观察到临床症状,提示亚临床低磷血症可能很常见。在回顾的两个病例中,低磷血症是通过增加患者的HPN配方来解决的。注意,由于相容性和稳定性问题,HPN中磷管理存在局限性,根据患者的个人配方,可能需要补充其他方法。结论:三名接受羧麦芽糖铁治疗的HPN患者出现了短暂的、一般轻微的血清磷降低。监测是至关重要的,但本病例系列的结果表明临床并发症很少见。可能需要根据个别患者的需要调整HPN或额外补充,有些病例随着时间的推移会自我纠正。表1。铁注射的时间线和由此产生的血清磷水平和HPN公式调整。丹尼尔·纳迪姆,MD1;Stephen Adams, MS, RPh, BCNSP2;Bryan Snook21Geisinger怀俄明州山谷,布卢姆斯堡,宾夕法尼亚州;资金支持:无报告。背景:羧基麦芽糖铁(FC)是一种广泛使用的静脉补铁制剂,主要用于治疗缺铁。它提供了显著的益处,特别是在口服补铁被证明无效或耐受不良的情况下。然而,与氟氯烃使用相关的一个重要潜在副作用是低磷血症。这种情况已在多例接受FC治疗的患者中观察到。本文讨论了导致这种不良反应的潜在机制及其对患者护理的重大影响。方法:一名有营养不良和缺铁史的中年女性,在家中接受肠外营养,既往多次服用维诺福,最后一次给药是在2017年,患者出现过敏反应。因此,她改用三羧基麦芽糖铁(FCM)治疗。然而,在2018年接受多剂量FCM后,患者出现了明显的低磷血症。由于注意到低磷血症,因此对患者的全肠外营养(TPN)方案进行了调整,以增加总磷含量,以努力治疗低磷水平。在随后的几年中,患者也接受了持续剂量的FCM,尽管充血,但仍存在持续的低磷血症。结果:羧麦芽糖铁(FC)是一种广泛应用于静脉补铁治疗缺铁的药物。在口服补铁无效或不耐受的情况下,它特别有益。FC通过将铁直接输送到网状内皮系统中的巨噬细胞而起作用。然后铁被缓慢地释放出来供身体使用,主要用于生产血红蛋白。然而,最近的研究强调了与FC使用相关的低磷血症的潜在不良影响。FC诱导的低磷血症被认为是由激素成纤维细胞生长因子23 (FGF23)分泌增加引起的。FGF23是一种调节磷酸盐稳态的激素。当FGF23水平升高时,肾脏增加磷酸盐的排泄,导致血液中的磷酸盐水平降低。低磷血症在患者护理方面有许多影响。低磷血症的症状包括肌肉无力、疲劳、骨痛和精神错乱。在严重的情况下,持续的低磷血症可导致严重的并发症,如横纹肌溶解、溶血、呼吸衰竭,甚至死亡。因此,临床医生在给药FC时,了解低磷血症的潜在风险是至关重要的。建议接受重复剂量FC的患者定期监测血清磷酸盐水平。 需要进一步的研究,以充分了解这种不利影响的机制,并制定预防和管理这种影响的战略。结论:综上所述,虽然FC是一种有效的铁缺乏治疗方法,但临床医生必须意识到低磷血症的潜在风险。建议接受重复剂量FC的患者定期监测血清磷酸盐水平。需要进一步的研究,以充分了解这种不利影响的机制,并制定预防和管理这种影响的战略。表1。磷水平和铁的管理。表1显示了患者多次静脉注射铁后对血清磷水平的反应。Ashley Voyles, RD, LD, CNSC1;Jill Palmer, RD, LD, CNSC1;Kristin Gillespie, MD, RD, LDN, CNSC1;Suzanne Mack, MS, MPH, RD, LDN, CNSC1;Tricia Laglenne, MS, RD, LDN, CNSC1;Jessica Monczka, RD, CNSC, FASPEN1;Susan Dietz,药学博士,BCSCP1;Kathy Martinez, RD, LD11Option Care Health, Bannockburn, il财政支持:无报道。背景:通过经验丰富的营养支持团队(NST)的仔细评估和管理,家庭肠外营养(HPN)可以在家庭环境中成功启动。1,2启动HPN的安全候选者包括具有适当适应症、安全环境和可靠随访手段的医学稳定患者。然而,由于后勤原因或再喂养综合征(RFS)的风险,一些患者不适合直接开始HPNRFS的共识建议提供了识别和管理风险的指导经验丰富的NST可以提供个性化的护理,建议在开始HPN之前进行静脉(IV)水化,以加快治疗的开始,并使血液化学正常化,以减轻RFS的风险。本研究的目的是评估静脉补水对居家输液NST管理的成人患者的影响,这些患者在开始HPN之前接受了静脉补水。将描述在HPN之前接受静脉补水的患者比例,开始静脉补水的原因,以及这种干预可能对他们的护理和结果产生的影响。方法:本回顾性研究包括200例18岁及以上的HPN患者,从2024年1月1日至2024年6月30日,在一个国家家庭输液药房开始HPN治疗,为期6个月。数据收集包括基线人口统计学、HPN指征、RFS风险、开始HPN前接受静脉补液的天数、前2周内再次住院的次数、是否接受静脉补液,如果接受,静脉补液的指征和医嘱的组成部分。数据通过电子病历收集,并确定为标准化数据收集表。结果:在200例患者中,19例(9.5%)在HPN前接受了静脉补液。在这19例患者中,女性16例,男性3例(表1)。HPN最常见的适应症是减肥手术并发症(5例)、肠衰竭(4例)和肿瘤诊断(4例)(图1)。在这些患者中,9例(47%)为中度RFS风险,10例(53%)为高RFS风险。静脉补水的适应症包括7例(37%)由于电解质异常/RFS风险,5例(26%)由于中心静脉置管延迟,7例(37%)由于计划延迟(图2)。静脉补水订单中有15例(79%)包括电解质。所有没有电解质的订单(4)都有与后勤原因相关的指征(图3)。所有19名患者在接受静脉补水后7天内开始使用HPN。两名患者在治疗的前两周内入院,承认诊断与HPN无关。结论:在这组患者中,在经验丰富的NST管理下,HPN在家庭环境中成功启动,防止了不必要的住院治疗。该研究表明,在7天内开始HPN治疗时,安全的HPN治疗可能包括先静脉补液或不补液,以减轻RFS或由于后勤原因。静脉补水顺序是个性化的,以适应每个病人的需要。该数据仅反映通过家庭输液药房分配的静脉补水,不包括在外部诊所接受的静脉补水。该患者群体也不包括那些医学上认为HPN不稳定的患者或那些因其他因素不利于在家庭环境中开始的患者。未来的研究应考虑到这些局限性,并详细说明静脉补水成分、剂量和用药频率。表1。人口统计数据。图1所示。静脉补水的HPN适应症。图2。静脉补液和再喂养风险适应症。图3。IV水合的适应症和类型。 Emily Boland Kramer, MS, RD, LDN, CNSC1;Jessica Monczka, RD, CNSC, FASPEN1;Tricia Laglenne, MS, RD, LDN, CNSC1;Ashley Voyles, RD, LD, CNSC1;Susan Dietz,药学博士,BCSCP1;Kathy Martinez, RD, LD11Option Care Health, Bannockburn, il财政支持:无报道。背景:肠外营养(PN)是无法通过胃肠道充分吸收营养的患者的重要治疗方法PN很复杂,每个顺序有10个或更多单独给药的组分,这本质上增加了给药错误的风险。2本研究旨在分析家庭输液提供者在出院时收到的PN单,并根据ASPEN关于适当肠外营养剂量的建议确定遗漏标准成分的发生率。3本研究的主要目的是确定出院单中缺失的钠、钾、镁、钙、磷和多种维生素成分。次要目标是确定在从医院到家庭的护理过渡(TOC)过程中是否添加了已确定的缺失组件。方法:这个多中心,回顾性图表回顾分析了在国家家庭输液提供者3个月期间的患者。数据是从电子病历和内部监控软件中收集的。注册营养师,认证营养支持临床医生(RD, CNSC)审查了所有从医院转到家庭的临床管理和PN治疗的患者的PN出院单。纳入标准是18岁或以上的PN患者提供大部分营养需求,如表1所定义,出院时PN订单中缺少钠、钾、镁、钙、磷或多种维生素。排除标准是年龄小于18岁的患者,接受补充PN不能提供大部分营养需求的患者,以及所有电解质和多种维生素剂量的患者。结果:在3个月期间(2024年4月1日至2024年6月30日),确定了267例患者,他们年龄大于18岁,通过PN获得大部分营养需求,出院单中至少缺少一种PN成分。人口统计信息见表2和图1。出院单单项缺失175例(65.5%),多项缺失92例(34.5%)。缺钙175例(65.5%),缺磷68例(25.5%),缺复合维生素38例(14%),缺镁23例(8.6%),缺钾20例(7.5%),缺钠20例(7.5%)。在从医院到家庭的过渡期间,经过与提供者的讨论,94.9%的患者恢复了钙,94.7%的患者恢复了多种维生素,91.3%的患者恢复了镁,90%的患者恢复了钾,88.2%的患者恢复了磷,80%的患者恢复了钠。结论:本研究突出了PN出院单中遗漏成分的患病率,其中钙是最常见的遗漏成分,占65.5%。考虑到许多因PN出院回家的患者需要长期治疗,充足的钙补充对于防止骨吸收和代谢性骨病相关的并发症至关重要。在RD、CNSC鉴定为缺钙的出院单中,94.9%的医生同意在TOC过程中在PN单中添加钙在临床上是合适的。这强调了从医院到家庭过渡期间营养支持临床医生审查和沟通的重要性。未来的研究应该分析PN订单中缺少组件的原因,并提高对所有PN患者回家进行彻底临床审查的必要性的认识,以确保安全和优化长期PN所需的所有组件的充足性。表1。纳入和排除标准。表2。人口统计数据。图1所示。初级PN诊断。图2。在TOC过程中,订单中缺少的部件被添加回来。Avi Toiv, MD1;BS2的霍普·奥布莱恩;Arif Sarowar, MSc2;Thomas Pietrowsky, MS, RD1;内米·贝尔特兰,RN1;Yakir Muszkat, MD1;赛义德·穆罕默德·贾夫里,md11亨利福特医院,底特律,密歇根州;2韦恩州立大学医学院,底特律,密歇根州。背景:肠衰竭相关肝病(IFALD)是依赖全肠外营养(TPN)的患者,特别是等待肠移植的患者的已知并发症。人们担心IFALD可能会对移植后的预后产生负面影响,包括移植物存活和总体患者死亡率。本研究旨在评估IFALD对移植结果的影响,如肠或多脏器移植前肝功能测试(LFT)异常所显示的。方法:我们对2010年至2023年在学术移植中心接受IT治疗的所有患者进行回顾性图表回顾。 主要结果是患者的生存和移植受者的移植失败。结果:50名接受IT治疗的患者中,有30人(60%)在接受IT治疗前接受过TPN治疗。移植时的中位年龄为50岁(范围17-68岁)。在两组中,大多数移植都是单纯的IT,但也包括MVT。87%的TPN患者在移植前出现LFTs升高。33%患者移植后1年LFTs持续升高。在我们的队列中,tpn相关的肝功能障碍与混合肝损伤模式相关,包括肝细胞损伤(p &lt; 0.001)和胆汁淤积性损伤(p &lt; 0.001)。tpn相关的LFTs升高与主要移植结局(包括死亡(p = 0.856)、移植失败(p = 0.144)或急性排斥反应(p = 0.306)之间没有显著关联。同样,在未接受TPN治疗的患者中,LFTs升高与死亡(p = 0.855)、移植物衰竭(p = 0.769)或急性排斥反应(p = 0.386)之间无显著差异。移植前tpn相关性肝功能障碍与移植后1年LFTs升高相关(p &lt; 0.001),但缺乏临床相关性。结论:尽管与TPN使用相关的IFALD与等待肠或多脏器移植的患者的特定肝功能障碍模式相关,但它似乎与重要的关键移植结局(如移植物衰竭、死亡率或急性排斥反应)无关。然而,移植后1年仍与持续升高的LFTs相关。这些发现表明,虽然tpn相关的肝损伤很常见,但它可能对长期移植成功没有临床显著影响。需要进一步的研究来探索IFALD对这一患者群体的长期影响。乔迪(林德)佩恩,RD, CNSC1;Kassandra Samuel, MD, MA2;希瑟·杨,MD3;凯莉·舒特,RD3;克里斯汀·霍纳,RDN, CNSC3;Daniel Yeh, MD, MHPE, FACS, FCCM, FASPEN, CNSC31Denver Health, Parker, CO;2丹佛健康,圣约瑟夫医院,丹佛,科罗拉多州;3 . Denver Health, Denver, co .财务支持:无报告。背景:中心线相关血流感染(CLABSI)与并发症、住院时间和护理费用增加有关。大多数CLABSI研究都集中在家庭肠外营养(PN)患者身上,并且缺乏记录住院患者中由PN引起的CLABSI发生率的数据。在我们的机构,我们观察到临床医生由于担心CLABSI而不愿意在有明确PN指征的患者中启动PN。因此,我们进行了一项质量改进项目,以记录住院期间开始的新中心肠外营养(CPN)的CLABSI发生率。方法:我们对1月1日至23年12月31日在本院接受CPN治疗的成年住院患者进行回顾性分析。如果患者在入院前接受过PN或仅接受过外周PN,则排除。CLABSI和二次归因采用国家医疗安全网络(NHSN)定义。传染病(ID)顾问对CLABSI病例进行了进一步深入的审查,以确定阳性病例是归因于CPN还是其他原因。对阳性患者的静脉通路类型也进行了综述。结果:106例住院患者接受CPN治疗,共1121 CPN天。CPN输注的中位[IQR]长度为8[4-14]天。接受CPN输注患者的平均(标准差)年龄为53.3(18.6)岁,男性65(61%)。符合CLABSI标准的CPN患者由ID顾问进一步审查,结果只有4例CLABSI可归因于CPN。这4例病例的发生率为每1000 CPN天3.6例CLABSI。其中两例患者因其他感染原因被注意到,包括胃溃疡穿孔和肠穿孔并吻合口漏。其中3例患者通过中心静脉导管(port,股静脉导管和非隧道颈内导管)输注CPN,第4例患者通过外周插入的中心导管输注CPN。我们的综述中未报道每导管天CLABSI病例的发生率。结论:在我们的机构,&lt;在住院期间开始短期CPN的患者中,有4%发生了可归因于CPN的CLABSI。这种低感染率是我们机构质量改进和质量保证工作的基准。建议与ID合作,对CLABSI的CPN患者进行更深入的检查,以确定感染是否更可能与CPN输注以外的其他原因有关。Julianne Harcombe, RPh1;Jana Mammen, pharm1;Hayato Delellis, pharm1;Stefani Billante, PharmD11Baycare, St. Joseph's Hospital, Tampa, FLEncore演讲后:佛罗里达住院医师会议2023。资金支持:无报告。 背景:目的/背景:再进食综合征被定义为在接受肠内或肠外营养的营养不良患者中可能发生的液体和电解质的潜在致命变化。当再进食综合征发生时,在开始提供卡路里后不久,可以看到磷/镁/钾水平降低和硫胺素缺乏。美国肠外和肠内营养协会指南认为低磷血症是再喂养综合征的标志;然而,镁和钾已被证明同样重要。本研究的目的是确定低磷血症在有再喂养综合征风险的患者中的发生率以及监测磷的重要性。方法:本研究采用BayCare卫生系统病历数据库Cerner进行多中心回顾性图表回顾。该研究包括在2023年1月至2023年12月期间入院的18岁或以上的患者,接受了全肠外营养(TPN),并有再进食综合征的风险。我们将有再喂养综合征风险的患者定义为在TPN开始前满足以下两个标准的患者:TPN开始前的体重指数(BMI) 18.5 kg/m2, 1个月内体重减轻5%,5天无口服摄入,或血清磷/镁/钾水平低。COVID-19患者和使用异丙酚的患者被排除在研究之外。本研究的主要目的是评估有再喂养综合征风险的TPN患者低磷血症、低镁血症和低钾血症的发生率。次要目的是评估在TPN开始时添加硫胺素是否对低磷血症的发生率有好处。结果:83例患者符合再喂养综合征风险标准。在83名患者中,共有53名患者被用于进行初步研究以确定样本量,其中30名患者被纳入研究。第1天和第2天的结果表明,低镁血症的发生率与低磷血症和低钾血症不同,发生率明显较低。Cochran’s Q检验在第1天得出x2(2) = 9.57 (p值= 0.008),在第2天得出x2(2) = 4.77 (p值= 0.097),表明至少有一个组与其他组相比仅在第1天存在差异。事后分析发现,第1天低磷血症与低镁血症(30%)和低镁血症与低钾血症(33.3%)的发生率存在差异。对于次要结局,在TPN中添加硫胺素后,第2天和第1天的磷水平差异为0.073 (p值= 0.668,95% CI[-0.266 - 0.413])。结论:在接受肠外营养且存在再喂养综合征风险的患者中,低磷血症与低镁血症、低镁血症与低钾血症的发生率无统计学差异。在接受肠外营养并有再喂养综合征风险的患者中,添加硫胺素后第2天的磷水平与第1天的磷水平无统计学差异。Jennifer McClelland, MS, RN, FNP-BC1;Margaret Murphy,药学博士,BCNSP1;马修Mixdorf1;Alexandra Carey, md11波士顿儿童医院,波士顿,财政支持:无报道。背景:缺铁性贫血(IDA)在依赖肠外营养(PN)的肠衰竭(IF)患者中很常见。首选肠内铁治疗;然而,可能不耐受或无效。在这些情况下,静脉注射(IV)铁是一个合适的选择。并发症包括不良反应和感染,尽管使用低分子量(LMW)制剂时较低。在家庭PN (HPN)程序中,开发了一种算法(图1)来利用IV铁治疗IDA。方法:对2019年1月至2024年4月间按算法给予静脉注射铁的大型HPN项目(每年约150例)进行回顾性图表回顾。分析了实验室研究,寻找铁蛋白500 ng/mL表明潜在的铁过载,以及转铁蛋白饱和度12-20%表明铁充足的实例。在铁蛋白水平[gt;500]的情况下,进行了进一步的审查,以了解病因、临床意义以及是否坚持静脉注射铁算法。结果:HPN患者诊断为IDA基于低铁面板(低血红蛋白和/或MCV,低铁蛋白,高网织红细胞计数,血清铁和转铁蛋白饱和度和/或高总铁结合能力(TIBC)。如果患者能够耐受肠内补铁,则开始3-6 mg/kg/天的剂量。如果患者不能耐受肠内铁,则开始静脉滴注。最初的静脉注射是在医院或输液中心进行密切监测,并建立家庭维持给药后的补充剂量。 右旋糖酐铁是首选,因为它可以直接添加到PN中并在循环期间运行。添加PN消除了额外的输液,减少了额外的CVC通道。右旋糖酐铁与静脉注射脂质不相容,因此患者必须每周有一天无脂才能给药。如果患者每天接受静脉输注脂质,则蔗糖铁与PN分开输注。维持IV铁剂量为1mg /kg/周,剂量和频率根据临床状况、实验室研究和趋势进行滴定。铁板和c反应蛋白(CRP)每2个月检查一次。如果实验室研究低于预期范围并与IDA一致,则按剂量或频率将IV铁剂量增加50%;如果研究超出预期范围,则静脉注射铁剂量按剂量或频率减少50%。最大家庭剂量为&lt;3毫克/公斤/剂量;如果需要更高的剂量,病人会被转到输液中心。如果铁蛋白≤500ng /mL,由于铁超载和在肝脏沉积的风险,IV铁暂停。回顾了2019年1月至2024年4月所有HPN项目患者的铁蛋白结果(n = 4165),寻找500 ng/mL铁超载的水平。在14例维持IV铁治疗的独特患者中发现29例铁蛋白500 ng/mL(占审查值的0.7%)。在9例患者中,高铁蛋白水平与CRP升高的急性疾病同时发生;在这些病例中,铁蛋白升高被认为与炎症状态和铁超载有关。在2例中,在实验室抽取前一天给予静脉铁剂量,导致结果错误升高。2例患者有12例(占评估值的0.28%)铁蛋白升高被认为与没有炎症的静脉注射铁剂量有关,CRP水平正常。在此期间,没有记录的不良事件。结论:IDA在IF依赖于PN的患者中很常见。铁不是PN中的标准成分或添加剂。在这一人群中使用静脉铁可以通过减少入院、输液中心就诊或严重贫血患者输血的需要来提高生活质量。通过适当的剂量和监测,静脉注射铁可以安全地用于HPN患者的维持治疗。图1所示。静脉铁在家庭肠外营养依赖患者算法。Lynne susstersic, MS, RD1;Debbie Stevenson, MS, RD, cnsc21美国特种输液服务公司,桑顿,CO;2Amerita专业输液服务,罗切斯特山,密歇根州财政支持:无报道。背景:结缔组织增生小圆瘤(DSRT)是一种在腹部和骨盆形成肿瘤的软组织肉瘤。为了改善控制,经常使用细胞减少、腹腔热化疗(CRS/HIPEC)和放疗,这可能导致硬化性腹膜炎继发肠梗阻。由于无法口服或肠内摄入营养,这就需要全肠外营养治疗(TPN)。肠外营养治疗的一个主要并发症是肠外营养相关性肝病(PNALD),而DSRT最常见的转移是肝脏。本病例报告详细介绍了用橄榄和大豆油为基础的静脉注射脂质乳剂(OO, SO- ile)替代大豆、MCT、橄榄和鱼油为基础的静脉注射脂质乳剂(SO, MCT, OO, FO-ILE)来治疗高肝功能试验(LFTs)。方法:一名28岁的男性DSRT转移到腹膜和大肝脏肿块合并包膜性腹膜炎和肠皮瘘(ECF),在CRS/HIPEC后,于2022年提交给Amerita专业输液服务的肠外营养项目。TPN于2022年1月至2023年3月启动,并于2023年12月因胆道梗阻而停止并重新启动。TPN的启动和推进使用1.3 g/kg/天的smof脂,(SO, MCT, OO, FO-ILE),已知有助于缓解PNALD。患者LFTs升高,谷丙转移酶(ALT)峰值为445 U/L,天冬氨酸转移酶(AST)峰值为606 U/L,碱性磷酸酶(ALP)峰值为1265 U/L。尽管将患者转变为循环方案并最大限度地增加葡萄糖和氨基酸的热量,肝功能仍继续恶化。尝试改用克林脂(OO, SO-ILE), 1.3 g/kg/day。结果:开始OO, SO-ILE后,LFTs在12天内改善,ALT为263 U/L, AST为278 U/L, ALP为913 U/L。这些数值持续改善,直到2024年6月治疗结束,最终ALT值为224 U/L, AST为138 U/L, ALP为220 U/L。参见图1。总胆红素未见明显改善。患者能够成功耐受脂质乳剂的这种转换,并能够将体重从50公斤增加到53.6公斤。结论:SO、MCT、OO、FO-ILE有助于预防和减轻PNALD的不良反应,但脂质乳对其他形式肝脏疾病的影响有待进一步研究。 我们的病例表明,升高的LFTs可能是癌症引起的,而不是与长期使用肠外营养有关。较高的橄榄油脂质浓度可能对与PNALD无关的LFTs有有益影响。同样值得注意的是,以前的研究已经证明大豆油对肝功能有负面影响,并且大豆在SO、MCT、OO、FO-ILE中的浓度(30%)比OO、SO- ile中的浓度(20%)要大。这可能需要进一步研究特定大豆浓度对肝功能的影响。LFTs应在个案基础上进行评估和治疗,评估疾病机制、药物-药物相互作用、肠外营养成分和患者的主观信息。图1所示。对lft的影响。Shaurya Mehta, BS1;Ajay Jain, MD, DNB, MHA1;仓岛健人,MD, phd;钱德拉谢哈拉哲学博士;阿伦·维尔马,MD1;Marzena Swiderska-Syn1;Shin Miyata, MD1;穆斯塔法·纳扎尔,MD1;米格尔·古兹曼,MD1;Sherri Besmer, MD1;马修·麦克海尔,MD1;Jordyn Wray1;切尔西·哈钦森,MD1;约翰·朗,dvm11圣路易斯大学,圣路易斯,美国演讲后:佛罗里达北美儿科胃肠病学,肝病学和营养学学会(NASPGHAN)。资金支持:无报告。背景:短肠综合征(SBS)是一种毁灭性的疾病。在缺乏肠内营养(EN)的情况下,患者依赖于全肠外营养(TPN),并患有肠道衰竭相关的肝脏疾病和肠道萎缩。肠道适应(IA)和肠内自主(EA)仍然是临床目标。我们假设EA可以通过我们的DREAM系统(美国专利63/413,988)实现,该系统允许EN通过胃,然后将营养丰富的远端肠内容物循环再循环到近端肠,从而在SBS的情况下实现完整的EN。方法:24头新生猪随机分配肠内营养组(EN;n = 8);TPN- sbs(仅在TPN上,n = 8);或DREAM (n = 8)。收集肝脏、肠道和血清进行组织学和血清生化检查。采用Graph Pad Prism 10.1.2(324)进行统计学分析。所有检验均采用双尾检验,显著性水平为0.05。结果:TPN-SBS仔猪与DREAM相比有显著的胆汁淤积(p = 0.001), DREAM与EN相比无统计学差异(p = 0.14)。DREAM在第4天过渡到完整的EN。EN的平均血清结合胆红素为0.037 mg/dL, TPN-SBS为1.2 mg/dL, DREAM为0.05 mg/dL。TPN-SBS组血清胆汁酸水平明显高于EN组(p = 0.007)和DREAM组(p = 0.03)。胆管细胞损伤标志物GGT均值在TPN-SBS和EN (p &lt; 0.001)和DREAM (p &lt; 0.001)中均显著高于EN (p &lt; 0.001),分别为EN 21.2 U/L、TPN-SBS 47.9 U/L和DREAM 22.5 U/L (p = 0.89)。为了评估肠道生长,我们测量了线性肠道质量(LGM),以每厘米肠道的重量计算。DREAM对肠萎缩有显著的IA和保存作用。平均近端肠道LGM为EN 0.21 g/cm, TPN-SBS 0.11 g/cm, DREAM 0.31 g/cm (p = 0.004, TPN-SBS vs DREAM)。远端肠道LGM为EN 0.34 g/cm, TPN-SBS 0.13 g/cm, DREAM 0.43 g/cm (p = 0.006, TPN-SBS vs DREAM)。IHC显示DREAM与EN有相似的肝脏CK-7(胆管上皮标志物),p = 0.18,肝脏Cyp7A1, p = 0.3。EN组与DREAM组LGR5阳性肠干细胞数量差异无统计学意义,p = 0.18。DREAM阻止肝脏、CyP7A1、BSEP、FGFR4、SHP、SREPBP-1和肠道FXR、TGR5、EGF与TPN和SBS组的变化。结论:DREAM可显著减少肝脏胆汁淤积,防止肠道萎缩,并提供了一种新的方法,可以在SBS的情况下实现早期完全EN。该系统通过推动IA和EN的自主性,突出了SBS管理的重大进步,为SBS患者的生命挽救策略带来了范式变化。Silvia Figueiroa, MS, RD, CNSC1;Paula Delmerico, MS, RD, CNSC21MedStar华盛顿医院中心,贝塞斯达,马里兰州;2MedStar华盛顿医院中心,阿灵顿,弗吉尼亚。财政支持:无报告。背景:肠外营养(PN)治疗是一个重要的临床干预所有年龄的病人和跨护理设置。PN的复杂性有可能造成严重的患者伤害,特别是当错误发生时。根据安全用药实践研究所(ISMP), PN被列为高度警戒用药,应制定以安全为重点的策略,以尽量减少错误和伤害。处理PN是多因素的,包括处方、订单审查和验证、复合、标签和管理。PN处方应考虑临床适宜性和配方安全性。PN配方必须根据患者的临床状况、实验室参数和营养状况,提供适量的宏量营养素和微量营养素。 在成年骨软化症患者中观察到,PN成分的加性作用可导致毒性,引起中枢神经系统问题,并导致代谢性骨病。当肾功能和胃肠机制受损时,铝可在体内积聚。铝中毒可导致贫血、痴呆、骨病和脑病。铝中毒的症状可能包括精神状态改变、骨痛、肌肉无力、骨折不愈合和过早骨质疏松。2004年7月,美国食品和药物管理局(FDA)强制要求标注铝含量,目标是将铝含量限制在每公斤5微克/天以下。成人和儿童透析患者,以及所有年龄接受PN支持的患者,高铝暴露的风险增加。减少高铝PN添加剂是降低铝暴露和毒性风险的最有效途径。这摘要提出了一个独特的情况下,止汗剂的使用有助于铝的积累在一个成年PN患者。方法:长期PN治疗的患者(表1)经常出现低离子钙的结果;3毫克/分升,导致考虑其他影响因素。此外,患者每天服用非常大剂量的维生素D(口服),以保持在正常范围内(50,000IU口服6天/周)。发生代谢性骨病的危险因素包括钙、镁、磷、维生素D的矿物质失衡、皮质类固醇的使用、长期使用PN和铝毒性(表2)。已知骨质疏松症诊断的患者左小腿有两处应力性骨折。铝测试完成,以确定其他因素,可能有助于低电离钙值和骨质疏松症。在患者讨论期间,患者透露他们每天使用一次含铝止汗剂。止汗剂中铝含量的范围尚不清楚,但研究表明,最低限度的吸收是可能的,特别是在肾功能不全的人群中。结果:在2023年7月3日铝值升高后(图1),患者将产品改为不含铝的止汗剂。3个月和7个月时复查铝值。结果表明,患者选择的止汗剂可能通过皮肤吸收导致铝含量增加。选择止汗剂可能不会导致铝中毒,但会增加每日的总铝含量。结论:由于铝毒性风险增高,对于接受长期PN支持的患者,预防铝积累至关重要。PN之外的其他潜在污染源包括透析、加工食品、铝箔、化妆品(止汗剂、除臭剂、牙膏)、药物(抗酸剂)、疫苗、有铝焊接的工作环境和某些加工工业工厂。药物和PN添加剂的铝含量因品牌和数量而异。临床医生应审查所有潜在的含铝来源,并评估减少铝暴露和预防长期PN患者潜在铝毒性的方法。表1。病人的人口统计数据。表2。PN处方中铝的含量。图1所示。铝实验室值结果。Haruka Takayama, RD, phd;Kazuhiko Fukatsu, MD, phd;野口美纪BA3;松本奈奈,RD, MS2;成田智则,MD4;井上良,MD, ph . 3;Satoshi Murakoshi,医学博士。卢克国际医院,东京中央区;2东京大学文京区,东京;3东京大学医院,文京区,东京;4 .东京大学,东京中央市;5神奈川人类服务大学,横须贺市,神奈川县背景:我们之前的研究已经证明,β -羟基- β -甲基丁酸盐(HMB)补充的全肠外营养(TPN)可以部分恢复标准TPN喂养小鼠的肠道相关淋巴组织(GALT)萎缩。口服HMB现在在健美运动员和运动员中很流行。在此,我们研究了口服补充HMB是否可以增加随意食用食物的小鼠的GALT质量。方法:6周龄雄性癌症研究所(ICR)小鼠分为对照组(n = 9)、H600组(n = 9)和H2000组(n = 9)。所有小鼠均可随意进食和饮水7 d。给予H600或H2000小鼠含有Ca-HMB的水,浓度分别为3 mg或10 mg/mL,而对照组则饮用普通自来水。由于这些小鼠每天大约喝6-7 mL的水,H600组和H2000组每天分别服用600和2000mg/kg的Ca-HMB。操作7 d后,在全身麻醉下穿刺心脏处死小鼠,取全小肠分离GALT细胞。评估每个组织中的GALT细胞数量(Payer patches;PPs,上皮内腔;IE和固有层;LP)。 结果:结论:硬截止期后延迟PN订单数为;5%,由于药剂师对确保患者安全的关注,更新有效PN订单的数量很少。活动PN的更新没有引起临床显著的变化,因此尽管数量少,但被认为是安全的过程。对后期生产计划订单所做的更改是次要的或与计划停止生产计划有关。在NST和药房管理部门审查结果后,决定采取以下措施:与药房员工一起审查数据和流程,以协助工作量流程和教育,为供应商创建简洁的Riley TPN流程文件,特别是有延迟订单的服务,审查PN订单输入硬性截止日期和DC PN订单在截止日期前的需求,以协助药房员工工作流程和避免潜在的PN浪费,在6-12个月内重复QI分析。国际特色海报muna Islami,药学博士,BCNSP1;Mohammed Almusawa,药学博士,bccidp2;nof Alotaibi, PharmD, BCPS, BCNSP3;Jwael Alhamoud, pharm1;Maha Islami, PharmD4;Khalid Eljaaly, PharmD, MS, BCIDP, FCCP, FIDSA4;Majda Alattas,药学博士,BCPS, BCIDP1;Lama Hefni, RN5;Basem Alraddadi, md11麦加吉达费萨尔国王专科医院;2韦恩州立大学,吉达,麦加;3麦加吉达乌姆库拉大学;4阿卜杜勒阿齐兹国王大学医院,吉达,麦加;5 .费萨尔国王专科医院,吉达,麦加。财政支持:无报告。背景:肠外营养(PN)是无法通过胃肠道满足其营养需求的患者的关键治疗方法。虽然它提供了一种挽救生命的解决方案,但它也带来了中心线相关血流感染(CLABSIs)的风险。然而,缺乏综合研究检查在更异质的PN接受者队列中CLABSIs的危险因素。本研究旨在确定沙特阿拉伯接受PN治疗的患者发生CLABSIs的相关危险因素。方法:本回顾性队列多中心研究在沙特阿拉伯的三个大型三级转诊中心进行。该研究包括2018年至2022年期间通过中心线接受PN治疗的所有住院患者。本研究的目的是通过单因素和多因素分析,探讨肠外营养(PN)与中央线相关血流感染(CLABSIs)之间的关系。结果:在662例接受PN和中心静脉导管的住院患者中,123例(18.6%)发生CLABSI。在我们的患者中,肠外营养持续时间是CLABSI发展的依赖危险因素(OR, 1.012;95% ci, 0.9-1.02)。在服用PN的患者中,CLABSI的发生率在研究期间没有显著变化。结论:PN治疗时间仍是CLABSIs发生的重要危险因素;需要更多的研究来确定减少PN患者CLABSI发生率的最佳方法。表1。接受PN.1 n(%)治疗的住院患者的特征中位数(IQR) BMI,身体质量指数。表2。接受pn . 1n(%)治疗的CLABSI患者与非CLABSI患者的特征中位数(IQR), 2 Fisher精确检验;皮尔逊卡方检验;Mann Whitney U test PN,肠外营养clabsi,中央线相关血流感染PN,肠外营养中心线接受PN的患者发生CLABSI的百分比。吕杜1;梁丽娟,药学2;妮莎·戴夫,药学2;托马斯·齐格勒,MD2;Vivian Zhao, PharmD21Emory大学医院营养支持小组,Lawrenceville, GA;2Emory Healthcare,亚特兰大,美国。财务支持:无报告。背景:静脉脂质乳(ILE)是肠外营养(PN)依赖患者的重要组成部分,因为它提供卡路里和必需脂肪酸;然而,使用大豆油为基础的胰十二指肠切除术(SO-ILE)可能会导致需要慢性胰十二指肠切除术的肠衰竭患者发生胰十二指肠切除术相关肝病(PNALD)。为了降低这种风险,新的ILE配方,如SO、中链甘油三酯(MCT)、橄榄油(OO)和鱼油基ILE (SO/MCT/OO/FO-ILE)的混合物,或纯鱼油基ILE (FO-ILE)现在在美国上市。FO-ILE仅被批准用于小儿pn相关性胆汁淤积症。该病例强调了联合使用含fo的ILEs来改善PNALD的益处。方法:2020年2月,一名65岁女性患有症状性贲门失弛缓症,需要机器人辅助食管切开术合并食管成形术。术后伴肠损伤,需行多次小肠切除及全结肠切除伴空肠末端造口,导致短肠综合征,小肠残留80厘米。 根据SF-12的衡量,生活质量在身体和心理健康方面都有所改善(分别为7.3分和3.8分)。患者报告的可行性评分在所有方面(可接受性、适当性和可行性)都很高,总分为55.7/60(92.8%)。输液门诊护士(n = 3)的可行性总评分也较高(52.7/60,87.8%)。(表2)所有患者均未出现并发症。结论:门诊术前PN是一种可行的方法,可改善营养不良手术患者的预后。这种新方法有可能提高结果,减少营养不良手术患者住院的必要性。未来的研究需要涉及更大的人群来评估门诊PN的疗效。表1。参与者的基线特征(n = 8)。表2。门诊术前肠外营养的效果和可行性(n = 8)。阿德里安娜·维兹比卡,MD1;Rosmary Carballo Araque, RD1;Andrew Ukleja, MD11Cleveland Clinic Florida, Weston, fl财政支持:无报道。背景:胃轻瘫(GP)是一种慢性运动障碍,以胃排空延迟为特征,伴有恶心、呕吐和腹痛等症状。治疗包括饮食调整和药物治疗,并根据疾病严重程度提供营养支持。严重难治性病例可能需要肠内或肠外营养(PN)。然而,家庭肠外营养(HPN)在GP管理中的作用尚未得到充分探讨。本研究旨在通过检查HPN在GP人群中的利用来加强营养治疗实践,解决当前营养支持策略的重大差距。方法:对2022年8月至2024年8月接受HPN治疗的患者进行回顾性单中心分析。作为质量改进监测进程的一部分,通过审查电子病历获得了数据。采用描述性统计分析患者的人口统计学、全科病的病因、HPN的适应症、中心通路类型、治疗持续时间和pn相关并发症。纳入标准为:成人(18岁),通过胃造影诊断GP, HPN至少连续2个月。141例确诊HPN患者中,10例诊断GP为PN指征。结果:全科医生患者占我们家庭PN人口的7%(10/141)。在这项队列分析中,10名接受HPN治疗的全科医生主要是女性(80%);平均年龄42.6岁。,所有被确认为白人的人。所有患者均患有特发性GP, 80%的病例出现严重的胃排空延迟,所有患者均出现恶心/呕吐的主要症状。中心通道类型:50% PICC导管,30%希克曼导管,10%输电线,10%中转站。PN治疗的平均体重变化增加了21.9磅。80%的患者出现感染相关并发症,包括菌血症(甲氧西林敏感金黄色葡萄球菌(MSSA)、耐甲氧西林金黄色葡萄球菌(MRSA))、假单胞菌和真菌血症。在20%的患者中发现了深静脉血栓(DVT),同时还有一例心脏血栓。在70%的病例中尝试了管饲试验,但50%的病例最终因不耐受而停止,如腹痛或埋尾综合征等并发症。60%的患者使用慢性疼痛治疗,40%的患者使用阿片类药物治疗(吗啡、芬太尼)。50%的患者因复发性感染(20%)、转管喂养(20%)、依从性可疑(20%)或口服摄入改善(40%)而停止PN治疗。结论:这一回顾性分析强调了HPN作为全科医生营养策略的潜力,特别是对于那些有难治性症状和胃排空严重延迟的患者,这些患者以前EN失败或经历过与肠内通路相关的并发症。除了观察到的平均体重增加外,HPN似乎在缓解恶心、呕吐和腹痛等衰弱症状方面起着至关重要的作用,从而改善患者的整体生活质量。尽管如此,感染相关并发症的流行和对慢性疼痛管理的要求强调了全科医生治疗相关的挑战。患者对不同营养策略反应的可变性强调了个性化护理计划的重要性。这些发现支持进一步研究以优化HPN方案和改进GP的综合管理策略。图1所示。PN终止的原因。图2。PN相关并发症。 黄龙昌,MD1;彭王;委会帅;鑫Qi1;李Zhang1;王新英1南京大学医学院金陵医院普外科,江苏南京;2佛山市第一人民医院消化道外科消化疾病研究中心,广东佛山;3南京大学医学院金陵医院普外科,江苏南京;4南京大学医学院金陵医院普外科,南京,江苏南京。国家自然科学基金资助,82170575;82370900。背景:全肠外营养(TPN)诱导的肠道菌群失调与肠屏障损伤密切相关,但其机制尚不清楚。方法:通过应用16S rRNA基因测序和宏基因组分析,研究了慢性肠衰竭(CIF)患者和TPN小鼠模型接受肠外营养后肠道微生物群的变化,随后在一个独立的验证队列中验证了这些观察结果。此外,我们利用液相色谱-质谱(LC-MS)对关键代谢物进行了全面分析。此外,我们通过RNA测序(RNA-seq)、流式细胞术和单细胞RNA测序(scRNA-seq)探索了必要的先天样淋巴细胞群的修饰。结果:TPN引起的肠道屏障损伤主要是由于鼠乳杆菌减少所致。L.murinus通过将色氨酸代谢为吲哚-3-羧酸(ICA)来减轻tpn诱导的肠道屏障损伤。此外,ICA通过靶向核受体Rorc刺激先天淋巴样细胞3 (ILC3)分泌白介素-22,增强肠道屏障保护。结论:我们阐明了tpn相关肠道屏障损伤的驱动机制,并表明用鼠乳杆菌或ICA干预可有效改善tpn诱导的肠道屏障损伤。图1所示。TPN诱导人和小鼠肠道屏障损伤。(a)队列1的发热率和ICU入院率。(b-d) CIF患者血清IFABP、CRP和LPS水平。(e)代表性肠道H&amp; e染色及损伤评分(f)(每组n = 10只)。(g) Ussing Chamber法测定小鼠肠道电阻结果(每组5只)。(h)小鼠肠道和肝脏免疫荧光实验。(i) Chow组和TPN组Western blot结果。图2。TPN诱导人和小鼠肠道生态失调。(a)队列1粪便内容物16S rRNA的PCoA (n = 16个人/组)。(b)使用线性判别分析(LDA)确定显著丰度。(c) 10个最丰富的属。(d)相对属或种丰度的PCoA(每组5只)。(e)小鼠LDA。(f)桑基图显示了人类和小鼠数量最多的10个属。(g)以下热图说明了肠道菌群丰度与CIF患者临床特征之间的相关性。图3。代谢活跃的L.murinus改善肠屏障损伤。(a)采用RT-PCR方法量化L-PN和H-PN患者粪便中L. murinus的丰度(队列1和2)。(b)具有代表性的肠道H&amp;E染色和损伤评分(c)(每组n = 10只小鼠)。(d) Ussing Chamber法测定小鼠肠道电阻结果(每组5只)。(e) Western Blot结果。(f) Chow组和TPN组小鼠的3D-PCA和火山图(g)分析。(h)根据从Chow组和TPN组小鼠(每组n = 5只)的粪便中获得的代谢组学数据,对代谢组全途径进行了富集。(i)热图描述了Chow组和TPN组小鼠(每组n = 5只)肠道菌群丰度与色氨酸代谢物的相关性。(j) 3D-PCA VIP评分。VIP值为1.5的分类单元在鉴别过程中具有显著的重要性。图4。ICA对L.murinus的作用至关重要。(a) TPN小鼠处理PBS对照或ICA的粪便中ICA的水平(每组n = 10只)。(b)代表性肠道H&amp;E染色及损伤评分(c)(每组n = 10只)。(d) Ussing Chamber法测定小鼠肠道电阻结果(每组5只)。(e) Western blot结果。(f)这一代谢途径说明了细菌L. murinus从色氨酸中产生ICA。(g)接受ΔArAT或活L. murinus的TPN小鼠粪便中代谢物的PLS-DA图谱(每组n = 5只小鼠)。(h)接受ΔArAT (n = 5)或活L. murinus (n = 5)治疗的TPN小鼠粪便样本中色氨酸靶向代谢组学的热图。 (i)代表性肠道H&amp;E染色及损伤评分(j)(每组n = 10只)。(k) Western Blot结果。Callie Rancourt, RDN1;Osman Mohamed Elfadil, MBBS1;Yash Patel, MBBS1;Taylor Dale, MS, RDN1;艾莉森·凯勒,MS, RDN1;Alania Bodi, MS, RDN1;Suhena Patel, MBBS1;Andrea Morand, MS, RDN, LD1;阿曼达·恩格尔,药学博士,RPh1;Manpreet Mundi, MD11Mayo Clinic, Rochester, mn。背景:尽管卵圆孔未闭(PFO)通常无症状且不会引起健康问题,但它们可能是栓塞和中风的危险因素。由于这种理论上的风险,一些机构已经制定了方案,要求对PFO患者通过一个小微米过滤器给药。虽然2合1的葡萄糖和氨基酸溶液可以相对容易地通过0.22微米的过滤器过滤,但注射脂质乳剂(ILEs),无论是单独使用还是作为总混合物的一部分,都由较大的颗粒组成,需要1.2微米或更大的过滤器尺寸。在PFO患者中,使用较大的滤过器会妨碍ILE的摄入,而ILE是卡路里的重要来源。目前尚不清楚接受ILE治疗的患者是否会增加脂质栓塞和中风的发生率。方法:对中心肠外营养(CPN)患者进行单中心回顾性分析。统计资料和基线临床特征,包括合并症和CVA病史。感兴趣的结果被定义为在CPN后30天内发生缺血性脑血管事故(CVA),因此可能归因于CPN。记录其他心血管和血栓栓塞事件。2018年1月1日至2023年12月18日期间在我们的第四护理转诊中心诊断为PFO并住院使用CPN的所有患者被纳入病例队列。确定了与年龄、性别、住院CPN持续时间和临床合并症相匹配的3:1对照组,并利用该对照组来检查感兴趣结果的差异。结果:接受CPN的PFO患者(n = 38, 53.8%为女性)在CPN开始时的平均年龄为63.5±13.1岁,平均BMI为31.1±12.1(表1)。PFO大小不同,大多数(38.5%)具有非常小/轻微的PFO(表2)。该队列中的所有患者都放置了适当大小的过滤器用于CPN和ILE管理。两组间的CPN处方和持续时间具有可比性。大多数PFO患者(53.8%)接受混合油ILE,其次是大豆油-橄榄油ILE(23.1%),而大多数无PFO患者(51.8%)接受大豆油-橄榄油ILE,(42.9%)接受混合油ILE(表3)。病例组和对照组的心血管风险患病率相当,包括肥胖、高血压、糖尿病和血脂异常。然而,PFO组中更多的患者有血管性心脏事件和心房颤动史,非PFO组中更多的患者是吸烟者(表4)。PFO患者接受PN治疗的中位时间为7天(IQR: 5,13), 32例(84.2%)接受ILE治疗。接受CPN的无PFO患者(n = 114, 52.6%)在CPN开始时的平均年龄为64.9±10岁,平均BMI为29.6±8.5。该队列患者接受PN治疗的中位时间为7天(IQR: 5,13.5), 113例(99.1%)接受ILE治疗。两组在接受CPN后30天内缺血性CVA的发生率无差异(PFO组2例(5.3%),非PFO组1例(0.8%);p = 0.092)(表4)。结论:PFO患者CVA/卒中合并CPN的风险问题与临床相关,但往往缺乏明确的答案。我们的研究显示,在给予PN后的前30天,在匹配的对照队列中,PFO患者和非PFO患者之间可能归因于CPN的缺血性CVA没有差异。这一发现表明,CPN合并ILE对于住院的PFO患者可能是安全的。表1。基线人口统计学和临床特征。表2。卵圆孔未闭的诊断。*所有患者同时服用异丙酚。表3。PN处方。表4。结果和并发症。肠内营养治疗yosman Mohamed Elfadil, MBBS1;Edel Keaveney博士;阿黛尔·帕丁森,RDN1;丹妮尔·约翰逊,MS, RDN1;瑞秋·康诺利,理学士。2;Suhena Patel, MBBS1;Yash Patel, MBBS1;Ryan Hurt, MD, phd;Manpreet Mundi, MD11Mayo Clinic, Rochester, MN;Rockfield MD, galwayrockfield Medical Devices。背景:家庭肠内营养(HEN)的患者,其中许多人是移动的,由于活动能力的限制,除了潜在疾病过程造成的负担外,还可能经历显著的困难和生活质量(QoL)的降低。改善喂养时的活动能力可以减轻HEN相关的负担,并可能改善生活质量。 本前瞻性队列研究旨在评估参与者在使用新型肠内喂养系统(EFS)后对其移动性、进食时进行身体活动的便利性和生活质量的看法。方法:进行了一项前瞻性单中心研究,以评估一种新型EFS,这是一种经fda批准的弹性系统(Mobility +®),由一个轻质喂养袋(可容纳500毫升饲料),一个填充组(与注射器一起使用以填充EFS)和一个喂食组组成,用于将EN配方输送到带有ISO 80369-3兼容连接器的延长组/喂食管。通过邀请招募成年hen依赖患者使用研究EFS,每天至少喂食2次,持续14天,之前有5-7天的熟悉期。参与者对喂食时的典型日常活动(例如,移动、旅行、社交)和喂食系统参数(易用性、便携性、噪音、酌情权、性能)的评价采用hen专家验证问卷进行评估。评分从1到5分,5分代表最积极的反应。计算该队列的总体得分并取其平均值。参与者在熟悉期间被跟踪。在第7天和第14天,对依从性、肠内采食量、参与者对研究EFS与现行系统的看法以及其他措施进行了额外的电话访谈。我们排除了那些由于潜在疾病而导致功能能力下降的患者。结果:17名参与者完成了研究(平均年龄63.8±12岁;70.6%的男性)。参与者使用各种喂养系统,包括重力、丸法和泵,大多数(82.4%)患者放置了g管(表1)。在所有研究日中,16名(94.1%)患者每天至少使用两次研究EFS(以及大多数每日EN卡路里)(表2)。与研究前使用的系统相比,使用研究EFS执行各种活动的能力评分显着不同。评分的提高体现在日常活动的便利性上,包括在房间或楼梯间走动、短途和长途散步、乘坐汽车或公共交通工具旅行、从事中高强度活动、睡眠以及与家人和朋友的社交活动。在入组前和研究结束(第14天)之间的时间点(p值&lt; 0.0001)(表3)。研究前和研究EFS使用的系统对喂养系统参数的评分有显著差异(p &lt; 0.0001)(表3),在易于携带、噪音水平和谨慎喂养能力方面,阳性评分的增加幅度最大。对研究EFS性能的总体满意度评分与研究前使用的系统评分没有差异,参与者报告主要影响因素是时间长度和填写研究EFS所需的努力。在生活质量评分方面没有发现差异。结论:所研究的EFS作为一种肠内喂养方式是安全有效的,为HEN接受者提供了另一种选择。参与者报告说,研究EFS对他们的日常生活活动有显著的积极影响。虽然总体生活质量评分保持不变,但生活质量的移动性、自主性和携带便利性方面的改善与研究EFS的使用有关。表1。基线人口统计学和临床特征。表2。安全性和有效性。表3。研究EFS的可用性和影响。Talal Sharaiha, MD1;Martin Croce, MD, FACS2;丽莎·麦克奈特,注册会计师,BSN MS2;Alejandra Alvarez, ACP, PMP, CPXP21Aspisafe Solutions Inc., Brooklyn, NY;财务支持:Talal Sharaiha是Aspisafe Solutions Inc.的高管。Martin Croce, Lisa McKnight和Alejandra Alvarez是地区一级医疗机构的雇员。区域一体健康通过服务而不是现金在Aspisafe Solutions Inc.拥有经济利益。Aspisafe免费提供这些产品。背景:在过去的几十年里,饲管安全的创新很少,使得医用粘合剂仍然是标准方法。然而,粘接剂经常不能保持安全定位,脱位率在36%到62%之间,平均约为40%。脱位可导致不良后果,包括误吸、营养不良风险增加、医疗费用增加和护理时间延长。我们的目的是评估一种新型喂食管固定装置与标准胶带相比在防止NG管移位方面的安全性和有效性。该装置由一个支架组成,支架安装在患者的上唇上。该支架具有用于固定进料管的非粘性机构,范围从10到18法国。 对比分析显示,有肠内营养和没有肠内营养的患者在住院时间、重症监护要求、菌血症、胃肠道出血、出院MELD 3.0评分和住院死亡率方面均显著增加。在自发性细菌性腹膜炎、肺炎、入院MELD 3.0评分或移植后生存时间方面,给予肠内营养的患者与未给予肠内营养的患者相比无显著差异(表2)。结论:在本研究中,尽管诊断为严重蛋白质热量营养不良,但住院的肝硬化患者中只有不到50%接受了肠内营养。发现肠内营养的开始平均延迟一周,在入院后。住院时间的延长和较高的住院死亡率表明,在住院过程中较晚开始肠内营养缺乏益处。基于这些发现,我们的机构实施了一项质量改进计划,在肝硬化和严重蛋白质卡路里营养不良的住院患者中建立早期肠内喂养。未来的研究将评估这一举措的有效性和对临床结果的影响。表1。肠内营养患者入院至出院期间临床终点的变化。缩写:kg, kg;BMI,身体质量指数;INR:国际标准化比率;Na,钠,Cr,肌酐;TB:总胆红素;终末期肝病模型MELD;肠内营养;标准偏差肠内营养与非肠内营养患者临床特点及预后的比较分析。缩写词:MASLD,代谢功能障碍相关的脂肪变性肝病;丙型肝炎病毒;乙型肝炎病毒;AIH,自身免疫性肝炎;PSC,原发性硬化性胆管炎;原发性胆管炎;肠内营养;注册营养师;特派团,医疗特别护理单位;机构、肺炎;SBP,自发性细菌性腹膜炎;GIB,胃肠道出血;终末期肝病模型MELD;d,天;N,数量;Std,标准偏差。杰西·詹姆斯,MS, RDN, cnsc11威廉森医疗中心,富兰克林,tn财政支持:无报道。背景:饲管用于给无法安全口服营养物质和药物的患者提供肠内营养,这是一种营养不良和脱水风险较高的人群。不幸的是,这些管道容易堵塞。工作人员将尝试使用标准的床边技术疏通管道,包括温水冲洗或化学酶。然而,这些实践不仅耗时,而且常常是不成功的,需要替换。2021年9月至2023年7月,在某二级医疗中心评估了一种用于恢复堵塞小口径管道通畅的驱动机械装置,作为一种替代的疏通方法。研究目的是探索驱动机械装置疏通留置管和监测任何潜在安全问题的能力。方法:TubeClear®系统(驱动机械装置,actuated Medical, Inc., Bellefonte, PA,图1)被开发用于解决各种留置管的堵塞。N = 20例患者(表1),其中N = 16,10fr 109 cm长的鼻胃管(NG)和N = 4,10fr 140 cm长的鼻空肠管(NJ)使用驱动机械装置进行清理尝试。最初,患者接受至少30分钟的标准疏通策略,包括温水冲洗和Creon/NaHCO3浆液。在通畅恢复不成功(n = 17)或通畅恢复并重新记录(n = 3)后,尝试使用驱动式机械装置。程序时间由电子监控记录图表系统估算,包括启动机械装置的设置、使用和清洁时间,精确到5分钟。所有清理程序均由三名训练有素的注册营养师完成。结果:恢复输卵管通畅的平均时间(n = 20)为26.5分钟(NG为25分钟,NJ为32.5分钟),成功率为90%(表2),操作人员或患者未报告明显的安全问题。用户满意度为100%(20/20),患者不适感为10%(2/20)。结论:基于目前的结果,与其他床边实践相比,驱动机械装置在解决堵塞方面显着更成功。作业人员指出,“当泥浆/水无法冲洗时,驱动机械装置能够处理堵塞。”需要注意的是,在完全堵塞形成之前使用驱动机械装置,采用预防性方法,“比等到管道完全堵塞要容易得多。” “对于部分堵塞的管道,”尽管它有点专利和实用性,但驱动机械装置的快速通过基本上恢复了完全通畅,并可能防止了完全堵塞"对于NG患者,“再多的冲洗或药物浆都无效,但驱动的机械装置在几分钟内就工作了,没有任何问题。“在多次尝试标准干预失败后,只有驱动的机械装置能够恢复输卵管通畅,从而节省了不必更换输卵管的费用。”对于清理失败的情况,作业者指出,“尽管未能恢复通畅,但显然没有机会进行冲洗以获得更好的结果,使用这种选择(驱动机械装置)有助于避免更换管柱。”对于新泽西州的病人来说,“没有其他传统的方法可以恢复新泽西州小口径饲管的通畅,而不需要大量的x射线照射和”猜测工作“,这对这个病危和依赖呼吸机的病人来说是不可能的。”事实证明,采用标准床边疏通技术的替代方案对该机构是有益的,其有效性为90%,并使患者免于进行管道更换,并通过避免管道更换成本节省了我们机构的资金。表1。病人和饲管人口统计。表2。驱动机械装置用途。图1所示。用于清除部分和完全堵塞的留置进料管的驱动机械装置。Vicki Emch, MS, RD1;达尼Foster2;Holly Walsworth, RD31Aveanna Medical Solutions, Lakewood, CO .;2Aveanna Medical Solutions, Chandler, AZ;3Aveanna Medical Solutions, Erie, co .财务支持:无报告。背景:自大流行以来,家庭护理提供者应对了多次配方缺货。由于创造性地解决问题,临床医生已经能够成功地提供替代。然而,当泵供料装置缺货时,选择是有限的;给料装置是根据泵的品牌而定的。最近,用于急性和家庭护理中常见的泵的喂养装置的缺货导致家庭护理供应链严重短缺。这需要迅速采取行动,以确保患者能够继续进行管饲并防止再入院。一个解决方案是将患者更换为没有缺货的泵品牌。通常情况下,将患者转移到不同品牌的泵需要亲自教学。由于形势的紧迫性,需要建立一种更有效的方法。一位地区家庭护理提供者确定,20%使用肠内喂养泵的患者使用的是未订购的设备,50%是儿科患者,他们往往不能忍受其他喂养方法。作为回应,这个家庭护理提供者组建了一个团队,为泵培训创建了一个新的教育模式。该小组由注册护士、注册营养师、病人护理和分销代表组成。假设是提供高质量的教育材料,包括教学视频,详细的问题沟通和电话临床支持,将允许成功过渡。方法:我们优先考虑诊断为短肠综合征、胃轻瘫、糖原储存病、通风口依赖等的患者,以确定过渡的紧迫性;2岁以下,居住在农村地区的人用2天邮编并进行了临床复查,确定患者采用空肠饲管。(见表1)泵转换小组联系患者/护理人员审查情况,讨论营养输送的选择,确定当前的设备库存,评估转换的紧迫性,协调泵,设备和教育材料的输送。每周报告跟踪使用影响泵的患者数量,转换的患者数量,以及那些要求转换回原来的泵的患者数量。结果:共有2111例患者使用了缺货的喂养泵,其中50%的患者年龄在12岁以下。老了。在3个月的时间里,1435名患者或68%的患者成功地过渡到不同品牌的泵,其中只有7名患者或0.5%的患者要求返回到原来的泵,即使他们知道潜在的喂养装置短缺的风险。(见图1)结论:团队方法包括主动与患者/护理人员沟通,优先考虑患者的风险水平,通过视频链接提供高质量的教育材料和临床医生的外呼,从而成功过渡到新品牌的喂养泵。表1。患者优先级与后备泵组(表1)。泵转换数量(图1)。desiree Barrientos, DNP, MSN, RN, LEC11Coram CVS,中国,财务支持:无报告。背景:肠内营养治疗社区的随访护理对于良好的预后至关重要。 没有收集到有关主要大学医学中心家庭肠内营养(HEN)结果的数据。没有一个强有力的计划来跟踪那些出院后进行管饲的病人。因此,关于随访护理或患者结果的信息很少,这些结果与当前的实践、并发症、再次住院和设备问题有关。方法:采用出院后48小时和30天出院外展电话问卷、出院前发放问卷和喂奶泵发放问卷。结果:教育:48小时与30天比较。你能告诉我你为什么住院吗?在出院回家之前,是否有医疗服务提供者联系您进行教育?问3:你了解医生给你的营养指示吗?Q4:您能告诉我保持PEG/TF站点清洁的步骤吗?你能告诉我冲洗水管需要多少水吗?在两个调查时间点之间,患者的理解、自我监测和导航都有所改善。关于第三季度的患者教育,对营养顺序的了解从91%提高到100%,第四季度:保持管饲部位清洁的步骤从78%提高到96%,在48小时和30天的时间点上,每次喂食前后冲洗水的知识分别从81%提高到100%。结论:在两个调查时间点之间,患者的理解、自我监测和导航能力有所提高。对开放式和信息性问题的口头回答进行汇总,以分析并发症、护理差距和服务失败。表1。48小时和30天内的问卷回复。表2。48小时和30天内的问卷回复。图1所示。教育:48小时和30天比较。图2。自我监测和导航:48小时和30天的比较。Rachel Ludke, MS, RD, CD, CNSC, CCTD1;凯拉·马歇尔,RD, cd21弗洛德特纪念路德医院,沃基夏,威斯康星州;2弗洛德特纪念路德医院,大本德,威斯康星州财政支持:无报告。背景:早期肠内营养对改善患者预后起着至关重要的作用。从历史上看,喂食管是由护士、医生和高级实践提供者放置的。在过去的二十年里,营养师(rdn)在床边放置喂食管的流行程度越来越高。通过对《注册营养师执业范围》和《注册营养师专业表现标准》的修改,这种做法得到了营养与饮食学会和美国肠内和肠外营养学会的认可。rdn放置在床边的喂食管有可能减少护理、透视检查和内部运输时间,这是我们医院感兴趣的。在2023年秋天,我们启动了一项试点,以评估rdn领导的床边管放置小组在我们800张床位的一级创伤中心的可行性。方法:rdn首先与护理领导密切合作,制定导管放置工作流程,以确定合适的患者,概述与床边护士和提供者的沟通,并建立故障排除解决方案(图1)。然后,重症监护病房的护理人员使用相机引导技术(IRIS)培训rdn放置导管,并在10次成功放置导管后认为rdn有能力。考虑到RDN引导的管置入文献有限,我们将成功定义为80%的管置入胃肠道内的适当位置。结果:迄今为止,该试验包括57例患者;46支(80.7%);经KUB证实,39例胃和7例幽门后)放置成功。值得注意的是,39个胃管中有2个最初被要求放置在幽门后,但在放置过程中发现问题后,这些患者认为胃管是可以接受的。11例(19%)由于行为问题、鼻腔堵塞和解剖异常而置管失败。结论:本试验表明,训练有素的rdn可以成功地使用相机引导的管放置技术在床边放置饲管。该试验的一个限制是样本量小。我们最初将试点限制在2个医院楼层,并且在教育护理人员RDN放置管的可用性方面遇到了困难。通过对管道放置订单的评估,我们发现在我们的试点期间,平均放置了198个管道,这表明141个rdn错过了放置管道的机会。为了解决这些问题,我们创建了大量的教育公告,并与单位护理人员合作,鼓励在需要放置饲管时与RDN联系。我们还将试点扩大到整个医院,并正在调查最常放置试管的时间段。 及时的营养干预和将成功的营养护理纳入医院协议以减少LOS的战略是医疗营养治疗的重要组成部分。模块化营养通常在这些干预措施中发挥关键作用。Klek等人(2020)的一项研究强调了模块化营养在为危重患者的特定需求提供个性化营养支持方面的作用。该研究表明,使用营养模块可以根据患者的代谢需求更精确地调整营养(Klek et al., 2020)。模块化营养也被证明对ICU患者的临床结果有积极影响。Compher等人(2019)的一项观察性研究报告称,靶向给药蛋白质模块以实现更高的蛋白质摄入量与改善临床结果相关,例如降低ICU LOS (Compher等人,2019)。方法:模块化营养的管理可能是一个挑战。通常,模块化蛋白质(MP)是通过营养师订购的,并作为饮食顺序的一部分分发。护理小组负责管理和记录MP。通常在MP处方和给药之间存在脱节。在某些情况下,这与MP不是电子健康记录(EHR)中的跟踪任务有关。本次评估的目的是审查与质量改进(QI)计划相关的数据,其中MP (ProSource TF)被添加到药物管理记录(MAR)中,该记录使用条形码扫描过程来跟踪MP的提供和文档。本评价的目的是确定QI主动性与患者ICU LOS之间可能存在的相关性。QI倡议评估了2021年6月1日至2021年11月30日的实施前时间框架,以及2022年1月1日至2022年6月30日的实施后时间框架。实施前共有ICU就诊1962例,实施后共有ICU就诊1844例。使用一系列统计检验对数据进行了分析。结果:总样本的t检验有显著性,t(3804) = 8.35, p &lt;.001,表明与实施前相比,岗位上的平均LOS显着降低(表1)。这种正相关性使我们能够假设MP提供的改善可能与ICU中LOS的降低有关。除了LOS,我们还可以提出与MAR和MP利用的关系。实施前,获得了1600剂量的MP,实施后增加了2400剂量。数据表明,产品使用和MAR实施之间存在相关性,即使在实施后的总体遭遇减少了。与之前相比,实施后的产品利用率提高了50%。结论:所提供的数据表明,在MAR中添加MP有助于改善供应,简化文件,并可能降低ICU的LOS。表1。总接触前后LOS的比较。表1显示了在mar上实施MP前后LOS的t检验比较。在mar上展示产品利用与MP实施前后的遭遇。国际特色海报eliana Giuntini, phd;Ana Zanini, RD, MSc2;Hellin dos Santos, RD, MSc2;安娜·保拉·塞莱斯,工商管理学硕士;伯纳黛特·佛朗哥博士31圣保罗<s:1>大学食品研究中心,圣保罗<e:1>;prodiet医疗营养,库里蒂巴,巴拉那;3巴西<s:1>圣保罗大学药学院食品研究中心,巴西圣保罗<e:1>背景:由于合成代谢抵抗,危重患者对蛋白质的需求增加,以保持肌肉质量。此外,这些患者更容易发生高血糖,可采取的营养策略之一是提供低血糖指数的饮食。高热量和高蛋白肠内配方可以帮助满足这些患者的能量和蛋白质目标。由于减少了碳水化合物含量,这些配方也有助于降低餐后血糖反应。本研究旨在评估一种专门的高蛋白肠内营养配方的血糖指数(GI)和血糖负荷(GL)。方法:选择15名健康志愿者,根据自我报告无疾病或定期用药,年龄在21至49岁之间,根据空腹和餐后2小时血糖评估,葡萄糖耐量正常。这些人在禁食10小时后参加,每周一次,连续3周食用葡萄糖溶液(参考食物),并在接下来的一周食用专门的高蛋白肠内配方(Prodiet医学营养),两者的量都相当于25克可用碳水化合物。专门的高蛋白配方提供1.5千卡/毫升,26%的蛋白质(98克/升),39%的碳水化合物和35%的脂质,包括EPA + DHA。 在0(进食前)、15、30、45、60、90和120分钟定期进行毛细血管采血。计算增量曲线下面积(iAUC),不包括禁食线以下的面积。血糖负荷(GL)根据公式GL = [GI(葡萄糖=参考)X每份有效碳水化合物克数]/100确定。采用学生t检验检验差异(p &lt; 0.05)。结果:为了消耗25克有效碳水化合物,个体摄入140克高蛋白配方。高蛋白肠内配方奶粉显示低GI (GI = 23),与葡萄糖(p &lt; 0.0001)和低GL (GL = 8.2)相比有显著差异。血糖曲线数据显示,除了T90外,葡萄糖和专门的高蛋白配方在所有时间点上都有显著差异,葡萄糖的血糖峰值出现在T30 (126 mg/dL),而专门的高蛋白肠内配方的血糖峰值出现在T30和T45,其值明显低于葡萄糖(102 vs 126 mg/dL)。与葡萄糖相比,特殊高蛋白配方的iAUC更小(538±91 vs 2061±174 mg/dL x min) (p &lt; 0.0001),表现出没有高峰的曲线,通常在血糖指数降低的食物中观察到。结论:特制的高蛋白肠内营养配方具有低GI和GL的作用,可显著降低餐后血糖反应,降低血糖升高和变化。这可能会降低胰岛素需求和血糖变异性。图1所示。志愿者(N = 15)在食用参考食物和专门的高蛋白肠内营养配方后,在120分钟内对25g有效碳水化合物的平均血糖反应贝瑟尼·韦斯科特,APRN, CNP, MS2;Manpreet Mundi, MD2;Ryan Hurt,医学博士,21 mayo Clinic Rochester, Rochester, MN;2Mayo Clinic, Rochester, mn。资金支持:无报道。背景:催眠疗法是利用催眠来治疗医学或心理障碍。具体来说,肠道定向催眠疗法是功能性胃肠疾病和肠脑轴疾病的一种治疗选择。它已被证明是有效的管理胃肠道症状,如腹痛,恶心,功能性消化不良和肠易激综合征症状。有证据表明,6%-19%有这些胃肠道症状的患者表现出回避/限制性食物摄入障碍(ARFID)的特征。多项研究表明,胃肠道症状得到改善,并能在1年后维持这种改善。然而,关于在家庭肠内营养患者中使用催眠疗法的数据缺乏。方法:我们报告了一例67岁的成年女性,患有h/o型肠易激综合征(腹泻为主)并伴有新的粘液性阑尾癌,腹部肿瘤减容包括结肠造口术和远端胃切除术。由于肠道功能恢复延迟,患者术后接受肠外营养(PN)治疗1个月后,口服饮食提前。不幸的是,她很难戒掉PN,因为她“害怕开始吃东西”,因为她患有功能性吞咽困难,一看到食物就会呕吐,甚至在电视上也是如此。4周PN后,放置鼻空肠饲管,并将其送回家。结果:在多学科门诊营养门诊就诊中,患者依赖肠内营养,并报告无法耐受口服摄入,原因不明。讨论了长期的肠内通路,但患者希望避免这种情况,并要求其他干预措施,她可以尝试帮助她进食。她被推荐接受肠道导向催眠疗法。经过3周催眠治疗的4次面对面治疗后,患者能够忍受越来越多的口服摄取量并取消鼻腔空肠喂养。在4个月后的随访中,她仍然吃得很好,并继续赞扬她从肠道导向催眠疗法中获得的结果。结论:除了营养支持外,以患者为中心的治疗肠-脑轴疾病和饮食行为紊乱和/或饮食失调是重要的考虑因素。这些包括但不限于认知行为疗法、正念干预、针灸、生物反馈策略和肠道导向催眠疗法。根据患者的需求和偏好,可以考虑采用团体、在线和治疗师指导的治疗方法。需要进一步的研究来更好地描述这些治疗方式对家庭肠内营养人群的影响。Allison Krall, MS, RD, LD, CNSC1;Cassie Fackler, RD, LD, CNSC1;Gretchen Murray, RD, LD, CNSC1;Amy Patton, MHI, RD, CNSC, lssgb21俄亥俄州立大学韦克斯纳医学中心,俄亥俄州哥伦布市;2俄亥俄州立大学韦克斯纳医学中心,韦斯特维尔,俄亥俄州财政支持:无报道。 背景:有充分的证据表明,不必要的住院会对患者的身心健康产生负面影响,并可能增加医疗成本有许多策略可以限制不必要的住院;我们拥有1000多个床位的学术医疗中心采用了一种创新策略,即注册营养师(rd)。文献表明,由专门的团队使用电磁跟踪放置饲管可以改善患者的发病率和死亡率,并且是该手术的成本效益解决方案多年来,研发人员一直是喂食管团队的一部分,尽管只有研发人员的团队的确切数量尚不清楚经修订的2021年《营养支持rdn (Competent, Proficient, and Expert)的实践标准和专业表现标准》确定,“专家”级别的营养师努力获得额外的经验和培训,因此可能在饲管放置团队中发挥重要作用在RD领导的管组实施之前,对于这些患者能否及时获得肠内通路,我院并没有统一的流程。方法:于2023年12月在我院电子病案中启用“RD管组”会诊医嘱集。试管小组的初衷是覆盖住院病房,但很快就发现有机会将这项服务扩展到观察区和急诊科(ED)。本病例系列摘要将概述三个具有不同临床背景的患者的病例研究,以及管团队如何能够防止住院患者入院。患者1:一名81岁的女性,因鼻肠饲管移位,在第4次手术中进行食管修复,返回急诊科。咨询了RD管团队,并能够更换她的管,并在适当的地方缰绳。从急诊科出院而不需要再入院的病人。患者2:一名有耳鼻喉癌病史的81岁男性,在院外急诊科转到我们急诊科,没有训练有素的工作人员可以为他更换脱落的鼻肠饲管。RD管组更换了他的管,并将其固定到位。病人可以在没有入院的情况下从急诊科出院。患者3:31岁女性,胃肠道病史复杂,因PO不耐受多次长期住院。患者出院2天后因鼻肠饲管堵塞返回急诊科。由于导管无法疏通,因此RD管组能够在ED中更换导管并防止再入院。结果:咨询量证实有需要管队服务。在咨询和订单集实施的前8个月,研发团队共放置了403根管子。其中24例(6%)被安置在急诊科和观察病房。在上述3例患者病例和许多其他患者病例中,RD能够使用电磁放置装置(EMPD)成功放置管子,从而防止患者入院。结论:组建饲管团队可能是一个复杂的过程,需要医院高级管理层、医师冠军、护理团队和法律/风险管理团队的支持。在实施的第一年,我们的医院系统能够证明,研发领导的管队不仅有潜力帮助患者建立安全的肠内通道,而且可以通过防止入院和再入院成为医疗机构的资产。表1。RD管道团队咨询(2023年12月11日- 2024年8月31日)。Arina Cazac, RD1;乔安妮·马修斯,RD2;克尔斯滕·威廉森,RD3;佩斯利·斯蒂尔,RD4;萨凡纳·赞廷,RD5;Sylvia Rinaldi, RD, phd21内部平衡,King City, ON;2伦敦健康科学中心,伦敦,安大略;3NutritionRx,伦敦;4Vanier儿童心理健康中心,伦敦,安大略省;5利斯托维尔-温厄姆和地区家庭健康小组,温厄姆,安大略省财政支持:无报告。背景:帕金森病是第二常见的神经退行性疾病,其主要疾病相关症状为吞咽困难。 吞咽困难会增加吸入的风险,将食物或液体吸入肺部,可能引发肺炎,这是帕金森病患者的复发性死亡。因此,放置饲管将营养输送到胃或小肠中,以维持适当的营养输送,并减少通过口服摄入误吸的风险。据我们所知,目前还没有研究比较帕金森病相关吞咽困难患者胃(G)或空肠(J)管饲在结局上的差异,然而,在危重症人群中确实存在有限的 研究比较这两种方式。 背景:为家庭使用设计的医疗器械必须优先考虑用户的安全性、易操作性和可靠性,特别是在肠内喂养等关键活动中。本研究旨在通过总结性试验和任务分析来验证一种新型肠内营养系统的可用性、安全性和总体用户满意度。方法:采用基于模拟的人为因素总结性研究,共36名参与者,包括护理人员和肠内喂养技术的直接使用者。参与者来自三个主要城市:休斯顿、芝加哥和凤凰城。任务分析侧重于Luminoah FLOW™肠内营养系统的关键和非关键功能,同时使用系统可用性量表(SUS)测量用户满意度。该研究评估了任务的成功完成、潜在的使用错误和定性的用户反馈。结果:所有关键任务都被100%的用户成功完成,除了一个清洁任务,它的成功率为89%。非关键任务的总体完成率达到95.7%,显示了系统的易用性和设计的直观性。SUS得分异常高,平均得分为91.5,表明用户对该设备的偏好超过当前的替代品。此外,91%的参与者表示他们会选择新系统而不是市场上的其他产品。结论:创新的便携式肠内营养系统具有良好的可用性和安全性,满足其预期用户群体的设计要求。关键任务的高完成率和压倒性的正面SUS分数强调了系统的易用性和可取性。这些发现表明,该系统是家庭肠内喂养的优越选择,在现实情况下提供了安全性和效率。教学材料的进一步改进可能会提高用户在非关键任务上的表现。释放tewalt11凤凰城退伍军人管理局,凤凰城,亚利桑那州财政支持:无报道。背景:加强术后恢复(ERAS)方案,包括术前碳水化合物负荷,旨在通过减少应激反应和胰岛素抵抗来加速恢复。这些方案已被证明可以减少住院时间、术后并发症和医疗费用。然而,关于ERAS对糖尿病患者的安全性和有效性的知识有限。糖尿病患者占手术病例的15%,通常住院时间较长,术后并发症较多。本研究在非糖尿病人群中评估了对糖尿病患者重要的结果测量,以支持未来可能将糖尿病患者纳入ERAS方案的试验的基础。方法:对凤凰城退伍军人事务卫生保健系统的24例术前接受碳水化合物饮料的结直肠手术患者与24例接受传统护理的结直肠手术患者进行回顾性分析。评估的结果包括血糖(BG)水平、误吸和术后并发症。其他分析评估了依从性、住院时间和医疗费用。结果:两组的人口统计数据具有可比性(表1)。碳水化合物负荷组术前BG水平(164.6±36.3 mg/dL)与对照组(151.8±47.7 mg/dL)相似(p &gt; 0.05)(图1)。与对照组(157.6±61.9 mg/dL)相比,碳水化合物负荷组术后BG水平(139.4±37.5 mg/dL)更低且更稳定。两组患者误吸、呕吐情况无统计学差异(p &gt; 0.05)(表2)。碳水化合物负荷组平均住院时间短1天,但差异无统计学意义(p &gt; 0.05)(表2)。碳水化合物负荷作为ERAS方案的一部分与更好的术后血糖控制有关,没有增加并发症的风险,缩短了住院时间。虽然本研究未包括糖尿病患者,但这些发现表明碳水化合物负荷是ERAS安全有效的组成部分。将糖尿病患者纳入ERAS是一个合乎逻辑的下一步,可以显著改善这一人群的手术效果。未来的研究应集中在糖尿病患者身上,以评估碳水化合物负荷对术后血糖控制、并发症发生率、住院时间和医疗费用的影响。表1。人口统计数据。表格中包含了这两个群体的人口统计数据。研究组由接受碳水化合物饮料的参与者组成,而对照组包括接受传统护理的参与者。表2。术后的结果。表中为两组患者的术后结果。 研究组由接受碳水化合物饮料的参与者组成,而对照组包括接受传统护理的参与者。该图显示了碳水化合物组和非碳水化合物组的术前血糖水平,以及各组的趋势线(p &gt; 0.05)。图1所示。术前BG水平。该图显示了术后碳水化合物组和非碳水化合物组的血糖水平,并给出了各组的趋势线(p &gt; 0.05)。图2。术后BG水平。营养不良与营养评估amy Patton, MHI, RD, CNSC, LSSGB1;Elisabeth Schnicke, RD, LD, CNSC2;Sarah Holland, MSc, RD, LD, CNSC3;Cassie Fackler, RD, LD, CNSC2;Holly Estes-Doetsch, MS, RDN, LD4;克里斯汀·罗伯茨,博士,RDN, LD, CNSC, FASPEN, FAND5;克里斯托弗·泰勒,博士,rdn41俄亥俄州立大学韦克斯纳医学中心,韦斯特维尔,俄亥俄州;2俄亥俄州立大学韦克斯纳医学中心,俄亥俄州哥伦布市;3俄亥俄州立大学韦克斯纳医学中心,俄亥俄州上阿灵顿;4俄亥俄州立大学,哥伦布,俄亥俄州;5 .俄亥俄州立大学,格兰维尔,俄亥俄州资金支持:无报道。背景:营养不良与住院时间延长(LOS)、跌倒增加和再入院增加等住院结果的不利关联已在文献中得到充分记录。我们的目的是了解降低注册营养师(RD)与患者比例的不同护理模式是否会导致我们机构内营养不良的识别和记录增加。我们还评估了这些指标与监测的医院单位的LOS之间的关系。方法:2022年7月,在一家拥有1000多张床位的大型学术医疗中心,作为营养不良鉴定、文件完整性和员工培训试点项目的一部分,又聘请了两名rd。在医院的三个单位,RD与患者的比例从1:75降至1:47。在试点单位,RD完成了一项全面的营养评估,包括针对所有根据医院营养筛查政策确定为“有风险”的患者的营养重点体检(NFPE)。那些未被确定为“有风险”的患者在入院第7天或根据咨询请求接受NFPE的完整RD评估。在质量和运营团队的质量数据经理的协助下,创建了营养不良仪表板。这个可视化图表使我们能够按单位跟踪和监测RD营养不良识别率,以及由计费和编码团队捕获的营养不良诊断的患者百分比。数据还从电子病历(EMR)中提取,以查看其他患者的结果。在回顾性分析中,我们比较了这些单位的一个新的护理模式和标准模式。结果:试点单位营养不良的RD识别捕获率有所增加。在心脏护理部门,RD识别率从2022财年(FY)的基准6%上升到2023-2024财年的平均12.5%。在两个普通医学单位,RD确定的营养不良率在两年干预期间几乎翻了一倍(表1)。与对照单位相比,其中一个普通医学干预单位的LOS显着降低(p &lt; 0.001, Cohen's D: 13.8)(表2)。在22财年和23/24财年之间分析的所有单位的LOS都降低了。在对照组中,被诊断为营养不良的患者在FY22至FY23/24的LOS减少了15%,而在干预单元中被诊断为营养不良的患者的LOS减少了19%。当将FY23和FY24的干预组与对照组进行比较时,干预组的LOS远低于对照组。结论:营养师评估和相关干预可能有助于降低LOS。降低RD与患者的比率可以更好地识别营养不良,并支持患者的预后,如LOS。有机会评估试点单位的其他患者结果,包括跌倒、再入院率和病例混合指数。表1。RD确定了两个普通医学试点单位的营养不良率。表2。控制单元与干预单元住院时间比较。艾米·巴顿,MHI, RD, CNSC, LSSGB1;Misty McGiffin, dtr21俄亥俄州立大学韦克斯纳医学中心,韦斯特维尔,俄亥俄州;2俄亥俄州立大学韦克斯纳医学中心,俄亥俄州哥伦布市,财政支持:无报告。背景:延迟识别有营养风险的患者可能会影响患者的预后。根据医疗机构的政策,饮食技师(DTs)和注册营养师(rd)审查患者病历,与患者会面,并分配营养风险水平。然后由RD评估高风险患者的营养不良和其他营养问题。根据2023年1月实施的新跟踪流程,在2023年1月至4月期间,每月平均有501例患者营养风险分配逾期或不完整,5月份急剧增加到835例。 每天都有遗漏的风险分配。如果风险分配没有在政策的72小时参数内完成,这可能导致延迟或错过研发评估机会和政策合规性问题。方法:2023年6月,在一家拥有1000多个床位的学术医疗中心启动了一项采用DMAIC(定义、测量、分析、改进、控制)框架的精益六西格玛质量改进项目,目的是提高rd和DTs营养风险分配(NRA)流程的效率。第二个目标是,根据营养与饮食学会/美国肠外和肠内营养指标或营养不良(AAIM)标准,研究效率的潜在提高是否也会导致RD确定的营养不良的增加。一组研发人员和技术开发人员开会讨论质量改进过程的每一个步骤。对问题进行了定义,并分析了营养不良识别率和遗漏营养风险分配的基线测量值。鱼骨图用于帮助进行根本原因分析,随后使用收益矩阵来确定潜在的干预措施。改进阶段于2023年10月和11月实施,包括改变筛查政策本身,并在某些患者单位重新分配临床营养人员。结果:确定的改进确实对未完成的工作和营养不良识别率产生了积极影响。5月至10月的营养不良率平均为11.7%,而11月至4月的营养不良率为12.1%。从5月到10月的平均每月975人减少到11月到4月的每月783人,每月减少192人(20%)。一个额外的质量改进过程周期目前正在进行,以进一步改进这些度量标准。结论:培养一种持续改进的文化对临床医生和领导者来说都是一个重大挑战。加强营养护理和提高临床医生的效率是关键目标。将临床营养领导者引入为质量监测和提高而设计的工具,可以带来更好的绩效结果和更有效的营养护理。与PDSA(计划、执行、研究、行动)项目一起用于这个项目的工具在这个过程中是有价值的。让团队成员从这些改进工作的开始就参与进来,也可以帮助确保成功地采用实践变更。表1。RD确定营养不良率。表2。不完全营养风险分配(NRA’s)莫里斯·珍妮·阿圭罗,注册护士,MD1;Precy Gem Calamba, MD, FPCP, dpbcn21内科,普洛斯彼里达,阿古桑德尔南;2北达沃塔古姆市医疗营养科财政支持:无报告。背景:在胃肠道(GI)癌症患者中,营养不良是死亡率和发病率的一个强有力的预测因素,因为对治疗的反应较差,生活质量也较差。在设有癌症中心的塔古姆市的一家三级政府医院,虽然对癌症患者进行营养不良筛查是常规的,但过去没有进行过着重于确定胃肠道癌症患者营养状况与生活质量之间关系的研究。本研究的主要目的是确定营养状况是否与在三级政府医院寻求癌症治疗的成年胃肠道癌症菲律宾患者的生活质量有关。方法:采用定量、观察、横断面、调查分析和预测型研究方法。采用世界卫生组织生活质量简要版问卷(WHOQOL-BREF)确定病例的生活质量。采用Logistic回归分析胃肠癌患者的人口学、临床和营养状况之间的关系。结果160人,平均年龄56.4±12岁,男性占61.9%,已婚占77.5%,天主教占81.1%,高中学历占38.1%。结肠腺癌占诊断病例的近一半(43.125%),其次为直肠腺癌(22.5%)、直肠乙状结肠腺癌(11.875%)、胃肠道间质瘤(5.625%)。在分期方面,4期占40.625%,其次是3b期(19.375%)、3c期(10%)、3a期(5.625%)和2a期(4.375%)。只有2.5%为4a期,而0.625%为4b期。超过四分之一的患者接受CAPEOX(38.125%),其次是FOLFOX(25.625%),然后是IMATINIB(5.625%)。其中体重过轻或肥胖占15.6%,超重占34.4%。SGA分级中,重度38.1%,中度33.8%,其余为正常至轻度。在生活质量方面,每个变量的平均得分为:总体生活质量一般好(3.71±0.93),总体健康感知一般满意(3.46 ~ 3.86±0)。 97分),对是否有足够的精力维持日常生活、是否满意自己的外表、是否能获得日常生活所需的信息、是否有休闲的机会等方面的满意程度一般(2.71 ~ 3.36±1.02分),对是否有足够的钱满足自己的需要的满意程度略高(2.38±0.92分)。平均而言,参与者经常经历消极情绪,如情绪低落、绝望、抑郁和焦虑(2.81±0.79)。年龄(p = 0.047)、癌症诊断(p = 0.001)、BMI (p = 0.028)和SGA营养状况(p = 0.010)与成年癌症患者的生活质量之间存在显著关联。结论:营养状况与在三级政府医院寻求癌症治疗的成年胃肠道癌症菲律宾患者的生活质量显著相关。公共卫生干预可能在这些因素中发挥关键作用,以改善患者的生存和预后。罗嘉文,MS, RD, LDN, CNSC1;汉娜·雅各布斯,OTD, OTR/L2;张秀玲,MS, RD, LDN3;朱莉·迪卡洛,MS4;Donna Belcher, EdD, MS, RD, LDN, CDCES, CNSC, FAND5;加林娜·盖曼,MD6;大卫·林,md71马萨诸塞州总医院,沙伦,马萨诸塞州;2MedStart国家康复医院,华盛顿特区;3新英格兰浸信会医院,马萨诸塞州波士顿;4马萨诸塞州总医院神经技术与神经康复中心,马萨诸塞州波士顿;5 .营养与食品服务部,波士顿,马萨诸塞州;6哈佛医学院和麻省总医院,波士顿,马萨诸塞州;神经危重症护理;神经恢复,波士顿MGH,财政支持:营养与饮食学会,营养支持成员研究奖营养师。背景:营养状况是脑损伤幸存者最佳恢复的已知可改变因素,然而,通过营养优化临床结果的具体基准数据有限。本初步研究旨在量化从ICU入院至出院后90天脑损伤幸存者的临床、营养和功能特征。方法:在马萨诸塞州总医院(MGH)神经科学ICU住院12个月的患者根据以下标准入组:年龄18岁及以上,初步诊断为急性脑损伤,ICU住院时间至少72小时,出院后生存时间超过90天,符合MGH神经康复诊所门诊随访转诊标准。数据从电子健康记录和神经恢复诊所随访/电话访谈中收集。这些包括患者特征、急性临床结果、营养摄入、替代营养和入院、出院和出院后90天的功能评分。采用描述性统计进行分析。结果:在研究期间的212例入院患者中,有50例患者被纳入分析。APACHE (n = 50)、GCS (n = 50)和NIHSS (n = 20)的平均评分分别为18分、11分和15分。78%的患者需要通气,平均持续时间为6.3天。ICU和ICU后平均住院时间分别为17.4天和15.9天。80%的患者在24-48小时内接受了营养(肠内或口服)。前7天ICU的平均能量和蛋白质摄入量分别为1128千卡/天和60.3克蛋白质/天,均为估计需求的63%。根据ASPEN指南进行评估,BMI≥30的患者比BMI≥30的患者获得更少的能量(11.6 vs 14.6 kcal/kg/day),但更高的蛋白质(1.04 vs 0.7 g protein/kg/day)。12%的患者在出院前至少7天营养摄入量低于50%,被认为有营养风险。46%的患者出院时采用了长期的肠内通路。只有16%的病人出院回家,而不是去康复机构。出院后90天,32%的患者再次入院,其中27%是因为中风。入院时,患者的平均MUST(营养不良普遍筛查工具)和MST(营养不良筛查工具)评分分别为0.56和0.48,反映其营养风险较低。出院时,这些患者的平均MUST和MST评分分别增加到1.16和2.08,表明这些患者已经处于营养风险中。出院后90天,两项评分均恢复到低营养风险(MUST 0.48, MST 0.59)。所有患者的功能评分,通过改进的Rankin量表(mRS)测量,遵循类似的模式:入院时平均得分为0.1,出院时为4.2,出院后90天为2.8。出院后90天Barthel指数为64.1,中度依赖。结论:这项初步研究突出了从ICU入院到出院后90天脑损伤幸存者的关键营养、临床和功能特征。 本研究的目的有两个:1)探索护理者对当前和未来CVAD培训实践的看法;2)评估当护理从护理者转变为患者时,积极主动的形式化CVAD培训计划的必要性。方法:采用在线软件工具进行问卷调查,共8个问题。目标受众包括接受HPN治疗儿童的护理人员。该调查的链接通过电子邮件发送,并发布在支持HPN社区的各种社交媒体平台上。该调查于2024年6月17日至7月18日进行。通过CVAD接受HPN的儿童的非照护者被排除在外。结果:本次调查共收到114份回复,但根据排除标准,仅有86份被纳入分析。CVAD患儿接受HPN的年龄分布在0 - 18岁之间。大多数情况下,关于HPN治疗和CVAD护理的初始培训是由HPN诊所/医院资源/学习中心或家庭输液药房进行的(表1)。48%的受访者表示,他们的HPN团队从未提供再教育或分享最佳实践(图1)。大多数受访者选择了最佳个人来培训他们的孩子进行CVAD护理和安全(图2)。此外,60%的受访者选择了是。如果提供的话,他们会希望他们的孩子参加CVAD培训(图3)。结论:这项调查证实,当确定孩子准备好承担这项责任时,大多数照顾者期望训练他们的孩子执行CVAD护理。这项培训的一个挑战是,在这项调查中,几乎一半的受访者表示,他们从未接受过团队的再教育或最佳实践建议。这一发现表明,需要一个正式的培训计划,以协助护理人员过渡到病人的心血管疾病护理。由于大多数受访者报告依靠他们的肠道康复或胃肠道/运动诊所来解决与CVAD相关的问题,这些中心将是建立过渡训练计划的最佳场所。本研究的局限性如下:仅通过选定的社交平台进行分发,未捕获这些平台之外的用户。额外的研究将有助于确定内容训练的最佳顺序和节奏。表1。中心静脉通路装置(CVAD)培训和支持实践。图1所示。您的HPN团队多久提供一次再教育或分享最佳实践?图2。谁是最好的CVAD护理管理和安全培训您的孩子?图3。如果提供正式的CVAD培训,您希望您的孩子参加吗?Laryssa graphic, MS, RDN, LDN, CNSC1;埃琳娜·斯托扬诺娃,MSN, RN2;Crystal Wilkinson, PharmD3;Emma Tillman, PharmD, phd41 nutrshare, Tamarac, FL;2 .密苏里州堪萨斯城nutrshare;3 nutrshare,圣地亚哥,加州;4印第安纳大学,卡梅尔,in财政支持:无报道。背景:家庭内长期肠外营养(LTPN)是美国许多患者的生命线。患者使用中心静脉通路装置(CVAD)来实施LTPN。中心线相关血流感染(CLABSI)是与需要LTPN的患者相关的严重风险。LTPN人群CLABSI发生率为每1000个导管天0.9-1.1例。本研究的目的是确定由专门从事LTPN的国家家庭输液提供者服务的患者队列中CLABSI的发生率,并确定与CLABSI发生率增加相关的变量。方法:回顾性分析2023年3月至2024年5月LTPN肠衰竭患者的电子病历,查询患者人口统计学、人体测量数据、护理利用、肠外营养处方(包括脂类)、治疗时间、地理分布、处方医师专业、CLABSI病史、可用血培养结果和乙醇锁的使用情况。患者的邮政编码被用来确定美国卫生部定义的农村卫生区域;人类服务。患者被分为两组:1)至少有一个CLABSI的患者和2)研究期间没有CLABSI的患者。比较两组患者的人口学和临床指标。标称资料采用Fisher精确检验,正态分布资料采用学生t检验,非正态分布资料采用Mann-Whitney u检验。结果:我们确定了198名在研究期间维持LTPN的人。在研究期间,该队列的总CLABSI率为0.49 / 1000导管天。在研究期间,44名LTPN患者有一个或多个CLABSI, 154名LTPN患者没有CLABSI。 进一步的统计分析将有助于描述营养状况与临床和功能结果之间的关系,这可能指导未来ICU营养和神经康复的研究和实践。Lavanya Chhetri, BS1;阿曼达·范·雅各布,MS, RDN, LDN, CCTD1;桑德拉·戈麦斯博士,RD1;Pokhraj Suthar, MBBS1;Sarah Peterson,博士,rd11拉什大学医学中心,芝加哥,伊利诺斯州背景:识别肝病患者的虚弱提供了对患者营养状况和身体恢复能力的有价值的见解。然而,目前尚不清楚肌肉减少是否是肝病虚弱的一个重要病因。确定虚弱和肌肉量之间的可能联系可能会导致更好的风险预测、个性化干预和改善患者护理的结果。本研究的目的是确定与接受肝移植评估的非虚弱肝病患者相比,虚弱患者的骨骼肌指数(SMI)是否较低。方法:采用回顾性、横断面研究设计。在2019年1月1日至2023年12月31日期间接受肝移植评估的年龄大于18岁的患者,如果在初始肝移植评估期间完成了肝脆弱指数(LFI)评估,并在初始肝移植评估后30天内完成了诊断性腹部CT扫描,则纳入研究。人口统计资料(年龄、性别、身高和BMI)、肝病病因、MELD-Na评分、糖尿病和肝细胞癌病史、肝脏疾病并发症(腹水、肝细胞癌、肝性脑病等);食管静脉曲张),并记录LFI评分。LFI被记录为连续变量,并被分为分类变量(虚弱:定义为LFI≥4.5;不虚弱:定义为LFI≤4.4)。第三腰椎CT截肌面积(cm2)量化;计算SMI (cm2/身高,单位为米2),将低肌肉质量分为分类变量(低肌肉质量:定义为男性SMI≤50 cm2/m2,女性≤39 cm2/m2;正常肌肉质量:定义为男性SMI≤50 cm2/m2,女性SMI≤39 cm2/m2)。采用独立t检验分析来确定体弱与非体弱患者之间的重度精神分裂症是否存在差异。结果:共纳入104例患者,其中男性占57%,平均年龄57±10岁,平均BMI 28.1±6.4 kg/m2。MELD-Na平均评分为16.5±6.9分;25%有肝细胞癌史,38%有糖尿病史。大多数样本至少有一种肝脏疾病并发症(72%有腹水,54%有肝性脑病,67%有静脉曲张)。平均LFI评分为4.5±0.9,44%为体弱。平均SMI为45.3±12.6 cm2/m2, 52%被归类为低肌肉质量(男性63%,女性38%)。体弱与非体弱患者的SMI无差异(43.5±10.6 vs 47.3±13.9 cm2/m2, p = 0.06)。报告了男性和女性因虚弱状态导致的SMI之间的差异,由于样本量小,未使用显著性检验。体弱男性(43.5±12.2比48.4±14.9)和女性(43.4±9.3比45.2±11.8)的SMI均低于非体弱患者。结论:体弱多病与非体弱多病患者的SMI无显著差异;然而,基于0.06的p值,可能存在边际趋势和可能的差异,但需要进一步的研究来证实这一发现。此外,令人担忧的是,男性的低肌肉质量率较高,体弱和非体弱男性的平均SMI低于用于识别低肌肉质量的截止值(SMI≤50 cm2/m2)。需要进一步的研究来探索导致男性肌肉质量低的潜在因素,特别是在虚弱人群中,并确定旨在改善肌肉质量的有针对性的干预措施是否可以减轻肝移植评估患者的虚弱和改善临床结果。丽贝卡·普莱斯顿,MS, RD, LD1;Keith Pearson, PhD, RD, LD2;Stephanie Dobak, MS, RD, LDN, CNSC3;艾米·埃利斯,博士,公共卫生硕士,RD, ld11阿拉巴马州塔斯卡卢萨大学;2阿拉巴马大学伯明翰分校,阿拉巴马州伯明翰;3费城托马斯杰斐逊大学,资金支持:ALS协会护理质量补助金。背景:肌萎缩侧索硬化症(ALS)是一种进行性神经退行性疾病。由于吞咽困难、高代谢、自食困难等挑战,营养不良在ALS (PALS)患者中很常见。 在许多临床环境中,营养不良的诊断使用营养与饮食学会/美国肠外和肠内营养学会诊断营养不良的指标(AAIM)或全球营养不良领导倡议(GLIM)标准。然而,对ALS诊所的营养不良评估实践知之甚少。这项定性研究探讨了ALS诊所的rd如何诊断pal的营养不良。方法:研究人员对22名在美国ALS诊所工作的rd进行了6个虚拟焦点小组。录音逐字转录,并导入NVivo 14软件(QSR International, 2023, Melbourne, Australia)。两名研究小组成员使用演绎主题分析法独立分析数据。结果:AAIM指标被确定为最常用的营养不良诊断标准。两位参与者使用AAIM和GLIM标准的组合来描述。尽管所有的参与者都描述了他们进行了彻底的营养评估,但有些人说他们没有正式记录门诊患者的营养不良情况,因为缺乏补偿,而且感觉诊断结果不会改变干预措施。相反,另一些人指出了为报销目的记录营养不良情况的重要性。在所有组中,rd报告在诊断营养不良方面存在挑战,因为难以区分疾病与营养不良相关的肌肉损失。因此,一些rd描述了调整当前的营养不良标准,以关注体重减轻、减少能量摄入或脂肪减少。结论:总体而言,研究人员一致认为,营养不良在PALS中很常见,他们进行了全面的营养评估,作为标准治疗的一部分。在记录营养不良的人中,大多数人使用AAIM指标来支持诊断。然而,由于肌肉损失是ALS的自然结果,rd认为很难评估营养相关的肌肉损失。这项研究强调需要制定针对pal的营养不良标准。表1。与ALS患者营养不良诊断相关的主题。Carley Rusch,博士,RDN, LDN1;Nicholas Baroun, BS2;Katie Robinson, PhD, MPH, RD, LD, CNSC1;Maria Geraldine E. Baggs博士;Refaat Hegazi, MD, PhD, MPH1;多米尼克·威廉姆斯,医学博士,硕士11雅培营养,哥伦布,俄亥俄州;2迈阿密大学,牛津,俄亥俄财政支持:这项研究得到了雅培营养公司的支持。背景:营养不良越来越被认为是存在于所有BMI类别中的一种状况。尽管迄今为止的许多研究都集中在低bmi患者的营养不良上,但仍有必要了解营养干预如何改变高bmi患者和现有合并症的结果。在一项针对营养不良住院老年人的试验的事后分析中,我们试图确定食用含有高能量、蛋白质和β -羟基- β -甲基丁酸盐(ONS+HMB)的特殊ONS是否可以改善BMI≥27的老年人的维生素D和营养状况。方法:使用来自滋养试验(一项随机、安慰剂对照、多中心、双盲研究)的数据进行事后分析,该研究在住院的营养不良患者中进行,初步诊断为充血性心力衰竭、急性心肌梗死、肺炎或慢性阻塞性肺疾病。在试验中,参与者在住院期间和出院后90天内接受ONS + HMB或安慰剂饮料(目标2份/天)的标准治疗。在基线、出院后0、30、60和90天,采用主观总体评估(SGA)和握力评估营养状况。在入院72小时(基线)、出院后30天和60天内评估维生素D(25-羟基维生素D)。BMI≥27的参与者组成分析队列。采用基线测量调整后的协方差分析确定治疗效果。结果:纳入166例BMI≥27的患者,平均年龄76.41±8.4岁,以女性为主(51.2%)。基线握力(n = 137)为22.3±0.8 kg,血清25-羟基维生素D (n = 138)浓度为26.0±1.14 ng/mL。在第90天,ONS+HMB改善了营养状况,其中64%的ONS+HMB组营养良好(SGA-A),而对照组为37% (p = 0.011)。在指数住院期间(基线至出院),与安慰剂相比,ONS+HMB组的握力有更高的变化趋势(最小二乘法平均值±标准误差:1.34 kg±0.35 vs 0.41±0.39;P = 0.081),但在其他时间点均无统计学意义。在第60天,接受ONS + HMB组的维生素D浓度显著高于安慰剂组(29.7±0.81 vs 24.8±0.91;p &lt; 0.001)。 结论:与安慰剂相比,接受标准治疗+ ONS+HMB治疗的营养不良且BMI≥27的住院老年患者在第60天和第90天的维生素D和营养状况分别有显著改善。这表明,对于BMI升高和营养不良的患者,急性后护理过渡应考虑使用ONS+HMB等干预措施与标准护理相结合,继续给予营养。艾琳·多斯·桑托斯;伊希斯·海伦娜·布恩索;玛丽莎·奇科内利·贝勒;Maria Fernanda Jensen kok21医院Samaritano Higienópolis, <e:1>圣保罗;2 . Samaritano Higienopolis医院,<s:1>圣保罗州财政支持:无报告。背景:营养不良会对住院时间、感染率、死亡率、临床并发症、再入院率以及平均医疗费用产生负面影响。人们认为,早期营养干预可以减少负面事件并产生经济影响。因此,我们的目的是通过营养筛查评估有营养风险的患者的平均住院费用,并伴有口服营养补充的指征。方法:对2023年8月至2024年1月在某私立医院住院的110例成人患者进行回顾性研究。入院24小时内进行营养筛查。为了根据小腿围(CC)对低肌肉量进行分类,考虑了分界点:入院96小时内测量的女性33厘米和男性34厘米。他们以分组方式进行评估,考虑G1患者有口服补充(OS)的指征,但由于可修改的原因没有开始,G2患者有口服补充的指征,并果断开始(在治疗指征48小时内),G3患者有口服补充的指征,但开始较晚(在治疗指征48小时后),G4患者加入G1和G3,因为两者都没有果断接受口服补充。排除接受肠内或肠外营养治疗的患者。结果:G2在研究样本中普遍存在(51%),平均住院时间中等(20.9天),平均每日住院费用较低,平均年龄为71岁,明显存在低肌肉质量(56%),重症监护(IC)住院需求较低(63%),icu的平均住院时间(SLA)为13.5天。G1的患病率较低(9%),平均住院时间较短(16天),平均每日住院费用比G2高41%,平均年龄68岁,一致认为肌肉质量充足(100%),相当需要在重症监护室住院(70%),但IC的SLA为7.7天。G3占研究样本的40%,平均住院时间更长(21.5天),平均每日住院费用比G2高22%,平均年龄73岁,明显存在低肌肉量(50%)和中度重症监护住院需求(66%),但IC的SLA为16.5天。与G2相比,G4的样本组相似(G2: 56例,G4: 54例),平均年龄(72岁)、住院时间(20.55天)、IC住院时间(66%)、IC SLA(64.23%),但平均每日住院费用较高(比G2高39%),低肌量患者患病率较高(59%)。结论:根据所呈现的结果,我们可以得出结论,未接受OS并在IC中度过时间的患者的百分比平均比其他组高5%,该组一致具有足够的肌肉量,但由于临床条件,食物接受度和体重减轻而需要补充。除G1组外,所有组中50%以上的患者肌肉质量低。在费用方面,与未接受OS的患者相比,患者果断补充或延迟补充的费用分别减少了45%和29%。与G4相比,主动补充的患者的成本仍低39%。国际杰出海报达芙妮·洛夫斯利博士,RD1;Rajalakshmi Paramasivam,理学硕士,rd11阿波罗医院,金奈,泰米尔纳德邦背景:医院营养不良曾经是一个未被充分报道的问题,但近几十年来已经引起了极大的关注。这个普遍存在的问题对康复、住院时间(LOS)和总体结果产生了负面影响。本研究旨在评估临床营养实践的有效性,以及营养指导委员会(NSC)在通过改善营养护理和患者预后来解决和潜在地根除这一持续存在的问题方面的作用。方法:纳入2018年1月至2024年8月连续入住某三级医院非危重病房的患者。 从电子病历中回顾性提取患者人口统计资料、体重指数(BMI)、修正主观总体评估(mSGA)、口服营养补充剂(ONS)的使用情况和临床结果。采用SPSS 20.0对数据进行分析,比较实施NSC前后的结果。结果:在239,630例连续患者中,包括139,895例非危重患者,平均年龄为57.10±15.89岁;男性占64.3%,女性占35.7%。平均BMI为25.76±4.74 kg/m2,多病发生率为49.6%。大多数患者(25.8%)入院时伴有心脏病。根据修正的主观总体评价(mSGA), 87.1%营养良好,12.8%中度营养不良,0.1%严重营养不良。10%的人服用了ONS处方,其中体重过轻者的ONS处方最高(28.4%);BMI正常(13%);超重(9.1%);肥胖(7.7%)(p = 0.000)和mSGA -营养不良(5.5%);中度营养不良(MM) 41%;严重营养不良(SM)占53.2% (p = 0.000),肺科占23.3%,其次是胃肠病学;肝病学(19.2%)(p = 0.000)。平均住院时间为4.29±4.03天,总死亡率为1.2%。根据mSGA评分,严重营养不良显著影响死亡率(0.8%比5.1%,p = 0.000)。多发病(0.9% vs. 1.5%)和呼吸系统疾病(2.6%,p = 0.000)增加了死亡风险。由mSGA(34.7%, 57.4%, 70.9%)和BMI(43.7%, 38%)评估的不良营养状况与较长的住院时间(LOS≥4天,p = 0.000)相关。NSC的实施带来了显著的改善——平均生存时间减少(4.4天vs 4.1天,p = 0.000),死亡风险从1.6%降低到0.7% (p = 0.000)。基线营养状况未见明显变化,表明临床营养实践在评估患者营养方面是有效的。国家统计局的处方在2022年至2024年间从5.2%增加到9.7% (p = 0.000),使死亡率在2022年之后降至1%以下,而在国家安全委员会之前,死亡率超过1% (p = 0.000)。LOS与ONS的使用呈显著负相关(p = 0.000)。逐步二元logistic回归显示,mSGA评估的营养不良预测死亡率的比值比为2.49,LOS的比值比为1.7,其次是ONS处方和多病(p = 0.000)。结论:一个功能良好的NSC是推动成功的营养干预和实现组织目标的关键。通过mSGA及早发现营养不良,然后采取及时和适当的营养干预措施,对于缩小护理差距和改善临床结果至关重要。强有力的领导和治理对于推动这些努力至关重要,确保患者获得最佳营养支持,以促进康复并降低死亡率。表1。患者特征:基线人体测量细节;营养状况。人体测量和营养状况的基线细节。表2。Logistic回归预测医院LOS和死亡率。逐步二元logistic回归显示,mSGA评估的营养不良预测死亡率的比值比为2.49,LOS的比值比为1.7,其次是ONS处方和多病(p = 0.000)。与营养良好的患者相比,msga评级的营养不良患者住院时间更长(p = 0.000)。营养状况(mSGA) Vs医院LOS(4天)。汉娜·韦尔奇,MS, RD1;Wendy Raissle, RD, CNSC2;Maria Karimbakas, RD, CNSC31Optum输液药房,凤凰城,AZ;2Optum输液药房,七叶树,AZ;3Optum Infusion Pharmacy, Milton,财务支持:无报告。背景:粮食不安全是指人们没有足够的食物吃,不知道下一顿饭从哪里来。2022年,美国约有4900万人依赖食品援助慈善机构。接受肠外营养(PN)的患者可能能够通过口服摄入补充营养,但由于慢性健康状况限制了工作能力和家庭总收入,可能会出现粮食不安全。患者还可能面临缺乏负担得起的住房、水电费增加和医疗费用负担。粮食不安全的迹象可能表现为体重减轻、营养不良、精力不足、注意力难以集中或其他身体指标,如水肿、长期干裂的嘴唇、皮肤干燥和眼睛发痒。本摘要的目的是突出两个独特的病人的情况介绍,其中粮食不安全促使临床医生进行干预。方法:患者1:50岁男性,患有短肠综合征(SBS),长期PN,因家庭经济困难而致电注册营养师(RD)(见表1)。 患者和临床医生的关系允许患者向RD传达关于无法养活自己和家人的敏感担忧,这导致患者依赖PN提供所有营养。由于目前的食物不安全,临床医生对PN/水合作用进行了改变,以帮助改善患者的临床状况。患者2:一名患有长期PN的21岁男性SBS患者与他的家庭注册护士(RN)讨论了家庭在支付食物方面的困难(见表2)。注册护士告知临床团队疑似食物不安全,并联系了保险病例经理(CM)讨论食物负担能力。研发部利用了当地的社区资源,如食品银行、食品盒和社区项目。一个社区项目能够帮助病人吃饭,直到病人的阿姨开始为他做饭。该患者没有直接与RD共享食物不安全;然而,与家庭注册护士的关系在与患者进行面对面交谈时证明是有价值的。结果:在这2例患者中,获取食物困难影响了患者的临床状态。临床小组确定了跨学科小组的粮食不安全和进一步教育的需要。RD与在职护理人员一起制作了一份食品不安全信息讲义,以帮助识别迹象(图1),以发现社区中可能存在的食品不安全状况和潜在的患者资源。图2给出了在怀疑有问题时询问患者的建议问题。结论:鉴于粮食不安全的普遍性,对体征和症状进行常规评估是必要的。家庭营养支持团队(包括注册营养师、注册护士、药剂师和护理技术人员)可以协助这项工作,因为他们经常与患者进行电话和在家联系,并与患者和护理人员建立信任的关系。临床医生应该意识到潜在的社会情况,可以保证改变PN配方。为了深思熟虑地解决这一敏感问题,PN输液提供者应考虑加强患者评估,并促进跨学科团队的教育,以提高对可获取社区资源的认识。表1。患者1信息。表2。疑似粮食不安全时间表。图1所示。检测粮食不安全的迹象。图2。问问题。christian Bury, MS, RD, LD, CNSC1;阿曼达·霍奇·博德,RDN, LD2;David Gardinier, RD, LD3;Roshni sredharan, MD, FASA, FCCM3;玛丽亚加西亚路易斯,MS, RD, ld41克利夫兰诊所,大学高地,OH;2克利夫兰诊所基金会,俄亥俄州沙利文;3克利夫兰诊所,俄亥俄州克利夫兰;2月25日,在佛罗里达州奥兰多举行的重症监护医学学会重症监护大会上,再次介绍:克利夫兰诊所癌症中心。出版:重症监护医学。2025;53(1):出版中。资金支持:项目支持由莫里森克利夫兰诊所营养研究合作提供。背景:先前存在营养不良的住院和危重患者预后更差,住院时间(LOS)更长。目前,注册营养师(rd)通过营养重点体检(NFPE)评估营养不良。最近的建议鼓励使用身体成分工具,如计算机断层扫描(CT)和NFPE。训练有素的rd可以使用CT扫描评估第三腰椎(L3)的骨骼肌质量,然后计算骨骼肌指数(SMI)和平均Hounsfield单位(HU),分别确定肌肉大小和质量。这已经在不同的临床人群中得到了验证,在NFPE困难的危重病人中可能特别有用。我们的目的是评估在外科和重症监护人群中使用CT扫描是否可以作为一种辅助工具来捕捉遗漏的营养不良诊断。方法:对克利夫兰诊所2021-2023年收治的120例患者进行营养不良评估,包括在腹部CT检查后2天内进行NFPE评估。其中,59名患者在入院时完成了大手术或手术,并被纳入最终分析。CT扫描由训练有素的L3 RD使用Terarecon读取,结果由名为Veronai的人工智能软件(AI)交叉参考。年龄、性别、身体质量指数、重度精神障碍指数对HU进行分析,并进行营养不良诊断。结果:对59例患者进行分析。其中61%为男性,51%为65岁,24% BMI为30。47%的患者被诊断为营养不良。根据NFPE,共有24名患者没有肌肉萎缩,而CT显示该组58%的患者肌肉质量低。22%的患者(13/59)在使用CT时营养不良严重程度较高。此外,在所有年龄组中,71%的患者检测到肌肉质量差。 值得注意的是,人工智能和RD在检测低肌肉量方面的评估有95%的一致性。结论:rd可以有效地分析CT扫描,并将SMI和HU与NFPE结合使用。单靠NFPE并不总是足够灵敏地检测外科和危重病人的低肌。营养不良的程度决定了营养干预措施,包括营养支持的时间和类型,因此必须准确诊断和调整干预措施,以改善结果。表1。CT诊断营养不良的变化。该图显示了当CT与NFPE结合使用ASPEN指南时,营养不良诊断的变化。表2。肌肉评估:CT vs NFPE。该图比较了CT和NFPE对肌肉的评估。第三腰椎CT扫描显示肌肉量和肌肉质量正常,患者65岁。图1所示。CT扫描评估肌肉大小和质量。第三腰椎CT扫描显示肥胖患者低肌肉质量和低肌肉质量。图2。CT扫描评估肌肉大小和质量。Elif Aysin, PhD, RDN, LD1;Rachel Platts, RDN, LD1;洛里·洛根,rn11亨利社区卫生,纽卡斯尔,in财政支持:无报道。背景:疾病相关的营养不良会改变身体成分并导致功能衰退。在急诊医院,20-50%的住院病人在入院时营养不良。营养不良患者的医疗费用更高,死亡率更高,住院时间更长。营养不良与再入院和并发症的风险增加有关。监测、诊断、治疗和记录营养不良对治疗患者很重要。它还有助于正确的诊断相关组(DRG)编码和准确的CMI(病例组合指数),从而增加报销。方法:在营养师完成营养重点体检(NFPE)课程并配备足够的工作人员后,在我们的小型农村社区医院开展由注册营养师领导的营养不良项目。跨学科医学委员会批准了改善营养不良筛查、诊断、治疗做法和编码的项目。决定使用学会/美国肠外和肠内营养学会(ASPEN)和全球营养不良领导倡议(GLIM)标准来诊断营养不良。营养不良筛查工具(MST)由护士完成,以确定营养不良的风险。营养和饮食部门创建了一个新的定制报告,使用营养数据库提供NPO-Clear-Full液体患者报告。rdn在工作日和周末检查NPO-Clear-Full liquid患者报告、BMI报告和住院时间(LOS)患者名单。rdn还进行NFPE检查以评估营养状况。如果发现营养不良,rdn通过医院信使系统与提供者沟通。提供者在他们的文件和护理计划中增加营养不良诊断。rdn创建了一个数据集,并与编码员和临床文档完整性专家/护理协调共享。我们追踪营养不良的病人。此外,注册营养师在营养不良的病人身上花了更多的时间。他们为出院计划和教育做出了贡献。结果:比较了实施营养不良项目6个月后的2023年营养不良诊断率和报销金额。营养不良诊断率从2.6%上升到10.8%。未指明的蛋白质卡路里营养不良诊断从39%下降到1.5%。rdn诊断的营养不良在82%的病例中被记录在提供者笔记中。营养不良诊断率提高315%,营养不良报销率提高158%。在确定为营养不良的患者中,59%接受了营养不良DRG代码。其余41%的患者有较高的主要并发症和合并症(mcs)代码。我们的营养不良赔偿从10.6万美元增加到27.6万美元。结论:实施循证实践指南是识别和准确诊断营养不良的关键。提供足够的工作人员进行必要的培训和多学科合作,改善了我们医院的营养不良诊断记录,增加了营养不良的报销。表1。营养不良前后的实施结果。图1所示。营养不良诊断的患病率。Elisabeth Schnicke, RD, LD, CNSC1;Sarah Holland,理学硕士,RD, LD, cnsc21俄亥俄州立大学韦克斯纳医学中心,俄亥俄州哥伦布市;2俄亥俄州立大学韦克斯纳医学中心,上阿灵顿,俄亥俄州财政支持:无报告。背景:营养不良与住院时间延长、再入院、死亡率和预后不良有关。早期发现和治疗至关重要。 结果:营养不良状况对主要结局(至EN起始时间、至EN目标率时间)无差异(p &gt; 0.05)。多元回归分析发现,中度营养不良和重度营养不良患者在入院后48小时内开始肠内营养的可能性均不高(p &gt; 0.05)。营养不良组间ICU和医院LOS均无差异(p &gt; 0.05)。营养不良组间ICU死亡率和住院死亡率均无差异(p &lt; 0.05)。在中度营养不良的患者中,81.8%的患者需要血管加压剂,而重度营养不良的患者为75%,未诊断为营养不良的患者为44.4% (p = 0.010)。90.9%的中度营养不良患者需要延长呼吸机时间(72小时),而59.4%的严重营养不良患者和51.9%的无营养诊断的患者需要延长呼吸机时间(p = 0.011)。结论:尽管营养不良的严重程度不影响LOS、再入院或死亡率,但营养不良状况确实显著预示着患者需要血管加压剂和延长呼吸机时间的可能性更大。为了更好地了解营养不良状况与临床结果之间的关系,进一步的研究需要更大的样本量。Jamie Grandic, RDN-AP, CNSC1;Cindi Stefl, RN, BSN, CCDS21Inova卫生系统,费尔法克斯站,弗吉尼亚州;2Inova Health System, Fairfax, VAEncore after presentation: Vizient Connections Summit 2024(2024年9月16日至19日)。出版物:美国医疗质量杂志(AJMQ) 2024年Vizient增刊。资金支持:无报告。背景:研究表明,高达50%的住院患者营养不良,但这些病例中只有9%被诊断出来。(1)对营养不良的诊断和干预不足会导致患者预后较差,并减少收入。我们的全系统营养不良宣传活动成功地加强了营养师的参与,提供者教育,并简化了文件编制流程。这一举措使营养不良代码的捕获量增加了两倍,营养不良变量的捕获量显著增加,与诊断相关的群体相对权重平均增加了约0.9。因此,与准确的营养不良诊断和记录相关的收入增加了约300%,同时死亡率和住院时间的观察与预期(O/E)比也有所改善。鉴于营养不良对死亡率、住院时间和费用的重大影响,加强识别规划至关重要。在我们的卫生系统内,营养不良鉴定项目已经在五家医院实施了几年。建立了临床营养和临床文献完整性(CDI)的系统领导角色,以确保一致性,实施最佳实践并优化项目疗效。在2022年,一项全面的分析确定了改进的机会:低系统捕获率(2%),对项目好处的认识有限,以及不一致的文档实践。领导团队,在我们的执行发起人的支持下,解决了这些问题,与服务部门的领导接触,并继续推动项目的改进。方法:开展营养不良教育活动:加强临床营养与CDI之间的合作,确保新发现的营养不良患者的日常全系统沟通。领导团队,包括编码和遵从性,审查了考虑拒绝风险和监管审计的文档协议。启动了一项全系统的营养师培训计划,包括RD的核心营养不良优化团队、5小时的综合培训和每月的图表审核,目标是达到80%的文件符合性。开展了一项提供者意识宣传活动,以互动演示形式介绍营养不良项目的好处和提供者文件建议。开发电子健康记录(EHR)报告和营养不良电子健康记录工具,使文件标准化。电子病历和财务报告用于监测项目影响和捕获率。结果:通过对利益相关者的持续教育,营养不良运动显著改善了成果。2022年11月还创建了营养不良电子病历工具。该工具对于增强文档、显著提高提供商和CDI的效率至关重要。主要结果包括:营养师文件合规性从85%(2022年7月)增加到95%(2024年);rd确定的营养不良病例从2%(2021年)增加到16%(2024年);最终编码营养不良诊断的月平均值从240例(2021年)增加到717例(2023年);平均DRG相对权重从1.24(2021年)攀升至2.17(2023年);财务影响从550万美元(2021年)增加到1770万美元(2024年);LOS O/E由1.04提高到0。 死亡率O/E从0.77提高到0.62(2021-2023)。结论:这一系统范围的倡议不仅提高了捕获率和文档记录,而且提高了总体结果。通过CDI和RD团队更多地发挥协作和领导作用,提供者可以更多地专注于患者护理,使这些团队能够在最佳状态下运作。展望2025年,重点将转向领先指标,以完善营养不良的识别,并进一步评估教育运动的影响。ryyota Sakamoto, MD, phd11京都大学,京都资金支持:无报道。背景:人们越来越关注吃肉对环境的影响,已经开始考虑转向植物性饮食。植物性饮食中特别缺乏的一种营养素是维生素B12。由于维生素B12缺乏会导致贫血、肢体感觉异常和肌肉无力,以及包括抑郁、谵妄和认知障碍在内的精神症状,因此寻找可持续的当地维生素B12来源非常重要。在这项研究中,我们主要研究了尼泊尔、印度和不丹两种传统上主要食用的发酵和腌制蔬菜gundruk和sinki,并调查了它们的维生素B12含量。辛基主要由萝卜根制成,而冈德鲁克则由芥菜叶等绿叶制成,这些绿叶经过发酵和晒干过程保存下来。以前的报告表明,在这些地区,不仅素食者和纯素食者,而且相当多的人,特别是穷人,可能一直很少吃肉。政府和其他组织已经启动了喂养计划,特别是向学校提供含有维生素A、B1、B2、B3、B6、B9、B12、铁和锌的强化食品。然而,在这个时候,向社区居民提供强化食品并不容易。探索从当地可用的产品中获取维生素B12的可能性是很重要的,这些产品可以由素食者、纯素食者或社区中的穷人服用。方法:从市场上获取4份腊肠和5份腊肠,采用德勃鲁氏乳杆菌亚种测定其维生素B12含量。乳酸杆菌(莱希曼乳杆菌)ATCC7830。定量下限设为0.03µg/100 g。采用LC-MS/MS(配备Triple Quad 5500 + AB-Sciex质谱仪的岛津LC系统)对微生物定量方法中维生素B12浓度最高的样品进行氰钴胺测定。氰钴胺素的多反应监测转变模式为Q1: 678.3 m/z, Q3: 147.1 m/z。结果:四种样品中均检出维生素B12,从高到低分别为5.0µg/100 g、0.13µg/100 g、0.12µg/100 g和0.04µg/100 g。对于sinki,在5个样品中有4个样品中检测到,其值从高到低分别为1.4µg/100 g, 0.41µg/100 g, 0.34µg/100 g和0.16µg/100 g。LC-MS/MS估计样品中氰钴胺素的浓度为1.18µg/100 g。结论:根据世界卫生组织和联合国粮食及农业组织《人体营养中的维生素和矿物质需求(第二版)》(2004年),成人维生素B12的推荐摄入量为2.4微克/天,孕妇为2.6微克/天,哺乳期妇女为2.8微克/天。这项研究的结果表明,尽管样本之间存在很大的差异,但甘露和辛基有可能作为维生素B12的来源。为了将山楂和辛基作为维生素B12的来源,可能有必要在关注维生素B12与山楂和辛基不同制作方法之间的关系的同时,找到一种稳定维生素B12含量的方法。特蕾莎·卡佩罗,MS, RD, LD1;阿曼达·特鲁克斯,MS, RRT, RCP, AE-C1;Jennifer curtis, MS, RD, LD, CLC1;Ada Lin, md11国家儿童医院,哥伦布市,俄亥俄州经济支持:无报告。背景:危重儿童的代谢需求是由静息能量消耗的增加来定义的。(1、2)。PICU的能量需求是不断变化的,准确的评估是具有挑战性的。(3)由于患者进食过多或不足,预测方程被发现是不准确的,这可能导致负面结果,如肌肉损失和愈合不良(进食不足)以及体重增加(过度进食)(1,4,5)。间接量热法(IC)被认为是评估代谢需求的金标准,特别是对危重儿科患者(1,2,4)。由于人员配备、设备可用性和成本以及其他与患者相关的问题和/或推车规格,IC的使用可能受到限制(6)。在我们的设施中,我们确定营养师使用IC的限制。 与没有CLABSI的患者相比,经历CLABSI的患者体重明显增加,输注可注射脂质乳(ILE)的天数更少,导管停留时间更短(表1)。CLABSI组和没有CLABSI组在LTPN的时间长度、消费者所在地(农村与非农村)、家庭卫生服务的利用、输注肠外营养(PN)的天数等方面没有显著差异。结论:在这项回顾性队列研究中,我们报告的CLABSI率为每1000个导管天0.49个,低于之前发表的类似患者群体的CLABSI率。在本研究期间,有CLABSI和没有CLABSI的患者体重、输注ILE天数和导管停留时间有显著差异。然而,先前报道的影响CLABSI的变量,如乙醇锁的使用和与护理提供者的接近程度,在本队列中没有显着差异。可能需要更多LTPN患者或更长的研究时间来证实这些结果及其对CLABSI发生率的影响。表1。长期肠外营养(LTPN)的特点。Silvia Figueiroa, MS, RD, CNSC1;Stacie Townsend, MS, RD, CNSC21MedStar华盛顿医院中心,马里兰州贝塞斯达;2马里兰州贝塞斯达国立卫生研究院资金支持:无报道。背景:在住院患者中,脂质乳剂是平衡肠外营养(PN)的重要组成部分。传统上,豆油基脂质注射乳剂(SO-ILE)作为PN配方的一部分,主要作为能量来源和预防必需脂肪酸缺乏症。当代的做法已经发展到结合不同脂质乳液的混合物,包括大豆、MCT、橄榄油和鱼油的组合(SO、MCT、OO、FO-ILE)。有证据表明,使用SO、MCT、OO、FO-ILEs可能会改变必需脂肪酸谱,影响肝脏代谢和其他与临床益处相关的过程。该项目的目的是比较在一家完全致力于临床研究的医院接受肠外营养的SO- ile或SO、MCT、OO、FO-ILE的成年患者的必需脂肪酸谱(EFAP)、甘油三酯(TGL)、肝功能检查(LFTs)和总胆红素(TB)水平。方法:回顾性分析2019年1月1日至2023年12月31日在我院接受PN合并SO-ILE或SO、MCT、OO、FO-ILE的成人患者,并在接受ILE 7天后进行EFAP评估。数据包括人口统计、临床和营养参数。没有实验室标记物的患者、使用异丙酚的患者以及在收集EFAP前7天内同时使用ILE产品的患者被排除在外。采用Fisher检验和Mann-Whitney U检验对数据进行统计分析。结果:共纳入42张患者病历(14张SO-ILE;so, MCT, oo, o - ile)。组特征见表1。SO-ILE组患者接受更多的ILE (0.84 vs 0.79 g/kg/天,p &lt; 0.0001)。在ILE开始后,TGL水平发生显著变化(p &lt; 0.0001)。SO- ile组中57%的患者LFTs升高,SO、MCT、OO、FO-ILE组中60%的患者LFTs升高,而TB分别升高21%和40%(图1)。进一步分析显示,两组之间LFTs和TB无显著差异。EFAP的评估显示DHA、二十二酚酸和EPA的水平有显著差异,在接受SO、MCT、OO、FO-ILE的组中,这些水平更高。相反,在亚油酸、同型g-亚麻酸和总omega - 6脂肪酸水平上也观察到显著差异,在接受SO ILE的患者中这些水平更高(图2)。在必需脂肪酸缺乏方面,各组之间没有观察到差异,如三烯:四烯比率所示。结论:在我们的样本分析中,SO- ile组与SO、MCT、OO、FO-ILE组之间的LFTs和TB水平无显著差异。在SO、MCT、OO、FO-ILE组中发现DHA、二十二酚酸和EPA的水平增加,而在SO- ile组中亚油酸、同型g-亚麻酸和总omega - 6脂肪酸的水平往往更高。虽然在我们的样本中,SO、MCT、OO、FO-ILE组接受的ILE/kg/天剂量较低,但各组之间必需脂肪酸缺乏率没有差异。表1。一般特征(N = 42)。图1所示。肝功能检查(N = 39)。图2。必需脂肪酸谱(N = 42)。Kassandra Samuel, MD, ma;乔迪(林德)佩恩,RD, CNSC2;凯莉·舒特,RD3;克里斯汀·霍纳,RDN, CNSC3;Daniel Yeh, MD, MHPE, FACS, FCCM, FASPEN, cnsc31丹佛健康,圣约瑟夫医院,丹佛,科罗拉多州;2丹佛健康中心,帕克,科罗拉多州;3 . Denver Health, Denver, co .财务支持:无报告。 大多数测试都是由重症监护病房的营养师安排的,很少在重症监护病房之外进行,尽管测试会使医院其他科室(如CTICU、康复、NICU和降压区)的患者受益。对非picu营养师的非正式调查显示,他们在解释数据和提供基于测试结果的建议方面存在很大的不确定性。不确定的原因主要集中在对这项技术缺乏熟悉。本研究的目的是制定指南和工作表,以一致地评估IC结果,目的是鼓励在我们的儿科设施中增加间接量热仪的使用。方法:由注册营养师(rd)和呼吸治疗师(RTs)组成的委员会于2023年1月召开会议,商定了循序渐进的指导方针,每月进行试验、审查和更新。最终确定的指南已转换为工作表,以提高使用的一致性并有助于解释IC结果。建立了一个共享文件,用于存储有关IC的文章以及指南和工作表的访问(图1和2)。对于本研究,回顾了2022年1月1日至2024年7月31日的IC数据。这些数据包括完成的测试数量和订单的来源。结果:自指南实施以来,使用IC数据的非picu区域从2022年的16%增加到2023年的30%,并且似乎有望在2024年保持不变(图3)。研发人员报告在评估测试结果以及提出测试订购建议方面的舒适度有所提高。结论:标准化的指南和工作表提高了RD对测试结果的舒适度和解释水平。PICU rd在PICU查房时已经变得更加熟练和舒适地解释IC。我们希望随着指南/工作表的发展,更多的非picu rd将在重症监护区域之外使用IC测试,因为重症监护区域可能会出现更长的住院时间。IC允许更个性化的营养处方。另一个好处是学科之间的信息交流。RTs向rd提供如何使用机器的教育。这增强了rd从RT的角度对IC测试结果的理解。作为回报,注册医生教育注册医生,让他们了解为什么患者测试环境的某些方面有助于报告结果,以便注册医生正确解释信息。委员会继续开会并讨论患者的测试,以了解如何优化测试以及如何使用结果来指导营养护理。图1所示。代谢车共享文件的屏幕截图。图2。IC工作表。图3。按单位每年完成的购物车:2022年为预干预;2023年和2024年是干预后。关键字:H2B = PICU;其他科室为非picu (H4A =心胸内科降压科,H4B =心胸内科ICU, H5B =烧伤科,H8A =肺科,H8B =稳定气管/通气科,H10B =神经外科/神经内科,H11B =肾病科/胃肠道,H12A =血液科/肿瘤科,C4A = NICU, C5B =感染性疾病)。Alfredo lozornio - jimsamnez -de-la- rosa, MD, MSCN1;Minu Rodríguez-Gil, MSCN2;Luz Romero-Manriqe, MSCN2;Cynthia García-Vargas, MD, MSCN2;Rosa Castillo-Valenzuela博士;Yolanda m<s:1> nade - romero,医学博士,msc11墨西哥学院Nutrición临床营养与营养治疗学院,León,瓜纳华托;2墨西哥学院Nutrición Clínica y Terapia nutritionalc(墨西哥临床营养与营养治疗学院),León,瓜纳华背景:骨骼肌减少症是一种系统性进行性肌肉骨骼疾病,与不良事件风险增加相关,在老年人中非常普遍。这种情况导致功能下降,生活质量下降,并对经济产生影响。由于年龄相关的变化、代谢变化、肥胖、久坐不动的生活方式、慢性退行性疾病、营养不良和促炎状态,肌肉减少症正变得越来越普遍。这项研究的目的是调查力量和肌肉质量之间的关系,用小腿围测量,都校正BMI,在年轻和年长的墨西哥成年人中。方法:这是一项前瞻性、观察性、横断面、基于人群的临床研究,研究对象为年龄在30至90岁之间的墨西哥男性和女性,通过方便抽样获得。这项研究得到了墨西哥瓜纳华托León的阿兰达德拉帕拉医院伦理委员会的批准,并遵守《赫尔辛基宣言》。在解释了研究的性质后,获得了所有参与者的知情同意。纳入标准为:30 - 90岁功能独立的墨西哥男性和女性(Katz指数类别“a”)。排除截肢、运动障碍或四肢有固定装置的参与者。该研究小组以前在人体测量方面是标准化的。 当比较体重或BMI Z-评分类别Z &lt; -2、-2 &lt; Z &lt; -0.01或Z &gt; 0时,入院的严重营养不良患者的大多数测量临床结果没有差异(表2)。与BMI &lt; -2或在-2和0之间相比,0的中位成本增加(p = 0.042)(表2)。在体重Z评分&gt的患者中,中位成本(p = 0.067)和中位LOS (p = 0.104)有增加的趋势;0.结论:不论用于诊断严重营养不良的诊断标准如何,严重营养不良住院患者具有相似的临床结果。少数入院的严重营养不良患者(n = 180或44%)有足够的人体测量数据来确定BMI。基于这些数据,我们机构未来针对入院前和入院后干预的项目将需要关注所有严重营养不良患者,而不会根据严重营养不良或人体测量的标准(单个或多个数据点)缩小范围。质量改善项目包括改善入院时的身高测量和BMI测定,这将允许将来评估人体测量对临床结果的影响。表1。严重营养不良诊断类别的结果。根据用于确定营养不良诊断的诊断标准,比较严重营养不良的结果。仅代表严重营养不良的患者。诊断标准根据ASPEN/AND指南确定,并在入院时由注册营养师(RD)定义。OR =手术室;ICU =重症监护病房;LOS =停留时间。报告了383例入院患者的数据,共327例患者因再入院:284例患者有1例入院;2次入院33例;8例入院3次;1例患者入院4次;1例住院5次。BMI z评分分类结果。入院的严重营养不良患者的结局,根据BMI z评分分层。仅代表严重营养不良的患者。BMI z评分根据入院时的体重和身高测量确定,由床边住院护士记录。OR =手术室;ICU =重症监护病房;LOS =停留时间。由于身高测量不完整,只有180例入院患者的数据可用,共158例患者:142例患者有1次入院;2次入院12例;3例患者入院3次;1例5次入院。claudia Maza, ND MSc1;Isabel Calvo, MD, MSc2;Andrea Gómez, ND2;Tania Abril, MSc3;Evelyn frias - tore,医学博士,MSc41Centro m<s:1> dico Militar(军事医疗中心),危地马拉,圣罗莎;2 Tijuana总医院(Tijuana总医院),下加利福尼亚州蒂华纳;3 Guayas Guayaquil天主教大学Católica de Santiago de Guayaquil;4 universsidad Espíritu Santo(圣灵大学),得克萨斯州滴泉市背景:营养不良是住院患者中常见且重要的问题,特别是在老年人或有多种合并症的患者中。这些患者营养不良的存在与发病率、死亡率、住院时间延长和医疗费用增加的风险有关。肌肉力量是营养状况的一个重要指标,可以用握力(HGS)有效地测量。本研究旨在描述拉丁美洲两家医院患者的营养状况与肌肉力量下降之间的关系。方法:于2022年2月至5月进行回顾性观察研究。数据是从两家医院收集的:一家在危地马拉,一家在墨西哥。共有169例年龄在19-98岁的患者被纳入研究,127例符合纳入标准。样本包括在内科、外科和老年科住院的成年男女患者。入院时和住院第14天分别记录了手握力、人口统计数据、基线医学诊断、体重和身高。排除标准包括手臂或手部活动受限的患者、镇静或机械通气患者以及住院时间少于24小时的患者。HGS采用JAMAR®和Smedley测功机,按照标准方案进行测量。采用集中趋势的方法进行统计分析,结果以表格和图表的形式呈现。结果:在第一医院(墨西哥),62例患者参与,以女性样本为主。平均体重为69.02公斤,身高为1.62米,BMI为26.14公斤/平方米(超重)。最常见的入院诊断是传染病、神经系统疾病和消化系统疾病。(表1)在第一次和第二次测量之间观察到HGS略有增加(0.49 kg)。 (图1)在第二家医院(危地马拉),62名患者也符合纳入标准,其中男性样本占主导地位。平均体重为65.92公斤,身高为1.61米,BMI为25.47公斤/平方米(超重)。传染病和肌肉骨骼疾病是最常见的诊断。(表1)在第一次和第二次测量之间,HGS降低了2kg。(图2)低HGS与体重过轻患者以及II级和III级肥胖患者相关。两个中心BMI正常的患者都表现出肌肉力量的显著下降,这表明体重本身并不是肌肉力量保持的充分指标。结论:这项多中心研究强调了住院患者营养状况与肌肉力量下降之间的重要关系。虽然体重过轻的患者HGS下降,但II级和III级肥胖患者也经历了显著的力量损失。这些发现表明HGS是一种评估住院患者营养状况和肌肉力量的有价值的非侵入性工具。早期识别肌肉力量退化可以帮助医疗保健提供者实施及时的营养干预措施,以改善患者的预后。表1。研究人群的基线人口学和临床特征。NS:神经系统,BMI:身体质量指数营养状况与握力的关系(中心1 -墨西哥)。图2。营养状况与握力的关系(中心2 -危地马拉)。Reem Farra, MDS, RD, CNSC, CCTD1;Cassie Greene, RD, CNSC, CDCES2;Michele Gilson, MDA, RD, CEDS2;玛丽·英格利克,MS, RD, CSO, CDCES2;克里斯汀·索纳姆,MS, RD, CDE2;戴比·安德森,MS, RD, CEDRD-S, CHC3;Stephanie Hancock, RD, CSP, CNSC41Kaiser Permanente, Lone Tree, CO;2Kaiser Permanente, Denver, CO;3Kaiser Permanente, Castle Rock, CO .;4 . kaiser Permanente, Littleton, co .财务支持:无报道。背景:目前,医疗保险和医疗补助服务中心对门诊的营养不良没有标准化的筛查要求。这引起了人们对营养不良的早期识别、营养干预的可能性以及医疗保健费用增加的关注。虽然对住院患者的营养不良筛查研究得很好,但其对门诊护理的影响尚未得到彻底检查。研究表明,营养不良的患者在出院后30天内再次入院的可能性要高出40%,住院费用增加了1万多美元。作为注册营养师(rd)营养评估的一部分,本质量改进项目旨在评估对所有患者实施标准化营养不良筛查工具的影响。方法:选择营养不良筛查工具(MST),因为它的有效性,敏感性和特异性,以确定在门诊设置的营养不良风险。这个工具通过询问最近无意的体重减轻和因食欲不振而减少的摄入量来评估风险。根据回答,0-5分表示风险的严重程度。0-1分表示没有风险,2-5分表示有营养不良风险。该问卷被整合到电子病历的营养评估部分,以标准化所有患者的筛查。那些得分在2分或以上的人被纳入,没有排除疾病状态。患者在初始评估后2-6周进行随访,在此期间重新计算其MST评分。结果:共筛选414例患者,175例完成随访。其中,131名儿童在营养干预后的MST得分有所改善,12名儿童得分增加,32名儿童得分保持不变。239名患者因各种原因失去随访,包括缺乏反应、RD安排有限、保险变更和死亡率。那些MST得分提高的患者在随后的住院治疗中平均每人节省了15,000美元的费用。这种成本的避免是由于住院时间较短,治疗反应较好,并且减少了对医疗干预的需求。该项目提高了多学科团队对早期营养干预重要性的认识。结论:该项目表明,在门诊环境中标准化营养不良筛查可以为患者和卫生系统节省成本,同时改善整体护理。需要进一步的研究来确定门诊营养不良筛查的最佳工具、最佳随访时间表和有效的营养干预措施,以最大限度地避免成本。 Amy Sharn, MS, RDN, LD1;Raissa Sorgho,博士,MScIH2;Suela Sulo,博士,硕士3;Emilio Molina-Molina, PhD, MSc, MEd4;克拉拉·罗哈斯·黑山,RD5;Mary Jean Villa-Real Guno, MD, FPPS, FPSPGHAN, MBA6;Sue Abdel-Rahman,制药博士,MA71Abbott Nutrition, Columbus, OH;2加州萨克拉门托公共卫生研究所健康与营养中心;3全球医疗事务与研究,雅培营养,芝加哥,伊利诺伊州;4研究,开发,雅培营养,格拉纳达,安达卢西亚;5罗萨里奥大学,埃斯奎拉德梅迪奇纳,波哥大,昆迪纳马卡;6马尼拉雅典耀大学医学和公共卫生学院,马尼拉大都会,国家首都区;演讲会:美国营养学会,6月29日至7月2日,芝加哥,伊利诺伊州,美国;美国儿科学会,9月27日至10月1日,美国佛罗里达州奥兰多。出版物:Sharn AR, Sorgho R, Sulo S, Molina-Molina E, Rojas Montenegro C, Villa-Real Guno MJ, Abdel-Rahman S.使用中上臂围z-score测量支持青少年营养不良筛查作为全球体育和健康计划的一部分,并改善营养保健的可及性。2024年8月12日;11:14 . 23978。doi: 10.3389 / fnut.2024.1423978。PMID: 39188981;PMCID: PMC11345244。资金支持:本研究由美国伊利诺伊州芝加哥雅培营养不良解决方案中心提供资金支持。Veeradej Pisprasert, MD, phd;Kittipadh Boonyavarakul, MD2;Sornwichate Rattanachaiwong, MD3;Thunchanok Kuichanuan, MD3;Pranithi Hongsprabhas, MD3;1庆庆大学医学院,庆庆大学,庆庆;2泰国曼谷朱拉隆功大学;3孔敬大学医学系,孔敬大学资助。背景:系统性硬化症(SSc)是一种自身免疫性疾病,营养不良是其自然史和/或胃肠道慢性炎症引起的常见并发症。目前的营养评估工具,如GLIM标准,可能包括关于营养诊断的肌肉质量测量数据。人体测量是测定肌肉质量的基本方法,然而,在这种情况下的数据是有限的。本研究旨在确定通过人体测量测定肌肉质量和肌肉功能对诊断SSc患者营养不良的效用。方法:对泰国斯利那加林医院的成年SSc患者进行横断面诊断研究。所有患者均根据主观整体评估(SGA)评估营养不良。通过上臂中肌围(MUAC)和小腿围(CC)测量肌肉质量,通过握力(HGS)测定肌肉功能。结果:共纳入SSc患者208例,其中女性149例(71.6%)。平均年龄59.3±11.0岁,体重指数21.1±3.9 kg/m²。近一半(95例);45.7%)营养不良。MUAC、CC、HGS平均值分别为25.9±3.83、31.5±3.81、19.0±6.99 kg。MUAC诊断营养不良的受试者工作特征(ROC)曲线下面积(AUC)为0.796,CC为0.759,HGS为0.720。建议的截止值见表1。结论:肌肉质量和肌肉功能与营养不良有关。通过人体测量来评估肌肉质量和/或功能可能是系统性硬化症患者营养评估的一部分。表1。系统性硬化症患者MUAC、CC和HGS的建议临界值小腿围,HGS;握力MUAC;mid-upper-arm周长。图1所示。主观总体评价(SGA)诊断营养不良的MUAC、CC和HGS的ROC曲线。Trevor systemma, BS1;Megan Beyer, MS, RD, LDN2;希拉里·温斯洛普,MS, RD, LDN, CNSC3;威廉·赖斯,BS4;Jeroen Molinger, PhDc5;Suresh Agarwal, MD3;科里·瓦萨斯,MD3;Paul Wischmeyer, MD, EDIC, FCCM, FASPEN6;Krista Haines, DO, MA31Duke University, Durham, NC;2杜克大学医学院麻醉科,北卡罗来纳州达勒姆;3杜克大学医学院,北卡罗来纳州达勒姆;4东弗吉尼亚医学院,弗吉尼亚州诺福克;5杜克大学医学中心麻醉科-杜克心脏中心,北卡罗来纳州罗利;6Duke University Medical School, Durham, nc .财政支持:Baxter。背景:实现可接受的营养目标是术后护理的一个关键但经常被忽视的组成部分,影响患者的重要结果,如减少感染并发症和缩短ICU住院时间。预测静息能量消耗(pREE)方程与实际测量的REE (mREE)相关性较差,导致潜在的有害的过量或不足摄食。国际ICU指南现在推荐使用间接量热法(IC)来确定mREE和个性化患者营养。 手术应激增加了蛋白质分解代谢和胰岛素抵抗,但年龄对术后mREE趋势的影响尚未得到很好的研究,而通常使用的pREE方程并没有考虑到这一点。本研究假设接受腹部大手术的老年人代谢恢复比年轻患者慢,如ic测量。方法:这是一项irb批准的前瞻性试验,研究对象是接受腹部大手术后继发于钝性或穿透性创伤、败血症或血管急症的腹部开放的成年患者。患者接受连续IC评估以指导术后营养输送。术后72小时内进行评估,ICU期间每3±2天进行一次评估,出院后每7±2天进行一次评估。根据医学可行性,使用Q-NRG®代谢监测仪(COSMED)在机械通气患者的呼吸机模式或面罩或冠层模式下测定患者的mREE。IC数据取自满足稳态条件的≥3分钟间隔,即氧气消耗和二氧化碳产生的变化小于10%。不符合这些标准的测量结果被排除在最终分析之外。在术后前9天至少两个时间点没有mREE数据的患者也被排除在外。老年患者定义为≥65岁,年轻患者定义为≤50岁。采用最小二乘法计算稀土元素的变化趋势,假设方差不等(α = 0.05),采用t检验进行比较。使用ASPEN-SCCM方程从入院人体测量数据计算患者的pREE值,并与IC测量值进行比较。结果:18名老年人和15名年轻人符合预先指定的资格标准,并被纳入最终分析。老年人和年轻人REE恢复的平均率和标准误差分别为28.9±17.1 kcal/d和75.5±18.9 kcal/d,接近但未达到统计学意义(p = 0.07)。老年队列pREE的下限和上限平均分别为1728±332 kcal和2093±390 kcal,明显超过了大多数患者通过IC获得的mREE,并且未能捕捉到使用mREE识别的观察到的变异性。在年轻人中,pREE值更接近IC测量值,平均值分别为1705±278 kcal和2084±323 kcal。结论:我们的数据表明,腹部大手术后代谢恢复率在年轻和老年成人患者之间存在差异,但可能由于样本量不足,没有达到统计学意义。预测能量方程不能充分捕捉到REE的变化,可能高估了老年患者术后的能量需求,未能认识到我们的研究在老年患者中发现的mREE增加的可变性。这些发现加强了在术后早期恢复期使用IC指导营养输送的重要性。为了进一步探索这些问题,需要更大规模的试验来使用IC和量化蛋白质代谢的贡献。表1。病人的人口统计数据。图1所示。与pREE (ASPEN)相比,大腹手术后老年和年轻成人患者mREE的术后变化。Amber Foster, BScFN, BSc1;Heather Resvick, PhD(c), MScFN, RD2;Janet Madill, PhD, RD, FDC3;帕特里克·卢克,医学博士,FRCSC2;Alp Sener, MD, PhD, FRCSC4;Max Levine, MD, MSc51Western University, Ilderton, ON;2LHSC,伦敦,ON;3西安大略大学健康科学学院布雷西亚食品与营养科学学院,伦敦;4伦敦健康科学中心,伦敦,安大略;财政支持:布雷西亚大学学院MScFN助学金。背景:目前,体重指数(BMI)是衡量慢性肾脏疾病(CKD)患者是否适合肾移植的唯一标准。然而,BMI并不能很好地衡量这类患者的健康状况,因为它不能区分肌肉质量、脂肪质量和水的重量。终末期肾病患者经常经历体液平衡的改变,导致体液潴留、肿胀和体重增加。因此,这些患者的BMI可能会被错误地升高。因此,为这一患者群体考虑更准确和客观的身体成分测量是至关重要的。本研究的目的是确定CKD患者被分类为健康体重、超重或肥胖之间的身体组成是否存在差异。方法:这是一项横断面研究,分析114名成年CKD患者的身体组成,评估肾脏移植。参与者被分为三个BMI组:健康体重组(1组,BMI &lt; 24.9 kg/m2, n = 29),超重组(2组,BMI≥24.9-29)。 9 kg/m2, n = 39)或肥胖(3组,BMI≥30 kg/m2, n = 45)。采用生物电阻抗分析法(BIA)测定脂肪质量、瘦体重(LBM)和相位角(PhA)。标准相位角(SPhA)是衡量细胞健康状况的一种指标,计算方法为[(观察到的PhA-平均PhA)/ PhA的标准差]。采用Jamar测力仪测量握力(HGS),超声测量股四头肌肌层厚度(QMLT)。归一化HGS (nHGS)计算为[HGS/体重(kg)],并与年龄和性别特异性的标准化临界值进行比较。无脂质量指数(FFMI)采用[LBM/(身高(m))2]计算。低FFMI可以识别营养不良的高风险,使用ESPEN的临界值&lt;17公斤/平方米男性和&lt;雌性15公斤/平方米。使用经过验证的Fried脆弱表型评估工具确定脆弱状态。统计分析:连续资料采用单因素方差分析,后设Tukey事后检验,分类资料采用卡方检验。IBM SPSS version 29,显著性p &lt; 0.05。结果:组1的参与者比组2 (p = 0.004)或组3 (p &lt; 0.001)年轻。男性和女性在三组之间没有显著差异。组1低于临界值的FFMI值(13%)显著高于组2(0%)和组3 (2.1%)(p = 0.02)。nHGS在两组之间存在显著差异,第3组(75%)的参与者更频繁地出现较低的肌肉力量,而第2组和第1组分别为48.7%和28.5% (p &lt; 0.001)。在三个BMI组之间,QMLT、SPhA、HGS或虚弱状态均无显著差异。结论:在三个BMI组之间,QMLT、SPhA或虚弱状态等身体成分参数似乎没有差异。然而,被归类为具有健康BMI的CKD患者更有可能存在营养不良的风险。此外,与其他两组相比,那些被归类为具有健康BMI的人似乎拥有更多的肌肉力量。综上所述,这些结果提供了令人信服的证据,表明BMI不应成为列出肾移植患者的唯一标准。需要进一步的研究来证实这些发现。凯莉·维尼克,BS1;凯瑟琳·彼得森,MS, RDN, CSO2;Julie Kurtz, MS, CDCES, RDN2;莫琳·麦考伊,MS, RDN3;Mary Chew, MS, rdn41亚利桑那州立大学和退伍军人医疗管理局,凤凰城,亚利桑那州;2退伍军人医疗管理局,亚利桑那州凤凰城;3亚利桑那州立大学,亚利桑那州凤凰城;4凤凰VAHCS,凤凰,azphoenix财务支持:无报告。背景:营养不良没有一个标准化的定义,也没有普遍的识别标准。注册营养师(rdn)通常根据学会和ASPEN营养不良鉴定(AAIM)标准进行诊断,而医生则需要使用国际疾病分类第10版(ICD-10-CM)。然而,在如何确定两者之间的营养不良存在重大差异。营养不良ICD-10代码(E43.0, E44.0, E44.1, E50.0-E64.9)的诊断标准模糊,导致提供者使用临床专业知识和先前的营养教育。对于营养师来说,AAIM的诊断标准是明确定义和验证的,可以根据体重和摄入量减少、肌肉和脂肪量减少、液体积聚和身体功能下降来识别营养不良。由于缺乏标准化,识别和诊断营养不良的过程不一致。本研究的目的是分析医生和营养师使用两种方法对营养不良诊断的一致性,并比较一致组和不一致组之间的患者结果。方法:通过电子方式从退伍军人健康管理局的临床数据仓库中提取2019年、2020年和2021年4月至7月期间分配营养不良诊断代码的668名住院患者的回顾性图表。住院时间、感染、压伤、跌倒、30天再入院以及提供者之间的沟通记录从图表回顾中收集。医院的费用数据来自退伍军人公平资源分配(VERA),并与样本中匹配的社会安全号码配对。用卡方比较感染、压伤、跌倒和再入院的不一致性和一致性的差异。两组患者住院时间和住院费用均值采用方差分析(ANOVA)进行统计分析。结果:供方对营养不良的诊断不一致。诊断不一致组的不良预后比例高于诊断一致组。一致的诊断发现与记录沟通的发生率显著相关(p &lt; 0.001)。 结论:本研究显示了营养不良患者护理的差距。需要进行进一步的研究,以了解一致诊断和提供者之间沟通的障碍。松本奈奈,RD, MS1;大叶幸二副教授2;成田智则,MD3;井上荣,MD2;Satoshi Murakoshi, MD, PhD4;谷口幸,MD2;河野健一,MD2;野口美纪BA5;精工Tsuihiji2;Kazuhiko Fukatsu, MD, phd21东京大学,文京城,东京;2东京大学文京区,东京;3 .东京大学,东京中央城;4神奈川县横须贺市神奈川县人类服务大学;5 .东京大学医院,文京区,东京。资金支持:无报道。背景:治疗性饮食常用于各种疾病的患者,如糖尿病、肾功能不全和高血压。然而,由于给予的营养量的限制,治疗性饮食可能会降低食欲。医院的膳食不论何种饮食类型,只要病人吃饱了,就能维持病人的营养状况。治疗性饮食可能至少在一定程度上与医院营养不良有关。因此,我们进行了一项探索性回顾性队列研究,在考虑其他因素的情况下,探讨治疗性饮食和常规饮食是否存在住院患者口腔消耗的差异。方法:本研究方案经东京大学伦理委员会批准,协议号为2023396ni -(1)。我们回顾性地提取了2022年6月至10月在东京大学医院骨科和脊柱外科住院的患者的病历信息。符合条件的患者年龄大于20岁且住院7天以上。这些患者的主要营养来源是口服膳食。给予质地改良饮食、半食或流食者排除。测量包括在住院期间(例如入院时、手术前后和出院时)、性别和年龄的不同时间点口服食物摄入量的百分比。通过线性混合效应模型分析治疗饮食与常规饮食对患者口服食用量的差异。结果:共分析290例患者,其中50例患者在入院时接受治疗性饮食,240例患者在入院时接受常规饮食。在每个住院时间,治疗饮食的平均口服摄入百分比为83.1%,常规饮食的平均口服摄入百分比为87.2%,与治疗饮食相比,常规饮食的平均口服摄入百分比始终高出4-6%(图)。在调整性别和年龄的线性混合效应模型中,常规饮食组口服摄入的平均百分比比治疗饮食组高4.0%(95%可信区间[CI], -0.8%至8.9%,p = 0.100),但差异未达到统计学意义。女性口服摄入的平均百分比比男性低15.6% (95%CI, -19.5%至-11.8%)。同样,与年轻患者相比,老年患者的摄入率降低(差异,每年龄-0.2%,95%CI -0.3%至-0.1%)。结论:本探索性研究未能表明,与常规饮食相比,治疗性饮食可以减少骨科和仰卧手术患者的食物摄入量。然而,性别和年龄是影响食物摄入量的重要因素。我们需要特别注意女性和/或老年患者增加口服食物的摄入量。未来的研究将增加检查的患者数量,将队列扩展到其他科室,并进行前瞻性研究,以找出哪些因素真实影响患者住院期间的口服摄入。图1所示。住院期间每种饮食中口服摄入量的百分比。罗瑞娜·穆哈吉,MS1;Michael Owen-Michaane, MD, MA, CNSC211人类营养研究所,Vagelos内科和外科医生学院,哥伦比亚大学欧文医学中心,纽约,NY;2哥伦比亚大学欧文医学中心,纽约,纽约资金支持:无报道。背景:肌肉质量对整体健康和幸福至关重要。因此,准确估计肌肉质量对于诊断营养不良和肌肉减少症和恶病质等疾病至关重要。Kim方程使用生物标志物数据来估计肌肉质量,但该工具是否能在高BMI和肾病人群中提供准确的估计仍不确定。因此,本研究的目的是评估Kim方程在具有不同BMI和肾脏疾病的队列中是否适合和可靠地估计肌肉质量并预测营养不良、肌肉减少症和相关结局。方法:这是一项横断面研究,使用来自我们所有人研究计划的数据。统计数据、体重、身高、肌酐、胱抑素C、营养不良诊断、髋部骨折和恶病质均来自电子健康记录(EHR)。 Kim方程由肌酐、胱抑素C和体重导出(表1),并与已建立的阑尾瘦质量(ALM)肌少症临界值进行比较,包括ALM/BMI和ALM/身高2。通过记录在EHR中的特定ICD-10-CM代码确定营养不良,并根据营养不良状况(有无营养不良)对参与者进行分类。比较重度/中度营养不良组和非重度/中度营养不良组的肌肉质量和生物标志物水平,并使用线性回归分析BMI与肌酐和胱抑素C水平的关系。使用Wilcoxon秩和检验来评估估计肌肉质量与营养不良诊断之间的关系。结果:基线特征按性别分层进行比较。参与者的平均年龄为58.2岁(SD = 14.7)。平均BMI为30.4 kg/m2 (SD = 7.5)。平均血清肌酐和胱抑素C水平分别为2.01 mg/dL (SD = 1.82)和0.18 mg/dL (SD = 0.11)。平均估计肌肉质量为80.1 kg (SD = 21),估计肌肉质量占体重的百分比为92.8%。平均ALM/BMI为2.38 (SD = 0.32), ALM/height2为25.41 kg/m2 (SD = 5.6)。没有参与者达到肌肉减少症的临界值。所有计算变量汇总如表1所示。在这个队列中,&lt;2%的人被诊断为严重营养不良。2%为中度营养不良(表2)。与没有营养不良的人相比,严重营养不良的参与者的肌肉质量更低(W = 2035, p &lt; 0.05)(图1)。结论:该研究表明,Kim方程高估了高BMI或肾脏疾病人群的肌肉质量,因为尽管预期CKD患病率较高,但没有参与者达到肌肉减少症的临界值。考虑到CKD中已知的低肌肉量风险,这种高估值得关注。虽然较低的肌肉量与严重营养不良显著相关(p &lt; 0.05),但Kim方程确定的营养不良病例比基于临床数据的预期要少。虽然像肌酸酐和胱抑素C这样的生物标志物可能有助于诊断营养不良,但金方程可能无法准确估计不同人群的肌肉量或预测营养不良和肌肉减少症。需要进一步的研究来改进这些估计。表1。基于FNIH (ALM/BMI)和EWGSOP (ALM/Height2)临界值的肌肉质量度量计算和肌少症诊断缩写:bmi -身体质量指数;tbmm -全身肌肉质量(也称为肌肉质量)(使用Kim方程计算);alm -阑尾瘦肌肉质量(使用麦卡锡方程);ALM/ height2 - appendical Lean Muscle Mass调整为高度平方(使用EWGSOP截止值诊断肌肉减少症);ALM/BMI-根据BMI调整的阑尾瘦肌肉质量(使用FNIH截止值诊断肌肉减少症)。公式1:Kim公式-计算体肌质量=体重*血清肌酐/(K *体重*血清胱抑素C) +血清肌酐)严重和中度营养不良的患病率。(小于20的计数被抑制,以防止参与者的重新识别)。图1所示。有和没有严重营养不良群体的肌肉质量。海报区分罗伯特·韦默,BS1;Lindsay Plank博士;Alisha Rovner博士;Carrie Earthman,博士,rd11特拉华大学,纽瓦克,特拉华;2奥克兰大学,奥克兰资金支持:无报道。背景:骨骼肌丧失在肝硬化患者中很常见,而低肌肉量是国际公认的诊断营养不良的关键表型标准。1,2肌肉可以通过各种方式进行临床评估,尽管参考数据和共识指南在解释肌肉测量来定义这一患者群体的营养不良方面是有限的。本研究的目的是利用体内中子活化分析(IVNAA)测量的总蛋白(TBP)作为参考,评估已发表的肌少症切点应用于双能x射线吸收仪(DXA)肌肉测量以GLIM标准诊断营养不良的敏感性和特异性。方法:成年肝硬化患者在奥克兰大学身体成分实验室行IVNAA和全身DXA。测量dda -free mass (FFM)和阑尾骨骼肌mass (ASMM,有和没有校正湿骨量3)并以身高平方为指标(FFMI, ASMI)。根据年龄、性别和身高匹配的健康参考数据计算TBP实测值与预测值之比,作为蛋白质指数;低于平均值2个标准差的值(&lt;0.77)定义为蛋白质消耗(营养不良)。评估推荐指南中公布的切点(表1)。低于切点的DXA值被解释为“肌肉减少”。测定每个切点的敏感性和特异性。 结果:研究样本包括350名肝硬化成人(238名男性/112名女性,中位年龄52岁),终末期肝病模型(MELD)中位评分为12(范围5-36)。应用已公布的肌少症切点诊断肝硬化患者的DXA营养不良,其敏感性为40.8% - 79.0%,特异性为79.6% - 94.2%(表1)。尽管所有已公布的DXA测量ASMI的切点都是相似的,但将Baumgartner4和Newman5 ASMI切点应用于DXA测量的ASMI时,特别是在校正湿骨量后,在肝硬化患者中,通过蛋白质耗竭诊断营养不良,获得了敏感性和特异性的最佳结合。表1中的Studentski ASMM切点产生了不可接受的低灵敏度(41-55%)。结论:这些发现表明,使用dxa衍生的Baumgartner/Newman骨矫正ASMI切点在肝硬化患者GLIM诊断营养不良方面具有可接受的有效性。然而,考虑到在DXA测量ASMI时对湿骨量进行这种校正的做法并不常见,将这些切点应用于DXA未校正的ASMI标准测量可能会产生更低的敏感性,这表明许多低肌肉和营养不良的个体在应用这些切点时可能会被误诊为非营养不良。表1。双能x线骨骼肌指数测定法测定肝硬化患者蛋白质耗竭的选定切点评价缩写:DXA,双能x射线吸收仪;米,男;F,女;全球营养不良领导倡议;欧洲老年人肌肉减少症工作组;亚洲肌肉减少症工作组;美国国立卫生研究院基金会;ASMM,由dxa测量的手臂和腿部瘦软组织测定的附骨骼肌质量(kg);ASMI,以平方米为单位索引高度的ASMM;ASMM- bc,根据Heymsfield等人1990年修正湿骨量的ASMM;ASMI- bc, ASMI校正湿骨量。重症监护和关键健康问题amir Kamel,药学博士,FASPEN1;Tori Gray, pharm2;Cara Nys,药学博士,BCIDP3;Erin Vanzant, MD, FACS4;马丁罗森塔尔,医学博士,FACS, faspen11佛罗里达大学,盖恩斯维尔,佛罗里达州;2 .佛罗里达州盖恩斯维尔市辛辛那提儿童基金会;3奥兰多健康中心,佛罗里达州奥兰多;4佛罗里达州盖恩斯维尔市佛罗里达大学医学院创伤和急性护理外科外科学系资金支持:无报道背景:氨基酸(AAs)在我们的身体中有不同的用途,包括结构、酶和整体细胞功能。氨基酸的利用和需求在健康和疾病状态之间可能有所不同。某些情况下,如慢性肾病或短肠综合征会影响血浆AA水平。先前的研究已经确定瓜氨酸是肠道功能和吸收能力的标志。手术或创伤等应激源可以改变AAs的代谢,可能导致高分解代谢状态和可用AAs池的变化。本研究的主要目的是比较腹部手术患者与未手术患者的AA水平。次要目的是描述术后并发症,并将血浆AAs水平与这些并发症联系起来。方法:本研究是一项单中心回顾性分析,于2007年1月1日至2019年3月15日期间对转介到佛罗里达大学健康营养支持团队(NST)的患者进行了单中心回顾性分析,并将氨基酸水平作为营养支持咨询的一部分进行了常规代谢评估。如果样品被认为受到污染,则排除氨基酸数据。患有遗传疾病的患者也被排除在研究之外。在研究期间,采用生物铬离子交换色谱法(ARUP实验室,盐湖城,UT)进行氨基酸生物测定。结果:在筛选的227例患者中,181例患者纳入研究(58例接受腹部手术,123例未接受手术)。参与者的平均年龄、BMI和身高分别为52.2岁、25.1 kg/m2和169 cm。两组患者的基线特征相似,31%的手术组患者在术后一年内接受了手术,86.5%的患者保留了结肠,69.2%的患者进行了肠切除术,记录在案的肠长度为147.6厘米(58人中有36人)。术后并发症小肠梗阻、肠梗阻、漏肠、脓肿、出血和手术部位感染(SSI)分别为12.1%、24%、17.2%、20.7%、3.4%和17.2%。在评估的19个aa中,两组之间瓜氨酸和蛋氨酸的中位数水平有显著差异(23 [14-35]vs 17 [11-23];P = 0.0031和27 [20-39]vs . 33[24-51];p = 0.0383。 背景:长期以来,住院患者的早期营养支持一直被认为是改善患者整体预后的关键干预措施。入院的患者通常难以通过肠内途径获得足够的营养,可能需要肠外营养(PN)。中央肠外营养(CPN)需要中央通路,这历来导致对中央静脉相关血流感染(CLABSI)的担忧。获得集中访问可能需要大量资源,并可能在等待访问期间导致治疗延误。相反,外周肠外营养(PPN)可以在没有中心通道的情况下提供。在这个质量改进项目中,我们试图描述我们在一家大型城市三级医院的PPN使用情况。方法:我们对23年1月1日至23年12月31日在我院接受PN治疗的成年住院患者进行回顾性分析。如果患者在住院前有PN,则排除在审查之外。收集人口统计信息、治疗持续时间、及时给药状况和配方营养成分信息。结果:128例住院患者接受了共1302个PN天的PN治疗。患者平均年龄53.8岁(SD: 17.9),男性65例(50%)。26例(20%)患者仅接受PPN治疗,中位[IQR]长度为3[2-4]天,61例(48%)患者仅接受CPN治疗,中位长度为6[3 - 10]天。39例(30%)患者开始使用PPN,过渡到CPN的中位时间为1[1-3]天,CPN的中位总持续时间为8[5-15.5]天。少数患者接受CPN,然后过渡到PPN(2%)。结论:在我们的机构,超过50%的住院PN患者使用PPN,最常见的是在PN开始时,然后最终过渡到CPN,持续时间相对较短,为一到两周。需要进一步的研究来确定那些可能通过增加PPN量和宏量营养素来提供适当营养治疗的患者。Nicole Halton, NP, CNSC1;Marion Winkler, PhD, RD, LDN, CNSC, FASPEN2;Elizabeth Colgan, MS, RD3;Benjamin Hall, MD41Brown Surgical Associates, Providence, RI;2罗德岛州普罗维登斯罗得岛医院外科和营养支助科;3罗德岛医院,普罗维登斯,罗德岛;4布朗外科协会,布朗大学医学院,普罗维登斯,rid财政支持:无报道。背景:肠外营养(PN)为不能满足口服或肠内营养需求的胃肠功能受损患者提供足够的营养和液体。PN需要静脉通路装置,这有相关的风险,包括感染以及与治疗相关的代谢异常。在医院环境中对PN治疗的监测包括定期的血液工作,但不正确采集的样本可能导致异常的实验室结果和不必要的医疗干预。方法:在罗德岛医院进行了一项IRB豁免的质量改进研究,由管理所有成人PN的外科营养服务中心进行。本研究的目的是量化2024年1月1日至2024年8月31日期间PN患者中受污染血液样本的发生情况。收集人口统计资料、静脉通路装置和pn相关诊断。每个病人、每个医院单位的污染血液标本定量测定,并根据总PN天进行调整。比较了污染和重新抽取的血液样本的血清葡萄糖、钾和磷水平。报告描述性数据。结果:138例患者共接受了1840天的PN治疗,中位PN治疗时间为8天(IQR 9,范围2-84)。最常见的血管通路装置是双腔外周置管中心导管。大多数(63%)患者由外科团队转诊,并在外科楼层或重症监护病房接受治疗。最常见的PN相关诊断为肠梗阻、胃或小肠梗阻和短肠综合征。42例(30%)接受TPN的患者中有74例血液标本被污染,每总患者日的发生率为4%。在25个护理单位中,64%的TPN患者至少有一次血液标本被污染。与重新绘制的样本相比,污染样本显示血清葡萄糖、钾和磷显著不同(p &lt; 0.001);污染和重绘样品的葡萄糖分别为922±491和129±44 mg/dL;钾6.1±1.6 vs 3.9±0.5 mEq/L;磷4.9±1.2 vs 3.3±0.6 mg/dL。重复血样的平均间隔时间为3小时。结论:受污染的血样可导致患者护理延误、多次抽血带来的不适、不必要的医疗干预(胰岛素;停止PN),延迟及时放置PN订单,并增加感染风险。 丙氨酸和精氨酸水平与术后肠梗阻有关,亮氨酸水平与SSI相关,谷氨酸和甘氨酸水平与术后瘘管形成有关。结论:除瓜氨酸和蛋氨酸外,大多数氨基酸水平在接受腹部手术的患者和未接受腹部手术的患者之间没有显著差异。特定的氨基酸,如丙氨酸、精氨酸、亮氨酸、谷氨酸和甘氨酸可以作为术后并发症的早期指标,但需要更大规模的前瞻性试验来验证我们的发现。仓岛健人,MD, phd;恩典Trello1;詹姆斯Fox1;爱德华Portz1;Shaurya Mehta, BS1;奥斯汀Sims1;阿伦·维尔马,MD1;钱德拉谢哈拉哲学博士;Yasar Caliskan, MD1;穆斯塔法·纳扎尔,MD1;Ajay Jain, MD, DNB, mha11圣路易斯大学,圣路易斯,美国背景:边缘供肝(mdl)已被用于肝移植,以解决主要器官短缺问题。然而,mdl对缺血/再灌注损伤(IRI)非常敏感。最近的研究表明,铁下垂是一种新型的程序性细胞死亡,是IRI的潜在诱因。我们假设通过铁螯合剂去铁胺(DFO)调节铁上吊可以改变IRI的过程。方法:使用我们的新型灌注调节器官治疗与增强控制测试(PROTECT)模型(美国提供专利,US63/136,165),获得6个人mdl(肝脏A至F)并分成成对的叶。同时对两叶进行灌注,其中一叶进行DFO,另一叶作为内控。进行组织学、血清化学、死铁相关基因表达、铁积累测定和脂质过氧化测定。结果:组织学分析显示A肝和D肝大泡性脂肪变性严重(30%),B肝和E肝大泡性脂肪变性表现为轻至中度。大多数样本显示轻度炎症,主要在3区。灌注过程中未见明显坏死。Perl's普鲁士蓝染色和非血红素铁定量显示,DFO处理抑制了肝脏a到D的铁积累(p &lt; 0.05)。根据铁螯合程度,将12个叶状体分为铁含量减少(n = 4)和铁含量增加(n = 8)两组。对比分析显示,前者的HIF1-alpha、RPL8、IREB2、ACSF2、NQO1相关基因显著下调(p = 0.0338、p = 0.0085、p = 0.0138、p = 0.0138、p = 0.0209)。铁含量降低显著抑制叶内脂质过氧化(p = 0.02)。而铁螯合叶的血清AST较低,这没有达到统计学意义。结论:本研究证实了铁的积累是由恒温灌注驱动的。铁含量的降低抑制了死铁相关基因和脂质过氧化以减轻IRI。我们使用人类mdl的结果揭示了铁含量与铁下垂之间的新关系,为IRI治疗的未来发展提供了坚实的基础。Gabriella ten Have博士;Macie Mackey, BSc1;卡罗莱纳·佩雷斯,msc;John Thaden, phd;Sarah Rice, phd;Marielle Engelen博士;资金支持:国防部:CDMRP PR190829 - ASPEN Rhoads研究基金会- C. Richard Fleming Grant;丹尼尔·h·泰特尔鲍姆·格兰特。背景:脓毒症是危重患者感染的潜在威胁生命的并发症,其特征是几个器官严重的组织破坏,导致长期肌肉无力、疲劳和体力活动减少。最近的指南建议在脓毒性事件恢复后的头几天逐渐增加蛋白质摄入量。然而,目前尚不清楚这是如何影响全身蛋白质周转的。因此,在急性败血症-恢复猪模型中,我们研究了限制饲喂均衡氨基酸餐(AA)后败血症恢复早期的全身蛋白质代谢。方法:25头猪(±25 kg),静脉滴注铜绿假单胞菌活菌(5*108 CFU/h)致脓毒症。在t = 9 h时,开始静脉注射庆大霉素恢复。脓毒症后,每日两次递增进食(Day 1:25%, 2:50%, 3:75%, &gt;4:100%)。100%饲粮中每公斤体重含有15.4克CHO和3.47克脂肪,以及平衡的游离AA(反映肌肉AA分布)混合物(0.56克n = 3.9克AA)。在败血症前(基线)和恢复第3天,静脉注射氨基酸稳定同位素混合物作为脉冲(吸收后)。随后,吸收后2小时采集动脉血样本。用LCMS测定氨基酸浓度和富集程度。统计学:RM-ANOVA, α = 0.05。 结果:第3天,动物体重下降(2.4 [0.9,3.9]%,p = 0.0025)。与基线值相比,血浆AA浓度谱发生了变化。总体而言,非必需AA总血浆浓度没有变化。组氨酸、亮氨酸、蛋氨酸、苯丙氨酸、色氨酸和缬氨酸的血浆必需氨基酸浓度较低(p &lt; 0.05),赖氨酸较高(p = 0.0027)。异亮氨酸没有变化。我们观察到非必需氨基酸的全身产量(WBP)较低,如精氨酸(p &lt; 0.0001)、谷氨酰胺(p &lt; 0.0001)、谷氨酸(p &lt; 0.0001)、甘氨酸(p &lt; 0.0001)、脯氨酸(p &lt; 0.0001)、鸟氨酸(p = 0.0003)、牛磺酸(p &lt; 0.0001)和酪氨酸(p &lt; 0.0001)。瓜氨酸的产量没有改变。此外,必需氨基酸、异亮氨酸(p = 0.0002)、亮氨酸(p &lt; 0.0001)、缬氨酸(p &lt; 0.0001)、蛋氨酸(p &lt; 0.0001)、色氨酸(p &lt; 0.0001)和赖氨酸(p &lt; 0.0001)的WBP均较低。全身蛋白质分解和蛋白质合成也较低(p &lt; 0.0001),而净蛋白质分解没有变化。结论:我们的脓毒症-恢复猪模型表明,在脓毒症恢复早期限制食物会导致蛋白质周转减少。Gabriella ten Have博士;Macie Mackey, BSc1;卡罗莱纳·佩雷斯,msc;John Thaden, phd;Sarah Rice, phd;Marielle Engelen博士;资金支持:国防部:CDMRP PR190829 - ASPEN Rhoads研究基金会- C. Richard Fleming Grant;丹尼尔·h·泰特尔鲍姆·格兰特。背景:脓毒症是危重患者感染的潜在威胁生命的并发症。其特征是几个器官严重的组织破坏,导致长期肌肉无力、疲劳和体力活动减少(ICU-AW)。在急性脓毒症恢复的ICU-AW猪模型中,我们研究了仅含有必需氨基酸(EAA)的膳食是否可以恢复脓毒症恢复期间的代谢失调,并通过综合代谢表型进行评估1。方法:采用铜绿假单胞菌活菌(5*108 CFU/h)静脉滴注致脓毒症49头(±25 kg)。在t = 9 h时,开始静脉注射庆大霉素恢复。脓毒症后,每日2次盲目加食量(Day 1:25%, 2:50%, 3:75%, &gt;4:100%)。100%饲粮中每公斤体重含有15.4克CHO和3.47克脂肪,0.56克氮分别为EAA混合物(反映肌肉蛋白EAA, 4.3克AA)和对照组(TAA, 3.9克AA)。在败血症前(基线)和恢复第7天,静脉注射氨基酸稳定同位素混合物作为脉冲(吸收后)。随后采集动脉血2小时。用LCMS测定AA浓度和富集程度。统计学:RM-ANOVA, α = 0.05。结果:脓毒症后体重减轻,在脓毒症后第7天恢复。与基线相比,EAA组肌肉疲劳(p &lt; 0.0001)、tau-甲基组氨酸全身生成(WBP)(反映肌原纤维肌肉分解,p &lt; 0.0001)和全身净蛋白质分解(p &lt; 0.0001)增加,但对照组(肌肉疲劳:p &lt; 0.0001, tau-甲基组氨酸:p = 0.0531,净蛋白质分解(p &lt; 0.0001)减少。此外,在第7天,甘氨酸(p &lt; 0.0001)、羟脯氨酸(p &lt; 0.0001)、谷氨酸(p &lt; 0.0001)、谷氨酰胺(p &lt; 0.0001)和牛磺酸(p &lt; 0.0001)的WBP均降低,但较低(甘氨酸:p = 0.0014;羟脯氨酸(p = 0.0007);谷氨酸p = 0.0554)或更高(谷氨酰胺:p = 0.0497;牛磺酸:p &lt; 0.0001)。另外,瓜氨酸的WBP在第7天升高(p = 0.0011),而对照组的WBP较对照组低(p = 0.0078)。EAA组血浆中天冬酰胺(p &lt; 0.0001)、瓜氨酸(p &lt; 0.0001)、谷氨酰胺(p = 0.0001)、tau-甲基组氨酸(0.0319)、丝氨酸(p &lt; 0.0001)、牛磺酸(p &lt; 0.0001)和酪氨酸(p &lt; 0.0001)浓度较高。除甘氨酸、甲基组氨酸和鸟氨酸外,EAA组的清除率较低(p &lt; 0.05)。结论:我们的败血症恢复猪ICU-AW模型显示,败血症后仅饲喂eaa饲料与肌肉和全身净蛋白分解增加有关,并影响非eaa代谢。我们假设脓毒症后营养中非必需氨基酸是改善蛋白质合成代谢所必需的。Rebecca Wehner, RD, LD, CNSC1;Angela Parillo, MS, RD, LD, CNSC1;Lauren McGlade, RD, LD, CNSC1;南阳,RD, LD, CNSC1;allison Vasu-Sarver, MSN, APRN-CNP1;米歇尔·韦伯,DNP, RN, APRN-CNS, APRN-NP, CCRN, CCNS, OCN, AOCNS1;Stella Ogake, MD, fccp11俄亥俄州立大学韦克斯纳医学中心,哥伦布市,俄亥俄州,资金支持:无报道。背景:重症监护病房的患者,特别是那些机械通气的患者,经常接受不适当的肠内营养(EN)治疗。 危重病人平均只得到规定营养需求的40-50%,而ASPEN/SCCM指南鼓励努力提供&gt;80%的目标能量和蛋白质需求。帮助实现这些努力的一种方法是使用基于体积的喂养(VBF)。在我们的机构,每小时收费喂养(RBF)的方法是标准的。2022年,我们的医疗重症监护室(MICU)业务委员会询问了实施VBF的潜在好处。在改变我们的实践之前,我们收集数据来评估我们在满足EN目标方面的当前表现,并确定中断的原因。虽然文献表明,与RBF相比,VBF在EN并发症方面被认为是相对安全的,但据我们所知,目前还没有关于有胃肠(GI)不耐受风险的患者开始使用VBF的安全性的信息。因此,我们也试图确定EN是否更频繁地发生是由于胃肠道不耐受还是手术。方法:我们回顾性评价某大型三级学术医疗中心MICU的EN分娩与EN目标的比较,以及EN分娩低于目标时中断的原因。我们回顾了十天内所有MICU病人的信息。一天为从早上7点至早上6点59分接收的总EN量,以毫升计。使用QI评估表,我们收集了以下数据:24小时内的目标EN量,24小时内收到的体积,收到的体积与规定的百分比,每天喂养的小时数(或低于目标率)。将持续管饲的原因根据潜在原因分为6类:喂食起始/滴定、胃肠道问题(便秘、腹泻、呕吐、恶心、胃胀、胃残量高)、手术问题、非手术问题、机械问题和实践问题。数据输入到电子表格中,并使用描述性统计来评估结果。结果:在2022年8月为期两周的时间内,随机对接受EN治疗的MICU患者进行了10天的观察。82例患者接受EN治疗。观察了384个EN日。所有患者的平均EN交付率为70%。保留EN的原因如下:34例(23%)与喂养开始有关,55例(37%)与GI问题有关,19例(13%)与手术程序有关,32例(22%)与非手术程序有关,2例(1%)与机械问题有关,5例(3%)与实践问题有关。51例(35%)可以考虑VBF。结论:这些结果表明,由于胃肠道问题和喂养开始,我们的MICU中EN的给药量通常低于规定量。共89例(60%)。在这两种情况下,VBF协议都不会改善交付。VBF可能会导致患有胃肠道问题的患者增加不适,并且可以通过改变进展方案来改善喂养开始。由于VBF仅在35%的病例中具有潜在的益处,并且观察到高于平均水平的EN分娩,因此该方案未在观察到的MICU中实施。Delaney Adams, pharm1;Brandon Conaway, pharm2;Julie Farrar, pharm3;Saskya Byerly, MD4;迪娜·菲利贝托,MD4;Peter Fischer, MD4;Roland Dickerson, PharmD31Regional One Health, Memphis, TN;2退伍军人事务医疗中心,田纳西州孟菲斯;3田纳西大学药学院,田纳西州孟菲斯;4田纳西大学医学院,孟斐斯,TNEncore后报告:第54届重症医学学会年会。2025年2月23日至25日,奥兰多,佛罗里达州。出版物:重症监护医学。2025;53(1):在印刷中。资金支持:无报告。megan Beyer, MS, RD, LDN1;Krista Haines, DO, MA2;Suresh Agarwal, MD2;希拉里·温斯洛普,MS, RD, LDN, CNSC2;Jeroen Molinger, PhDc3;Paul Wischmeyer, MD, EDIC, FCCM, faspen41杜克大学医学院麻醉系,北卡罗来纳州达勒姆;2杜克大学医学院,北卡罗来纳州达勒姆;3杜克大学医学中心麻醉科-杜克心脏中心,北卡罗来纳州罗利;财政支持:百特,雅培。背景:静息能量消耗(REE)对重症监护病房(ICU)患者的营养管理至关重要。准确的能量需求评估对于优化营养干预措施至关重要。然而,对于不同疾病状态如何特异性影响ICU患者的REE,我们所知甚少。现有的能量建议是泛化的,没有考虑到不同疾病类型的代谢变异性。间接量热法(IC)被认为是测量稀土元素的金标准,但尚未得到充分利用。本研究通过在大型学术医疗中心使用代谢车评估分析疾病状态的REE来解决这一差距。研究结果有望为重症监护提供更精确、针对特定疾病的营养建议。 方法:这是一项汇总分析,纳入了四项前瞻性临床试验,评估了一系列疾病状态的ICU患者。本分析纳入的患者被诊断为COVID-19、呼吸衰竭、心胸(CT)手术、创伤或外科重症监护条件而入住ICU。所有患者在ICU入院72小时内进行IC以评估REE,并在患者病情稳定时进行随访测量。在每项研究中,患者都在标准的ICU护理方案下进行管理,并根据临床试验方案进行个性化或标准化的营养干预。主要结果是测量的REE,以kcal/day表示,并与体重(kcal/kg/day)标准化。报告了人口统计学和临床特征的汇总统计数据,如年龄、性别、身高、体重、BMI和合并症。采用方差分析对五种疾病状态进行比较分析,以确定REE差异的显著性。结果:本组共纳入165例ICU患者。该队列的平均年龄为58岁,其中58%为男性,42%为女性。种族统计数据包括36%的黑人,52%的白人,10%的其他背景。外科ICU组患者的热量需求最低,平均为1503千卡/天,而COVID-19患者的热量需求最高,为1982千卡/天。CT手术患者测量为1644千卡/天,呼吸衰竭患者测量为1763千卡/天,创伤患者需要1883千卡/天。方差分析显示,两组间REE差异有统计学意义(p &lt; 0.001)。当与体重(千卡/公斤/天)归一化时,REE的范围从20.3到23.5千卡/公斤/天不等,不同疾病状态之间的差异具有统计学意义(p &lt; 0.001)。结论:本研究揭示了ICU患者不同疾病状态下REE的显著差异,强调了疾病特异性能量推荐的必要性。这些发现表明,特定的疾病过程,如COVID-19和创伤,可能会增加代谢需求,而从外科手术中恢复的患者可能具有相对较低的能量需求。这些发现强调了基于患者疾病状态的个性化营养干预的重要性,以优化康复和临床结果,并防止可能对患者预后产生不利影响的喂养不足或过度喂养。结果表明,IC应在ICU环境中更广泛地实施,以指导基于实时代谢数据的精确和有效的营养输送,而不是依赖于标准的预测方程。需要进一步的研究来完善这些建议,并探索在ICU中持续监测REE和量身定制的营养需求。表1。人口学和临床特征。表2。疾病组诊断。图1所示。疾病组平均静息能量消耗。Hailee Prieto, MA, RD, LDN, CNSC1;艾米丽·麦克德莫特,MS, RD, LDN, cnsc21西北纪念医院,肖尔伍德,伊利诺伊州;2西北纪念医院,芝加哥,伊利诺伊州财政支持:无报告。背景:注册营养师和CTICU团队之间的沟通是非标准化的,在支持RD建议的正确实施方面往往无效。后期或缺乏RD支持会影响向患者提供的营养护理的质量。在23财年,CTICU营养咨询/风险周转时间在24小时内为58%,遗漏的营养咨询/风险为9%。我们的目标是通过规范研发与CTICU APRNs之间的沟通,将研发咨询/风险周转时间从58%提高到75%,将研发咨询/风险漏报率从9%降低到6%,这是基于我们部门目标的。结果指标营养风险周转时间和营养咨询周转时间。过程度量是我们在回合中研发出现的百分比。方法:我们使用DMAIC模块来尝试解决我们在CTICU中的通信问题。我们听取了客户的意见,调查了CTICU的arpn,发现一个障碍是rd在CTICU中的有限存在。我们发现CTICU aprn发现每天与他们的团队进行RD round是有价值的。然后,我们对ICU的RDs进行了文献检索,特别是心脏/胸部ICU,发现心脏手术危重患者发生营养不良的风险很高,然而,与非心脏手术或MICU患者相比,医学营养治疗的开始和营养供应的总体充分性较低。RD书面医嘱直接改善了接受营养支持的患者的预后。为了发挥最大的影响力,rd需要出现在ICU,并在做出重要决策时参与其中。营养师在ICU团队中的参与显著提高了团队实施及时、相关的营养支持干预的能力。 Diane Nowak, RD, LD, CNSC1;Mary Kronik, RD, LD, CNSC2;卡洛琳·库珀,RD, LD, CNSC3;Mary Rath, MEd, RD, LD, CNSC4;Ashley Ratliff, MS, RD, LD, CNSC4;Eva Leszczak-Lesko,健康科学学士,rrt41克利夫兰诊所,伊利里亚,俄亥俄州;2克利夫兰诊所,奥姆斯特德,俄亥俄州;3克利夫兰诊所,洛基河,俄亥俄州;4 .克利夫兰诊所,俄亥俄州克利夫兰。背景:间接量热法(IC)是准确测定能量消耗的金标准。该团队对目前全国范围内的IC实践进行了全面的文献回顾,表明采用IC的设施通常遵循由停留时间(LOS)决定的标准协议。重症监护室(ICU)注册营养师(RD)指导IC干预,以减少对不准确的预测方程的依赖,并在IC订单实践的帮助下明智地识别患者(1,2)。虽然候选资格由临床标准决定,但实施主要受RD时间限制。我们的项目旨在通过使用标准化的实施过程,将集成电路纳入我们的护理标准。方法:在我们拥有1299张床位的第四护理医院实施IC,包括249张ICU床位,由ICU rd和呼吸治疗师(RT)组成的多学科团队与一名医师冠军合作。在6个月的试用期后,购买了3台Cosmed QNRG+间接量热仪。由于潜在的快速临床状态变化和RD人员配置,ICU团队选择了基于订单的实践,而不是协议。该订单由有执照的独立医生(LIP)签署,包括三个部分:有指征的IC订单,营养评估咨询,以及一旦测试批准,RD向RT释放的有条件订单。订单签署后,注册护士与注册护士和注册护士合作,通过验证标准化临床标准来评估IC候选资格。如果合适,RD将在测试之前发布RT订单,以便记录呼吸机设置。为了开始测试,RD输入患者信息并校准气速,随后RT确保通气连接。接下来,RD开始测试,并在床边保持标准化的20分钟时间,以确保稳定状态不被中断。测试完成后,选择最佳5分钟平均值来获得测量的静息能量消耗(mREE)。RD考虑了多种因素来解释结果,如果有必要,还会修改营养干预措施。结果:2024年5月至2024年8月,8名ICU注册营养师完成了87项IC测量,其中包括不同ICU的患者。所有87例患者都是由RD选择的,因为他们担心进食过量或不足。83%的测量是有效的测试,79%的测量导致干预修改。面对面的时间为66小时45分钟,平均每次测试45分钟。用于解释结果和修改干预措施的额外时间为15-30分钟。结论:IC能够准确捕捉危重病人的能量消耗。研发导向的基于订单的集成电路实践使我们的机构能够成功地引入集成电路。未来从有限的IC实施到护理标准的过渡将取决于对众多挑战的考虑,包括研发时间限制和危重病护理起起落落期间的患者数量。为了配合不断变化的重症监护动态,正在积极评估人员配备水平和工作流程。表1。间接量热法(IC)核对表。图1所示。测试无效的IC结果。图2。具有有效测试的IC结果。图3。IC适应症和禁忌症。图4。IC史诗订单。Rebecca Frazier, MS, RD, CNSC1;Chelsea Heisler, MD, MPH1;Bryan Collier, DO, FACS, FCCM11Carilion Roanoke Memorial Hospital, Roanoke, vnc,财务支持:无报告。背景:充足的能量摄入和适当的常量营养素组成是患者康复的重要组成部分;然而,人们发现预测方程的精度是可变的。间接量热法(IC)可以深入了解作为代谢燃料和热量需求的主要营养基质,通常可以识别喂养过量和不足。虽然IC被认为是确定静息能量消耗的金标准,但它在成本、设备可行性和人员时间限制方面存在挑战,即呼吸治疗(RT)。我们的假设是:注册营养师(RD)主导的IC测试可以以一种安全可行的方式进行,而不会出现不必要的并发症风险。此外,IC将显示出更高的热量需求,特别是需要至少7天呼吸支持的患者。方法:一组rd在单一机构筛选外科ICU患者。 插管至少3天的患者被认为有资格进行检测。排除标准包括PEEP≥10,吸入氧分数≥60%,Richmond躁动镇静量表≥1,胸管漏气,体外膜氧合使用,1小时内1°C变化。测试使用基于RD和患者可用性的Q-NRG+便携式代谢监测仪(Baxter)完成。将测试结果与基于宾夕法尼亚州立大学方程(39次测试)的计算需求进行比较。过喂/欠喂定义为与方程结果偏差&gt;15%。能量需求的平均差异分析采用标准配对双尾t检验计算&lt;/= 7总通风天数和&gt;7通风天数。结果:30例患者行IC检测;总共完成了39次测试。RD引导的IC测试没有并发症,并且在完成5次测试后需要最小的RT介入。总体而言,56.4%的IC与呼吸机天数无关,显示过度喂养。此外,33.3%的试验表明适宜摄食(稀土含量计算值的85-115%),10.3%的试验表明摄食不足。当按呼吸天数分层时(&gt;7天vs≤7天),发现相似的结果,66%的IC测试与计算的热量需求有15%;54.4 ~ 60.0%为过食,12.5 ~ 6.7%为欠食。结论:估算热量需求的公式提供了不一致的结果。与IC相比,无论通气天数如何,营养方程式都类似地低估和高估了营养需求。尽管缺乏统计学意义,但营养不良的影响已被充分记录并具有广泛的临床意义。只需最少的培训,IC就可以在RD和床边注册护士的陪同下安全地进行。利用研发来协调和执行IC测试是一个可行的过程,可以最大限度地提高人员效率,并允许立即调整营养计划。IC作为营养评估的金标准,应在外科ICU患者中进行,以协助制定营养治疗算法。多洛雷斯Rodriguez1;Mery Guerrero2;玛丽亚Centeno2;芭芭拉Maldonado2;桑德拉Herrera2;Sergio santana31厄瓜多尔抗癌协会,瓜亚基尔,瓜亚亚斯;2瓜亚斯瓜亚基尔solca;3 .哈瓦那大学,La Habana, Ciudad de La Habana背景:2022年,国际癌症研究机构(International Agency for Research on cancer - globocan)报告了全球近2000万例新发癌症病例,其中厄瓜多尔有30888例。乳腺癌、前列腺癌和胃癌是诊断最多的癌症类型。血液病(OHD)显著影响患者的营养状况。ELAN厄瓜多尔2014年的研究涉及5000多名患者,发现37%的参与者营养不良,在多动症患者中这一比例上升至65%。由FELANPE在2019年至2020年期间进行的拉丁美洲肿瘤营养不良研究(LASOMO)显示,在10个拉丁美洲国家的52个卫生中心的1842名患者中,营养不良的发生率为59.1%。本研究旨在介绍目前在厄瓜多尔医院接受治疗的患者中与OHD相关的营养不良状况。方法:LASOMO研究的厄瓜多尔部分在2019年至2020年期间进行,作为前面提到的区域流行病学倡议的一部分。本研究是一项为期一天的全国性多中心调查,涉及Guayas省(3)、Manabí省(1)和Azuay省(1)五家医院的血液病(OHD)患者的卫生中心和专业服务。血液病(OHD)患者的营养状况使用Detsky等人主观整体评估(SGA)的B + C评分进行评估。该研究包括2019年10月至11月期间入住临床、外科、重症监护和骨髓移植(BMT)部门的18岁及以上的男性和女性患者。参与是自愿的,患者通过签署同意书提供知情同意。使用基于变量类型的位置、分散和聚合统计来分析数据。关系的性质和强度采用卡方检验进行独立性评估,显著性水平为&lt;5%用于识别显著关联。计算营养不良的优势比及其相关的95%置信区间。结果:共入组390例患者,其中女性63.6%,男性36.4%,平均年龄55.3±16.5岁;47.2%的患者年龄在60岁及以上。最常见的肿瘤部位包括肾脏、泌尿道、子宫、卵巢、前列腺和睾丸,占所有病例的18.7%(见表1)。化疗是主要的肿瘤治疗方法,接受化疗的患者占42.8%。49.7%的受访患者营养不良,其中14.4%被归类为严重营养不良(见图1)。 这些发现表明,早期营养缺乏可能会增加CLABSI的风险,而且这种风险与所提供的营养补充类型无关。表1。患者特征、临床和营养结果。表2。中心接入装置类型和微生物与提供营养支持方式的关系。Yonatan Oki, MD1;Faya Nuralda Sitompul21ASPEN,雅加达,雅加达;2 . osaka University, Minoh, osaka财政支持:无报告。背景:人参在亚洲被广泛用作功能性食品或治疗性补充剂,它含有人参皂苷等生物活性化合物,具有降糖、抗炎、保护心脏和抗肿瘤等生物学作用。然而,研究表明,人参也有抗凝血和抗聚集作用,并可能与出血有关。本病例报告提出了一个老年晚期胰腺癌患者人参诱导出血的潜在病例。病例描述:一名76岁男性IV期胰腺癌并转移到肝脏、淋巴结和腹膜,在BIPAP通气、NGT喂养、腹水引流和foley导尿管的家庭护理中。患者有a型主动脉夹层修复史、贫血史、血小板减少史,血小板计数持续低于50,000/µL。尽管没有使用抗凝血剂的历史,患者在每天服用100克西洋参(AG) 5天后出现大量胃肠道出血和血尿。最初怀疑弥散性血管内凝血(DIC),但直到护理的第三周才观察到出血的迹象,与人参食用一致。因患者病情不稳定及家属拒绝,未行内镜检查。讨论:由于患者已经不稳定的状态和低血小板计数,AG的消耗可能引发出血。人参皂苷,特别是Rg1, Rg2和Rg3,已被证明具有抗凝作用,延长凝血时间和抑制血小板聚集。研究表明,AG提取物可以显著延长凝血时间,降低血小板活性,可能导致观察到的出血。结论:本病例强调了AG在胰腺癌伴血小板减少患者大出血中的可能作用。鉴于人参已知的抗凝血特性,在给血液系统异常或有出血风险的患者使用时应谨慎,并需要进一步研究以评估其在这些人群中的安全性。方法:无报道。结果:无报道。结论:无报道。Kursat Gundogan, MD1;玛丽·内利斯博士;Nurhayat Ozer博士;沙欣·特梅尔,MD3;雷杰普·尤克塞尔,MD4;Murat Sungar, MD5;迪恩·琼斯博士;Thomas Ziegler, md61临床营养科,埃尔西耶斯大学健康科学研究所,开塞利;2埃默里大学,亚特兰大,佐治亚州;3埃尔西耶斯大学卫生科学研究所,开塞利;4埃尔西耶斯大学医学院内科教研室,开塞利;5埃尔西耶斯大学医学院内科教研室,开塞利;财政支持:埃尔西耶斯大学科学研究委员会(TSG- 2021-10078)和TUBITAK 2219计划(1059B-192000150),各资助KG,国家卫生研究院资助P30 ES019776,资助DPJ和TRZ。背景:代谢组学是一种很有前途的分析技术,用于研究危重疾病引起的重大代谢改变。本研究利用血浆高分辨率代谢组学(HRM)分析来定义危重成人常见危重疾病严重程度评分相关的全身代谢。方法:本横断面研究在土耳其开塞利的埃尔西耶斯大学医院和美国佐治亚州亚特兰大的埃默里大学进行。参与者是重症监护病房预计停留时间超过48小时的危重成人。入院当天取血浆进行代谢组学检测。数据分析采用两种ICU入院疾病严重程度评分(APACHE II和mNUTRIC)与代谢组全关联研究(MWAS)中所有血浆代谢特征的回归分析。APACHE II评分作为连续变量分析,mNUTRIC评分作为二分类变量分析[≤4(低)vs. &gt;4(高)]。对与两种疾病严重程度评分独立相关的显著代谢物(p &lt; 0.05)进行途径富集分析。结果:共纳入77例患者。平均年龄69岁(33 ~ 92岁);65%是女性。分别为APACHE II的MWAS和mNUTRIC评分确定了超过15,000个代谢组学特征。 关于正确采血技术的护理再教育对于减少污染的发生至关重要。所有的政策和程序都将被审查,并将实施一个教育计划。在此之后,将重新评估PN期间血液污染的发生率。Hassan Dashti博士,RD1;Priyasahi Saravana1;Meghan lau11马萨诸塞州总医院,波士顿,MAEncore演示:ASN营养2024。出版:Saravana P, Lau M, Dashti HS。接受夜间家庭肠外营养输注的短肠综合征成人的持续血糖监测。2024年11月23日。doi: 10.1038 / s41430 - 024 - 01548 - z。先在线,后印刷。PMID: 39580544。资金支持:阿斯彭路德斯研究基金会。玛丽亚·罗曼诺娃,MD1;Azadeh Lankarani-Fard, MD21VA大洛杉矶医疗保健系统,橡树园,CA;2GA大洛杉矶医疗保健系统,洛杉矶,财务支持:无报告。背景:营养不良是住院期间的严重并发症。肠外营养(PN)是最复杂的解决方法,但需要持续监测。在我们的医疗中心,PN的提供是由跨学科营养支持小组(NST)指导的。2024年,我们开始在大洛杉矶退伍军人管理局创建一个仪表板,以监控PN的安全和利用。在这里,我们讨论了开发仪表板及其首次使用的协作过程。方法:利用VA电子健康记录数据构建仪表板。仪表板使用Microsoft Power BI技术定制数据可视化。NST团队与工厂的数据分析团队密切合作,修改和验证仪表板,以满足团队的需求。仪表板是在VA防火墙后维护的,只有NST的成员可以访问。仪表板回顾了过去2年中接受营养支持咨询的患者水平数据。变量包括入院资料、请求会诊日期、请求时的治疗专业、人口统计学、入院诊断、出院诊断、PPN/TPN单次、入院后血糖值≥200 mg/dL、血清磷值≥2.5 mg/dL、血清钾值≥3.5 mmol/L、出院诊断是否再喂养(ICD 10 E87.8)、入院时微量营养素水平、出院诊断是否感染。用于捕获感染的ICD10代码为:菌血症(R78.81)、败血症(A41.*)或导管相关系感染(ICD10 = T80.211*)。星号(*)表示ICD10分类中的任何数字。仪表板每周更新一次。NST验证仪表板上的信息以确保有效性,并根据需要对信息进行细化。结果:最初的数据提取发现,由于患者在同一次住院期间改变了治疗专业,出现了重复的咨询请求;由于给药前经常修改处方,出现了PPN/TPN的重复订单。数据分析团队致力于减少这些重复。NST还与数据分析团队合作,修改他们现有的文档,以便更好地捕获未来所需的数据。仪表板数据通过直接图表审查进行验证。在2022年4月至2024年4月期间,68名患者从急症护理部门就诊,58名患者在此期间接受了PPN或TPN。35例患者出现高血糖。两名患者在出院时被认为有再进食的经历。在接受PPN/TPN的患者中发现了14次感染,但仅从仪表板上无法确定病因,需要额外的图表审查。结论:仪表板可以方便地监测医院的营养支持服务。仪表板的细化需要临床团队和数据分析团队之间的协作,以确保有效性和工作量捕获。Michael Fourkas, MS1;Julia Rasooly, MS1;Gregory Schears, MD21PuraCath Medical Inc., Newark, CA;资金支持:本研究由Puracath Medical提供资金。背景:静脉导管可以为患者提供长时间的药物和营养输送的静脉通道,但由于无菌性不足,有发生中央静脉相关血流感染(CLABSI)的风险。提供药物注射通道的无针连接器(NC)是已知的主要污染源之一。研究表明,目前的连接器消毒方法,如15秒消毒擦拭,并不能保证连接器内部完全消毒。随着耳念珠菌等超级细菌的兴起,迫切需要更好的无菌技术合规和非抗生素消毒方法。 两组的平均UCR从基线开始逐日增加,但干预组的速度更快,干预效果显著(p = 0.0127)。与对照组相比,干预组在第7天和第8天的UCR显著提高了21和22个单位(p = 0.0214和p = 0.0215)。第7天之后,干预组的平均每日UCR趋于稳定,而对照组则没有。调整疾病严重程度、每日体液平衡和AKI并没有改变干预效果。ICU第2天(干预前)UCR与肌纤维CSA呈显著负相关(r = -0.39, p = 0.011),但第8天(干预后)无显著负相关(r = 0.23, p = 0.153)。结论:在ICU的第一周内,氨基酸补充可显著提高UCR,此后趋于稳定。基线时的UCR可能是肌肉状态的一个指标。图1所示。对照组和干预组ICU期间尿肌酐比(UCR)的变化。误差条表示95%置信区间(ci)。Paola Renata Lamoyi Domínguez, MSc1;Iván Osuna Padilla, phd;Lilia Castillo Martínez,博士;约瑟夫·丹尼尔·卡德萨-阿吉拉尔,MD2;Martín Ríos-Ayala, MD21UNAM,墨西哥国立自治大学,墨西哥城,联邦区;2墨西哥联邦区国家呼吸疾病研究所;3国家医学科学和营养研究所萨尔瓦多Zubirán,墨西哥城,区联邦财政支持:无报告。背景:不排便(ND)在机械通气(MV)危重患者中非常普遍,据报道高达83%的病例。这种疾病与高发病率和高死亡率有关。大多数现有的研究都集中在临床数据和不排便之间的关系;然而,缺乏证据表明其与饮食因素有关。我们的目的是分析在肺炎和其他肺部症状的危重患者中,肠内营养中的膳食纤维与通过肠内和肠外途径排便的液体量之间的关系。方法:我们对2023年5月至2024年4月期间在墨西哥城一家三级医院接受肠内营养(EN)治疗的MV患者进行了纵向分析。纳入标准为年龄18岁,年龄中等,因肺炎或其他肺部症状入住呼吸重症监护病房(ICU),以及入住ICU后24小时内进行的营养评估。排除标准为需要肠外营养、大手术、创伤性脑损伤或神经肌肉疾病的患者排除在本研究之外。营养评估,包括NUTRIC评分,SOFA和APACHE II评估,以及能量-蛋白质需求的估计,由训练有素的营养师在ICU入院后的前24小时内进行。在随访的每一天(0 ~ 6天),我们记录EN中提供的纤维量,输液量,包括肠内和肠外输液量,以及阿片类药物、镇静剂、神经肌肉阻滞剂和血管加压剂的药物处方。ND定义为ICU入院后6天未排便。还评估了ND和排便之间的差异。使用离散时间生存分析检查ND与饮食因素的关系。结果:纳入74例患者;40例(54%)患者出现ND。非排便组患者在ICU的住院时间更长,其中50%的患者在第10天有第一次排便。两组间纤维供给和输注液量均无差异。在多变量分析中,ND与纤维(每天摄入纤维10 - 20g, OR 1.17 95%CI: 0.41-3.38, p = 0.29)或总液体(每天摄入液体25 - 30ml /kg, OR 1.85 95%CI:0.44-7.87, p = 0.404)之间没有关联。结论:54%的研究人群存在排便障碍。虽然纤维和液体被认为是治疗不排便的一种方法,但我们没有发现危重患者的相关性。表1。分组的人口学和临床特征。表2。每日饮食因素比较。Andrea Morand, MS, RDN, LD1;Osman Mohamed Elfadil, MBBS1;凯·格雷伯,rd1;Yash Patel, MBBS1;Suhena Patel, MBBS1;Chloe Loersch, RDN1;伊莎贝尔·威金斯,RDN1;安娜·桑托罗,MS, RDN1;娜塔莉·约翰逊,MS1;克里斯汀·埃克特,MS, RDN1;Dana Twernbold, RDN1;达契亚·塔尔莫,RDN1;伊丽莎白·恩格尔,RRT, LRT1;艾弗里·埃里克森,MS, RDN1;Alex Kirby, MS, RDN1;Mackenzie Vukelich, RDN1;凯特·桑德巴肯,RDN1;Victoria Vasquez, RDN1;Manpreet Mundi, MD11Mayo Clinic, Rochester, mn。背景:目前危重患者营养支持指南推荐使用间接量热法(IC)来确定能量需求。 然而,由于可及性、劳动力和成本,IC测试在许多机构受到限制,这导致依赖预测方程来确定热量目标。实施了质量改进(QI)计划,以评估IC常规完成后对营养护理的影响。方法:在选定的内科和外科重症监护病房(ICU)进行前瞻性QI项目,包括由营养师在会诊后24-48小时内或住院第4天评估的危重患者。排除有IC禁忌症的患者,包括需要ECMO、CRRT、MARS或FiO2的患者,以及需要大量补充氧气的自主呼吸患者。最初的热量目标是利用预测方程建立的,这是我们机构的标准护理。ICU住院第4天后,收集IC测量值、预测方程和基于体重的norm图。使用的预测方程包括Harris-Benedict (HB) -基础,调整HB(体重指数(BMI) &gt; 30时基础的75%),Penn State(通风),Mifflin St. Jeor (MSJ),修正HB和调整HB (BMI &gt; 30时基础的75%)。收集了额外的人口统计学、人体测量学和临床数据。结果:85例患者中男性居多(53例,占62.4%),曾入住外科ICU(57例,占67.1%),体重超重(平均BMI 29.8 kg/m^2),平均年龄61.3岁(SD 16.5)。在IC测试时,ICU的中位住院时间为6天;77.6% (n = 66)的患者采用机械通气,使用呼吸机的中位天数为4天(表1)。将平均IC测量的REE与预测方程进行比较,结果显示,除了基于体重的nomogram高热量需求(p = 0.3615)外,所有方程均显著低于IC (p &lt; 0.0001)。评估的预测方程与IC的绝对中位数差异(图1)。REE测量后,肠内(EN)和肠外营养(PN)患者的热量目标显著增加(p = 0.0016和p = 0.05)。REE前肠内喂养的平均热量目标为1655.4 (SD 588), REE后平均增加268.4 kcal/d;非肠外喂养患者REE治疗前的平均热量目标为1395.2 kcal (SD 313.6), REE治疗后的平均热量目标为1614.1 kcal (SD 239.3),平均增加了167.5 kcal(表2)。每BMI类别每实际体重的平均REE为BMI + lt; 29.9 = 25.7±7.9 kcal/kg, BMI 30-34.9 = 20.3±3.8 kcal/kg, BMI 35-39.9 = 22.8±4.6 kcal/kg, BMI≥40 = 16.3±2.9 kcal/kg(理想体重25.4±10.5 kcal/kg)。(图2)说明了IC的BMI分解的平均每日卡路里需求和检验的预测方程。结论:IC测量值与各种预测方程之间存在显著差异,除了基于体重的高估计卡路里需求。营养目标随着IC测量值的变化而发生显著变化。建议我们在我们机构的危重病人中扩大IC的使用。在不可能集成电路的情况下,应使用基于重量的图。表1。基线人口统计学和临床特征。表2。营养支持。图1所示。与预测方程相比,利用IC估算每日卡路里的差异。图2。RMR由IC和其他预测方程由BMI。GI、肥胖、代谢和其他营养相关概念[j];Osman Mohamed Elfadil, MBBS1;Yash Patel, MBBS1;Chanelle Hager, RN1;Manpreet Mundi, MD1;Ryan Hurt, MD, PhD11Mayo Clinic, Rochester, mnc。背景:慢性毛细血管渗漏综合征是一种罕见但可能致命的恶性肿瘤免疫治疗副作用,主要表现为难治性全身性水肿和难治性低血压。该综合征的特发性类型也是已知的。在没有可识别的其他病因的情况下,通过排除单次或反复发作的血管内低血容量或全身性水肿,主要表现为低血压、血液浓缩和低白蛋白血症的诊断三重征的患者,可以诊断出该疾病。类固醇辅助治疗仍然是标准的治疗方法。在癌症免疫治疗继发的毛细血管渗漏综合征中,通常考虑停用该药物。这种相对罕见的综合征可能与重大的临床挑战有关。本临床病例报告侧重于营养护理方面。方法:一名45岁男性,既往有高血压、儿童期肺结核和直肠癌病史。2022年10月首次诊断为中分化浸润性直肠腺癌IIIb (cT3, cN1, cM0)。作为最初的治疗,他参加了一项临床试验。 他接受了25个周期的研究药物Vudalimab (PD1/CTLA4双特异性抗体)免疫治疗,在没有额外化疗、放疗或手术的情况下获得了完全的临床反应。不幸的是,自2023年11月以来,他出现了广泛的毛细血管渗漏综合征,表现为反复出现无血、乳糜腹水和胸腔积液。他的治疗也因甲状腺炎和胰岛素依赖型糖尿病的发展而复杂化。患者最近表现为腹胀、腹水和周围水肿,尽管利尿剂治疗仍未改善。进行诊断和治疗性穿刺,发现乳糜性腹水。两周后,患者再次出现腹水积聚,腹水恶化并伴有胸膜和心包积液。PET CT未见恶性病变,但显示沿腹膜壁摄取增加,提示腹膜炎。进一步评估的淋巴管造影显示无明显渗漏/阻塞;然而,本研究不能排除毛细血管渗透性增加引起的微泄漏。然而,他需要双侧胸膜和腹膜引流(每日0.5至1l)。诊断为毛细血管渗漏综合征。除奥曲肽外,免疫抑制治疗开始于静脉注射甲基强的松(BID 40 mg),然后过渡到口服类固醇(PO 60 mg);然而,随着泼尼松剂量的减少和口服类固醇的过渡,患者的症状再次出现。他的免疫抑制方案被修改为包括每周一次IVIG和每天两次静脉注射白蛋白的试验。从营养的角度来看,他最初是常规的口服饮食。然而,在摄入高脂肪食物后,他的排尿量明显增加,因此他改用低脂肪40克/天的高蛋白饮食,以防止乳糜腹水恶化。根据ASPEN标准,患者出现严重的肌肉痉挛和中度营养不良,并伴有明显的临床肌肉损失,于是开始TPN治疗。开始无脂饮食以减少淋巴流量,随后胸管输出量改善,随后过渡到混合油和口服饮食的家庭肠外营养。结果:无报道。结论:慢性毛细血管/淋巴渗漏综合征具有挑战性,需要改变饮食习惯。随着饮食的改变,显著减少口服脂肪摄入量,可以考虑短期或长期PN。Kishore Iyer, MBBS1;弗朗西斯卡·乔利,医学博士;Donald Kirby, MD, FACG, FASPEN3;Simon Lal, MD, PhD, FRCP4;Kelly Tappenden, PhD, RD, FASPEN2;Palle Jeppesen, MD, PhD5;纳德·优素福,医学博士,MBA6;Mena Boules, MD, MBA, FACG6;常明,MS, PhD6;Tomasz Masior, MD6;Susanna Huh, MD, MPH7;Tim Vanuytsel,医学博士,纽约西奈山伊坎医学院;2营养支持,Hôpital博戎,巴黎,法兰西岛;3美国俄亥俄州克利夫兰市肠衰竭与肝脏疾病科;4索尔福德皇家NHS基金会信托,英格兰索尔福德;5丹麦首都哥本哈根国立医院肠衰竭与肝病科;6Ironwood Pharmaceuticals, Basel, Basel- stadt;7Ironwood Pharmaceuticals, Boston, MA;8鲁汶大学医院,鲁汶,Brabant wallon2024美国胃肠病学学院,2024年10月25日至30日,宾夕法尼亚州费城。资金支持:无报告。国际杰出海报francisca Joly, MD, PhD1;Tim Vanuytsel, MD, phd;Donald Kirby, MD, FACG, FASPEN3;Simon Lal, MD, PhD, FRCP4;Kelly Tappenden, PhD, RD, FASPEN1;Palle Jeppesen, MD, PhD5;Federico Bolognani, MD, PhD6;纳德·优素福,医学博士,MBA6;李嘉丽博士;Reda Sheik, MPH6;Isabelle Statovci, BS, CH6;Susanna Huh, MD, MPH7;Kishore Iyer, mbbs81营养支持,Hôpital博戎,巴黎,法兰西岛;2鲁汶大学医院,鲁汶,布拉班特,瓦隆;3美国俄亥俄州克利夫兰市肠衰竭与肝脏疾病科;4索尔福德皇家NHS基金会信托,英格兰索尔福德;5丹麦首都哥本哈根国立医院肠衰竭与肝病科;6Ironwood Pharmaceuticals, Basel, Basel- stadt;7Ironwood Pharmaceuticals, Boston, MA;8西奈山伊坎医学院,纽约,NYEncore演讲后:2024年消化疾病周,2024年5月18日至21日,美国华盛顿。资金支持:无报告。 Tim Vanuytsel, MD, PhD1;Simon Lal, MD, PhD, FRCP2;Kelly Tappenden, PhD, RD, FASPEN3;Donald Kirby, MD, FACG, FASPEN4;Palle Jeppesen, MD, PhD5;弗朗西斯卡·乔利,医学博士;Tomasz Masior, MD6;Patricia Valencia, PharmD7;常明,MS, PhD6;Mena Boules, MD, MBA, FACG6;Susanna Huh, MD, MPH7;Kishore Iyer, mbbs81鲁汶大学医院,鲁汶,Brabant Wallon;2索尔福德皇家NHS基金会信托,英格兰索尔福德;3营养支持,Hôpital博戎,巴黎,法兰西岛;4肠道衰竭和肝脏疾病科,俄亥俄州克利夫兰;5丹麦首都哥本哈根国立医院肠衰竭与肝病科;6Ironwood Pharmaceuticals, Basel, Basel- stadt;7Ironwood Pharmaceuticals, Boston, MA;8西奈山伊坎医学院,纽约,NYEncore演讲后:2024年美国胃肠病学学院,2024年10月25日至30日,宾夕法尼亚州费城。资金支持:无报告。Boram Lee, MD1;Ho-Seong Han,博士11首尔国立大学盆唐医院,首尔,首尔背景:胰腺癌是致死性恶性肿瘤之一,5年生存率不足10%。尽管治疗取得了进步,但由于人口老龄化和肥胖率上升,其发病率正在上升。肥胖传统上被认为是许多癌症(包括胰腺癌)的负面预后因素。然而,“肥胖悖论”表明,肥胖可能与某些疾病的较好预后有关。本研究探讨肥胖对胰腺切除术后长期胰腺癌幸存者生存的影响。方法:回顾性分析2004年1月至2022年6月行胰腺导管腺癌(PDAC)手术治疗的404例患者。将患者分为非肥胖组(BMI 18.5 ~ 24.9) (n = 313)和肥胖组(BMI≥25.0)(n = 91)。收集的数据包括人口统计学、临床、围手术期和术后信息。生存结局(总生存期[OS]、无复发生存期[RFS]、癌症特异性生存期[CSS])采用Kaplan-Meier曲线和Cox回归模型进行分析。一项亚组分析检查了肥胖队列中内脏脂肪与皮下脂肪比(VSR)对生存的影响。结果:肥胖患者(n = 91)的5年OS (38.9% vs. 27.9%, p = 0.040)和CSS (41.4% vs. 33%, p = 0.047)明显优于非肥胖患者。两组间RFS无显著差异。在肥胖队列中,较低的VSR与生存率的提高相关(p = 0.012),表明脂肪分布在结果中的重要性。结论:肥胖与接受手术的胰腺癌患者的总体生存率和癌症特异性生存率的提高有关,强调了采用细致入微的方法来管理肥胖患者的潜在益处。脂肪组织的分布,特别是相对于内脏脂肪较高的皮下脂肪,进一步影响生存,表明量身定制的治疗策略可以提高结果。Nicole Nardella, MS1;Nathan Gilchrist, BS1;Adrianna Oraiqat, BS1;Sarah Goodchild, BS1;德纳·伯汉,BS1;Laila Stancil, HS1;珍妮·米兰诺,BS1;克里斯蒂娜·圣地亚哥,BS1;梅丽莎·亚当斯,PA-C1;Pamela Hodul, md11莫菲特癌症中心,坦帕,佛罗里达州背景:胰腺癌(PC)是一种毁灭性的诊断,估计在2024年有66440例新病例和51750例死亡。据报道,癌症患者营养不良的发生率在30-85%之间,这取决于患者的年龄、癌症类型和疾病阶段。具体而言,PC患者经常出现营养不良,这可能导致生活质量和肿瘤治疗的负面影响。我们假设,对PC患者早期营养干预意识的提高导致了我们三级癌症中心营养咨询的高利用率。方法:这项免irb的回顾性研究纳入了2021-2023年在我们机构就诊的新诊断、治疗naïve PC患者(n = 701)。我们将新诊断定义为在临床表现后30天内病理证实的腺癌。在首次咨询时,使用经过验证的营养不良筛查工具(MST)(89.5阳性预测值)对患者进行体重减轻和营养不良风险筛查,并根据风险或患者偏好转介给营养师。收集的数据包括人口统计学、疾病阶段(局部vs转移)、肿瘤位置(头/颈部/钩端、身体/尾巴、多灶性)、表现症状(体重减轻百分比、腹痛、腹胀、恶心/呕吐、疲劳、排便习惯改变)、黄疸、胰腺炎和胃出口梗阻的经历,以及营养师咨询。使用描述性变量和Fisher精确检验来报告结果。 结果:患者以男性为主(54%),中位年龄70岁(27 ~ 95岁)。大约一半的患者有局限性疾病(54%),原发肿瘤位于头颈部/钩部(57%)。头颈部/钩部肿瘤多为局部病变(66%),而体尾肿瘤多为转移性病变(63%)。进一步的人口统计信息见表1。66%的患者(n = 466)、69%的局部患者(n = 261)和64%的转移性患者(n = 205)经历了意外体重减轻。局部疾病患者在中位3个月内体重减轻12%,而转移性患者在中位5个月内体重减轻10%。在局部患者中,大多数表现为腹痛(66%)、恶心/呕吐/疲劳(61%)和排便习惯改变(44%)。转移患者的表现症状相似(见表2),肿瘤部位与表现症状的关系无统计学意义。67% (n = 473)的患者接受了营养师咨询,其中77%为局限性疾病患者,57%为转移性疾病患者。在报告体重减轻的人中,74% (n = 343)咨询了营养师。结论:总体而言,发现大量新诊断,治疗naïve PC患者存在营养不良。局限性疾病和肿瘤位于头/颈/钩部的患者会出现恶心、呕吐、排便习惯改变和疲劳等最严重的胃肠道症状。早期实施积极的营养筛查计划,提高了人们对营养不良的认识,并为新诊断的PC患者提供了营养干预。表1。人口统计学和疾病特征。表2。表现症状。Nicole Nardella, MS1;Nathan Gilchrist, BS1;Adrianna Oraiqat, BS1;Sarah Goodchild, BS1;德纳·伯汉,BS1;Laila Stancil, HS1;珍妮·米兰诺,BS1;克里斯蒂娜·圣地亚哥,BS1;梅丽莎·亚当斯,PA-C1;Pamela Hodul, md11莫菲特癌症中心,坦帕,佛罗里达州背景:胰腺癌(PC)是一种侵袭性疾病,5年生存率为13%。症状出现在病程晚期,导致约50%的患者出现转移性疾病。新发糖尿病通常是PC的首发症状之一,可在癌症诊断前3年确诊。我们假设在糖尿病患者中提高对PC患病率的认识,无论是新发的还是已经存在的,都可能导致PC的早期诊断。方法:这项免irb的回顾性研究纳入了2021-2023年间就诊的糖尿病新发PC患者(n = 458)。我们将新发糖尿病定义为病理证实的腺癌前3年内确诊的糖尿病。我们将新诊断的PC定义为在临床表现后30天内病理证实的腺癌。收集的数据包括人口统计学、分期(局部vs转移)、肿瘤位置(头/颈部/钩端、身体/尾巴、多病灶)、治疗开始、糖尿病发病(新发vs已存在)、糖尿病治疗方案和体重减轻。描述性变量用于报告结果。结果:在研究期间,有1310例患者来我院就诊。其中,35%被诊断患有糖尿病(n = 458)。大多数患者为男性(61%),PC诊断年龄为69岁(41-92岁)。患者大多为局限性疾病(57%),原发肿瘤位于头颈部/钩部(59%)。31%的糖尿病患者出现新发糖尿病,11%的新患者出现新发糖尿病,63%有局限性疾病(79%头颈部/钩端),37%有转移(66%身体/尾巴)。在先前患有糖尿病的患者中(69%),54%有局限性疾病(69%头颈部/钩部),46%有转移性疾病(53%身体/尾巴)。进一步的人口统计学/疾病特征见表1。10% (n = 31)已存在糖尿病的患者出现糖尿病突然恶化,12%的患者在诊断为PC之前改变了他们的治疗方案。因此,13%(175/ 1310)的新患者出现新发或恶化的糖尿病。75% (n = 108)的新发糖尿病患者体重减轻,12个月内中位体重减轻14%(3%-38%)(1-24)。另外,66% (n = 206)的既往糖尿病患者体重减轻,6个月(0.5-18)的中位体重减轻14%(4%-51%)。糖尿病用药:口服41%,胰岛素30%,口服和胰岛素联合20%,不用药10%。在新发糖尿病患者中,68%在PC诊断后1年内确诊,32%在PC诊断后1-3年内确诊。在诊断1年内,68%的患者有局限性疾病,81%的患者有头/颈部/钩部肿瘤。在转移性肿瘤中(31%),73%有体/尾肿瘤。 在PC诊断后1-3年内诊断为糖尿病的患者中,52%有局限性疾病(75%头颈部/钩端),48%有转移性疾病(59%身体/尾巴)。更多特征见表2。结论:总体而言,在我们的机构中,大约三分之一的新发PC患者患有糖尿病,其中三分之一的患者出现了新发糖尿病。大多数糖尿病患者表现为局部头颈部/钩部肿瘤。当比较新发与已存在的糖尿病患者时,新发患者比已存在的糖尿病患者在更长的时间内体重减轻的幅度更大,而且疾病更局限。PC诊断后1年内诊断为糖尿病的患者多出现局部病变(头/颈/钩部)。因此,提高对糖尿病与PC的关系的认识,特别是新发病和既往病史恶化,可能会导致早期诊断。表1。人口统计学和疾病特征。表2。新发糖尿病特征马塞洛·门德斯博士;加布里埃拉·奥利维拉,RD2;Ana Zanini, RD, MSc2;Hellin dos Santos, RD, MSc21Cicatripelli, belsamim, Para;prodiet Medical Nutrition, Curitiba, ParanaEncore post财务支持:无报道。背景:根据NPUAP,压力损伤(PI)是发生在皮肤和/或底层软组织的损伤,主要发生在骨突出部位,也可能与使用医疗器械有关。pi可以从完整的皮肤到更深的溃疡,影响肌肉和骨骼等结构。本研究的目的是报告在PI治疗中使用专门的伤口愈合补充剂的经验。方法:根据Cicatripelli公司护理人员的临床经验,撰写病例报告。通过对医疗记录和显示伤口进展的照片的审查,从2024年5月至7月收集了数据。患者是一名69岁女性,患有慢性阻塞性肺病、糖尿病和全身性动脉高血压,否认吸烟和饮酒。由于细菌性肺炎,患者在机械通气15天后出现骶骨4期PI,并于2024年5月2日入住一家私人诊所接受治疗。初始伤口评估:尺寸:16.5x13x4cm (WxLxD);大量带有恶臭的化脓性渗出物;周围皮肤完整;轻度至中度疼痛;75%肉芽组织,25%液化性坏死(脱落)(图1)。创面用0.1%聚六亚甲基双胍(PHMB)溶液清洗,保守清创,用含1.2%银的亲水剂一次覆盖,棉纱布和透明膜二次覆盖,每72小时换药一次。补充剂(Correctmax - Prodiet医疗营养)于2024年5月20日开始,剂量为每天2包,含胶原蛋白肽10克,l -精氨酸3克,维生素a 612毫克,维生素E 16毫克,维生素C 508毫克,硒30微克,锌16毫克。结果:补药第17天,用phmb浸渍纱布代替含银敷料的亲水剂,创面无感染迹象,临床效果明显改善。尺寸:8x6x2cm (WxLxD);中度血清学渗出;周围皮肤完整;100%肉芽组织;疼痛和气味明显改善(图2)。第28天,由于出现轻度皮炎,将敷料改为海藻酸钙和海藻酸钠,以优化渗出物控制。采用低强度激光治疗,并使用皮肤保护喷雾。伤口评估:尺寸:7x5.5x1.5 cm (WxLxD),特征保持不变(图3)。第56天,由于缺乏资源无法继续治疗,患者返回换药并接受出院指示。方法保持不变,每3天换一次药。伤口评估:尺寸:5x3.5x0.5 cm (WxLxD),伤口面积减少约92%,上皮化边缘,并保持特征(图4)。结论:营养干预与特定的营养补充可以帮助管理复杂的伤口,作为愈合过程中的关键工具,有助于缩短愈合时间。图1所示。2024年2月5日初步评估当天的伤口照片。图2。2024年6月6日补药17天后伤口照片。图3。2024年6月17日补药28天后伤口照片。图4。2024年7月15日补药56天后伤口照片。 Ludimila Ribeiro, RD, MSc1;Bárbara Gois, RD, phd;Ana Zanini, RD, MSc3;Hellin dos Santos, RD, MSc3;安娜·保拉·塞莱斯,mba;Flávia Corgosinho, phd;Joao Mota,博士41 Goiania联邦大学营养学院Goiás, Goias;2戈伊尼亚联邦大学Goiás营养学院,戈阿斯;prodiet医疗营养,库里蒂巴,巴拉那;4 Goias联邦大学,Goiania, goas财政支持:无报道。背景:餐后血糖被认为是大血管和微血管疾病发展的重要危险因素。尽管使用降糖药,糖尿病患者经常出现餐后高血糖,由于不平衡的饮食。本研究的目的是比较低血糖指数配方作为2型糖尿病患者标准早餐的替代品对血糖控制的影响。方法:这项随机、安慰剂对照、交叉研究纳入了18例2型糖尿病患者。参与者被要求在不同的周内连续三个工作日随机食用相同卡路里含量的营养配方早餐或典型的巴西早餐。该营养配方(200毫升)提供200千卡热量,含碳水化合物20克,蛋白质8.8克,脂肪9.4克(MUFA: 5.6克,PUFA: 2.0克),纤维3.0克(DiamaxIG - Prodiet Medical Nutrition),可作为早餐替代品。为参与者提供了营养配方和标准早餐。在两周的干预期间,参与者使用连续血糖监测传感器(Libre 2)。测量体重和身高以计算体重指数(BMI),并监测药物使用情况。结果:女性占61%,平均年龄50.28±12.58岁。平均血糖为187.13±77.98 mg/dL, BMI为29.67±4.86 kg/m²。所有参与者都服用二甲双胍,其中两人与胰岛素同时服用。在研究期间,药物剂量或治疗方案没有变化。营养配方组的曲线下增量面积显著低于标准早餐组(2,794.02±572.98∶4,461.55±2,815.73,p = 0.01)。结论:与标准巴西早餐相比,用于血糖控制的低血糖指数配方显著降低了2型糖尿病患者餐后血糖反应。这些发现表明,在这一人群中,结合低血糖指数的膳食可能是一种有效的策略,可以更好地控制餐后血糖水平,这可能有助于降低发生大血管和微血管并发症的风险。Kirk Kerr博士;Bjoern Schwander, phd;多米尼克·威廉姆斯,医学博士,硕士11雅培营养,哥伦布,俄亥俄州;2AHEAD GmbH, bietiheim - bissingen, baden - wurttemberg,财政支持:雅培营养。背景:据世界卫生组织称,肥胖是糖尿病、心脏病和癌症等全球非传染性疾病的主要危险因素。体重循环,通常被定义为有意减肥和无意反弹,在肥胖人群中观察到,可能会产生不利的健康和经济后果。本研究建立了体重循环对肥胖个体健康经济影响的第一个模型,肥胖定义为身体质量指数(BMI)≥30 kg/m²。方法:建立了一个每月周期的终身状态转换模型(STM)来模拟一组肥胖个体,比较“体重循环者”和“非体重循环者”。模拟的患者队列假定平均BMI为35.5,其中11%的患者患有心血管疾病,6%的患者患有2型糖尿病。使用美国社会视角,主要结果是避免每个肥胖相关事件的成本,每个生命年(LY)的成本增加,以及每个质量调整生命年(QALY)的成本增加。肥胖相关疾病的转移概率通过荟萃分析获得,并基于美国疾病相关基础风险。通过bmi相关、体重周期相关和t2d相关相对风险(RR)调整风险。BMI进展、健康效用、直接和间接成本以及其他人群特征均由美国发表的研究报告提供。未来的成本和影响每年折现3%。进行确定性和概率敏感性分析以调查结果的稳健性。结果:模拟一生水平,非骑车者避免了0.090个肥胖相关事件,增加了0.602个生命周期,增加了0.518个生命周期,减少了大约4592美元的总成本(1004美元的直接成本和3588美元的间接成本)。敏感性分析表明,该模型对患者年龄和心血管疾病发病率和死亡率风险的变化最敏感。在敏感性分析中,不循环作为成本效益的选择是稳健的。 采用Jamovi软件(2.2.5)和MetaboAnalyst 5.0进行统计。结果:共鉴定出鞘磷脂(SM)、神经酰胺(Cer)、鞘脂糖(GlcSL)等34种鞘磷脂。SMs是血浆中最常见的SL- 27 SM, 3 Cer和4 GlcSL-和GIT -16 SM, 13 Cer和5 GlcSL。每个GIT组织均呈现明显的SL重构。血浆和空肠对术后SLs的变化区分较好(图1)。空肠的变化最明显,其次是血浆和十二指肠(图2)。图3为血浆和胃肠道组织的热图。相关分析显示,血浆SLs与空肠SLs的相关性较强,尤其是SM(d32:0)、GlcCer(d42:1)和SM(d36:1)。这些脂质与空肠鞘磷脂呈负强相关,而与空肠神经酰胺呈正强相关(表1)。结论:RYGB与血浆和GIT中SL重构有关。SM是在血浆和GIT中发现的主要SL。空肠和血浆的变化最为明显,两者的相关性更强。考虑到我们的研究结果,SM在RYGB后代谢变化中的作用有待进一步研究。表1。血浆鞘脂与胃肠道鞘脂的相关性分析[j] . p &lt; 2006;**p &lt;; 01;***p &lt; 001。绿色圆圈代表基线时的样本,红色圆圈代表RYGB后3个月的样本。图1所示。主成分分析(PCA)从GIT组织和血浆。折次变化= log2术后平均值/术前平均值。图2。血浆和胃肠道鞘脂的折叠变化。右上方绿色方框下的图代表手术前的脂质丰度,左上方红色方框下的图代表RYGB后的脂质丰度。图3。血浆和胃肠道鞘脂热图。卢卡斯Santander1;加布里埃拉·德·奥利维拉·莱莫斯,MD2;多斯桑托斯Mancuzo3;Natasha mendonpada Machado博士;Raquel Torrinhas博士;Dan Linetzky Waitzberg,博士学位51圣阿马罗大学,圣保罗市<s:1> Bernardo Do Campo;2圣保罗大学医学院,Brasília,联邦区;3圣保罗市<s:1>卡埃塔诺大学,南卡埃塔诺大学;4巴西圣保罗大学医学院,巴西圣保罗;5圣保罗<s:1>大学医学院,圣保罗<e:1>。财政支持:<s:1>圣保罗大学安帕罗基金会。背景:微量白蛋白尿(MAL)是肾脏损伤的早期生物标志物,与高血压、2型糖尿病(T2DM)和肥胖等疾病有关。它还与较高的心血管(CV)风险相关。本研究探讨了Roux-en-Y胃分流术(RYGB)对肥胖、2型糖尿病和MAL患者MAL和CV指标的影响。方法:8名II-III级肥胖、2型糖尿病和MAL患者接受RYGB。接受胰岛素治疗的患者不包括在内。MAL定义为尿白蛋白与肌酐比值&gt;30毫克/克。在基线和术后3个月测量MAL、血糖和脂质血清生物标志物。收缩压(SBP)和舒张压(DBP)以及T2DM和高血压的药物治疗也进行了评估。ADA 2021标准定义T2DM缓解。分类变量报告为绝对频率和相对频率,而连续变量表示为基于正态性检验的中位数和IQR。使用Wilcoxon和Mann-Whitney测试对数值数据进行组内和组间比较。当需要比较二分类变量时,进行Fisher检验。数据采用JASP软件0.18.1.0进行分析。结果:总体而言,RYGB与体重减轻、身体成分改善和更好的CV标志物相关(表1)。手术后,所有患者的MAL减少了至少70%,一半的患者消退。所有MAL消退的患者术前水平≤100mg /g。无缓解的患者术前有严重的MAL(33.8比667.5,p = 0.029),收缩压(193比149.5,p = 0.029)和舒张压(138比98,p = 0.025)升高。手术后血压下降,但没有MAL缓解的患者血压仍然较高:收缩压(156.0 vs 129.2, p = 0.069)和舒张压(109.5 vs 76.5, p &lt; 0.001)。3个月时,MAL缓解与T2DM缓解无关(75% vs. 50%, p = 1.0)。1例患者在RYGB后MAL恶化(193.0 vs 386.9 mg/g)。术后肾小球滤过率(GFR)仅在MAL分辨率组(95.4 vs 108.2 ml/min/1.73 m²,p = 0.089)比无MAL分辨率组(79.2 vs 73.7 ml/min/1.73 m²,p = 0.6)有增加的趋势。结论:RYGB在该队列中有效降低了肾功能障碍和心血管风险指标。患者表现出MAL的减少,一半的患者有缓解。 紫外线- c (UV-C)是一种成熟的技术,通常用于医院设备和房间的消毒。在这项研究中,我们研究了我们的新型UV-C光消毒装置对接种了常见clabsi相关生物的UV-C光透射nc的效果。方法:以金黄色葡萄球菌(ATCC #6538)、白色念珠菌(ATCC #10231)、耳念珠菌(CDC B11903)、大肠杆菌(ATCC #8739)、铜绿假单胞菌(ATCC #9027)和表皮葡萄球菌(ATCC #12228)为试验菌。每种生物共检测29份NC样本,阳性对照3份,阴性对照1份。每个UV- c光透射型NC接种10µl培养的接种物(7.00-7.66 log),并使用我们的紫外光消毒装置FireflyTM暴露于平均48 mW/cm2的紫外线下1秒。紫外线消毒后,10 mL 0.9%生理盐水溶液冲洗NC,用0.45µm膜过滤。将膜过滤器镀在与细菌相匹配的琼脂培养基上,对金黄色葡萄球菌、大肠杆菌、表皮葡萄球菌和铜绿假单胞菌在37℃下孵育一夜,对白色葡萄球菌和金黄色葡萄球菌在室温下孵育两天。阳性对照遵循相同的程序,不暴露在紫外线下,稀释100倍,然后一式三次涂在琼脂板上。阴性对照在不接种的情况下遵循相同的程序。培养皿孵育后,计数并记录每个培养皿上的菌落数。通过确定阳性对照对数浓度除以样品浓度(以cfu/mL为单位)来计算对数减少量。以1 cfu/10 mL计算总杀伤量。结果:使用我们的紫外线产生装置,我们能够实现超过4个对数的平均减少和完全杀死所有测试生物。金黄色葡萄球菌、白色念珠菌、金黄色葡萄球菌、大肠杆菌、铜绿假单胞菌和表皮葡萄球菌的对数减少率分别为5.29、5.73、5.05、5.24、5.10和5.19。结论:我们证明,使用我们的紫外线消毒装置和UV- c透射nc,常见clabsi相关生物体减少了4倍以上。通过在NC内直接注射疫苗,我们证明可以实现NC内的消毒,这是传统擦洗方法无法实现的。一秒的NC消毒时间将减少医院工作流程的中断,特别是在重症监护病房,高效的消毒率对于采用该技术至关重要。表1。受试生物暴露于48毫瓦/平方厘米的UV-C下1秒后的对数减少。Yaiseli Figueredo, pharmd11迈阿密大学医院,迈阿密,佛罗里达州背景:奥曲肽属于生长抑素类似物。它在标签外用于恶性肠梗阻(MBO)。生长抑素类似物(SSA)抑制多种激素的释放和作用,减少胃分泌物、蠕动和内脏血流,同时促进水和电解质的吸收。国家综合癌症网络(NCCN)指南推荐奥曲肽100-300微克皮下注射,每天2 -3次或10-40微克/小时连续输注,用于恶性肠梗阻的治疗,如果预后大于8周,考虑长效释放(LAR)或贮存注射。使用奥曲肽作为肠外营养液的添加剂一直是一个有争议的话题,因为担心形成糖基奥曲肽偶联物,可能会降低奥曲肽的功效。然而,其他相容性研究已经得出结论,在室温下,在室温下的TPN溶液中,奥曲肽在48小时内几乎没有损失。在迈阿密大学医院,使用奥曲肽作为全肠外营养(TPN)溶液的添加剂,以减少恶性肠梗阻患者的胃肠道分泌物。起始剂量为300微克,如果输出保持不受控制/升高,则剂量以300微克的增量增加至900微克的最大剂量。本研究的目的是评估奥曲肽作为TPN添加剂用于恶性肠梗阻住院患者的有效性和安全性。方法:通过三年回顾性图表回顾(2021年6月- 2024年6月),评估奥曲肽作为TPN添加剂在UMH诊断为MBO的住院患者中的有效性和安全性。从图表回顾中获得以下信息:年龄、性别、肿瘤诊断、TPN指征、TPN依赖性、使用的奥曲肽剂量、记录的基线和最终胃肠道分泌量、适当的通气胃造口类型、住院时间、基线和最终肝功能检查。结果:27例恶性肠梗阻患者需要加奥曲肽TPN治疗。 样本量小,随访时间短,可能限制了手术对肾功能的总体影响。为了更好地了解减肥手术对MAL的影响及其与其他心血管指标的关系,未来的研究需要更大的队列和更长的随访时间。表1。RYGB后生化及临床数据分析。eGFR:估计肾小球滤过率;HbA1c:糖化血红蛋白;HDL-c:高密度脂蛋白胆固醇;HOMA-BETA:内稳态模型下的β细胞功能;HOMA-IR:胰岛素抵抗的稳态模型评估LDL-c:低密度脂蛋白胆固醇;Non-HDL-c:非高密度脂蛋白胆固醇;VLDL-c:极低密度脂蛋白胆固醇;DBP:舒张压;SBP:收缩压;WC:腰围。Michelle Nguyen,理学学士,理学硕士;Johane P Allard, MD, FRCPC2;丹麦人克里斯蒂娜·达乌德,MD3;Maitreyi Raman, MD, MSc4;Jennifer Jin, MD, FRCPC5;利亚·格拉姆利克,MD6;杰西卡·韦斯理学硕士;陈志强,博士7;Lidia Demchyshyn,博士81 pentavere Research Group Inc.,多伦多,ON;2多伦多总医院内科消化内科,多伦多,安大略省;3蒙特利尔大学医学院消化内科,蒙特利尔,QC;4加拿大卡尔加里大学消化科,加拿大卡尔加里;5阿尔伯塔大学医学部,皇家亚历山德拉医院消化内科,埃德蒙顿,AB;6加拿大阿尔伯塔大学医学和牙科学院消化科,埃德蒙顿;7 .武田加拿大公司,温哥华,BC;第46届欧洲临床营养与代谢学会(ESPEN)大会,2024年9月7-10日,意大利米兰。背景:Teduglutide适用于依赖肠外支持(PS)的短肠综合征(SBS)患者的治疗。本研究使用真实世界证据评估了泰杜肽在加拿大诊断为依赖PS的SBS患者中的长期有效性和安全性。方法:这是一项观察性、回顾性研究,使用来自加拿大国家武田患者支持计划的数据,包括患有SBS的成年人。数据收集于teduglutide起始治疗前6个月和起始治疗至2023年12月1日死亡或失去随访期间。描述性统计描述了人群和治疗中出现的不良事件(teae)。肠外营养/静脉内液体供应(PN/IV)的变化根据PN/IV容量较基线的减少进行评估。p &lt; 0.05为统计学意义。结果:52例患者(60%为女性)纳入本研究。中位年龄(范围)为54岁(22-81岁),50%的患者的病因是克罗恩病。6个月时,与基线相比,PN/IV量绝对减少的中位数(范围)和百分比为3,900 mL/周(-6,960-26,784;P &lt; 0.001)和28.1%(-82.9-100)。在24个月时,中位(范围)绝对减少量为6,650 mL/周(-4,400-26,850;p = 0.003),每周PN/IV减少≥20%的患者比例为66.7%。在整个研究中,27%的患者实现了PN/IV的独立性。在研究期间,51例(98%)患者报告了teae,其中83%为严重teae,最常见的3种是体重变化、腹泻和疲劳。结论:患者在使用teduglutide后显示PN/IV容量显著降低,没有意外的安全性发现。这项研究证明了teduglutide在加拿大SBS患者中的长期有效性和安全性,与之前的临床试验和现实世界的研究相一致。sarah Carter, RD, LDN, CNSC1;Ruth Fisher, RDN, LD, CNSC21Coram CVS/Specialty Infusion Services, Tullahoma, TN;2 . coram CVS/专业输液服务,圣希莱尔,密苏里州财政支持:无报告。背景:感知获益是决定治疗继续的一个因素。关于接受GLP-2类似物teduglutide的患者的积极结果,除了断奶HPN的成功率和水合体积提高20%之外,几乎没有发表过相关信息。初次接受治疗的患者可能会质疑在开始治疗后多久才能看到效果。如果患者希望改善生活质量,他们可能会与开处方者合作,提高治疗耐受性。该数据分析将提供有关患者接受teduglutide和他们对治疗的感知益处的详细信息。方法:作为服务协议的一部分,营养师对接受特杜葡肽治疗的患者进行访谈,监测其持续性并帮助消除治疗障碍。营养师根据患者的服药日期每周和每月打电话,并在患者电子病历的流程中记录干预措施。 (不推荐)或依从性差,大量饮酒,或复杂的医疗/手术条件。肠外营养配方在该人群中是安全的,并优先考虑充足的氮,非蛋白热量和微量营养。对危险因素、治疗反应和多学科小组作用的进一步分析正在进行中。表1。风险/演示。表2。肠外营养干预的反应。Holly Estes-Doetsch, MS, RDN, LD1;Aimee Gershberg, RD, CDN, CPT2;Megan Smetana,药学博士,BCPS, BCTXP3;林赛·索博特卡,DO3;克里斯汀·罗伯茨,博士,RDN, LD, CNSC, FASPEN, fand41俄亥俄州立大学,俄亥俄州哥伦布市;2NYC Health + Hospitals,纽约市,纽约州;3俄亥俄州立大学韦克斯纳医学中心,俄亥俄州哥伦布市;4 .俄亥俄州立大学,格兰维尔,俄亥俄州资金支持:无报道。背景:失代偿性肝硬化通过改变胆汁合成和通过胆管排泄增加脂肪消化不良的风险。消化不良会增加维生素和矿物质缺乏的风险,如果不及时治疗,会导致相应的健康问题,如代谢性骨病、干眼症和角化过度。缺乏预防和治疗缺陷的综合指南。方法:从电子病历中提取病史和手术史、人体测量、药物和营养补充剂、实验室数据和医疗程序并进行分析。结果:一例先天性胆道闭锁合并失代偿性肝硬化患者在肝病科门诊就诊。生化评估显示严重的维生素A缺乏和亚理想的维生素D和锌状态。体格检查显示休止期呕吐和短暂性视力模糊。尽管有大剂量口服醋酸视黄酯的历史,每天服用10,000-50,000单位,并通过肌间注射和联合治疗缺锌以确保足够的循环视黄醇结合蛋白,但在过去的10年里,血清视黄醇的正常化是不可能的。肝移植后患者血清维生素A水平恢复正常。结论:在失代偿期肝硬化中,当传统治疗策略不成功时,缺乏足够的微量营养素剂量指南。此外,由于潜在的肝功能障碍,转运蛋白分泌的改变可能对评估微量营养素状态的实验室标志物构成挑战。与药剂学和医学界的合作有助于进行彻底的评估,并建立安全的治疗和监测计划。需要进行临床研究,以了解慢性无反应性脂溶性维生素缺乏症患者可接受和安全的给药策略。财政支持:这项工作得到了加拿大国家研究委员会工业研究援助计划(NRC IRAP),阿尔伯塔省创新加速创新到护理(AICE)的资助,部分由卡尔加里大学斯奈德慢性疾病研究所的临床问题孵化器资助。背景:小肠(SI)微生物组在营养吸收中起着至关重要的作用,新出现的证据表明,粪便内容物不足以代表小肠微环境。内窥镜取样是可行的,但价格昂贵且不可扩展。可消化取样胶囊技术正在兴起。然而,潜在的污染成为这些设备的主要限制。方法:我们之前报道了小肠微生物组抽吸(SIMBA)胶囊作为采样、密封和保存SI管腔内容物的有效手段,用于16s rRNA基因测序分析。一组DNA样本,包括SIMBA胶囊(CAP)收集的SI样本和匹配的唾液(SAL)、粪便(FEC)、十二指肠内窥镜抽吸(ASP)和刷拭(BRU)样本,来自一项观察性临床验证研究的16名参与者,被送去进行霰弹枪宏基因组测序。目的是1)比较胶囊(CAP)与内窥镜抽吸(ASP)和850份来自临床微生物数据仓库(PRJEB28097)的小肠、大肠和粪便样本的取样性能,以及2)从物种组成和功能潜力方面表征4个不同取样点的样品。结果:4/80个样本(1/16 SAL、2/16 ASP、0/16 BRU、1/16 CAP和0/16 FEC)的文库制备失败,76个样本进行了鸟瞰图测序(平均每个样本38.5 M对读对)(图1)。质量评估表明,尽管CAP样本的原始DNA产率较低,但与ASP和BRU相比,它们保留了最低水平的宿主污染(平均5.27%比平均93.09 - 96.44%)每个样本的平均总读数)(图2)。 在回肠末端样品中检测到的物种中,CAP样品与ASP样品共享了大部分物种。ASP和CAP样品组成与十二指肠、空肠和唾液较为相似,与大肠和粪便样品差异较大。功能基因组学进一步揭示了GI区域特异性差异:在ASP和CAP样品中,我们检测到许多用于碳水化合物消化和短链脂肪酸的肠道代谢模块(GMMs)。而益生菌种类、参与胆汁酸代谢的种类和基因主要存在于CAP和FEC样品中,ASP样品中未检测到。结论:尽管ASP样品的宿主污染程度较高,但CAP和ASP的微生物组组成相似。在显示GI区域特异性功能电位方面,CAP似乎比ASP质量更好。这一分析证明了SIMBA胶囊在揭示肠道微生物组方面的巨大潜力,并支持SIMBA胶囊在观察性和干预性研究中的应用,以调查短期和长期生物食品干预对肠道微生物组的影响(图3)。利用SIMBA胶囊进行的一系列研究正在进行中,生物食品干预影响的可检测性将在不久的将来报告(表3)1) .Table 1。正在进行的使用SIMBA胶囊的观察性和介入性临床研究列表。图1所示。霰弹枪宏基因组测序:分类概述(10个最物种的相对丰度采样点)。图2。散弹枪宏基因组测序:高质量的非宿主污染读取采样点。图3。使用SIMBA胶囊的短期和长期介入研究方案。Darius Bazimya理学硕士营养、RN1;Francine Mwitende, RN1;Theogene Uwizeyimana, phn11全球卫生公平大学,基加利背景:卢旺达目前正在努力应对营养不良和肥胖上升的双重负担,尤其是在城市人口中。随着国家经历快速城市化和饮食转变,传统的营养不良问题与肥胖率和相关代谢紊乱并存。本研究旨在调查这些营养相关问题及其对卢旺达城市人口胃肠道(GI)健康和代谢结果的影响之间的关系。方法:在卢旺达首都基加利进行了一项横断面研究,涉及1200名18至65岁的成年人。收集饮食摄入、体重指数(BMI)、胃肠道症状和代谢指标(如空腹血糖、胆固醇水平和肝酶)的数据。参与者根据身体质量指数被分为三组:营养不良、体重正常和超重/肥胖。使用有效的问卷评估胃肠道症状,并通过血液检查评估代谢标志物。进行统计分析以评估饮食模式、BMI类别和GI/代谢健康结果之间的相关性。结果:研究发现,25%的参与者营养不良,22%的参与者肥胖,这反映了卢旺达城市人口营养不良和肥胖人数增加的双重负担。在肥胖参与者中,40%的人表现出空腹血糖水平升高(p &lt; 0.01), 30%的人报告了明显的胃肠道紊乱,如肠易激综合征(IBS)和非酒精性脂肪性肝病(NAFLD)。相比之下,营养不良个体报告的胃肠道症状较少,但微量营养素缺乏症的发生率较高,包括贫血(28%)和维生素a缺乏症(15%)。在肥胖和正常体重的参与者中,以高脂肪和低纤维摄入为特征的饮食模式与胃肠道疾病和代谢功能障碍的增加显著相关(p &lt; 0.05)。结论:这项研究突出了卢旺达城市中心营养不良和肥胖共存所构成的日益严重的公共卫生挑战。与城市化相关的饮食变化对营养谱的两端都有影响,对GI和代谢健康产生不利影响。解决这些问题需要综合的营养干预措施,考虑到营养不足和肥胖的双重挑战,促进均衡饮食和改善获得保健服务的机会。这些发现对卢旺达的营养治疗和代谢支持实践具有重要意义,强调有必要针对该国独特的营养状况采取量身定制的干预措施。李维·泰根博士,RD1;娜塔莉亚·库奇马,MD2;Hijab Zehra, BS1;林安妮,博士,RD3;莎朗·洛佩兹,BS2;Amanda Kabage, MS2;莫妮卡·费舍尔,MD4;Alexander Khoruts, md21明尼苏达大学圣路易斯市 保罗、锰;2明尼苏达大学,明尼阿波利斯;3明尼苏达大学奥斯汀分校;4印第安纳大学医学院,印第安纳波利斯,in财政支持:共同实现治愈。背景:粪便微生物群移植(FMT)是治疗复发性艰难梭菌感染(rCDI)的有效方法。在严重的抗生素损伤后,该手术可以修复肠道微生物群。rCDI的恢复与感染后肠易激综合征(IBS)的高发相关。此外,年龄较大,医疗合并症和长期腹泻疾病导致这一患者群体虚弱。FMT对rCDI患者IBS症状和虚弱的影响在很大程度上是未知的。在这项前瞻性队列研究中,我们收集了两个大型学术医疗中心的rCDI患者在FMT治疗后3个月内的IBS症状和虚弱数据。方法:自愿接受FMT治疗的rCDI成人纳入本研究(n = 113)。我们排除了在3个月随访期内CDI复发的患者(n = 15)或在任何时间点IBS症状严重程度量表(IBS- sss)评分不完整的患者。IBS-SSS是一项测量症状强度的5项调查,得分范围从0到500,得分越高表示症状越严重。在基线、FMT后1周、FMT后1个月和FMT后3个月收集IBS-SSS。在基线和3个月时使用虚弱量表评估虚弱程度(分类变量:“健壮健康”、“虚弱前期”、“虚弱”)。采用Kruskal-Wallis检验比较IBS-SSS在不同时间点的差异。事后分析采用假发现率调整方法进行成对Wilcoxon秩和检验。采用Friedman检验比较基线和3个月时间点之间的衰弱分布。结果:队列平均年龄为63.3岁(SD 15.4);75%的患者为女性(共58例)。各时间点IBS-SSS评分见表1和图1。基线时的中位IBS评分为134 [IQR 121], fmt后1周降至65 [IQR 174]。基线时间点与1周、1个月和3个月时间点存在差异(p &lt; 0.05)。在时间点之间没有观察到其他差异。在基线和3个月时评估虚弱程度(总n = 52)。在基线时,71%的患者(n = 37)被认为是体弱或体弱,但这一比例在3个月时降至46% (n = 24)(表2;p &lt; 0.05)。结论:这项多中心、前瞻性队列研究的结果表明,FMT治疗rCDI后3个月,IBS症状和虚弱均有总体改善。值得注意的是,IBS症状评分在FMT后1周有所改善。需要进一步的研究来了解FMT后肠易激综合征症状改善的预测因素,以及营养治疗是否有助于进一步改善。进一步了解营养疗法如何帮助改善rCDI FMT后观察到的虚弱状态也很重要。表1。IBS-SSS评分在基线和FMT后的分布。表2。在基线和fmt后3个月用虚弱量表评估虚弱分布。IBS-SSS评分跨时间点的箱形图分布。IBS-SSS评分中位数基线为134分,fmt后1周降至65分。基线时间点与1周、1个月和3个月时间点存在差异(p &lt; 0.05)。图1所示。IBS-SSS评分按时间点分布。Oshin Khan, BS1;Subanandhini Subramaniam Parameshwari, MD2;克里斯汀·海特曼,博士,RDN1;Kebire Gofar, MD, MPH2;Kristin Goheen, BS, RDN1;加布里埃尔·范豪,BS1;莉迪亚·福斯特费尔,BS1;Mahima Vijaybhai Vyas2;Saranya Arumugam, MBBS2;Peter Madril, MS, RDN1;Praveen Goday, MBBS3;Thangam Venkatesan, MD2;克里斯汀·罗伯茨,博士,RDN, LD, CNSC, FASPEN, fand41俄亥俄州立大学,俄亥俄州哥伦布市;2俄亥俄州立大学韦克斯纳医学中心,俄亥俄州哥伦布市;3俄亥俄州哥伦布市全国儿童医院;4 .俄亥俄州立大学,格兰维尔,俄亥俄州资金支持:无报道。背景:周期性呕吐综合征(CVS)是一种肠-脑相互作用的疾病,其特征是反复发作的恶心、呕吐和腹痛,持续数小时至数天。成人和儿童的估计患病率约为2%。由于顽固性症状,患者可能会出现饮食紊乱,但缺乏相关数据。识别有饮食失调和营养不良风险的患者可以帮助优化营养状况,并可能改善患者的整体预后。本研究的目的是确定在三级保健转诊诊所寻求治疗的CVS患者的营养摄入和饮食模式,并根据疾病严重程度确定饮食摄入的变化。 方法:在这项正在进行的基于Rome IV标准诊断为CVS的成年人的横断面研究中,参与者被要求完成包括食物频率问卷(FFQ)在内的有效调查。确定了基线人口统计学和临床特征,包括疾病严重程度(由每年发作次数定义)。计算健康饮食指数(HEI)得分(0-100分)来评估饮食质量,得分越高表明饮食质量越好,得分越低。数据完整的患者被纳入中期分析。结果:33例平均年龄40±16岁,平均BMI 28.6±7.9。该队列以女性(67%)、白人(79%)和中度至重度疾病(76%)为主。营养不良筛查工具支持42%的参与者存在营养不良风险,与BMI状态(p = 0.358)和疾病严重程度(p = 0.074)无关。CVS患者的HEI评分较低(55分),并且在疾病严重程度上没有差异(58分vs 54分;p = 0.452)。能量摄入量从416-3974千卡/天不等,平均摄入量为1562千卡/天。结论:无论疾病严重程度和BMI如何,CVS患者饮食摄入不足,营养不良风险高。提供者和注册营养师必须意识到这一患者群体中营养不良风险的高发率和不良饮食摄入,以改善饮食干预的提供。深入了解饮食失调和代谢紊乱可能会提高对CVS饮食摄入的理解。汉娜·休伊,MDN1;Holly Estes-Doetsch, MS, RDN, LD2;克里斯托弗·泰勒博士,RDN2;克里斯汀·罗伯茨,博士,RDN, LD, CNSC, FASPEN, fand31全国儿童医院,哥伦布市,OH;2俄亥俄州立大学,俄亥俄州哥伦布市;3 .俄亥俄州立大学,格兰维尔,俄亥俄州资金支持:无报道。背景:由于饮食吸收不良,克罗恩病(CD)中常见微量营养素缺乏。怀孕对营养的需求增加,当伴有乳糜泻等吸收不良时,临床医生必须密切监测微量营养素状况。然而,临床医生在管理这些复杂的患者时缺乏循证指南,使临床医生使用临床判断进行管理。一个怀孕的女性与乳糜泻提出了一个未检测到脂溶性维生素缺乏症分娩的案例研究。维生素K缺乏对乳糜泻患者产后管理的影响,以及潜在的评估和治疗策略。在妊娠25周时,患者出现生化缺铁性贫血、维生素B12缺乏症和锌缺乏症,给予口服补充剂和/或肌内注射治疗。尽管在怀孕前有多种微量营养素缺乏的历史,但没有对妊娠期间的脂溶性维生素进行评估。在怀孕33周时,母亲被诊断出患有先兆子痫,并在35周时分娩。出生后,婴儿以纵隔肿块、肝功能检查异常和初始凝血功能障碍就诊于新生儿重症监护病房。这位母亲在剖宫产后子宫出血。此时,患者的INR为14.8,PT和PTT严重延长,凝血因子II、VII、IX和x的水平低于最佳水平。患者被诊断为维生素K缺乏症,最初每天口服10mg, x 3天,导致血清维生素K升高,而PT和INR趋于正常限度。出院时,医生建议她每天口服1毫克维生素K,以防止进一步缺乏。PT和INR是生化测定,每3个月重新评估一次,因为血清维生素K更能反映最近的摄入量。乳糜泻是一种复杂的疾病,妊娠对微量营养素状况的影响尚不清楚。在怀孕期间,乳糜泻患者可能需要额外的微量营养素监测,特别是在历史微量营养素缺乏或其他危险因素的情况下。这一病例表明,需要进一步研究cd特异性微量营养素缺乏症,并制定特异性补充指南和治疗算法,以检测高危患者的微量营养素缺乏症。方法:无报道。结果:无报道。结论:无报道。Gretchen Murray, BS, RDN1;克里斯汀·罗伯茨,博士,RDN, LD, CNSC, FASPEN, FAND2;菲尔·哈特,MD1;米切尔·拉姆齐,医学博士11俄亥俄州立大学韦克斯纳医学中心,哥伦布,俄亥俄州;2俄亥俄州立大学,格兰维尔,俄亥俄州。背景:在许多吸收不良的情况下,包括炎症性肠病、乳糜泻、短肠综合征和胃旁路手术后,肠性高草酸尿症(EH)和由此引起的结石症是有充分证据的。 慢性胰腺炎(CP)通常导致外分泌胰腺功能不全(EPI)和随后的脂肪吸收不良,增加了钙与膳食脂肪结合导致的EH的风险,从而使草酸盐可被结肠吸收。通过减少全谷物、绿色蔬菜、烤豆、浆果、坚果、啤酒和巧克力的摄入来调节草酸盐的摄入量,同时改善水合作用是公认的EH和草酸钙结石的医学营养疗法(MNT)。虽然饮食中草酸盐的来源是众所周知的,但关于CP饮食中草酸盐的文献有限,因此缺乏数据来指导这些患者对EH和草酸钙结石的MNT治疗。方法:采用横断面病例对照研究,将CP患者与健康对照者进行比较。使用Vioscreen™食物频率问卷来评估和量化CP队列中的总草酸摄入量,并描述饮食来源。描述性统计用于描述草酸的膳食摄入量和贡献食物来源。结果:共纳入52例CP患者,平均年龄50±15岁。大多数受试者为男性(n = 35;67%)。BMI平均值为24±6 kg/m2,体重过轻8例(15%)。平均每日热量摄入为1549千卡,平均每日草酸摄入量为104毫克(范围11-1428毫克)。饮食中草酸摄入量的前三名是生蔬菜和熟蔬菜,如菠菜或生菜,其次是混合食物,如披萨、意大利面、墨西哥卷饼和茶。饮食中草酸盐摄入量的其他重要贡献者(100毫克)包括运动或代餐棒、饼干和蛋糕、土豆制品(土豆泥、烘烤、薯片、油炸)和精制谷物(面包、玉米饼、百吉饼)。结论:在CP人群中,草酸盐摄入量最高的包括蔬菜、混合食物、茶、代餐棒、一些甜点、土豆和精制谷物。许多已确定的饮食草酸来源并未考虑排除在典型的草酸限制饮食中。个性化的饮食草酸盐调节方法对于推动MNT预防cp患者EH是必要的。上海交通大学医学院附属上海第六人民医院临床营养科陈培展;2临床研究中心、瑞金医院、上海交通大学医学院ShanghaiFinancial支持:本研究支持的青少年培养计划上海第六人民医院(批准号国家卫生健康委员会微量元素营养重点实验室(批准号:ynqn202223);wlkfz202308),达能研究院中国膳食营养研究与传播(批准号:DIC2023-06)。背景:据报道,血清维生素D水平低与肌肉量减少有关;然而,这种关系是否有因果关系尚无定论。本研究使用来自国家健康与营养调查(NHANES)的数据和两样本孟德尔随机化(MR)分析来确定血清25-羟基维生素D [25(OH)D]与阑尾肌肉质量(AMM)之间的因果关系。方法:在NHANES 2011-2018数据集中,纳入了11,242名年龄在18-59岁之间的参与者,并进行了多变量线性回归,以评估双能x射线吸收仪测量的25(OH)D与AMM之间的关系(图1)。167个与血清25(OH)D显著相关的单核苷酸多态性被用作工具变量(IVs)来评估维生素D对英国生物银行(417,580名欧洲人)AMM的影响,使用单变量和多变量MR模型(图2)。在NHANES 2011-2018数据集中,在调整了年龄、种族、采血季节、教育程度、收入、体重指数和体力活动后,所有参与者的血清25(OH)D浓度与AMM呈正相关(β = 0.013, SE = 0.001, p &lt; 0.001)。在性别分层分析中,男性(β = 0.024, SE = 0.002, p &lt; 0.001)与女性(β = 0.003, SE = 0.002, p = 0.024)呈正相关。在单变量MR中,所有参与者(β = 0.049, SE = 0.024, p = 0.039)和男性(β = 0.057, SE = 0.025, p = 0.021)的遗传较高的血清25(OH)D水平与AMM呈正相关,但基于IVW模型的女性(β = 0.043, SE = 0.025, p = 0.090)仅显着显著。在两个样本的MR调查中,未检测到IVs的显著多效性效应。MVMR分析结果显示,25(OH)D对人群、男性和女性AMM的正向影响分别为β = 0.116, SE = 0.051, p = 0.022, β = 0.111, SE = 0.053, p = 0.036, β = 0.124, SE = 0.054, p = 0.021。 结论:血清25(OH)D浓度与AMM呈正相关;然而,需要更多的研究来了解潜在的生物学机制。图1所示。横断面研究中参与者选择工作流程图。图2。血清25(OH)D与阑尾肌质量的双样本孟德尔随机化分析假设。假设包括:(1)遗传工具变量(IVs)应与血清25(OH)D显著相关;(2)基因iv不应与任何其他潜在的混杂因素相关联;(3)基因iv只能通过血清25(OH)D,而不能通过任何其他混杂因素影响阑尾肌肉质量。虚线表示对假设的违背。任谦博士;吴俊贤11上海交通大学医学院附属上海市第六人民医院临床营养科经费支持:无报道。背景:健康的饮食对于预防和治疗2型糖尿病(T2DM)至关重要,它对公众健康有负面影响。全谷物富含膳食纤维,是碳水化合物的良好来源。然而,全谷物摄入与2型糖尿病风险和糖代谢之间的相关性和因果关系尚不清楚。方法:首先,利用2003-2018年全国健康与营养检查调查数据库,研究膳食全谷物/纤维摄入量与T2DM风险和葡萄糖代谢的相关性。然后,基于Neale实验室数据库中公开发表的最大的全基因组关联分析,选择与全谷物摄入量有统计学意义且显著相关的单核苷酸多态性(snp)作为工具变量(p &lt; 5×10-8,连锁不平衡r2 &lt; 0.1)。采用逆方差加权分析(IVW)、加权中位数法等方法分析全谷物摄入与T2DM的因果关系。采用异质性检验、基因多效性检验和敏感性分析评价结果的稳定性和可靠性。结果:膳食中粗粮(OR = 0.999, 95%CI: 0.999 ~ 1.000, p = 0.004)/纤维(OR = 0.996, 95%CI: 0.993 ~ 0.999, p = 0.014)与T2DM发病风险呈负相关。在糖代谢正常的人群中,膳食纤维摄入量与FPG (β = -0.003, SE = 0.001, p &lt; 0.001)、FINS (β = -0.023, SE = 0.011, p = 0.044)和HOMA-IR (β = -0.007, SE = 0.003, p = 0.023)呈负相关。在糖代谢异常人群中,全谷物(β = -0.001, SE = 0.001, p = 0.036)和膳食纤维(β = -0.006, SE = 0.002, p = 0.005)与HbA1c呈负相关。对于进一步的MR分析,IVW方法显示,全谷物摄入量每增加一个标准差,T2DM风险降低1.9% (OR = 0.981, 95%CI: 0.970 ~ 0.993, β = -0.019, p = 0.002),与IVW倍增随机效应、IVW固定效应和IVW径向效应的结果一致。同时,MR- egger回归分析(截距= -2.7 × 10-5, p = 0.954)表明基因多效性不影响MR分析结果。留一分析显示,个体SNP对结果影响不大(异质性= 0.445)。结论:粗粮可降低2型糖尿病的发病风险,改善糖代谢稳态和胰岛素抵抗。粗粮摄入与T2DM之间的因果关系,以及每日粗粮的最佳摄入量,仍需在未来通过大型随机对照干预研究和前瞻性队列研究进一步探索。坂田彦,注册营养师1;MIsa Funaki注册营养师2;增田香苗,注册营养师2;里约热内卢栗原注册营养师2;高村朋美,注册营养师2;Masaru吉田Doctor21University兵库县,Ashiya-shi,兵库县;2兵库大学,hymez -shi,兵库市背景:近年来,与生活方式有关的疾病,如肥胖、糖尿病和血脂异常已被认为是一个问题,其中一个原因是过量的脂肪饮食。已知肥胖等相关疾病是脓毒症、新型冠状病毒感染等传染病严重程度的危险因素,但其发病机制尚不清楚。因此,我们推测高脂肪饮食可能不仅会引起脂肪细胞的功能改变,还会引起巨噬细胞的功能改变,从而削弱免疫反应,导致传染病的加重。因此,在本研究中,我们通过蛋白质组学分析和RNA序列分析来研究高脂饮食负荷在巨噬细胞中诱导了哪些基因和蛋白的表达。 方法:将4周龄小鼠分为正常饮食(ND)组和高脂饮食(HFD)组,饲养4周。解剖前一周,腹腔注射巯基乙酸酯促进巨噬细胞增殖的培养基2ml,取巨噬细胞,用罗斯威尔公园纪念研究所培养基(Roswell Park Memorial Institute medium, RPMI)在37℃、5% CO2条件下培养。在环境中培养2小时后,去除浮细胞,利用回收的巨噬细胞进行蛋白质组分析。此外,对巨噬细胞中提取的RNA进行RNA序列分析。结果:蛋白质组分析鉴定出每组超过4000个蛋白。在HFD组中,与ND组相比,免疫球蛋白和嗜酸性过氧化物酶等吞噬作用相关蛋白的表达降低。此外,RNA测序数据分析也显示吞噬相关基因的表达水平下降,在蛋白质组学分析中观察到其表达水平下降。结论:由此可见,高脂饮食负荷降低了巨噬细胞的吞噬能力。本研究旨在阐明高脂肪膳食负荷诱导基因和蛋白质表达并诱导免疫抑制作用的分子机制。本杰明·戴维斯,BS1;Chloe Amsterdam, BA1;Basya Pearlmutter, BS1;Jackiethia Butsch, C-CHW2;Aldenise Ewing, PhD, MPH, CPH3;艾琳·霍利,MS, RDN, LD2;Subhankar Chakraborty, MD, phd41俄亥俄州立大学医学院,俄亥俄州哥伦布市;2俄亥俄州立大学韦克斯纳医学中心,俄亥俄州哥伦布市;3俄亥俄州立大学公共卫生学院,俄亥俄州哥伦布市;4俄亥俄州立大学韦克斯纳医学中心,俄亥俄州都柏林。资金支持:无报道。背景:粮食不安全(FI)是指缺乏持续获得足够的食物来维持积极健康的生活。这一问题受到经济、社会和环境因素的影响,对弱势群体的影响尤为严重。根据美国农业部的数据,超过10%的美国家庭经历过FI,导致身体和精神健康方面的后果。FI与较差的饮食质量、加工食品和高热量食品的消费增加以及肥胖、糖尿病和心血管疾病的较高患病率有关。FI带来的慢性压力也会引发或加剧精神健康障碍,包括抑郁和焦虑,进一步使健康结果复杂化。消化不良影响着全球约20%的人口,严重降低了生活质量并增加了医疗保健费用。其病因是多因素的,包括胃运动异常、内脏过敏、炎症和社会心理压力。虽然FI与肠脑相互作用障碍(DGBIs)如消化不良之间的联系研究有限,但新出现的证据表明两者之间存在双向关系。因此,本研究旨在探讨FI、消化不良和其他与健康相关的社会需求(HRSN)之间的关系。我们的假设包括1)FI患者有更严重的消化不良症状,2)FI在其他领域与HRSN相关。方法:到专业运动诊所就诊的患者被前瞻性地纳入了一个以全面调查DGBIs病理生理为目标的登记处。在就诊前完成HRSN和消化不良的有效问卷。数据采用REDCap软件处理,SPSS软件进行统计分析。结果:53例患者完成问卷调查。88.7%的患者为白人,73.6%为女性,平均年龄45.6岁(21-72岁),BMI 28.7 kg/m2(17.8-51.1)。13例(24.5%)患者存在FI。食物安全组患者消化不良症状的总体严重程度明显较低(13.8比18.8,p = 0.042)。在消化不良的四个亚量表(恶心呕吐、餐后饱腹感、腹胀和食欲不振)中,只有FI患者的食欲不振更严重(2.3 vs. 1.3, p = 0.017)。与食物安全的患者相比,FI患者更有可能处于中等(61.5%比5.0%)或高风险(30.8%比2.5%,p &lt; 0.001)的经济困难,经历未满足的交通需求(38.5%比5.0%,p = 0.0.019)和住房不稳定(30.8%比5.0%,p = 0.023)。他们也有更高的抑郁风险(54%对12.5%,p = 0.005),报告缺乏体育活动(92.3%对55.0%,p = 0.05)。在调整了年龄、性别、种族和BMI之后,FI并不是全球消化不良严重程度的预测因子。较高的BMI(比值比0.89,95% ci 0.81-0.98)与早期饱腹的严重程度相关。女性(O.R. 10.0, 95% C.I. 1.3-76.9, p = 0.03)与恶心严重程度相关。较高的BMI (O.R. 1.10, 95% C.I. 1.001-1.22, p = 0.048)和女性(O.R. 10.8, 95% C.I. 1.6-72)。 所有患者开始使用奥曲肽300微克/天,加入2合1 TPN溶液。所有患者的胃肠道分泌量平均减少65%,记录的最终平均每日量为540 mL。记录的基线平均产量为1518毫升/天。平均住院时间23天,范围3 ~ 98天。肝功能测试(LFTs)在基线和入院时的最后住院值进行评估。27例患者中有4例(15%)观察到肝酶显著升高,超过正常上限的3倍。结论:奥曲肽是治疗恶性肠梗阻有限药物选择的一个有价值的补充。在本回顾性图表回顾中观察到,其平均减少胃肠道分泌物65%的能力可显着缓解症状并改善患者护理。对于依赖TPN的恶性肠梗阻患者,使用奥曲肽作为TPN溶液的添加剂可减少患者每天接受输注或皮下注射的次数。根据奥曲肽的说明书,肝胆并发症的发生率高达63%。从回顾性图表回顾中发现,15%的患者有显著的肝酶升高,这仍然是一个重要的监测参数。帕维尔·特辛斯基协会教授,MUDr.1;Jan Gojda,教授,MUDr, phd;peter Wohl, md, ph . 3;Katerina Koudelkova, mudr41医学系,布拉格,Hlavni mesto Praha;2布拉格查尔斯大学第三医学院大学医院医学部,普拉哈,哈拉夫尼梅斯托普拉哈;3临床和实验医学研究所,布拉格,Hlavni mesto Praha;4 .布拉格大学医院医学部和布拉格第三医学院,布拉格,Hlavni mesto praha财政支持:该登记处得到武田和百特科学补助金的支持。背景:基于捷克共和国家庭肠外营养(HPN)患者的最新30年分析和分层,全HPN患者的适应症、综合征、表现、断奶和并发症的趋势。方法:根据HPN中心的数据,分析2007 - 2023年HPN国家登记处的记录。使用竞争风险回归(Fine and Gray)模型分析导管相关性败血症(CRS)、导管闭塞和血栓性并发症的发生时间。其他数据以中位数或平均值表示,95% CI (p &lt; 0.05为显著)。结果:HPN发病率为1.98 / 10万(人口1050万)。预计20%的患者终生依赖,40%的患者可能断奶,40%的患者是姑息性的。在1838个记录中,约有150万导管日,672例患者出现短肠综合征(36.6%),531例患者出现肠梗阻(28.9%),274例患者出现吸收不良(14.9%),其余361例患者(19.6%)分为瘘管、吞咽困难或仍未明确。SBS以I型(57.8%)和II型(20.8%)为主。残肠平均长度为104.3 cm (35.9 ~ 173.4 cm),其中ⅰ型SBS残肠长度更长。HPN的主要适应症为假性肠梗阻(35.8%)、非恶性手术(8.9%)、克罗恩病(7.3%)和肠系膜阻塞(6.8%)。77.8%的HPN患者报告了一天中大部分时间的活动能力,653名经济能力强的患者中有162名(24.8%)报告了经济活动和独立性。49.1%的患者主要使用隧道导管,24.3%的患者使用PICC, 19.8%的患者使用静脉导管。69.7%的患者使用市售药袋,24.7%的患者使用药房配制的外加剂。66.9%的患者每周7天每天1袋。每1000个导管天脓毒症比率从2013年的0.84下降到2022年的0.15。导管闭塞率从0.152下降到0.10 / 1000导管天,血栓并发症率从0.05下降到0.04。代谢性骨病患病率为15.6%,PNALD患病率为22.3%。在最初的12个月里,28%的患者实现了肠道自主,5年后增加到45%。患者第一年生存率为62%,5年生存率为45%,10年生存率为35%。到目前为止,有36例患者使用Tedeglutide,平均每日HPN容量减少至60.3%。结论:近十年来捷克HPN患者的患病率呈上升趋势,与发病率呈上升趋势。大多数患者预计将在第一年内终止HPN。CRS的发生风险在过去5年中显著下降,并保持在较低水平,而导管闭塞和血栓并发症有稳定的趋势。Tedeglutide显著降低了所需的IV体积。图1所示。 BATTED包括胆囊切除术,总胆管和肝管注射95%乙醇,以及持续乙醇引起的肝内损伤的固定器缝合。放置血管通道,并安排超声引导肝活检。batted后6周的仔猪进行了HPE。首次手术后8周,对动物实施安乐死。进行血清学、组织学、基因表达及免疫组织化学检查。结果:血清学评估显示,BATTED手术后6周结合胆红素从基线激增(平均Δ 0.39 mg/dL至3.88 mg/dL)。γ -谷氨酰转移酶(GGT)也显示出数倍的增加(平均:Δ 16.3IU至89.5IU)。Sham没有显示这些升高(结合胆红素:Δ 0.39 mg/dL至0.98 mg/dL, GGT: Δ 9.2IU至10.4IU)。天狼星红染色显示门静脉周围和弥漫性肝纤维化明显(增加16倍),胆管增殖标志物CK-7增加9倍。与假手术组相比,BATTED组的仔猪CD-3(7倍)、α - sma(8.85倍)、COL1A1(11.7倍)和CYP7A1(7倍)增强。仔猪的营养状况改善,结合胆红素降低(Δ 4.89 mg/dL至2.11 mg/dL),成功实现了HPE。结论:BATTED复制了BA的特征,包括高胆红素血症、GGT升高、明显的肝纤维化、胆管增生和随后成功的HPE的炎症浸润。该模型为阐明HPE后BA和适应的机制提供了大量机会,为诊断和治疗的发展铺平了道路。Sirine Belaid, MBBS, MPH1;Vikram Raghu, MD, MS11UPMC,匹兹堡,太平洋财政支持:无报道。背景:在我们的机构,儿科住院医师负责管理肠衰竭(IF)患者在住院单位。然而,据报道,他们在处理这些复杂的案件时感到缺乏经验和焦虑。该项目旨在确定知识差距,并评估儿科住院医师在管理IF患者方面的信心水平。方法:我们使用qualics进行了一项在线需求评估调查,其中包括李克特量表,多项选择题,开放式和评级问题,以评估居民在执行与IF患者护理相关的任务时的信心水平(从1到10)。这项自愿调查,经IRB批准为豁免,通过二维码和电子邮件分发给匹兹堡大学的所有儿科住院医生。结果:在受访者中,32%的人参与了调查,其中近50%的人完成了肠道康复(IR)服务的轮转。居民报告在计算全肠外营养(TPN)方面的置信度最低,如静脉(IV)液体(中央或周围给药的溶液,含有葡萄糖和主要电解质,设计与患者家中TPN含量相匹配),识别d -乳酸酸中毒和小肠细菌过度生长(SIBO)的迹象,以及管理SIBO(平均置信度为2/10)。他们还表示对家庭TPN和类似TPN的静脉输液、了解相关解剖、确保适当的造口护理、管理中心通路丢失和解决导管血流不良的信心较低(平均信心评分为3-4/10)。相反,居民对管理喂养不耐受和中心线相关感染更有信心(平均信心评级为5-6/10)。此外,他们认为自己识别脓毒症和低血容量性休克迹象的能力为8/10,尽管他们对处理这些情况的信心不足(7/10)。此外,64%的受访者同意,管理IF患者具有教育价值,他们更喜欢层叠卡片或模拟会议作为教育资源(平均评分分别为8分和6.8分,满分为10分)。结论:该调查突出了儿科住院医师需要进一步教育的几个领域。通过有针对性的课程干预来解决这些知识差距,可以更好地让住院医生管理IF患者,并有可能增加他们对这一专业的兴趣。CLABSI =中线相关血流感染,TPN =全肠外营养,EHR =电子健康记录,IV =静脉注射,SIBO =小肠细菌过度生长。图1所示。根据儿科住院医师的平均信心评分(&gt;= 7/ 10,5 -6/10, &lt;=4/10),将肠衰竭(IF)患者管理相关任务分为三类。图2。说明儿科住院医师对肠衰竭患者管理教育价值的看法分布。Alyssa Ramuscak, msc, MSc1;Inez Martincevic, MSc1;Hebah Assiri, MD1;Estefania Carrion, MD2;杰西·赫尔斯特,医学博士11病童医院,多伦多,安大略省;2基多大都会医院,基多,皮钦查。资金支持:加拿大雀巢健康科学公司,加拿大安大略省北约克。 背景:肠内营养为不能满足口服需要的个体提供液体和营养。最近,人们对真正以食物为基础的配方奶粉的兴趣,突显了一种向提供常见的水果、蔬菜和蛋白质成分的营养转变。本研究旨在评估高热量、植物性、真正的食品成分配方在儿科管饲患者中的耐受性和营养充分性。方法:这项前瞻性、单臂、开放标签研究评估了高热量植物性配方Compleat®Junior 1.5在医学复杂、稳定、1-13岁儿童管饲患者中的耐受性和营养充足性。参与者于2023年5月至2024年6月从多伦多病童医院的门诊诊所招募。在基线时获得人口统计学和人体测量值(体重、身高)。与常规饲料相比,研究产品的日剂量是等热量的。通过加水来满足总液体需求。参与者在3天内过渡到研究配方,随后是14天的独家使用研究产品。在过渡期和研究期间,护理人员使用电子数据库(Medrio®)监测每日摄取量、饲料耐受性和肠道运动。研究结束时进行了一次访问以收集体重测量数据。描述性统计总结了人口学和临床特征(表1)。配对t检验比较了基线和研究结束时的年龄体重和年龄bmi z分数。使用布里斯托大便量表或布鲁塞尔婴幼儿大便量表,描述不耐受和排便症状的事件频率,并将基线与干预期进行比较。在研究期间,卡路里和蛋白质目标的百分比是根据超过规定的卡路里摄入量和根据年龄和体重的饮食参考摄入量来计算的。结果:总共招募了27名中位年龄为5.5岁(IQR, 2.5-7)的儿科门诊参与者,其中26名完成了研究(表1)。参与者的年龄体重和年龄bmi z-score在基线和研究结束期间显著改善,分别从-1.75±1.93到-1.67±1.88 (p &lt; 0.05),从-0.47±1.46到0.15±0.23 (p &lt; 0.05)。在基线和研究结束之间,任何胃肠道症状的频率没有显著差异,包括呕吐、呕吐/干呕、管通气或喂食时感觉到的疼痛/不适。在基线和研究结束时,大便的频率和类型没有显著差异。研究参与者在大部分(13±1.7天)的研究期间达到了100%的规定能量。在研究期间,所有参与者都超过了蛋白质需求。20个家庭(76.9%)表示在完成研究后希望继续使用研究产品。结论:这项前瞻性研究表明,在14天的研究期间,在稳定但医学复杂的儿童中,高热量、植物性、真正的食品成分配方具有良好的耐受性,并且热量充足,可以维持或促进体重增加。大多数护理人员倾向于继续使用研究产品。表1。参与者的人口学和临床特征(n = 27)。gustave Falciglia, MD, MSCI, MSHQPS1;丹尼尔·罗宾逊,医学博士,MSCI1;Karna Murthy, MD, MSCI1;Irem Sengul Orgut博士;Karen Smilowitz博士,MS3;Julie Johnson, MSPH PhD41Northwestern University Feinberg School of Medicine, Chicago, IL;2阿拉巴马大学卡尔弗豪斯商学院,塔斯卡卢萨;3西北大学凯洛格商学院;麦考密克工程学院,埃文斯顿,伊利诺伊州;4北卡罗莱纳大学医学院,教堂山,NCEncore报告后:儿童医院新生儿联盟(CHNC)年度会议,2021年11月1日,休斯顿,德克萨斯州。资金支持:无报道。Lyssa Lamport, MS, RDN, CDN1;阿比盖尔·奥洛克,MD2;巴里·温伯格,MD2;维塔利亚·波亚尔,md21纽约科恩儿童医疗中心,华盛顿港,纽约州;2Cohen Children Medical Center of NY, New Hyde Park, NY资金支持:无报道。背景:新生儿重症监护病房(NICU)的早产儿由于频繁的小而脆弱的血管插管,是外周静脉导管浸润(PIVI)的高危人群。在新生儿中最常见的输注是肠外营养(PN),其次是抗生素。以前的报告表明,输液的固有特性,如pH值、渗透压和钙含量,决定了PIVIs的严重程度。这导致了将蛋白质和/或钙的摄入量限制在最佳生长和骨矿化推荐量以下的普遍做法。 方法:我们的目的是确定婴儿和静脉(IV)输注与新生儿重症PIVIs发展相关的特征。我们对2018-2022年IV级NICU的pii进行了回顾性分析(n = 120)。每个PIVI由伤口认证的新生儿专家评估,并使用基于输液护士协会(INS)分期标准的评分系统分为轻度、中度或重度。组间比较采用方差分析、卡方分析或Mann-Whitney检验对非参数数据进行比较。结果:婴儿严重皮维上尉的意思是出生体重低于那些轻度或中度皮维上尉(分别为1413.1 g和2116.9 g和2020.3 g、p = . 01)(表1),大多数皮维上尉发生在注入PN和脂质,但程度并不与输注速率有关,同渗重摩或氨基酸的浓度(平均3.8 g / dL轻度组,3.5 g / dL中等组和3.4 g / dL严重组)或钙(在所有组中值6500 mg / L)(表2,图1)。值得注意的是,在严重PIVI组中,在PIVI后24小时内静脉输注药物最为常见(p = .03)(表2)。大多数PIVI,包括轻度PIVI,都使用透明质酸酶治疗。4%的中度和44%的重度pii患者需要高级伤口护理,没有人需要手术干预。结论:新生儿重症监护病房重症PIVIs最容易发生在低出生体重婴儿和静脉给药后24小时内。这很可能是因为药物必须保持在酸性或碱性pH值以保持稳定性,并且许多药物具有高渗透压和/或固有的腐蚀性。因此,药物可引起化学性静脉炎和炎症外渗。相反,PN成分,包括氨基酸和钙,与外渗的严重程度无关。我们的研究结果表明,在给药后增加对早产儿静脉注射部位的监测可能会降低严重静脉注射的风险。相反,减少或不注射氨基酸和/或钙以减轻PIVI风险可能会导致营养缺乏,而不会降低临床显著PIVI的风险。表1。新生儿重症监护病房轻、中、重度瞳孔特征比较。PIVI严重程度是根据INS标准指定的。表2。NICU患者PIVI发生及严重程度与给药及输液成分的关系。图1所示。Infusate属性。Stephanie Oliveira, MD, CNSC1;乔西Shiff2;艾米丽罗曼蒂克,RD3;凯瑟琳·希区柯克,RD4;吉莉安·戈达德,MD4;保罗·威尔士,md51辛辛那提儿童医院医疗中心,梅森,俄亥俄州;2辛辛那提大学,俄亥俄州辛辛那提;辛辛那提儿童医院医疗中心,俄亥俄州辛辛那提;4辛辛那提儿童医院,俄亥俄州辛辛那提;辛辛那提儿童医院医疗中心,辛辛那提,俄亥俄州财政支持:无报告。背景:对于肠外营养衰竭的儿童,通常会给他们喂食元素肠内配方,因为人们认为它们通常耐受性更好,因为蛋白质模块是游离氨基酸,不含其他过敏原,并且存在长链脂肪酸。2022年2月,由于细菌污染,市场上一种受欢迎的元素配方被召回,需要立即过渡到另一种肠内配方。这包括为我们的一些患者提供植物性选择。由于宗教习俗、成本考虑和个人喜好,越来越多的家庭对改用植物性配方奶粉感兴趣并提出要求。虽然植物性配方奶粉缺乏主要的过敏原,可能含有有益的可溶性纤维,但它们在这一患者群体中的研究还在进行中。这项研究的目的是确定肠外营养失调的儿童在从元素配方转向植物配方后,其生长是否受到影响。方法:我们进行了一项回顾性队列研究,研究对象是由我们的肠道康复计划管理的PN患者,他们在产品召回期间从元素配方过渡到植物性配方。数据收集的人口统计,肠道解剖,配方奶粉摄入量,肠外营养支持,耐受性,粪便一致性和体重增加前后6个月的配方奶粉转换。使用wilcoxon符号秩检验对生长和营养摄入量的变化进行配对分析。采用卡方检验比较配方耐受性。alpha值&lt; 0.05被认为是显著的。结果:11例患者纳入研究[8];中位胎龄33 (IQR = 29, 35.5)周,评估时中位胎龄20.4 (IQR = 18.7,29.7)个月]。所有参与者的IF类别均为短肠综合征(SBS)。残余小肠长度28(IQR = 14.5,47.5) cm。 总体而言,改用植物性配方后,观察到的生长差异没有统计学意义(p = 0.76)(图1)。植物性配方的肠内配方体积和卡路里摄入量中位数更高,但没有统计学意义(p = 0.83和p = 0.41)(图2)。11名患者中有7名(64%)报告改用植物性配方后大便计数减少(p = 0.078)和大便一致性改善(p = 0.103)。在整个研究过程中,转换为植物性配方后,PN卡路里和体积断奶率没有差异(卡路里:p = 0.83;结论:在这项针对IF儿童的小型研究中,从游离氨基酸配方转向完整的植物性配方是耐受良好的。各组间保持生长。在改用植物性配方奶粉后,这些儿童耐受的肠内容积增加,但我们没有足够的能力证明统计差异。没有证据表明转换后的儿童对蛋白质过敏。对于患有肠衰竭的儿童,植物性配方奶粉可能是元素配方奶粉的另一种选择。图1:元素配方和植物配方的增重变化(克/天)。图2。元素和植物性配方的肠内热量(千卡/公斤/天)和体积(毫升/公斤/天)的变化。图3。元素和植物性配方的PN热量(千卡/公斤/天)和体积(毫升/公斤/天)的变化。Carly McPeak, RD, LD1;Amanda Jacobson-Kelly,医学博士,医学硕士11俄亥俄州哥伦布市全国儿童医院财政支持:无报道。背景:在儿科,通过空肠造瘘、胃空肠造瘘或鼻空肠造瘘管进行幽门后肠内喂养越来越多地用于克服误吸、严重胃食管反流、胃运动不良和胃出口梗阻等问题。空肠内喂养绕过十二指肠,而十二指肠是许多营养物质的主要吸收部位。铜被认为主要是在胃和近端十二指肠吸收,因此患者接受营养支持的方式绕过这一区域可能会出现铜缺乏。儿科人群中铜缺乏的患病率和并发症并没有很好的记录。方法:回顾性分析在全国儿童医院(NCH)治疗的两例患者。回顾医疗记录,收集实验室、药物/补充资料和肠内喂养史。在每个病例报告中,观察到两名患者都接受了Pediasure Peptide®肠内配方。结果:病例1:一名14岁男性接受幽门后肠内营养治疗两年。患者表现为全血细胞减少和贫血加重。2017年3月提取的实验室数据显示铜(&lt;10 ug/dL)和铜蓝蛋白(20 mg/dL)。开始时静脉注射氯化铜38微克/千克/天,持续3天,然后通过j型管过渡到119微克/千克/天,持续4个月。在首次缺乏症发作2个月后重新绘制实验室,结果显示全血细胞减少症总体改善(表1)。4个月后,氯化铜作为维持剂量降至57微克/千克/天。在首次缺乏症发作两年半后重新绘制的实验室数据显示,尽管服用了较低剂量的补充剂,但铜(27微克/分升)和铜蓝蛋白(10毫克/分升)的水平仍然不足。病例2:一名8岁女性接受幽门后肠内营养3个月。2019年3月的实验室数据显示,铜(38微克/分升)和铜蓝蛋白(13毫克/分升)水平不足。每日通过空肠管补充氯化铜50微克/千克/天。在开始补充后11个月和15个月重新进行铜和铜蓝蛋白检测,发现尽管血液学值保持稳定,但仍存在铜缺乏(表2)。结论:目前尚无临床医生预防、筛查、治疗和维持儿科幽门后肠内喂养铜缺乏的指南。目前对严重缺铜患者的补铜剂量很大程度上是基于病例序列和专家意见。在NCH,正如上面讨论的病例报告所证明的那样,目前的标准护理补充剂在铜补充方面表现出不一致的改善。未来的研究应确定适当的补充,并评估其对幽门后肠内喂养患者的疗效。表1。病例1的实验室评价“-”表示无数据,加粗表示结果低于年龄正常下限。表2。病例2的实验室评价“-”表示无数据,加粗表示结果低于年龄正常下限。Meighan Marlo, pharm1;伊桑·梅佐夫,MD1;肖恩·皮尔森,博士,RPh1;Zachary Thompson,药学博士,公共卫生硕士,bcpps11国家儿童医院,哥伦布市,俄亥俄州,财政支持:无报道。 背景:肠外营养(PN)是一种高风险的治疗方法,通常含有40多种不同的成分。建议将PN处方纳入电子健康记录(EHR),以尽量减少转录错误的风险,并允许对处方者实施重要的安全警报。在用于转诊的系统之间缺乏强大的EHR互操作性的情况下,门诊PN处方工作流程通常需要手动抄写订单。由于需要基于体重的定制,以及由于许多药房缺乏经验,门诊PN使用率低,接受门诊PN的儿科患者发生药物事件的风险增加。在电子病历中开发和实施结合住院和门诊功能的处方系统对于提高儿科患者门诊PN的质量和安全性是必要的。主要目标是创建一个工作流程,通过最大限度地减少人工转录和提高药物安全性来改善护理过渡。我们描述了标准电子病历工具的修改,以实现这一目标。方法:利用多学科团队,在全国儿童医院的电子病历中完成了门诊PN处方的开发和纳入。为了在新系统中进行优化,考虑并评估了住院和门诊患者的差异、向处方者提供适当警报的安全参数以及法律要求。结果:最终产品成功纳入门诊PN处方,同时允许护理设置之间的处方无缝转移。在住院或门诊就诊期间,开处方者可为患者指定门诊PN。订单随后排队等待药剂师审查/验证,以评估调整并确定门诊设置中的扩展稳定性考虑因素。药剂师审查后,处方打印并由提供者签名传真给药房。结论:据我们所知,这是第一家能够开发并将儿科PN处方纳入电子病历的机构,该机构在住院和门诊设置中转移,独立于人工转录,同时仍然允许定制PN。Faith Bala博士;Enas Alshaikh博士;Sudarshan Jadcherla, md11俄亥俄州哥伦布市全国儿童医院研究所财政支持:无报道。背景:由于宫内胎儿生长速度很少与宫内胎儿生长速度相匹配,因此宫外生长仍然是新生儿ICU (NICU)早产儿关注的问题。虽然原因是多因素的,但排他性肠外营养(EPN)持续时间和过渡到排他性肠内营养(EEN)阶段所起的作用尚不清楚。从出生到EEN及其后的关键阶段可能存在严重的营养缺乏,这些可能会影响短期和长期的结果。鉴于这一基本原理,我们的目的是研究口腔喂养困难的康复期早产儿从出生到生长发育的持续时间与住院时间(LOHS)之间的关系。方法:回顾性分析前瞻性收集的来自俄亥俄州哥伦布市全国儿童医院全转诊级IV级新生儿重症监护室的77名早产儿的数据,这些早产儿后来转诊到我们创新的新生儿和婴儿喂养障碍项目,以评估和管理严重的喂养/空气消化困难。纳入标准:出生&lt;妊娠32周,出生体重;1500克,无染色体/遗传疾病,经足期后(37-42周,PMA)全口喂养出院。使用芬顿生长图将生长变量转换为年龄和性别特定的z分数。根据美国营养与饮食学会(Academy of Nutrition and Dietetics)对新生儿和早产儿群体的标准,将宫外生长受限(EUGR)定义为从出生到出院体重z分数下降。0.8. 采用卡方检验、Fisher精确检验、Mann Whitney U检验和适当的t检验对按EUGR状态分层的临床特征进行比较。采用多变量回归方法探讨和评估新生儿出生至EEN持续时间与出院时生长z评分之间的关系。使用多元线性回归评估从出生到EEN持续时间与LOHS之间的关系。结果:42例(54.5%)新生儿出院时出现EUGR;出院时明显高于出生时(表1)。出院时生长受限婴儿的出生胎龄明显较低,出生时需要机械通气的比例较高,脓毒症发生率较高,达到EEN所需时间较长(表2)。 从出生到EEN的持续时间与出院时的体重、身长和头围Z评分呈显著负相关。同样,从出生到肠内营养(EEN)的持续时间与LOHS显著正相关(图1)。结论:从出生到肠内营养(EEN)的持续时间可以影响生长结果。我们推测,推荐营养摄入量与实际营养摄入量之间存在显著差距,特别是对于那些有慢性管饲困难的人。在建立EEN之前,精心规划和个性化的营养是相关的,使用标准化喂养方案的肠内营养促进策略提供了推广的前景,尽管提供了个性化的机会。表1。参与者成长特征。表2。临床特征。图1所示。从出生到EEN持续时间与生长参数和住院时间的关系。Alayne Gatto, MS, MBA, RD, CSP, LD, FAND1;Jennifer Fowler, MS, RDN, CSPCC, LDN2;Deborah Abel,博士,RDN, LDN3;克里斯蒂娜·瓦伦丁,医学博士,硕士,RDN, FAAP, faspen41佛罗里达国际大学,布卢明代尔,GA;2东卡罗莱纳健康,华盛顿,北卡罗来纳州;3佛罗里达国际大学,佛罗里达州迈阿密海滩;4Banner大学医学中心,亚利桑那大学,图森,亚利桑那州。背景:新生儿注册营养师(NICU RDN)在新生儿重症监护病房(NICU)早产儿和危重婴儿的护理中起着至关重要的作用。高级儿科能力往往是营养学位课程或营养实习的差距。对这些脆弱患者的重症监护成功需要以患者为中心的护理、多学科合作和适应性临床问题解决方面的专业知识。本研究旨在确定新生儿重症监护室注册护士的需求、参与水平和专业知识,同时也为他们的工作满意度和职业寿命提供见解。目前在美国,大约有850个三级和四级新生儿重症监护病房,其中新生儿营养师是重要的护理团队成员,因此医院必须提供适当的补偿、福利和教育支持。方法:这是一项横断面检查,使用于2024年3月向已建立的新生儿和儿科营养师实践小组发送的全国在线调查,经IRB批准。为目前和以前的新生儿重症监护室注册护士提供了一个质量链接,让他们完成一项10分钟的在线调查,调查提供了一张可选的礼品卡。该链接一直保持开放,直到200张礼品卡耗尽,大约在调查开始一周后。使用Stats IQ qualics进行统计分析。在描述性统计中,应答频率用计数和百分比表示。对于差异的比较,分类分析采用χ 2检验和Fisher’s Exact检验。结果:共有253名现任(n = 206)和前任(n = 47) NICU rdn完成了在线问卷调查。在210名受访者中,84名(40%)报告有儿科临床经验,94名(44%)有临床儿科营养学实习经验,21名(10%)曾担任WIC营养师,15名(7.1%)有儿科专业认证或奖学金,12名(5.7%)在进入新生儿重症监护病房之前没有任何经验。(表1)在163名受访者中,83名(50.9%)表示接受了额外的新生儿重症监护病房培训的财政支持或报销。感觉作为团队成员受到重视的受访者计划在新生儿重症监护室RD职位上待5年以上(p &gt; 0.0046)。此外,他们还表示得到了认可和赞赏(64.4%)、动力(54.1%)和晋升机会(22.9%)。(表2)结论:新生儿重症监护室的注册护士没有清晰的能力路线图和职业发展轨迹。此外,继续教育的财政支持或报销并不是一贯的员工福利,这可能在工作满意度和留任方面发挥关键作用。这些数据不仅为营养师的管理人员提供了宝贵的见解,也为专业协会提供了建立计划和留住机会的机会。表1。问:当您开始担任新生儿重症监护室RD时,您有什么经验?(n = 210)。N和百分比将超过210个,因为受访者可以选择多个答案。表2。问题比较:您觉得您在新生儿重症监护室的工作有以下特点吗?您打算在这个岗位上工作多久?Sivan Kinberg, MD1;克里斯汀·霍耶,RD2;埃弗拉多·佩雷斯·蒙托亚,RD2;张琼,MA2;伊丽莎白·伯格,MD2;Jyneva Pickel, dnp21哥伦比亚大学欧文医学中心,纽约,NY;2哥伦比亚大学医学中心,纽约,纽约资金支持:无报告。 背景:短肠综合征(SBS)患者由于肠道表面积减少、胆酸缺乏和转运时间短,可出现明显的脂肪吸收不良,往往导致喂养不耐受和依赖肠外营养(PN)。SBS患者可能缺乏胰酶和/或可用胰酶的有效性降低,导致外分泌胰功能不全(EPI)症状,包括体重减轻、增重不佳、腹痛、腹泻和脂溶性维生素缺乏。口服胰酶替代疗法(PERT)经常被尝试,但由于酶递送不一致和饲管堵塞的风险,与肠内营养(EN)一起使用是不切实际的。当与连续EN一起使用时,口服PERT提供的酶递送不足,因为其水解脂肪的能力在摄入30分钟后显着下降。在这一人群中,对改善肠内吸收治疗的显著需求导致人们对使用在线消化药筒治疗管饲SBS患者的EPI症状感兴趣。固定化脂肪酶药筒是fda批准的在线消化药筒,用于在EN中的脂肪到达胃肠道之前水解脂肪,允许通过连续或批量饲料输送可吸收脂肪。在SBS患者中,与口服PERT制剂相比,这种方式可能更有效地改善肠内脂肪吸收。临床前研究表明,在猪SBS模型中使用在线消化药筒增加了脂溶性维生素的吸收,减少了PN依赖性,并改善了肠道适应性。我们的研究旨在评估在我们中心使用在线消化筒的儿童SBS患者的PN、EN、生长参数、粪便排出量和脂溶性维生素水平的变化。方法:对使用在线固定化脂肪酶(RELiZORB)药筒的儿童SBS EN患者进行单中心回顾性研究。数据收集包括患者人口统计学、SBS病因学、手术史、PN特征(卡路里、体积、输注小时/天)、EN特征(管型、小丸或连续喂养、配方、卡路里、体积、小时)、固定化脂肪酶药筒使用情况(每天使用药筒数、持续时间)、人体测量、粪便排出量、胃肠道症状、药物(包括以前使用PERT)、实验室评估(脂溶性维生素水平、脂肪酸组、胰腺弹性酶)、开始固定化脂肪酶药筒的适应症,有无副作用报告。排除小肠移植或囊性纤维化患者。结果:11例患者纳入研究,平均年龄10.4岁,55%为女性。SBS最常见的病因是坏死性小肠结肠炎(45%),7例(64%)患者依赖于PN。中期分析结果显示:固定脂肪酶药筒的平均使用时间为3.9个月,43%的患者PN热量减少,100%的患者体重增加,6/9(67%)的患者排便量改善。墨盒堵塞是最常见的技术困难(33%),这是通过更好的配方混合来克服的。没有不良事件或副作用的报道。结论:在这项单中心研究中,在患有SBS的儿童患者中使用在线固定化脂肪酶消化盒显示出有希望的结果,包括体重增加,改善粪便排出量和减少对PN的依赖。这些发现表明,在线消化筒可能在改善儿童SBS患者的脂肪吸收不良和减少PN依赖方面发挥作用。需要更大的多中心研究来进一步评估在线消化盒在这一人群中的有效性、耐受性和安全性。Vikram Raghu, MD, MS1;费拉斯·艾丽莎,MD2;西蒙·霍斯伦,MB ChB3;Jeffrey Rudolph, md21匹兹堡大学医学院,Gibsonia, PA;2匹兹堡upmc儿童医院,宾夕法尼亚州匹兹堡;3美国匹兹堡大学医学院,美国国家转化科学推进中心(KL2TR001856)背景:由于能够有效地查询多中心数据,管理数据库可以为罕见病提供独特的视角。儿童肠衰竭的研究具有挑战性,因为这种疾病在单一中心的罕见性和以前缺乏单一的诊断代码。2023年10月,《国际疾病分类》第十次修订版(ICD-10)增加了肠衰竭的新诊断代码。我们的目的是描述这个代码的使用和局限性,以识别肠道衰竭的儿童。方法:从2023年10月1日至2024年6月30日,我们使用儿科健康信息系统数据库进行了一项多中心横断面研究。肠衰竭患儿采用ICD-10编码(K90.83)进行诊断。 描述性统计用于描述人口统计学特征、诊断、利用和结果。结果:我们从849例诊断代码为肠衰竭的独特患者中确定了1804例住院患者。图1显示了自2023年10月开始以来每月的代码使用趋势。在这段时间内,849名患者总共有7085次住院治疗,这意味着只有25%的患者就诊包括肠衰竭诊断。在这849名患者中,638名患者在接受肠外营养期间至少遇到过一次;其中400人也有肠道衰竭诊断代码。仅检查住院时间超过2天的患者,592/701(84%)患者接受了肠外营养。中心静脉相关的血流感染占了501例。患者住院时间中位数为37天(IQR 13-96)。患者主要是非西班牙裔白人(43.7%)、非西班牙裔黑人(14.8%)或西班牙裔(27.9%)。大多数人有政府保险(63.5%)。儿童机会指数(Child Opportunity Index)在所有五个五分位数中都有分布。所有肠衰竭诊断患者的总标准化成本总计1.57亿美元,这些患者的总标准化成本总计2.59亿美元。在这9个月内,每位患者的中位成本为104,890美元(IQR $31,149 - 315,167美元)。在研究期间,有28例(3.3%)患者死亡。结论:自2023年10月实施以来,肠衰竭诊断代码的使用不一致,可能与肠衰竭的定义不同有关。肠衰竭患儿住院费用高,死亡率罕见但显著。未来的工作必须考虑仅使用新代码来识别这些患者的局限性。图1所示。遇到肠衰竭诊断代码的次数。kera McNelis, MD, MS1;Allison Ta, MD2;傅婷婷,md21美国亚特兰大埃默里大学;2 .辛辛那提儿童医院医疗中心,辛辛那提,ohohc。返场演讲:第六届儿科早期职业研究年会,2024年8月27日,埃默里大学健康科学研究大楼1号。资金支持:无报告。背景:新生儿期是一个快速生长的时期,许多需要重症监护的婴儿需要额外的营养支持。营养与饮食学会(and)发表了一份专家共识声明,以建立新生儿营养不良的鉴定标准。根据这些意见衍生的标准,关于与诊断结果相关的证据有限。本研究的目的是比较基于人体测量学的营养不良指标与婴儿直接测量的身体成分。方法:空气置换体积脉搏波被认为是无创身体成分测量的金标准,并被纳入转诊的四级新生儿重症监护病房的常规临床护理。本研究包括可测量身体成分的晚期早产儿和足月婴儿(34-42周胎龄)。根据AND标准将婴儿归类为营养不良。根据诺里斯体成分生长曲线测定脂肪质量、无脂肪质量和体脂百分比z分数。采用Logistic回归确定脂肪量、无脂量和体脂率与营养不良诊断的关系。采用线性回归方法预测18-24月龄体重指数(BMI)。结果:纳入84例婴儿,其中39%为女婴,96%为单胎婴儿(表1)。15%的婴儿出生时胎龄小,12%的婴儿出生时胎龄大。近一半的患者有先天性肠道异常,包括胃裂和肠闭锁。63%的人至少达到一项营养不良标准。无脂质量z评分与营养不良诊断呈负相关,比值比为0.77 (95% CI 0.59-0.99, p &lt; 0.05)。营养不良诊断与体脂百分比或脂肪量之间没有统计学上的显著关联。在18-24个月时,即使去除高库克距离的异常值,任何身体成分变量与BMI之间也没有统计学上的显著关系。结论:危重足月儿营养不良诊断与低脂肪质量相关。在这项小型研究中,身体成分并不是日后BMI的预测指标。表1。新生儿重症监护病房晚期早产儿和足月婴儿的特征:通过空气置换体积脉搏描记仪测量身体成分。John Stutts, MD, MPH1;Yong Choe, MAS11Abbott, Columbus, oh财政支持:Abbott。背景:儿童肥胖患病率呈上升趋势。 尽管人们对减肥的意识和努力有所提高,但对肥胖儿童营养不良的了解却很少。本研究的目的是评估美国儿童肥胖的患病率,并确定哪种指标组合最能定义该人群的营养不良。方法:利用2017-2018年国家健康与营养检查调查(NHANES)数据库评估生物标志物(由于Covid-19大流行而最新的完整数据集)。流行趋势来自2013-2018年的调查数据。肥胖定义为在CDC性别特异性bmi年龄增长图表中≥95个百分位数。队列年龄范围为12-18岁。分析营养摄入和血清中维生素D、维生素E、维生素C、钾、钙、维生素B9、维生素A、总蛋白、白蛋白、球蛋白、高敏C反应蛋白(hs-CRP)、铁、血红蛋白和平均红细胞体积(MCV)。他们还分析了纤维的摄入量。服用补充剂的儿童被排除在外。在SAS®Version 9.4和SAS®Enterprise Guide Version 8.3中分别使用SAS®Proc SURVEYFREQ(采用Wald卡方检验)和Proc SURVEYMEANS(采用t检验)进行分类和连续数据分析。假设检验采用双侧、0.05水平检验。结果以平均±标准误差(SE) (n =调查样本量)或百分比±SE (n)报告。结果:该队列中肥胖患病率为21.3%±1.3(993)。患病率呈逐年上升趋势;2013-2014年20.3%±2.1(1232),2015-2016年20.5%±2.0(1129)。与非肥胖儿童相比,肥胖儿童血清中维生素D(50.3±2.4比61.5±2.1,P &lt; 0.001)、铁(14.4±0.5比16.4±0.4,P &lt; 0.001)、白蛋白(41.7±0.3比43.3±0.3,P &lt; 0.001)、MCV(83.8±0.5比85.8±0.3,P = 0.003)的平均水平显著(P≤0.05)降低。与非肥胖儿童相比,肥胖儿童血清总蛋白(73.2±0.4比72.0±0.3,p = 0.002)、球蛋白(31.5±0.4比28.7±0.3,p &lt; 0.001)和hs-CRP(3.5±0.3比1.2±0.2,p &lt; 0.001)的平均水平显著(p≤0.05)升高。维生素D(51.9%±5.6 vs. 26.8%±3.7,p = 0.001)、血红蛋白(16.3%±3.1 vs. 7.5%±1.8,p = 0.034)和低血红蛋白+低MCV(11.2%±2.9 vs. 3.3%±1.0,p = 0.049)不足的发生率较高。其他血清水平差异不显著(p &gt; 0.05),摄取量差异不显著。结论:结果表明儿童肥胖患病率持续上升。与非肥胖儿童人群相比,血清微量营养素和宏量营养素水平也存在差异,但这些营养素的膳食摄入量无显著差异。低血红蛋白+低MCV的较高患病率支持铁缺乏,并为低平均血铁水平的数据增加了临床相关性。肥胖儿童表现出较高的平均球蛋白和hs-CRP水平,与炎症状态一致。这些结果强调了肥胖儿童营养不良的存在,以及在这一儿科人群中提高营养意识的必要性。Elisha London, BS, RD1;Derek Miketinas博士,RD2;Ariana Bailey,博士,MS3;Thomas Houslay博士;Fabiola Gutierrez-Orozco, phd;Tonya Bender, MS, PMP5;Ashley Patterson,博士11利洁时/美赞臣,埃文斯维尔,印第安纳州;2 . data minds Consulting, LLC, Houston, TX;3利洁时/美赞臣营养公司,亨德森,肯塔基州;4利洁时/美赞臣营养公司,曼彻斯特,英国;5利洁时/美赞臣营养公司,纽堡,in财政支持:无报告。背景:本研究的目的是研究在全国具有代表性的儿童和青少年样本中,营养摄入是否因营养不良分类而异。方法:对2001年至2020年3月参加全国健康与营养检查调查的1-18岁儿童和青少年进行二次分析。如果参与者怀孕或没有提供至少一项可靠的饮食回忆,则被排除在外。营养不良风险程度分别在1-2岁和3-18岁时通过身高体重和年龄bmi z -score进行评估。根据营养与饮食学会/美国肠外和肠内营养学会,营养不良的分类使用Z分数:无(Z &gt; -1),轻度(Z在-1和-1.9之间)和中度/严重(Z≤-2)。从1到2个24小时的饮食回顾中评估饮食摄入量。使用国家癌症研究所的方法评估了从食品和饮料中摄取的常量营养素,以及从食品、饮料和补充剂中摄取的微量营养素。切断点法用于估计微量营养素摄入量低于估计的平均需要量和超出可接受的大量营养素分布范围的儿童比例。 所有的分析都根据抽样方法进行了调整,并应用了适当的样本权重。在没有营养不良作为参照组的情况下,进行了独立样本t检验来比较各年龄组营养状况分类之间的估计值。结果:共分析了32188名参与者。其中,31689家(98.4%)提供了人体测量学。大多数(91%)不符合营养不良标准,而分别有7.4%和1.6%符合轻度和中度/重度营养不良标准。在所有儿童/青少年中,33.3%[0.7]%报告使用膳食补充剂(平均[SE])。在中度/重度营养不良的儿童中,14-18岁的青少年(70.6[3.3]%)和9-13岁的较大儿童(69.7[3.4]%)的钙摄入不足最为严重,而3-8岁的青少年(29.5[3.9]%)和1-2岁的儿童(5.0[1.2]%)的钙摄入不足最为严重。在9-13岁的儿童中,患有轻度营养不良(65.3[2.1]%)和中度/重度营养不良(69.7[2.1]%)的儿童钙摄入不足的风险高于没有营养不良的儿童(61.2[1.1]%)。同样,在中度/重度营养不良的青少年中,14-18岁(分别为20.9[3.3]%和30.6[3.4]%)和9-13岁(分别为13.4[2.6]%和33.9[3.6]%)的锌和磷摄入不足与年龄较小的儿童(1-8岁分别为0.2[0.1]-0.9[0.4]%和0.2[0.1]-0.4[0.2]%)相比最为严重。虽然在中度/重度营养不良的人群中观察到蛋白质摄入不足的风险最大,但大多数儿童和青少年从蛋白质和碳水化合物中摄入的能量百分比是足够的。无论营养状况如何,所有年龄组的总脂肪和饱和脂肪都摄入过量。据报道,过度饱和脂肪摄入的风险最大的是那些患有中度/重度营养不良的人(所有年龄组的范围:85.9-95.1%)。结论:患有营养不良的年龄较大的儿童和青少年因摄入钙、锌和磷等微量营养素不足而面临的风险最大。这些结果可能表明,营养不良风险最大的人群,尤其是青少年,饮食质量较差。安娜·本森,DO1;路易斯·马丁博士;Katie Huff, MD, ms21印第安纳大学医学院,Carmel, IN;2印第安纳大学医学院,印第安纳州印第安纳波利斯。背景:微量金属是生长所必需的,在新生儿时期尤为重要。这些微量金属的建议摄入量可供新生儿使用。然而,这些建议是基于有限的数据,并且很少有关于新生儿微量金属水平及其对结果的影响的可用描述。此外,监测微量金属水平可能很困难,因为包括炎症在内的多种因素都会影响准确性。该项目的目的是评估患者血清锌、硒和铜水平及相关结局,包括生长、胆汁淤积率、败血症、支气管肺发育不良(BPD)和死亡,这是一组入住新生儿重症监护病房(NICU)和肠外依赖的队列。方法:我们完成了2016年1月至2023年2月期间接受肠外营养(PN)并抽取微量金属板的NICU患者的回顾性图表回顾。审查了基线实验室、PN时间、痕量金属面板水平、PN中痕量金属剂量、肠内饲料和补充剂以及包括发病率和死亡率在内的结果的图表。脓毒症是根据血培养阳性和胆汁淤积作为直接胆红素2 mg/dL诊断的。使用Fisher精确检验或卡方检验来评估分类变量之间的相关性。Spearman相关用于评价两个连续变量之间的相关性。p值0.05表示显著性。结果:我们纳入了98例患者,人口学资料见表1。微量金属升高和缺乏的患者人数分别见表1和表2。图1显示了生长和微量金属水平之间的关系。与微量金属缺乏症诊断相关的患者结果见表2。铜缺乏与脓毒症(p = 0.010)和BPD (p = 0.033)显著相关。硒缺乏与胆汁淤积(p = 0.001)和BPD (p = 0.003)相关。为了进一步评估硒与胆汁淤积的关系,spearman相关性发现硒水平与直接胆红素水平呈显著负相关(p = 0.002;图2)结论:微量金属缺乏症在我国人群中普遍存在。此外,硒和铜缺乏与新生儿疾病有关,包括败血症、胆汁淤积和BPD。当评估硒缺乏和胆汁淤积时,硒水平被发现与胆红素水平直接相关。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Poster Abstracts

P1–P34 Parenteral Nutrition Therapy

P35–P52 Enteral Nutrition Therapy

P53–P83 Malnutrition and Nutrition Assessment

P84–P103 Critical Care and Critical Health Issues

P104–P131 GI, Obesity, Metabolic, and Other Nutrition Related Concepts

P132–P165 Pediatric, Neonatal, Pregnancy, and Lactation

Parenteral Nutrition Therapy

Sarah Williams, MD, CNSC1; Angela Zimmerman, RD, CNSC2; Denise Jezerski, RD, CNSC2; Ashley Bestgen, RD, CNSC2

1Cleveland Clinic Foundation, Parma, OH; 2Cleveland Clinic Foundation, Cleveland, OH

Financial Support: Morrison Healthcare.

Background: Essential fatty acid deficiency (EFAD) is a rare disorder among the general population but can be a concern in patients reliant on home parenteral nutrition (HPN), particularly those who are not receiving intravenous lipid emulsions (ILE). In the US, the only ILE available until 2016 was soybean oil based (SO-ILE), which contains more than adequate amounts of essential fatty acids, including alpha-linolenic acid (ALA, an omega-3 fatty acid) and linoleic acid (LA, an omega-6 fatty acid). In 2016, a mixed ILE containing soybean oil, medium chain triglycerides, olive oil and fish oil, became available (SO, MCT, OO, FO-ILE). However, it contains a lower concentration of essential fatty acids compared to SO-ILE, raising theoretical concerns for development of EFAD if not administered in adequate amounts. Liver dysfunction is a common complication in HPN patients that can occur with soybean based ILE use due to their pro-inflammatory properties. Short-term studies and case reports in patients receiving SO, MCT, OO, FO-ILE have shown improvements in liver dysfunction for some patients. Our study evaluates the long-term impact of SO, MCT, OO, FO-ILE in our HPN patient population.

Methods: This single-center, retrospective cohort study was conducted at the Cleveland Clinic Center for Human Nutrition using data from 2017 to 2020. It involved adult patients who received HPN with SO, MCT, OO, FO-ILE for a minimum of one year. The study assessed changes in essential fatty acid profiles, including triene-tetraene ratios (TTRs) and liver function tests (LFTs) over the year. Data was described as mean and standard deviation for normal distributed continuous variables, medians and interquartile range for non-normally distributed continuous variables and frequency for categorical variables. The Wilcoxon signed rank test was used to compare the baseline and follow-up TTR values (mixed time points). The Wilcoxon signed rank test with pairwise comparisons was used to compare the LFTs at different time points and to determine which time groups were different. P-values were adjusted using Bonferroni corrections. Ordinal logistic regression was used to assess the association between lipid dosing and follow-up TTR level. Analyses were performed using R software and a significance level of 0.05 was assumed for all tests.

Results: Out of 110 patients screened, 26 met the inclusion criteria of having baseline and follow-up TTRs. None of the patients developed EFAD, and there was no significant difference in the distribution of TTR values between baseline and follow-up. Additionally, 5.5% of patients reported adverse GI symptoms while receiving SO, MCT, OO, FO-ILE. A separate subgroup of 14 patients who had abnormal LFTs, including bilirubin, alkaline phosphatase (AP), aspartate aminotransferase (AST) or alanine aminotransferase (ALT), were evaluated. There was a statistically significant improvement of AST and ALT and decreases in bilirubin and AP that were not statistically significant.

Conclusion: We found that using SO, MCT, OO, FO-ILE as the primary lipid source did not result in EFAD in any of our subset of 26 patients, and TTRs remained statistically unchanged after introduction of SO, MCT, OO, FO-ILE. Additionally, there was a statistically significant decrease in AST and ALT following the start of SO, MCT, OO, FO-ILE. While liver dysfunction from PN is multifactorial, the use of fish oil based lipids has been shown to improve LFT results due to a reduction of phytosterol content as well as less pro-inflammatory omega-6 content when compared to SO-ILEs. A significant limitation was the difficulty in obtaining TTR measurements by home health nursing in the outpatient setting, which considerably reduced the number of patients who could be analyzed for EFAD.

Table 1. Summary Descriptive Statistics of 26 Patients With Baseline and Follow Up TTR.

Table 2. Change in LFTs From Baseline Levels Compared to 3 Months, 6 Months, 9 Months and 12 Months.

Wendy Raissle, RD, CNSC1; Hannah Welch, MS, RD2; Jan Nguyen, PharmD3

1Optum Infusion Pharmacy, Buckeye, AZ; 2Optum Infusion Pharmacy, Phoenix, AZ; 3Optum Infusion Pharmacy, Mesa, AZ

Financial Support: None Reported.

Background: Aluminum is a non-nutrient contaminant of parenteral nutrition (PN) solution. The additive effects of PN components can contribute to toxicity and cause central nervous system issues as well as contribute to metabolic bone disease as observed in adults with osteomalacia. When renal function and gastrointestinal mechanisms are impaired, aluminum can accumulate in the body. Aluminum toxicity can result in anemia, dementia, bone disease and encephalopathy. Symptoms of aluminum toxicity may include mental status change, bone pain, muscle weakness, nonhealing fractures and premature osteoporosis. In July 2004, the U.S. Food and Drug Administration (FDA) mandated labeling of aluminum content with a goal to limit exposure to less than 5mCg/kg/day. Adult and pediatric dialysis patients, as well as patients of all ages receiving PN support, have an increased risk of high aluminum exposure. Reducing PN additives high in aluminum is the most effective way to decrease aluminum exposure and risk of toxicity. This abstract presents a unique case where antiperspirant use contributed to an accumulation of aluminum in an adult PN patient.

Methods: A patient on long-term PN (Table 1) often had results of low ionized calcium of < 3 mg/dL, leading to consideration of other contributing factors. In addition, patient was taking very high doses of vitamin D daily (by mouth) to stay in normal range (50,000IU orally 6 days/week). Risk factors for developing metabolic bone disease include mineral imbalances of calcium, magnesium, phosphorus, vitamin D, corticosteroid use, long-term PN use and aluminum toxicity (Table 2). A patient with known osteoporosis diagnosis had two stress fractures in left lower leg. Aluminum testing was completed in order to identify other factors that may be contributing to low ionized calcium values and osteoporosis. During patient discussion, the patient revealed they used an aluminum-containing antiperspirant one time daily. The range of aluminum content in antiperspirants is unknown, but studies show that minimal absorption may be possible, especially in populations with kidney insufficiency.

Results: After an elevated aluminum value resulted on July 3, 2023 (Figure 1), patient changed products to a non-aluminum containing antiperspirant. Aluminum values were rechecked at 3 and 7 months. Results indicate that patient's antiperspirant choice may have been contributing to aluminum content through skin absorption. Antiperspirant choice may not lead to aluminum toxicity but can contribute to an increased total daily aluminum content.

Conclusion: Preventing aluminum accumulation is vital for patients receiving long-term PN support due to heightened risk of aluminum toxicity. Other potential sources of contamination outside of PN include dialysis, processed food, aluminum foil, cosmetic products (antiperspirants, deodorant, toothpaste) medications (antacids), vaccinations, work environment with aluminum welding and certain processing industry plants. Aluminum content of medications and PN additives vary based on brands and amount. Clinicians should review all potential aluminum containing sources and assess ways to reduce aluminum exposure and prevent potential aluminum toxicity in long-term PN patients.

Table 1. Patient Demographics.

Table 2. Aluminum Content in PN Prescription.

Figure 1. Aluminum Lab Value Result.

Haruka Takayama, RD, PhD1; Kazuhiko Fukatsu, MD, PhD2; MIdori Noguchi, BA3; Nana Matsumoto, RD, MS2; Tomonori Narita, MD4; Reo Inoue, MD, PhD3; Satoshi Murakoshi, MD, PhD5

1St. Luke's International Hospital, Chuo-ku, Tokyo; 2The University of Tokyo, Bunkyo-ku, Tokyo; 3The University of Tokyo Hospital, Bunkyo-ku, Tokyo; 4The University of Tokyo, Chuo-City, Tokyo; 5Kanagawa University of Human Services, Yokosuka-city, Kanagawa

Financial Support: None Reported.

Background: Our previous study has demonstrated beta-hydroxy-beta-methylbutyrate (HMB)-supplemented total parenteral nutrition (TPN) partially to restore gut-associated lymphoid tissue (GALT) atrophy observed in standard TPN-fed mice. Oral intake of HMB is now popular in body builders and athletes. Herein, we examined whether oral supplementation of HMB could increase GALT mass in mice which eat dietary chow ad libitum.

Methods: Six-week-old male Institute of Cancer Research (ICR) mice were divided into the Control (n = 9), the H600 (n = 9) and the H2000 (n = 9) groups. All mice were allowed to take chow and water ad libitum for 7 days. The H600 or H2000 mice were given water containing Ca-HMB at 3 mg or 10 mg/mL water, while the Controls drank normal tap water. Because these mice drank approximately 6-7 mL water per day, the H600 and H2000 groups took 600 and 2000mg/kg Ca-HMB in one day, respectively. After 7 days manipulation, all mice were killed with cardiac puncture under general anesthesia, and the whole small intestine was harvested for GALT cell isolation. GALT cell numbers were evaluated in each tissue (Payer patches; PPs, Intraepithelial spaces; IE, and Lamina propria; LP). The nasal washings, bronchoalveolar lavage fluid (BALF), and intestinal washings were also collected for IgA level measurement by ELISA. The Kruskal-Wallis test was used for all parameter analyses, and the significance level was set at less than 5%.

Results: There were no significant differences in the number of GALT cells in any sites among the 3 groups (Table 1). Likewise, mucosal IgA levels did not differ between any of the 2 groups (Table 2).

Conclusion: Oral intake of HMB does not affect GALT cell number or mucosal IgA levels when the mice are given normal diet orally. It appears that beneficial effects of HMB on the GALT are expected only in parenterally fed mice. We should examine the influences of IV-HMB in orally fed model in the next study.

Table 1. GALT Cell Number (x107/body).

Table 2. IgA Levels.

Median (interquartile range). Kruskal-Wallis test. n; Control=9, H600 = 8, H2000 = 9.

Nahoki Hayashi, MS1; Yoshikuni Kawaguchi, MD, PhD, MPH, MMA2; Kenta Murotani, PhD3; Satoru Kamoshita, BA1

1Medical Affairs Department, Research and Development Center, Chiyoda-ku, Tokyo; 2Hepato-Biliary-Pancreatic Surgery Division, Bunkyo-ku, Tokyo; 3School of Medical Technology, Kurume, Fukuoka

Financial Support: Otsuka Pharmaceutical Factory, Inc.

Background: The guideline of American Society for Parenteral and Enteral Nutrition recommends a target energy intake of 20 to 30 kcal/kg/day in patients undergoing surgery. Infectious complications reportedly decreased when the target energy and protein intake were achieved in the early period after gastrointestinal cancer surgery. However, no studies investigated the association of prescribed parenteral energy doses with clinical outcomes in patients who did not receive oral/tube feeding in the early period after gastrointestinal cancer surgery.

Methods: Data of patients who underwent gastrointestinal cancer surgery during 2011–2022 and fasted for 7 days or longer after surgery were extracted from a nationwide medical claims database. The patients were divided into 3 groups based on the mean prescribed parenteral energy doses during 7 days after surgery as follows: the very-low group (<10 kcal/kg/day), the low group (10–20 kcal/kg/day), and the moderate group (≥20 kcal/kg/day). Multivariable logistic regression model analysis was performed using in-hospital mortality, postoperative complications, length of hospital stay, and total in-hospital medical cost as the objective variable and the 3 group and confounding factors as the explanatory variables.

Results: Of the 18,294 study patients, the number of patients in the very low, low, and moderate groups was 6,727, 9,760, and 1,807, respectively. The median prescribed energy doses on the 7th day after surgery were 9.2 kcal/kg, 16 kcal/kg, and 27 kcal/kg in the very low, low, and moderate groups, respectively. The adjusted odds ratio (95% confidence interval) for in-hospital mortality with reference to the very low group was 1.060 (1.057–1.062) for the low group and 1.281 (1.275–1.287) for the moderate group. That of postoperative complications was 1.030 (0.940–1.128) and 0.982 (0.842–1.144) for the low and moderate groups, respectively. The partial regression coefficient (95% confidence interval) for length of hospital stay (day) with reference to the very low group was 2.0 (0.7–3.3) and 3.2 (1.0–5.5), and that of total in-hospital medical cost (US$) was 1,220 (705–1,735) and 2,000 (1,136–2,864), for the low and moderate groups, respectively.

Conclusion: Contrary to the guideline recommendation, the prescribed energy doses of ≥ 10 kcal/kg/day was associated with the increase in in-hospital mortality, length of hospital stay, and total in-hospital medical cost. Our findings questioned the efficacy of guideline-recommended energy intake for patients during 7 days after gastrointestinal cancer surgery.

Jayme Scali, BS1; Gaby Luna, BS2; Kristi Griggs, MSN, FNP-C, CRNI3; Kristie Jesionek, MPS, RDN, LDN4; Christina Ritchey, MS, RD, LD, CNSC, FASPEN, FNHIA5

1Optum Infusion Pharmacy, Thornton, PA; 2Optum Infusion Pharmacy, Milford, MA; 3Optum Infusion Pharmacy, Murphy, NC; 4Optum Infusion Pharmacy, Franklin, TN; 5Optum Infusion Pharmacy, Bulverde, TX

Financial Support: None Reported.

Background: Home parenteral nutrition (HPN) is a life-sustaining nutrition support therapy that is administered through a central venous access device (CVAD). Caregivers of children on HPN are initially trained to provide CVAD care and therapy administration by their clinical team or home infusion nurse. Often there is a gap in training when the patient is ready to assume responsibility for CVAD care. Without proper training, patients are at significant risk of complications such as bloodstream infections and catheter occlusions. The purpose of this study was twofold: 1) to explore the caregiver's perspective about current and future CVAD training practices and 2) to evaluate the need for a proactive formalized CVAD training program when care is transitioned from caregiver to patient.

Methods: An 8-question survey was created using an online software tool. The target audience included caregivers of children receiving HPN. A link to the survey was sent via email and posted on various social media platforms that support the HPN community. The survey was conducted from June 17 to July 18, 2024. Respondents who were not caregivers of a child receiving HPN via a CVAD were excluded.

Results: The survey received 114 responses, but only 86 were included in the analysis based on exclusion criteria. The distribution of children with a CVAD receiving HPN was evenly weighted between 0 and 18 years of age. The majority of the time, initial training regarding HPN therapy and CVAD care was conducted by the HPN clinic/hospital resource/learning center or home infusion pharmacy (Table 1). Forty-eight percent of respondents indicated their HPN team never offers reeducation or shares best practices (Figure 1). Most respondents selected the best individual to train their child on CVAD care and safety is the caregiver (Figure 2). In addition, 60% of respondents selected yes, they would want their child to participate in CVAD training if offered (Figure 3).

Conclusion: This survey confirms that most caregivers anticipate training their child to perform CVAD care when it is determined the child is ready for this responsibility. One challenge to this provision of training is that almost half of the respondents in this survey stated they never receive reeducation or best practice recommendations from their team. This finding demonstrates a need for a formalized training program to assist caregivers when transitioning CVAD care to the patient. Since most respondents reported relying on their intestinal rehab or GI/motility clinic for CVAD related concerns, these centers would be the best place to establish a transition training program. Limitations of the study are as follows: It was only distributed via select social platforms, and users outside of these platforms were not captured. Additional studies would be beneficial in helping to determine the best sequence and cadence for content training.

Table 1. Central Venous Access Device (CVAD) Training and Support Practices.

Figure 1. How Often Does Your HPN Team Offer Reeducation or Share Best Practices?

Figure 2. Who is Best to Train Your Child on CVAD Care Management and Safety?

Figure 3. If Formalized CVAD Training is Offered, Would You Want Your Child to Participate?

Laryssa Grguric, MS, RDN, LDN, CNSC1; Elena Stoyanova, MSN, RN2; Crystal Wilkinson, PharmD3; Emma Tillman, PharmD, PhD4

1Nutrishare, Tamarac, FL; 2Nutrishare, Kansas City, MO; 3Nutrishare, San Diego, CA; 4Indiana University, Carmel, IN

Financial Support: None Reported.

Background: Long-term parenteral nutrition (LTPN) within the home is a lifeline for many patients throughout the United States. Patients utilize central venous access devices (CVAD) to administer LTPN. Central line-associated bloodstream infection (CLABSI) is a serious risk associated with patients who require LTPN. Rates of CLABSI in LTPN populations range from 0.9-1.1 per 1000 catheter days. The aim of this study was to determine the incidence of CLABSI in a cohort of patients serviced by a national home infusion provider specializing in LTPN and identify variables associated with an increased incidence of CLABSI.

Methods: A retrospective review of electronic medical records of LTPN patients with intestinal failure was queried from March 2023 to May 2024 for patient demographics, anthropometric data, nursing utilization, parenteral nutrition prescription including lipid type, length of therapy use, geographic distribution, prescriber specialty, history of CLABSI, blood culture results as available, and use of ethanol lock. Patient zip codes were used to determine rural health areas, as defined by the US Department of Health & Human Services. Patients were divided into two groups: 1) patients that had at least one CLABSI and 2) patients with no CLABSI during the study period. Demographic and clinical variables were compared between the two groups. Nominal data were analyzed by Fisher's exact test and continuous data were analyzed with student t-test for normal distributed data and Mann-Whitney U-test was used for non-normal distributed data.

Results: We identified 198 persons that were maintained on LTPN during the study time. The overall CLABSI rate for this cohort during the study period was 0.49 per 1000 catheter days. Forty-four persons with LTPN had one or more CLABSI and 154 persons with LTPN did not have a CLABSI during the study period. Persons who experienced CLABSI weighed significantly more, had fewer days of infusing injectable lipid emulsions (ILE), and had a shorter catheter dwell duration compared to those that did not have a CLABSI (Table 1). There was no significant difference between the CLABSI and no CLABSI groups in the length of time on LTPN, location of consumer (rural versus non-rural), utilization of home health services, number of days parenteral nutrition (PN) was infused, or use of ethanol locks (Table 1).

Conclusion: In this retrospective cohort study, we report a CLABSI rate of 0.49 per 1000 catheter days, which is lower than previously published CLABSI rates for similar patient populations. Patient weight, days of infusing ILE, and catheter dwell duration were significantly different between those that did and did not have a CLABSI in this study period. Yet, variables such as use of ethanol lock and proximity to care providers that had previously been reported to impact CLABSI were not significantly different in this cohort. An expanded study with more LTPN patients or a longer study duration may be necessary to confirm these results and their impact on CLABSI rates.

Table 1. Long Term Parenteral Nutrition (LTPN) Characteristics.

Silvia Figueiroa, MS, RD, CNSC1; Stacie Townsend, MS, RD, CNSC2

1MedStar Washington Hospital Center, Bethesda, MD; 2National Institutes of Health, Bethesda, MD

Financial Support: None Reported.

Background: In hospitalized patients, lipid emulsions constitute an essential component of balanced parenteral nutrition (PN). Soy oil-based lipid injectable emulsions (SO-ILE) were traditionally administered as part of PN formulations primarily as a source of energy and for prevention of essential fatty acid deficiency. Contemporary practice has evolved to incorporate mixtures of different lipid emulsions, including a combination of soy, MCT, olive, and fish oils (SO, MCT, OO, FO-ILE). Evidence suggests that the use of SO, MCT, OO, FO-ILEs may alter essential fatty acid profiles, impacting hepatic metabolism and other processes associated with clinical benefits. The aim of this project was to compare essential fatty acid profile (EFAP), triglycerides (TGL), liver function tests (LFTs), and total bilirubin (TB) levels in adult patients receiving parenteral nutrition with SO-ILE or SO, MCT, OO, FO-ILE in a unique hospital entirely dedicated to conducting clinical research.

Methods: This was a retrospective chart review from 1/1/2019 to 12/31/2023 of adult patients in our hospital who received PN with SO-ILE or SO, MCT, OO, FO-ILE and had EFAP assessed after 7 days of receiving ILE. Data included demographic, clinical, and nutritional parameters. Patients with no laboratory markers, those on propofol, and those who received both ILE products in the 7 days prior to collection of EFAP were excluded. Data was statistically analyzed using Fisher's tests and Mann-Whitney U tests as appropriate.

Results: A total of 42 patient charts were included (14 SO-ILE; 28 SO, MCT, OO, FO-ILE). Group characteristics can be found in Table 1. Patients on SO-ILE received more ILE (0.84 vs 0.79 g/kg/day, p < 0.0001). TGL levels changed significantly after start of ILE (p < 0.0001). LFTs were found to be elevated in 57% of patients in the SO-ILE group and 60% in the SO, MCT, OO, FO-ILE group, while TB was increased in 21% and 40% of the patients respectively (Figure 1). Further analysis showed no significant differences in LFTs and TB between the two groups. Assessment of EFAP revealed a significant difference in the levels of DHA, docosenoic acid, and EPA, which were found to be higher in the group receiving SO, MCT, OO, FO-ILE. Conversely, significant differences were also observed in the levels of linoleic acid, homo-g-linolenic acid, and total omega 6 fatty acids, with those being higher in patients administered SO ILE (Figure 2). No differences were observed between groups regarding the presence of essential fatty acid deficiency, as indicated by the triene:tetraene ratio.

Conclusion: In our sample analysis, LFTs and TB levels did not differ significantly between SO-ILE and SO, MCT, OO, FO-ILE groups. Increased levels of DHA, docosenoic acid, and EPA were found in the SO, MCT, OO, FO-ILE group, while linoleic acid, homo-g-linolenic acid, and total omega 6 fatty acids tended to be higher in the SO-ILE group. Although in our sample the SO, MCT, OO, FO-ILE group received a lower dosage of ILE/kg/day, there were no differences in the rate of essential fatty acid deficiency between groups.

Table 1. General Characteristics (N = 42).

Figure 1. Liver Function Tests (N = 39).

Figure 2. Essential Fatty Acid Profile (N = 42).

Kassandra Samuel, MD, MA1; Jody (Lind) Payne, RD, CNSC2; Karey Schutte, RD3; Kristen Horner, RDN, CNSC3; Daniel Yeh, MD, MHPE, FACS, FCCM, FASPEN, CNSC3

1Denver Health, St. Joseph Hospital, Denver, CO; 2Denver Health, Parker, CO; 3Denver Health, Denver, CO

Financial Support: None Reported.

Background: Early nutritional support in hospitalized patients has long been established as a key intervention to improve overall patient outcomes. Patients admitted to the hospital often have barriers to receiving adequate nutrition via enteral route means and may be candidates for parenteral nutrition (PN). Central parenteral nutrition (CPN) requires central access, which has historically led to concerns for central line-associated bloodstream infection (CLABSI). Obtaining central access can be resource-intensive and may result in treatment delays while awaiting access. Conversely, peripheral parenteral nutrition (PPN) can be delivered without central access. In this quality improvement project, we sought to characterize our PPN utilization at a large urban tertiary hospital.

Methods: We performed a retrospective review of adult inpatients receiving PN at our facility from 1/1/23–12/31/23. Patients were excluded from review if they had PN initiated prior to hospitalization. Demographic information, duration of treatment, timely administration status, and information regarding formula nutrition composition were collected.

Results: A total of 128 inpatients received PN for a total of 1302 PN days. The mean age of these patients was 53.8 years old (SD: 17.9) and 65 (50%) were male. Twenty-six (20%) patients received only PPN for a median [IQR] length of 3 [2–4] days, and 61 (48%) patients received only CPN for a median length 6 [3–10] days. Thirty-nine (30%) patients were started on PPN with the median time to transition to CPN of 1 [1-3] day(s) and a median total duration of CPN being 8 [5-15.5] days. A small minority of patients received CPN and then transitioned to PPN (2%).

Conclusion: At our institution, PPN is utilized in more than 50% of all inpatient PN, most commonly at PN initiation and then eventually transitioning to CPN for a relatively short duration of one to two weeks. Additional research is required to identify those patients who might avoid central access by increasing PPN volume and macronutrients to provide adequate nutrition therapy.

Nicole Halton, NP, CNSC1; Marion Winkler, PhD, RD, LDN, CNSC, FASPEN2; Elizabeth Colgan, MS, RD3; Benjamin Hall, MD4

1Brown Surgical Associates, Providence, RI; 2Department of Surgery and Nutritional Support at Rhode Island Hospital, Providence, RI; 3Rhode Island Hospital, Providence, RI; 4Brown Surgical Associates, Brown University School of Medicine, Providence, RI

Financial Support: None Reported.

Background: Parenteral nutrition (PN) provides adequate nutrition and fluids to patients with impaired gastrointestinal function who cannot meet their nutritional needs orally or enterally. PN requires a venous access device that has associated risks including infection as well as metabolic abnormalities associated with the therapy. Monitoring of PN therapy in the hospital setting involves regular blood work, yet improperly collected samples can lead to abnormal laboratory results and unnecessary medical interventions.

Methods: An IRB exempted quality improvement study was conducted at Rhode Island Hospital by the Surgical Nutrition Service which manages all adult PN. The purpose of the study was to quantify the occurrence of contaminated blood samples among PN patients between January 1, 2024, and August 31, 2024. Demographic data, venous access device, and PN-related diagnoses were collected. Quantification of contaminated blood specimens was determined per patient, per hospital unit, and adjusted for total PN days. Comparisons were made between serum glucose, potassium, and phosphorus levels from contaminated and redrawn blood samples. Descriptive data are reported.

Results: 138 patients received PN for a total of 1840 days with a median length of PN therapy of 8 days (IQR 9, range 2-84). The most common vascular access device was dual lumen peripherally inserted central catheter. The majority (63%) of patients were referred by surgery teams and received care on surgical floors or critical care units. The most frequent PN related diagnoses were ileus, gastric or small bowel obstruction, and short bowel syndrome. There were 74 contaminated blood specimens among 42 (30%) patients receiving TPN for a rate of 4% per total patient days. Of 25 nursing units, 64% had at least one occurrence of contaminated blood specimens among TPN patients on that unit. Contaminated samples showed significantly different serum glucose, potassium, and phosphorus compared to redrawn samples (p < 0.001); glucose in contaminated vs redrawn samples was 922 ± 491 vs 129 ± 44 mg/dL; potassium 6.1 ± 1.6 vs 3.9 ± 0.5 mEq/L; phosphorus 4.9 ± 1.2 vs 3.3 ± 0.6 mg/dL. The average time delay between repeated blood samples was 3 hours.

Conclusion: Contaminated blood samples can lead to delays in patient care, discomfort from multiple blood draws, unnecessary medical interventions (insulin; discontinuation of PN), delay in placement of timely PN orders, and increased infection risk. Nursing re-education on proper blood sampling techniques is critical for reducing contamination occurrences. All policies and procedures will be reviewed, and an educational program will be implemented. Following this, occurrences of blood contamination during PN will be reassessed.

Hassan Dashti, PhD, RD1; Priyasahi Saravana1; Meghan Lau1

1Massachusetts General Hospital, Boston, MA

Encore Poster

Presentation: ASN Nutrition 2024.

Publication: Saravana P, Lau M, Dashti HS. Continuous glucose monitoring in adults with short bowel syndrome receiving overnight infusions of home parenteral nutrition. Eur J Clin Nutr. 2024 Nov 23. doi: 10.1038/s41430-024-01548-z. Online ahead of print. PMID: 39580544.

Financial Support: ASPEN Rhoads Research Foundation.

Maria Romanova, MD1; Azadeh Lankarani-Fard, MD2

1VA Greater Los Angeles Healthcare System, Oak Park, CA; 2GA Greater Los Angeles Healthcare System, Los Angeles, CA

Financial Support: None Reported.

Background: Malnutrition is a serious complication of the hospital stay. Parenteral nutrition (PN) in most sophisticated way of addressing it but requires on-going monitoring. In our medical center PN provision is guided by the interdisciplinary Nutrition Support Team (NST). In 2024 we began creation of a dashboard to monitor safety and utilization of PN at the Greater Los Angeles VA. Here we discuss the collaborative process of developing the dashboard and its first use.

Methods: A dashboard was constructed using data from the VA electronic health record. The dashboard used Microsoft Power BI technology to customize data visualization. The NST group worked closely with the Data Analytics team at the facility to modify and validate the dashboard to accommodate the needs of the group. The dashboard was maintained behind a VA firewall and only accessible to members of the NST. The dashboard reviewed patient level data for whom a Nutrition Support consult was placed for the last 2 years. The variables included were the data of admission, date of consult request, the treating specialty at the time of request, demographics, admission diagnosis, discharge diagnosis, number of orders for PPN/TPN, number of blood sugars >200 mg/dL after admission, number of serum phosphorus values < 2.5 mg/dL, number of serum potassium values < 3.5 mmol/L, any discharge diagnosis of refeeding (ICD 10 E87.8), micronutrient levels during admission, and any discharge diagnosis of infection. The ICD10 codes used to capture infection were for: bacteremia (R78.81), sepsis (A41.*), or catheter associated line infection (ICD10 = T80.211*). The asterix (*) denotes any number in that ICD10 classification. The dashboard was updated once a week. The NST validated the information on the dashboard to ensure validity, and refine information as needed.

Results: The initial data extraction noted duplicate consult request as patients changed treating specialties during the same admission and duplicate orders for PPN/TPN as the formulations were frequently modified before administration. The Data Analytics team worked to reduce these duplicates. The NST also collaborated with the Data Analytics team to modify their existing documentation to better capture the data needed going forward. Dashboard data was verified by direct chart review. Between April 2022- 2024 68 consults were placed from the acute care setting and 58 patients received PPN or TPN during this time period. Thirty-five patients experienced hyperglycemia. Two patients were deemed to have experience refeeding at the time of discharge. Fourteen episodes of infection were noted in those who received PPN/TPN but the etiology was unclear from the dashboard alone and required additional chart review.

Conclusion: A dashboard can facilitate monitoring of Nutrition Support services in the hospital. Refinement of the dashboard requires collaboration between the clinical team and the data analytics team to ensure validity and workload capture.

Michael Fourkas, MS1; Julia Rasooly, MS1; Gregory Schears, MD2

1PuraCath Medical Inc., Newark, CA; 2Department of Anesthesiology and Perioperative Medicine, Mayo Clinic, Rochester, MN

Financial Support: Funding of the study has been provided by Puracath Medical.

Background: Intravenous catheters can provide venous access for drug and nutrition delivery in patients for extended periods of time, but risk the occurrence of central line associated bloodstream infections (CLABSI) due to inadequate asepsis. Needleless connectors (NC), which provide access for injection of medications, are known to be one of the major sources of contamination. Studies demonstrate that current methods of disinfecting connectors such as a 15 second antiseptic wipe do not guarantee complete disinfection inside of connectors. With the rise of superbugs such as Candida auris, there is an urgent need for better aseptic technique compliance and non-antibiotic disinfection methods. Ultraviolet light-C (UV-C) is an established technology that is commonly used in hospital settings for disinfection of equipment and rooms. In this study, we investigate the efficacy of our novel UV-C light disinfection device on UV-C light-transmissive NCs inoculated with common CLABSI-associated organisms.

Methods: Staphylococcus aureus (ATCC #6538), Candida albicans (ATCC #10231), Candida auris (CDC B11903), Escherichia coli (ATCC #8739), Pseudomonas aeruginosa (ATCC #9027), and Staphylococcus epidermidis (ATCC #12228) were used as test organisms for this study. A total of 29 NC samples were tested for each organism with 3 positive controls and 1 negative control. Each UV-C light-transmissive NC was inoculated with 10 µl of cultured inoculum (7.00-7.66 log) and were exposed to an average of 48 mW/cm2 of UV light for 1 second using our in-house UV light disinfection device FireflyTM. After UV disinfection, 10 mL of 0.9% saline solution was flushed through the NC and filtered through a 0.45 µm membrane. The membrane filter was plated onto an agar medium matched to the organism and was incubated overnight at 37°C for S. aureus, E. coli, S. epidermidis, and P. aeruginosa, and two days at room temperature for C. albicans and C. auris. Positive controls followed the same procedure without exposure to UV light and diluted by 100x before being spread onto agar plates in triplicates. The negative controls followed the same procedure without inoculation. After plates were incubated, the number of colonies on each plate were counted and recorded. Log reduction was calculated by determining the positive control log concentration over the sample concentration in cfu/mL. 1 cfu/10 mL was used to make calculations for total kills.

Results: Using our UV light generating device, we were able to achieve greater than 4 log reduction average and complete kills for all test organisms. The log reduction for S. aureus, C. albicans, C. auris, E. coli, P. aeruginosa, and S. epidermidis were 5.29, 5.73, 5.05, 5.24, 5.10, and 5.19, respectively.

Conclusion: We demonstrated greater than 4-log reduction in common CLABSI-associated organisms using our UV light disinfection device and UV-C transmissive NCs. By injecting inoculum directly inside the NC, we demonstrated that disinfection inside NCs can be achieved, which is not possible with conventional scrubbing methods. A one second NC disinfection time will allow less disruption in the workflow in hospitals, particularly in intensive care units where highly effective and efficient disinfection rates are essential for adoption of the technology.

Table 1. Log Reduction of Tested Organisms After Exposure to 48 mW/cm2 UV-C for 1 Second.

Yaiseli Figueredo, PharmD1

1University of Miami Hospital, Miami, FL

Financial Support: None Reported.

Background: Octreotide belongs to the somatostatin analog class. It is used off-label for malignant bowel obstructions (MBO). Somatostatin analogs (SSA) inhibit the release and action of multiple hormones, reducing gastric secretions, peristalsis, and splanchnic blood flow while enhancing water and electrolyte absorption. National Comprehensive Cancer Network (NCCN) guidelines recommend octreotide 100-300 mcg subcutaneous twice to three times a day or 10-40 mcg/hour continuous infusion for the management of malignant bowel obstructions, and if prognosis is greater than 8 weeks, consider long-acting release (LAR) or depot injection. Using octreotide as an additive to parenteral nutrition solutions has been a debatable topic due to concerns of formation of a glycosyl octreotide conjugate that may decrease the octreotide's efficacy. However, other compatibility studies have concluded little octreotide loss over 48 hours in TPN solutions at room temperature in ambient room light. At the University of Miami Hospital, it is practiced using octreotide as an additive to Total Parenteral Nutrition (TPN) solutions to reduce gastro-intestinal secretions in patients with malignant bowel obstructions. The starting dose is 300 mcg, and dose is increased on 300 mcg increments to a maximum dose of 900 mcg if output remains uncontrolled/elevated. The objective of this study is to evaluate the effectiveness and safety of octreotide as an additive to TPN in hospitalized patients with malignant bowel obstructions.

Methods: A three-year retrospective chart review (June 2021-June 2024) was conducted to evaluate the effectiveness and safety of octreotide as an additive to TPN in hospitalized patients with MBO diagnosis at UMH. The following information was obtained from chart review: age, gender, oncologic diagnosis, TPN indication, TPN dependency, octreotide doses used, baseline and final gastrointestinal secretion output recorded, type of venting gastrostomy in place, length of hospital stay, and baseline and final hepatic function tests.

Results: A total of 27 patients were identified to have malignant bowel obstruction requiring TPN which had octreotide additive. All patients were started on octreotide 300 mcg/day added into 2-in-1 TPN solution. The gastrointestinal secretion output was reduced on average by 65% among all patients with a final average daily amount of 540 mL recorded. The baseline average output recorded was 1,518 mL/day. The average length of treatment as an inpatient was 23 days, range 3-98 days. Liver function tests (LFTs) were assessed at baseline and last inpatient value available for the admission. Four out of the 27 patients (15%) reviewed were observed to have a significant rise in liver enzymes greater than three times the upper limit of normal.

Conclusion: Octreotide represents a valuable addition to the limited pharmacological options for managing malignant bowel obstruction. Its ability to reduce gastrointestinal secretions by 65% on average as observed in this retrospective chart review can significantly alleviate symptoms and improve patient care. Using octreotide as an additive to TPN solutions for patients with malignant bowel obstructions who are TPN dependent reduces the number of infusions or subcutaneous injections patients receive per day. According to octreotide's package insert, the incidence of hepato-biliary complications is up to 63%. The finding that 15% of patients from this retrospective chart review had significant liver enzyme elevations remains an important monitoring parameter to evaluate.

Pavel Tesinsky, Assoc. Prof., MUDr.1; Jan Gojda, Prof., MUDr, PhD2; Petr Wohl, MUDr, PhD3; Katerina Koudelkova, MUDr4

1Department of Medicine, Prague, Hlavni mesto Praha; 2Department of Medicine, University Hospital, 3rd Faculty of Medicine Charles University in Prague, Praha, Hlavni mesto Praha; 3Institute for Clinical and Experimental Medicine, Prague, Hlavni mesto Praha; 4Department of Medicine, University Hospital and 3rd Faculty of Medicine in Prague, Prague, Hlavni mesto Praha

Financial Support: The Registry was supported by Takeda and Baxter scientific grants.

Background: Trends in indications, syndromes, performance, weaning, and complications of patients on total HPN based on the updated 30 years analysis and stratification of patients on home parenteral nutrition (HPN) in Czech Republic.

Methods: Records from the HPN National Registry were analysed for the time period 2007 – 2023, based on the data from the HPN centers. Catheter related sepsis (CRS), catheter occlusions, and thrombotic complications were analyzed for the time–to–event using the competing-risks regression (Fine and Gray) model. Other data is presented as median or mean with 95% CI (p < 0.05 as significant).

Results: The incidence rate of HPN is 1.98 per 100.000 inhabitants (population 10.5 mil.). Lifetime dependency is expected in 20% patients, potential weaning in 40%, and 40% patients are palliative. Out of 1838 records representing almost 1.5 million catheter days, short bowel syndrome was present in 672 patients (36.6%), intestinal obstruction in 531 patients (28.9%), malabsorption in 274 patients (14.9%), and the rest of 361 patients (19.6%) was split among fistulas, dysphagia, or remained unspecified. The majority of SBS were type I (57.8 %) and II (20.8%). Mean length of residual intestine was 104.3 cm (35.9 - 173.4 cm) with longer remnants in type I SBS. Dominant indications for HPN were pseudoobstruction (35.8%), non-maignant surgical conditions (8.9%), Crohn disease (7.3%), and mesenteric occlusion (6.8%). Mobility for a substantial part of the day was reported from 77.8% HPN patients, economic activity and independence from 162 (24.8 %) out of 653 economically potent patients. A tunneled catheter was primarily used in 49.1%, PICC in 24.3%, and IV port in 19.8% patients. Commercially prepared bags were used in 69.7%, and pharmacy-prepared admixtures in 24.7% patients. A total of 66.9% patients were administered 1 bag per day/7 days a week. The sepsis ratio per 1000 catheter days decreased from 0.84 in 2013 to 0.15 in 2022. The catheter occlusions ratio decreased from 0.152 to 0.10 per 1000 catheter days, and thrombotic complications ratio from 0.05 to 0.04. Prevalence of metabolic bone disease is 15.6 %, and prevalence of PNALD is 22.3%. In the first 12 months, 28 % patients achieved intestinal autonomy increasing to 45 % after 5 years. Patient survival rate is 62% in the first year, 45% at 5 years, and 35% at the 10-years mark. Tedeglutide was indicated in 36 patients up to date with reduction of the daily HPN volume to 60.3% on average.

Conclusion: Prevalence of HPN patients in the Czech Republic is increasing in the past ten years and it is corresponding to the incidence rate. Majority of patients are expected to terminate HPN within the first year. Risk of CRS decreased significantly in the past five years and remains low, while catheter occlusion and thrombotic complications have a stable trend. Tedeglutide significantly reduced the required IV volume.

Figure 1. Per-Year Prevalence of HPN Patients and Average Number of Catheter Days Per Patient (2007 - 2022).

Figure 2. Annual Incidence of HPN Patients (2007 - 2022).

Figure 3. Catheter related bloodstream infections (events per 1,000 catheter-days).

Jill Murphree, MS, RD, CNSC, LDN1; Anne Ammons, RD, LDN, CNSC2; Vanessa Kumpf, PharmD, BCNSP, FASPEN2; Dawn Adams, MD, MS, CNSC2

1Vanderbilt University Medical Center, Nashville, TN; 2Vanderbilt University Medical Center, Nashville, TN

Financial Support: None Reported.

Background: Determining macronutrient goals in patients requiring home parenteral nutrition (HPN) can be difficult due to various factors. While indirect calorimetry is the gold standard for measuring energy expenditure, it is not readily available in the outpatient setting. Therefore, clinicians typically rely on less accurate weight-based equations for assessment of protein and energy requirements. Energy goals are also impacted by the targeted desire for weight loss, weight gain, or weight maintenance. Patients receiving HPN may consume some oral dietary intake and experience variable degrees of macronutrient absorption. These factors, as well as underlying clinical conditions, can significantly impact protein and energy requirements and may change over the course of HPN therapy. The purpose of this study was to evaluate the range of protein and energy doses prescribed in patients receiving HPN managed by an interdisciplinary intestinal failure clinic at a large academic medical center.

Methods: Patient demographics including patient age, gender, and PN indication/diagnosis were retrospectively obtained for all patients discharged home with PN between May 2021 to May 2023 utilizing an HPN patient database. Additional information was extracted from the electronic medical record at the start of HPN, then at 2-week, 2 to 3 month, and 6-month intervals following discharge home that included height, actual weight, target weight, HPN energy dose, HPN protein dose, and whether the patient was eating. Data collection ended at completion of HPN therapy or up to 6 months of HPN. All data was entered and stored in an electronic database.

Results: During the study period, 248 patients were started on HPN and 56 of these patients received HPN for at least 6 months. Patient demographics are included in Table 1. At the start of HPN, prescribed energy doses ranged from 344 to 2805 kcal/d (6 kcal/kg/d to 45 kcal/kg/d) and prescribed protein doses ranged from 35 to 190 g/d (0.6 g/kg/d to 2.1 g/kg/d). There continued to be a broad range of prescribed energy and protein doses at 2-week, 2 to 3 month, and 6-month intervals of HPN. Figures 1 and 2 provide the prescribed energy and protein doses for all patients and for those who are eating and not eating. For patients not eating, the prescribed range for energy was 970 to 2791 kcal/d (8 kcal/kg/d to 45 kcal/kg/d) and for protein was 40 to 190 g/d (0.6 g/kg/d to 2.0 g/kg/d) at the start of PN therapy. The difference between actual weight and target weight was assessed at each study interval. Over the study period, patients demonstrated a decrease in the difference between actual and target weight to suggest improvement in reaching target weight (Figure 3).

Conclusion: The results of this study demonstrate a wide range of energy and protein doses prescribed in patients receiving HPN. This differs from use of PN in the inpatient setting, where weight-based macronutrient goals tend to be more defined. Macronutrient adjustments may be necessary in the long-term setting when patients are consuming oral intake, for achievement/maintenance of target weight, or for changes in underlying conditions. Patients receiving HPN require an individualized approach to care that can be provided by interdisciplinary nutrition support teams specializing in intestinal failure.

Table 1. Patient Demographics Over 6-Month Study Period.

Figure 1. Parenteral Nutrition (PN) Energy Range.

Figure 2. Parenteral Nutrition (PN) Protein Range.

Figure 3. Difference Between Actual Weight and Target Weight.

Jennifer Lachnicht, RD, CNSC1; Christine Miller, PharmD2; Jessica Younkman, RD CNSC2

1Soleo Home Infusion, Frisco, TX; 2Soleo Health, Frisco, TX

Financial Support: None Reported.

Background: Since the 1990s, initiating parenteral nutrition (PN) at home has been performed, though some clinicians prefer hospital initiation due to risks like refeeding syndrome (RS). A key factor in successful home PN initiation is careful evaluation by an experienced nutrition support clinician, particularly assessing RS risk. In 2020, ASPEN published consensus recommendations for identifying patients at risk for RS and guidelines for initiating and advancing nutrition. Home PN initiation offers advantages, including avoiding hospitalization, reducing healthcare costs, minimizing hospital-acquired infections, and improving quality of life. Literature suggests a savings of $2,000 per day when PN is started at home. This study aims to determine the risk and incidence of RS in patients who began PN at home, based on the 2020 ASPEN Consensus Recommendations for RS.

Methods: A national home infusion provider's nutrition support service reviewed medical records for 27 adult patients who initiated home PN between September 2022 and June 2024. Patients were evaluated for RS risk before PN initiation and the actual incidence of RS based on pre- and post-feeding phosphorus, potassium, and magnesium levels using ASPEN 2020 criteria. Initial lab work was obtained before and after PN initiation. Refeeding risk was categorized as mild, moderate, moderate to severe, or severe based on initial nutrition assessment, including BMI, weight loss history, recent caloric intake, pre-feeding lab abnormalities, fat/muscle wasting, and high-risk comorbidities. The percent change in phosphorus, potassium, and magnesium was evaluated and categorized as mild, moderate, or severe if levels decreased after therapy start. Initial PN prescriptions included multivitamins and supplemental thiamin per provider policy and consensus recommendations.

Results: The average baseline BMI for the study population was 19.6 kg/m² (range 12.7-31.8, median 18.9). Weight loss was reported in 88.9% of patients, averaging 22%. Little to no oral intake at least 5-10 days before assessment was reported in 92.3% of patients. Initial lab work was obtained within 5 days of therapy start in 96.2% of cases, with 18.5% showing low prefeeding electrolytes. 100% had high-risk comorbidities. RS risk was categorized as mild (4%), moderate (48%), moderate to severe (11%), and severe (37%). Home PN was successfully initiated in 25 patients (93%). Two patients could not start PN at home: one due to persistently low pre-PN electrolyte levels despite IV repletion, and one due to not meeting home care admission criteria. Starting dextrose averaged 87.2 g/d (range: 50-120, median 100). Average total starting calories were 730 kcals/d, representing 12.5 kcals/kg (range: 9-20, median 12). Initial PN formula electrolyte content included potassium (average 55.4 mEq/d, range: 15-69, median 60), magnesium (average 11.6 mEq/d, range: 4-16, median 12), and phosphorus (average 15.6 mmol/d, range: 8-30, median 15). Labs were drawn on average 4.3 days after therapy start. Potassium, magnesium, and phosphorus levels were monitored for decreases ≥10% of baseline to detect RS. Decreases in magnesium and potassium were classified as mild (10-20%) and experienced by 4% of patients, respectively. Eight patients (32%) had a ≥ 10% decrease in phosphorus: 4 mild (10-20%), 2 moderate (20-30%), 2 severe (>30%).

Conclusion: Home initiation of PN can be safely implemented with careful monitoring and evaluation of RS risk. This review showed a low incidence of RS based on ASPEN criteria, even in patients at moderate to severe risk prior to home PN initiation. Close monitoring of labs and patient status, along with adherence to initial prescription recommendations, resulted in successful PN initiation at home in 92.5% of patients.

Dana Finke, MS, RD, CNSC1; Christine Miller, PharmD1; Paige Paswaters, RD, CNSC1; Jessica Younkman, RD, CNSC1

1Soleo Health, Frisco, TX

Financial Support: None Reported.

Background: Iron deficiency anemia is common in home parenteral nutrition (HPN) patients (Hwa et al., 2016). Post iron infusion, hypophosphatemia due to fibroblast growth factor 23 is a known side effect of some IV iron formulations (Wolf et al., 2019). Ferric carboxymaltose can cause serum phosphorus levels to drop below 2 mg/dL in 27% of patients (Onken et al., 2014). Hypophosphatemia (< 2.7 mg/dL) can lead to neurological, neuromuscular, cardiopulmonary, and hematologic issues, and long-term effects like osteopenia and osteoporosis (Langley et al., 2017). This case series reviews the occurrence and clinical implications of transient serum phosphorus drops in patients receiving ferric carboxymaltose and HPN.

Methods: A retrospective case series review was performed for three patients who were administered ferric carboxymaltose while on HPN therapy. Serum phosphorus levels were measured at baseline prior to initial iron dose, within 1 week post injection, and at subsequent time intervals up to 4 months post initial injection. Data was collected on the timing, magnitude, and duration of any phosphorus decreases, and any associated clinical symptoms or complications. Patient records were also reviewed to evaluate any correlations with HPN composition or phosphorus dosing.

Results: Among the three patients reviewed, all exhibited short-term drops in serum phosphorus levels following ferric carboxymaltose injection (Table 1). All patients had baseline serum phosphorus levels within normal limits prior to the initial dose of ferric carboxymaltose. All cases involved multiple doses of ferric carboxymaltose, which contributed to the fluctuations in phosphorus levels. The average drop in serum phosphorus from baseline to the lowest point was 50.3%. The lowest recorded phosphorus level among these three patients was 1.4 mg/dL, and this was in a patient who received more than two doses of ferric carboxymaltose. In two cases, increases were made in HPN phosphorus in response to serum levels, and in one case no HPN changes were made. However, all serum phosphorus levels returned to normal despite varied interventions. Despite the low phosphorus levels, none of the patients reported significant symptoms of hypophosphatemia during the monitoring periods. Ferric carboxymaltose significantly impacts serum phosphorus in HPN patients, consistent with existing literature. The need for vigilant monitoring is highlighted, patients receiving HPN are closely monitored by a trained nutrition support team with frequent lab monitoring. Lab monitoring in patients receiving ferric carboxymaltose who are not on HPN may be less common. The lowest level recorded was 1.4 mg/dL, indicating potential severity. Despite significant drops, no clinical symptoms were observed, suggesting subclinical hypophosphatemia may be common. In two of the reviewed cases, hypophosphatemia was addressed by making incremental increases in the patient's HPN formulas. Note that there are limitations to phosphorus management in HPN due to compatibility and stability issues, and alternative means of supplementation may be necessary depending upon the patient's individual formula.

Conclusion: Three HPN patients receiving ferric carboxymaltose experienced transient, generally mild reductions in serum phosphorus. Monitoring is crucial, but the results from this case series suggest that clinical complications are rare. Adjustments to HPN or additional supplementation may be needed based on individual patient needs, with some cases self-correcting over time.

Table 1. Timeline of Iron Injections and the Resulting Serum Phosphorus Levels and HPN Formula Adjustments.

Danial Nadeem, MD1; Stephen Adams, MS, RPh, BCNSP2; Bryan Snook2

1Geisinger Wyoming Valley, Bloomsburg, PA; 2Geisinger, Danville, PA

Financial Support: None Reported.

Background: Ferric carboxymaltose (FC) is a widely used intravenous iron formulation, primarily employed in the treatment of iron deficiency. It offers significant benefits, particularly in cases where oral iron supplementation proves ineffective or is not well-tolerated. However, an important potential adverse effect associated with FC use is hypophosphatemia. This condition has been observed in multiple patients following their treatment with FC. The paper discusses the potential mechanisms leading to this adverse effect and its significant implications for patient care.

Methods: A middle-aged female with history of malnutrition and iron deficiency receiving parenteral nutrition at home had received multiple doses Venofer in the past, with the last dose given in 2017 to which the patient developed an anaphylactic reaction. She was therefore switched to ferric carboxymaltose (FCM) therapy. However, upon receiving multiple doses of FCM in 2018, the patient developed significant hypophosphatemia. As hypophosphatemia was noted, adjustments were made to the patient's total parenteral nutrition (TPN) regimen to increase the total phosphorus content in an effort to treat the low phosphate levels. The patient also received continued doses of FCM in subsequent years, with persistent hypophosphatemia despite repletion.

Results: Ferric carboxymaltose (FC) is a widely used intravenous iron therapy for the treatment of iron deficiency. It is particularly beneficial in cases where oral iron supplementation is ineffective or not tolerated. FC works by delivering iron directly to the macrophages in the reticuloendothelial system. The iron is then released slowly for use by the body, primarily for the production of hemoglobin. However, recent studies have highlighted the potential adverse effect of hypophosphatemia associated with the use of FC. Hypophosphatemia induced by FC is thought to be caused by an increase in the secretion of the hormone fibroblast growth factor 23 (FGF23). FGF23 is a hormone that regulates phosphate homeostasis. When FGF23 levels rise, the kidneys increase the excretion of phosphate, leading to lower levels of phosphate in the blood. There are many implications of hypophosphatemia in regards to patient care. Symptoms of hypophosphatemia can include muscle weakness, fatigue, bone pain, and confusion. In severe cases, persistent hypophosphatemia can lead to serious complications such as rhabdomyolysis, hemolysis, respiratory failure, and even death. Therefore, it is crucial for clinicians to be aware of the potential risk of hypophosphatemia when administering FC. Regular monitoring of serum phosphate levels is recommended in patients receiving repeated doses of FC. Further research is needed to fully understand the mechanisms underlying this adverse effect and to develop strategies for its prevention and management.

Conclusion: In conclusion, while FC is an effective treatment for iron deficiency, it is important for clinicians to be aware of the potential risk of hypophosphatemia. Regular monitoring of serum phosphate levels is recommended in patients receiving repeated doses of FC. Further research is needed to fully understand the mechanisms underlying this adverse effect and to develop strategies for its prevention and management.

Table 1. Phosphorous Levels and Iron Administration.

Table 1 shows the response to serum phosphorous levels in a patient given multiple doses of intravenous iron over time.

Ashley Voyles, RD, LD, CNSC1; Jill Palmer, RD, LD, CNSC1; Kristin Gillespie, MD, RD, LDN, CNSC1; Suzanne Mack, MS, MPH, RD, LDN, CNSC1; Tricia Laglenne, MS, RD, LDN, CNSC1; Jessica Monczka, RD, CNSC, FASPEN1; Susan Dietz, PharmD, BCSCP1; Kathy Martinez, RD, LD1

1Option Care Health, Bannockburn, IL

Financial Support: None Reported.

Background: Home parenteral nutrition (HPN) can be successfully initiated in the home setting with careful evaluation and management by an experienced nutrition support team (NST).1,2 Safe candidates for HPN initiation include medically stable patients with an appropriate indication, a safe environment, and the means for reliable follow-up. However, some patients are not appropriate to start directly with HPN due to logistical reasons or the risk of refeeding syndrome (RFS).2 Consensus Recommendations for RFS provide guidance regarding recognizing and managing risk.3 An experienced NST that provides individualized care can recommend intravenous (IV) hydration before initiating HPN to expedite the initiation of therapy and normalize blood chemistry to mitigate the risk of RFS. The purpose of this study is to evaluate the impact of IV hydration on adult patients managed by a home infusion NST who received IV hydration prior to initiating HPN. The proportion of patients who received IV hydration prior to HPN, the reason for initiating IV hydration, and the impact this intervention may have had on their care and outcomes will be described.

Methods: This retrospective review includes 200 HPN patients 18 years of age and older initiated on HPN therapy with a national home infusion pharmacy over a 6-month period from January 1, 2024, to June 30, 2024. Data collection included baseline demographics, indication for HPN, risk of RFS, days receiving IV hydration prior to initiating HPN, the number of rehospitalizations within the first 2 weeks, whether they received IV hydration, and if so, the indication for IV hydration and the components of the orders. Data was collected via electronic medical records and deidentified into a standardized data collection form.

Results: Of the 200 total patients, 19 (9.5%) received IV hydration prior to HPN. Of these 19 patients, 16 were female, and 3 were male (Table 1). The most common indications for HPN were bariatric surgery complication (5), intestinal failure (4), and oncology diagnosis (4) (Figure 1). Among these patients, 9 (47%) were at moderate RFS risk and 10 (53%) were at high RFS risk. The indications for IV hydration included 7 (37%) due to electrolyte abnormalities/RFS risk, 5 (26%) due to delay in central line placement, and 7 (37%) due to scheduling delays (Figure 2). IV hydration orders included electrolytes in 15 (79%) of the orders. All orders without electrolytes (4) had an indication related to logistical reasons (Figure 3). All 19 patients started HPN within 7 days of receiving IV hydration. Two were hospitalized within the first two weeks of therapy with admitting diagnoses unrelated to HPN.

Conclusion: In this group of patients, HPN was successfully initiated in the home setting when managed by an experienced NST, preventing unnecessary hospitalizations. This study demonstrated that safe initiation of HPN may include IV hydration with or without electrolytes first, either to mitigate RFS or due to logistical reasons, when started on HPN within 7 days. The IV hydration orders were individualized to fit the needs of each patient. This data only reflects IV hydration dispensed through the home infusion pharmacy and does not capture IV hydration received at an outside clinic. This patient population also does not include those deemed medically unstable for HPN or those not conducive to starting in the home setting for other factors. Future research should account for these limitations and detail IV hydration components, dosing, and frequency of orders.

Table 1. Demographics.

Figure 1. HPN Indications of IV Hydration.

Figure 2. Indication for IV Hydration and Refeeding Risk.

Figure 3. Indications and Types of IV Hydration.

Emily Boland Kramer, MS, RD, LDN, CNSC1; Jessica Monczka, RD, CNSC, FASPEN1; Tricia Laglenne, MS, RD, LDN, CNSC1; Ashley Voyles, RD, LD, CNSC1; Susan Dietz, PharmD, BCSCP1; Kathy Martinez, RD, LD1

1Option Care Health, Bannockburn, IL

Financial Support: None Reported.

Background: Parenteral nutrition (PN) is a critical therapy for patients who are unable to absorb nutrients adequately via the gastrointestinal tract.1 PN is complex, with 10 or more individually dosed components in each order which inherently increases the risk for dosing errors. 2 This study seeks to analyze the PN orders at hospital discharge received by a home infusion provider and identify the incidence of the omission of the standard components, as determined by ASPEN Recommendations on Appropriate Parenteral Nutrition Dosing.3 The primary objective of this study was to identify missing sodium, potassium, magnesium, calcium, phosphorus, and multivitamin components in hospital discharge orders. The secondary objective was to determine whether identified missing components were added back during the transition of care (TOC) process from hospital to home.

Methods: This multi-center, retrospective chart review analyzed patients referred to a national home infusion provider over a 3-month period. Data was collected from the electronic medical record and internal surveillance software. The Registered Dietitian, Certified Nutrition Support Clinician (RD, CNSC) reviewed all PN hospital discharge orders for patients transitioning clinical management and PN therapy from hospital to home. Inclusion criteria were patients 18 years of age or older with PN providing the majority of their nutritional needs, as defined in Table 1, who were missing sodium, potassium, magnesium, calcium, phosphorus, or multivitamin from their PN order at hospital discharge. Exclusion criteria were patients less than 18 years of age, patients receiving supplemental PN not providing majority of nutritional needs, and patients with doses ordered for all electrolytes and multivitamin.

Results: During the 3-month period (April 1, 2024 to June 30, 2024), 267 patients were identified who were greater than 18 years of age, receiving the majority of their nutritional needs via PN, and missing at least one PN component in their hospital discharge order. See Table 2 and Figure 1 for demographics. One hundred seventy-five (65.5%) patients were missing one component and 92 (34.5%) were missing multiple components from the hospital discharge order. One hundred seventy-five (65.5%) patients were missing calcium, 68 (25.5%) phosphorus, 38 (14%) multivitamin, 23 (8.6%) magnesium, 20 (7.5%) potassium, and 20 (7.5%) sodium. During the transition from hospital to home, after discussion with the provider, 94.9% of patients had calcium added back, 94.7% multivitamin, 91.3% magnesium, 90% potassium, 88.2% phosphorus, and 80% sodium.

Conclusion: This study highlights the prevalence of missing components in PN hospital discharge orders, with calcium being the most frequently omitted at a rate of 65.5%. Given that many patients discharging home on PN will require long term therapy, adequate calcium supplementation is essential to prevent bone resorption and complications related to metabolic bone disease. In hospital discharge orders that were identified by the RD, CNSC as missing calcium, 94.9% of the time the provider agreed that it was clinically appropriate to add calcium to the PN order during the TOC process. This underlines the importance of nutrition support clinician review and communication during the transition from hospital to home. Future research should analyze the reasons why components are missing from PN orders and increase awareness of the need for a thorough clinical review of all patients going home on PN to ensure the adequacy of all components required for safe and optimized long term PN.

Table 1. Inclusion and Exclusion Criteria.

Table 2. Demographics.

Figure 1. Primary PN Diagnosis.

Figure 2. Components Missing from Order and Added Back During TOC Process.

Avi Toiv, MD1; Hope O'Brien, BS2; Arif Sarowar, MSc2; Thomas Pietrowsky, MS, RD1; Nemie Beltran, RN1; Yakir Muszkat, MD1; Syed-Mohammad Jafri, MD1

1Henry Ford Hospital, Detroit, MI; 2Wayne State University School of Medicine, Detroit, MI

Financial Support: None Reported.

Background: Intestinal Failure Associated Liver Disease (IFALD) is a known complication in patients reliant on total parenteral nutrition (TPN), especially those awaiting intestinal transplantation. There is concern that IFALD may negatively impact post-transplant outcomes, including graft survival and overall patient mortality. This study aims to evaluate the impact of IFALD, as indicated by liver function test (LFT) abnormalities before intestinal or multivisceral transplant, on transplant outcomes.

Methods: We conducted a retrospective chart review of all patients who underwent IT at an academic transplant center from 2010 to 2023. The primary outcome was patient survival and graft failure in transplant recipients.

Results: Among 50 IT recipients, there were 30 IT recipients (60%) who required TPN before IT. The median age at transplant in was 50 years (range, 17-68). In both groups, the majority of transplants were exclusively IT, however they included MVT as well. 87% of patients on TPN developed elevated LFTs before transplant. 33% had persistently elevated LFTs 1 year after transplant. TPN-associated liver dysfunction in our cohort was associated with mixed liver injury patterns, with both hepatocellular injury (p < 0.001) and cholestatic injury (p < 0.001). No significant associations were found between TPN-related elevated LFTs and major transplant outcomes, including death (p = 0.856), graft failure (p = 0.144), or acute rejection (p = 0.306). Similarly, no significant difference was observed between elevated LFTs and death (p = 0.855), graft failure (p = 0.769), or acute rejection (p = 0.386) in patients who were not on TPN. TPN-associated liver dysfunction before transplant was associated with elevated LFTs at one year after transplant (p < 0.001) but lacked clinical relevance.

Conclusion: Although IFALD related to TPN use is associated with specific liver dysfunction patterns in patients awaiting intestinal or multivisceral transplants, it does not appear to be associated with significant key transplant outcomes such as graft failure, mortality, or acute rejection. However, it is associated with persistently elevated LFTs even 1 year after transplant. These findings suggest that while TPN-related liver injury is common, it may not have a clinically significant effect on long-term transplant success. Further research is needed to explore the long-term implications of IFALD in this patient population.

Jody (Lind) Payne, RD, CNSC1; Kassandra Samuel, MD, MA2; Heather Young, MD3; Karey Schutte, RD3; Kristen Horner, RDN, CNSC3; Daniel Yeh, MD, MHPE, FACS, FCCM, FASPEN, CNSC3

1Denver Health, Parker, CO; 2Denver Health, St. Joseph Hospital, Denver, CO; 3Denver Health, Denver, CO

Financial Support: None Reported.

Background: Central line-associated blood stream infection (CLABSI) is associated with increased complications, length of stay and cost of care. The majority of CLABSI studies are focused on home parenteral nutrition (PN) patients and there is a paucity of data documenting the incidence of CLABSI attributable to PN in the inpatient setting. At our institution, we have observed that clinicians are reluctant to initiate PN in patients with clear indications for PN due to concerns about CLABSI. Therefore, we performed a quality improvement project to document our incidence of CLABSI rate for new central parenteral nutrition (CPN) initiated during hospitalization.

Methods: We performed a retrospective review of adult inpatients who initiated CPN at our facility from 1/1/23-12/31/23. Patients were excluded if they received PN prior to admission or only received peripheral PN. The National Healthcare Safety Network (NHSN) definitions were used for CLABSI and secondary attribution. Further deeper review of CLABSI cases was provided by an Infectious Disease (ID) consultant to determine if positive cases were attributable to CPN vs other causes. The type of venous access for the positive patients was also reviewed.

Results: A total of 106 inpatients received CPN for a total of 1121 CPN days. The median [IQR] length of CPN infusion was 8 [4-14] days. Mean (standard deviation) age of patients receiving CPN infusion was 53.3 (18.6) years and 65 (61%) were men. The CPN patients who met criteria for CLABSI were further reviewed by ID consultant and resulted in only four CLABSI cases being attributable to CPN. These four cases resulted in an incidence rate of 3.6 cases of CLABSI per 1000 CPN days. Two of these patients were noted for additional causes of infection including gastric ulcer perforation and bowel perforation with anastomotic leak. Three of the patients had CPN infused via a central venous catheter (a port, a femoral line, and a non-tunneled internal jugular catheter) and the fourth patient had CPN infused via a peripherally inserted central catheter. The incidence rate for CLABSI cases per catheter days was not reported in our review.

Conclusion: At our institution, < 4% of patients initiating short-term CPN during hospitalization developed a CLABSI attributable to the CPN. This low rate of infection serves as a benchmark for our institution's quality improvement and quality assurance efforts. Collaboration with ID is recommended for additional deeper review of CPN patients with CLABSI to determine if the infection is more likely to be related to other causes than infusion of CPN.

Julianne Harcombe, RPh1; Jana Mammen, PharmD1; Hayato Delellis, PharmD1; Stefani Billante, PharmD1

1Baycare, St. Joseph's Hospital, Tampa, FL

Encore Poster

Presentation: Florida Residency Conference 2023.

Financial Support: None Reported.

Background: Purpose/Background: Refeeding syndrome is defined as potentially fatal shifts in fluids and electrolytes that may occur in malnourished patients receiving enteral or parenteral nutrition. When refeeding syndrome occurs, a reduction in phosphorus/magnesium/potassium levels and thiamine deficiency can be seen shortly after initiation of calorie provision. The American Society of Parenteral and Enteral Nutrition guidelines considers hypophosphatemia as the hallmark sign of refeeding syndrome; however, magnesium and potassium have been shown to be equally important. The purpose of this study is to identify the incidence of hypophosphatemia in patients who are at risk of refeeding syndrome and the importance of monitoring phosphorus.

Methods: This study was a multicenter retrospective chart review that was conducted using the BayCare Health System medical records database, Cerner. The study included patients who were 18 years or older, admitted between January 2023 through December 2023, received total parenteral nutrition (TPN) and were at risk of refeeding syndrome. We defined patients at risk of refeeding syndrome as patients who meet two of the following criteria prior to starting the TPN: body mass index (BMI) prior to starting TPN < 18.5 kg/m2, 5% weight loss in 1 month, no oral intake 5 days, or low levels of serum phosphorus/magnesium/potassium. COVID-19 patients and patients receiving propofol were excluded from the study. The primary objective of the study was to evaluate the incidence of hypophosphatemia versus hypomagnesemia versus hypokalemia in patients receiving TPN who were at risk of refeeding syndrome. The secondary objective was to evaluate whether the addition of thiamine upon initiation of TPN showed benefit in the incidence of hypophosphatemia.

Results: A total of 83 patients met the criteria for risk of refeeding syndrome. Out of the 83 patients, a total of 53 patients were used to run a pilot study to determine the sample size and 30 patients were included in the study. The results on day 1 and day 2 suggest the incidence of hypomagnesemia differs from that of hypophosphatemia and hypokalemia, with a notably lower occurrence. The Cochran's Q test yielded x2(2) = 9.57 (p-value = 0.008) on day 1 and x2(2) = 4.77 (p-value = 0.097) on day 2, indicating a difference in at least one group compared to the others on only day 1. A post hoc analysis found a difference on day 1 between the incidence of hypophosphatemia vs hypomagnesemia (30%) and hypomagnesemia vs hypokalemia (33.3%). For the secondary outcome, the difference in day 2 versus day 1 phosphorus levels with the addition of thiamine in the TPN was 0.073 (p-value = 0.668, 95% CI [-0.266 – 0.413]).

Conclusion: Among patients who were on parenteral nutrition and were at risk of refeeding syndrome, there was not a statistically significant difference in the incidence of hypophosphatemia vs hypomagnesemia and hypomagnesemia vs hypokalemia. Among patients who were on parenteral nutrition and were at risk of refeeding syndrome, there was not a statistically significant difference on day 2 phosphorus levels vs day 1 phosphorus levels when thiamine was added.

Jennifer McClelland, MS, RN, FNP-BC1; Margaret Murphy, PharmD, BCNSP1; Matthew Mixdorf1; Alexandra Carey, MD1

1Boston Children's Hospital, Boston, MA

Financial Support: None Reported.

Background: Iron deficiency anemia (IDA) is common in patients with intestinal failure (IF) dependent on parenteral nutrition (PN). Treatment with enteral iron is preferred; however, may not be tolerated or efficacious. In these cases, intravenous (IV) iron is a suitable alternative. Complications include adverse reactions and infection, though low when using low-molecular-weight (LMW) formulations. In a home PN (HPN) program, an algorithm (Figure 1) was developed to treat IDA utilizing IV iron.

Methods: A retrospective chart review was conducted in large HPN program (~150 patients annually) from Jan 2019 - April 2024 who were prescribed IV iron following an algorithm. Laboratory studies were analyzed looking for instances of ferritin >500 ng/mL indicating potential iron overload, as well as transferrin saturation 12-20% indicating iron sufficiency. In instances of ferritin levels >500 further review was conducted to understand etiology, clinical significance and if the IV iron algorithm was adhered to.

Results: HPN patients are diagnosed with IDA based on low iron panel (low hemoglobin and/or MCV, low ferritin, high reticulocyte count, serum iron and transferrin saturation and/or high total iron binding capacity (TIBC). If the patient can tolerate enteral iron supplementation, a dose of 3-6 mg/kg/day is initiated. If patient cannot tolerate enteral iron, the IV route is initiated. Initial IV dose is administered in the hospital or infusion center for close monitoring and to establish home maintenance administration post repletion dosing. Iron dextran is preferred as it can be directly added into the PN and run for duration of the cycle. Addition to the PN eliminates an extra infusion and decrease additional CVC access. Iron dextran is incompatible with IV lipids, so the patient must have one lipid-free day weekly to be able to administer. If patient receives daily IV lipids, iron sucrose is given as separate infusion from the PN. Maintenance IV iron dosing is 1 mg/kg/week, with dose and frequency titrated based on clinical status, lab studies and trends. Iron panel and C-reactive protein (CRP) are ordered every 2 months. If lab studies are below the desired range and consistent with IDA, IV iron dose is increased by 50% by dose or frequency; if studies are over the desired range, IV iron dose is decreased by 50% by dose or frequency. Maximum home dose is < 3 mg/kg/dose; if higher dose needed, patient is referred to an infusion center. IV iron is suspended if ferritin >500 ng/mL due to risk for iron overload and deposition in the liver. Ferritin results (n = 4165) for all patients in HPN program from January 2019-April 2024 were reviewed looking for levels >500 ng/mL indicating iron overload. Twenty-nine instances of ferritin >500 ng/mL (0.7% of values reviewed) were identified in 14 unique patients on maintenance IV iron. In 9 instances, the high ferritin level occurred with concomitant acute illness with an elevated CRP; elevated ferritin in these cases was thought to be related to an inflammatory state vs. iron overload. In 2 instances, IV iron dose was given the day before lab draw, rendering a falsely elevated result. Two patients had 12 instances (0.28% of values reviewed) of elevated ferritin thought to be related to IV iron dosing in the absence of inflammation, with normal CRP levels. During this period, there were no recorded adverse events.

Conclusion: IDA is common in patients with IF dependent on PN. Iron is not a standard component or additive in PN. Use of IV iron in this population can increase quality of life by decreasing need for admissions, visits to infusion centers, or need for blood transfusions in cases of severe anemia. IV iron can be safely used for maintenance therapy in HPN patients with appropriate dosing and monitoring.

Figure 1. Intravenous Iron in the Home Parenteral Nutrition dependent patient Algorithm.

Lynne Sustersic, MS, RD1; Debbie Stevenson, MS, RD, CNSC2

1Amerita Specialty Infusion Services, Thornton, CO; 2Amerita Specialty Infusion Services, Rochester Hills, MI

Financial Support: None Reported.

Background: Desmoplastic small round tumor (DSRT) is a soft-tissue sarcoma that causes tumors to form in the abdomen and pelvis. To improve control, cytoreduction, hyperthermic intraperitoneal chemotherapy (CRS/HIPEC) and radiotherapy are often used, which can result in bowel obstruction secondary to sclerosing peritonitis. This necessitates total parenteral nutrition therapy (TPN) due to the inability to consume nutrition orally or enterally. A major complication of parenteral nutrition therapy is parenteral nutrition associated liver disease (PNALD) and the most common metastasis for DSRT is the liver. This case report details the substitution of an olive and soy oil-based intravenous lipid emulsion (OO, SO-ILE) for a soy, MCT, olive, fish oil-based intravenous lipid emulsion (SO, MCT, OO, FO-ILE) to treat high liver function tests (LFTs).

Methods: A 28-year-old male with DSRT metastatic to peritoneum and large hepatic mass complicated by encapsulating peritonitis and enterocutaneous fistula (ECF), following CRS/HIPEC presented to the parenteral nutrition program at Amerita Specialty Infusion Services in 2022. TPN was initiated from January 2022 to March 2023, stopped, and restarted in December 2023 following a biliary obstruction. TPN was initiated and advanced using 1.3 g/kg/day SMOFlipid, (SO, MCT, OO, FO-ILE), known to help mitigate PNALD. The patient developed rising LFTs, with alanine transferase (ALT) peaking at 445 U/L, aspartate transferase (AST) at 606 U/L, and alkaline phosphatase (ALP) at 1265 U/L. Despite transitioning the patient to a cyclic regimen and maximizing calories in dextrose and amino acids, liver function continued to worsen. A switch to Clinolipid, (OO, SO-ILE), at 1.3 g/kg/day was tried.

Results: Following the initiation of OO, SO-ILE, LFTs improved in 12 days with ALT resulting at 263 U/L, AST at 278 U/L, and ALP at 913 U/L. These values continued to improve until the end of therapy in June 2024 with a final ALT value of 224 U/L, AST at 138 U/L, and ALP at 220 U/L. See Figure 1. No significant improvements in total bilirubin were found. The patient was able to successfully tolerate this switch in lipid emulsions and was able to increase his weight from 50 kg to 53.6 kg.

Conclusion: SO, MCT, OO, FO-ILE is well-supported to help prevent and alleviate adverse effects of PNALD, however lipid emulsion impacts on other forms of liver disease need further research. Our case suggests that elevated LFTs were likely cancer induced, rather than associated with prolonged use of parenteral nutrition. A higher olive oil lipid concentration may have beneficial impacts on LFTs that are not associated with PNALD. It is also worth noting that soybean oil has been demonstrated in previous research to have a negative impact on liver function, and the concentration of soy in SO, MCT, OO, FO-ILE is larger (30%) compared to OO, SO-ILE (20%). This may warrant further investigation into specific soy concentrations’ impact on liver function. LFTs should be assessed and treated on a case-by-case basis that evaluates disease mechanisms, medication-drug interactions, parenteral nutrition composition, and patient subjective information.

Figure 1. OO, SO-ILE Impact on LFTs.

Shaurya Mehta, BS1; Ajay Jain, MD, DNB, MHA1; Kento Kurashima, MD, PhD1; Chandrashekhara Manithody, PhD1; Arun Verma, MD1; Marzena Swiderska-Syn1; Shin Miyata, MD1; Mustafa Nazzal, MD1; Miguel Guzman, MD1; Sherri Besmer, MD1; Matthew Mchale, MD1; Jordyn Wray1; Chelsea Hutchinson, MD1; John Long, DVM1

1Saint Louis University, St. Louis, MO

Encore Poster

Presentation: North American Society for Pediatric Gastroenterology, Hepatology and Nutrition (NASPGHAN) Florida.

Financial Support: None Reported.

Background: Short bowel syndrome (SBS) is a devastating condition. In absence of enteral nutrition (EN), patients are dependent on Total Parenteral Nutrition (TPN) and suffer from of intestinal failure associated liver disease and gut atrophy. Intestinal adaptation (IA) and enteral autonomy (EA) remains the clinical goal. We hypothesized EA can be achieved using our DREAM system, (US patent 63/413,988) which allows EN via the stomach and then a mechanism for cyclical recirculation of nutrient rich distal intestinal content into proximal bowel enabling full EN despite SBS.

Methods: 24 neonatal pigs were randomly allocated to enteral nutrition (EN; n = 8); TPN-SBS (on TPN only, n = 8); or DREAM (n = 8). Liver, gut, and serum were collected for histology, and serum biochemistry. Statistical analysis was performed using ‘Graph Pad Prism 10.1.2 (324)’. All tests were 2-tailed using a significance level of 0.05.

Results: TPN-SBS piglets had significant cholestasis vs DREAM (p = 0.001) with no statistical difference in DREAM vs EN (p = 0.14). DREAM transitioned to full EN by day 4. Mean serum conjugated bilirubin for EN was 0.037 mg/dL, TPN-SBS 1.2 mg/dL, and DREAM 0.05 mg/dL. Serum bile acids were significantly elevated in TPN-SBS vs EN (p = 0.007) and DREAM (p = 0.03). Mean GGT, a marker of cholangiocytic injury was significantly higher in TPN-SBS vs EN (p < 0.001) and DREAM (p < 0.001) with values of EN 21.2 U/L, TPN-SBS 47.9 U/L, and DREAM 22.5 U/L (p = 0.89 DREAM vs EN). To evaluate gut growth, we measured lineal gut mass (LGM), calculated as the weight of the bowel per centimeter. There was significant IA and preservation in gut atrophy with DREAM. Mean proximal gut LGM was EN 0.21 g/cm, TPN-SBS 0.11 g/cm, and DREAM 0.31 g/cm (p = 0.004, TPN-SBS vs DREAM). Distal gut LGM was EN 0.34 g/cm, TPN-SBS 0.13 g/cm, and DREAM 0.43 g/cm (p = 0.006, TPN-SBS vs DREAM). IHC revealed DREAM had similar hepatic CK-7 (bile duct epithelium marker), p = 0.18 and hepatic Cyp7A1, p = 0.3 vs EN. No statistical differences were noted in LGR5 positive intestinal stem cells in EN vs DREAM, p = 0.18. DREAM prevented changes in hepatic, CyP7A1, BSEP, FGFR4, SHP, SREPBP-1 and gut FXR, TGR5, EGF vs TPN and SBS groups.

Conclusion: DREAM resulted in a significant reduction of hepatic cholestasis, prevented gut atrophy, and presents a novel method enabling early full EN despite SBS. This system, by driving IA and EN autonomy highlights a major advancement in SBS management, bringing a paradigm change to life saving strategies for SBS patients.

Silvia Figueiroa, MS, RD, CNSC1; Paula Delmerico, MS, RD, CNSC2

1MedStar Washington Hospital Center, Bethesda, MD; 2MedStar Washington Hospital Center, Arlington, VA

Financial Support: None Reported.

Background: Parenteral nutrition (PN) therapy is a vital clinical intervention for patients of all ages and across care settings. The complexity of PN has the potential to cause significant patient harm, especially when errors occur. According to The Institute for Safe Medication Practices (ISMP), PN is classified as a high-alert medication and safety-focused strategies should be formulated to minimize errors and harm. Processing PN is multifactorial and includes prescribing, order review and verification, compounding, labeling, and administration. PN prescription should consider clinical appropriateness and formulation safety. The PN formulation must be written to provide appropriate amounts of macronutrients and micronutrients based on the patient's clinical condition, laboratory parameters, and nutrition status. The PN admixture should not exceed the total amounts, recommended concentrations, or rate of infusion of these nutrients as it could result in toxicities and formulation incompatibility or instability. The ASPEN Parenteral Nutrition Safety Consensus Recommendations recommend PN be prescribed using standardized electronic orders via a computerized provider order entry (CPOE) system, as handwritten orders have potential for error. Data suggests up to 40% of PN-related errors occur during the prescription and transcription steps. Registered pharmacists (RPh) are tasked with reviewing PN orders for order accuracy and consistency with recommendations made by the nutrition support team. These RPh adjustments ensure formula stability and nutrient optimization. This quality improvement project compares the frequency of RPh PN adjustments following initial provider order after transition from a paper to CPOE ordering system. Our hypothesis is CPOE reduces the need for PN adjustments by pharmacists during processing which increases clinical effectiveness and maximizes resource efficiency.

Methods: This was a retrospective evaluation of PN ordering practices at a large, academic medical center after shifting from paper to electronic orders. PN orders were collected during three-week periods in December 2022 (Paper) and December 2023 (CPOE) and analyzed for the frequency of order adjustments by the RPh. Adjustments were classified into intravascular access, infusion rate, macronutrients, electrolytes, multivitamin (MVI) and trace elements (TE), and medication categories. The total number of adjustments made by the RPh during final PN processing was collected. These adjustments were made per nutrition support team.

Results: Daily PN orders for 106 patients – totaling 694 orders – were reviewed for provider order accuracy at the time of fax (paper) and electronic (CPOE) submission. Order corrections made by the RPh decreased by 96% for infusion rate, 91.5% for macronutrients, 79.6% for electrolytes, 81.4% for MVI and TE, and 50% for medication additives (Table 1).

Conclusion: Transitioning to CPOE led to reduction in the need for PN order adjustments at the time of processing. One reason for this decline is improvement in physician understanding of PN recommendations. With CPOE, the registered dietitian's formula recommendation is viewable within the order template and can be referenced at the time of ordering. The components of the active PN infusion also automatically populate upon ordering a subsequent bag. This information can aid the provider when calculating insulin needs or repleting electrolytes outside the PN, increasing clinical effectiveness. A byproduct of this process change is improved efficiency, as CPOE requires less time for provider prescription and RPh processing and verification.

Table 1. RPh Order Adjustments Required During Collection Period.

Elaina Szeszycki, BS, PharmD, CNSC1; Emily Gray, PharmD2; Kathleen Doan, PharmD, BCPPS3; Kanika Puri, MD1

1Riley Hospital for Children at Indiana University Health, Indianapolis, IN; 2Lurie Children's Hospital, Chicago, IL; 3Riley Hospital for Children at IU Health, Indianapolis, IN

Financial Support: None Reported.

Background: Parenteral nutrition (PN) is ordered daily at Riley Hospital for Children at IU Health by different specialties. Our hospital is a stand-alone pediatric hospital, including labor, delivery, and high-risk maternal care. Historically, the PN orders were due by early afternoon with a hard cut-off by end of the day shift for timely central compounding at a nearby adult hospital. Due to the relocation of staff and equipment to a new sterile compounding facility, a hard deadline was created with an earlier cutoff time and contingency plans for orders received after the deadline. This updated process was created to allow for timely delivery to Riley and subsequently to the patients to meet the standard PN hang-time of 2100.The Nutrition Support Team (NST) and Pharmacy and Therapeutics (P&T) Committee approved an updated PN order process as follows:

Enforce Hard PN Deadline of 1200 for new and current PN orders If PN not received by 1200, renew active PN order for the next 24 hours If active PN order is not appropriate for the next 24 hours, the providers will need to order IVF in place of PN until the following day PN orders into PN order software by 1500

Methods: A quality improvement (QI) process check was performed 3 months initiation of the updated PN order process. Data collection was performed for 1 month with the following data points: Total PN orders, Missing PN orders at 1200, PN orders re-ordered per P&T policy after 1200 deadline, Lab review, Input and output, Subsequent order changes for 24 hours after renewal of active PN order, Waste of PN, and Service responsible for late PN order.

Results:

Conclusion: The number of late PN orders after the hard deadline was < 5% and there was a minimal number of renewed active PN orders due to the pharmacists' concern for ensuring safety of our patients. No clinically significant changes resulted from renewal of active PN, so considered a safe process despite small numbers. The changes made to late PN orders were minor or related to the planned discontinuation of PN. After review of results by NST and pharmacy administration, it was decided to take the following actions: Review data and process with pharmacy staff to assist with workload flow and education Create a succinct Riley TPN process document for providers, specifically services with late orders, reviewing PN order entry hard deadline and need for DC PN order by deadline to assist with pharmacy staff workflow and avoidance of potential PN waste Repeat QI analysis in 6-12 months.

International Poster of Distinction

Muna Islami, PharmD, BCNSP1; Mohammed Almusawa, PharmD, BCIDP2; Nouf Alotaibi, PharmD, BCPS, BCNSP3; Jwael Alhamoud, PharmD1; Maha Islami, PharmD4; Khalid Eljaaly, PharmD, MS, BCIDP, FCCP, FIDSA4; Majda Alattas, PharmD, BCPS, BCIDP1; Lama Hefni, RN5; Basem Alraddadi, MD1

1King Faisal Specialist Hospital, Jeddah, Makkah; 2Wayne State University, Jeddah, Makkah; 3Umm al Qura University, Jeddah, Makkah; 4King Abdulaziz University Hospital, Jeddah, Makkah; 5King Faisal Specialist Hospital, Jeddah, Makkah

Financial Support: None Reported.

Background: Parenteral nutrition (PN) is a critical therapy for patients unable to meet their nutritional needs through the gastrointestinal tract. While it offers a life-saving solution, it also carries the risk of central line-associated bloodstream infections (CLABSIs). However, there is a lack of comprehensive studies examining the risk factors for CLABSIs in a more heterogeneous cohort of PN recipients. This study aims to identify the risk factors associated with CLABSIs in patients receiving PN therapy in Saudi Arabia.

Methods: This retrospective cohort multicenter study was conducted in three large tertiary referral centers in Saudi Arabia. The study included all hospitalized patients who received PN therapy through central lines between 2018 and 2022. The purpose of the study was to investigate the association between parenteral nutrition (PN) and central line-associated bloodstream infections (CLABSIs), using both univariate and multivariate analysis.

Results: Out of 662 hospitalized patients who received PN and had central lines, 123 patients (18.6%) developed CLABSI. Among our patients, the duration of parenteral nutrition was a dependent risk factor for CLABSI development (OR, 1.012; 95% CI, 0.9-1.02). In patients who were taken PN, the incidence of CLABSI did not change significantly over the course of the study's years.

Conclusion: The length of PN therapy is still an important risk factor for CLABSIs; more research is required to determine the best ways to reduce the incidence of CLABSI in patients on PN.

Table 1. Characteristics of Hospitalized Patients Who Received PN.

1 n (%); Median (IQR) BMI, Body Mass Index.

Table 2. The Characteristics of Individuals With and Without CLABSI Who Received PN.

1 n (%); Median (IQR), 2 Fisher's exact test; Pearson's Chi-squared test; Mann Whitney U test PN, Parenteral Nutrition

CLABSI, central line-associated bloodstream infections PN, parenteral nutrition

Figure 1. Percentage of Patients With a Central Line Receiving PN Who Experienced CLABSI.

Duy Luu, PharmD1; Rachel Leong, PharmD2; Nisha Dave, PharmD2; Thomas Ziegler, MD2; Vivian Zhao, PharmD2

1Emory University Hospital - Nutrition Support Team, Lawrenceville, GA; 2Emory Healthcare, Atlanta, GA

Financial Support: None Reported.

Background: Intravenous lipid emulsion (ILE) is an essential component in parenteral nutrition (PN)-dependent patients because it provides calories and essential fatty acids; however, the use of soybean oil-based ILE (SO-ILE) may contribute to the development of PN-associated liver disease (PNALD) in patients with intestinal failure who require chronic PN. To mitigate this risk, new formulations of ILE, such as a mixture of SO, medium chain triglycerides (MCT), olive oil (OO), and fish oil-based ILE (SO/MCT/OO/FO-ILE), or pure fish oil-based ILE (FO-ILE) are now available in the US. FO-ILE is only approved for pediatric use for PN-associated cholestasis. This patient case highlights the benefits of using a combination of FO-containing ILEs to improve PNALD.

Methods: A 65-year-old female with symptomatic achalasia required robotic-assisted esophagomyotomy with fundoplasty in February 2020. Her postoperative course was complicated with bowel injury that required multiple small bowel resections and total colectomy with end jejunostomy, resulting in short bowel syndrome with 80 centimeters of residual small bowel. Postoperatively, daily PN containing SO-ILE was initiated along with tube feedings (TF) during hospitalization, and she was discharged home with PN and TF. Her surgeon referred her to the Emory University Hospital (EUH) Nutrition Support Team (NST) for management. She received daily cyclic-PN infused over 12 hours, providing 25.4 kcal/kg/day with SO-ILE 70 grams (1.2 g/kg) three times weekly and standard TF 1-2 containers/day. In September 2020, she complained of persistent jaundice and was admitted to EUH. She presented with scleral icterus, hyperbilirubinemia, and elevated liver function tests (LFTs). EUH NST optimized PN to provide SO/MCT/OO/FO-ILE (0.8 g/kg/day), which improved blood LFTs, and the dose was then increased to 1 g/kg/day. In the subsequent four months, her LFTs worsened despite optimizing pharmacotherapy, continuing cyclic-TF, and reducing and then discontinuing ILE. She required multiple readmissions at EUH and obtained two liver biopsies that confirmed a diagnosis of PN-induced hepatic fibrosis after 15 months of PN. Her serum total bilirubin level peaked at 18.5 mg/dL, which led to an intensive care unit admission and required molecular adsorbent recirculating system therapy. In March 2022, the NST exhausted all options and incorporated FO-ILE (0.84 g/kg/day) three times weekly (separate infusion) and SO/MCT/OO/FO-ILE (1 g/kg/day) weekly.

Results: The patient's LFTs are shown in Figure 1. The blood level of aspartate aminotransferase improved from 123 to 60 units/L, and alanine aminotransferase decreased from 84 to 51 units/L after 2 months and returned to normal after 4 months of the two ILEs. Similarly, the total bilirubin decreased from 5.6 to 2.2 and 1.1 mg/dL by 2 and 6 months, respectively. Both total bilirubin and transaminase levels remained stable. Although her alkaline phosphatase continued to fluctuate and elevated in the last two years, this marker decreased from 268 to 104 units/L. All other PNALD-related symptoms were resolved.

Conclusion: This case demonstrates that the combination of FO-containing ILEs significantly improved and stabilized LFTs in an adult with PNALD. Additional research is needed to investigate the effect of FO-ILE in adult PN patients to mitigate PNALD.

SO: soybean oil; MCT: median-chain triglyceride; OO: olive oil; FO: fish oil; AST: Aspartate Aminotransferase; ALT: Alanine Aminotransferase.

Figure 1. Progression of Liver Enzymes Status in Relation to Lipid Injectable Emulsions.

Narisorn Lakananurak, MD1; Leah Gramlich, MD2

1Department of Medicine, Faculty of Medicine, Chulalongkorn University, Bangkok, Krung Thep; 2Division of Gastroenterology, Faculty of Medicine and Dentistry, University of Alberta, Edmonton, AB

Financial Support: This research study received a grant from Baxter, Canada.

Background: Pre-operative parenteral nutrition (PN) has been shown to enhance outcomes in malnourished surgical patients. Traditionally, pre-operative PN necessitates hospital admission, which leads to increased length of stay (LOS) and higher hospital costs. Furthermore, inpatient pre-operative PN may not be feasible or prioritized when access to hospital beds is restricted. Outpatient PN presents a potential solution to this issue. To date, the feasibility and impact of outpatient PN for surgical patients have not been investigated. This study aims to assess the outcomes and feasibility of outpatient pre-operative PN in malnourished surgical patients.

Methods: Patients scheduled for major surgery who were identified as at risk of malnutrition using the Canadian Nutrition Screening Tool and classified as malnourished by Subjective Global Assessment (SGA) B or C were enrolled. Exclusion criteria included severe systemic diseases as defined by the American Society of Anesthesiologists (ASA) classification III to V, insulin-dependent diabetes mellitus, and extreme underweight (less than 40 kg). Eligible patients received a peripherally inserted central catheter (PICC) line and outpatient PN (Olimel 7.6%E 1,000 ml) for 5-10 days using the maximum infusion days possible prior to surgery at an infusion clinic. PN was administered via an infusion pump over 4-5 hours by infusion clinic nurses. The outcomes and feasibility of outpatient PN were assessed. Safety outcomes, including refeeding syndrome, dysglycemia, volume overload, and catheter-related complications, were monitored.

Results: Outpatient PN was administered to eight patients (4 males, 4 females). Pancreatic cancer and Whipple's procedure were the most common diagnoses and operations, accounting for 37.5% of cases. (Table 1) The mean (SD) duration of PN was 5.5 (1.2) days (range 5-8 days). Outpatient PN was completed in 75% of patients, with an 88% completion rate of PN days (44/50 days). Post-PN infusion, mean body weight and body mass index increased by 4.6 kg and 2.1 kg/m², respectively. The mean PG-SGA score improved by 4.9 points, and mean handgrip strength increased from 20 kg to 25.2 kg. Quality of life, as measured by SF-12, improved in both physical and mental health domains (7.3 and 3.8 points, respectively). Patient-reported feasibility scores were high across all aspects (Acceptability, Appropriateness, and Feasibility), with a total score of 55.7/60 (92.8%). Infusion clinic nurses (n = 3) also reported high total feasibility scores (52.7/60, 87.8%). (Table 2) No complications were observed in any of the patients.

Conclusion: Outpatient pre-operative PN was a feasible approach that was associated with improved outcomes in malnourished surgical patients. This novel approach has the potential to enhance outcomes and decrease the necessity for hospital admission in malnourished surgical patients. Future studies involving larger populations are needed to evaluate the efficacy of outpatient PN.

Table 1. Baseline Characteristics of the Participants (n = 8).

Table 2. Outcomes and Feasibility of Outpatient Preoperative Parenteral Nutrition (n = 8).

Adrianna Wierzbicka, MD1; Rosmary Carballo Araque, RD1; Andrew Ukleja, MD1

1Cleveland Clinic Florida, Weston, FL

Financial Support: None Reported.

Background: Gastroparesis (GP) is a chronic motility disorder marked by delayed gastric emptying, associated with symptoms: nausea, vomiting and abdominal pain. Treatment consists of diet modifications and medications, with nutritional support tailored to disease severity. Severe refractory cases may require enteral or parenteral nutrition (PN). However, the role of home parenteral nutrition (HPN) in managing GP is underexplored. This study aims to enhance nutrition therapy practice by examining the utilization of HPN in GP population, addressing a significant gap in current nutrition support strategies.

Methods: We conducted a retrospective, single-center analysis of patients receiving HPN from August 2022 to August 2024. Data were obtained through a review of electronic medical records as part of a quality improvement monitoring process. Patients' demographics, etiology of GP, indications for HPN, types of central access, duration of therapy, and PN-related complications were analyzed using descriptive statistics. Inclusion criteria were: adults (>18 yrs.), GP diagnosis by gastric scintigraphy, and HPN for a minimum of 2 consecutive months. Among 141 identified HPN patients, 10 were diagnosed with GP as indication for PN.

Results: GP patients constituted 7% (10/141) of our home PN population. In this cohort analysis of 10 patients with GP receiving HPN, the demographic profile was predominantly female (80%); a mean age of 42.6 yrs., all individuals identified as Caucasian. All patients had idiopathic GP, severe gastric emptying delay was found in 80% of cases, with all experiencing predominant symptoms of nausea/vomiting. Type of central access: 50% PICC lines, 30% Hickman catheters, 10% Powerlines, and 10% mediport. The mean weight change with PN therapy was an increase of 21.9 lbs. 80% of patients experienced infection-related complications, including bacteremia (Methicillin-Sensitive Staphylococcus Aureus (MSSA), Methicillin-Resistant Staphylococcus Aureus (MRSA)), Pseudomonas, and fungemia. Deep vein thrombosis (DVT) was identified in 20% of patients, alongside one case of a cardiac thrombus. Tube feeding trials were attempted in 70% of cases, but 50% ultimately discontinued due to intolerance, such as abdominal pain or complications like buried bumper syndrome. Chronic pain management was used in 60% of patients, with 40% on opioid therapy (morphine, fentanyl). PN was discontinued in 50% of patients due to recurrent infections (20%), advancement to tube feeding (20%), questionable compliance (20%), or improvement in oral intake (40%).

Conclusion: This retrospective analysis underscores the potential of HPN as a nutritional strategy for GP, particularly in patients with refractory symptoms and severe delay in gastric emptying who previously failed EN or experienced complications related to the enteral access. In addition to the observed mean weight gain, HPN seems to play a crucial role in alleviating debilitating symptoms such as nausea, vomiting, and abdominal pain, thereby improving patients' overall quality of life. Nonetheless, the prevalence of infection-related complications and the requirement for chronic pain management underscore the challenges associated with GP treatment. The variability in patient responses to different nutritional strategies emphasizes the importance of individualized care plans. These findings advocate for further research to optimize HPN protocols and improve comprehensive management strategies in GP.

Figure 1. Reasons for PN Discontinuation.

Figure 2. Complication Associated with PN.

Longchang Huang, MD1; Peng Wang2; Shuai Liu3; Xin Qi1; Li Zhang1; Xinying Wang4

1Department of General Surgery, Jinling Hospital, Medical School of Nanjing University, Nanjing, Jiangsu; 2Department of Digestive Disease Research Center, Gastrointestinal Surgery, The First People's Hospital of Foshan, Guangdong, Foshan; 3Department of General Surgery, Jinling Hospital, Medical School of Nanjing University, Nanjing, Jiangsu; 4Wang, Department of General Surgery, Jinling Hospital, Medical School of Nanjing University, Nanjing, Jiangsu

Financial Support: National Natural Science Foundation of China, 82170575 and 82370900.

Background: Total parenteral nutrition (TPN) induced gut microbiota dysbiosis is closely linked to intestinal barrier damage, but the mechanism remains unclear.

Methods: Through the application of 16S rRNA gene sequencing and metagenomic analysis, we examined alterations in the gut microbiota of patients with chronic intestinal failure (CIF) and TPN mouse models subjected to parenteral nutrition, subsequently validating these observations in an independent verification cohort. Additionally, we conducted a comprehensive analysis of key metabolites utilizing liquid chromatography-mass spectrometry (LC-MS). Moreover, we explored modifications in essential innate-like lymphoid cell populations through RNA sequencing (RNA-seq), flow cytometry, and single-cell RNA sequencing (scRNA-seq).

Results: The gut barrier damage associated with TPN is due to decreased Lactobacillus murinus. L.murinus mitigates TPN-induced intestinal barrier damage through the metabolism of tryptophan into indole-3-carboxylic acid (ICA). Furthermore, ICA stimulates innate lymphoid cells 3 (ILC3) to secrete interleukin-22 by targeting the nuclear receptor Rorc to enhance intestinal barrier protection.

Conclusion: We elucidate the mechanisms driving TPN-associated intestinal barrier damage and indicate that interventions with L. murinus or ICA could effectively ameliorate TPN-induced gut barrier injury.

Figure 1. TPN Induces Intestinal Barrier Damage in Humans and Mice. (a) the rate of febrile and admission of ICU in the Cohort 1. (b-d) Serum levels of IFABP, CRP, and LPS in patients with CIF. (e) Representative intestinal H&E staining and injury scores (f) (n = 10 mice per group). (g) Results of the electrical resistance of the intestine in mice by Ussing Chamber (n = 5 mice per group). (h) Immunofluorescence experiments in the intestines and livers of mice. (i) The results of Western blot in the Chow and TPN groups.

Figure 2. TPN Induces Gut Dysbiosis in Humans and Mice. (a) PCoA for 16S rRNA of fecal content from Cohort 1 (n = 16 individuals/group). (b) Significant abundance are identified using linear discriminant analysis (LDA). (c) Top 10 abundant genus. (d) PCoA of the relative genus or species abundances (n = 5 mice per group). (e) LDA for mice. (f) Sankey diagram showing the top 10 abundant genus of humans and mice. (g) The following heatmap illustrates the correlation between the abundance of species in intestinal microbiota and clinical characteristics of patients with CIF.

Figure 3. Metabolically active L.murinus ameliorate intestinal barrier damage. (a) RT-PCR was conducted to quantify the abundance of L. murinus in feces from L-PN and H-PN patients (Cohorts 1 and 2). (b) Representative intestinal H&E staining and injury scores (c) (n = 10 mice per group). (d) The results of the electrical resistance of the intestine in mice by Ussing Chamber (n = 5 mice per group). (e) The results of Western Blot. (f) 3D-PCA and volcano plot (g) analyses between the Chow and TPN group mice. (h) The metabolome-wide pathways were enriched based on the metabolomics data obtained from fecal content from Chow and TPN group mice (n = 5 mice per group). (i) The heatmap depicts the correlation between the abundance of intestinal microbiota in species level and tryptophan metabolites of the Chow and TPN group mice (n = 5 mice per group). (j) VIP scores of 3D-PCA. A taxon with a variable importance in projection (VIP) score of >1.5 was deemed to be of significant importance in the discrimination process.

Figure 4. ICA is critical for the effects of L.murinus. (a) The fecal level of ICA from TPN mice treated PBS control or ICA(n = 10 mice per group). (b) Representative intestinal H&E staining and injury scores (c) (n = 10 mice per group). (d) The results of the electrical resistance of the intestine in mice by Ussing Chamber (n = 5 mice per group). (e) The results of Western blot. (f) This metabolic pathway illustrates the production of ICA by the bacterium L. murinus from the tryptophan. (g) PLS-DA for the profiles of metabolite in feces from TPN mice receiving ΔArAT or live L. murinus (n = 5 mice per group). (h) The heat map of tryptophan-targeted metabolomics in fecal samples from TPN mice that received either ΔArAT (n = 5) or live L. murinus (n = 5). (i) Representative intestinal H&E staining and injury scores (j)(n = 10 mice per group). (k) The results of Western Blot.

Callie Rancourt, RDN1; Osman Mohamed Elfadil, MBBS1; Yash Patel, MBBS1; Taylor Dale, MS, RDN1; Allison Keller, MS, RDN1; Alania Bodi, MS, RDN1; Suhena Patel, MBBS1; Andrea Morand, MS, RDN, LD1; Amanda Engle, PharmD, RPh1; Manpreet Mundi, MD1

1Mayo Clinic, Rochester, MN

Financial Support: None Reported.

Background: Although patent foramen ovale (PFO) are generally asymptomatic and cause no health concerns, they can be a risk factor for embolism and stroke. Due to this theoretical risk, some institutions have established protocols requiring most IV solutions to be administered through a small micron filter in patients with PFO. While 2-in-1 dextrose and amino acid solutions can be filtered through a 0.22-micron filter with relative ease, injectable lipid emulsions (ILEs), whether on their own or as part of a total admixture, consist of larger particles, requiring a 1.2-micron or bigger filter size. The use of the larger filter precludes the administration of ILE, an essential source of calories, in patients with PFO. It is unknown if patients who do receive ILE have an increased incidence of lipid embolism and stroke.

Methods: A single-center retrospective review of patients on central parenteral nutrition (CPN) was completed. Demographics, and baseline clinical characteristics including co-morbidities and history of CVA were collected. The outcome of interest is defined as an ischemic cerebrovascular accident (CVA) within 30 days of CPN and, therefore, potentially attributable to it. Other cardiovascular and thromboembolic events were captured. All Patients with a PFO diagnosis and inpatient CPN administration between January 1, 2018, and December 18, 2023, at our quaternary care referral center were included as the case cohort. A 3:1 control group matched to age, gender, duration of inpatient CPN, and clinical co-morbidities was identified and utilized to examine the difference in the outcome of interest.

Results: Patients with PFO who received CPN (n = 38, 53.8% female) had a mean age of 63.5 ± 13.1 years and a mean BMI of 31.1 ± 12.1 at CPN initiation (Table 1). The PFO varied in size, with the majority (38.5%) having a very small/trivial one (Table 2). All patients in this cohort had appropriate size filters placed for CPN and ILE administration. CPN prescription and duration were comparable between both groups. The majority of patients with PFO (53.8%) received mixed oil ILE, followed by soy-olive oil ILE (23.1%), whereas the majority of patients without PFO (51.8%) received soy-olive oil ILE and (42.9%) received mixed oil ILE (Table 3). Case and control groups had cardiovascular risks at comparable prevalence, including obesity, hypertension, diabetes, and dyslipidemia. However, more patients had a history of vascular cardiac events and atrial fibrillation in the PFO group, and more patients were smokers in the non-PFO group (Table 4). Patients with PFO received PN for a median of 7 days (IQR: 5,13), and 32 (84.2%) received ILE. Patients without PFO who received CPN (n = 114, 52.6%) had a mean age of 64.9 ± 10 years and a mean BMI of 29.6 ± 8.5 at CPN initiation. Patients in this cohort received PN for a median of 7 days (IQR: 5,13.5), and 113 (99.1%) received ILE. There was no difference in the incidence of ischemic CVA within 30 days of receiving CPN between both groups (2 (5.3%) in the PFO group vs. 1 (0.8%) in the non-PFO group; p = 0.092) (Table 4).

Conclusion: The question of the risk of CVA/stroke with CPN in patients with PFO is clinically relevant and often lacks a definitive answer. Our study revealed no difference in ischemic CVA potentially attributable to CPN between patients with PFO and patients without PFO in a matched control cohort in the first 30 days after administration of PN. This finding demonstrates that CPN with ILE is likely safe for patients with PFO in an inpatient setting.

Table 1. Baseline Demographics and Clinical Characteristics.

Table 2. PFO Diagnosis.

*All received propofol concomitantly.

Table 3. PN Prescription.

Table 4. Outcomes and Complications.

Enteral Nutrition Therapy

Osman Mohamed Elfadil, MBBS1; Edel Keaveney, PhD2; Adele Pattinson, RDN1; Danelle Johnson, MS, RDN1; Rachael Connolly, BSc.2; Suhena Patel, MBBS1; Yash Patel, MBBS1; Ryan Hurt, MD, PhD1; Manpreet Mundi, MD1

1Mayo Clinic, Rochester, MN; 2Rockfield MD, Galway

Financial Support: Rockfield Medical Devices.

Background: Patients on home enteral nutrition (HEN), many of whom are mobile, can experience significant hardships and reduced quality of life (QoL) due to limitations on mobility, on top of burdens due to underlying disease processes. Improving mobility while feeding could reduce burdens associated with HEN and potentially improve QoL. This prospective cohort study aims to evaluate participants’ perspectives on their mobility, ease of performing physical activities while feeding, and QoL following the use of a novel enteral feeding system (EFS).

Methods: A prospective single-center study was conducted to evaluate a novel EFS, which is an FDA-cleared elastomeric system (Mobility + ®) that consists of a lightweight feeding pouch (reservoir for 500 mL feed), a filling set (used in conjunction with a syringe to fill EFS) and a feeding set to deliver EN formula to an extension set/feeding tube with an ISO 80369-3 compatible connector. Adult HEN-dependent patients were recruited by invitation to use the study EFS for a minimum of 2 feeds a day for 14 days, preceded by a familiarization period of 5-7 days. Participant perspectives on how they rated performing typical daily activities while feeding (e.g., moving, traveling, socializing) and feeding system parameters (ease of use, portability, noise, discretion, performance) were evaluated using HEN-expert validated questionnaires. A score was given for each rating from 1 to 5, with 5 being the most positive response. An overall score was calculated and averaged for the cohort. Participants were followed up during the familiarization period. On days 7 and 14, additional telephone interviews were conducted regarding compliance, enteral feed intake, participant perspectives on study EFS vs. current system, and other measures. We excluded those with reduced functional capacity due to their underlying disease(s).

Results: Seventeen participants completed the study (mean age 63.8 ± 12 years; 70.6% male). Participants used various feeding systems, including gravity, bolus method, and pump, with the majority (82.4%) having a G-tube placed (Table 1). Sixteen (94.1%) patients achieved use of study EFS for at least two feeds a day (and majority of daily EN calories) for all study days (Table 2). The ratings for the ability to perform various activities using study EFS were significantly different compared to those of the systems used before the study. An improvement in ratings was noted for the ease of performing common daily activities, including moving between rooms or on stairs, taking short and long walks, traveling by car or public transport, engaging in moderate- to high-intensity activities, sleeping, and socializing with family and friends, between the time point before enrolment and end of study (day 14) (p-value < 0.0001) (Table 3). Ratings of feeding system parameters were significantly different between systems used before the study and the study EFS (p < 0.0001) (Table 3), with the largest increases in positive ratings noted in relation to easiness to carry, noise level, and ability to feed discreetly. Ratings for overall satisfaction with the performance of study EFS did not differ from the ratings for the systems used before the study, with participants reporting that the main influencing factors were the length of time and the effort needed to fill study EFS. No difference was noted in the QoL rating.

Conclusion: The studied EFS is safe and effective as an enteral feeding modality that provides an alternative option for HEN recipients. Participants reported a significant positive impact of study EFS on their activities of daily living. Although the overall QoL rating remained the same, improvements in mobility, discretion, and ease of carrying— aspects of QoL—were associated with the use of study EFS.

Table 1. Baseline Demographics and Clinical Characteristics.

Table 2. Safety and Effectiveness.

Table 3. Usability and Impact of the Study EFS.

Talal Sharaiha, MD1; Martin Croce, MD, FACS2; Lisa McKnight, RN, BSN MS2; Alejandra Alvarez, ACP, PMP, CPXP2

1Aspisafe Solutions Inc., Brooklyn, NY; 2Regional One Health, Memphis, TN

Financial Support: Talal Sharaiha is an executive of Aspisafe Solutions Inc. Martin Croce, Lisa McKnight and Alejandra Alvarez are employees of Regional One Health institutions. Regional One Health has a financial interest in Aspisafe Solutions Inc. through services, not cash. Aspisafe provided the products at no charge.

Background: Feeding tube securement has seen minimal innovation over the past decades, leaving medical adhesives to remain as the standard method. However, adhesives frequently fail to maintain secure positioning, with dislodgement rates reported between 36% and 62%, averaging approximately 40%. Dislodgement can lead to adverse outcomes, including aspiration, increased risk of malnutrition, higher healthcare costs, and extended nursing care. We aimed to evaluate the safety and efficacy of a novel feeding tube securement device in preventing NG tube dislodgement compared to standard adhesive tape. The device consists of a bracket that sits on the patient's upper lip. The bracket has a non-adhesive mechanism for securing feeding tubes ranging from sizes 10 to 18 French. It extends to two cheek pads that sit on either side of the patient's cheeks and is further supported by a head strap that wraps around the patient's head (Fig. 1 + Fig. 2).

Methods: We conducted a prospective, case-control trial at Regional One Health Center, Memphis, TN, comparing 50 patients using the novel securement device, the NG Guard, against 50 patients receiving standard care with adhesive tape. The primary outcome was the rate of accidental or intentional NG tube dislodgement. Secondary outcomes included the number of new NG tubes required as a result of dislodgement, and device-related complications or adhesive-related skin injuries in the control group. Statistical analyses employed Student's t-test for continuous variables and Fisher's exact test for categorical variables. Significance was set at an alpha level of 0.05. We adjusted for confounding variables, including age, sex, race, and diagnosis codes related to delirium, dementia, and confusion (Table 1).

Results: There were no significant differences between the groups in baseline characteristics, including age, sex, race, or confusion-related diagnoses (Table 2) (p ≥ 0.09). Nasogastric tube dislodgement occurred significantly more often in the adhesive tape group (31%) compared to the intervention group (11%) (p < 0.05). The novel device reduced the risk of tube dislodgement by 65%. Additionally, 12 new tubes were required in the control group compared to 3 in the intervention group (p < 0.05), translating to 18 fewer reinsertion events per 100 tubes inserted in patients secured by the novel device. No device-related complications or adhesive-related injuries were reported in either group.

Conclusion: The novel securement device significantly reduced the incidence of nasogastric tube dislodgement compared to traditional adhesive tape. It is a safe and effective securement method and should be considered for use in patients with nasogastric tubes to reduce the likelihood of dislodgement and the need for reinsertion of new NG tubes.

Table 1. Diagnosis Codes Related to Dementia and Delirium.

Table 2. Baseline Demographics.

Figure 1. Novel Securement Device - Front View.

Figure 2. Novel Securement Device - Side Profile.

Best of ASPEN-Enteral Nutrition Therapy

Poster of Distinction

Alexandra Kimchy, DO1; Sophia Dahmani, BS2; Sejal Dave, RDN1; Molly Good, RDN1; Salam Sunna, RDN1; Karen Strenger, PA-C1; Eshetu Tefera, MS3; Alex Montero, MD1; Rohit Satoskar, MD1

1MedStar Georgetown University Hospital, Washington, DC; 2Georgetown University Hospital, Washington, DC; 3MedStar Health Research Institute, Columbia, MD

Financial Support: None Reported.

Background: Early nutrition intervention is of high importance in patients with cirrhosis given the faster onset of protein catabolism for gluconeogenesis compared to those without liver disease. Severe malnutrition is associated with frequent complications of cirrhosis such as infection, hepatic encephalopathy, and ascites. Furthermore, studies have demonstrated higher mortality rates in cirrhotic patients who are severely malnourished in both the pre- and post-transplant setting. The current practice guidelines encourage the use of enteral nutrition in cirrhotic patients who are unable to meet their intake requirements with an oral diet. The aim of this study was to evaluate the utilization and implications of enteral feeding in hospitalized patients with cirrhosis diagnosed with severe protein calorie malnutrition at our institution.

Methods: This was a retrospective study of patients admitted to the transplant hepatology inpatient service at MedStar Georgetown University Hospital from 2019-2023. ICD-10-CM code E43 was then used to identity patients with a diagnosis of severe protein calorie malnutrition. The diagnosis of cirrhosis and pre-transplant status were confirmed by review of the electronic medical record. Patients with the following characteristics were excluded: absence of cirrhosis, history of liver transplant, admission diagnosis of upper gastrointestinal bleed, and/or receipt of total parenteral nutrition. Wilcoxon rank sum and two sample t-tests were used to examine differences in the averages of continuous variables between two groups. Chi-square and Fisher exact tests were used to investigate differences for categorical variables. Statistical significance was defined as p-values ≤ 0.05.

Results: Of the 96 patients with cirrhosis and severe protein calorie malnutrition, 31 patients (32%) received enteral nutrition. Time from admission to initiation of enteral feeding was on average 7 days with an average total duration of enteral nutrition of 10 days. In the group that received enteral nutrition, there was no significant change in weight, BMI, creatinine, total bilirubin or MELD 3.0 score from admission to discharge; however, albumin, sodium and INR levels had significantly increased (Table 1). A comparative analysis between patients with and without enteral nutrition showed a significant increase in length of stay, intensive care requirement, bacteremia, gastrointestinal bleeding, discharge MELD 3.0 score and in hospital mortality rates among patients with enteral nutrition. There was no significant difference in rates of spontaneous bacterial peritonitis, pneumonia, admission MELD 3.0 score or post-transplant survival duration in patients with enteral nutrition compared to those without enteral nutrition (Table 2).

Conclusion: In this study, less than fifty percent of patients hospitalized with cirrhosis received enteral nutrition despite having a diagnosis of severe protein calorie malnutrition. Initiation of enteral nutrition was found to be delayed a week, on average, after hospital admission. Prolonged length of stay and higher in-hospital mortality rates suggest a lack of benefit of enteral nutrition when started late in the hospital course. Based on these findings, our institution has implemented a quality improvement initiative to establish earlier enteral feeding in hospitalized patients with cirrhosis and severe protein calorie malnutrition. Future studies will evaluate the efficacy of this initiative and implications for clinical outcomes.

Table 1. The Change in Clinical End Points from Admission to Discharge Among Patients Who Received Enteral Nutrition.

Abbreviations: kg, kilograms; BMI, body mass index; INR, international normalized ratio; Na, sodium, Cr, creatinine; TB, total bilirubin; MELD, model for end stage liver disease; EN, enteral nutrition; Std, standard deviation

Table 2. Comparative Analysis of Clinical Characteristics and Outcomes Between Patients With And Without Enteral Nutrition.

Abbreviations: MASLD, metabolic dysfunction-associated steatotic liver disease; HCV, hepatitis C virus; HBV, hepatitis B virus; AIH, autoimmune hepatitis; PSC, primary sclerosing cholangitis; PBC, primary biliary cholangitis; EN, enteral nutrition; RD, registered dietician; MICU, medical intensive care unit; PNA, pneumonia; SBP, spontaneous bacterial peritonitis; GIB, gastrointestinal bleed; MELD, model for end stage liver disease; d, days; N, number; Std, standard deviation.

Jesse James, MS, RDN, CNSC1

1Williamson Medical Center, Franklin, TN

Financial Support: None Reported.

Background: Feeding tubes (Tubes) are used to deliver enteral nutrition to patients who are unable to safely ingest nutrients and medications orally, a population at elevated risk of malnutrition and dehydration. Unfortunately, these Tubes have a propensity for becoming clogged. Staff will attempt to unclog Tubes using standard bedside techniques including warm water flushes or chemical enzymes. However, not only are these practices time-consuming, often they are unsuccessful, requiring replacement. An actuated mechanical device for restoring patency in clogged small bore Tubes was evaluated at a level 2 medical center as an alternative declogging method from September 2021- July 2023. Study objectives were to explore the actuated mechanical device's ability to unclog indwelling Tubes and monitor any potential safety issues.

Methods: The TubeClear® System (actuated mechanical device, Actuated Medical, Inc., Bellefonte, PA, Figure 1) was developed to resolve clogs from various indwelling Tubes. N = 20 patients (Table 1) with n = 16, 10Fr 109 cm long nasogastric (NG) tubes, and n = 4, 10Fr 140 cm long nasojejunal (NJ) tubes, underwent clearing attempts with the actuated mechanical device. Initially, patients underwent standard declogging strategies for a minimum of 30 minutes, including warm water flushes and Creon/NaHCO3 slurry. Following unsuccessful patency restoration (n = 17) or patency restoration and reclogging occurring (n = 3), the actuated mechanical device was attempted. Procedure time was estimated from electronic monitoring records charting system and included set up, use, and cleaning time for the actuated mechanical device, to the closest five minutes. All clearing procedures were completed by three trained registered dietitians.

Results: The average time to restore Tube patency (n = 20) was 26.5 min (25 minutes for NG, 32.5 min for NJ) with 90% success (Table 2), and no significant safety issues reported by the operator or patient. User satisfaction was 100% (20/20) and patient discomfort being 10% (2/20).

Conclusion: Based on presented results, the actuated mechanical device was significantly more successful at resolving clogs compared to alternative bedside practices. Operators noted that the “Actuated mechanical device was able to work with clogs when slurries/water can't be flushed.” It was noted that actuated mechanical device use prior to formation of full clog, utilizing a prophylactic approach, “was substantially easier than waiting until the Tube fully clogged.” For a partly clogged Tube, “despite it being somewhat patent and useable, a quick pass of the actuated mechanical device essentially restored full patency and likely prevented a full clog.” For an NG patient, “no amount of flushing or medication slurry was effective, but the actuated mechanical device worked in just minutes without issue.” “Following standard interventions failure after multiple attempts, only the actuated mechanical device was able to restore Tube patency, saving money on not having to replace Tube.” For a failed clearance, the operator noted “that despite failure to restore patency, there was clearly no opportunity for flushes to achieve a better result and having this option [actuated mechanical device] was helpful to attempt to avoid tube replacement.” For an NJ patient, “there would have been no other conventional method to restore patency of a NJ small bore feeding tube without extensive x-ray exposure and "guess work," which would have been impossible for this patient who was critically ill and ventilator dependent.” Having an alternative to standard bedside unclogging techniques proved beneficial to this facility, with 90% effectiveness and saving those patients from undergoing a Tube replacement and saving our facility money by avoiding Tube replacement costs.

Table 1. Patient and Feeding Tube Demographics.

Table 2. Actuated Mechanical Device Uses.

Figure 1. Actuated Mechanical Device for Clearing Partial and Fully Clogged Indwelling Feeding Tubes.

Vicki Emch, MS, RD1; Dani Foster2; Holly Walsworth, RD3

1Aveanna Medical Solutions, Lakewood, CO; 2Aveanna Medical Solutions, Chandler, AZ; 3Aveanna Medical Solutions, Erie, CO

Financial Support: None Reported.

Background: Homecare providers have managed through multiple formula backorders since the pandemic. Due to creative problem-solving, clinicians have successfully been able to offer substitutions. However, when pump feeding sets are on backorder, the options are limited; feeding sets are specific to the brand of pump. Recently, a backorder of feeding sets used for pumps common in acute and home care resulted in a severe shortage in the home care supply chain. This required fast action to ensure patients are able to continue administering their tube feeding and prevent readmissions. A solution is to change the patient to a pump brand which is not on backorder. Normally transitioning a patient to a different brand of pump would require in-person teaching. Due to the urgency of the situation, a more efficient method needed to be established. A regional home care provider determined that 20% of patients using enteral feeding pumps were using backordered sets, and 50% were pediatric patients who tend not to tolerate other methods of feeding. In response this home care provider assembled a team to create a new educational model for pump training. The team was composed of Registered Nurses, Registered Dietitians, and patient care and distribution representatives. The hypothesis is that providing high quality educational material with instructional videos, detailed communication on the issue and telephonic clinical support will allow for a successful transition.

Methods: To determine urgency of transition we prioritized patients with a diagnosis of short bowel syndrome, gastroparesis, glycogen storage disease, vent dependency, < 2 years of age, those living in a rural area with a 2-day shipping zip code and conducted a clinical review to determine patients with jejunal feeding tube. (See Table 1) A pump conversion team contacted patients/caregivers to review the situation, discuss options for nutrition delivery, determine current inventory of sets, assessed urgency for transition and coordinated pump, sets and educational material delivery. Weekly reporting tracked number of patients using the impacted pump, transitioned patients, and those requesting to transition back to their original pump.

Results: A total of 2111 patients were using the feeding pump with backordered sets and 50% of these patients were under the age of 12 yrs. old. Over a period of 3 months, 1435 patients or 68% of this patient population were successfully transitioned to a different brand of pump and of those only 7 patients or 0.5% requested to return to their original pump even though they understood the risk of potentially running short on feeding sets. (See Figure 1).

Conclusion: A team approach which included proactively communicating with patients/caregivers, prioritizing patient risk level, providing high-quality educational material with video links and outbound calls from a clinician resulted in a successful transition to a new brand of feeding pump.

Table 1. Patient Priority Levels for Pump with Backordered Sets (Table 1).

Figure 1. Number of Pump Conversions (Chart 1).

Desiree Barrientos, DNP, MSN, RN, LEC1

1Coram CVS, Chino, CA

Financial Support: None Reported.

Background: Follow-up care for the enteral nutrition therapy community is essential for good outcomes. No data had been collected regarding home enteral nutrition (HEN) outcomes at a major university medical center. There was no robust program in place to follow up with patients who were discharged on tube feedings. Consequently, there was little information regarding the follow-up care or patient outcomes related to current practice, complications, re-hospitalizations, and equipment issues for this population.

Methods: The tools utilized were the questionnaire for the 48-hours and 30-day post-discharge discharge outreach calls, pre-discharge handouts, and feeding pump handouts.

Results: Education: Comparison of 48-Hours and 30 days. Q1: Can you tell me why you were hospitalized?Q2: Did a provider contact you for education prior to discharging home? Q3: Do you understand your nutrition orders from your Doctor? Q4: Can you tell me the steps of how to keep your PEG/TF site clean? Q5: Can you tell me how much water to flush your tube? There was an improvement in patient understanding, self-monitoring, and navigation between the two survey timepoints. Regarding patient education in Q3, there was an improved understanding of nutrition orders from 91% to 100%, Q4: steps to keeping tube feeding site clean resulted from 78% to 96%, and knowledge of water flushed before and after each feeding from 81% to 100% at the 48-hour and 30-day timepoints, respectively.

Conclusion: There was an improvement in patient understanding, self-monitoring, and navigation between the two survey timepoints. Verbal responses to open-ended and informational questions were aggregated to analyze complications, care gaps, and service failures.

Table 1. Questionnaire Responses At 48 Hours and 30 Days.

Table 2. Questionnaire Responses At 48 Hours and 30 Days.

Figure 1. Education: Comparison at 48-hours and 30-days.

Figure 2. Self-monitoring and Navigation: Comparison at 48-hours and 30-days.

Rachel Ludke, MS, RD, CD, CNSC, CCTD1; Cayla Marshall, RD, CD2

1Froedtert Memorial Lutheran Hospital, Waukesha, WI; 2Froedtert Memorial Lutheran Hospital, Big Bend, WI

Financial Support: None Reported.

Background: Initiation of early enteral nutrition plays an essential role in improving patient outcomes1. Historically, feeding tubes have been placed by nurses, doctors and advanced practice providers. Over the past two decades, the prevalence of dietitians (RDNs) placing feedings tubes at the bedside has grown. This practice has been endorsed by the Academy of Nutrition and Dietetics and American Society for Enteral and Parenteral Nutrition through modifications to the Scope of Practice for the Registered Dietitian Nutritionist and Standards of Professional Performance for Registered Dietitian Nutritionists.2,3 Feeding tubes placed at the bedside by RDNs has the potential to decrease nursing, fluoroscopy and internal transport time, which is of interest to our hospital. In fall of 2023, we launched a pilot to evaluate the feasibility of an RDN-led bedside tube placement team at our 800-bed level 1 trauma center.

Methods: RDNs first worked closely with nursing leadership to create a tube placement workflow to identify appropriate patients, outline communication needed with the bedside nurse and provider, and establish troubleshooting solutions (Figure 1). Intensive care unit nursing staff then trained RDNs in tube placement using camera-guided technology (IRIS) and deemed RDNs competent after 10 successful tube placements. Given limited literature on RDN led tube placement, we defined success as >80% of tube placements in an appropriate position within the gastrointestinal tract.

Results: To date, the pilot includes 57 patients; Forty-six tubes (80.7%; 39 gastric and 7 post-pyloric) were placed successfully, as confirmed by KUB. Of note, 2 of the 39 gastric tubes were originally ordered to be placed post-pylorically, however gastric tubes were deemed acceptable for these patients after issues were identified during placement. Eleven (19%) tube placements were unsuccessful due to behavioral issues, blockages in the nasal cavity, and anatomical abnormalities.

Conclusion: This pilot demonstrated that well trained RDNs can successfully place feeding tubes at the bedside using camera-guided tube placement technology. One limitation to this pilot is the small sample size. We initially limited the pilot to 2 hospital floors and had trouble educating nursing staff on the availability of the RDN to place tubes. Through evaluation of tube placement orders, we found the floors placed on average 198 tubes during our pilot, indicating 141 missed opportunities for RDNs to place tubes. To address these issues, we created numerous educational bulletins and worked with unit nursing staff to encourage contacting the RDN when feeding tube placement was needed. We also expanded the pilot hospital-wide and are looking into time periods when tubes are most often placed. Anecdotally, bedside feeding tube placement takes 30 to 60 minutes, therefore this pilot saved 1350 to 2700 minutes of nursing time and 180 to 360 minutes of fluoroscopy time necessary to place post-pyloric tubes. Overall, our pilot has demonstrated feasibility in RDN-led bedside feeding tube placement, allowing staff RDNs to practice at the top of their scope and promoting effective use of hospital resources.

Figure 1. Dietitian Feeding Tube Insertion Pilot: 2NT and 9NT.

Lauren Murch, MSc, RD1; Janet Madill, PhD, RD, FDC2; Cindy Steel, MSc, RD3

1Nestle Health Science, Cambridge, ON; 2Brescia School of Food and Nutritional Sciences, Faculty of Health Sciences, Western University, London, ON; 3Nestle Health Science, Hamilton, ON

Financial Support: Nestle Health Science.

Background: Continuing education (CE) is a component of professional development which serves two functions: maintaining practice competencies, and translating new knowledge into practice. Understanding registered dietitian (RD) participation and perceptions of CE facilitates creation of more effective CE activities to enhance knowledge acquisition and practice change. This preliminary analysis of a practice survey describes RD participation in CE and evaluates barriers to CE participation.

Methods: This was a cross-sectional survey of clinical RDs working across care settings in Canada. Targeted participants (n = 4667), identified using a convenience sample and in accordance with applicable law, were invited to complete a 25-question online survey, between November 2023 to February 2024. Descriptive statistics and frequencies were reported.

Results: Nationally, 428 RDs working in acute care, long term care and home care, fully or partially completed the survey (9.1% response). Respondents indicated the median ideal number of CE activities per year was 3 in-person, 5 reviews of written materials, and 6 virtual activities. However, participation was most frequently reported as “less often than ideal” for in person activities (74.7% of respondents) and written material (53.6%) and “as often as ideal” for virtual activities (50.7%). Pre-recorded video presentations, live virtual presentations and critically reviewing written materials were the most common types of CE that RDs had participated in at least once in the preceding 12-months. In-person hands-on sessions, multimodal education and simulations were the least common types of CE that RDs had encountered in the preceding 12-months (Figure 1). The most frequent barriers to participation in CE were cost (68% of respondents), scheduling (60%), and location (51%). However, encountered barriers that most greatly limited participation in CE were inadequate staff coverage, lack of dedicated education days within role, and lack of dedicated time during work hours (Table 1). When deciding to participate in CE, RDs ranked the most important aspects of the content as 1) from a credible source, 2) specific/narrow topic relevant to practice and 3) enabling use of practical tools/skills at the bedside.

Conclusion: This data suggests there is opportunity for improvement in RD CE participation, with the greatest gaps in in-person and written activities. Although RDs recognize the importance of relevant and practical content, they reported infrequent exposure to types of CE that are well-suited to this, such as simulations and hands-on activities. When planning and executing CE, content should be credible, relevant, and practical, using a format that is both accessible and impactful. Results of this study help benchmark Canadian RD participation in CE and provide convincing evidence to address barriers and maximize optimal participation.

Table 1. Frequent and Impactful Barriers Limiting Participation in CE Activities.

Note: 1. Percentages total greater than 100% because all respondents selected the 3 most important barriers impacting their participation in CE activities. 2. Items are ranked based on a weighted score calculated from a 5-point Likert scale, indicating the extent to which the barrier was perceived to have limited participation in CE activities.

Figure 1. Types of Continuing Education Activities Dietitians Participated In At Least Once, In The Preceding 12-Months.

Karen Sudders, MS, RDN, LDN1; Alyssa Carlson, RD, CSO, LDN, CNSC2; Jessica Young, PharmD3; Elyse Roel, MS, RDN, LDN, CNSC2; Sophia Vainrub, PharmD, BCPS4

1Medtrition, Huntingdon Valley, PA; 2Endeavor Health/Aramark Healthcare +, Evanston, IL; 3Parkview Health, Fort Wayne, IN; 4Endeavor Health, Glenview, IL

Financial Support: None Reported.

Background: Adequate nutrition is a critical component of patient care and plays a significant role in reducing hospital length of stay (LOS). Malnutrition is associated with numerous adverse outcomes, including increased morbidity, delayed recovery, and prolonged hospitalization (Tappenden et al., 2013). Timely nutrition interventions, and strategies for integrating successful nutrition care into hospital protocols to reduce LOS are essential parts of medical nutrition therapy. Modular nutrition often plays a key role in these interventions. A study by Klek et al. (2020) emphasizes the role of modular nutrition in providing personalized nutritional support for the specific needs of critically ill patients. The study suggests that using nutrient modules allows for a more precise adjustment of nutrition based on the metabolic requirements of patients (Klek et al., 2020). Modular nutrition has also been shown to positively impact clinical outcomes in ICU patients. An observational study by Compher et al. (2019) reported that the targeted administration of protein modules to achieve higher protein intake was associated with improved clinical outcomes, such as reduced ICU LOS (Compher et al., 2019).

Methods: Administration of modular nutrition can be a challenge. Typically, modular proteins (MP) are ordered through the dietitian and dispensed as part of the diet order. The nursing team is responsible for administration and documentation of the MP. There is commonly a disconnect between MP prescription and administration. In some cases, it's related to the MP not being a tracked task in the electronic health record (EHR). The goal of this evaluation was to review data related to a quality improvement (QI)initiative where MP (ProSource TF) was added to the medication administration record (MAR) which used a barcode scanning process to track provision and documentation of MP. The objective of this evaluation was to determine possible correlation between the QI initiative and patients’ ICU LOS. The QI initiative evaluated a pre-implementation timeframe from June 1st, 2021 to November 30th, 2021, with a post implementation timeframe from January 1st, 2022 to June 30th, 2022. There were a total of 1962 ICU encounters in the pre-implementation period and 1844 ICU encounters in the post implementation period. The data was analyzed using a series of statistical tests.

Results: The t-test for the total sample was significant, t(3804) = 8.35, p < .001, indicating the average LOS was significantly lower at post compared to pre implementation(TABLE 1). This positive correlation allows us to assume that improved provision of MP may be related to a reduced LOS in the ICU. In addition to LOS, we can also suggest a relationship with the MAR and MP utilization. Pre-implementation, 1600 doses of MP were obtained with an increase of 2400 doses obtained post implementation. The data suggests there is a correlation between product use and MAR implementation even though the overall encounters at post implementation were reduced. There was a 50% increase in product utilization post implementation compared to previous.

Conclusion: The data provided suggests the benefit for adding MP on the MAR to help improve provision, streamline documentation and potentially reduce ICU LOS.

Table 1. Comparison of LOS Between Pre and Post Total Encounters.

Table 1 displays the t-test comparison of LOS in pre vs post implementation of MP on the MAR.

Figure 1. Displays Product Utilization and Encounters Pre vs Post Implementation of MP on the MAR.

International Poster of Distinction

Eliana Giuntini, PhD1; Ana Zanini, RD, MSc2; Hellin dos Santos, RD, MSc2; Ana Paula Celes, MBA2; Bernadette Franco, PhD3

1Food Research Center/University of São Paulo, São Paulo; 2Prodiet Medical Nutrition, Curitiba, Parana; 3Food Research Center/School of Pharmaceutical Sciences/University of São Paulo, São Paulo

Financial Support: None Reported.

Background: Critically ill patients present an increased need for protein to preserve muscle mass due to anabolic resistance. Additionally, these patients are more prone to developing hyperglycemia, and one of the nutritional strategies that can be adopted is to provide a diet with a low glycemic index. Hypercaloric and high-protein enteral formulas can help meet energy and protein goals for these patients. Due to their reduced carbohydrate content, these formulas can also contribute to lowering the postprandial glycemic response. The study aimed to evaluate the glycemic index (GI) and the glycemic load (GL) of a specialized high-protein enteral nutrition formula.

Methods: Fifteen healthy volunteers were selected, based on self-reported absence of diseases or regular medication use, aged between 21 and 49 years, with normal glucose tolerance according to fasting and postprandial glucose assessments over 2 hours. The individuals attended after a 10-hour fast, once per week, consuming the glucose solution – reference food – for 3 weeks, and the specialized high-protein enteral formula (Prodiet Medical Nutrition) in the following week, both in amounts equivalent to 25 g of available carbohydrates. The specialized high-protein formula provides 1.5 kcal/ml, 26% protein (98 g/L), 39% carbohydrates, and 35% lipids, including EPA + DHA. Capillary blood sampling was performed at regular intervals, at 0 (before consumption), 15, 30, 45, 60, 90, and 120 minutes. The incremental area under the curve (iAUC) was calculated, excluding areas below the fasting line. Glycemic load (GL) was determined based on the equation GL = [GI (glucose=reference) X grams of available carbohydrates in the portion]/100. Student's t-test were conducted to identify differences (p < 0.05).

Results: To consume 25 g of available carbohydrates, the individuals ingested 140 g of the high-protein formula. The high-protein enteral formula showed a low GI (GI = 23) with a significant difference compared to glucose (p < 0.0001) and a low GL (GL = 8.2). The glycemic curve data showed significant differences at all time points between glucose and the specialized high-protein formula, except at T90, with the glycemic peak occurring at T30 for glucose (126 mg/dL) and at both T30 and T45 for the specialized high-protein enteral formula, with values significantly lower than glucose (102 vs 126 mg/dL). The iAUC was smaller for the specialized high-protein formula compared to glucose (538 ± 91 vs 2061 ± 174 mg/dL x min) (p < 0.0001), exhibiting a curve without high peak, typically observed in foods with a reduced glycemic index.

Conclusion: The specialized high-protein enteral nutrition formula showed a low GI and GL, resulting in a significantly reduced postprandial glycemic response, with lower glucose elevation and variation. This may reduce insulin requirements, and glycemic variability.

Figure 1. Mean Glycemic Response of Volunteers (N = 15) to 25 G of Available Carbohydrates After Consumption of Reference Food and a Specialized High-Protein Enteral Nutrition Formula, in 120 Min.

Lisa Epp, RDN, LD, CNSC, FASPEN1; Bethaney Wescott, APRN, CNP, MS2; Manpreet Mundi, MD2; Ryan Hurt, MD, PhD2

1Mayo Clinic Rochester, Rochester, MN; 2Mayo Clinic, Rochester, MN

Financial Support: None Reported.

Background: Hypnotherapy is the use of hypnosis for the treatment of a medical or psychological disorder. Specifically, gut directed hypnotherapy is a treatment option for functional gastrointestinal disorders and disorders of the gut brain axis. It has been shown to be effective in management of GI symptoms such as abdominal pain, nausea, functional dyspepsia and irritable bowel syndrome symptoms. Evidence suggests that 6%–19% of patients with these GI symptoms exhibit characteristics of Avoidant/restrictive food intake disorder (ARFID). Multiple studies show improvement in GI symptoms and ability to maintain that improvement after 1 year. However, there is a paucity of data regarding use of hypnotherapy in home enteral nutrition patients.

Methods: A case report involving a 67-year-old adult female with h/o Irritable bowel syndrome (diarrhea predominant) and new mucinous appendiceal cancer s/p debulking of abdominal tumor, including colostomy and distal gastrectomy is presented. She was on parenteral nutrition (PN) for 1 month post op due to delayed return of bowel function before her oral diet was advanced. Unfortunately, she had difficulty weaning from PN as she was “scared to start eating” due to functional dysphagia with gagging at the sight of food, even on TV. After 4 weeks of PN, a nasojejunal feeding tube was placed and she was dismissed home.

Results: At multidisciplinary outpatient nutrition clinic visit, the patient was dependent on enteral nutrition and reported inability to tolerate oral intake for unclear reasons. Long term enteral access was discussed, however the patient wished to avoid this and asked for alternative interventions she could try to help her eat. She was referred for gut directed hypnotherapy. After 4 in-person sessions over 3 weeks of hypnotherapy the patient was able to tolerate increasing amounts of oral intake and remove her nasal jejunal feeding. Upon follow-up 4 months later, she was still eating well and continued to praise the outcome she received from gut directed hypnotherapy.

Conclusion: Patient-centered treatments for gut-brain axis disorders and disordered eating behaviors and/or eating disorders are important to consider in addition to nutrition support. These include but are not limited to Cognitive Behavior Therapy, mindfulness interventions, acupuncture, biofeedback strategies, and gut directed hypnotherapy. Group, online and therapist directed therapies could be considered for treatment avenues dependent on patient needs and preferences. Additional research is needed to better delineate impact of these treatment modalities in the home enteral nutrition population.

Allison Krall, MS, RD, LD, CNSC1; Cassie Fackler, RD, LD, CNSC1; Gretchen Murray, RD, LD, CNSC1; Amy Patton, MHI, RD, CNSC, LSSGB2

1The Ohio State University Wexner Medical Center, Columbus, OH; 2The Ohio State University Wexner Medical Center, Westerville, OH

Financial Support: None Reported.

Background: It is well documented that unnecessary hospital admissions can have a negative impact on patient's physical and emotional wellbeing and can increase healthcare costs.1 Numerous strategies exist to limit unnecessary hospital admissions; one innovative strategy being utilized at our 1000+ bed Academic Medical Center involves Registered Dietitians (RDs). Literature demonstrates that feeding tube placement by a dedicated team using electromagnetic tracking improves patient morbidity and mortality, and is a cost effective solution for this procedure.2 RDs have been part of feeding tube teams for many years, though exact numbers of RD only teams are unclear.3 The Revised 2021 Standards of Practice and Standards of Professional Performance for RDNs (Competent, Proficient, and Expert) in Nutrition Support identifies that dietitians at the “expert” level strive for additional experience and training, and thus may serve a vital role in feeding tube placement teams.4 Prior to the implementation of the RD led tube team, there was no uniform process at our hospital for these patients to obtain enteral access in a timely manner.

Methods: In December 2023 an “RD tube team” consult and order set went live within the electronic medical record at our hospital. The original intent of the tube team was to cover the inpatient units, but it soon became apparent that there were opportunities to extend this service to observation areas and the Emergency Department (ED). This case series abstract will outline case studies from three patients with various clinical backgrounds and how the tube team was able to prevent an inpatient admission. Patient 1: An 81-year-old female who returned to the ED on POD# 4 s/p esophageal repair with a dislodged nasoenteric feeding tube. The RD tube team was consulted and was able to replace her tube and bridled it in place. Patient discharged from ED without requiring hospital readmission. Patient 2: An 81-year-old male with a history of ENT cancer who transferred to our ED after outside hospital ED had no trained staff available to replace his dislodge nasoenteric feeding tube. RD tube team replaced his tube and bridled it into place. Patient was able to discharge from the ED without admission. Patient 3: A 31-year-old female with complex GI history and multiple prolonged hospitalizations due to PO intolerance. Patient returned to ED 2 days post discharge with a clogged nasoenteric feeding tube. Tube was unable to be unclogged, thus the RD tube team was able to replace tube in ED and prevent readmission.

Results: Consult volumes validated there was a need for a tube team service. In the first 8 months of consult and order set implementation, a total of 403 tubes were placed by the RD team. Of those, 24 (6%) were placed in the ED and observation units. In the 3 patient cases described above, and numerous other patient cases, the RD was able to successfully place a tube using an electromagnetic placement device (EMPD), thereby preventing a patient admission.

Conclusion: Creating a feeding tube team can be a complex process to navigate and requires support from senior hospital administration, physician champions, nursing teams and legal/risk management teams. Within the first year of implementation, our hospital system was able to demonstrate that RD led tube teams have the potential to not only help with establishing safe enteral access for patients, but also can be an asset to the medical facility by preventing admissions and readmissions.

Table 1. RD Tube Team Consults (December 11, 2023-August 31, 2024).

Arina Cazac, RD1; Joanne Matthews, RD2; Kirsten Willemsen, RD3; Paisley Steele, RD4; Savannah Zantingh, RD5; Sylvia Rinaldi, RD, PhD2

1Internal Equilibrium, King City, ON; 2London Health Sciences Centre, London, ON; 3NutritionRx, London, ON; 4Vanier Children's Mental Wellness, London, ON; 5Listowel-Wingham and Area Family Health Team, Wingham, ON

Financial Support: None Reported.

Background: Parkinson's disease is the second most prevalent neurodegenerative disease with a predominant disease-related symptom known as dysphagia. Dysphagia can increase the risk of aspiration, the inhaling of food or liquids into the lungs, potentially instigating the onset of pneumonia, a recurrent fatality in patients with Parkinson's disease. Therefore, feeding tubes are placed to deliver nutrition into the stomach or the small intestine to maintain appropriate nutrition delivery and reduce the risk of aspiration by oral intake. To the best of our knowledge, there is no research comparing the differences in outcomes between gastric (G) or jejunal (J) tube feeding in patients with Parkinson's disease-related dysphagia, however, limited research does exist in critically ill populations comparing these two modalities. The purpose of this study is to compare the differences in hospital readmissions related to aspiration events and differences in mortality rates after placement of a gastric or jejunal feeding tube in patients with Parkinson's disease-related dysphagia.

Methods: This was a retrospective chart review of patients either admitted to the medicine or clinical neurosciences units at University Hospital in London Ontario, Canada, between January 1, 2010, to December 31, 2022. Patients were included if they had a documented diagnosis of Parkinson's or associated diseases and had a permanent feeding tube placed during hospital admission that was used for enteral nutrition. Patients were excluded from the study if they had a comorbidity that would affect their survival, such as cancer, or if a feeding tube was placed unrelated to Parkinson's Disease-related dysphagia, for example, feeding tube placement post-stroke. A p-value < 0.05 was considered statistically significant.

Results: 25 participants were included in this study; 7 had gastric feeding tubes, and 18 had jejunal feeding tubes. Demographic data is shown in Table 1. No statistically significant differences were found in demographic variables between the G- and J-tube groups. Of interest, none of the 28% of participants that had dementia were discharged to home; 5 were discharged to long-term care, 1 was discharged to a complex continuing care facility, and 1 passed in hospital. Differences in readmission rates and morality between groups did not reach significance, likely due to our small sample size in the G-tube group (Figures 1 and 2). However, we found that 50% of participants were known to have passed within 1 year of initiating enteral nutrition via their permanent feeding tube, and there was a trend of higher readmission rates in the G-tube group.

Conclusion: While this study did not yield statistically significant results, it highlights the need for further research of a larger sample size to assess confounding factors, such as concurrent oral intake, that affect the difference in outcomes between G- and J-tube groups. Future research would also benefit from examining the influence on quality of life in these patients. Additional research is necessary to inform clinical practice guidelines and clinical decision-making for clinicians, patients and families when considering a permanent feeding tube.

Table 1. Participant Demographics.

Readmission rates were calculated as a hospital. If a participant was readmitted more than once within the defined percentage of the number of readmissions to the number of discharges from timeframes, subsequent readmissions were counted as a new readmission and new discharge event. Readmission rate calculations did not include participants who passed during or after the defined timeframes. Differences in readmission rates between gastric and jejunal feeding tube groups did no reach statistical significance.

Figure 1. Readmission Rate.

Mortality rates were calculated from the time that enteral nutrition was initiated through a permanent feeding tube in 30-day, 60-day, 90-day, and 1-year time intervals. Differences in mortality rates between gastric and jejunal feeding tube groups did not reach statistical significance.

Figure 2. Mortality Rate.

Jennifer Carter, MHA, RD1

1Winchester Medical Center, Valley Health, Winchester, VA

Financial Support: None Reported.

Background: Early enteral nutrition is shown to improve patient outcomes and can decrease, or attenuate, the progression of malnutrition. Placement of nasoenteric feeding tubes was deemed within the scope of practice (SOP) for Registered Dietitian Nutritionists (RDNs) in 2007, with recent revisions in 2021, by the Academy of Nutrition and Dietetics (AND) and the American Society of Parenteral and Enteral Nutrition (ASPEN). Due to enhanced order writing privileges, RDNs are knowledgeable and aware of those in need of enteral nutrition recommendations. The goal of this abstract is to bring light to the efficiency of a RDN led nasoenteric tube placement team.

Methods: A retrospective chart review of the first 70 patients who received a nasoenteric tube placed by a RDN in 2023 was conducted. Data points collected include time of tube order to tube placement and time of tube order to enteral nutrition order.

Results: Out of 70 tubes placed, the average time from tube order to tube placement was 2.28 hours. The longest time from tube order to placement was 19 hours. The average time from tube order to enteral nutrition order was 5.58 hours. The longest time from tube order to enteral nutrition order was 23.6 hours.

Conclusion: This retrospective review reflects the timeliness of placement and provision of enteral nutrition in the acute care setting when performed by an all RDN team. Overall, placement occurred within less than 2.5 hours of tube placement order, and enteral nutrition orders entered less than 6 hours of tube placement order. The RDNs at Winchester Medical Center have been placing nasoenteric-feeding tubes since 2013 using an electromagnetic tube placement device (EMPD). As of 2018, it became an all RDN team. With the enhanced support from AND and ASPEN, nasoenteric feeding tube placement continues to be promoted within the SOP for RDNS. Also, the Accreditation Council for Education in Nutrition and Dietetics (ACEND) now requires dietetic interns to learn, observe, and even assist with nasoenteric tube placements. Over time, more RDNs in the acute care setting will likely advance their skillset to include this expertise.

Figure 1. Time From MD Order to Tube Placement in Hours.

Figure 2. Time From MD Order of Tube to Tube Feed Order in Hours.

Poster of Distinction

Vanessa Millovich, DCN, MS, RDN, CNSC1; Susan Ray, MS, RD, CNSC, CDCES2; Robert McMahon, PhD3; Christina Valentine, MD, RDN, FAAP, FASPEN4

1Kate Farms, Hemet, CA; 2Kate Farms, Temecula, CA; 3Seven Hills Strategies, Columbus, OH; 4Kate Farms, Cincinnati, OH

Financial Support: Kate Farms provided all financial support.

Background: Whole food plant-based diets have demonstrated metabolic benefits across many populations. The resulting increased intake of dietary fiber and phytonutrients is integral to the success of this dietary pattern due to the positive effect on the digestive system. Patients dependent on tube feeding may not receive dietary fiber or sources of phytonutrients, and the impact of this is unknown. Evidence suggests that the pathways that promote digestive health include more than traditional prebiotic sources from carbohydrate fermentation. Data on protein fermentation metabolites and potential adverse effects on colon epithelial cell integrity are emerging. These lesser-known metabolites, branched-chain fatty acids (BCFAs), are produced through proteolytic fermentation. Emerging research suggests that the overproduction of BCFAs via protein fermentation may be associated with toxic by-products like p-cresol. These resulting by-products may play a role in digestive disease pathogenesis. Enteral formulas are often used to support the nutritional needs of those with digestive conditions. Plant-based formulations made with yellow pea protein have been reported to improve GI tolerance symptoms. However, the underlying mechanisms responsible have yet to be investigated. The purpose of this study was to assess the impact of a mixed food matrix enteral formula containing pea protein, fiber, and phytonutrients on various markers of gut health in healthy children and adults, using an in-vitro model.

Methods: Stool samples of ten healthy pediatric and 10 adult donors were collected and stored at -80°C. The yellow pea protein formulas (Kate Farms™ Pediatric Standard 1.2 Vanilla-P1, Pediatric Peptide 1.0 Vanilla-P2, and Standard 1.4 Plain-P3) were first predigested using standardized intestinal processing to simulate movement along the digestive tract. The in-vitro model was ProDigest's Colon-on-a-Plate (CoaP®) simulation platform which has demonstrated in vivo-in vitro correlation. Measurements of microbial metabolic activity included pH, production of gas, SCFAs, BCFA, ammonia, and microbiota shifts. Paired two-sided t-tests were performed to evaluate differences between treatment and control. Differential abundance analysis was performed using LEfSe and treeclimbR. Statistical significance, as compared to negative control, is indicated by a p-value of < 0.05.

Results: In the pediatric group, the microbial analysis showed significant enrichment of Bifidobacteria as well as butyrate-producing genera Agathobacter and Agathobaculum with the use of the pediatric formulas when compared to the control. P1 resulted in a statistically significant reduction of BCFA production (p < = 0.05). P1 and P2 resulted in statistically significant increases in acetate and propionate. In the adult group, with treatment using P3, microbial analysis showed significant enrichment of Bifidobacteria compared to the control group. P3 also resulted in a reduction of BCFAs, although not statistically significant. Gas production and drop in pH were statistically significant (p < = 0.05) for all groups P1, P2, and P3 compared to control, which indicates microbial activity.

Conclusion: All enteral formulas demonstrated a consistent prebiotic effect on the gut microbial community composition in healthy pediatric and adult donors. These findings provide insight into the mechanisms related to digestive health and highlight the importance of designing prospective interventional research to better understand the role of fiber and phytonutrients within enteral products.

Hill Johnson, MEng1; Shanshan Chen, PhD2; Garrett Marin3

1Luminoah Inc, Charlottesville, VA; 2Virginia Commonwealth University, Richmond, VA; 3Luminoah Inc, San Diego, CA

Financial Support: Research was conducted with the support of VBHRC's VA Catalyst Grant Funding.

Background: Medical devices designed for home use must prioritize user safety, ease of operation, and reliability, especially in critical activities such as enteral feeding. This study aimed to validate the usability, safety, and overall user satisfaction of a novel enteral nutrition system through summative testing and task analysis.

Methods: A simulation-based, human factors summative study was conducted with 36 participants, including both caregivers and direct users of enteral feeding technology. Participants were recruited across three major cities: Houston, Chicago, and Phoenix. Task analysis focused on critical and non-critical functions of the Luminoah FLOW™ Enteral Nutrition System, while user satisfaction was measured using the System Usability Scale (SUS). The study evaluated successful task completion, potential use errors, and qualitative user feedback.

Results: All critical tasks were completed successfully by 100% of users, with the exception of a cleaning task, which had an 89% success rate. Non-critical tasks reached an overall completion rate of 95.7%, demonstrating the ease of use and intuitive design of the system. The SUS score was exceptionally high, with an average score of 91.5, indicating strong user preference for the device over current alternatives. Furthermore, 91% of participants indicated they would choose the new system over other products in the market.

Conclusion: The innovative portable enteral nutrition system demonstrated excellent usability and safety, meeting the design requirements for its intended user population. High completion rates for critical tasks and an overwhelmingly positive SUS score underscore the system's ease of use and desirability. These findings suggest that the system is a superior option for home enteral feeding, providing both safety and efficiency in real-world scenarios. Further refinements in instructional materials may improve user performance on non-critical tasks.

Elease Tewalt1

1Phoenix Veterans Affairs Administration, Phoenix, AZ

Financial Support: None Reported.

Background: Enhanced Recovery After Surgery (ERAS) protocols, including preoperative carbohydrate loading, aim to accelerate recovery by reducing the stress responses and insulin resistance. These protocols have been shown to decrease hospital stays, postoperative complications, and healthcare costs. However, there is limited knowledge about the safety and efficacy of ERAS for diabetic patients. Patients with diabetes make up 15% of surgical cases and often have longer hospital stays and more postoperative complications. This study evaluated outcome measures important to patients with diabetes in a non-diabetic population in order to support the groundwork for future trials that could include diabetic patients in ERAS protocols.

Methods: A retrospective chart review at the Phoenix Veterans Affairs Health Care System compared 24 colorectal surgery patients who received preoperative carbohydrate drinks with 24 who received traditional care. Outcomes assessed included blood glucose (BG) levels, aspiration, and postoperative complications. Additional analyses evaluated adherence, length of hospital stay, and healthcare costs.

Results: The demographics of the two groups were comparable (Table 1). The preoperative BG levels of the carbohydrate loading group were similar (164.6 ± 36.3 mg/dL) to the control group (151.8 ± 47.7 mg/dL) (p > 0.05) (Figure 1). The carbohydrate loading group demonstrated lower and more stable postoperative BG levels (139.4 ± 37.5 mg/dL) compared to the control group (157.6 ± 61.9 mg/dL), but this difference was not statistically significant (p > 0.05) (Figure 2). There were no significant differences in aspiration or vomiting between the groups (p > 0.05) (Table 2). The carbohydrate loading group had a shorter average hospital stay by one day, but this difference was not statistically significant (p > 0.05) (Table 2).

Conclusion: Carbohydrate loading as part of ERAS protocols was associated with better postoperative glucose control, no increased risk of complications, and reduced hospital stays. Although diabetic patients were not included in this study, these findings suggest thatcarbohydrate loading is a safe and effective component of ERAS. Including diabetic patients in ERAS is a logical next step that could significantly improve surgical outcomes for this population. Future research should focus on incorporating diabetic patients to assess the impact of carbohydrate loading on postoperative glucose control, complication rates, length of stay, and healthcare costs.

Table 1. Demographics.

The table includes the demographics of the two groups. The study group consists of participants who received the carbohydrate drink, while the control group includes those who received traditional care.

Table 2. Postoperative Outcomes.

The table includes the postoperative outcomes of the two groups. The study group consists of participants who received the carbohydrate drink, while the control group includes those who received traditional care.

The figure shows the preoperative BG levels of the carbohydrate and non-carbohydrate groups, along with a trendline for each group (p > 0.05).

Figure 1. Preoperative BG Levels.

The figure shows the postoperative BG levels of the carbohydrate and non-carbohydrate groups, along with a trendline for each group (p > 0.05).

Figure 2. Postoperative BG Levels.

Malnutrition and Nutrition Assessment

Amy Patton, MHI, RD, CNSC, LSSGB1; Elisabeth Schnicke, RD, LD, CNSC2; Sarah Holland, MSc, RD, LD, CNSC3; Cassie Fackler, RD, LD, CNSC2; Holly Estes-Doetsch, MS, RDN, LD4; Kristen Roberts, PhD, RDN, LD, CNSC, FASPEN, FAND5; Christopher Taylor, PhD, RDN4

1The Ohio State University Wexner Medical Center, Westerville, OH; 2The Ohio State University Wexner Medical Center, Columbus, OH; 3The Ohio State University Wexner Medical Center, Upper Arlington, OH; 4The Ohio State University, Columbus, OH; 5The Ohio State University, Granville, OH

Financial Support: None Reported.

Background: The unfavorable association of malnutrition on hospital outcomes such as longer length of stays (LOS), increased falls, and increased hospital readmissions has been well documented in the literature. We aimed to see if a different model of care that lowered Registered Dietitian (RD) to patient ratios would lead to increased malnutrition identification and documentation within our facility. We also evaluated the relationship between these metrics and LOS on monitored hospital units.

Methods: In July 2022, two additional RDs were hired at a large 1000+ bed academic medical center as part of a pilot program focused on malnutrition identification, documentation integrity, and staff training. The RD to patient ratio was reduced from 1:75 to 1:47 on three units of the hospital. On the pilot units, the RD completed a full nutrition assessment, including a Nutrition Focused Physical Exam (NFPE), for all patients who were identified as "at risk" per hospital nutrition screening policy. Those patients that were not identified as “at risk” received a full RD assessment with NFPE by day 7 of admission or per consult request. A malnutrition dashboard was created with assistance from a Quality Data Manager from the Quality and Operations team. This visual graphic allowed us to track and monitor RD malnutrition identification rates by unit and the percentage of patients that had a malnutrition diagnosis captured by the billing and coding team. Data was also pulled from the Electronic Medical Record (EMR) to look at other patient outcomes. In a retrospective analysis we compared the new model of care to the standard model on one of these units.

Results: There was an increase in the RD identified capture rate of malnutrition on the pilot units. On a cardiac care unit, the RD identification rate went from a baseline of 6% in Fiscal Year (FY) 2022 to an average of 12.5% over FY 2023-2024. On two general medicine units, the malnutrition rates identified by RD nearly doubled during the two-year intervention (Table 1). LOS was significantly lower on one of the general medicine intervention floors compared to a control unit (p < 0.001, Cohen's D: 13.8) (Table 2). LOS was reduced on all units analyzed between FY22 and FY23/24. Those patients with a malnutrition diagnosis had a 15% reduction in LOS FY22 to FY23/24 in control group compared to 19% reduction in LOS for those identified with malnutrition on intervention unit. When comparing intervention versus control units for FY23 and FY24 combined, the intervention had a much lower LOS than control unit.

Conclusion: Dietitian assessments and related interventions may contribute in reducing LOS. Reducing RD to patient ratios may allow for greater identification of malnutrition and support patient outcomes such as LOS. There is an opportunity to evaluate other patient outcomes for the pilot units including falls, readmission rates and Case Mix Index.

Table 1. RD Identified Malnutrition Rates on Two General Medicine Pilot Units.

Table 2. Control Unit and Intervention Unit Length of Stay Comparison.

Amy Patton, MHI, RD, CNSC, LSSGB1; Misty McGiffin, DTR2

1The Ohio State University Wexner Medical Center, Westerville, OH; 2The Ohio State University Wexner Medical Center, Columbus, OH

Financial Support: None Reported.

Background: Delays in identifying patients at nutrition risk can impact patient outcomes. Per facility policy, Dietetic Technicians (DTs) and Registered Dietitians (RDs) review patient charts, meet with patients, and assign a nutrition risk level. High risk patients are then assessed by the RD for malnutrition among other nutrition concerns. Based on a new tracking process implemented in January of 2023, an average of 501 patient nutrition risk assignments were overdue or incomplete per month in January thru April of 2023 with a dramatic increase to 835 in May. Missed risk assignments occur daily. If the risk assignment is not completed within the 72-hour parameters of the policy, this can result in late or missed RD assessment opportunities and policy compliance concerns.

Methods: In June 2023, a Lean Six Sigma quality improvement project using the DMAIC (Define, Measure, Analyze, Improve, Control) framework was initiated at a 1000+ bed Academic Medical Center with the goal to improve efficiency of the nutrition risk assignment (NRA) process for RDs and DTs. A secondary goal was to see if potential improved efficiency would also lead to an increase in RD identified malnutrition based on Academy of Nutrition and Dietetics/American Society for Parenteral and Enteral Nutrition Indicators or Malnutrition (AAIM) criteria. A group of RDs and DTs met to walk through each of the quality improvement process steps. The problem was defined and baseline measurements of malnutrition identification rates and missed nutrition risk assignments were analyzed. A fishbone diagram was used to help with root cause analysis and later a payoff matrix was used to identify potential interventions. The improve phase was implemented in October and November 2023 and included changes to the screening policy itself and redistributing clinical nutrition staff on certain patient units.

Results: Identified improvements did have a positive impact on incomplete work and on malnutrition identification rates. Malnutrition identification rates on average May thru October were 11.7% compared to malnutrition rates November thru April of 12.1%. The number of missed NRA's decreased from an average of 975 per month May thru October to 783 per month November thru April, a decrease of 192 per month (20%). An additional quality improvement process cycle is currently underway to further improve these metrics.

Conclusion: Fostering a culture of ongoing improvement presents a significant challenge for both clinicians and leaders. Enhancing nutrition care and boosting clinician efficiency are critical goals. Introducing clinical nutrition leaders to tools designed for quality monitoring and enhancement can lead to better performance outcomes and more effective nutrition care. Tools such as those used for this project along with PDSA (Plan, Do, Study, Act) projects are valuable in this process. Involving team members from the beginning of these improvement efforts can also help ensure the successful adoption of practice changes.

Table 1. RD Identified Malnutrition Rates.

Table 2. Incomplete Nutrition Risk Assignments (NRA's).

Maurice Jeanne Aguero, RN, MD1; Precy Gem Calamba, MD, FPCP, DPBCN2

1Department of Internal Medicine, Prosperidad, Agusan del Sur; 2Medical Nutrition Department, Tagum City, Davao del Norte

Financial Support: None Reported.

Background: Malnutrition is a strong predictor for mortality and morbidity through poor response to therapy, and quality of life among Gastro-intestinal (GI) cancer patients. In a tertiary government hospital in Tagum City where a cancer center is present, although malnutrition screening among patients with cancer is routinary, no studies focusing on determining the association between nutritional status and quality of life among GI cancer patients were conducted in the past. This study generally aims to determine if nutritional status is associated with the quality of life among adult GI cancer Filipino patients seeking cancer therapy in a tertiary government hospital.

Methods: A quantitative, observational, cross-sectional, survey analytical, and predictive type of research was done. World Health Organization Quality of Life Brief Version (WHOQOL-BREF) questionnaire was utilized to determine the quality of life of cases. Logistic regression analysis was used for the association between the demographic, clinical and nutritional profile among patients with gastrointestinal cancer.

Results: Among respondents (n = 160, mean age 56.4 ± 12 years), majority were male (61.9%), married (77.5%), Roman Catholic (81.1%), and finished high school (38.1%). Almost half were diagnosed cases of colon adenocarcinoma (43.125%), followed by rectal adenocarcinoma (22.5%), rectosigmoid adenocarcinoma (11.875%), then GI stromal tumor (5.625%). On cancer staging, 40.625% were Stage 4, followed by Stage 3b (19.375%), Stage 3c (10%), Stage 3a (5.625%), then Stage 2a (4.375%). Only 2.5% were Stage 4a, while 0.625% were Stage 4b. More than one fourth received CAPEOX (38.125%), followed by FOLFOX (25.625%), then IMATINIB (5.625%). Among cases, 15.6% were underweight or obese, or overweight (34.4%). In terms of SGA grading, 38.1% had severe level, 33.8% moderate level, while the rest normal to mild. On quality of life, mean scores per variable were: generally good for general quality of life (3.71 ± 0.93), generally satisfied for perception on general health, being satisfied with one's self, and his or her relationship with others (3.46 to 3.86 ± 0.97), generally of moderate satisfaction on having enough energy to daily life, on accepting bodily appearance, on the availability of information needed for daily living, and on the extent of having the opportunity for leisure (2.71 to 3.36 ± 1.02), a little level of satisfaction was thrown on having enough money to meet their needs (2.38 ± 0.92). Participants, on average, experienced quite often negative feelings such as having blue mood, despair, depression and anxiety (2.81 ± 0.79). A significant association between age (p = 0.047), cancer diagnosis (p = 0.001), BMI status (p = 0.028), and SGA nutritional status (p = 0.010) relative to the quality of life among adult cancer patients was documented.

Conclusion: Nutritional status was significantly associated with the quality of life among adult GI cancer Filipino patients seeking cancer therapy in a tertiary government hospital. Public health interventions may play a critical role on these factors to improve patient survival and outcome.

Carmen, Kaman Lo, MS, RD, LDN, CNSC1; Hannah Jacobs, OTD, OTR/L2; Sydney Duong, MS, RD, LDN3; Julie DiCarlo, MS4; Donna Belcher, EdD, MS, RD, LDN, CDCES, CNSC, FAND5; Galina Gheihman, MD6; David Lin, MD7

1Massachusetts General Hospital, Sharon, MA; 2MedStart National Rehabilitation Hospital, Washington, DC; 3New England Baptist Hospital, Boston, MA; 4Center for Neurotechnology and Neurorecovery, Mass General Hospital, Boston, MA; 5Nutrition and Food Services, MGH, Boston, MA; 6Harvard Medical School and Mass General Hospital, Boston, MA; 7Neurocritical Care & Neurorecovery, MGH, Boston, MA

Financial Support: Academy of Nutrition and Dietetics, Dietitian in Nutrition Support Member Research Award.

Background: Nutritional status is a known modifiable factor for optimal recovery in brain injury survivors, yet, data on specific benchmarks for optimizing clinical outcomes through nutrition are limited. This pilot study aimed to quantify the clinical, nutritional, and functional characteristics of brain injury survivors from ICU admission to 90 days post-discharge.

Methods: Patients admitted to the Massachusetts General Hospital (MGH) Neurosciences ICU over a 12-month period were enrolled based on these criteria: age 18 years and greater, primary diagnoses of acute brain injury, ICU stay for at least 72 hours, and meeting MGH Neurorecovery Clinic referral criteria for outpatient follow-up with survival beyond 90 days post-discharge. Data were collected from the electronic health record and from the neurorecovery clinic follow-visit/phone interview. These included patient characteristics, acute clinical outcomes, nutrition intake, surrogate nutrition and functional scores at admission, discharge, and 90 days post-discharge. Descriptive statistics were used for analysis.

Results: Of 212 admissions during the study period, 50 patients were included in the analysis. The mean APACHE (n = 50), GCS (n = 50), and NIHSS (n = 20) scores were 18, 11 and 15, respectively. Seventy-eight percent of the patients required ventilation with a mean duration of 6.3 days. Mean ICU and post ICU length of stay were 17.4 and 15.9 days, respectively. Eighty percent received nutrition (enteral or oral) within 24-48 hours of ICU admission. The first 7-day mean ICU energy and protein intake were 1128 kcal/day and 60.3 g protein/day, respectively, both 63% of estimated needs. Assessing based on ASPEN guidelines, patients with a BMI ≥ 30 received less energy (11.6 vs 14.6 kcal/kg/day) but higher protein (1.04 vs 0.7 g protein/kg/day) than those with a BMI < 30. Twelve percent of patients had less than 50% of their nutritional intake for at least 7 days pre-discharge and were considered nutritionally at risk. Forty-six percent were discharged with long-term enteral access. Only 16% of the patients were discharged to home rather than a rehabilitation facility. By 90 days post-discharge, 32% of the patients were readmitted, with 27% due to stroke. Upon admission, patients’ mean MUST (Malnutrition Universal Screening Tool) and MST (Malnutrition Screening Tool) scores were 0.56 and 0.48, respectively, reflecting that they were at low nutritional risk. By discharge, the mean MUST and MST scores of these patients increased to 1.16 and 2.08, respectively, suggesting these patients had become nutritionally at risk. At 90 days post-discharge, both scores returned to a low nutrition risk (MUST 0.48 and MST 0.59). All patients’ functional scores, as measured by the modified Rankin scale (mRS), followed a similar pattern: the mean score was 0.1 at admission, 4.2 at discharge, and 2.8 at 90 days post-discharge. The 90 days post-discharge Barthel index was 64.1, indicating a moderate dependence in these patients.

Conclusion: This pilot study highlighted key nutritional, clinical, and functional characteristics of brain injury survivors from ICU admission to 90 days post-discharge. Further statistical analysis will help delineate the relationships between nutritional status and clinical and functional outcomes, which may guide future research and practice in ICU nutrition and neurorehabilitation.

Lavanya Chhetri, BS1; Amanda Van Jacob, MS, RDN, LDN, CCTD1; Sandra Gomez, PhD, RD1; Pokhraj Suthar, MBBS1; Sarah Peterson, PhD, RD1

1Rush University Medical Center, Chicago, IL

Financial Support: None Reported.

Background: Identifying frailty in patients with liver disease provides valuable insights into a patient's nutritional status and physical resilience. However, it is unclear if reduced muscle is an important etiology of frailty in liver disease. Identifying the possible connection between frailty and muscle mass may lead to better risk prediction, personalized interventions, and improved outcomes in patient care. The purpose of this study is to determine if frail patients have lower skeletal muscle index (SMI) compared to not-frail patients with liver disease undergoing liver transplant evaluation.

Methods: A retrospective, cross-sectional study design was utilized. Patients greater than 18 years of age who underwent a liver transplant evaluate from January 1st, 2019 until December 31, 2023 were included if they had a liver frailly index (LFI) assessment completed during the initial liver transplant evaluation and a diagnostic abdominal CT scan completed within 30 days of the initial liver transplant evaluation. Demographic data (age, sex, height, and BMI), etiology of liver disease, MELD-Na score, history of diabetes and hepatocellular carcinoma, liver disease complications (ascites, hepatocellular carcinoma, hepatic encephalopathy & esophageal varices), and LFI score were recorded for each patient. LFI was recorded as both a continuous variable and dichotomized into a categorical variable (frail: defined as LFI ≥ 4.5 versus not frail: defined as LFI ≤ 4.4). Cross-sectional muscle area (cm2) from the third lumbar region of the CT was quantified; SMI was calculated (cm2/height in meters2) and low muscle mass was dichotomized into a categorical variable (low muscle mass: defined as SMI ≤ 50 cm2/m2 for males and ≤39 cm2/m2 for females versus normal muscle mass: defined as SMI > 50 cm2/m2 for males and >39 cm2/m2 for females). An independent t-test analysis was used to determine if there is a difference in SMI between patients who are categorized as frail versus not frail.

Results: A total of 104 patients, 57% male with a mean age of 57 ± 10 years and mean of BMI 28.1 ± 6.4 kg/m2, were included. The mean MELD-Na score was 16.5 ± 6.9; 25% had a history of hepatocellular carcinoma and 38% had a history of diabetes. The majority of the sample had at least one liver disease complication (72% had ascites, 54% had hepatic encephalopathy, and 67% had varices). The mean LFI score was 4.5 ± 0.9 and 44% were categorized as frail. The mean SMI was 45.3 ± 12.6 cm2/m2 and 52% were categorized as having low muscle mass (males: 63% and females: 38%). There was no difference in SMI between patients who were frail versus not frail (43.5 ± 10.6 versus 47.3 ± 13.9 cm2/m2, p = 0.06). The difference between SMI by frailty status was reported for males and females, no significance testing was used due to the small sample size. Both frail males (43.5 ± 12.2 versus 48.4 ± 14.9) and females (43.4 ± 9.3 versus 45.2 ± 11.8) had a lower SMI compared to non-frail patients.

Conclusion: No difference in SMI between frail versus not frail patients was observed; however, based on the p-value of 0.06 a marginal trend and possible difference may exist, but further research is needed to confirm the findings. Additionally, it is concerning that men had a higher rate of low muscle mass and the mean SMI for both frail and not frail men was below the cut-off used to identify low muscle mass (SMI ≤ 50 cm2/m2). Additional research is needed to explore the underlying factors contributing to low muscle mass in men, particularly in frail populations, and to determine whether targeted interventions aimed at improving muscle mass could mitigate frailty and improve clinical outcomes in patients undergoing liver transplant evaluation.

Rebekah Preston, MS, RD, LD1; Keith Pearson, PhD, RD, LD2; Stephanie Dobak, MS, RD, LDN, CNSC3; Amy Ellis, PhD, MPH, RD, LD1

1The University of Alabama, Tuscaloosa, AL; 2The University of Alabama at Birmingham, Birmingham, AL; 3Thomas Jefferson University, Philadelphia, PA

Financial Support: The ALS Association Quality of Care Grant.

Background: Amyotrophic Lateral Sclerosis (ALS) is a progressive neurodegenerative disease. Malnutrition is common in people with ALS (PALS) due to dysphagia, hypermetabolism, self-feeding difficulty and other challenges. In many clinical settings, malnutrition is diagnosed using the Academy of Nutrition and Dietetics/American Society of Parenteral and Enteral Nutrition indicators to diagnose malnutrition (AAIM) or the Global Leadership Initiative on Malnutrition (GLIM) criteria. However, little is known about malnutrition assessment practices in ALS clinics. This qualitative study explored how RDs at ALS clinics are diagnosing malnutrition in PALS.

Methods: Researchers conducted 6 virtual focus groups with 22 RDs working in ALS clinics across the United States. Audio recordings were transcribed verbatim, and transcripts imported to NVivo 14 software (QSR International, 2023, Melbourne, Australia). Two research team members independently analyzed the data using deductive thematic analysis.

Results: The AAIM indicators were identified as the most used malnutrition diagnostic criteria. Two participants described using a combination of AAIM and GLIM criteria. Although all participants described performing thorough nutrition assessments, some said they do not formally document malnutrition in the outpatient setting due to lack of reimbursement and feeling as though the diagnosis would not change the intervention. Conversely, others noted the importance of documenting malnutrition for reimbursement purposes. Across all groups, RDs reported challenges in diagnosing malnutrition because of difficulty differentiating between disease- versus malnutrition-related muscle loss. Consequently, several RDs described adapting current malnutrition criteria to focus on weight loss, decreased energy intake, or fat loss.

Conclusion: Overall, RDs agreed that malnutrition is common among PALS, and they conducted thorough nutrition assessments as part of standard care. Among those who documented malnutrition, most used AAIM indicators to support the diagnosis. However, as muscle loss is a natural consequence of ALS, RDs perceived difficulty in assessing nutrition-related muscle loss. This study highlights the need for malnutrition criteria specific for PALS.

Table 1. Themes Related to Diagnosing Malnutrition in ALS.

Carley Rusch, PhD, RDN, LDN1; Nicholas Baroun, BS2; Katie Robinson, PhD, MPH, RD, LD, CNSC1; Maria Geraldine E. Baggs, PhD1; Refaat Hegazi, MD, PhD, MPH1; Dominique Williams, MD, MPH1

1Abbott Nutrition, Columbus, OH; 2Miami University, Oxford, OH

Financial Support: This study was supported by Abbott Nutrition.

Background: Malnutrition is increasingly recognized as a condition that is present in all BMI categories. Although much research to date has focused on malnutrition in patients with lower BMIs, there is a need for understanding how nutrition interventions may alter outcomes for those with higher BMIs and existing comorbidities. In a post-hoc analysis of the NOURISH trial, which investigated hospitalized older adults with malnutrition, we sought to determine whether consuming a specialized ONS containing high energy, protein and beta-hydroxy-beta-methylbutyrate (ONS+HMB) can improve vitamin D and nutritional status in those with a BMI ≥ 27.

Methods: Using data from the NOURISH trial, a randomized, placebo-controlled, multi-center, double-blind study conducted in hospitalized participants with malnutrition and a primary diagnosis of congestive heart failure, acute myocardial infarction, pneumonia, or chronic obstructive pulmonary disease, post-hoc analysis was conducted. In the trial, participants received standard care with either ONS + HMB or a placebo beverage (target 2 servings/day) during hospitalization and for 90 days post-discharge. Nutritional status was assessed using Subjective Global Assessment (SGA) and handgrip strength at baseline, 0-, 30-, 60- and 90-days post-discharge. Vitamin D (25-hydroxyvitamin D) was assessed within 72 hours of admission (baseline), 30- and 60-days post-discharge. Participants with a BMI ≥ 27 formed the analysis cohort. Treatment effect was determined using analysis of covariance adjusted for baseline measures.

Results: The post-hoc cohort consisted of 166 patients with a BMI ≥ 27, mean age 76.41 ± 8.4 years, and who were predominantly female (51.2%). Baseline handgrip strength (n = 137) was 22.3 ± 0.8 kg while serum concentrations of 25-hydroxyvitamin D (n = 138) was 26.0 ± 1.14 ng/mL. At day 90, ONS+HMB improved nutritional status in which 64% of ONS+HMB group were well-nourished (SGA-A) vs. 37% of the control group (p = 0.011). There was a trend towards higher changes in handgrip strength with ONS+HMB during index hospitalization (baseline to discharge) compared to placebo (least squares means ± standard error: 1.34 kg ± 0.35 vs. 0.41 ± 0.39; p = 0.081) but was not significant at all other timepoints. Vitamin D concentrations were significantly higher at day 60 in those receiving ONS + HMB compared to placebo (29.7 ± 0.81 vs. 24.8 ± 0.91; p < 0.001).

Conclusion: Hospitalized older patients with malnutrition and a BMI ≥ 27 had significant improvements in their vitamin D and nutritional status at day 60 and 90, respectively, if they received standard care + ONS+HMB as compared to placebo. This suggests transitions of care post-acute setting should consider continuation of nutrition for patients with elevated BMI and malnutrition using interventions such as ONS+HMB in combination with standard care.

Aline Dos Santos1; Isis Helena Buonso2; Marisa Chiconeli Bailer2; Maria Fernanda Jensen Kok2

1Hospital Samaritano Higienópolis, São Paulo; 2Hospital Samaritano Higienopolis, São Paulo

Financial Support: None Reported.

Background: Malnutrition negatively impacts the length of hospital stay, infection rate, mortality, clinical complications, hospital readmission and also on average healthcare costs. It is believed that early nutritional interventions could reduce negative events and generate economic impact. Therefore, our objective was to evaluate the average cost of hospitalization of patients at nutritional risk through nutritional screening with indication of oral nutritional supplementation.

Methods: Retrospective study including 110 adult patients hospitalized in a private institution admitted between August 2023 and January 2024. Nutritional screening was performed within 24 hours of admission. To classify low muscle mass according to calf circumference (CC), the cutoff points were considered: 33 cm for women and 34 cm for men, measured within 96 hours of hospital admission. They were evaluated in a grouped manner considering G1 patients with an indication for oral supplementation (OS) but not started for modifiable reasons, G2 patients with an indication for OS and started assertively (within 48 hours of the therapeutic indication) and G3 patients with an indication for OS but started late (after 48 hours of therapeutic indication) and G4 the joining of patients from G1 and G3 as both did not receive OS assertively. Patients receiving enteral or parenteral nutritional therapy were excluded.

Results: G2 was prevalent in the studied sample (51%), had an intermediate average length of stay (20.9 days), lower average daily hospitalization cost, average age of 71 years, significant prevalence of low muscle mass (56%) and lower need for hospitalization in intensive care (IC) (63%) with an average length of stay (SLA) in IC of 13.5 days. G1 had a lower prevalence (9%), shorter average length of stay (16 days), average daily cost of hospitalization 41% higher than G2, average age of 68 years, unanimity of adequate muscle mass (100%) and considerable need for hospitalization in intensive care (70%), but with a SLA in IC of 7.7 days. G3 represented 40% of the sample studied, had a longer average length of stay (21.5 days), average daily cost of hospitalization 22% higher than G2, average age of 73 years, significant prevalence of low muscle mass (50%) and intermediate need for hospitalization in intensive care (66%) but with SLA in IC of 16.5 days. Compared to G2, G4 presented a similar sample group (G2: 56 patients and G4: 54 patients) as well as mean age (72 years), hospitalization (20.55 days), hospitalization in IC (66%), SLA in IC (64.23%) but higher average daily hospitalization cost (39% higher than G2) and higher prevalence of patients with low muscle mass (59%).

Conclusion: From the results presented, we can conclude that the percentage of patients who did not receive OS and who spent time in the IC was on average 5% higher than the other groups, with unanimously adequate muscle mass in this group, but with the need for supplementation due to clinical conditions, food acceptance and weight loss. More than 50% of patients among all groups except G1 had low muscle mass. Regarding costs, patients supplemented assertively or late cost, respectively, 45% and 29% less compared to patients who did not receive OS. Comparing G2 with G4, the cost remains 39% lower in patients assertively supplemented.

International Poster of Distinction

Daphnee Lovesley, PhD, RD1; Rajalakshmi Paramasivam, MSc, RD1

1Apollo Hospitals, Chennai, Tamil Nadu

Financial Support: None Reported.

Background: Malnutrition in hospitals, once an underreported issue, has gained significant attention in recent decades. This widespread problem negatively impacts recovery, length of stay (LOS), and overall outcomes. This study aimed to evaluate the effectiveness of clinical nutrition practices and the role of a Nutrition Steering Committee (NSC) in addressing and potentially eradicating this persistent problem by improving nutritional care and patient outcomes.

Methods: Consecutive patients admitted to non-critical units of a tertiary care hospital between January 2018 and August 2024 were included in the study. Patient demographics, body mass index (BMI), modified Subjective Global Assessment (mSGA), oral nutritional supplements (ONS) usage, and clinical outcomes were retrospectively extracted from electronic medical records. Data was analyzed using SPSS version 20.0, comparing results before and after the implementation of the NSC.

Results: Out of 239,630 consecutive patients, 139,895 non-critical patients were included, with a mean age of 57.10 ± 15.89 years; 64.3% were men and 35.7% women. The mean BMI was 25.76 ± 4.74 kg/m2, and 49.6% of the patients were polymorbid. The majority (25.8%) were admitted with cardiac illness. According to the modified Subjective Global Assessment (mSGA), 87.1% were well-nourished, 12.8% moderately malnourished, and 0.1% severely malnourished. ONS were prescribed for 10% of the population and ONS prescription was highest among underweight (28.4%); Normal BMI (13%); Overweight (9.1%); Obese (7.7%) (p = 0.000) and mSGA – well-nourished (5.5%); moderately malnourished (MM) 41%; severely malnourished (SM) 53.2% (p = 0.000) and pulmonology (23.3%), followed by gastroenterology & Hepatology (19.2%) (p = 0.000). The mean hospital LOS was 4.29 ± 4.03 days, with an overall mortality rate of 1.2%. Severe malnutrition, as rated by mSGA, significantly impacted mortality (0.8% vs. 5.1%, p = 0.000). Mortality risk increased with polymorbidity (0.9% vs. 1.5%) and respiratory illness (2.6%, p = 0.000). Poor nutritional status, as assessed by mSGA (34.7%, 57.4%, 70.9%) and BMI (43.7% vs. 38%), was associated with longer hospital stays (LOS ≥ 4 days, p = 0.000). The implementation of the NSC led to significant improvements - average LOS decreased (4.4 vs. 4.1 days, p = 0.000), and mortality risk was reduced from 1.6% to 0.7% (p = 0.000). No significant changes were observed in baseline nutritional status, indicating effective clinical nutrition practices in assessing patient nutrition. ONS prescriptions increased from 5.2% to 9.7% between 2022 and 2024 (p = 0.000), contributing to the reduction in mortality rates to below 1% after 2022, compared to over 1% before NSC (p = 0.000). A significant negative correlation was found between LOS and ONS usage (p = 0.000). Step-wise binary logistic regression indicated that malnutrition assessed by mSGA predicts mortality with an odds ratio of 2.49 and LOS with an odds ratio of 1.7, followed by ONS prescription and polymorbidity (p = 0.000).

Conclusion: A well-functioning NSC is pivotal in driving successful nutritional interventions and achieving organizational goals. Early identification of malnutrition through mSGA, followed by timely and appropriate nutritional interventions, is essential to closing care gaps and improving clinical outcomes. Strong leadership and governance are critical in driving these efforts, ensuring that the patients receive optimal nutritional support to enhance recovery and reduce mortality.

Table 1. Patient Characteristics: Details of Baseline Anthropometric & Nutritional Status.

Baseline details of Anthropometric Measurements and Nutrition Status.

Table 2. Logistic Regression to Predict Hospital LOS and Mortality.

Step-wise binary logistic regression indicated that malnutrition assessed by mSGA predicts mortality with an odds ratio of 2.49 and LOS with an odds ratio of 1.7, followed by ONS prescription and polymorbidity (p = 0.000).

mSGA-rated malnourished patients stayed longer in the hospital compared to the well-nourished category (p = 0.000)

Figure 1. Nutritional Status (mSGA) Vs Hospital LOS (>4days).

Hannah Welch, MS, RD1; Wendy Raissle, RD, CNSC2; Maria Karimbakas, RD, CNSC3

1Optum Infusion Pharmacy, Phoenix, AZ; 2Optum Infusion Pharmacy, Buckeye, AZ; 3Optum Infusion Pharmacy, Milton, MA

Financial Support: None Reported.

Background: Food insecurity is when people do not have enough food to eat and do not know where their next meal will come from. In the United States approximately 49 million people relied on food assistance charities in 2022 (data from Feeding America). Patients receiving parenteral nutrition (PN), who may be capable of supplementing with oral intake, may experience food insecurity due to chronic health conditions limiting work capability and total family income. Patients may also experience lack of affordable housing, increased utilities and the burden of medical expenses. Signs of food insecurity may present as weight loss, malnourishment, low energy, difficulty concentrating, or other physical indicators such as edema, chronically cracked lips, dry skin, and itchy eyes. The purpose of this abstract is to highlight two unique patient case presentations where food insecurity prompted clinicians to intervene.

Methods: Patient 1: 50-year-old male with short bowel syndrome (SBS) on long-term PN called the registered dietitian (RD) regarding financial difficulties with feeding the family (see Table 1). The patient and clinician relationship allowed the patient to convey sensitive concerns to the RD regarding inability to feed himself and his family, which resulted in the patient relying on the PN for all nutrition. Due to the food insecurity present, the clinician made changes to PN/hydration to help improve patient's clinical status. Patient 2: A 21-year-old male with SBS on long-term PN spoke with his in-home registered nurse (RN) regarding family's difficulties affording food (see Table 2). The RN informed the clinical team of suspected food insecurity and the insurance case manager (CM) was contacted regarding food affordability. The RD reached out to local community resources such as food banks, food boxes and community programs. A community program was able to assist the patient with meals until patient's aunt started cooking meals for him. This patient did not directly share food insecurity with RD; however, the relationship with the in-home RN proved valuable in having these face-to-face conversations with the patient.

Results: In these two patient examples, difficulty obtaining food affected the patients’ clinical status. The clinical team identified food insecurity and the need for further education for the interdisciplinary team. A food insecurity informational handout was created by the RD with an in-service to nursing to help aid recognition of signs (Figure 1) to detect possible food insecurity and potential patient resources available in the community. Figure 2 presents suggested questions to ask a patient if an issue is suspected.

Conclusion: Given the prevalence of food insecurity, routine assessment for signs and symptoms is essential. Home nutrition support teams (including RDs, RNs, pharmacists and care technicians) are positioned to assist in this effort as they have frequent phone and in-home contact with patients and together build a trusted relationship with patients and caregivers. Clinicians should be aware regarding potential social situations which can warrant changes to PN formulations. To approach this sensitive issue thoughtfully, PN infusion providers should consider enhancing patient assessments and promote education across the interdisciplinary team to create awareness of accessible community resources.

Table 1. Patient 1 Information.

Table 2. Suspected Food Insecurity Timeline.

Figure 1. Signs to Detect Food Insecurity.

Figure 2. Questions to Ask.

Poster of Distinction

Christan Bury, MS, RD, LD, CNSC1; Amanda Hodge Bode, RDN, LD2; David Gardinier, RD, LD3; Roshni Sreedharan, MD, FASA, FCCM3; Maria Garcia Luis, MS, RD, LD4

1Cleveland Clinic, University Heights, OH; 2Cleveland Clinic Foundation, Sullivan, OH; 3Cleveland Clinic, Cleveland, OH; 4Cleveland Clinic Cancer Center, Cleveland, OH

Encore Poster

Presentation: The Society of Critical Care Medicine, Critical Care Congress in Orlando, FL on February 25th.

Publication: Critical Care Medicine.2025;53(1):In press.

Financial Support: Project support provided by Morrison Cleveland Clinic Nutrition Research collaborative.

Background: Hospitalized and critically ill patients who have preexisting malnutrition can have worse outcomes and an increased length of stay (LOS). Currently, Registered Dietitians (RDs) assess malnutrition with a nutrition focused physical exam (NFPE). Recent recommendations encourage the use of body composition tools such as computed tomography (CT) along with the NFPE. Trained RDs can use CT scans to evaluate skeletal muscle mass at the third lumbar vertebrae (L3) and then calculate Skeletal Muscle Index (SMI) and mean Hounsfield Units (HU) to determine muscle size and quality, respectively. This has been validated in various clinical populations and may be particularly useful in the critically ill where the NFPE is difficult. We aim to evaluate if using CT scans in the surgical and critical care population can be a supportive tool to capture a missed malnutrition diagnosis.

Methods: One-hundred and twenty patients admitted to the Cleveland Clinic from 2021-2023 with a malnutrition evaluation that included an NFPE within 2 days of an abdominal CT were evaluated. Of those, 59 patients had a major surgery or procedure completed that admission and were included in the final analysis. The CT scans were read by a trained RD at L3 using Terarecon, and results were cross-referenced by an artificial intelligence software (AI) called Veronai. Age, sex, BMI, SMI & HU were analyzed, along with the malnutrition diagnosis.

Results: Fifty-nine patients were analyzed. Of these, 61% were male, 51% were >65 years old, and 24% had a BMI > 30. Malnutrition was diagnosed in 47% of patients. A total of 24 patients had no muscle wasting based on the NFPE while CT captured low muscle mass in 58% in that group. Twenty-two percent of patients (13/59) had a higher level of malnutrition severity when using CT. Additionally, poor muscle quality was detected in 71% of patients among all age groups. Notably, there was a 95% agreement between AI and the RD's assessment in detecting low muscle mass.

Conclusion: RDs can effectively analyze CT scans and use SMI and HU with their NFPE. The NFPE alone is not always sensitive enough to detect low muscle in surgical and critically ill patients. The degree of malnutrition dictates nutrition interventions, including the timing and type of nutrition support, so it is imperative to accurately diagnose and tailor interventions to improve outcomes.

Table 1. Change in Malnutrition Diagnosis Using CT.

The graph shows the change in Malnutrition Diagnosis when CT was applied in conjunction with the NFPE utilizing ASPEN Guidelines.

Table 2. Muscle Assessment: CT vs NFPE.

This graph compares muscle evaluation using both CT and the NFPE.

CT scan at 3rd lumbar vertebrae showing normal muscle mass and normal muscle quality in a pt >65 years old.

Figure 1. CT Scans Evaluating Muscle Size and Quality.

CT scan at 3rd lumbar vetebrae showing low muscle mass and low muscle quality in a patient with obesity.

Figure 2. CT Scans Evaluating Muscle Size and Quality.

Elif Aysin, PhD, RDN, LD1; Rachel Platts, RDN, LD1; Lori Logan, RN1

1Henry Community Health, New Castle, IN

Financial Support: None Reported.

Background: Disease-related malnutrition alters body composition and causes functional decline. In acute care hospitals, 20-50% of inpatients have malnutrition when they are admitted to the hospital. Patients with malnutrition experience higher medical costs, increased mortality, and longer hospital stays. Malnutrition is associated with an increased risk of readmission and complications. Monitoring, diagnosis, treatment, and documentation of malnutrition are important for treating patients. It also contributes to proper Diagnosis Related Group (DRG) coding and accurate CMI (Case Mix Index), which can increase reimbursement.

Methods: After dietitians completed the Nutrition Focused Physical Exam (NFPE) course and sufficient staff had been provided, the malnutrition project was initiated under the leadership of RDNs in our small rural community hospital. The interdisciplinary medical committee approved the project to improve malnutrition screening, diagnosis, treatment practices, and coding. It was decided to use the Academy/American Society of Parenteral and Enteral Nutrition (ASPEN) and Global Leadership Initiative on Malnutrition (GLIM) criteria to diagnose malnutrition. The Malnutrition Screening Tool (MST) was completed by nurses to determine the risk of malnutrition. The Nutrition and Dietetics department created a new custom report that provides NPO-Clear-Full liquids patients' reports using the nutrition database. RDNs check NPO-Clear-Full liquids patients' reports, BMI reports, and length of stay (LOS) patient lists on weekdays and weekends. RDNs also performed the NFPE exam to evaluate nutritional status. If malnutrition is identified, RDNs communicate with providers through the hospital messenger system. Providers add malnutrition diagnosis in their documentation and plan of care. RDNs created a dataset and shared it with Coders and Clinical Documentation Integrity Specialists/Care Coordination. We keep track of malnutrition patients. In addition, RDNs spent more time with malnourished patients. They contributed to discharge planning and education.

Results: The prevalence of malnutrition diagnosis and the amount of reimbursement for 2023 were compared to the six months after implementing the malnutrition project. Malnutrition diagnosis increased from 2.6% to 10.8%. Unspecified protein-calorie malnutrition diagnoses decreased from 39% to 1.5%. RDN-diagnosed malnutrition has been documented in provider notes for 82% of cases. The malnutrition diagnosis rate increased by 315% and the malnutrition reimbursement rate increased by 158%. Of those patients identified with malnutrition, 59% received malnutrition DRG code. The remaining 41% of patients received higher major complications and comorbidities (MCCs) codes. Our malnutrition reimbursement increased from ~$106,000 to ~$276,000.

Conclusion: The implementation of evidence-based practice guidelines was key in identifying and accurately diagnosing malnutrition. The provision of sufficient staff with the necessary training and multidisciplinary teamwork has improved malnutrition diagnosis documentation in our hospital, increasing malnutrition reimbursement.

Table 1. Before and After Malnutrition Implementation Results.

Figure 1. Prevalence of Malnutrition Diagnosis.

Elisabeth Schnicke, RD, LD, CNSC1; Sarah Holland, MSc, RD, LD, CNSC2

1The Ohio State University Wexner Medical Center, Columbus, OH; 2The Ohio State University Wexner Medical Center, Upper Arlington, OH

Financial Support: None Reported.

Background: Malnutrition is associated with increased length of stay, readmissions, mortality and poor outcomes. Early identification and treatment are essential. The Malnutrition Screening Tool (MST) is a quick, easy tool recommended for screening malnutrition in adult hospitalized patients. This is commonly used with or without additional indicators. We aimed to evaluate our current nutrition screening policy, which utilizes MST, age and body mass index (BMI), to improve malnutrition identification.

Methods: This quality improvement project data was obtained over a 3-month period on 4 different adult services at a large academic medical center. Services covered included general medicine, hepatology, heart failure and orthopedic surgery. Patients were assessed by a Registered Dietitian (RD) within 72hrs of admission if they met the following high-risk criteria: MST score >2 completed by nursing on admission, age ≥65 yrs or older, BMI ≤ 18.5 kg/m2. If none of the criteria were met, patients were seen within 7 days of admission or sooner by consult request. Malnutrition was diagnosed using Academy of Nutrition and Dietetics/American Society for Parenteral and Enteral Nutrition indicators of malnutrition (AAIM) criteria. Data collected included malnutrition severity and etiology, age, gender, BMI and MST generated on admission.

Results: A total of 239 patients were diagnosed with malnutrition. Table 1 shows detailed characteristics. Malnutrition was seen similarly across gender (51% male, 49% female) and age groups. Age range was 21-92 yrs with an average age of 61 yrs. BMI range was 9.8-50.2 kg/m2 with an average BMI of 24.6 kg/m2. More patients were found to have moderate malnutrition at 61.5% and chronic malnutrition at 54%. When data was stratified by age ≥65 yrs, similar characteristics were seen for malnutrition severity and etiology. Notably, more patients (61.5%) had an MST of < 2 or an incomplete MST compared to patients < 65 yrs of age (56%). There were 181 patients (76%) that met high risk screening criteria and were seen for an assessment within 72hrs. Seventy patients (39%) were screened only due to age ≥65 yrs. Forty-five (25%) were screened due to MST alone. There were 54 (30%) that met 2 indicators for screening. Only a small number of patients met BMI criteria alone or 3 indicators (6 patients or 3% each).

Conclusion: Utilizing MST alone would have missed over half of patients diagnosed with malnutrition and there was a higher miss rate with older adults using MST alone. Age alone as a screening criteria caught more patients than MST alone did. Adding BMI to screening criteria added very little and we still missed 24% of patients with our criteria. A multi-faceted tool should be explored to best capture patients.

Table 1. Malnutrition characteristics.

*BMI: excludes those with amputations, paraplegia; all patients n = 219, patients ≥65yo n = 106.

Robin Nuse Tome, MS, RD, CSP, LDN, CLC, FAND1

1Nemours Children's Hospital, DE, Landenberg, PA

Financial Support: None Reported.

Background: Malnutrition is a global problem impacting patients of all ages, genders, and races. Malnourished hospitalized patients are associated with poorer outcomes including longer in-hospital length of stays, higher rate of death, higher need for home healthcare service, and a higher rate of 30day readmission. This results in a higher economic burden for the healthcare industry, insurance, and the individual. Malnutrition documentation plays a vital role in capturing the status of the patient and starting the conversation about interventions to address then concern.

Methods: A taskforce made up of physicians, registered dietitians (RDs), and clinical documentation specialists met to discuss strategies to increase documentation of the malnutrition diagnosis to facilitate better conversation about the concern and potential interventions. Options included inbox messages, a best practice alert, a SmartLink between the physician and RD note, and adding the diagnosis to the problem list. Of the options, the team selected to develop a SmartLink was developed within the Electronic Medical Record (EMR) that links text from the RD note about malnutrition to the physician note to capture the diagnosis of malnutrition, the severity, and the progression of the diagnosis over time.

Results: Preliminary data shows that physician documentation of the malnutrition diagnosis as well as the severity and progression of the diagnosis increased by 20% in the pilot medical team. Anecdotally, physicians were more aware of the patient's nutrition status with documentation linked to their note and collaboration between the medical team and the RD to treat malnutrition has increased.

Conclusion: We hypothesize that expanding the practice to the entire hospital, will increase documentation of malnutrition diagnosis in the physician note. This will help increase awareness of the nutrition status of the patient, draw attention and promote collaboration on interventions to treat, and increase billable revenue to the hospital by capturing the documentation of the degree of malnutrition in the physician note.

David López-Daza, RD1; Cristina Posada-Alvarez, Centro Latinoamericano de Nutrición1; Alejandra Agudelo-Martínez, Universidad CES2; Ana Rivera-Jaramillo, Boydorr SAS3; Yeny Cuellar-Fernández, Centro Latinoamericano de Nutrición1; Ricardo Merchán-Chaverra, Centro Latinoamericano de Nutrición1; María-Camila Gómez-Univio, Centro Latinoamericano de Nutrición1; Patricia Savino-Lloreda, Centro Latinoamericano de Nutrición1

1Centro Latinoamericano de Nutrición (Latin American Nutrition Center), Chía, Cundinamarca; 2Universidad CES (CES University), Medellín, Antioquia; 3Boydorr SAS, Chía, Cundinamarca

Financial Support: None Reported.

Background: The Malnutrition Screening Tool (MST) is a simple and quick instrument designed to identify the risk of malnutrition in various healthcare settings, including home hospitalization. Its use has become widespread due to its ease of application and ability to preliminarily assess the nutritional status of patients. In the context of home care, where clinical resources may be limited, an efficient screening tool is crucial to ensure early interventions that prevent nutritional complications in vulnerable populations. The objective was to evaluate the diagnostic accuracy of the MST in detecting malnutrition among patients receiving home care.

Methods: A diagnostic test study was conducted, collecting sociodemographic data, MST results, and malnutrition diagnoses based on the Global Leadership Initiative on Malnutrition (GLIM) criteria. A positive MST score was defined as a value of 2 or above. Categorical data were summarized using proportions, while quantitative variables were described using measures of central tendency. Sensitivity, specificity, the area under the receiver operating characteristic curve (AUC), positive likelihood ratio (LR + ), and negative likelihood ratio (LR-) were estimated along with their respective 95% confidence intervals.

Results: A total of 676 patients were included, with a median age of 82 years (interquartile range: 68-89 years), and 57.3% were female. According to the GLIM criteria, 59.8% of the patients met the criteria for malnutrition. The MST classified patients into low risk (62.4%), medium risk (30.8%), and high risk (6.8%). The sensitivity of the MST was 11.4% (95% CI: 8.5-14.9%), while its specificity was 100% (95% CI: 98.7-100%). The positive likelihood ratio (LR + ) was 1.75, and the negative likelihood ratio (LR-) was 0.0. The area under the curve was 0.71 (95% CI: 0.69-0.73), indicating moderate discriminative capacity.

Conclusion: While the MST demonstrates extremely high specificity, its low sensitivity limits its effectiveness in accurately identifying malnourished patients in the context of home care. This suggests that, although the tool is highly accurate for confirming the absence of malnutrition, it fails to detect a significant number of patients who are malnourished. As a result, while the MST may be useful as an initial screening tool, its use should be complemented with more comprehensive assessments to ensure more precise detection of malnutrition in this high-risk population.

Poster of Distinction

Colby Teeman, PhD, RDN, CNSC1; Kaylee Griffith, BS2; Karyn Catrine, MS, RDN, LD3; Lauren Murray, MS, RD, CNSC, LD3; Amanda Vande Griend, BS, MS2

1University of Dayton, Xenia, OH; 2University of Dayton, Dayton, OH; 3Premier Health, Dayton, OH

Financial Support: None Reported.

Background: The prevalence of malnutrition in critically ill populations has previously been shown to be between 38-78%. Previously published guidelines have stated that patients in the ICU should be screened for malnutrition within 24-48 hours and all patients in the ICU for >48 hours should be considered at high risk for malnutrition. Patients with a malnutrition diagnosis in the ICU have been shown to have poorer clinical outcomes; including longer length of stay, greater readmission rates, and increased mortality. The purpose of the current study was to see if severity of malnutrition impacted the time to initiate enteral nutrition and the time to reach goal enteral nutrition rate in critically ill patients and determine the possible impact of malnutrition severity on clinical outcomes.

Methods: A descriptive, retrospective chart review was conducted in multiple ICU units at a large level I trauma hospital in the Midwest. All participants included in analysis had been assessed for malnutrition by a registered dietitian according to ASPEN clinical practice guidelines. Exclusion criteria included patients receiving EN prior to the RDN assessment, those who received EN for < 24 hours total, patients on mixed oral and enteral nutrition diets, and patients receiving any parenteral nutrition. Participants were grouped by malnutrition status; no malnutrition n = 27, moderate malnutrition n = 22, and severe malnutrition n = 32. All data analysis were analyzed SPSS version 29.

Results: There was no difference in primary outcomes (time to EN initiation, time to EN goal rate) by malnutrition status (both p > 0.05). Multiple regression analysis found neither moderately malnourished or severely malnourished patients were more likely to have enteral nutrition initiation delayed for >48 hours from admission (p > 0.05). Neither ICU LOS nor hospital LOS was different among malnutrition groups (p > 0.05). Furthermore, neither ICU nor hospital mortality was different among malnutrition groups (p < 0.05). Among patients who were moderately malnourished, 81.8% required vasopressors, compared to 75% of patients who were severely malnourished, and 44.4% of patients who did not have a malnutrition diagnosis (p = 0.010). 90.9% of moderately malnourished patients required extended time on a ventilator (>72 hours), compared to 59.4% of severely malnourished patients, and 51.9% of patients without a malnutrition diagnosis (p = 0.011).

Conclusion: Although the severity of malnutrition did not impact LOS, readmission, or mortality, malnutrition status did significantly predict greater odds of a patient requiring vasopressors and spending an extended amount of time on a ventilator. Further studies with larger sample sizes are warranted to continue developing a better understanding of the relationship between malnutrition status and clinical outcomes.

Jamie Grandic, RDN-AP, CNSC1; Cindi Stefl, RN, BSN, CCDS2

1Inova Health System, Fairfax Station, VA; 2Inova Health System, Fairfax, VA

Encore Poster

Presentation: Vizient Connections Summit 2024 (Sept 16-19, 2024).

Publication: 2024 Vizient supplement to the American Journal of Medical Quality (AJMQ).

Financial Support: None Reported.

Background: Research reveals that up to 50% of hospitalized patients are malnourished, yet only 9% of these cases are diagnosed. (1) Inadequate diagnosis and intervention of malnutrition can lead to poorer patient outcomes and reduced revenue. Our systemwide malnutrition awareness campaign successfully enhanced dietitian engagement, provider education, and streamlined documentation processes. This initiative resulted in a two-fold increase in the capture of malnutrition codes, a notable rise in malnutrition variable capture, and an average increase in diagnosis-related group relative weight by approximately 0.9. Consequently, there was a ~ 300% increase in revenue associated with accurate malnutrition diagnosis and documentation, alongside improvements in the observed-to-expected (O/E) ratio for mortality and length of stay. Given the critical impact of malnutrition on mortality, length of stay, and costs, enhanced identification programs are essential. Within our health system, a malnutrition identification program has been implemented across five hospitals for several years. System leadership roles for clinical nutrition and clinical documentation integrity (CDI) were established to ensure consistency, implement best practices, and optimize program efficacy. In 2022, a comprehensive analysis identified opportunities for improvement: a low systemwide capture rate (2%), limited awareness of the program's benefits, and inconsistent documentation practices. The leadership team, with the support of our executive sponsors, addressed these issues, engaged with service line leaders, and continue to drive program enhancements.

Methods: A 4-part malnutrition education campaign was implemented: Strengthened collaboration between Clinical Nutrition and CDI, ensuring daily systemwide communication of newly identified malnourished patients. Leadership teams, including coding and compliance, reviewed documentation protocols considering denial risks and regulatory audits. Launched a systemwide dietitian training program with a core RD malnutrition optimization team, a 5-hour comprehensive training, and monthly chart audits aiming for >80% documentation compliance. Created a Provider Awareness Campaign featuring interactive presentations on the malnutrition program's benefits and provider documentation recommendations. Developed an electronic health record (EHR) report and a malnutrition EHR tool to standardize documentation. EHR and financial reports were used to monitor program impact and capture rates.

Results: The malnutrition campaign has notably improved outcomes through ongoing education for stakeholders. A malnutrition EHR tool was also created in November 2022. This tool is vital for enhancing documentation, significantly boosting provider and CDI efficiency. Key results include: Dietitian documentation compliance increased from 85% (July 2022) to 95% (2024); RD-identified malnutrition cases increased from 2% (2021) to 16% (2024); Monthly average of final coded malnutrition diagnoses increased from 240 (2021) to 717 (2023); Average DRG relative weight climbed from 1.24 (2021) to 2.17 (2023); Financial impact increased from $5.5 M (2021) to $17.7 M (2024); and LOS O/E improved from 1.04 to 0.94 and mortality O/E improved from 0.77 to 0.62 (2021-2023).

Conclusion: This systemwide initiative not only elevates capture rates and documentation but also enhances overall outcomes. By CDI and RD teams taking on more of a collaborative, leadership role, providers can concentrate more on patient care, allowing these teams to operate at their peak. Looking ahead to 2025, the focus will shift towards leading indicators to refine malnutrition identification and assess the educational campaign's impact further.

Ryota Sakamoto, MD, PhD1

1Kyoto University, Kyoto

Financial Support: None Reported.

Background: Growing concern about the environmental impact of meat eating has led to consideration of a shift to a plant-based diet. One nutrient that tends to be particularly deficient in a plant-based diet is vitamin B12. Since vitamin B12 deficiency can cause anemia, limb paresthesia and muscle weakness, and psychiatric symptoms including depression, delirium, and cognitive impairment, it is important to find sustainable local sources of vitamin B12. In this study, we focused on gundruk and sinki, two fermented and preserved vegetables traditionally consumed mainly in Nepal, India, and Bhutan, and investigated their vitamin B12 content. The sinki is mainly made from radish roots, while gundruk is made from green leaves such as mustard leaves, which are preserved through fermentation and sun-drying processes. Previous reports indicated that, in these regions, not only vegetarians and vegans but also a significant number of people may have consistently low meat intake, especially among the poor. The governments and other organizations have been initiating feeding programs to supply fortified foods with vitamins A, B1, B2, B3, B6, B9, B12, iron, and zinc, especially to schools. At this time, however, it is not easy to get fortified foods to residents in the community. It is important to explore the possibility of getting vitamin B12 from locally available products that can be taken by vegetarians, vegans, or the poor in the communities.

Methods: Four samples of gundruk and five samples of sinki were obtained from markets, and the vitamin B12 content in them was determined using Lactobacillus delbrueckii subsp. lactis (Lactobacillus leichmannii) ATCC7830. The lower limit of quantification was set at 0.03 µg/100 g. The sample with the highest vitamin B12 concentration in the microbial quantification method was also measured for cyanocobalamin using LC-MS/MS (Shimadzu LC system equipped with Triple Quad 5500 plus AB-Sciex mass spectrometer). The Multiple Reaction Monitoring transition pattern for cyanocobalamin were Q1: 678.3 m/z, Q3: 147.1 m/z.

Results: For gundruk, vitamin B12 was detected in all four of the four samples, with values of 5.0 µg/100 g, 0.13 µg/100 g, 0.12 µg/100 g, and 0.04 µg/100 g, respectively, from highest to lowest. For sinki, it was detected in four of the five samples, with values of 1.4 µg/100 g, 0.41 µg/100 g, 0.34 µg/100 g, and 0.16 µg/100 g, respectively, from highest to lowest. The cyanocobalamin concentration by LC-MS/MS in one sample was estimated to be 1.18 µg/100 g.

Conclusion: According to “Vitamin and mineral requirements in human nutrition (2nd edition) (2004)” by the World Health Organization and the Food and Agriculture Organization of the United Nations, the recommended intake of vitamin B12 is 2.4 µg/day for adults, 2.6 µg/day for pregnant women and 2.8 µg/day for lactating women. The results of this study suggest that gundruk and sinki have potential as a source of vitamin B12, although there is a great deal of variability among samples. In order to use gundruk and sinki as a source of vitamin B12, it may be necessary to find a way to stabilize the vitamin B12 content while focusing on the relationship between vitamin B12 and the different ways of making gundruk and sinki.

Teresa Capello, MS, RD, LD1; Amanda Truex, MS, RRT, RCP, AE-C1; Jennifer Curtiss, MS, RD, LD, CLC1; Ada Lin, MD1

1Nationwide Children's Hospital, Columbus, OH

Financial Support: None Reported.

Background: The metabolic demands of critically ill children are defined by an increase in resting energy expenditure. (1,2). Energy needs in the PICU are ever changing and accurate evaluations are challenging to obtain. (3) Predictive equations have been found to be inaccurate due to over or under feeding patients which can lead to negative outcomes such as muscle loss and poor healing (underfeeding) and weight gain as adipose (overfeeding) (1,4,5). Indirect calorimetry (IC) is considered the gold standard to assess metabolic demand especially for critically ill pediatric patients (1,2,4). The use of IC may be limited due to staffing, equipment availability and cost as well as other patient related issues and/or cart specifications (6). In our facility, we identified limited use of IC by dietitians. Most tests were ordered by PICU dietitians and rarely outside the critical care division even though testing would benefit patients in other divisions of the hospital such as the CTICU, rehab, NICU and stepdown areas. Informal polling of non-PICU dietitians revealed that they had significant uncertainty interpreting data and providing recommendations based on test results. Reasons for uncertainty mostly centered from a lack of familiarity with this technology. The purpose of this study was to develop guidelines and a worksheet for consistently evaluating IC results with the goal of encouraging increased use of indirect calorimetry at our pediatric facility.

Methods: A committee of registered dietitians (RDs) and respiratory therapists (RTs) met in January 2023 and agreed on step-by-step guidelines which were trialed, reviewed, and updated monthly. Finalized guidelines were transitioned to a worksheet to improve consistency of use and aid in interpretation of IC results. A shared file was established for articles about IC as well as access to the guidelines and worksheet (Figures 1 and 2). For this study, IC data from January 1, 2022 to July 31, 2024 was reviewed. This data included number of tests completed and where the orders originated.

Results: Since the guidelines have been implemented, the non-PICU areas using IC data increased from 16% in 2022 to 30% in 2023 and appears to be on track to be the same in 2024 (Figure 3). RDs report an improved comfort level with evaluating test results as well as making recommendations for test ordering.

Conclusion: The standardized guidelines and worksheet increased RD's level of comfort and interpretation of test results. The PICU RDs have become more proficient and comfortable explaining IC during PICU rounds. It is our hope that with the development of the guidelines/worksheet, more non-PICU RDs will utilize the IC testing outside of the critical care areas where longer lengths of stay may occur. IC allows for more individualized nutrition prescriptions. An additional benefit was the mutual exchange of information between disciplines. The RTs provided education on the use of the machine to the RDs. This enhanced RDs understanding of IC test results from the RT perspective. In return, the RDs educated the RTs as to why certain aspects of the patient's testing environment were helpful to report with the results for the RD to interpret the information correctly. The committee continues to meet and discuss patients’ tests to see how testing can be optimized as well as how results may be used to guide nutrition care.

Figure 1. Screen Capture of Metabolic Cart Shared File.

Figure 2. IC Worksheet.

Figure 3. Carts completed per year by unit: 2022 is pre-intervention; 2023 and 2024 are post intervention. Key: H2B = PICU; other areas are non-PICU (H4A = cardiothoracic stepdown, H4B = cardiothoracic ICU, H5B = burn, H8A = pulmonary, H8B = stable trach/vent unit, H10B = Neurosurgery/Neurology, H11B = Nephrology/GI, H12A = Hematology/Oncology, C4A = NICU, C5B = infectious disease).

Alfredo Lozornio-Jiménez-de-la-Rosa, MD, MSCN1; Minu Rodríguez-Gil, MSCN2; Luz Romero-Manriqe, MSCN2; Cynthia García-Vargas, MD, MSCN2; Rosa Castillo-Valenzuela, PhD2; Yolanda Méndez-Romero, MD, MSC1

1Colegio Mexicano de Nutrición Clinica y Terapia Nutricional (Mexican College of Clinical Nutrition and Nutritional Therapy), León, Guanajuato; 2Colegio Mexicano de Nutrición Clínica y Terapia Nutricionalc (Mexican College of Clinical Nutrition and Nutritional Therapy), León, Guanajuato

Financial Support: None Reported.

Background: Sarcopenia is a systemic, progressive musculoskeletal disorder associated with an increased risk of adverse events and is highly prevalent in older adults. This condition leads to a decline in functionality, quality of life, and has an economic impact. Sarcopenia is becoming increasingly common due to age-related changes, metabolic changes, obesity, sedentary lifestyle, chronic degenerative diseases, malnutrition, and pro-inflammatory states. The objective of this study was to investigate the relationship between strength and muscle mass, measured by calf circumference, both corrected for BMI, in young and older Mexican adults.

Methods: This is a prospective, observational, cross-sectional, population-based clinical study conducted among Mexican men and women aged 30 to 90 years old, obtained through convenience sampling. The study was approved by the Ethics Committee of the Aranda de la Parra Hospital in León, Guanajuato, Mexico, and adheres to the Declaration of Helsinki. Informed consent was obtained from all participants after explaining the nature of the study. Inclusion criteria were: Mexican men and women aged 30 to 90 years old who were functionally independent (Katz index category "a"). Participants with amputations, movement disorders, or immobilization devices on their extremities were excluded. The research team was previously standardized in anthropometric measurements. Demographic data and measurements of weight, height, BMI, calf circumference, and grip strength using a dynamometer were collected. Data are presented as average and standard deviation. Spearman's correlation analysis was used to assess the relationship between BMI and calf circumference adjusted for BMI, with grip strength, considering a significance level of p < 0.05.

Results: The results of 1032 subjects were presented, 394 men and 638 women from central Mexico, located in workplaces, recreation centers, and health facilities, aged between 30 and 90 years old. Table 1 shows the distribution of the population in each age category, categorized by sex. Combined obesity and overweight were found in 75.1% of the sample population, with a frequency of 69.2% in men and 78.7% in women; 20% had a normal weight, with 25.6% in men and 16.6% in women, and 4.8% had low BMI, with 5.1% of men and 4.7% of women (Graph 1). The depletion of calf circumference corrected for BMI and age in the female population begins at 50 years old with exacerbation at 65 years old and older, while in men a greater depletion can be observed from 70 years old onwards. (Graph 2). When analyzing the strength corrected for BMI and age, grip strength lowers at 55 years old, lowering even more as age increases, in both genders; Chi-square=83.5, p < 0.001 (Graph 3). By Spearman correlation, an inverse and high relationship was found in both genders between age and grip strength, that is, as age increases, grip strength decreases (r = -0.530, p < 0.001). A moderate and negative correlation was found between age and calf circumference, as age increases, calf circumference decreases independently of BMI (r = -0.365, p < 0.001). Calf circumference and grip strength are positively and moderately related, as calf circumference decreases, the grip strength decreases, independently of BMI (r = 0.447, p < 0.0001).

Conclusion: These results show that the study population exhibited a decrease in grip strength, not related to BMI, from early ages, which may increase the risk of early-onset sarcopenia. This findings encourage early assessment of both grip strength and muscle mass, using simple and accessible measurements such as grip strength and calf circumference, adjusted for BMI. These measurements can be performed in the office during the initial patient encounter or in large populations, as in this study.

Table 1. Distribution of the Population According to Age and Gender.

Alison Hannon, Medical Student1; Anne McCallister, DNP, CPNP2; Kanika Puri, MD3; Anthony Perkins, MS1; Charles Vanderpool, MD1

1Indiana University School of Medicine, Indianapolis, IN; 2Indiana University Health, Indianapolis, IN; 3Riley Hospital for Children at Indiana University Health, Indianapolis, IN

Financial Support: None Reported.

Background: The Academy of Nutrition and Dietetics and American Society for Parenteral and Enteral Nutrition have published criteria to diagnose a patient with mild, moderate, or severe malnutrition that use either single or multiple data points. Malnutrition is associated with worse clinical outcomes, and previous data from our institution showed that hospitalized children with severe malnutrition are at higher risk of mortality compared to mild and moderate malnutrition. This project aims to determine if differences in clinical outcomes exist in patients with severe malnutrition based on the diagnostic criteria or anthropometrics differences in patients.

Methods: We included all patients discharged from Riley Hospital for Children within the 2023 calendar year diagnosed with severe malnutrition, excluding maternity discharges. Diagnostic criteria used to determine severe malnutrition was collected from registered dietitian (RD) documentation and RD-assigned malnutrition statement within medical records for the admission. Data was collected on readmission rates, mortality, length of stay (LOS), LOS index, cost, operative procedures, pediatric intensive care unit (ICU) admissions, and anthropometrics measured on admission. We used mixed-effects regression or mixed effects logistic regression to test whether the outcome of interest differed by severe malnutrition type. Each model contained a random effect for patient to account for correlation within admissions by the same patient and a fixed effect for severe malnutrition type. All analyses were performed using SAS v9.4.

Results: Data was gathered on 409 patient admissions. 383 admissions had diagnostic criteria clearly defined regarding severity of malnutrition. This represented 327 unique patients (due to readmissions). There was no difference in any measured clinical outcomes based on the criteria used for severe malnutrition, including single or multiple point indicators or patients who met both single and multiple point indicators (Table 1). Anthropometric data was analyzed including weight Z-score (n = 398) and BMI Z-score (n = 180). There was no difference seen in the majority of measured clinical outcomes in admissions with severe malnutrition when comparing based on weight or BMI Z-score categories of Z < -2, -2 < Z < -0.01, or Z > 0 (Table 2). Patients admitted with severe malnutrition and a BMI Z score > 0 had an increase in median cost (p = 0.042) compared to BMI < -2 or between -2 and 0 (Table 2). There was a trend towards increased median cost (p = 0.067) and median LOS (p = 0.104) in patients with weight Z score > 0.

Conclusion: Hospitalized patients with severe malnutrition have similar clinical outcomes regardless of diagnostic criteria used to determine the diagnosis of severe malnutrition. Fewer admissions with severe malnutrition (n = 180 or 44%) had sufficient anthropometric data to determine BMI. Based on this data, future programs at our institution aimed at intervention prior to and following admission will need to focus on all patients with severe malnutrition and will not be narrowed based on criteria (single, multiple data point) of severe malnutrition or anthropometrics. Quality improvement projects include improving height measurement and BMI determination upon admission, which will allow for future evaluation of impact of anthropometrics on clinical outcomes.

Table 1. Outcomes by Severe Malnutrition Diagnosis Category.

Outcomes by severe malnutrition compared based on the diagnostic criteria used to determine malnutrition diagnosis. Patients with severe malnutrition only are represented. Diagnostic criteria determined based on ASPEN/AND guidelines and defined during admission by registered dietitian (RD). OR = operative room; ICU = intensive care unit; LOS = length of stay. Data on 383 admissions presented, total of 327 patients due to readmissions: 284 patients had 1 admission; 33 patients had 2 admissions; 8 patients had 3 admissions; 1 patient had 4 admissions; 1 patient had 5 admissions

Table 2. Outcomes By BMI Z-score Category.

Outcomes of patients admitted with severe malnutrition, stratified based on BMI Z-score. Patients with severe malnutrition only are represented. BMI Z-score determined based on weight and height measurement at time of admission, recorded by bedside admission nurse. OR = operative room; ICU = intensive care unit; LOS = length of stay. Due to incomplete height measurements, data on only 180 admissions was available, total of 158 patients: 142 patients had 1 admission; 12 patients had 2 admissions; 3 patients had 3 admissions; 1 patient had 5 admissions

Claudia Maza, ND MSc1; Isabel Calvo, MD, MSc2; Andrea Gómez, ND2; Tania Abril, MSc3; Evelyn Frias-Toral, MD, MSc4

1Centro Médico Militar (Military Medical Center), Guatemala, Santa Rosa; 2Hospital General de Tijuana (Tijuana General Hospital), Tijuana, Baja California; 3Universidad Católica de Santiago de Guayaquil (Catholic University of Santiago de Guayaquil), Guayaquil, Guayas; 4Universidad Espíritu Santo (Holy Spirit University), Dripping Springs, TX

Financial Support: None Reported.

Background: Malnutrition is a common and significant issue among hospitalized patients, particularly in older adults or those with multiple comorbidities. The presence of malnutrition in such patients is associated with an increased risk of morbidity, mortality, prolonged hospital stays, and elevated healthcare costs. A crucial indicator of nutritional status is muscle strength, which can be effectively measured using handgrip strength (HGS). This study aimed to describe the relationship between nutritional status and muscle strength reduction in patients from two hospitals in Latin America.

Methods: A retrospective observational study was conducted from February to May 2022. Data were collected from two hospitals: one in Guatemala and one in Mexico. A total of 169 patients aged 19-98 years were initially considered for the study, and 127 met the inclusion criteria. The sample comprised adult patients of both sexes admitted to internal medicine, surgery, and geriatrics departments. Handgrip strength, demographic data, baseline medical diagnosis, weight, and height were recorded at admission and on the 14th day of hospitalization. Exclusion criteria included patients with arm or hand movement limitations, those under sedation or mechanical ventilation, and those hospitalized for less than 24 hours. HGS was measured using JAMAR® and Smedley dynamometers, following standard protocols. Statistical analysis was performed using measures of central tendency, and results were presented in tables and figures.

Results: In the first hospital (Mexico), 62 patients participated, with a predominant female sample. The average weight was 69.02 kg, height 1.62 meters, and BMI 26.14 kg/m² (classified as overweight). The most common admission diagnoses were infectious diseases, nervous system disorders, and digestive diseases. (Table 1) A slight increase in HGS (0.49 kg) was observed between the first and second measurements. (Figure 1) In the second hospital (Guatemala), 62 patients also met the inclusion criteria, with a predominant male sample. The average weight was 65.92 kg, height 1.61 meters, and BMI 25.47 kg/m² (classified as overweight). Infectious diseases and musculoskeletal disorders were the most common diagnoses. (Table 1) HGS decreased by 2 kg between the first and second measurements. (Figure 2) Low HGS was associated with underweight patients and those with class II and III obesity. Patients with normal BMI in both centers exhibited significant reductions in muscle strength, indicating that weight alone is not a sufficient indicator of muscle strength preservation.

Conclusion: This multicenter study highlights the significant relationship between nutritional status and decreased muscle strength in hospitalized patients. While underweight patients showed reductions in HGS, those with class II and III obesity also experienced significant strength loss. These findings suggest that HGS is a valuable, non-invasive tool for assessing both nutritional status and muscle strength in hospitalized patients. Early identification of muscle strength deterioration can help healthcare providers implement timely nutritional interventions to improve patient outcomes.

Table 1. Baseline Demographic and Clinical Characteristics of the Study Population.

NS: Nervous System, BMI: Body Mass Index

Figure 1. Relationship Between Nutritional Status and Handgrip Strength (Center 1 - Mexico).

Figure 2. Relationship Between Nutritional Status and Handgrip Strength (Center 2 - Guatemala).

Reem Farra, MDS, RD, CNSC, CCTD1; Cassie Greene, RD, CNSC, CDCES2; Michele Gilson, MDA, RD, CEDS2; Mary Englick, MS, RD, CSO, CDCES2; Kristine Thornham, MS, RD, CDE2; Debbie Andersen, MS, RD, CEDRD-S, CHC3; Stephanie Hancock, RD, CSP, CNSC4

1Kaiser Permanente, Lone Tree, CO; 2Kaiser Permanente, Denver, CO; 3Kaiser Permanente, Castle Rock, CO; 4Kaiser Permanente, Littleton, CO

Financial Support: None Reported.

Background: Currently, there is no standardized screening required by the Centers for Medicare and Medicaid Services for malnutrition in outpatient settings. This raises concerns about early identification of malnutrition, the likelihood of nutrition interventions, and increased healthcare costs. While malnutrition screening in hospitalized patients is well-studied, its impact in outpatient care has not been thoroughly examined. Studies show that malnourished patients are 40% more likely to be readmitted within 30 days of discharge, adding over $10,000 to hospitalization costs. This quality improvement project aimed to evaluate the impact of implementing a standardized malnutrition screening tool for all patients as part of nutrition assessments by Registered Dietitians (RDs).

Methods: The Malnutrition Screening Tool (MST) was chosen due to its validity, sensitivity, and specificity in identifying malnutrition risk in outpatient settings. This tool assesses risk by asking about recent unintentional weight loss and decreased intake due to appetite loss. Based on responses, a score of 0-5 indicates the severity of risk. Scores of 0-1 indicate no risk, while scores of 2-5 indicate a risk for malnutrition. This questionnaire was integrated into the nutrition assessment section of the electronic medical record to standardize screening for all patients. Those with scores of 2 or greater were included, with no exclusions for disease states. Patients were scheduled for follow-ups 2-6 weeks after the initial assessment, during which their MST score was recalculated.

Results: A total of 414 patients were screened, with 175 completing follow-up visits. Of these, 131 showed improvements in their MST scores after nutrition interventions, 12 had score increases, and 32 maintained the same score. Two hundred thirty-nine patients were lost to follow-up for various reasons, including lack of response, limited RD scheduling access, changes in insurance, and mortality. Those with improved MST scores experienced an average cost avoidance of $15,000 each in subsequent hospitalizations. This cost avoidance was due to shorter hospitalizations, better treatment responses, and reduced need for medical interventions. The project raised awareness about the importance of early nutrition interventions among the multidisciplinary team.

Conclusion: This project suggests that standardizing malnutrition screening in outpatient settings could lead to cost avoidance for patients and health systems, along with improved overall care. Further studies are needed to identify the best tools for outpatient malnutrition screening, optimal follow-up timelines, and effective nutrition interventions for greatest cost avoidance.

Amy Sharn, MS, RDN, LD1; Raissa Sorgho, PhD, MScIH2; Suela Sulo, PhD, MSc3; Emilio Molina-Molina, PhD, MSc, MEd4; Clara Rojas Montenegro, RD5; Mary Jean Villa-Real Guno, MD, FPPS, FPSPGHAN, MBA6; Sue Abdel-Rahman, PharmD, MA7

1Abbott Nutrition, Columbus, OH; 2Center for Wellness and Nutrition, Public Health Institute, Sacramento, CA; 3Global Medical Affairs and Research, Abbott Nutrition, Chicago, IL; 4Research & Development, Abbott Nutrition, Granada, Andalucia; 5Universidad del Rosario, Escuela de Medicina, Bogota, Cundinamarca; 6Ateneo de Manila University, School of Medicine and Public Health, Metro Manila, National Capital Region; 7Health Data Synthesis Institute, Chicago, IL

Encore Poster

Presentation: American Society for Nutrition, June 29-July 2, Chicago, IL, USA; American Academy of Pediatrics, September 27-October 1, Orlando, FL, USA.

Publication: Sharn AR, Sorgho R, Sulo S, Molina-Molina E, Rojas Montenegro C, Villa-Real Guno MJ, Abdel-Rahman S. Using mid-upper arm circumference z-score measurement to support youth malnutrition screening as part of a global sports and wellness program and improve access to nutrition care. Front Nutr. 2024 Aug 12;11:1423978. doi: 10.3389/fnut.2024.1423978. PMID: 39188981; PMCID: PMC11345244.

Financial Support: This study was financially supported by the Abbott Center for Malnutrition Solutions, Chicago, IL, USA.

Veeradej Pisprasert, MD, PhD1; Kittipadh Boonyavarakul, MD2; Sornwichate Rattanachaiwong, MD3; Thunchanok Kuichanuan, MD3; Pranithi Hongsprabhas, MD3; Chingching Foocharoen, MD3

1Faculty of Medicine, Khon Kaen University, Muang Khon Kaen, Khon Kaen; 2Chulalongkorn University, Bangkok, Krung Thep; 3Department of Medicine, Khon Kaen University, Muang Khon Kaen, Khon Kaen

Financial Support: Grant supported by Khon Kaen University.

Background: Systemic sclerosis (SSc) is an autoimmune disease which malnutrition is a common complication caused by chronic inflammation of its natural history and/or gastrointestinal tract involvement. Current nutritional assessment tools, e.g. GLIM criteria, may include data regarding muscle mass measurement for nutritional diagnosis. Anthropometric measurement is a basic method in determining muscle mass, however, data in such condition is limited. This study aimed to determine utility of determining muscle mass and muscle function by anthropometric measurement for diagnosing malnutrition in SSc patients.

Methods: A cross-sectional diagnostic study was conducted in adult SSc patients at Srinagarind Hospital, Thailand. All patients were assessed for malnutrition based on Subjective Global Assessment (SGA). Muscle mass was measured by mid-upper-arm muscle circumference (MUAC) and calf circumference (CC), in addition, muscle function was determined by handgrip strength (HGS).

Results: A total of 208 SSc patients were included, of which 149 were females (71.6%). The respective mean age and body mass index was 59.3 ± 11.0 years and 21.1 ± 3.9 kg/m². Nearly half (95 cases; 45.7%) were malnourished based on SGA. Mean values of MUAC, CC, and HGS were 25.9 ± 3.83, 31.5 ± 3.81, and 19.0 ± 6.99 kg, respectively. Area under the curve (AUC) of receiver operating characteristic (ROC) curves of MUAC for diagnosing malnutrition was 0.796, of CC was 0.759, and HGS was 0.720. Proposed cut-off values were shown in table 1.

Conclusion: Muscle mass and muscle function were associated with malnutrition. Assessment of muscle mass and/or function by anthropometric measurement may be one part of nutritional assessment in patients with systemic sclerosis.

Table 1. Proposed Cut-Off Values of MUAC, CC, and HGS in Patients With Systemic Sclerosis.

CC; calf circumference, HGS; handgrip strength, MUAC; mid-upper-arm circumference.

Figure 1. ROC Curve of MUAC, CC, and HGS in Diagnosing Malnutrition by Subjective Global Assessment (SGA).

Trevor Sytsma, BS1; Megan Beyer, MS, RD, LDN2; Hilary Winthrop, MS, RD, LDN, CNSC3; William Rice, BS4; Jeroen Molinger, PhDc5; Suresh Agarwal, MD3; Cory Vatsaas, MD3; Paul Wischmeyer, MD, EDIC, FCCM, FASPEN6; Krista Haines, DO, MA3

1Duke University, Durham, NC; 2Duke University School of Medicine- Department of Anesthesiology, Durham, NC; 3Duke University School of Medicine, Durham, NC; 4Eastern Virginia Medical School, Norfolk, VA; 5Duke University Medical Center - Department of Anesthesiology - Duke Heart Center, Raleigh, NC; 6Duke University Medical School, Durham, NC

Financial Support: Baxter.

Background: Achieving acceptable nutritional goals is a crucial but often overlooked component of postoperative care, impacting patient-important outcomes like reducing infectious complications and shortening ICU length of stay. Predictive resting energy expenditure (pREE) equations poorly correlate with actual measured REE (mREE), leading to potentially harmful over- or under-feeding. International ICU guidelines now recommend the use of indirect calorimetry (IC) to determine mREE and personalized patient nutrition. Surgical stress increases protein catabolism and insulin resistance, but the effect of age on postoperative mREE trends, which commonly used pREE equations do not account for, is not well studied. This study hypothesized that older adults undergoing major abdominal surgery experience slower metabolic recovery than younger patients, as measured by IC.

Methods: This was an IRB-approved prospective trial of adult patients following major abdominal surgery with open abdomens secondary to blunt or penetrating trauma, sepsis, or vascular emergencies. Patients underwent serial IC assessments to guide postoperative nutrition delivery. Assessments were offered within the first 72 hours after surgery, then every 3 ± 2 days during ICU stay, followed by every 7 ± 2 days in stepdown. Patient mREE was determined using the Q-NRG® Metabolic Monitor (COSMED) in ventilator mode for mechanically ventilated patients or the mask or canopy modes, depending on medical feasibility. IC data were selected from ≥ 3-minute intervals that met steady-state conditions, defined by a variance of oxygen consumption and carbon dioxide production of less than 10%. Measurements not meeting these criteria were excluded from the final analysis. Patients without mREE data during at least two-time points in the first nine postoperative days were also excluded. Older adult patients were defined as ≥ 65 years and younger patients as ≤ 50. Trends in REE were calculated using the method of least squares and compared using t-tests assuming unequal variance (α = 0.05). Patients' pREE values were calculated from admission anthropometric data using ASPEN-SCCM equations and compared against IC measurements.

Results: Eighteen older and 15 younger adults met pre-specified eligibility criteria and were included in the final analysis. Average rates and standard error of REE recovery in older and younger adult patients were 28.9 ± 17.1 kcal/day and 75.5 ± 18.9 kcal/day, respectively, which approached – but did not reach – statistical significance (p = 0.07). The lower and upper bands of pREE for the older cohort averaged 1728 ± 332 kcal and 2093 ± 390 kcal, respectively, markedly exceeding the mREE obtained by IC for most patients and failing to capture the observed variability identified using mREE. In younger adults, pREE values were closer to IC measurements, with lower and upper averages of 1705 ± 278 kcal and 2084 ± 323 kcal, respectively.

Conclusion: Our data signal a difference in rates of metabolic recovery after major abdominal surgery between younger and older adult patients but did not reach statistical significance, possibly due to insufficient sample size. Predictive energy equations do not adequately capture changes in REE and may overestimate postoperative energy requirements in older adult patients, failing to appreciate the increased variability in mREE that our study found in older patients. These findings reinforce the importance of using IC to guide nutrition delivery during the early recovery period post-operatively. Larger trialsemploying IC and quantifying protein metabolism contributions are needed to explore these questions further.

Table 1. Patient Demographics.

Figure 1. Postoperative Changes in mREE in Older and Younger Adult Patients Following Major Abdominal Surgery Compared to pREE (ASPEN).

Amber Foster, BScFN, BSc1; Heather Resvick, PhD(c), MScFN, RD2; Janet Madill, PhD, RD, FDC3; Patrick Luke, MD, FRCSC2; Alp Sener, MD, PhD, FRCSC4; Max Levine, MD, MSc5

1Western University, Ilderton, ON; 2LHSC, London, ON; 3Brescia School of Food and Nutritional Sciences, Faculty of Health Sciences, Western University, London, ON; 4London Health Sciences Centre, London, ON; 5University of Alberta, Edmonton, AB

Financial Support: Brescia University College MScFN stipend.

Background: Currently, body mass index (BMI) is used as the sole criterion to determine whether patients with chronic kidney disease (CKD) are eligible for kidney transplantation. However, BMI is not a good measure of health in this patient population, as it does not distinguish between muscle mass, fat mass, and water weight. Individuals with end-stage kidney disease often experience shifts in fluid balance, resulting in fluid retention, swelling, and weight gain. Consequently, BMI of these patients may be falsely elevated. Therefore, it is vitally important to consider more accurate and objective measures of body composition for this patient population. The aim this study is to determine whether there is a difference in body composition between individuals with CKD who are categorized as having healthy body weight, overweight, or obesity.

Methods: This was a cross-sectional study analyzing body composition of 114 adult individuals with CKD being assessed for kidney transplantation. Participants were placed into one of three BMI groups: healthy weight (group 1, BMI < 24.9 kg/m2, n = 29), overweight (group 2, BMI ≥ 24.9-29.9 kg/m2, n = 39) or with obesity (group 3, BMI ≥ 30 kg/m2, n = 45). Fat mass, lean body mass (LBM), and phase angle (PhA) were measured using Bioelectrical Impedance Analysis (BIA). Standardized phase angle (SPhA), a measure of cellular health, was calculated by [(observed PhA-mean PhA)/standard deviation of PhA]. Handgrip strength (HGS) was measured using a Jamar dynamometer, and quadriceps muscle layer thickness (QMLT) was measured using ultrasonography. Normalized HGS (nHGS) was calculated as [HGS/body weight (kg)], and values were compared to age- and sex-specific standardized cutoff values. Fat free mass index (FFMI) was calculated using [LBM/(height (m))2]. Low FFMI, which may identify high risk of malnutrition, was determined using ESPEN cut-off values of < 17 kg/m2 for males and < 15 kg/m2 for females. Frailty status was determined using the validated Fried frailty phenotype assessment tool. Statistical analysis: continuous data was analyzed using one-way ANOVA followed by Tukey post hoc, while chi-square tests were used for analysis of categorical data. IBM SPSS version 29, significance p < 0.05.

Results: Participants in group 1 were younger than either group 2 (p = 0.004) or group 3 (p < 0.001). There was no significant difference in males and females between the three groups. FFMI values below cutoff were significantly higher for group 1 (13%), versus group 2 (0%) and group 3 (2.1%) (p = 0.02). A significant difference in nHGS was found between groups, with lower muscle strength occurring more frequently among participants in group 3 (75%) vs 48.7% in group 2 and 28.5% in group 1 (p < 0.001). No significant differences were seen in QMLT, SPhA, HGS, or frailty status between the three BMI groups.

Conclusion: It appears that there is no difference in body composition parameters such as QMLT, SPhA, or frailty status between the three BMI groups. However, patients with CKD categorized as having a healthy BMI were more likely to be at risk for malnutrition. Furthermore, those individuals categorized as having a healthy BMI appeared to have more muscle strength compared to the other two groups. Taken together, these results provide convincing evidence that BMI should not be the sole criterion for listing patients for kidney transplantation. Further research is needed to confirm these findings.

Kylie Waynick, BS1; Katherine Petersen, MS, RDN, CSO2; Julie Kurtz, MS, CDCES, RDN2; Maureen McCoy, MS, RDN3; Mary Chew, MS, RDN4

1Arizona State University and Veterans Healthcare Administration, Phoenix, AZ; 2Veterans Healthcare Administration, Phoenix, AZ; 3Arizona State University, Phoenix, AZ; 4Phoenix VAHCS, Phoenix, AZ

Financial Support: None Reported.

Background: Malnutrition does not have a standardized definition nor universal identification criteria. Registered dietitian nutritionists (RDNs) most often diagnose based on Academy and ASPEN Identification of Malnutrition (AAIM) criteria while physicians are required to use International Classification of Diseases Version 10 (ICD-10-CM). However, there are major differences in how malnutrition is identified between the two. The malnutrition ICD-10 codes (E43.0, E44.0, E44.1, E50.0-E64.9) have vague diagnostic criteria leading providers to use clinical expertise and prior nutrition education. For dietitians, AAIM's diagnostic criteria is clearly defined and validated to identify malnutrition based on reduced weight and intake, loss of muscle and fat mass, fluid accumulation, and decrease in physical functioning. Due to lack of standardization, the process of identifying and diagnosing malnutrition is inconsistent. The purpose of this study was to analyze the congruence of malnutrition diagnosis between physicians and dietitians using the two methods and to compare patient outcomes between the congruent and incongruent groups.

Methods: A retrospective chart review of 668 inpatients assigned a malnutrition diagnostic code were electronically pulled from the Veteran Health Administration's Clinical Data Warehouse for the time periods of April through July in 2019, 2020, and 2021. Length of stay, infection, pressure injury, falls, thirty day readmissions, and documentation of communication between providers was collected from chart review. Data for cost to the hospital was pulled from Veterans Equitable Resource Allocation (VERA) and paired with matching social security numbers in the sample. Chi squares were used for comparing differences between incongruency and congruency for infection, pressure injury, falls, and readmissions. Means for length of stay and cost to hospital between the two groups were analyzed using ANOVA through SPSS.

Results: The diagnosis of malnutrition is incongruent between providers. The incongruent group had a higher percentage of adverse patient outcomes than those with congruent diagnosis. Congruent diagnoses were found to be significantly associated with incidence of documented communication (p < 0.001).

Conclusion: This study showcases a gap in malnutrition patient care. Further research needs to be conducted to understand the barriers to congruent diagnosis and communication between providers.

Nana Matsumoto, RD, MS1; Koji Oba, Associate Professor2; Tomonori Narita, MD3; Reo Inoue, MD2; Satoshi Murakoshi, MD, PhD4; Yuki Taniguchi, MD2; Kenichi Kono, MD2; MIdori Noguchi, BA5; Seiko Tsuihiji2; Kazuhiko Fukatsu, MD, PhD2

1The University of Tokyo, Bunkyo-City, Tokyo; 2The University of Tokyo, Bunkyo-ku, Tokyo; 3The University of Tokyo, Chuo-City, Tokyo; 4Kanagawa University of Human Services, Yokosuka-city, Kanagawa; 5The University of Tokyo Hospital, Bunkyo-ku, Tokyo

Financial Support: None Reported.

Background: Therapeutic diets are often prescribed for patients with various disorders, for example, diabetes, renal dysfunction and hypertension. However, due to the limitations of nutrients amount given, therapeutic diets might reduce appetite. Hospital meals maintain patients’ nutritional status when the meals are fully consumed regardless of the diet types. It is possible that therapeutic diets are at least partly associated with malnutrition in hospital. Therefore, we conducted an exploratory retrospective cohort study to investigate whether there are differences inpatients’ oral consumption between therapeutic and regular diets, taking into account other factors.

Methods: The study protocol was approved by the Ethics Committee of the University of Tokyo under protocol No.2023396NI-(1). We retrospectively extracted information from medical record of the patients who were admitted to the department of orthopedic and spine surgery at the University of Tokyo Hospital between June to October 2022. Eligible patients were older than 20 years old and hospitalized more than 7 days. These patients were provided oral diets as main source of nutrition. Patients prescribed texture modified diet, half or liquid diet are excluded. The measurements include percentages of oral food intake at various points during the hospitalization (e.g. at admission, before and after surgery and discharge), sex and ages. The differences in patient's oral consumption rate between therapeutic diet and regular diet were analyzed through a linear mixed-effect model.

Results: A total of 290 patients were analyzed, with 50 patients receiving a therapeutic diet and 240 patients receiving regular diet at admission. The mean percentage of oral intake was 83.1% in a therapeutic diet and 87.2% in a regular diet, and consistently 4-6% higher for regular diets compared to therapeutic diets, at each timing of hospitalization (Figure). In a linear mixed effect model with adjustment of sex and age, mean percentage of oral intake of a regular diet was 4.0% higher (95% confidence interval [CI], -0.8% to 8.9%, p = 0.100) than a therapeutic diet, although the difference did not reach statistical significance. The mean percentage of oral intake in women were 15.6% lower than men (95%CI, -19.5% to -11.8%.) Likewise, older patient's intake rate was reduced compared than younger patients (difference, -0.2% per age, 95%CI -0.3% to -0.1%).

Conclusion: This exploratory study failed to show that therapeutic diets may reduce food intake in orthopedic and supine surgery patients as compared to regular diet. However, sex and ages were important factors affecting food intake. We need to pay special attention to female and/or aged patients for increasing oral food intake. Future research will increase the number of patients examined, expand the cohort to other department and proceed to prospective study to find out what factors truthfully affect patient's oral intake during the hospitalization.

Figure 1. The Percentage of Oral Intake During Hospitalization in Each Diet.

Lorena Muhaj, MS1; Michael Owen-Michaane, MD, MA, CNSC2

11 Institute of Human Nutrition, Vagelos College of Physicians and Surgeons, Columbia University Irving Medical Center, New York, NY; 2Columbia University Irving Medical Center, New York, NY

Financial Support: None Reported.

Background: Muscle mass is crucial for overall health and well-being. Consequently, accurate estimation of muscle mass is essential for diagnosing malnutrition and conditions such as sarcopenia and cachexia. The Kim equation uses biomarker data to estimate muscle mass, but whether this tool provides an accurate estimate in populations with high BMI and kidney disease remains uncertain. Therefore, the aim of this study is to assess whether the Kim equation is suitable and reliable for estimating muscle mass and predicting malnutrition, sarcopenia, and related outcomes in a cohort with diverse BMI and kidney disease.

Methods: This is a cross-sectional study using data from the All of Us Research Program. Data on demographics, weight, height, creatinine, cystatin C, and diagnoses of malnutrition, hip fractures, and cachexia were obtained from electronic health records (EHR). The Kim equation was derived from creatinine, cystatin C and weight (Table 1) and compared with established sarcopenia cutoffs for appendicular lean mass (ALM), including ALM/BMI and ALM/height2. Malnutrition was identified through specific ICD-10-CM codes recorded in the EHR and participants were categorized based on malnutrition status (with or without malnutrition). Muscle mass and biomarker levels were compared between groups with or without severe/moderate malnutrition, and the relationships of BMI with creatinine and cystatin C levels were analyzed using linear regression. Wilcoxon rank-sum tests were used to assess associations between estimated muscle mass and malnutrition diagnosis.

Results: Baseline characteristics were stratified by gender for comparison. The mean age of participants was 58.2 years (SD = 14.7). The mean BMI was 30.4 kg/m2 (SD = 7.5). Mean serum creatinine and cystatin C levels were 2.01 mg/dL (SD = 1.82) and 0.18 mg/dL (SD = 0.11), respectively. The mean estimated muscle mass was 80.1 kg (SD = 21), with estimated muscle mass as a percentage of body weight being 92.8%. Mean ALM/BMI was 2.38 (SD = 0.32), while ALM/height2 value was 25.41 kg/m2 (SD = 5.6). No participant met the cutoffs for sarcopenia. All calculated variables are summarized in Table 1. In this cohort, < 2% were diagnosed with severe malnutrition and < 2% with moderate malnutrition (Table 2). Muscle mass was lower in participants with severe malnutrition compared to those without (W = 2035, p < 0.05) (Figure 1).

Conclusion: This study suggests the Kim equation overestimates muscle mass in populations with high BMI or kidney disease, as no participants met sarcopenia cutoffs despite the expected prevalence in CKD. This overestimation is concerning given the known risk of low muscle mass in CKD. While lower muscle mass was significantly associated with severe malnutrition (p < 0.05), the Kim equation identified fewer malnutrition cases than expected based on clinical data. Though biomarkers like creatinine and cystatin C may help diagnose malnutrition, the Kim equation may not accurately estimate muscle mass or predict malnutrition and sarcopenia in diverse populations. Further research is needed to improve these estimates.

Table 1. Muscle Mass Metrics Calculations and Diagnosis of Sarcopenia Based on FNIH (ALM/BMI) and EWGSOP (ALM/Height2) Cut-off Values.

Abbreviations: BMI-Body Mass Index; TBMM-Total Body Muscle Mass (referred as Muscle Mass as well) (calculated using the Kim Equation); ALM-Appendicular Lean Muscle Mass (using the McCarthy equation); ALM/Height2-Appendicular Lean Muscle Mass adjusted for height square (using EWGSOP cutoffs for diagnosing sarcopenia); ALM/BMI-Appendicular Lean Muscle Mass adjusted for BMI (using FNIH cutoffs for diagnosing sarcopenia). Equation 1: Kim equation - Calculated body muscle mass = body weight * serum creatinine/((K * body weight * serum cystatin C) + serum creatinine)

Table 2. Prevalence of Severe and Moderate Malnutrition.

(Counts less than 20 suppressed to prevent reidentification of participants).

Figure 1. Muscle Mass in Groups With and Without Severe Malnutrition.

Poster of Distinction

Robert Weimer, BS1; Lindsay Plank, PhD2; Alisha Rovner, PhD1; Carrie Earthman, PhD, RD1

1University of Delaware, Newark, DE; 2University of Auckland, Auckland

Financial Support: None Reported.

Background: Loss of skeletal muscle is common in patients with liver cirrhosis and low muscle mass is internationally recognized as a key phenotypic criterion to diagnose malnutrition.1,2 Muscle can be clinically assessed through various modalities, although reference data and consensus guidance are limited for the interpretation of muscle measures to define malnutrition in this patient population. The aim of this study was to evaluate the sensitivity and specificity of published sarcopenia cutpoints applied to dual-energy X-ray absorptiometry (DXA) muscle measures to diagnose malnutrition by the GLIM criteria, using in vivo neutron activation analysis (IVNAA) measures of total body protein (TBP) as the reference.

Methods: Adults with liver cirrhosis underwent IVNAA and whole body DXA at the Body Composition Laboratory of the University of Auckland. DXA-fat-free mass (FFM) and appendicular skeletal muscle mass (ASMM, with and without correction for wet bone mass3) were measured and indexed to height squared (FFMI, ASMI). The ratio of measured to predicted TBP based on healthy reference data matched for age, sex and height was calculated as a protein index; values less than 2 standard deviations below the mean (< 0.77) were defined as protein depletion (malnutrition). Published cut points from recommended guidelines were evaluated (Table 1).4-9 DXA values below the cut point were interpreted as ‘sarcopenic’. Sensitivity and specificity for each cut point were determined.

Results: Study sample included 350 adults (238 males/112 females, median age 52 years) with liver cirrhosis and median model for end-stage liver disease (MELD) score 12 (range 5-36). Application of published sarcopenia cutpoints to diagnose malnutrition by DXA in patients with liver cirrhosis had sensitivity ranging from 40.8% - 79.0% and specificity ranging from 79.6% to 94.2% (Table 1). Although all of the selected published cutpoints for DXA-measured ASMI were similar, the Baumgartner4 and Newman5 ASMI cutpoints when applied to our DXA-measured ASMI, particularly after correction for wet bone mass, yielded the best combination of sensitivity and specificity for diagnosing malnutrition as identified by protein depletion in patients with liver cirrhosis. The Studentski ASMM cutpoints in Table 1 yielded unacceptably low sensitivity (41-55%).

Conclusion: These findings suggest that the use of the DXA-derived Baumgartner/Newman bone-corrected ASMI cutpoints offer acceptable validity in the diagnosis of malnutrition by GLIM in patients with liver cirrhosis. However, given that it is not common practice to make this correction for wet bone mass in DXA measures of ASMI, the application of these cutpoints to standard uncorrected measures of ASMI by DXA would likely yield much lower sensitivity, suggesting that many individuals with low muscularity and malnutrition would be misdiagnosed as non-malnourished when applying these cutpoints.

Table 1. Evaluation of Selected Published Cut-Points for Dual-Energy X-Ray Absorptiometry Appendicular Skeletal Muscle Index to Identify Protein Depletion in Patients with Liver Cirrhosis.

Abbreviations: DXA, dual-energy X-ray absorptiometry; M, male; F, female; GLIM, Global Leadership Initiative on Malnutrition; EWGSOP, European Working Group on Sarcopenia in Older People; AWGS, Asia Working Group for Sarcopenia; FNIH, Foundation for the National Institutes of Health; ASMM, appendicular skeletal muscle mass in kg determined from DXA-measured lean soft tissue of the arms and legs; ASMI, ASMM indexed to height in meters-squared; ASMM-BC, ASMM corrected for wet bone mass according to Heymsfield et al 1990; ASMI-BC, ASMI corrected for wet bone mass.

Critical Care and Critical Health Issues

Amir Kamel, PharmD, FASPEN1; Tori Gray, PharmD2; Cara Nys, PharmD, BCIDP3; Erin Vanzant, MD, FACS4; Martin Rosenthal, MD, FACS, FASPEN1

1University of Florida, Gainesville, FL; 2Cincinnati Children, Gainesville, FL; 3Orlando Health, Orlando, FL; 4Department of Surgery, Division of Trauma and Acute Care Surgery, College of Medicine, University of Florida, Gainesville, FL

Financial Support: None Reported.

Background: Amino acids (AAs) serve different purposes in our body including structural, enzymatic, and integral cellular functions. Amino acids utilization and demand may vary between healthy and disease states. Certain conditions such as chronic kidney disease or short bowel syndrome can affect plasma AA levels. Previous research has identified citrulline as a marker of intestinal function and absorptive capacity. Stressors such as surgery or trauma can alter AAs metabolism, potentially leading to a hypercatabolic state and changes in the available AAs pool. The primary objective of this study is to compare AA levels in patients who have undergone abdominal surgery with those who have not. The secondary endpoint is to describe post-surgical complications and correlate plasma AAs level to such complications.

Methods: This study was a single-center retrospective analysis conducted between January 1, 2007and March 15, 2019, of patients who were referred to the University of Florida Health Nutrition Support Team (NST) and had a routine metabolic evaluation with amino acid levels as a part of nutrition support consult. Amino acid data were excluded if specimen were deemed contaminated. Patients with genetic disorders were also excluded from the study. During the study period, the amino acid bioassay was performed using Bio-chrome Ion exchange chromatography (ARUP Laboratories, Salt Lake City, UT).

Results: Of the 227 patients screened, 181 patients were included in the study (58 who underwent abdominal surgery and 123 who did not). The mean age, BMI and height of participants were 52.2 years, 25.1 kg/m2 and 169 cm respectively. Baseline characteristics were similar between the two groups, 31% of the surgery arm had undergone a surgical procedure within a year from the index encounter, 86.5% retained their colon, 69.2% had a bowel resection with mean of 147.6 cm of bowel left for those with documented length of reaming bowel (36 out of 58). Postoperative complications of small bowel obstruction, ileus, leak, abscess, bleeding and surgical site infection (SSI) were 12.1%, 24%, 17.2%, 20.7%, 3.4% and 17.2% respectively. Among the 19 AAs evaluated, median citrulline and methionine levels were significantly different between the 2 groups (23 [14-35] vs 17 [11-23]; p = 0.0031 and 27 [20-39] vs 33[24-51]; p = 0.0383. Alanine and arginine levels were associated with postoperative ileus, leucine levels correlated with SSI and glutamic acid and glycine levels were linked to postoperative fistula formation.

Conclusion: Most amino acid levels showed no significant differences between patients who underwent abdominal surgery and those who did not except for citrulline and methionine. Specific amino acids, such as alanine, arginine, leucine, glutamic acid and glycine may serve as an early indicator of post-surgical complications, however larger prospective trial is warranted to validate our findings.

Kento Kurashima, MD, PhD1; Grace Trello1; James Fox1; Edward Portz1; Shaurya Mehta, BS1; Austin Sims1; Arun Verma, MD1; Chandrashekhara Manithody, PhD1; Yasar Caliskan, MD1; Mustafa Nazzal, MD1; Ajay Jain, MD, DNB, MHA1

1Saint Louis University, St. Louis, MO

Financial Support: None Reported.

Background: Marginal donor livers (MDLs) have been used for liver transplantation to address major organ shortages. However, MDLs are notably susceptible to Ischemia/Reperfusion injury (IRI). Recent investigations have highlighted Ferroptosis, a new type of programmed cell death, as a potential contributor to IRI. We hypothesized that modulating ferroptosis by the iron chelator deferoxamine (DFO) could alter the course of IRI.

Methods: Using our novel Perfusion Regulated Organ Therapeutics with Enhanced Controlled Testing (PROTECT) model (US provision Patent, US63/136,165), six human MDLs (liver A to F) were procured and split into paired lobes. Simultaneous perfusion was performed on both lobes, with one lobe subjected to DFO while the other serving as an internal control. Histology, serum chemistry, expression of ferroptosis-associated genes, determination of iron accumulation, and measurement of lipid peroxidation, were performed.

Results: Histological analysis revealed severe macrovesicular steatosis (>30%) in liver A and D, while liver B and E exhibited mild to moderate macrovesicular steatosis. Majority of the samples noted mild inflammation dominantly in zone 3. No significant necrosis was noted during perfusion. Perl's Prussian blue stain and non-heme iron quantification demonstrated a suppression of iron accumulation in liver A to D with DFO treatment (p < 0.05). Based on the degree of iron chelation, 12 lobes were categorized into two groups: lobes with decreased iron (n = 4) and those with increased iron (n = 8). Comparative analysis demonstrated that ferroptosis-associated genes (HIF1-alpha, RPL8, IREB2, ACSF2, NQO1) were significantly downregulated in the former (p = 0.0338, p = 0.0085, p = 0.0138, p = 0.0138, p = 0.0209, respectively). Lipid peroxidation was significantly suppressed in lobes with decreased iron (p = 0.02). While serum AST was lower in iron chelated lobes this did not reach statistical significance.

Conclusion: This study affirmed that iron accumulation was driven by normothermic perfusion. Reduction of iron content suppressed ferroptosis-associated genes and lipid peroxidation to mitigate IRI. Our results using human MDLs revealed a novel relationship between iron content and ferroptosis, providing a solid foundation for future development of IRI therapeutics.

Gabriella ten Have, PhD1; Macie Mackey, BSc1; Carolina Perez, MSc1; John Thaden, PhD1; Sarah Rice, PhD1; Marielle Engelen, PhD1; Nicolaas Deutz, PhD, MD1

1Texas A&M University, College Station, TX

Financial Support: Department of Defense: CDMRP PR190829 - ASPEN Rhoads Research Foundation - C. Richard Fleming Grant & Daniel H. Teitelbaum Grant.

Background: Sepsis is a potentially life-threatening complication of infection in critically ill patients and is characterized by severe tissue breakdown in several organs, leading to long-term muscle weakness, fatigue, and reduced physical activity. Recent guidelines suggest increasing protein intake gradually in the first days of recovery from a septic event. It is, however, unclear how this affects whole-body protein turnover. Therefore in an acute sepsis-recovery pig model, we studied whole-body protein metabolism in the early sepsis recovery phase after restricted feeding with a balanced meal of amino acids (AA).

Methods: In 25 catheterized pigs (± 25 kg), sepsis was induced by IV infusion of live Pseudomonas aeruginosa bacteria (5*108 CFU/hour). At t = 9 h, recovery was initiated with IV administration of gentamicin. Post sepsis, twice daily food was given incrementally (Day 1:25%, 2:50%, 3:75%, >4:100%). The 100% meals contained per kg BW: 15.4gr CHO and 3.47gr fat, and a balanced free AA (reflecting muscle AA profile) mixture (0.56 gr n = 3.9 gr AA). Before sepsis (Baseline) and on recovery day 3, an amino acid stable isotope mixture was administered IV as pulse (post-absorptive). Subsequently, arterial blood samples were collected postabsorptive for 2 hours. Amino acid concentrations and enrichments were determined with LCMS. Statistics: RM-ANOVA, α = 0.05.

Results: At day 3, animal body weight was decreased (2.4 [0.9, 3.9]%, p = 0.0025). Compared to baseline values, plasma AA concentration profiles were changed. Overall, the total non-essential AA plasma concentration did not change. Essential AA plasma concentrations of histidine, leucine, methionine, phenylalanine, tryptophan, and valine were lower (p < 0.05) and lysine higher (p = 0.0027). No change in isoleucine. We observed lower whole-body production (WBP) of the non-essential amino acids arginine (p < 0.0001), glutamine (p < 0.0001), glutamate (p < 0.0001), glycine (p < 0.0001), hydro-proline (p = 0.0041), ornithine (p = 0.0003), taurine (p < 0.0001), and tyrosine (p < 0.0001). Citrulline production has not changed. In addition, lower WBP was observed for the essential amino acids, isoleucine (p = 0.0002), leucine (p < 0.0001), valine (p < 0.0001), methionine (p < 0.0001), tryptophane (p < 0.0001), and lysine (p < 0.0001). Whole-body protein breakdown and protein synthesis were also lower (p < 0.0001), while net protein breakdown has not changed.

Conclusion: Our sepsis-recovery pig model suggests that food restriction in the early phase of sepsis recovery leads to diminished protein turnover.

Gabriella ten Have, PhD1; Macie Mackey, BSc1; Carolina Perez, MSc1; John Thaden, PhD1; Sarah Rice, PhD1; Marielle Engelen, PhD1; Nicolaas Deutz, PhD, MD1

1Texas A&M University, College Station, TX

Financial Support: Department of Defense: CDMRP PR190829 - ASPEN Rhoads Research Foundation - C. Richard Fleming Grant & Daniel H. Teitelbaum Grant.

Background: Sepsis is a potentially life-threatening complication of infection in critically ill patients. It is characterized by severe tissue breakdown in several organs, leading to long-term muscle weakness, fatigue, and reduced physical activity (ICU-AW). In an acute sepsis-recovery ICU-AW pig model, we studied whether meals that only contain essential amino acids (EAA) can restore the metabolic deregulations during sepsis recovery, as assessed by comprehensive metabolic phenotyping1.

Methods: In 49 catheterized pigs (± 25 kg), sepsis was induced by IV infusion of live Pseudomonas aeruginosa bacteria (5*108 CFU/hour). At t = 9 h, recovery was initiated with IV administration of gentamicin. Post sepsis, twice daily food was given blindly and incrementally (Day 1:25%, 2:50%, 3:75%, >4:100%). The 100% meals contained per kg BW: 15.4gr CHO and 3.47gr fat, and 0.56 gr N of an EAA mixture (reflecting muscle protein EAA, 4.3 gr AA) or control (TAA, 3.9 gr AA). Before sepsis (Baseline) and on recovery day 7, an amino acid stable isotope mixture was administered IV as pulse (post-absorptive). Subsequently, arterial blood samples were collected for 2 hours. AA concentrations and enrichments were determined with LCMS. Statistics: RM-ANOVA, α = 0.05.

Results: A body weight reduction was found after sepsis, restored on Day 7 post sepsis. Compared to baseline, in the EAA group, increased muscle fatigue (p < 0.0001), tau-methylhistidine whole-body production (WBP) (reflects myofibrillar muscle breakdown, p < 0.0001), and whole-body net protein breakdown (p < 0.0001) was observed but less in the control group (muscle fatigue: p < 0.0001, tau-methylhistidine: p = 0.0531, net protein breakdown (p < 0.0001). In addition on day 7, lower WBP was observed of glycine (p < 0.0001), hydroxyproline (p < 0.0001), glutamate (p < 0.0001), glutamine (p < 0.0001), and taurine (p < 0.0001), however less (glycine: p = 0.0014; hydroxyproline (p = 0.0007); glutamate p = 0.0554) or more (glutamine: p = 0.0497; taurine: p < 0.0001) in the control group. In addition, the WBP of citrulline (p = 0.0011) was increased on day seven but less in the control group (p = 0.0078). Higher plasma concentrations of asparagine (p < 0.0001), citrulline (p < 0.0001), glutamine (p = 0.0001), tau-methylhistidine (0.0319), serine (p < 0.0001), taurine (p < 0.0001), and tyrosine (p < 0.0001) were observed in the EAA group. In the EAA group, the clearance was lower (p < 0.05), except for glycine, tau-methylhistidine, and ornithine.

Conclusion: Conclusion Our sepsis-recovery pig ICU-AW model shows that feeding EAA-only meals after sepsis relates to an increased muscle and whole-body net protein breakdown and affects non-EAA metabolism. We hypothesize that non-essential amino acids in post-sepsis nutrition are needed to improve protein anabolism.

Rebecca Wehner, RD, LD, CNSC1; Angela Parillo, MS, RD, LD, CNSC1; Lauren McGlade, RD, LD, CNSC1; Nan Yang, RD, LD, CNSC1; Allyson Vasu-Sarver, MSN, APRN-CNP1; Michele Weber, DNP, RN, APRN-CNS, APRN-NP, CCRN, CCNS, OCN, AOCNS1; Stella Ogake, MD, FCCP1

1The Ohio State University Wexner Medical Center, Columbus, OH

Financial Support: None Reported.

Background: Patients in the intensive care unit, especially those on mechanical ventilation, frequently receive inadequate enteral nutrition (EN) therapy. Critically ill patients receive on average 40-50% of prescribed nutritional requirements, while ASPEN/SCCM guidelines encourage efforts be made to provide > 80% of goal energy and protein needs. One method to help achieve these efforts is the use of volume-based feedings (VBF). At our institution, an hourly rate-based feeding (RBF) approach is standard. In 2022, our Medical Intensive Care Unit (MICU) operations committee inquired about the potential benefits of implementing VBF. Before changing our practice, we collected data to assess our current performance in meeting EN goals and to identify the reasons for interruptions. While literature suggests that VBF is considered relatively safe in terms of EN complications compared to RBF, to our knowledge, there is currently no information on the safety of starting VBF in patients at risk of gastrointestinal (GI) intolerance. Therefore, we also sought to determine whether EN is more frequently held due to GI intolerance versus operative procedures.

Methods: We conducted a retrospective evaluation of EN delivery compared to EN goal and the reason for interruption if EN delivery was below goal in the MICU of a large tertiary academic medical center. We reviewed ten days of information on any MICU patient on EN. One day constituted the total EN volume received, in milliliters, from 0700-0659 hours. Using the QI Assessment Form, we collected the following data: goal EN volume in 24 hours, volume received in 24 hours, percent volume received versus prescribed, and hours feeds were held (or below goal rate) each day. The reasons for holding tube feeds were divided into six categories based on underlying causes: feeding initiation/titration, GI issues (constipation, diarrhea, emesis, nausea, distention, high gastric residual volume), operative procedure, non-operative procedure, mechanical issue, and practice issues. Data was entered on a spreadsheet, and descriptive statistics were used to evaluate results.

Results: MICU patients receiving EN were observed over ten random days in a two-week period in August 2022. Eighty-two patients were receiving EN. Three hundred and four EN days were observed. Average percent EN delivered was 70% among all patients. EN was withheld for the following reasons: 34 cases (23%) were related to feeding initiation, 55 (37%) GI issues, 19 (13%) operative procedures, 32 (22%) non-operative procedures, 2 (1%) mechanical issues, and 5 (3%) cases were related to practice issues. VBF could have been considered in 51 cases (35%).

Conclusion: These results suggest that EN delivery in our MICU is most often below prescribed amount due to GI issues and feeding initiation. Together, they comprised 89 cases (60%). VBF protocols would not improve delivery in either case. VBF would likely lead to increased discomfort in patients experiencing GI issues, and feeding initiation can be improved with changes to advancement protocols. Due to VBF having potential benefit in only 35% of cases, as well as observing above average EN delivery, this protocol was not implemented in the observed MICU.

Delaney Adams, PharmD1; Brandon Conaway, PharmD2; Julie Farrar, PharmD3; Saskya Byerly, MD4; Dina Filiberto, MD4; Peter Fischer, MD4; Roland Dickerson, PharmD3

1Regional One Health, Memphis, TN; 2Veterans Affairs Medical Center, Memphis, TN; 3University of Tennessee College of Pharmacy, Memphis, TN; 4University of Tennessee College of Medicine, Memphis, TN

Encore Poster

Presentation: Society for Critical Care Medicine 54th Annual Critical Care Congress. February 23 to 25, 2025, Orlando, FL.

Publication: Critical Care Medicine.2025;53(1):In press.

Financial Support: None Reported.

Best of ASPEN-Critical Care and Critical Health Issues

Megan Beyer, MS, RD, LDN1; Krista Haines, DO, MA2; Suresh Agarwal, MD2; Hilary Winthrop, MS, RD, LDN, CNSC2; Jeroen Molinger, PhDc3; Paul Wischmeyer, MD, EDIC, FCCM, FASPEN4

1Duke University School of Medicine- Department of Anesthesiology, Durham, NC; 2Duke University School of Medicine, Durham, NC; 3Duke University Medical Center - Department of Anesthesiology - Duke Heart Center, Raleigh, NC; 4Duke University Medical School, Durham, NC

Financial Support: Baxter, Abbott.

Background: Resting energy expenditure (REE) is critical in managing nutrition in intensive care unit (ICU) patients. Accurate energy needs assessments are vital for optimizing nutritional interventions. However, little is known about how different disease states specifically influence REE in ICU patients. Existing energy recommendations are generalized and do not account for the metabolic variability across disease types. Indirect calorimetry (IC) is considered the gold standard for measuring REE but is underutilized. This study addresses this gap by analyzing REE across disease states using metabolic cart assessments in a large academic medical center. The findings are expected to inform more precise, disease-specific nutritional recommendations in critical care.

Methods: This is a pooled analysis of patients enrolled in four prospective clinical trials evaluating ICU patients across a range of disease states. Patients included in this analysis were admitted to the ICU with diagnoses of COVID-19, respiratory failure, cardiothoracic (CT) surgery, trauma, or surgical intensive care conditions. All patients underwent IC within 72 hours of ICU admission to assess REE, with follow-up measurements conducted when patients were medically stable. In each study, patients were managed under standard ICU care protocols in each study, and nutritional interventions were individualized or standardized based on clinical trial protocols. The primary outcome was the measured REE, expressed in kcal/day and normalized to body weight (kcal/kg/day). Summary statistics for demographic and clinical characteristics, such as age, gender, height, weight, BMI, and comorbidities, were reported. Comparative analyses across the five disease states were performed using ANOVA tests to determine the significance of differences in REE.

Results: The analysis included 165 ICU patients. The cohort had a mean age of 58 years, with 58% male and 42% female. Racial demographics included 36% black, 52% white, and 10% from other backgrounds. Patients in the surgical ICU group had the lowest caloric requirements, averaging 1503 kcal/day, while COVID-19 patients had the highest calorie needs of 1982 kcal/day. CT surgery patients measured 1644 kcal/day, respiratory failure measured 1763 kcal/day, and trauma patients required1883 kcal/day. ANOVA analysis demonstrated statistically significant differences in REE between these groups (p < 0.001). When normalized to body weight (kcal/kg/day), the range of REE varied from 20.3 to 23.5 kcal/kg/day, with statistically significant differences between disease states (p < 0.001).

Conclusion: This study reveals significant variability in REE across different disease states in ICU patients, highlighting the need for disease-specific energy recommendations. These findings indicate that specific disease processes, such as COVID-19 and trauma, may increase metabolic demands, while patients recovering from surgical procedures may have comparatively lower energy needs. These findings emphasize the importance of individualized nutritional interventions based on a patient's disease state to optimize recovery, clinical outcomes and prevent underfeeding or overfeeding, which can adversely affect patient outcomes. The results suggest IC should be more widely implemented in ICU settings to guide precise and effective nutrition delivery based on real-time metabolic data rather than relying on standard predictive equations. Further research is needed to refine these recommendations and explore continuous monitoring of REE and tailored nutrition needs in the ICU.

Table 1. Demographic and Clinical Characteristics.

Table 2. Disease Group Diagnoses.

Figure 1. Average Measured Resting Energy Expenditure by Disease Group.

Hailee Prieto, MA, RD, LDN, CNSC1; Emily McDermott, MS, RD, LDN, CNSC2

1Northwestern Memorial Hospital, Shorewood, IL; 2Northwestern Memorial Hospital, Chicago, IL

Financial Support: None Reported.

Background: Communication between Registered Dietitians and CTICU teams is non-standardized and often ineffective in supporting proper implementation of RD recommendations. Late or lack of RD support can impact the quality of nutrition care provided to patients. In FY23, the CTICU nutrition consult/risk turnaround time was 58% within 24hrs and missed nutrition consults/risk was 9%. Our goal was to improve RD consult/risk turnaround time within 24hrs, based on our department goal, from 58% to 75% and missed RD consult/risk from 9% to 6%, through standardizing communication between RD and CTICU APRNs. The outcome metrics nutrition risk turnaround time and nutrition consult turnaround time. The process metric was our percent of RD presence in rounds.

Methods: We used the DMAIC module to attempt to solve our communication issue in CTICU. We took the voice of the customer and surveyed the CTICU ARPNs and found that a barrier was the RDs limited presence in the CTICU. We found that the CTICU APRNs find it valuable to have an RD rounding daily with their team. We then did a literature search on RDs rounding in ICU, specifically cardiac/thoracic ICUs and found critically ill cardiac surgery patients are at high risk of developing malnutrition, however initiation of medical nutrition therapy and overall adequacy of nutrition provision is lower compared to noncardiac surgical or MICU patients. RD written orders directly improve outcomes in patients receiving nutrition support. To have the most influence, RDs need to be present in the ICU and be involved when important decisions are being made. Dietitian involvement in the ICU team significantly improves the team's ability to implement prompt, relevant nutrition support interventions. We used process mapping to address that rounding times overlap with the step-down cardiac floors and ICU rounding. We optimized the schedule for the RDs daily to be able to attend as many rounds as possible daily, including the CTICU rounds. We then implemented a new rounding structure within the Cardiac Service Line based on literature search for the standard of care and RD role in ICU rounding.

Results: Our percentage of turnaround time for nutrition consults/risks increased by 26% within 24hrs (58 to 84%) and decreased for missed consults/risks to 1% to exceed goals. The number of nutrition interventions we were able to implement increased with more RDs attending rounds, which was tracked with implementation of a RD rounding structure within the CTICU. The number of implemented interventions from 1 to 2 RDs was skewed due to the RD attempting to round with both teams each day there was only 1 RD.

Conclusion: Communication between the CTICU team and Clinical Nutrition continues to improve with consistent positive feedback from the ICU providers regarding the new rounding structure. The new workflow was implemented in the Clinical Nutrition Cardiac Service Line. For future opportunities, there are other ICU teams at NMH that do not have a dedicated RD to round with them due to RD staffing that could also benefit from a dedicated RD in those rounds daily.

Table 1. New Rounding Structure.

*Critical Care Rounds; Green: Attend; Gold: Unable to attend.

Table 2. Control Plan.

Figure 1. Results Consult Risk Turn Around Time Pre & Post Rounding.

Figure 2. Number of Implemented RD Nutrition Interventions by Number of RDs Rounding.

Kenny Ngo, PharmD1; Rachel Leong, PharmD2; Vivian Zhao, PharmD2; Nisha Dave, PharmD2; Thomas Ziegler, MD2

1Emory Healthcare, Macon, GA; 2Emory Healthcare, Atlanta, GA

Financial Support: None Reported.

Background: Micronutrients play a crucial role in biochemical processes in the body. During critical illness, the status of micronutrients can be affected by factors such as disease severity and medical interventions. Extracorporeal membrane oxygenation (ECMO) is a vital supportive therapy that has seen increased utilization for critically ill patients with acute severe refractory cardiorespiratory failure. The potential alterations in micronutrient status and requirements during ECMO are an area of significant interest, but data are limited. This study aimed to determine the incidence of micronutrient depletion in critically ill patients requiring ECMO.

Methods: A retrospective chart review was conducted for patients with at least one micronutrient level measured in blood while receiving ECMO between January 1, 2015, and September 30, 2023. Chart reviews were completed using the Emory Healthcare electronic medical record system after the Emory University Institutional Review Board approved the study. A full waiver for informed consent and authorization was approved for this study. Data on demographic characteristics, ECMO therapy-related information, and reported micronutrient levels were collected. Descriptive statistics were used to evaluate the data.

Results: A total of 77 of the 128 reviewed patients met inclusion criteria and were included in data analysis (Table 1). The average age of patients was 49, and 55.8% were female. The average duration of ECMO was 14 days, and the average length of stay in the intensive care unit was 46.7 days. Among the included patients, 56 required continuous renal replacement therapy (CRRT) along with ECMO. Patients who required CRRT received empiric standard supplementation of folic acid (1 mg), pyridoxine (200 mg), and thiamine (100 mg) every 48 hours. Of the 77 patients, 44% had below-normal blood levels of at least one of the measured micronutrients. The depletion percentages of various nutrients were as follows: vitamin C (80.6%), vitamin D (75.0%), iron (72.7%), copper (53.3%), carnitine (31.3%), selenium (30.0%), pyridoxine (25.0%), folic acid (18.8%), vitamin A (10.0%), zinc (7.1%), and thiamine (3.7%) (Table 2). Measured vitamin B12, manganese, and vitamin E levels were within normal limits.

Conclusion: This study demonstrated that 60% of patients on ECMO had orders to evaluate at least one micronutrient in their blood. Of these, almost half had at least one micronutrient level below normal limits. These findings underscore the need for regular nutrient monitoring for critically ill patients. Prospective studies are needed to understand the impact of ECMO on micronutrient status, determine the optimal time for evaluation, and assess the need for and efficacy of supplementation in these patients.

Table 1. General Demographic and ECMO Characteristics (N = 77).

Table 2. Observed Micronutrient Status during ECMO for Critically Ill Patients.

Diane Nowak, RD, LD, CNSC1; Mary Kronik, RD, LD, CNSC2; Caroline Couper, RD, LD, CNSC3; Mary Rath, MEd, RD, LD, CNSC4; Ashley Ratliff, MS, RD, LD, CNSC4; Eva Leszczak-Lesko, BS Health Sciences, RRT4

1Cleveland Clinic, Elyria, OH; 2Cleveland Clinic, Olmsted Twp, OH; 3Cleveland Clinic, Rocky River, OH; 4Cleveland Clinic, Cleveland, OH

Financial Support: None Reported.

Background: Indirect calorimetry (IC) is the gold standard for the accurate determination of energy expenditure. The team performed a comprehensive literature review on current IC practices across the nation which showed facilities employing IC typically follow a standard protocol dictated by length of stay (LOS). Intensive Care Unit (ICU) Registered Dietitians (RD) have directed IC intervention to reduce reliance on inaccurate predictive equations and judiciously identify patients (1, 2) with the assistance of IC order-based practice. While IC candidacy is determined by clinical criteria, implementation has been primarily dictated by RD time constraints. Our project aims to include IC in our standard of care by using a standardized process for implementation.

Methods: To implement IC at our 1,299-bed quaternary care hospital, including 249 ICU beds, a multidisciplinary team including ICU RDs and Respiratory Therapists (RT) partnered with a physician champion. Three Cosmed QNRG+ indirect calorimeters were purchased after a 6-month trial period. Due to the potential for rapid clinical status changes and RD staffing, the ICU team selected an order-based practice as opposed to a protocol. The order is signed by a Licensed Independent Practitioner (LIP) and includes three components: an order for IC with indication, a nutrition assessment consult, and a conditional order for the RD to release to RT once testing approved. After the order is signed, the RD collaborates with the Registered Nurse and RT by verifying standardized clinical criteria to assess IC candidacy. If appropriate, the RD will release the order for RT prior to testing to allow for documentation of ventilator settings. To begin the test, the RD enters patient information and calibrates the pneumotach following which RT secures ventilation connections. Next, RD starts the test and remains at bedside for the standardized 20-minute duration to ensure steady state is not interrupted. Once testing is completed, the best 5-minute average is selected to obtain the measured resting energy expenditure (mREE). The RD interprets the results considering a multitude of factors and, if warranted, modifies nutrition interventions.

Results: Eight ICU registered dietitians completed 87 IC measurements from May 2024 through August 2024 which included patients across various ICUs. All 87 patients were selected by the RD due to concerns for over or underfeeding. Eighty-three percent of the measurements were valid tests and seventy-nine percent of the measurements led to intervention modifications. The amount of face-to-face time spent was 66 hours and 45 minutes or an average of 45 minutes per test. Additional time spent interpreting results and making modifications to interventions ranged from 15-30 minutes.

Conclusion: IC has the ability to capture accurate energy expenditures in the critically ill. RD directed IC order-based practice has allowed for the successful IC introduction at our institution. The future transition from limited IC implementation to a standard of care will be dependent on the consideration of numerous challenges including RD time constraints and patient volumes amid the ebbs and flows of critical care. To align with the ever-changing dynamics of critical care, staffing level and workflows are being actively evaluated.

Table 1. Indirect Calorimetry (IC) Checklist.

Figure 1. IC Result with Invalid Test.

Figure 2. IC Result with Valid Test.

Figure 3. IC Indications and Contraindications.

Figure 4. IC EPIC Order.

Rebecca Frazier, MS, RD, CNSC1; Chelsea Heisler, MD, MPH1; Bryan Collier, DO, FACS, FCCM1

1Carilion Roanoke Memorial Hospital, Roanoke, VA

Financial Support: None Reported.

Background: Adequate energy intake with appropriate macronutrient composition is an essential part of patient recovery; however, predictive equations have been found to be of variable accuracy. Indirect calorimetry (IC), gives insight into the primary nutrition substrate being utilized as metabolic fuel and caloric needs, often identifying over- and under-feeding. Though IC is considered the gold standard for determining resting energy expenditure, it has challenges with cost, equipment feasibility, and time restraints with personnel, namely respiratory therapy (RT). Our hypothesis: Registered Dietitian (RD)-led IC tests can be conducted in a safe and feasible manner without undue risk of complication. In addition, IC will show a higher caloric need, particularly in patients who require at least seven days of ventilatory support.

Methods: A team of RDs screened surgical ICU patients at a single institution. Intubated patients for at least 3 days were considered eligible for testing. Exclusion criteria included a PEEP > 10, fraction of inspired oxygen >60%, Richmond Agitation Sedation Scale ≥ 1, chest tube air leak, extracorporeal membrane oxygenation use, and >1°C change in 1 hour. Tests were completed using the Q-NRG+ Portable Metabolic Monitor (Baxter) based on RD, and patient availability. Test results were compared to calculated needs based on the Penn State Equation (39 tests). Overfeeding/underfeeding was defined as >15% deviation from the equation results. Analysis of mean difference in energy needs was calculated using a standard paired, two tailed t-test for </= 7 total ventilated days and >7 ventilated days.

Results: Thirty patients underwent IC testing; a total of 39 tests were completed. There were no complications in RD led IC testing, and minimal RT involvement was required after 5 tests completed. Overall, 56.4% of all IC regardless of ventilator days indicated overfeeding. In addition, 33.3% of tests indicated appropriate feeding (85-115% of calculated REE), and 10.3% of tests demonstrated underfeeding. When stratified by ventilator days (> 7 d vs. ≤7 d), similar results were found noting 66% of IC tests were >15% from calculated caloric needs; 54.4-60.0% via equation were overfed and 12.5-6.7% were underfed, respectively.

Conclusion: Equations estimating caloric needs provide inconsistent results. Nutritional equations under and overestimate nutritional needs similarly regardless of ventilatory days compared to IC. Despite the lack of statistical significance, the effects of poor nutrition are well documented and vastly clinically significant. With minimal training, IC can be performed safely with an RD and bedside RN. Utilizing the RD to coordinate and perform IC testing is a feasible process that maximizes personnel efficiency and allows for immediate adjustment in the nutrition plan. IC as the gold standard for nutrition estimation should be performed on surgical ICU patients to assist in developing nutritional treatment algorithms.

Dolores Rodríguez1; Mery Guerrero2; María Centeno2; Barbara Maldonado2; Sandra Herrera2; Sergio Santana3

1Ecuadorian Society for the Fight against Cancer, Guayaquil, Guayas; 2SOLCA, Guayaquil, Guayas; 3University of Havana, La Habana, Ciudad de la Habana

Financial Support: None Reported.

Background: In 2022, the International Agency for Research on Cancer-Globocan reported nearly 20 million new cases of cancer worldwide, with 30,888 cases in Ecuador. Breast, prostate, and stomach cancers were the most diagnosed types. Oncohematological diseases (OHD) significantly affect the nutritional status of patients. The ELAN Ecuador-2014 study, involving over 5,000 patients, found malnutrition in 37% of participants overall, rising to 65% among those with OHD. The Latin American Study of Malnutrition in Oncology (LASOMO), conducted by FELANPE between 2019 and 2020, revealed a 59.1% frequency of malnutrition among 1,842 patients across 52 health centers in 10 Latin American countries. This study aims to present the current state of malnutrition associated with OHD among patients treated in Ecuadorian hospitals.

Methods: The Ecuadorian segment of the LASOMO Study was conducted between 2019 and 2020, as part of the previously mentioned regional epidemiological initiative. This study was designed as a one-day, nationwide, multicenter survey involving health centers and specialized services for patients with Oncohematological diseases (OHD) across five hospitals located in the provinces of Guayas (3), Manabí (1), and Azuay (1). The nutritional status of patients with Oncohematological diseases (OHD) was assessed using the B + C scores from Detsky et al.'s Subjective Global Assessment (SGA). This study included male and female patients aged 18 years and older, admitted to clinical, surgical, intensive care, and bone marrow transplant (BMT) units during October and November 2019. Participation was voluntary, and patients provided informed consent by signing a consent form. Data were analyzed using location, dispersion, and aggregation statistics based on variable types. The nature and strength of relationships were assessed using chi-square tests for independence, with a significance level of < 5% to identify significant associations. Odds ratios for malnutrition were calculated along with their associated 95% confidence intervals.

Results: The study enrolled 390 patients, with 63.6% women and 36.4% men, averaging 55.3 ± 16.5 years old; 47.2% were aged 60 years or older. The most common tumor locations included kidneys, urinary tract, uterus, ovaries, prostate, and testicles, accounting for 18.7% of all cases (refer to Table 1). Chemotherapy was the predominant oncological treatment, administered to 42.8% of the patients surveyed. Malnutrition affected 49.7% of the patients surveyed, with 14.4% categorized as severely malnourished (see figure 1). The incidence of malnutrition was found to be independent of age, educational level, tumor location, and current cytoreductive treatment (refer to Table 2). Notably, the majority of the malnourished individuals were men.

Conclusion: Malnutrition is highly prevalent in patients treated for OHD in Ecuadorian hospitals.

Table 1. Most Frequent Location of Neoplastic Disease in Hospitalized Ecuadorian Patients and Type of Treatment Received. The number and {in brackets} the percentage of patients included in the corresponding.

Table 2. Distribution of Malnutrition Associated with Cancer According to Selected Demographic, Clinical, and Health Characteristics of Patients Surveyed During the Oncology Malnutrition Study in Ecuador. The number and {in brackets} the percentage of malnourished patients included in each characteristic category are presented. The frequency of malnutrition was estimated using the Subjective Global Assessment (Detsky et al,. 1987.)

Figure 1. State of Malnutrition Among Patients Treated for Cancer in Hospitals In Ecuador.

Ranna Modir, MS, RD, CNSC, CDE, CCTD1; Christina Salido, RD1; William Hiesinger, MD2

1Stanford Healthcare, Stanford, CA; 2Stanford Medicine, Stanford, CA

Financial Support: None Reported.

Background: Cardiovascular Intensive Care Unit (CVICU) patients are at high risk for significant nutrient deficits, especially during the early intensive care unit (ICU) phase, when postoperative complications and hemodynamic instability are prevalent. These deficits can exacerbate the catabolic state, leading to muscle wasting, impaired immune function, and delayed recovery. A calorie deficit > 10,000 with meeting < 80% of nutritional needs in the early ICU phase (first 14 days) has been linked to worse outcomes, including prolonged intubation, increased ICU length of stay (LOS) and higher risk of organ dysfunction and infections like central line-associated bloodstream infections (CLABSI). When evaluating CLABSI risk factors, the role of nutrition adequacy and malnutrition if often underestimated and overlooked, with more emphasis placed on the type of nutrition support (NS) provided, whether enteral nutrition (EN) or parenteral nutrition (PN). Historically, there has been practice of avoiding PN to reduce CLABSI risk, rather than ensuring that nutritional needs are fully met. This practice is based on initial reports from decades ago linking PN with hospital-acquired infections. However, updated guidelines based on modern data now indicate no difference in infectious outcomes with EN vs PN. In fact, adding PN when EN alone is insufficient can help reduce nutritional deficiencies, supporting optimal immune response and resilience to infection. As part of an ongoing NS audit in our CVICU, we reviewed all CLABSI cases over a 23-month period to assess nutrition adequacy and the modality of NS (EN vs PN) provided.

Methods: Data were extracted from electronic medical records for all CLABSI cases from September 2020 to July 2022. Data collected included patient characteristics, clinical and nutrition outcomes (Table 1). Descriptive statistics (means, standard deviations, frequencies) were calculated. Chi-square or Fisher's exact test assessed the association between type of NS and meeting >80% of calorie/protein targets within 14 days and until CLABSI onset. A significance level of 0.05 was used.

Results: In a 23 month period, 28/51 (54.9%) patients with a CLABSI required exclusive NS throughout their entire CVICU stay with n = 18 male (64.3%), median age 54.5 years, mean BMI 27.4, median CVICU LOS was 49.5 days with a 46.4% mortality rate. Surgical intervention was indicated in 60.7% patients with 41.2% requiring preoperative extracorporeal membrane oxygenation (ECMO) and 52.9% postoperative ECMO (Table 1). Majority of patients received exclusive EN (53.6%) with 46.4% EN + PN and 0% exclusive PN. In the first 14 ICU days, 21.4% met >80% of calorie needs, 32.1% met >80% of protein needs with 32.1% having a calorie deficit >10,000 kcal. No difference in type of NS and ability to meet >80% of nutrient targets in the first 14 days (Table 1, p = 0.372, p = 0.689). Majority of PN (61.5%) was initiated after ICU day 7. From ICU day 1 until CLABSI onset, the EN + PN group were more able to meet >80% of calorie targets vs exclusive EN (p = 0.016). 50% were diagnosed with malnutrition. 82% required ECMO cannulas and 42.9% dialysis triple lumen. Enterococcus Faecalis was the most common organism for the EN (43.7%) and EN + PN group (35.7%) (Table 2).

Conclusion: This single-center analysis of CVICU CLABSI patients found majority requiring exclusive NS failed to meet >80% of nutrition needs during the early ICU phase. Exclusive EN was the primary mode of NS compared to EN + PN or PN alone, challenging the assumption that PN inherently increases CLABSI risk. In fact, EN + PN improved ability to meet calorie targets until CLABSI onset. These findings suggest that early nutrient deficits may increase CLABSI risk and that the risk is not dependent on the type of NS provided.

Table 1. Patient Characteristics, Clinical and Nutritional Outcomes.

Table 2. Type of Central Access Device and Microorganism in Relation to Modality of Nutrition Support Provided.

Oki Yonatan, MD1; Faya Nuralda Sitompul2

1ASPEN, Jakarta, Jakarta Raya; 2Osaka University, Minoh, Osaka

Financial Support: None Reported.

Background: Ginseng, widely used as a functional food or therapeutic supplement in Asia, contains bioactive compounds such as ginsenosides, which exert a range of biological effects, including hypoglycemic, anti-inflammatory, cardioprotective, and anti-tumor properties. However, studies have indicated that ginseng also has anticoagulant and anti-aggregation effects and may be associated with bleeding. This case report presents a potential case of ginseng-induced bleeding in an elderly patient with advanced pancreatic cancer. Case Description: A 76-year-old male with stage IV pancreatic cancer and metastases to the liver, lymph nodes, and peritoneum was in home care with BIPAP ventilation, NGT feed, ascites drainage, and foley catheter. He had a history of type A aortic dissection repair, anemia, and thrombocytopenia, with platelet counts consistently below 50,000/µL. Despite no history of anticoagulant use, the patient developed massive gastrointestinal bleeding and hematuria after consuming 100 grams of American Ginseng (AG) per day for five days. Disseminated intravascular coagulation (DIC) was initially suspected, but no signs of bleeding were observed until the third week of care, coinciding with ginseng consumption. Endoscopy was not performed due to the patient's unstable condition and the family's refusal. Discussion: The consumption of AG may have triggered bleeding due to the patient's already unstable condition and low platelet count. Ginsenosides, particularly Rg1, Rg2, and Rg3, have been shown to exert anticoagulant effects, prolonging clotting times and inhibiting platelet aggregation. Studies have demonstrated that AG extracts can significantly extend clotting times and reduce platelet activity, potentially contributing to the observed bleeding. Conclusion: This case highlights the possible role of AG in inducing severe bleeding in a patient with pancreatic cancer and thrombocytopenia. Given ginseng's known anticoagulant properties, caution should be exercised when administering it to patients with hematological abnormalities or bleeding risks, and further research is warranted to assess its safety in these populations.

Methods: None Reported.

Results: None Reported.

Conclusion: None Reported.

Kursat Gundogan, MD1; Mary Nellis, PhD2; Nurhayat Ozer, PhD3; Sahin Temel, MD3; Recep Yuksel, MD4; Murat Sungar, MD5; Dean Jones, PhD2; Thomas Ziegler, MD6

1Division of Clinical Nutrition, Erciyes University Health Sciences Institute, Kayseri; 2Emory University, Atlanta, GA; 3Erciyes University Health Sciences Institute, Kayseri; 4Department of Internal Medicine, Erciyes University School of Medicine, Kayseri; 5Department of Internal Medicine, Erciyes University School of Medicine, Kayseri; 6Emory Healthcare, Atlanta, GA

Financial Support: Erciyes University Scientific Research Committee (TSG- 2021–10078) and TUBITAK 2219 program (1059B-192000150), each to KG, and National Institutes of Health grant P30 ES019776, to DPJ and TRZ.

Background: Metabolomics represents a promising profiling technique for the investigation of significant metabolic alterations that arise in response to critical illness. The present study utilized plasma high-resolution metabolomics (HRM) analysis to define systemic metabolism associated with common critical illness severity scores in critically ill adults.

Methods: This cross-sectional study performed at Erciyes University Hospital, Kayseri, Turkiye and Emory University, Atlanta, GA, USA. Participants were critically ill adults with an expected length of intensive care unit stay longer than 48 h. Plasma for metabolomics was obtained on the day of ICU admission. Data was analyzed using regression analysis of two ICU admission illness severity scores (APACHE II and mNUTRIC) against all plasma metabolomic features in metabolome-wide association studies (MWAS). APACHE II score was analyzed as a continuous variable, and mNUTRIC score was analyzed as a dichotomous variable [≤4 (low) vs. > 4 (high)]. Pathway enrichment analysis was performed on significant metabolites (raw p < 0.05) related to each of the two illness severity scores independently.

Results: A total of 77 patients were included. The mean age was 69 years (range 33-92 years); 65% were female. More than 15,000 metabolomic features were identified for MWAS of APACHE II and mNUTRIC scores, respectively. Metabolic pathways significantly associated with APACHE II score at ICU admission included: C21-steroid hormone biosynthesis, and urea cycle, vitamin E, seleno amino acid, aspartate/asparagine, and thiamine metabolism. Metabolic pathways associated with mNUTRIC score at ICU admission were N-glycan degradation, and metabolism of fructose/mannose, vitamin D3, pentose phosphate, sialic acid, and linoleic acid. Within the significant pathways, the gut microbiome-derived metabolites hippurate and N-acetyl ornithine were downregulated, and creatine and glutamate were upregulated with increasing APACHE II scores. Metabolites involved in energy metabolism that were altered with a high (> 4) mNUTRIC score included N-acetylglucosamine (increased) and gluconate (decreased).

Conclusion: Plasma HRM identified significant associations between two commonly used illness severity scores and metabolic processes involving steroid biosynthesis, the gut microbiome, skeletal muscle, and amino acid, vitamin, and energy metabolism in adult critically ill patients.

Hilary Winthrop, MS, RD, LDN, CNSC1; Megan Beyer, MS, RD, LDN2; Jeroen Molinger, PhDc3; Suresh Agarwal, MD4; Paul Wischmeyer, MD, EDIC, FCCM, FASPEN5; Krista Haines, DO, MA4

1Duke Health, Durham, NC; 2Duke University School of Medicine- Department of Anesthesiology, Durham, NC; 3Duke University Medical Center - Department of Anesthesiology - Duke Heart Center, Raleigh, NC; 4Duke University School of Medicine, Durham, NC; 5Duke University Medical School, Durham, NC

Financial Support: None Reported.

Background: Little evidence currently exists on the effect of body mass index (BMI) on resting energy expenditure (REE). Current clinical practice guidelines are based primarily on expert opinion and provide a wide range of calorie recommendations without delineations specific to BMI categories, making it difficult for the clinician to accurately prescribe calorie needs for their hospitalized patients. This abstract utilizes metabolic cart data from studies conducted at a large academic healthcare system to investigate trends within BMI and REE.

Methods: A pooled cohort of hospitalized patients was compiled from three clinical trials where metabolic cart measurements were collected. In all three studies, indirect calorimetry was initially conducted in the intensive care setting, with follow up measurements conducted as clinically able. Variables included in the analysis was measured resting energy expenditure (mREE) in total kcals as well as kcals per kilogram of body weight on ICU admission, respiratory quotient, days post ICU admission, as well as demographics and clinical characteristics. ANOVA tests were utilized to analyze continuous data.

Results: A total of 165 patients were included in the final analysis with 338 indirect calorimetry measurements. Disease groups included COVID-19 pneumonia, non-COVID respiratory failure, surgical ICU, cardiothoracic surgery, and trauma. The average age of patients was 58 years old, with 96 males (58.2%) and 69 females (41.8%), and an average BMI of 29.0 kg/m2. The metabolic cart measurements on average were taken on day 8 post ICU admission (ranging from day 1 to day 61). See Table 1 for more demographics and clinical characteristics. Indirect calorimetry measurements were grouped into three BMI categories: BMI ≤ 29.9 (normal), BMI 30-39.9 (obese), and BMI ≥ 40 (super obese). ANOVA analyses showed statistical significance amongst the three BMI groups in both total kcals (p < 0.001) and kcals per kg (p < 0.001). The normal BMI group had an average mREE of 1632 kcals (range of 767 to 4023), compared to 1868 kcals (range of 1107 to 3754) in the obese BMI group, and 2004 kcals (range of 1219 to 3458) in the super obese BMI group. Similarly, when analyzing kcals per kg, the normal BMI group averaged 23.3 kcals/kg, the obese BMI group 19.8, and the super obese BMI group 16.3.

Conclusion: Without access to a metabolic cart to accurately measure REE, the majority of nutrition clinicians are left to estimations. Current clinical guidelines and published data do not provide the guidance that is necessary to accurately feed many hospitalized patients. This current analysis only scratches the surface on the metabolic demands of different patient populations based on their BMI status, especially given the wide ranges of energy expenditures. Robust studies are needed to further elucidate the relationships between BMI and REE in different disease states.

Table 1. Demographics and Clinical Characteristics.

Figure 1. Average Measured Resting Energy Expenditure (in Total Kcals), by BMI Group.

Figure 2. Average Measured Resting Energy Expenditure (in kcals/kg), by BMI Group.

Carlos Reyes Torres, PhD, MSc1; Daniela Delgado Salgado, Dr2; Sergio Diaz Paredes, Dr1; Sarish Del Real Ordoñez, Dr1; Eva Willars Inman, Dr1

1Hospital Oncológico de Coahuila (Oncological Hospital of Coahuila), Saltillo, Coahuila de Zaragoza; 2ISSSTE, Saltillo, Coahuila de Zaragoza

Financial Support: None Reported.

Background: Chemotherapy is one of the principal treatment in cancer. Is described that some degree of toxicity is present in 98% of patients. Changes in body composition are frequent and are related to worse outcomes. Low muscle mass is associated with chemotherapy toxicity in observational studies. Phase angle (PhA) is an indicator cell integrity and positively correlates with adequate nutritional status and muscle mass. There is limited studies that have been evaluated the associaton of PhA and chemotherapy toxicity. The aim of this study was to evaluate the association of PhA, body composition and chemotherapy toxicity in cancer patients.

Methods: A prospective cohort study was conducted in adult patients with solid neoplasic disease with first-line systemic treatments. The subjects were evaluated at first chemotherapy treatment using bioelectrical impedance analysis with a RJL device and according to the standardized technique. The outcome is to know chemotherapy toxicity in the first 4 cycles of chemotherapy and associated with PhA and body composition. Toxicity was evaluated using National Cancer Institute (NCI) common terminology criteria for adverse events version 5.0. A PhA < 4.7 was considered low according to other studies.

Results: A total of 54 patients were evaluated and included in the study. The most common cancer diagnosis was breast cancer (40%), gastrointestinal tumors (33%), lung cancer (18%). Chemotherapy toxicity was presented in 46% of the patients. The most common adverse effects were gastrointestinal (48%), blood disorders (32%) and metabolically disorders (40%). There were statistically differences in PhA between patients with chemotherapy toxicity and patients without adverse effects: 4.45º (3.08-4.97) vs 6.07º (5.7-6.2) respectively, p vale < 0.001. PhA was associated with the risk of chemotherapy toxicity HR 8.7 (CI 95% 6.1-10.7) log rank test p = 0.02.

Conclusion: PhA was associated with the risk of chemotherapy toxicity in cancer patients.

Lizl Veldsman, RD, M Nutr, BSc Dietetics1; Guy Richards, MD, PhD2; Carl Lombard, PhD3; Renée Blaauw, PhD, RD1

1Division of Human Nutrition, Department of Global Health, Faculty of Medicine & Health Sciences, Stellenbosch University, Cape Town, Western Cape; 2Department of Surgery, Division of Critical Care, Faculty of Health Sciences, University of the Witwatersrand, Johannesburg, Gauteng; 3Division of Epidemiology and Biostatistics, Department of Global Health, Faculty of Medicine and Health Sciences, Stellenbosch University, Cape Town, Western Cape

Financial Support: Fresenius Kabi JumpStart Research Grant.

Background: Critically ill patients lose a significant amount of muscle mass over the first ICU week. We sought to determine the effect of bolus amino acid (AA) supplementation on the urea-to-creatinine ratio (UCR) trajectory over time, and whether UCR correlates with histology myofiber cross-sectional area (CSA) as a potential surrogate marker of muscle mass.

Methods: This was a secondary analysis of data from a registered clinical trial (ClinicalTrials.gov NCT04099108) undertaken in a predominantly trauma surgical ICU. Participants were randomly assigned into two groups both of which received standard care nutrition (SCN) and mobilisation. Study participants randomised to the intervention group also received a bolus AA supplement, with a 45-minute in-bed cycling session, from average ICU day 3 for a mean of 6 days. The change in vastus lateralis myofiber CSA, measured from pre-intervention (average ICU day 2) to post-intervention (average ICU day 8), was assessed through biopsy and histological analysis. A linear mixed-effects regression model was used to compare the mean daily UCR profiles between study groups from ICU day 0 to day 10. As a sensitivity analysis, we adjusted for disease severity on admission (APACHE II and SOFA scores), daily fluid balance, and the presence of acute kidney injury (AKI). A spearman correlation compared the UCR on ICU day 2 (pre-intervention) and ICU day 8 (post-intervention) with the corresponding myofiber CSA.

Results: A total of 50 enrolled participants were randomised to the study intervention and control groups in a 1:1 ratio. The control and intervention groups received, on average, 87.62 ± 32.18 and 85.53 ± 29.29 grams of protein per day (1.26 ± 0.41 and 1.29 ± 0.40 g/kg/day, respectively) from SCN, and the intervention group an additional 30.43 ± 5.62 grams of AA (0.37 ± 0.06 g/kg protein equivalents) from the AA supplement. At baseline, the mean UCR for the control (75.6 ± 31.5) and intervention group (63.8 ± 27.1) were similar. Mean UCR increased daily from baseline in both arms, but at a faster rate in the intervention arm with a significant intervention effect (p = 0.0127). The UCR for the intervention arm at day 7 and 8 was significantly higher by 21 and 22 units compared to controls (p = 0.0214 and p = 0.0215, respectively). After day 7 the mean daily UCR plateaued in the intervention arm, but not in the controls. Adjusting for disease severity, daily fluid balance, and AKI did not alter the intervention effect. A significant negative association was found between UCR and myofiber CSA (r = -0.39, p = 0.011) at ICU day 2 (pre-intervention), but not at day 8 (post-intervention) (r = 0.23, p = 0.153).

Conclusion: Bolus amino acid supplementation significantly increases the UCR during the first ICU week, thereafter plateauing. UCR at baseline may be an indicator of muscle status.

Figure 1. Change in Urea-to-Creatinine Ratio (UCR) Over ICU Days in the Control and Intervention Group. Error bars Represent 95% Confidence Intervals (CIs).

Paola Renata Lamoyi Domínguez, MSc1; Iván Osuna Padilla, PhD2; Lilia Castillo Martínez, PhD3; Josué Daniel Cadeza-Aguilar, MD2; Martín Ríos-Ayala, MD2

1UNAM, National Autonomous University of Mexico, Mexico City, Distrito Federal; 2National Institute of Respiratory Diseases, Mexico City, Distrito Federal; 3National Institute of Medical Sciences and Nutrition Salvador Zubirán, Mexico City, Distrito Federal

Financial Support: None Reported.

Background: Non-defecation (ND) is highly prevalent in critically ill patients on mechanical ventilation (MV) and has been reported in up to 83% of cases. This disorder is associated with high morbidity and mortality rates. Most existing research has focused on the association between clinical data and non-defecation; however, there is a lack of evidence regarding its association with dietary factors. We aimed to analyze the association between dietary fiber in enteral nutrition and the amount of fluids administered through enteral and parenteral routes with defecation during the first 6-days of MV in critically ill patients with pneumonia and other lung manifestations.

Methods: We conducted a longitudinal analysis of MV patients receiving enteral nutrition (EN) at a tertiary care hospital in Mexico City between May 2023 and April 2024. The inclusion criteria were age >18 years, MV, admission to the respiratory intensive care unit (ICU) for pneumonia or other lung manifestations, and nutritional assessment performed during the first 24 h after ICU admission. Exclusion criteria was established as patients who required parenteral nutrition, major surgery, traumatic brain injury, or neuromuscular disorders were excluded from this study. A nutritional assessment, including the NUTRIC score, SOFA and APACHE II assessments, and an estimation of energy-protein requirements, was performed by trained dietitians within the first 24 hours after ICU admission. During each day of follow-up (0 to 6 days), we recorded the amount of fiber provided in EN, and volume of infusion fluids, including enteral and parenteral route, and medical prescription of opioids, sedatives, neuromuscular blockers and vasopressors. ND was defined as >6 days without defecation from ICU admission. The differences between ND and defecation were also assessed. Association of ND with dietary factors were examined using discrete-time survival analysis.

Results: Seventy-four patients were included; ND was observed in 40 patients (54%). Non-defecation group had higher ICU length of stay, and 50% of this group had the first defecation until day 10. No differences in fiber provision and volume of infusion fluids were observed between the groups. In multivariate analysis, no associations between ND and fiber (fiber intake 10 to 20 g per day, OR 1.17 95% CI:0.41-3.38, p = 0.29) or total fluids (fluids intake 25 to 30 ml/kg/d, OR 1.85 95%CI:0.44-7.87, p = 0.404) were observed.

Conclusion: Non-defecation affected 54% of the study population. Although fiber and fluids are considered a treatment for non-defecation, we did not find an association in critically ill patients.

Table 1. Demographic and Clinical Characteristics by Groups.

Table 2. Daily Comparison of Dietary Factors.

Andrea Morand, MS, RDN, LD1; Osman Mohamed Elfadil, MBBS1; Kiah Graber, RDN1; Yash Patel, MBBS1; Suhena Patel, MBBS1; Chloe Loersch, RDN1; Isabelle Wiggins, RDN1; Anna Santoro, MS, RDN1; Natalie Johnson, MS1; Kristin Eckert, MS, RDN1; Dana Twernbold, RDN1; Dacia Talmo, RDN1; Elizabeth Engel, RRT, LRT1; Avery Erickson, MS, RDN1; Alex Kirby, MS, RDN1; Mackenzie Vukelich, RDN1; Kate Sandbakken, RDN1; Victoria Vasquez, RDN1; Manpreet Mundi, MD1

1Mayo Clinic, Rochester, MN

Financial Support: None Reported.

Background: Current guidelines for nutrition support in critically ill patients recommend the utilization of indirect calorimetry (IC) to determine energy needs. However, IC testing is limited at many institutions due to accessibility, labor, and costs, which leads to reliance on predictive equations to determine caloric targets. A quality improvement (QI) initiative was implemented to assess the impact on nutrition care when IC is routinely completed.

Methods: A prospective QI project in selected medical and surgical intensive care units (ICU) included critically ill patients assessed by the dietitian within 24-48 hours of consult order or by hospital day 4. Patients with contraindications to IC were excluded, including those requiring ECMO, CRRT, MARS, or FiO2 > 55, as well as spontaneously breathing patients requiring significant supplemental oxygen. The initial caloric target was established utilizing predictive equations, which is the standard of care at our institution. After day 4 of the ICU stay, IC measurements, predictive equations, and weight-based nomograms were collected. The predictive equations utilized included Harris-Benedict (HB) - basal, adjusted HB (75% of basal when body mass index (BMI) > 30), Penn State if ventilated, Mifflin St. Jeor (MSJ), revised HB, and revised HB adjusted (75% of basal when BMI > 30). Additional demographic, anthropometric, and clinical data were collected.

Results: Patients (n = 85) were majority male (n = 53, 62.4%), admitted to the surgical ICU (n = 57, 67.1%), overweight (mean BMI 29.8 kg/m^2), and the average age was 61.3 years old (SD 16.5). At the time of the IC test, the median ICU length of stay was 6 days; 77.6% (n = 66) were supported with mechanical ventilation, and median ventilator days were 4 days (Table 1). Mean IC measured REE was compared to predictive equations, showing that except for weight-based nomogram high caloric needs (p = 0.3615), all equations were significantly lower than IC (p < 0.0001). Median absolute differences from IC for evaluated predictive equations (Figure 1). After REE was measured, caloric goals increased significantly for patients on enteral (EN) and parenteral nutrition (PN) (p = 0.0016 and p = 0.05, respectively). In enterally fed the mean calorie goal before REE was 1655.4 (SD 588), and after REE 1917.6 (SD 528.6), an average increase of 268.4 kcal/day; In parenterally fed patients, the mean calorie goal before REE was1395.2 kcal (SD 313.6) and after REE 1614.1 (SD 239.3), an average increase of 167.5 kcal (Table 2). The mean REE per BMI category per actual body weight was BMI < 29.9 = 25.7 ± 7.9 kcal/kg, BMI 30-34.9 = 20.3 ± 3.8 kcal/kg, BMI 35-39.9 = 22.8 ± 4.6 kcal/kg, and BMI ≥ 40 = 16.3 ± 2.9 kcal/kg (25.4 ± 10.5 kcal/kg of ideal body weight). (Figure 2) illustrates the average daily calorie need broken down by BMI for IC and examined predictive equations.

Conclusion: There was a significant difference between IC measurements and various predictive equations except for weight-based high-estimated calorie needs. Nutrition goals changed significantly in response to IC measurements. It is recommended that we expand the use of IC in the critically ill population at our institution. In settings where IC is not possible, weight-based nomograms should be utilized.

Table 1. Baseline Demographics and Clinical Characteristics.

Table 2. Nutrition Support.

Figure 1. Difference in Daily Calorie Estimation Utilizing IC Compared to Predictive Equations.

Figure 2. RMR by IC and Other Predictive Equations by BMI.

GI, Obesity, Metabolic, and Other Nutrition Related Concepts

Suhena Patel, MBBS1; Osman Mohamed Elfadil, MBBS1; Yash Patel, MBBS1; Chanelle Hager, RN1; Manpreet Mundi, MD1; Ryan Hurt, MD, PhD1

1Mayo Clinic, Rochester, MN

Financial Support: None Reported.

Background: Chronic capillary leak syndrome is a rare but potentially fatal side effect of immunotherapy for some malignancies, mainly manifested with intractable generalized edema and often refractory hypotension. An idiopathic type of the syndrome is also known. It can be diagnosed by exclusion in patients with a single or recurrent episode of intravascular hypovolemia or generalized edema, primarily manifested by the diagnostic triad of hypotension, hemoconcentration, and hypoalbuminemia in the absence of an identifiable alternative cause. Supportive care with a role for steroids remains the standard treatment. In capillary leak syndrome secondary to cancer immune therapy, discontinuing the offending agent is typically considered. This relatively rare syndrome can be associated with significant clinical challenges. This clinical case report focuses on aspects of nutrition care.

Methods: A 45-year-old male with a past medical history of hypertension, pulmonary tuberculosis in childhood, and rectal cancer was admitted for evaluation of anasarca. He was first diagnosed with moderately differentiated invasive adenocarcinoma of rectum IIIb (cT3, cN1, cM0) in October 2022. As initial therapy, he was enrolled in a clinical trial. He received 25 cycles of immunotherapy with the study drug Vudalimab (PD1/CTLA4 bispecific antibody), achieving a complete clinical response without additional chemotherapy, radiation, or surgery. He, unfortunately, has developed extensive capillary leak syndrome manifested with recurrent anasarca, chylous ascites, and pleural effusions since November 2023. His treatment was also complicated by the development of thyroiditis and insulin-dependent diabetes. The patient most recently presented with abdominal fullness, ascites, and peripheral edema that did not improve despite diuretic therapy. A diagnostic and therapeutic paracentesis was performed, and chylous ascites were revealed. Two weeks later, the patient was presented with re-accumulation of ascites and worsening anasarca with pleural and pericardial effusion. A PET CT then was negative for malignant lesions but revealed increased uptake along the peritoneal wall, suggestive of peritonitis. A lymphangiogram performed for further evaluation revealed no gross leak/obstruction; however, this study could not rule out microleak from increased capillary permeability. However, he required bilateral pleural and peritoneal drains (output ranged from 0.5 to 1 L daily). A diagnosis of Capillary leak syndrome was made. In addition to Octreotide, immunosuppression therapy was initiated with IV methyl prednisone (40 mg BID) followed by a transition to oral steroids (60 mg PO); however, the patient's symptoms reappeared with a reduction in dose of prednisone and transition to oral steroids. His immunosuppression regimen was modified to include a trial of IVIG weekly and IV albumin twice daily. From a nutritional perspective, he was initially on a routine oral diet. Still, his drain was increased specifically after fatty food consumption, so he switched to a low fat 40 g/day and high protein diet to prevent worsening chylous ascites. In the setting of worsening anasarca and moderate malnutrition based on ASPEN criteria, along with significant muscle loss clinically, he was started on TPN. A no-fat diet was initiated to minimize lymphatic flow with subsequent improvement in his chest tube output volume, followed by a transition to home parenteral nutrition with mixed oil and oral diet.

Results: None Reported.

Conclusion: Chronic capillary/lymphatic leak syndrome can be challenging and necessitate dietary modification. Along with dietary changes to significantly reduce oral fat intake, short—or long-term PN can be considered.

Kishore Iyer, MBBS1; Francisca Joly, MD, PhD2; Donald Kirby, MD, FACG, FASPEN3; Simon Lal, MD, PhD, FRCP4; Kelly Tappenden, PhD, RD, FASPEN2; Palle Jeppesen, MD, PhD5; Nader Youssef, MD, MBA6; Mena Boules, MD, MBA, FACG6; Chang Ming, MS, PhD6; Tomasz Masior, MD6; Susanna Huh, MD, MPH7; Tim Vanuytsel, MD, PhD8

1Icahn School of Medicine at Mount Sinai, New York, NY; 2Nutritional Support, Hôpital Beaujon, Paris, Ile-de-France; 3Department of Intestinal Failure and Liver Diseases, Cleveland, OH; 4Salford Royal NHS Foundation Trust, Salford, England; 5Department of Intestinal Failure and Liver Diseases, Rigshospitalet, Copenhagen, Denmark, Hovedstaden; 6Ironwood Pharmaceuticals, Basel, Basel-Stadt; 7Ironwood Pharmaceuticals, Boston, MA; 8University Hospitals Leuven, Leuven, Brabant Wallon

Encore Poster

Presentation: American College of Gastroenterology 2024, October 25-30, 2024, Philadelphia, Pennsylvania.

Financial Support: None Reported.

International Poster of Distinction

Francisca Joly, MD, PhD1; Tim Vanuytsel, MD, PhD2; Donald Kirby, MD, FACG, FASPEN3; Simon Lal, MD, PhD, FRCP4; Kelly Tappenden, PhD, RD, FASPEN1; Palle Jeppesen, MD, PhD5; Federico Bolognani, MD, PhD6; Nader Youssef, MD, MBA6; Carrie Li, PhD6; Reda Sheik, MPH6; Isabelle Statovci, BS, CH6; Susanna Huh, MD, MPH7; Kishore Iyer, MBBS8

1Nutritional Support, Hôpital Beaujon, Paris, Ile-de-France; 2University Hospitals Leuven, Leuven, Brabant Wallon; 3Department of Intestinal Failure and Liver Diseases, Cleveland, OH; 4Salford Royal NHS Foundation Trust, Salford, England; 5Department of Intestinal Failure and Liver Diseases, Rigshospitalet, Copenhagen, Denmark, Hovedstaden; 6Ironwood Pharmaceuticals, Basel, Basel-Stadt; 7Ironwood Pharmaceuticals, Boston, MA; 8Icahn School of Medicine at Mount Sinai, New York, NY

Encore Poster

Presentation: Digestive Disease Week 2024, May 18 - 21, 2024, Washington, US.

Financial Support: None Reported.

Tim Vanuytsel, MD, PhD1; Simon Lal, MD, PhD, FRCP2; Kelly Tappenden, PhD, RD, FASPEN3; Donald Kirby, MD, FACG, FASPEN4; Palle Jeppesen, MD, PhD5; Francisca Joly, MD, PhD3; Tomasz Masior, MD6; Patricia Valencia, PharmD7; Chang Ming, MS, PhD6; Mena Boules, MD, MBA, FACG6; Susanna Huh, MD, MPH7; Kishore Iyer, MBBS8

1University Hospitals Leuven, Leuven, Brabant Wallon; 2Salford Royal NHS Foundation Trust, Salford, England; 3Nutritional Support, Hôpital Beaujon, Paris, Ile-de-France; 4Department of Intestinal Failure and Liver Diseases, Cleveland, OH; 5Department of Intestinal Failure and Liver Diseases, Rigshospitalet, Copenhagen, Denmark, Hovedstaden; 6Ironwood Pharmaceuticals, Basel, Basel-Stadt; 7Ironwood Pharmaceuticals, Boston, MA; 8Icahn School of Medicine at Mount Sinai, New York, NY

Encore Poster

Presentation: American College of Gastroenterology 2024, October 25-30, 2024, Philadelphia, Pennsylvania.

Financial Support: None Reported.

Boram Lee, MD1; Ho-Seong Han, PhD1

1Seoul National University Bundang Hospital, Seoul, Seoul-t'ukpyolsi

Financial Support: None Reported.

Background: Pancreatic cancer is one of the fatal malignancies, with a 5-year survival rate of less than 10%. Despite advancements in treatment, its incidence is increasing, driven by aging populations and increasing obesity rates. Obesity is traditionally considered to be a negative prognostic factor for many cancers, including pancreatic cancer. However, the "obesity paradox" suggests that obesity might be associated with better outcomes in certain diseases. This study investigated the effect of obesity on the survival of long-term pancreatic cancer survivors after pancreatectomy.

Methods: A retrospective analysis was conducted on 404 patients with pancreatic ductal adenocarcinoma (PDAC) patients who underwent surgery between January 2004 and June 2022. Patients were classified into the non-obese (BMI 18.5-24.9) (n = 313) and obese (BMI ≥ 25.0) (n = 91) groups. The data collected included demographic, clinical, perioperative, and postoperative information. Survival outcomes (overall survival [OS], recurrence-free survival [RFS], and cancer-specific survival [CSS]) were analyzed using Kaplan-Meier curves and Cox regression models. A subgroup analysis examined the impact of the visceral fat to subcutaneous fat ratio (VSR) on survival within the obese cohort.

Results: Obese patients (n = 91) had a significantly better 5-year OS (38.9% vs. 27.9%, p = 0.040) and CSS (41.4% vs. 33%, p = 0.047) than non-obese patients. RFS did not differ significantly between the groups. Within the obese cohort, a lower VSR was associated with improved survival (p = 0.012), indicating the importance of fat distribution in outcomes.

Conclusion: Obesity is associated with improved overall and cancer-specific survival in patients with pancreatic cancer undergoing surgery, highlighting the potential benefits of a nuanced approach to managing obese patients. The distribution of adipose tissue, specifically higher subcutaneous fat relative to visceral fat, further influences survival, suggesting that tailored treatment strategies could enhance the outcomes.

Nicole Nardella, MS1; Nathan Gilchrist, BS1; Adrianna Oraiqat, BS1; Sarah Goodchild, BS1; Dena Berhan, BS1; Laila Stancil, HS1; Jeanine Milano, BS1; Christina Santiago, BS1; Melissa Adams, PA-C1; Pamela Hodul, MD1

1Moffitt Cancer Center, Tampa, FL

Financial Support: None Reported.

Background: Pancreatic cancer (PC) is a devastating diagnosis with 66,440 new cases and 51,750 deaths estimated in 2024. The prevalence of malnutrition in patients with cancer has been reported to range from 30-85% depending on patient age, cancer type, and stage of disease. Specifically, PC patients frequently present with malnutrition which can lead to negative effects on quality of life and tumor therapy. We hypothesize that increased awareness of early nutritional intervention for PC patients has led to high utilization of dietitian consultations at our tertiary cancer center.

Methods: This IRB-exempt retrospective review included newly diagnosed, treatment naïve PC patients presenting to our institution in 2021-2023 (n = 701). We define newly diagnosed as having pathologically confirmed adenocarcinoma within 30 days of clinic presentation. Patients were screened for weight loss and risk of malnutrition using the validated Malnutrition Screening Tool (MST) (89.5 positive predictive value) at the initial consultation and were referred to a dietitian based on risk or patient preference. Data was collected on demographics, disease stage (localized vs metastatic), tumor location (head/neck/uncinate, body/tail, multi-focal), presenting symptoms (percent weight loss, abdominal pain, bloating, nausea/vomiting, fatigue, change in bowel habits), experience of jaundice, pancreatitis, and gastric outlet obstruction, and dietitian consultation. Descriptive variables and Fisher's exact test were used to report outcomes.

Results: The majority of patients were male (54%) with median age of 70 (27-95). About half of patients had localized disease (54%) with primary tumor location in the head/neck/uncinate region (57%). Patients with head/neck/uncinate tumor location mostly had localized disease (66%), while patients with body/tail tumors tended to have metastatic disease (63%). See Table 1 for further demographics. Unintentional weight loss was experienced by 66% of patients (n = 466), 69% of localized patients (n = 261) and 64% of metastatic patients (n = 205). Patients with localized disease stated a 12% loss in weight over a median of 3 months, while metastatic patients reported a 10% weight loss over a median of 5 months. Of the localized patients, the majority presented with symptoms of abdominal pain (66%), nausea/vomiting/fatigue (61%), and change in bowel habits (44%). Presenting symptoms of the metastatic patients were similar (see Table 2). There was no statistical significance of tumor location in relation to presenting symptoms. Dietitian consults occurred for 67% (n = 473) of the patient population, 77% for those with localized disease and 57% for those with metastatic disease. Of those with reported weight loss, 74% (n = 343) had dietitian consultation.

Conclusion: Overall, it was seen that a high number of newly diagnosed, treatment naïve PC patients present with malnutrition. Patients with localized disease and tumors located in the head/neck/uncinate region experience the greatest gastrointestinal symptoms of nausea, vomiting, change in bowel habits, and fatigue. Early implementation of a proactive nutritional screening program resulted in increased awareness of malnutrition and referral for nutritional intervention for newly diagnosed PC patients.

Table 1. Demographics and Disease Characteristics.

Table 2. Presenting Symptoms.

Nicole Nardella, MS1; Nathan Gilchrist, BS1; Adrianna Oraiqat, BS1; Sarah Goodchild, BS1; Dena Berhan, BS1; Laila Stancil, HS1; Jeanine Milano, BS1; Christina Santiago, BS1; Melissa Adams, PA-C1; Pamela Hodul, MD1

1Moffitt Cancer Center, Tampa, FL

Financial Support: None Reported.

Background: Pancreatic cancer (PC) is an aggressive disease, with 5-year survival rate of 13%. Symptoms occur late in the disease course, leading to approximately 50% of patients presenting with metastatic disease. New-onset diabetes is often one of the first symptoms of PC, with diagnosis occurring up to 3 years before cancer diagnosis. We hypothesize increasing awareness of PC prevalence in diabetic patients, both new-onset and pre-existing, may lead to early PC diagnosis.

Methods: This IRB-exempt retrospective review included new PC patients presenting to our institution in 2021-2023 with diabetes diagnosis (n = 458). We define new-onset diabetes as having been diagnosed within 3 years prior to pathologically confirmed adenocarcinoma. We define newly diagnosed PC as having pathologically confirmed adenocarcinoma within 30 days of clinic presentation. Data was collected on demographics, staging (localized vs metastatic), tumor location (head/neck/uncinate, body/tail, multi-focal), treatment initiation, diabetes onset (new-onset vs pre-existing), diabetes regimen, and weight loss. Descriptive variables were used to report outcomes.

Results: In the study period, 1,310 patients presented to our institution. Of those, 35% had a diabetes diagnosis (n = 458). The majority of patients were male (61%) with age at PC diagnosis of 69 (41-92). Patients mostly had localized disease (57%) with primary tumor location in the head/neck/uncinate region (59%). New-onset diabetes was present in 31% of diabetics, 11% of all new patients, with 63% having localized disease (79% head/neck/uncinate) and 37% metastatic (66% body/tail). Of those with pre-existing diabetes (69%), 54% had localized disease (69% head/neck/uncinate) and 46% had metastatic disease (53% body/tail). See Table 1 for further demographic/disease characteristics. Abrupt worsening of diabetes was seen in 10% (n = 31) of patients with pre-existing diabetes, and 12% had a change in their regimen prior to PC diagnosis. Hence 13% (175/1,310) of all new patients presented with either new-onset or worsening diabetes. Weight loss was present in 75% (n = 108) of patients with new-onset diabetes, with a median of 14% weight loss (3%-38%) over 12 months (1-24). Alternatively, weight loss was present in 66% (n = 206) of patients with pre-existing diabetes, with a median of 14% weight loss (4%-51%) over 6 months (0.5-18). Diabetes medication was as follows: 41% oral, 30% insulin, 20% both oral and insulin, 10% no medication. Of those patients with new-onset diabetes, 68% were diagnosed within 1 year of PC diagnosis and 32% were diagnosed within 1-3 years of PC diagnosis. Of those within 1 year of diagnosis, 68% had localized disease with 81% having head/neck/uncinate tumors. Of the metastatic (31%), 73% had body/tail tumors. For patients with diabetes diagnosis within 1-3 years of PC diagnosis, 52% had localized disease (75% head/neck/uncinate) and 48% had metastatic disease (59% body/tail). See Table 2 for further characteristics.

Conclusion: Overall, approximately one-third of new patients presenting with PC at our institution had diabetes, and new-onset diabetes was present in one-third of those patients. The majority of diabetes patients presented with localized head/neck/uncinate tumors. When comparing new-onset vs pre-existing diabetes, new-onset tended to experience greater weight loss over a longer time with more localized disease than pre-existing diabetes patients. Patients with diabetes diagnosis within 1 year of PC diagnosis had more localized disease (head/neck/uncinate). Hence increased awareness of diabetes in relation to PC, particularly new onset and worsening pre-existing, may lead to early diagnosis.

Table 1. Demographics and Disease Characteristics.

Table 2. New-Onset Diabetes Characteristics.

Marcelo Mendes, PhD1; Gabriela Oliveira, RD2; Ana Zanini, RD, MSc2; Hellin dos Santos, RD, MSc2

1Cicatripelli, Belém, Para; 2Prodiet Medical Nutrition, Curitiba, Parana

Encore Poster

Financial Support: None Reported.

Background: According to the NPUAP, a pressure injury (PI) is damage that occurs to the skin and/or underlying soft tissue, primarily over bony prominences, and may also be related to the use of medical devices. PIs can range from intact skin to deeper ulcers, affecting structures such as muscles and bones. The aim of this study was to report the experience of using a specialized supplement for wound healing in the treatment of a PI.

Methods: This is a case report based on the experience of the nurses from Cicatripelli®. Data were collected from May to July 2024 through a review of medical records and photographs showing the wound's progression. The patient was a 69-year-old woman with COPD, diabetes mellitus, and systemic arterial hypertension, who denied smoking and alcohol consumption. She developed a stage 4 PI in the sacral region after 15 days of mechanical ventilation due to bacterial pneumonia and was admitted to a private clinic for treatment on May 2, 2024. Initial wound assessment: Measurements: 16.5x13x4cm (WxLxD); abundant purulent exudate with a foul odor; intact peripheral skin; mild to moderate pain; 75% granulation tissue and 25% liquefactive necrosis (slough) (Figure 1). The wound was cleansed with 0.1% polyhexamethylene biguanide (PHMB) solution, followed by conservative debridement, primary coverage with hydrophiber with 1.2% silver, and secondary coverage with cotton gauze and transparent film, with dressing changes scheduled every 72 hours. Supplementation (Correctmax – Prodiet Medical Nutrition) started on 05/20/2024, with a dosage of 2 sachets per day, containing 10 g of collagen peptide, 3 g of L-arginine, 612 mg of vitamin A, 16 mg of vitamin E, 508 mg of vitamin C, 30 mcg of selenium, and 16 mg of zinc.

Results: On the 17th day of supplementation, the hydrophiber with silver dressing was replaced with PHMB-impregnated gauze, as the wound showed no signs of infection and demonstrated significant clinical improvement. Measurements: 8x6x2cm (WxLxD); moderate serohematic exudate; intact peripheral skin; 100% granulation tissue; significant improvement in pain and odor (Figure 2). On the 28th day, the dressing was switched to calcium and sodium alginate to optimize exudate control due to the appearance of mild dermatitis. Low-intensity laser therapy was applied, and a skin protective spray was used. Wound assessment: Measurements: 7x5.5x1.5 cm (WxLxD), with maintained characteristics (Figure 3). On the 56th day, the patient returned for dressing change and discharge instructions, as she could not continue the treatment due to a lack of resources. The approach remained the same with dressing changes every 3 days. Wound assessment: Measurements: 5x3.5x0.5 cm (WxLxD), with approximately 92% reduction in wound area, epithelialized margins, and maintained characteristics (Figure 4).

Conclusion: Nutritional intervention with specific nutrient supplementation can aid in the management of complex wounds, serving as a crucial tool in the healing process and contributing to reduced healing time.

Figure 1. Photo of the wound on the day of the initial assessment on 05/02/2024.

Figure 2. Photo of the wound after 17 days of supplementation on 06/06/2024.

Figure 3. Photo of the wound after 28 days of supplementation on 06/17/2024.

Figure 4. Photo of the wound after 56 days of supplementation on 07/15/2024.

Ludimila Ribeiro, RD, MSc1; Bárbara Gois, RD, PhD2; Ana Zanini, RD, MSc3; Hellin dos Santos, RD, MSc3; Ana Paula Celes, MBA3; Flávia Corgosinho, PhD2; Joao Mota, PhD4

1School of Nutrition, Federal University of Goiás, Goiania, Goias; 2School of Nutrition, Federal University of Goiás, Goiânia, Goias; 3Prodiet Medical Nutrition, Curitiba, Parana; 4Federal University of Goias, Goiania, Goias

Financial Support: None Reported.

Background: Postprandial blood glucose is considered an important risk factor for the development of macrovascular and microvascular diseases. Despite the use of hypoglycemic agents, patients with diabetes often experience postprandial hyperglycemia due to unbalanced meals. The aim of this study was to compare the effects of a low glycemic index formula for glycemic control as a substitute for a standard breakfast in patients with type 2 diabetes.

Methods: This randomized, placebo-controlled, crossover study included 18 individuals with type 2 diabetes. Participants were instructed to consume, in random order, either a nutritional formula or a typical Brazilian breakfast with the same caloric content for three consecutive week days in different weeks. The nutritional formula (200 mL) provided 200 kcal, with 20 g of carbohydrates, 8.8 g of protein, 9.4 g of fat (MUFA: 5.6 g, PUFA: 2.0 g), and 3.0 g of fiber (DiamaxIG – Prodiet Medical Nutrition), serving as a breakfast substitute. Both the nutritional formula and the standard breakfast were provided to the participants. During the two weeks of intervention, participants used continuous glucose monitoring sensors (Libre 2). Weight and height were measured to calculate body mass index (BMI), and medication use was monitored.

Results: The sample consisted of 61% females, with a mean age of 50.28 ± 12.58 years. The average blood glucose level was 187.13 ± 77.98 mg/dL and BMI 29.67 ± 4.86 kg/m². All participants were taking metformin, and two were taking it concomitantly with insulin. There were no changes in medication doses or regimens during the study. The incremental area under the curve was significantly lower in the nutritional formula group compared to the standard breakfast group (2,794.02 ± 572.98 vs. 4,461.55 ± 2,815.73, p = 0.01).

Conclusion: The low glycemic index formula for glycemic control significantly reduced postprandial glycemic response compared to a standard Brazilian breakfast in patients with type 2 diabetes. These findings suggest that incorporating low glycemic index meals could be an effective strategy for better managing postprandial blood glucose levels in this population, which may help mitigate the risk of developing macro and microvascular complications.

Kirk Kerr, PhD1; Bjoern Schwander, PhD2; Dominique Williams, MD, MPH1

1Abbott Nutrition, Columbus, OH; 2AHEAD GmbH, Bietigheim-Bissingen, Baden-Wurttemberg

Financial Support: Abbott Nutrition.

Background: According to the World Health Organization, obesity is a leading risk factor for global non communicable diseases like diabetes, heart disease and cancer. Weight cycling, often defined as intentionally losing and unintentionally regaining weight, is observed in people with obesity and can have adverse health and economic consequences. This study develops the first model of the health economic consequences of weight cycling in individuals with obesity, defined by a body-mass index (BMI) ≥ 30 kg/m².

Methods: A lifetime state-transition model (STM) with monthly cycles was developed to simulate a cohort of individuals with obesity, comparing “weight cyclers” versus “non-cyclers”. The simulated patient cohort was assumed to have average BMI 35.5, with 11% of patients with cardiovascular disease, and 6% of patients with type 2 diabetes. Key outcomes were the cost per obesity-associated event avoided, the cost per life-year (LY) gained, and the cost per quality-adjusted life year (QALY) gained, using a US societal perspective. Transition probabilities for obesity-associated diseases were informed by meta-analyses and were based on US disease-related base risks. Risks were adjusted by BMI-related, weight-cycling-related, and T2D-related relative risks (RR). BMI progression, health utilities, direct and indirect costs, and other population characteristics were informed by published US studies. Future costs and effects were discounted by 3% per year. Deterministic and probabilistic sensitivity analyses were performed to investigate the robustness of the results.

Results: Simulating a lifetime horizon, non-cyclers had 0.090 obesity-associated events avoided, 0.602 LYs gained, 0.518 QALYs gained and reduced total costs of approximately $4,592 ($1,004 direct and $3,588 indirect costs) per person. Sensitivity analysis showed the model was most responsive to changes in patient age and cardiovascular disease morbidity and mortality risks. Non-cycling as the cost-effective option was robust in sensitivity analyses.

Conclusion: The model shows that weight cycling has a major impact on the health of people with obesity, resulting in increased direct and indirect costs. The approach to chronic weight management programs should focus not only on weight reduction but also on weight maintenance to prevent the enhanced risks of weight cycling.

Avi Toiv, MD1; Arif Sarowar, MSc2; Hope O'Brien, BS2; Thomas Pietrowsky, MS, RD1; Nemie Beltran, RN1; Yakir Muszkat, MD1; Syed-Mohammad Jafri, MD1

1Henry Ford Hospital, Detroit, MI; 2Wayne State University School of Medicine, Detroit, MI

Financial Support: None Reported.

Background: Age is an important factor in the transplant evaluation as age at transplantation is historically thought to influence transplant outcomes in organ transplant recipients. There is limited data on the impact of age on intestinal (IT) and multivisceral (MVT) organ transplantation. This study investigates the impact of age on post-transplant outcomes in patients who received an intestinal or multivisceral (including intestine) transplant, comparing those under 40 years old to those aged 40 and above.

Methods: We conducted a retrospective chart review of all patients who underwent IT at an academic transplant center from 2010 to 2023. The primary outcome was patient survival and graft failure analyzed with Kaplan-Meier survival analysis.

Results: Among 50 IT recipients, there were 11 IT recipients < 40years old and 39 IT recipients ≥40 years old (Table). The median age at transplant in the <40 group was 37 years (range, 17-39) and in the ≥40 group was 54 years (range, 40-68). In both groups, the majority of transplants were exclusively IT, however they included MVT as well. Kaplan-Meier survival analysis revealed no significant differences between the two age groups in graft survival or patient mortality. Reoperation within 1 month was significantly linked to decreased survival (p = 0.015) and decreased graft survival (p = 0.003) as was moderate to severe rejection within 1 month (p = 0.009) but was not significantly different between the two age groups. Wilcoxon rank-sum test showed no difference between groups in regard to reoperation or moderate to severe rejection at 1 or 3 months or the development of chronic kidney disease.

Conclusion: Age at the time of intestinal transplantation (< 40 vs. ≥40 years old) does not appear to significantly impact major transplant outcomes, such as patient mortality, graft survival, or rejection rates. While reoperation and moderate to severe rejection within 1 and 3 months negatively affected overall outcomes, these complications were not more frequent in older or younger patients.

Table 1. Demographic Characteristics of Intestinal Transplant Recipients.

BMI, body mass index; TPN, total parenteral nutrition.

International Poster of Distinction

Gabriela de Oliveira Lemos, MD1; Natasha Mendonça Machado, PhD2; Raquel Torrinhas, PhD3; Dan Linetzky Waitzberg, PhD3

1University of Sao Paulo School of Medicine, Brasília, Distrito Federal; 2University of Sao Paulo School of Medicine, São Paulo; 3Faculty of Medicine of the University of São Paulo, São Paulo

Encore Poster

Presentation: Ganepão 2023.

Publication: Braspen Journal. ISSN 2764-1546 | Online Version.

Financial Support: This study is linked to project no. 2011/09612-3 and was funded by the Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP).

Background: Sphingolipids (SLs) are key molecules in cell signaling and play a central role in lipotoxicity. Moreover, they are major constituents of eucaryotic membrane cells. Data on SLs remodeling in the gastrointestinal tract after Roux-en-Y gastric bypass (RYGB) are lacking and may help to elicit tissue turnover and metabolism. This protocol aims to evaluate the SLs’ profile and remodeling within the plasma and different segments of the gastrointestinal tract (GIT) before and 3 months after RYGB in a population of women with obesity and type 2 diabetes mellitus (T2DM) and to correlate the changes within these tissues. This investigation is part of the SURMetaGIT study, registered at www.clinicalTrials.gov (NCT01251016).

Methods: Twenty-eight women with obesity and T2DM who underwent RYGB were enrolled in this protocol. Those on insulin therapy were excluded. We collected plasma (n = 28) and intestinal samples from the gastric pouch (n = 9), duodenum (n = 8), jejunum (n = 9), and ileum (n = 9) for untarget metabolomics analysis at baseline and 3 months post-surgery. Indian ink (SPOT®) was used to check the places for the following biopsies. SLs were identified using high-performance liquid chromatography coupled mass spectrometry. Data were processed and analyzed using AnalysisBaseFileConverter and MS-DIAL, respectively. The magnitude of SLs’ changes after RYGB was assessed by the fold change (log2 post-surgery mean/pre-surgery mean). The Spearman test was performed for the correlation analysis. P value < 0.05 was considered significant. Statistics were carried out in the Jamovi software (2.2.5) and MetaboAnalyst 5.0.

Results: 34 SLs were identified, including sphingomyelins (SM), ceramides (Cer), and glycosphingolipids (GlcSL). SMs were the most common SL found in the plasma – 27 SM, 3 Cer, and 4 GlcSL- and in GIT -16 SM, 13 Cer, and 5 GlcSL. Every GIT tissue presented distinct SL remodeling. The plasma and the jejunum better-discriminated SLs’ changes following surgery (Figure 1). The jejunum expressed the most robust changes, followed by the plasma and the duodenum (Figure 2). Figure 3 presents the heatmap of the plasma and the GIT tissues. Correlation analysis showed that the plasmatic SLs, particularly SM(d32:0), GlcCer(d42:1), and SM(d36:1), strongly correlated with jejunum SLs. These lipids showed a negative-strong correlation with jejunal sphingomyelins, but a positive-strong correlation with jejunal ceramides (Table 1).

Conclusion: RYGB was associated with SL remodeling in the plasma and GIT. SM was the main SL found in the plasma and the GIT. The most robust changes occurred in the jejunum and the plasma, and these 2 samples presented the more relevant correlation. Considering our findings, the role of SM in metabolic changes after RYGB should be investigated.

Table 1. Correlation Analysis of Sphingolipids from the plasma with the Sphingolipids from the Gastrointestinal Tract.

*p < ,05; **p < ,01; ***p < 0,001.

The green circle represents samples at baseline and the red circles represent the samples 3 months after RYGB.

Figure 1. Principal Component Analysis (PCA) from GIT Tissues and Plasma.

Fold change = log2 post-surgery mean/pre-surgery mean.

Figure 2. Fold Change of Sphingolipids from the Plasma and Gastrointestinal Tract.

The map under the right top green box represents lipids’ abundance before surgery, and the map under the left top red box represents lipids’ abundance after RYGB.

Figure 3. Heatmap of Sphingolipids from the Plasma and Gastrointestinal Tract.

Lucas Santander1; Gabriela de Oliveira Lemos, MD2; Daiane Mancuzo3; Natasha Mendonça Machado, PhD4; Raquel Torrinhas, PhD5; Dan Linetzky Waitzberg, PhD5

1Universidade Santo Amaro (Santo Amaro University), São Bernardo Do Campo, Sao Paulo; 2University of Sao Paulo School of Medicine, Brasília, Distrito Federal; 3Universidade São Caetano (São Caetano University), São Caetano do Sul, Sao Paulo; 4University of Sao Paulo School of Medicine, São Paulo; 5Faculty of Medicine of the University of São Paulo, São Paulo

Financial Support: Fundação de Amparo a Pesquisa do Estado de São Paulo.

Background: Microalbuminuria (MAL) is an early biomarker of kidney injury, linked to conditions like hypertension, type 2 diabetes (T2DM), and obesity. It is also associated with higher cardiovascular (CV) risk. This protocol examines the impact of Roux-en-Y gastric bypass (RYGB) on MAL and CV markers in patients with obesity, T2DM, and MAL.

Methods: 8 women with grade II-III obesity, T2DM, and MAL who underwent RYGB were included. Patients on insulin therapy were not included. MAL was defined as a urinary albumin-to-creatinine ratio > 30 mg/g. MAL, glycemic, and lipid serum biomarkers were measured at baseline and three months post-surgery. Systolic (SBP) and diastolic (DBP) blood pressures and medical treatments for T2DM and hypertension were also assessed. T2DM remission was defined by ADA 2021 criteria. Categorical variables were reported as absolute and relative frequencies, while continuous variables were expressed as median and IQR based on normality tests. Intragroup and intergroup comparisons were conducted using the Wilcoxon and Mann-Whitney tests for numeric data. The Fisher test was performed when necessary to compare dichotomic variables. Data were analyzed in the JASP software version 0.18.1.0.

Results: Overall, RYGB was associated with weight loss, improved body composition, and better CV markers (Table 1). Post-surgery, MAL decreased by at least 70% in all patients and resolved in half. All patients with MAL resolution had pre-surgery levels ≤ 100 mg/g. Those without resolution had severe pre-surgery MAL (33.8 vs. 667.5, p = 0.029), and higher SBP (193 vs. 149.5, p = 0.029) DBP (138 vs. 98, p = 0.025). Blood pressure decreased after surgery but remained higher in patients without MAL resolution: SBP (156.0 vs. 129.2, p = 0.069) and DBP (109.5 vs. 76.5, p < 0.001). MAL resolution was not linked to T2DM remission at 3 months (75% vs. 50%, p = 1.0). One patient had worsened MAL (193.0 vs. 386.9 mg/g) after RYGB. Glomerular filtration rate (GFR) tended to increase post-surgery only in the group with MAL resolution (95.4 vs. 108.2 ml/min/1.73 m², p = 0.089), compared to the group without MAL resolution (79.2 vs. 73.7 ml/min/1.73 m², p = 0.6).

Conclusion: RYGB effectively reduced markers of renal dysfunction and cardiovascular risk in this cohort. Patients showed a decrease in MAL, with resolution in half of the patients. The small sample size and short follow-up period may have limited the overall impact of the surgery on renal function. Future studies with larger cohorts and longer follow-ups are needed to understand better the effects of bariatric surgery on MAL, and its relation to other CV markers.

Table 1. Biochemical and Clinical Data Analysis Following RYGB.

eGFR: estimated glomerular filtration rate; HbA1c: glycated hemoglobin; HDL-c: high-density lipoprotein cholesterol; HOMA-BETA: beta-cell function by the homeostasis model; HOMA-IR: Homeostasis Model Assessment of Insulin Resistance; LDL-c: low-density lipoprotein cholesterol; Non-HDL-c: non-high-density lipoprotein cholesterol; VLDL-c: very-low-density lipoprotein cholesterol; DBP: diastolic blood pressure; SBP: systolic blood pressure; WC: waist circumference.

Michelle Nguyen, BSc, MSc1; Johane P Allard, MD, FRCPC2; Dane Christina Daoud, MD3; Maitreyi Raman, MD, MSc4; Jennifer Jin, MD, FRCPC5; Leah Gramlich, MD6; Jessica Weiss, MSc1; Johnny H. Chen, PhD7; Lidia Demchyshyn, PhD8

1Pentavere Research Group Inc., Toronto, ON; 2Division of Gastroenterology, Department of Medicine, Toronto General Hospital, Toronto, ON; 3Division of Gastroenterology, Centre Hospitalier de l'Université de Montréal (CHUM), Department of Medicine, University of Montreal, Montreal, QC; 4Division of Gastroenterology, University of Calgary, Calgary, AB; 5Department of Medicine, University of Alberta, Division of Gastroenterology, Royal Alexandra Hospital, Edmonton, AB; 6Division of Gastroenterology, Faculty of Medicine and Dentistry, University of Alberta, Edmonton, AB; 7Takeda Canada Inc., Vancouver, BC; 8Takeda Canada Inc., Toronto, ON

Encore Poster

Presentation: 46th European Society for Clinical Nutrition and Metabolism (ESPEN) Congress. 7-10 September 2024, Milan, Italy.

Financial Support: Funding of this study is from Takeda Canada Inc.

Background: Teduglutide is indicated for the treatment of patients with short bowel syndrome (SBS) who are dependent on parenteral support (PS). This study evaluated longer-term teduglutide effectiveness and safety in Canadian patients diagnosed with SBS dependent on PS using Real-World Evidence.

Methods: This was an observational, retrospective study, using data from the national Canadian Takeda patient support program, and included adults with SBS. Data were collected 6 months before teduglutide initiation and from initiation to Dec-01-2023, death, or loss of follow-up. Descriptive statistics characterized the population and treatment-emergent adverse events (TEAEs). Changes in parenteral nutrition/intravenous fluid supply (PN/IV) were assessed based on decreases in PN/IV volume from baseline. Statistical significance was set at p < 0.05.

Results: 52 patients (60% women) were included in this study. Median age (range) was 54 (22–81) and 50% had Crohn's disease as their etiology of SBS. At 6 months, median (range) absolute and percentage reduction from baseline in PN/IV volume was 3,900 mL/week (-6,960–26,784; p < 0.001), and 28.1% (-82.9–100). At 24 months, median (range) absolute reduction from baseline was 6,650 mL/week (-4,400–26,850; p = 0.003), and the proportion of patients who achieved ≥20% reduction in weekly PN/IV was 66.7%. Over the study, 27% achieved independence from PN/IV. TEAEs were reported in 51 (98%) patients (83% were serious TEAEs) during the study period, the 3 most common were weight changes, diarrhea, and fatigue.

Conclusion: Patients showed significant decreases in PN/IV volumes after initiating teduglutide, with no unexpected safety findings. This study demonstrates real-world, longer-term effectiveness and safety of teduglutide in Canadian patients with SBS, complimenting previous clinical trials, and real-world studies.

Poster of Distinction

Sarah Carter, RD, LDN, CNSC1; Ruth Fisher, RDN, LD, CNSC2

1Coram CVS/Specialty Infusion Services, Tullahoma, TN; 2Coram CVS/Specialty Infusion Services, Saint Hilaire, MN

Financial Support: None Reported.

Background: Perceived benefit is one factor determining therapy continuation. Little information is published regarding positive outcomes for patients receiving the GLP-2 analog teduglutide apart from success rates of weaning HPN and hydration volumes by 20%. Patients who are new to therapy may question how long after initiation should they expect to see results. Patients may collaborate with prescribers to improve therapy tolerance if they have hope for improvements in their quality of life. This data analysis will provide details regarding patients receiving teduglutide and their perceived benefits to therapy.

Methods: Dietitians interview patients receiving teduglutide as part of a service agreement, monitoring persistency and helping to eliminate barriers to therapy. Dietitians make weekly and monthly calls based on patients’ drug start dates and document interventions in flowsheets in patients’ electronic medical records. The interventions are published to data visualization software to monitor compliance with dietitian outreach as part of a quality improvement project. All patients were diagnosed with short bowel syndrome, but baseline characteristics were not exported to this dashboard. This study is a retrospective analysis of the existing dashboard using 3 years of assessments (May 1, 2021-April 30, 2024). Exclusion criteria included therapy dispensed from a pharmacy not using the company's newest computer platform and patients who did not respond to outreach. We analyzed the time on therapy before a positive outcome was reported by the patient to the dietitian and which positive outcomes were most frequently reported.

Results: The data set included 336 patients with 2509 phone assessments. The most frequently reported first positive outcome was improved ostomy output/less diarrhea (72%, n = 243), occurring just after a month on therapy (mean 31 days ± 26.4). The mean time for first positive outcome for all patients who reported one was 32 days ± 28.5 (n = 314). Of the 22 patients who reported no positive outcome, 13 didn't answer the dietitians’ calls after initial contact. A summary is listed in Table 1. Overall positive outcomes reported were improved ostomy output/less diarrhea (87%, n = 292), weight gain (70%, n = 236), improved appetite/interest in food (33%, n = 112), feeling stronger/more energetic (27%, n = 92), improved quality of life (23%, n = 90), improved lab results (13%, n = 45) and fewer antidiarrheal medications (12%, n = 40). Of the 218 patients receiving parenteral support, 44 patients stopped hydration and HPN completely (20%) with another 92 patients reporting less time or days on hydration and HPN (42%) for a total of 136 patients experiencing a positive outcome of parenteral support weaning (62%). Patients reported improvements in other areas of their lives including fewer hospitalizations (n = 39), being able to travel (n = 35), tolerating more enteral nutrition volume (n = 19), returning to work/school (n = 14) and improving sleep (n = 13). A summary is diagramed in Figure 2.

Conclusion: This retrospective analysis indicates that teduglutide is associated with improved symptom control and improved quality of life measures, with most patients seeing a response to therapy within the first 2 months. Patients responded to teduglutide with a decrease in ostomy output and diarrhea as the most frequent recognizable response to therapy. In addition to the goal of weaning parenteral support, clinicians should be cognizant of improvements in patients’ clinical status that can have significant impact in quality of life.

Table 1. Timing of First Reported Positive Outcome by Patients Receiving Teduglutide.

Figure 1. Total Positive Outcomes Reported by Patients (n = 336).

Poster of Distinction

Jennifer Cholewka, RD, CNSC, CDCES, CDN1; Jeffrey Mechanick, MD1

1The Mount Sinai Hospital, New York, NY

Financial Support: None Reported.

Background: Bariatric surgery is a guideline-directed intervention for patients with severe obesity. Adherence to post-operative recommendations is variable and consequent undernutrition complicated by multiple micronutrient deficiencies is prevalent. A post-bariatric surgery syndrome (PBSS) is not well defined and therefore poorly understood, though early detection and intervention would likely decrease clinical and economic burdens. Our current experience with PBSS is presented here as a case series. We will define PBSS based on clinical evidence related to risk factors, interventions, and clinical/biochemical responses.

Methods: Twenty four consecutive patients referred to our metabolic support service were identified between January 1, 2019 and December 31, 2023 who were admitted to The Mount Sinai Hospital in New York City with a history of RYGB (roux en y gastric bypass) or BPDDS (biliopancreatic diversion with duodenal switch) and found to have failure to thrive, undernutrition, clinical/biochemical features of at least one micronutrient deficiency, and indications for parenteral nutrition. Patients were excluded if they had surgical complications or were not treated with parenteral nutrition. Fifteen patients were included in this case series and de-identified prior to data collection using the electronic health records (EPIC) and coded data collection sheets. Descriptive statistical methods were used to analyze parametric variables as mean ± standard deviation and non-parametric variables as median (interquartile range).

Results: Results are provided in Table 1.

Conclusion: The PBSS is defined by significant decompensation following a bariatric surgery procedure with malabsorptive component characterized by failure to thrive, hypoalbuminemia, multiple micronutrient deficiencies, and need for parenteral nutrition. Major risk factors include inadequate protein and micronutrient intake due to either unawareness (e.g., not recommended) or poor adherence, significant alcohol consumption, or a complicating medical/surgical condition. Parenteral nutrition formulation was safe in this population and prioritizes adequate nitrogen, nonprotein calories, and micronutrition. Further analyses on risk factors, responses to therapy, and role of a multidisciplinary team are in progress.

Table 1. Risks/Presentation.

Table 2. Responses to Parenteral Nutrition Intervention.

Holly Estes-Doetsch, MS, RDN, LD1; Aimee Gershberg, RD, CDN, CPT2; Megan Smetana, PharmD, BCPS, BCTXP3; Lindsay Sobotka, DO3; Kristen Roberts, PhD, RDN, LD, CNSC, FASPEN, FAND4

1The Ohio State University, Columbus, OH; 2NYC Health + Hospitals, New York City, NY; 3The Ohio State University Wexner Medical Center, Columbus, OH; 4The Ohio State University, Granville, OH

Financial Support: None Reported.

Background: Decompensated cirrhosis increases the risk of fat maldigestion through altered bile synthesis and excretion through the bile canaliculi. Maldigestion increases the risk of vitamin and mineral deficiencies which when untreated contribute to consequential health issues such as metabolic bone disease, xerophthalmia, and hyperkeratosis. There is an absence of comprehensive guidelines for prevention and treatment of deficiencies.

Methods: Medical and surgical history, anthropometrics, medications and nutritional supplements, laboratory data, and medical procedures were extracted and analyzed from the electronic medical record.

Results: A patient with congenital biliary atresia and decompensated cirrhosis was seen in a hepatology outpatient clinic. Biochemical assessment revealed severe vitamin A deficiency and suboptimal vitamin D and zinc status. Physical assessment indicated telogen effluvium and transient blurry vision. Despite a history of high-dose oral retinyl acetate ranging from 10,000-50,000 units daily and a 3-day course of 100,000 units via intermuscular injection and co-treatment of zinc deficiency to ensure adequate circulating retinol binding protein, normalization of serum retinol was not possible over the last 10 years. The patient's serum vitamin A level normalized following liver transplantation.

Conclusion: In decompensated cirrhosis, there is a lack of sufficient guidelines for micronutrient dosing when traditional treatment strategies are unsuccessful. Furthermore, altered secretion of transport proteins due to underlying liver dysfunction may pose challenges in evaluating laboratory markers of micronutrient status. Collaborations with pharmacy and medicine support a thorough assessment, and the establishment of a safe treatment and monitoring plan. Clinical research is needed to understand strategies for acceptable and safe dosing strategies for patients with chronic, unresponsive fat soluble vitamin deficiencies.

Gang Wang, PhD1

1Nimble Science, Calgary, AB

Financial Support: This work was supported by the National Research Council of Canada Industrial Research Assistance Program (NRC IRAP), an Alberta Innovate Accelerating Innovations into CarE (AICE) grant, and partially by a Clinical Problems Incubator Grant from the Snyder Institute for Chronic Disease at the University of Calgary.

Background: The small intestine (SI) microbiome plays a crucial role in nutrient absorption and emerging evidence indicates that fecal contents is insufficient to represent the SI microenvironment. Endoscopic sampling is possible but expensive and not scalable. Ingestible sampling capsule technologies are emerging. However, potential contamination becomes are major limitations of these devices.

Methods: We previously reported the Small Intestinal MicroBiome Aspiration (SIMBA) capsules as effective means for sampling, sealing and preserving SI luminal contents for 16 s rRNA gene sequencing analysis. A subset of the DNA samples, including SI samples collected by SIMBA capsules (CAP) and the matched saliva (SAL), fecal (FEC) and duodenal endoscopic aspirate (ASP) and brush (BRU) samples, from 16 participants recruited for an observational clinical validation study were sent for shotgun metagenomic sequencing. The aims are 1) to compare the sampling performance of the capsule (CAP) compared to endoscopic aspirates (ASP) and 850 small intestine, large intestine and fecal samples from the Clinical Microbiomics data warehouse (PRJEB28097), and 2) to characterize samples from the 4 different sampling sites in terms of species composition and functional potential.

Results: 4/80 samples (1/16 SAL, 2/16 ASP, 0/16 BRU, 1/16 CAP and 0/16 FEC) failed library preparation and 76 samples were shotgun sequenced (average of 38.5 M read pairs per sample) (Figure 1). Quality assessment demonstrated that despite the low raw DNA yield of CAP samples, they retained a minimal level of host contamination, in comparison to ASP and BRU (mean 5.27 % vs. 93.09 - 96.44% of average total reads per sample) (Figure 2). CAP samples shared majority of the species with ASP samples as well as a big fraction of species detected in the terminal ileum samples. ASP and CAP sample composition was more similar to duodenum, jejunum and saliva and very different from large intestine and stool samples. Functional genomics further revealed GI regional-specific differences: In both ASP and CAP samples we detect a number of Gut Metabolic Modules (GMMs) for carbohydrate digestion and short-chain fatty acids. However, probiotic species, and species and genes involved in the bile acid metabolism were mainly prevalent in CAP and FEC samples and could not be detected in ASP samples.

Conclusion: CAP and ASP microbiome are compositionally similar despite of the high level of host contamination of ASP samples. CAP appears to be of better quality to reveal GI regional-specific functional potentials than ASP. This analysis demonstrates the great potential for the SIMBA capsule for unveiling the SI microbiome and supports the prospective use of SIMBA capsules in observational and interventional studies to investigate the impacts of short-term and long-term biotic foods interventions to the gut microbiome (Figure 3). A series of studies utilizing the SIMBA capsules are being conducted under way and the detectability of the biotic foods intervention impacts will be reported in near future (Table 1).

Table 1. List of Ongoing Observational and Interventional Clinical Studies using SIMBA Capsule.

Figure 1. Shotgun Metagenomic Sequencing: Taxonomy Overview (Relative Abundance of the 10 Most Species By Sampling Site).

Figure 2. Shotgun Metagenomic Sequencing: High Quality Non-Host Contamination Reads On Sampling Sites.

Figure 3. Short-term and Long-term Interventional Study Protocols Using SIMBA Capsules.

Darius Bazimya, MSc. Nutrition, RN1; Francine Mwitende, RN1; Theogene Uwizeyimana, Phn1

1University of Global Health Equity, Kigali

Financial Support: University of Global Health Equity.

Background: Rwanda is currently grappling with the double burden of malnutrition and rising obesity, particularly in urban populations. As the country experiences rapid urbanization and dietary shifts, traditional issues of undernutrition coexist with increasing rates of obesity and related metabolic disorders. This study aims to investigate the relationship between these nutrition-related issues and their impact on gastrointestinal (GI) health and metabolic outcomes in urban Rwandan populations.

Methods: A cross-sectional study was conducted in Kigali, Rwanda's capital, involving 1,200 adult participants aged 18 to 65. Data were collected on dietary intake, body mass index (BMI), GI symptoms, and metabolic markers such as fasting glucose, cholesterol levels, and liver enzymes. Participants were categorized into three groups: undernourished, normal weight, and overweight/obese based on their BMI. GI symptoms were assessed using a validated questionnaire, and metabolic markers were evaluated through blood tests. Statistical analyses were performed to assess correlations between dietary patterns, BMI categories, and GI/metabolic health outcomes.

Results: The study found that 25% of participants were classified as undernourished, while 22% were obese, reflecting the double burden of malnutrition and rising obesity in Rwanda's urban population. Among obese participants, 40% exhibited elevated fasting glucose levels (p < 0.01), and 30% reported significant GI disturbances, such as irritable bowel syndrome (IBS) and non-alcoholic fatty liver disease (NAFLD). In contrast, undernourished individuals reported fewer GI symptoms but showed a higher prevalence of micronutrient deficiencies, including anaemia (28%) and vitamin A deficiency (15%). Dietary patterns characterized by high-fat and low-fiber intake were significantly associated with increased GI disorders and metabolic dysfunction in both obese and normal-weight participants (p < 0.05).

Conclusion: This study highlights the growing public health challenge posed by the coexistence of undernutrition and obesity in Rwanda's urban centres. The dietary shifts associated with urbanization are contributing to both ends of the nutritional spectrum, adversely affecting GI and metabolic health. Addressing these issues requires comprehensive nutrition interventions that consider the dual challenges of undernutrition and obesity, promoting balanced diets and improving access to health services. These findings have important implications for nutrition therapy and metabolic support practices in Rwanda, emphasizing the need for tailored interventions that reflect the country's unique nutritional landscape.

Levi Teigen, PhD, RD1; Nataliia Kuchma, MD2; Hijab Zehra, BS1; Annie Lin, PhD, RD3; Sharon Lopez, BS2; Amanda Kabage, MS2; Monika Fischer, MD4; Alexander Khoruts, MD2

1University of Minnesota, St. Paul, MN; 2University of Minnesota, Minneapolis, MN; 3University of Minnesota, Austin, MN; 4Indiana University School of Medicine, Indianapolis, IN

Financial Support: Achieving Cures Together.

Background: Fecal microbiota transplantation (FMT) is a highly effective treatment for recurrent Clostridioides difficile infection (rCDI). The procedure results in the repair of the gut microbiota following severe antibiotic injury. Recovery from rCDI is associated with high incidence of post-infection irritable bowel syndrome (IBS). In addition, older age, medical comorbidities, and prolonged diarrheal illness contribute to frailty in this patient population. The effect of FMT on IBS symptoms and frailty in patients with rCDI is largely unknown. In this prospective cohort study, we collected IBS symptom and frailty data over the 3 months following FMT treatment for rCDI in patients at two large, academic medical centers.

Methods: Consenting adults who underwent FMT treatment for rCDI were enrolled in this study (n = 113). We excluded patients who developed a recurrence of CDI within the 3-month follow-up period (n = 15) or had incomplete IBS symptom severity scale (IBS-SSS) scores at any timepoint. The IBS-SSS is a 5-item survey measuring symptom intensity, with a score range from 0 to 500 with higher scores representing greater severity. IBS-SSS was collected at baseline, 1-week post FMT, 1-month post-FMT, and 3-months post-FMT. Frailty was assessed at baseline and 3-months using the FRAIL scale (categorical variable: “Robust Health”, “Pre-Frail”, “Frail”). Kruskal-Wallis test was used to compare IBS-SSS across timepoints. Post-hoc analysis was performed with the Pairwise Wilcoxon Rank Sum Tests using the False Discovery Rate adjustment method. The Friedman test was used to compare frailty distribution between the baseline and 3-month timepoints.

Results: Mean age of the cohort was 63.3 (SD 15.4) years; 75% of the patients were female sex (total n = 58 patients). The IBS-SSS scores across timepoints are presented in Table 1 and Figure 1. The median IBS score at baseline was 134 [IQR 121], which decreased to a median score of 65 [IQR 174] at 1-week post-FMT. The baseline timepoint was found to differ from the 1-week, 1-month, and 3-month timepoints (p < 0.05). No other differences between timepoints were observed. Frailty was assessed at baseline and 3 months (total n = 52). At baseline, 71% of patients (n = 37) were considered Pre-Frail or Frail, but this percentage decreased to 46% (n = 24) at 3-months (Table 2; p < 0.05).

Conclusion: Findings from this multicenter, prospective cohort study indicate an overall improvement in both IBS symptoms and frailty in the 3-months following FMT therapy for rCDI. Notably, IBS symptom scores were found to improve by 1-week post FMT. Further research is required to understand what predicts IBS symptom improvement following FMT and if nutrition therapy can help support further improvement. It will also be important to further understand how nutrition therapy can help support the improvement in frailty status observed following FMT for rCDI.

Table 1. Distribution of IBS-SSS Scores at Baseline and Following FMT.

Table 2. Frailty Distribution Assessed by FRAIL Scale at Baseline and 3-Months Post-FMT.

Box-plot distributions of IBS-SSS scores across timepoints. Median IBS-SSS score are baseline was 134 and decreased to a median score of 65 at 1-week post-FMT. The baseline timepoint was found to differ from the 1-week, 1-month, and 3-month timepoints (p < 0.05).

Figure 1. Distribution of IBS-SSS Scores by Timepoint.

Oshin Khan, BS1; Subanandhini Subramaniam Parameshwari, MD2; Kristen Heitman, PhD, RDN1; Kebire Gofar, MD, MPH2; Kristin Goheen, BS, RDN1; Gabrielle Vanhouwe, BS1; Lydia Forsthoefel, BS1; Mahima Vijaybhai Vyas2; Saranya Arumugam, MBBS2; Peter Madril, MS, RDN1; Praveen Goday, MBBS3; Thangam Venkatesan, MD2; Kristen Roberts, PhD, RDN, LD, CNSC, FASPEN, FAND4

1The Ohio State University, Columbus, OH; 2The Ohio State University Wexner Medical Center, Columbus, OH; 3Nationwide Children's Hospital, Columbus, OH; 4The Ohio State University, Granville, OH

Financial Support: None Reported.

Background: Cyclic vomiting syndrome (CVS) is a disorder of gut-brain interaction characterized by recurrent spells of nausea, vomiting, and abdominal pain lasting hours to days. Estimated prevalence is approximately 2% in adults and children. Due to recalcitrant symptoms, patients might develop disorder eating patterns though data are lacking. Identifying patients at risk for disordered eating patterns and malnutrition can help optimize nutritional status and may improve overall patient outcomes. The purpose of this study is to define nutrient intakes and dietary patterns of those with CVS seeking care at a tertiary care referral clinic, and to establish variation in dietary intakes based on disease severity.

Methods: In this ongoing, cross-sectional study of adults diagnosed with CVS based on Rome IV criteria, participants are asked to complete validated surveys including a food frequency questionnaire (FFQ). Baseline demographics and clinical characteristics including disease severity (defined by the number of episodes per year) were ascertained. Healthy eating index (HEI) scores (scale of 0-100) were calculated to assess diet quality with higher scores indicating better diet quality compared to lower scores. Those with complete data were included in this interim analysis.

Results: Data from 33 participants with an average age of 40 ± 16 years and an average BMI of 28.6 ± 7.9 is presented. The cohort was predominately female (67%), white (79%) and with moderate to severe disease (76%). The malnutrition screening tool supported that 42% of participants were at risk of malnutrition independent of BMI status (p = 0.358) and disease severity (p = 0.074). HEI scores were poor amongst those with CVS (55) and did not differ based on disease severity (58 vs 54; p = 0.452). Energy intakes varied ranging from 416-3974 kcals/day with a median intake of 1562 kcals/day.

Conclusion: In CVS, dietary intake is poor and there is a high risk of malnutrition regardless of disease severity and BMI. Providers and registered dietitian nutritionists must be aware of the high rates of malnutrition risk and poor dietary intakes in this patient population to improve delivery of dietary interventions. Insight into disordered eating and metabolic derangements may improve the understanding of dietary intakes in CVS.

Hannah Huey, MDN1; Holly Estes-Doetsch, MS, RDN, LD2; Christopher Taylor, PhD, RDN2; Kristen Roberts, PhD, RDN, LD, CNSC, FASPEN, FAND3

1Nationwide Children's Hospital, Columbus, OH; 2The Ohio State University, Columbus, OH; 3The Ohio State University, Granville, OH

Financial Support: None Reported.

Background: Micronutrient deficiencies are common in Crohn's disease (CD) due to poor dietary absorption. Pregnancy has an increased nutritional demand and when coupled with a malabsorptive condition like CD, clinicians must closely monitor micronutrient status. However, there is a lack of evidence-based guidelines for clinicians when managing these complex patients leaving clinicians to use clinical judgement for management. A case study of a pregnant female with CD presents for delivery with an undetected fat-soluble vitamin deficiency. The impact of a vitamin K deficiency on the post-partum management of a patient with CD is presented along with potential assessment and treatment strategies. At 25 weeks gestation, the patient presented with a biochemical iron deficiency anemia, vitamin B12 deficiency, and zinc deficiency for which she was treated with oral supplementation and/or intramuscular injections. No assessment of fat-soluble vitamins during the gestation period was conducted despite a pre-pregnancy history of multiple micronutrient deficiencies. At 33 weeks gestation, the mother was diagnosed with preeclampsia and delivered the fetus at 35 weeks. After birth, the infant presented to the NICU with a mediastinal mass, abnormal liver function tests and initial coagulopathy. The mother experienced a uterine hemorrhage post-cesarean section. At this time her INR was 14.8 with a severe prolongation of the PT and PTT and suboptimal levels of blood clotting factors II, VII, IX, and X. The patient was diagnosed with a vitamin K deficiency and was treated initially with 10 mg daily by mouth x 3 days resulting in an elevated serum vitamin K while PT and INR were trending towards normal limits. At discharge she was recommended to take 1 mg daily by mouth of vitamin K to prevent further deficiency. PT and INR were the biochemical assays that were reassessed every 3 months since serum vitamin K is more reflective of recent intake. CD represents a complex disorder and the impact of pregnancy on micronutrient status is unknown. During pregnancy, patients with CD may require additional micronutrient monitoring particularly in the case of historical micronutrient deficiencies or other risk factors. This case presents the need for further research into CD-specific micronutrient deficiencies and the creation of specific supplementation guidelines and treatment algorithms for detection of micronutrient deficiencies in at-risk patients.

Methods: None Reported.

Results: None Reported.

Conclusion: None Reported.

Gretchen Murray, BS, RDN1; Kristen Roberts, PhD, RDN, LD, CNSC, FASPEN, FAND2; Phil Hart, MD1; Mitchell Ramsey, MD1

1The Ohio State University Wexner Medical Center, Columbus, OH; 2The Ohio State University, Granville, OH

Financial Support: UL1TR002733.

Background: Enteric hyperoxaluria (EH) and resultant lithiasis is well documented in many malabsorptive conditions including inflammatory bowel disease, celiac disease, short bowel syndrome, and post gastric bypass. Chronic pancreatitis (CP) often leads to exocrine pancreatic insufficiency (EPI) and subsequent fat malabsorption increasing the risk of EH secondary to calcium binding to dietary fat leaving oxalates available for colonic absorption. Modulating oxalate intake by reducing whole grains, greens, baked beans, berries, nuts, beer and chocolate while simultaneously improving hydration is the accepted medical nutrition therapy (MNT) for EH and calcium-oxalate stones. Although sources of dietary oxalate are well-known, there is limited literature regarding dietary oxalates in the CP diet, leaving a paucity of data to guide MNT for EH and calcium-oxalate lithiasis for these patients.

Methods: A cross-sectional, case-control study was performed comparing subjects with CP to healthy controls. Vioscreen™ food frequency questionnaire was used to assess and quantify total oxalic acid intake in the CP cohort and to describe dietary sources. Descriptive statistics were used to describe dietary intake of oxalic acid and contributing food sources.

Results: A total of 52 subjects with CP were included and had a mean age of 50 ± 15 years. Most subjects were male (n = 35; 67%). Mean BMI was 24 ± 6 kg/m2 and 8 subjects (15%) were classified as underweight by BMI. Median daily caloric intake was 1549 kcal with a median daily oxalic acid intake of 104 mg (range 11-1428 mg). The top three contributors to dietary oxalate intake were raw and cooked greens such as spinach or lettuce, followed by mixed foods such as pizza, spaghetti, and tacos and tea. Other significant contributors (>100 mg) to dietary oxalate intake included sports or meal replacement bars, cookies and cakes, potato products (mashed, baked, chips, fried), and refined grains (breads, tortillas, bagels).

Conclusion: In the CP population, highest contributors to oxalate intake include greens, mixed foods, tea, meal replacement bars, some desserts, potatoes, and refined grains. Many of the identified dietary oxalate sources are not considered for exclusion in a typical oxalate restricted diet. A personalized approach to dietary oxalate modulation is necessary to drive MNT for EH prevention in those with CP.

Qian Ren, PhD1; Peizhan Chen, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine2

1Department of Clinical Nutrition, Shanghai Sixth People's Hospital Affiliated to Shanghai Jiao Tong University School of Medicine, Shanghai; 2Clinical Research Center, Ruijin Hospital, Shanghai Jiao Tong University School of Medicine, Shanghai

Financial Support: This study was supported by the Youth Cultivation Program of Shanghai Sixth People's Hospital (Grant No. ynqn202223), the Key Laboratory of Trace Element Nutrition, National Health Commission of the Peoples’ Republic of China (Grant No. wlkfz202308), and the Danone Institute China Diet Nutrition Research and Communication (Grant No. DIC2023-06).

Background: Low serum vitamin D status was reported to be associated with reduced muscle mass; however, it is inconclusive whether this relationship is causal. This study used data from the National Health and Nutrition Examination Survey (NHANES) and two-sample Mendelian randomization (MR) analyses to ascertain the causal relationship between serum 25-hydroxyvitamin D [25(OH)D] and appendicular muscle mass (AMM).

Methods: In the NHANES 2011–2018 dataset, 11,242 participants aged 18–59 years old were included, and multivariant linear regression was performed to assess the relationship between 25(OH)D and AMM measured by dual-energy X-ray absorptiometry (Figure 1). In two-sample MR analysis, 167 single nucleotide polymorphisms significantly associated with serum 25(OH)D were applied as instrumental variables (IVs) to assess vitamin D effects on AMM in the UK Biobank (417,580 Europeans) using univariable and multivariable MR models (Figure 2).

Results: In the NHANES 2011–2018 dataset, serum 25(OH)D concentrations were positively associated with AMM (β = 0.013, SE = 0.001, p < 0.001) in all participants, after adjustment for age, race, season of blood collection, education, income, body mass index and physical activity. In stratification analysis by sex, males (β = 0.024, SE = 0.002, p < 0.001) showed more pronounced positive associations than females (β = 0.003, SE = 0.002, p = 0.024). In univariable MR, genetically higher serum 25(OH)D levels were positively associated with AMM in all participants (β = 0.049, SE = 0.024, p = 0.039) and males (β = 0.057, SE = 0.025, p = 0.021), but only marginally significant in females (β = 0.043, SE = 0.025, p = 0.090) based on IVW models were noticed. No significant pleiotropy effects were detected for the IVs in the two-sample MR investigations. In MVMR analysis, a positive causal effect of 25(OH)D on AMM were observed in total population (β = 0.116, SE = 0.051, p = 0.022), males (β = 0.111, SE = 0.053, p = 0.036) and females (β = 0.124, SE = 0.054, p = 0.021).

Conclusion: Our results suggested a positive causal effect of serum 25(OH)D concentration on AMM; however, more researches are needed to understand the underlying biological mechanisms.

Figure 1. Working Flowchart of Participants Selection in the Cross-Sectional Study.

Figure 2. The study assumptions of the two-sample Mendelian Randomization analysis between serum 25(OH)D and appendicular muscle mass. The assumptions include: (1) the genetic instrumental variables (IVs) should exhibit a significant association with serum 25(OH)D; (2) the genetic IVs should not associate with any other potential confounding factors; and (3) the genetic IVs must only through serum 25(OH)D but not any other confounders to influence the appendicular muscle mass. The dotted lines indicate the violate of the assumptions.

Qian Ren, PhD1; Junxian Wu1

1Department of Clinical Nutrition, Shanghai Sixth People's Hospital Affiliated to Shanghai Jiao Tong University School of Medicine, Shanghai

Financial Support: None Reported.

Background: A healthy diet is essential for both preventing and treating type 2 diabetes mellitus (T2DM), which has a negative impact on public health. Whole grains, which are rich in dietary fibre, can serve as a good source of carbohydrates. However, the correlation and causality between whole grain intake and the risk of T2DM and glucose metabolism remains to be clarified.

Methods: First, the National Health and Nutrition Examination Survey database 2003-2018 were used to investigate the correlation between dietary whole grain/fibre intake and the risk of T2DM and glucose metabolism. Then, based on the largest publicly published genome-wide association analysis of whole grain intake in the Neale lab database, single nucleotide polymorphisms (SNPs) that were statistically and significantly associated with whole grain intake were selected as instrumental variables (p < 5×10-8, linkage disequilibrium r2 < 0.1). Inverse variance weighted analysis (IVW), weighted median method and other methods were used to analyze the causal relationship between whole grain intake and T2DM. Heterogeneity test, gene pleiotropy test and sensitivity analysis were performed to evaluate the stability and reliability of the results.

Results: The results showed that dietary intakes of whole grains (OR = 0.999, 95%CI: 0.999 ~ 1.000, p = 0.004)/fibre (OR = 0.996, 95% CI: 0.993 ~ 0.999, p = 0.014) were negatively associated with the risk of T2DM. In the population with normal glucose metabolism, dietary fibre intake was negatively associated with FPG (β = -0.003, SE = 0.001, p < 0.001), FINS (β = -0.023, SE = 0.011, p = 0.044), and HOMA-IR (β = -0.007, SE = 0.003, p = 0.023). In the population with abnormal glucose metabolism, dietary intakes of whole grains (β = -0.001, SE = 0.001, p = 0.036) and fibre (β = -0.006, SE = 0.002, p = 0.005) were negatively associated with HbA1c. For further MR analyses, the IVW method demonstrated that for every one standard deviation increase in whole grains intake, T2DM risk was decreased by 1.9% (OR = 0.981, 95%CI: 0.970 ~ 0.993, β = -0.019, p = 0.002), with consistent findings for IVW-multiplicative random effects, IVW-fixed effects and IVW-radial. Meanwhile, MR-Egger regression analysis (intercept = -2.7 × 10-5, p = 0.954) showed that gene pleiotropy doesn't influence the results of MR analysis. Leave-one-out analysis showed that individual SNP greatly doesn't influence the results (pheterogeneity= 0.445).

Conclusion: Dietary intakes of whole grains may reduce the risk of T2DM and improve glucose metabolic homeostasis and insulin resistance. The causal relationship between whole grains intake and T2DM, as well as the optimal daily intake of whole grains still needs to be further explored in the future through large randomised controlled intervention studies and prospective cohort studies.

Hikono Sakata, Registered Dietitian1; MIsa Funaki, Registered Dietitian2; Kanae Masuda, Registered Dietitian2; Rio Kurihara, Registered Dietitian2; Tomomi Komura, Registered Dietitian2; Masaru Yoshida, Doctor2

1University of Hyogo, Ashiya-shi, Hyogo; 2University of Hyogo, Himezi-shi, Hyogo

Financial Support: None Reported.

Background: In recent years, lifestyle-related diseases such as obesity, diabetes, and dyslipidemia have been recognized as problems, one of the causes of which is an excessive fatty diet. Obesity and other related diseases are known to be risk factors for the severity of infectious diseases such as sepsis and novel coronavirus infection, but their pathomechanisms have not been clarified. Therefore, we hypothesized that a diet high in fat might induce functional changes not only in adipocytes but also in macrophages, weakening the immune response and leading to aggravation of infectious diseases. Therefore, in this study, we performed proteome analysis and RNA sequence analysis to examine what kind of gene and protein expression is induced in macrophages by high-fat diet loading.

Methods: Four-week-old mice were divided into a normal diet (ND) group and a high-fat diet (HFD) group and kept for four weeks. Macrophages were collected intraperitoneally from mice that had been intraperitoneally injected with 2 mL of thioglycolate medium to promote macrophage proliferation one week before dissection, and incubated at 37°C with 5% CO2 using Roswell Park Memorial Institute medium (RPMI). After culturing in an environment for 2 hours, floating cells were removed, and proteome analysis was performed using the recovered macrophages. In addition, RNA sequence analysis was performed on RNA extracted from macrophages.

Results: Proteome analysis identified more than 4000 proteins in each group. In the HFD group, compared to the ND group, decreased expression of proteins involved in phagocytosis, such as immunoglobulin and eosinophil peroxidase, was observed. In addition, RNA sequencing data analysis also showed a decrease in the expression levels of genes related to phagocytosis, which had been observed to decrease in proteome analysis.

Conclusion: From the above, it was suggested that the phagocytic ability of macrophages is reduced by high-fat diet loading. This research is expected to clarify the molecular mechanisms by which high-fat dietary loading induces the expression of genes and proteins and induces immunosuppressive effects.

Benjamin Davies, BS1; Chloe Amsterdam, BA1; Basya Pearlmutter, BS1; Jackiethia Butsch, C-CHW2; Aldenise Ewing, PhD, MPH, CPH3; Erin Holley, MS, RDN, LD2; Subhankar Chakraborty, MD, PHD4

1The Ohio State University College of Medicine, Columbus, OH; 2The Ohio State University Wexner Medical Center, Columbus, OH; 3The Ohio State University College of Public Health, Columbus, OH; 4The Ohio State University Wexner Medical Center, Dublin, OH

Financial Support: None Reported.

Background: Food insecurity (FI) refers to lack of consistent access to sufficient food for an active, healthy life. This issue, influenced by economic, social, and environmental factors, disproportionately affects disadvantaged populations. According to the United States Department of Agriculture, more than 10% of U.S. households experience FI, leading to physical and mental health consequences. FI has been linked to poorer dietary quality, increased consumption of processed, calorie-dense foods, and a higher prevalence of obesity, diabetes, and cardiovascular diseases. Chronic stress from FI can also trigger or exacerbate mental health disorders, including depression and anxiety, further complicating health outcomes. Affecting ~20% of the global population, dyspepsia significantly diminishes quality of life and increases healthcare costs. Its etiology is multifactorial, involving abnormal gastric motility, visceral hypersensitivity, inflammation, and psychosocial stressors. Although research on the link between FI and disorders of gut-brain interaction (DGBIs) like dyspepsia is limited, emerging evidence suggests a bidirectional relationship. Therefore, this study was designed to examine the association between FI, dyspepsia, and other health-related social needs (HRSN). Our hypotheses include 1) patients with FI have more severe dyspepsia symptoms, and 2) FI is associated with HRSN in other domains.

Methods: Patients presenting to a specialty motility clinic were prospectively enrolled into a registry created with the goal of holistically investigating the pathophysiology of DGBIs. Validated questionnaires for HRSN and dyspepsia were completed prior to their clinic visits. Data were managed with REDCap and statistical analyses were performed using SPSS.

Results: 53 patients completed the questionnaires. 88.7% of patients were White and 73.6% were female with an average age of 45.6 years (21-72) and BMI of 28.7 kg/m2 (17.8-51.1). FI was present in 13 (24.5%) patients. The overall severity of dyspepsia symptoms was significantly less in the food secure patients (13.8 vs. 18.8, p = 0.042). Of the four subscales of dyspepsia (nausea-vomiting, postprandial fullness, bloating, and loss of appetite), only loss of appetite was significantly greater in those with FI (2.3 vs. 1.3, p = 0.017). Patients with FI were more likely to be at medium (61.5% vs. 5.0%) or high risk (30.8% vs. 2.5%, p < 0.001) of financial hardship, experience unmet transportation needs (38.5% vs. 5.0%, p = 0.0.019) and housing instability (30.8% vs. 5.0%, p = 0.023) compared to those who were food secure. They were also at higher risk of depression (54% vs. 12.5%, p = 0.005), and reporting insufficient physical activity (92.3% vs. 55.0%, p = 0.05). After adjusting for age, gender, race, and BMI, FI was not a predictor of global dyspepsia severity. Greater BMI (O.R. 0.89, 95% C.I. 0.81-0.98) was associated with severity of early satiety. Female gender (O.R. 10.0, 95% C.I. 1.3-76.9, p = 0.03) was associated with the severity of nausea. Greater BMI (O.R. 1.10, 95% C.I. 1.001-1.22, p = 0.048) and female gender (O.R. 10.8, 95% C.I. 1.6-72.9, p = 0.015) both correlated with severity of postprandial fullness.

Conclusion: FI affected ~25% of patients seen in our clinic. It was, however, not an independent predictor of overall severity of dyspepsia symptoms. Patients who experienced FI had higher prevalence of other HRSN, and risk of depression and physical inactivity. Our results highlight the importance of considering FI and other HRSN in the management of dyspepsia. Understanding this interaction is essential for improving clinical outcomes and guiding public health interventions.

Ashlesha Bagwe, MD1; Chandrashekhara Manithody, PhD1; Kento Kurashima, MD, PhD1; Shaurya Mehta, BS1; Marzena Swiderska-Syn1; Arun Verma, MD1; Austin Sims1; Uthayashanker Ezekiel1; Ajay Jain, MD, DNB, MHA1

1Saint Louis University, St. Louis, MO

Financial Support: None Reported.

Background: Farnesoid X receptor (FXR) a gut nuclear receptor regulates intestinal driven bile acid homeostasis in short bowel syndrome. Chenodeoxycholic acid (CDCA), a primary bile acid, acts as a FXR ligand. Stem cell-derived intestinal enteroids offer a valuable system to study intestinal FXR function. We hypothesized that transfection of porcine enteroids with small interfering RNA (siRNA) would modulate FXR to further define its mechanistic role.

Methods: We developed a porcine protocol for matrigel-based 3D culture systems to generated enteroids from the small bowel of neonatal yorkshire pigs. After 7 days, cultures were passaged and expanded. RNA strands for Dicer-substrate siRNAs (DsiRNAs) were synthesized as single-strand RNAs (ssRNAs) by Integrated DNA Technologies and resuspended in an RNase-free buffer. DsiRNA targeted Sus scrofa, farnesoid x receptor (FXR) gene (KF597010.1). Three sets of DsiRNA were made for gene-specific silencing of FXR gene. Porcine enteroids were cultured and transfected with FXR-specific siRNA and control siRNA using Lipofectamine RNAiMAX reagent. They were also treated with escalating CDCA concentrations (25 μM to 50 μM) for 24 and 48 hrs. FXR mRNA levels were quantified by real-time PCR and functional assays performed to assess changes in bile acid uptake and efflux following transfection.

Results: Data from 3 separate experiments of intestinal crypts showed similar results in enhanced FXR expression with CDCA against control (p < 0.01). CDCA treatment resulted in a dose-dependent increase in FXR mRNA expression, reaching peak levels after 48 hours of exposure (2.8X increase) with enhanced nuclear localization. Functionally, CDCA-treated enteroids displayed increased bile acid uptake and reduced efflux, validating FXR's role in mediating bile acid driven enterohepatic circulation. Several runs with siRNA were conducted. Using 30pMole siRNA (sense: 5' GAUCAACAGAAUCUUCUUCAUUATA 3' and antisense: 5' UAUAAUGAAGAAGAUUCUGUUGAUCUG 3') there was a 68% reduction in FXR expression against scramble. FXR silencing led to decreased bile acid uptake and increased efflux. No significant effects were observed in enteroids transfected with control siRNA. Paradoxically, CDCA treated cultures showed a higher proportion of immature enteroids to mature enteroids.

Conclusion: In porcine enteroids, CDCA treatment increased FXR gene expression and promoted its nuclear localization. This finding implies the existence of a positive feedback cycle in which CDCA, an FXR ligand, induces further synthesis and uptake. siRNA transfection was able to significantly decrease FXR activity. By employing this innovative methodology, one can effectively examine the function of FXR in ligand treated or control systems.

Pediatric, Neonatal, Pregnancy, and Lactation

Kento Kurashima, MD, PhD1; Si-Min Park, MD1; Arun Verma, MD1; Marzena Swiderska-Syn1; Shaurya Mehta, BS1; Austin Sims1; Mustafa Nazzal, MD1; John Long, DVM1; Chandrashekhara Manithody, PhD1; Shin Miyata, MD1; Ajay Jain, MD, DNB, MHA1

1Saint Louis University, St. Louis, MO

Encore Poster

Presentation: North American Society for Pediatric Gastroenterology, Hepatology and Nutrition (NASPGHAN) Florida.

Financial Support: None Reported.

Background: Biliary atresia (BA) is a major cause of obstructive neonatal cholestatic disease. Although hepato-portoenterostomy (HPE) is routinely performed for BA patients, more than half eventually require liver transplantation. The intricate mechanisms of bile ductular injury driving its pathogenesis remains elusive and not recapitulated in current small animal and rodent models that rely on bile duct ligation. Addressing prevailing lacunae, we hypothesized that extra and intra hepatic bile duct destruction through an endothelial irritant would recapitulate human condition. We have thus developed a novel neonatal piglet BA model called, ‘BATTED’. Piglets have liver and gastro-intestinal homology to human infants and share anatomic and physiological processes providing a robust platform for BATTED and HPE.

Methods: Six 7-10 day old piglets were randomized to BATTED (US provisional Patent US63/603,995) or sham surgery. BATTED included cholecystectomy, common bile duct and hepatic duct injection of 95% ethanol and a retainer suture for continued ethanol induced intrahepatic injury. Vascular access ports were placed and scheduled ultrasound guided liver biopsies were performed. Six-weeks post-BATTED piglets underwent HPE. 8-weeks after initial surgery, animals were euthanized. Serology, histology, gene expression and immunohistochemistry were performed.

Results: Serological evaluation revealed a surge in conjugated bilirubin 6 weeks after BATTED procedure from baseline (mean Δ 0.39 mg/dL to 3.88 mg/dL). Gamma-glutamyl transferase (GGT) also exhibited a several fold increase (mean: Δ 16.3IU to 89.5IU). Sham did not display these elevations (conjugated bilirubin: Δ 0.39 mg/dL to 0.98 mg/dL, GGT: Δ 9.2IU to 10.4IU). Sirius red staining demonstrated significant periportal and diffuse liver fibrosis (16-fold increase) and bile duct proliferation marker CK-7 increased 9-fold with BATTED. Piglets in the BATTED group demonstrated enhanced CD-3 (7-fold), alpha-SMA (8.85-fold), COL1A1 (11.7-fold) and CYP7A1 (7-fold), vs sham. Successful HPE was accomplished in piglets with improved nutritional status and a reduction in conjugated bilirubin (Δ 4.89 mg/dL to 2.11 mg/dL).

Conclusion: BATTED replicated BA features, including hyperbilirubinemia, GGT elevation, significant hepatic fibrosis, bile duct proliferation, and inflammatory infiltration with subsequent successful HPE. This model offers substantial opportunities to elucidate the mechanism underlying BA and adaptation post HPE, paving the path for the development of diagnostics and therapeutics.

Sirine Belaid, MBBS, MPH1; Vikram Raghu, MD, MS1

1UPMC, Pittsburgh, PA

Financial Support: None Reported.

Background: At our institution, pediatric residents are responsible for managing patients with intestinal failure (IF) in the inpatient units. However, they have reported feelings of inexperience and anxiety when dealing with these complex cases. This project aimed to identify knowledge gaps and evaluate the confidence levels of pediatric residents in managing IF patients.

Methods: We conducted an online needs assessment survey using Qualtrics, which included Likert-scale, multiple-choice, open-ended and rating questions to assess residents' confidence levels (from 1 to 10) in performing tasks related to IF patient care. This voluntary survey, approved by the IRB as exempt, was distributed to all pediatric residents at the University of Pittsburgh via QR codes and emails.

Results: Of the respondents, 32% participated in the survey, with nearly 50% having completed a rotation on the Intestinal Rehabilitation (IR) service. Residents reported the lowest confidence in calculating Total Parenteral Nutrition (TPN)-like Intravenous (IV) fluids (solutions administered centrally or peripherally, containing dextrose and major electrolytes designed to match the patients’ home TPN contents), identifying signs of D-lactic acidosis and small intestinal bacterial overgrowth (SIBO), and managing SIBO (average confidence rating of 2/10). They also expressed low confidence in ordering home TPN and TPN-like IV Fluids, understanding related anatomy, ensuring proper stoma care, managing central access loss, and addressing poor catheter blood flow (average confidence ratings of 3-4/10). Conversely, residents felt more confident managing feeding intolerance and central line-associated infections (average confidence ratings of 5-6/10). Additionally, they rated their ability to identify signs of septic and hypovolemic shock as 8/10, though they felt less confident in managing these conditions (7/10). Furthermore, 64% of respondents agreed that managing IF patients is educationally valuable and preferred laminated cards or simulated sessions as educational resources (average ratings of 8 and 6.8 out of 10, respectively).

Conclusion: The survey highlights several areas where pediatric residents need further education. Addressing these knowledge gaps through targeted curricular interventions can better equip residents to manage IF patients and potentially increase their interest in this specialty.

CLABSI = Central Line-Associated Bloodstream Infection, TPN = Total Parenteral Nutrition, EHR = Electronic Health Record, IV = Intravenous, SIBO = Small Intestinal Bacterial Overgrowth.

Figure 1. Stratifies the tasks related to managing patients with Intestinal Failure (IF) into three categories based on the average confidence rating score (>= 7/10, 5-6/10, <=4/10) of pediatric residents.

Figure 2. Illustrates the distribution of pediatric residents’ opinions on the educational value of managing patients with intestinal failure.

Alyssa Ramuscak, MHSc, MSc1; Inez Martincevic, MSc1; Hebah Assiri, MD1; Estefania Carrion, MD2; Jessie Hulst, MD, PhD1

1The Hospital for Sick Children, Toronto, ON; 2Hospital Metropolitano de Quito, Quito, Pichincha

Financial Support: Nestle Health Science Canada, North York, Ontario, Canada.

Background: Enteral nutrition provides fluids and nutrients to individuals unable to meet needs orally. Recent interest in real food-based formulas highlights a shift towards providing nutrition with familiar fruit, vegetable and protein ingredients. This study aimed to evaluate the tolerability and nutritional adequacy of hypercaloric, plant-based, real food ingredient formula in pediatric tube-fed patients.

Methods: This prospective, single-arm, open-label study evaluated the tolerability and nutritional adequacy of a hypercaloric, plant-based formula, Compleat® Junior 1.5, in medically complex, stable, pediatric tube-fed patients aged 1-13 years. Participants were recruited from outpatient clinics at The Hospital for Sick Children, Toronto from May 2023 to June 2024. Demographic and anthropometric measurements (weight, height) were obtained at baseline. Daily dose of study product was isocaloric when compared to routine feeds. Total fluid needs were met by adding water. Participants transitioned to the study formula over 3 days, followed by 14-days of exclusive use of study product. Caregivers monitored volume of daily intake, feed tolerance, and bowel movements during the transition and study period using an electronic database (Medrio®). An end of study visit was conducted to collect weight measurements. Descriptive statistics summarize demographic and clinical characteristics (Table 1). The paired t-test compared weight-for-age and BMI-for-age z-scores between baseline and the end-of-study. Symptoms of intolerance and bowel movements, using either the Bristol Stool Scale or Brussel's Infant and Toddler Stool Scale, were described as frequency of events and compared at baseline to intervention period. The percent calorie and protein goals during the study period were calculated as amount calories received over prescribed, and amount protein received as per dietary reference intake for age and weight.

Results: In total, 27 ambulatory pediatric participants with a median age of 5.5 years (IQR, 2.5-7) were recruited for the study with 26 completing (Table 1). Participant weight-for-age and BMI-for-age z-scores significantly improved between baseline and end of study, from -1.75 ± 1.93 to -1.67 ± 1.88 (p < 0.05), and from -0.47 ± 1.46 to 0.15 ± 0.23 (p < 0.05), respectively. There was no significant difference in the frequency of any GI symptom, including vomiting, gagging/retching, tube venting or perceived pain/discomfort with feeds, between baseline and end of study. There was no significant difference in frequency or type of stool between baseline and end of study. Study participants met 100% of their prescribed energy for the majority (13 ± 1.7 days) of the study period. All participants exceeded protein requirements during the study period. Twenty families (76.9%) indicated wanting to continue to use study product after completing the study.

Conclusion: This prospective study demonstrated that a hypercaloric, plant-based, real food ingredient formula among stable, yet medically complex children was well tolerated and calorically adequate to maintain or facilitate weight gain over a 14-day study period. The majority of caregivers preferred to continue use of the study product.

Table 1. Demographic and Clinical Characteristics of Participants (n = 27).

Poster of Distinction

Gustave Falciglia, MD, MSCI, MSHQPS1; Daniel Robinson, MD, MSCI1; Karna Murthy, MD, MSCI1; Irem Sengul Orgut, PhD2; Karen Smilowitz, PhD, MS3; Julie Johnson, MSPH PhD4

1Northwestern University Feinberg School of Medicine, Chicago, IL; 2University of Alabama Culverhouse College of Business, Tuscaloosa, AL; 3Northwestern University Kellogg School of Business & McCormick School of Engineering, Evanston, IL; 4University of North Carolina School of Medicine, Chapel Hill, NC

Encore Poster

Presentation: Children's Hospital Neonatal Consortium (CHNC) Annual Conference, November 1, 2021, Houston, TX.

Financial Support: None Reported.

Lyssa Lamport, MS, RDN, CDN1; Abigail O'Rourke, MD2; Barry Weinberger, MD2; Vitalia Boyar, MD2

1Cohen Children's Medical Center of New York, Port Washington, NY; 2Cohen Children's Medical Center of NY, New Hyde Park, NY

Financial Support: None Reported.

Background: Premature infants in the neonatal intensive care unit (NICU) are at high risk for peripheral intravenous catheter infiltration (PIVI) because of the frequent cannulation of small and fragile vessels. The most common infusate in neonates is parenteral nutrition (PN), followed by antibiotics. Previous reports have suggested that the intrinsic properties of infusates, such as pH, osmolality, and calcium content, determine the severity of PIVIs. This has led to the common practice of restricting the intake of protein and/or calcium to less than that which is recommended for optimal growth and bone mineralization.

Methods: Our objective was to identify the characteristics of infants and intravenous (IV) infusates that associated with the development of severe neonatal PIVIs. We conducted a retrospective analysis of PIVIs in our level IV NICU from 2018-2022 (n = 120). Each PIVI was evaluated by a wound certified neonatologist and classified as mild, moderate, or severe using a scoring system based on the Infusion Nurses Society (INS) staging criteria. Comparison between groups were done using ANOVA and chi-square analysis or Mann-Whitney tests for non-parametric data.

Results: Infants with severe PIVIs had a lower mean birthweight than those with mild or moderate PIVIs (1413.1 g vs 2116.9 g and 2020.3 g respectively, p = .01) (Table 1). Most PIVIs occurred during infusions of PN and lipids, but the severity was not associated with the infusion rate, osmolality, or with the concentration of amino acids (median 3.8 g/dL in the mild group, 3.5 g/dL in the moderate group and 3.4 g/dL in the severe group) or calcium (median 6500 mg/L in all groups) (Table 2, Figure 1). Of note, the infusion of IV medications within 24 hours of PIVI was most common in the severe PIVI group (p = .03)(Table 2). Most PIVIs, including mild, were treated with hyaluronidase. Advanced wound care was required for 4% of moderate and 44% of severe PIVIs, and none required surgical intervention.

Conclusion: Severe PIVIs in the NICU are most likely to occur in infants with a low birthweight and within 24 hours of administration of IV medications. This is most likely because medications must be kept at acidic or basic pH for stability, and many have high osmolarity and/or intrinsic caustic properties. Thus, medications may induce chemical phlebitis and extravasation with inflammation. In contrast, PN components, including amino acids and calcium, are not related to the severity of extravasations. Our findings suggest that increased surveillance of IV sites for preterm infants following medication administration may decease the risk of severe PIVIs. Conversely, reducing or withholding parenteral amino acid and or calcium to mitigate PIVI risk may introduce nutritional deficiencies without decreasing the risk of clinically significant PIVIs.

Table 1. Characteristic Comparison of Mild, Moderate, and Severe Pivis in Neonatal ICU. PIVI Severity Was Designated Based on INS Criteria.

Table 2. Association of Medication Administration and Components of Infusates With the Incidence and Severity of PIVI in NICU.

Figure 1. Infusate Properties.

Stephanie Oliveira, MD, CNSC1; Josie Shiff2; Emily Romantic, RD3; Kathryn Hitchcock, RD4; Gillian Goddard, MD4; Paul Wales, MD5

1Cincinnati Children's Hospital Medical Center, Mason, OH; 2University of Cincinnati, Cincinnati, OH; 3Cincinnati Children's Hospital Medical Center, Cincinnati, OH; 4Cincinnati Children's Hospital, Cincinnati, OH; 5Cincinnati Children's Hospital Medical Center, Cincinnati, OH

Financial Support: None Reported.

Background: It is common for children with intestinal failure on parenteral nutrition to be fed an elemental enteral formula as it is believed they are typically better tolerated due to the protein module being free amino acids, the absence of other allergens, and the presence of long chain fatty acids. In February 2022, a popular elemental formula on the market was recalled due to bacterial contamination that necessitated an immediate transition to an alternative enteral formula. This included initiating plant-based options for some of our patients. We have experienced a growing interest and request from families to switch to plant-based formulas due to religious practices, cost concerns, and personal preferences. While plant-based formulas lack major allergens and may contain beneficial soluble fiber, they are under studied in this patient population. This study aims to determine if growth was affected amongst children with intestinal failure on parenteral nutrition who switched from elemental to plant-based formulas.

Methods: We conducted a retrospective cohort study of IF patients on PN managed by our intestinal rehabilitation program who transitioned from elemental to plant-based formulas during the product recall. Data were collected on demographics, intestinal anatomy, formula intake, parenteral nutrition support, tolerance, stool consistency, and weight gain for 6 months before and after formula transition. Paired analyses were performed for change in growth and nutritional intake, using the Wilcoxon-signed rank test. Chi-squared tests were performed to compare formula tolerance. An alpha value < 0.05 was considered significant.

Results: Eleven patients were included in the study [8 Males; median gestational age 33 (IQR = 29, 35.5) weeks, median age at assessment 20.4 (IQR = 18.7,29.7) months]. All participants had short bowel syndrome (SBS) as their IF category. Residual small bowel length was 28(IQR = 14.5,47.5) cm. Overall, there was no statistically significant difference in growth observed after switching to plant-based formulas (p = 0.76) (Figure 1). Both median enteral formula volume and calorie intake were higher on plant-based formula, but not statistically significant (p = 0.83 and p = 0.41) (Figure 2). 7 of 11 patients (64%) reported decreased stool count (p = 0.078) and improved stool consistency (p = 0.103) after switching to plant-based formula. Throughout the study, the rate of PN calorie and volume weaning were not different after switching to plant-based formula (calories: p = 0.83; volume: p = 0.52) (Figure 3).

Conclusion: In this small study of children with IF, the switch from free amino acid formula to an intact plant-based formula was well tolerated. Growth was maintained between groups. After switching to plant-based formulas these children tolerated increased enteral volumes, but we were underpowered to demonstrate a statistical difference. There was no evidence of protein allergy among children who switched. Plant-based formulas may be an alternative option to elemental formulas for children with intestinal failure.

Figure 1: Change in Weight Gain (g/day) on Elemental and Plant-Based Formulas.

Figure 2. Change in Enteral Calories (kcal/kg/day) and Volume (mL/kg/day) on Elemental and Plant-Based Formulas.

Figure 3. Change in PN Calories (kcal/kg/day) and Volume (mL/kg/day) on Elemental and Plant-Based Formulas.

Carly McPeak, RD, LD1; Amanda Jacobson-Kelly, MD, MSc1

1Nationwide Children's Hospital, Columbus, OH

Financial Support: None Reported.

Background: In pediatrics, post-pyloric enteral feeding via jejunostomy, gastrojejunostomy or naso-jejunostomy tubes is increasingly utilized to overcome problems related to aspiration, severe gastroesophageal reflux, poor gastric motility, and gastric outlet obstruction. Jejunal enteral feeding bypasses the duodenum, a primary site of absorption for many nutrients. Copper is thought to be primarily absorbed in the stomach and proximal duodenum, thus patients receiving nutritional support in a way that bypasses this region can experience copper deficiency. The prevalence and complications of copper deficiency in the pediatric population are not well documented.

Methods: This was a retrospective case series of two patients treated at Nationwide Children's Hospital (NCH). Medical records were reviewed to collect laboratory, medication/supplement data and enteral feeding history. In each case report, both patients were observed to be receiving Pediasure Peptide® for enteral formula.

Results: Case 1: A 14-year-old male receiving exclusive post-pyloric enteral nutrition for two years. This patient presented with pancytopenia and worsening anemia. Laboratory data was drawn on 3/2017 and demonstrated deficient levels of copper (< 10 ug/dL) and ceruloplasmin (20 mg/dL). Repletion initiated with intravenous cupric chloride 38 mcg/kg/day for 3 days and then transitioned to 119 mcg/kg/day via j-tube for 4 months. Labs were redrawn 2 months after initial episode of deficiency and indicated overall improvement of pancytopenia (Table 1). After 4 months, cupric chloride decreased to 57 mcg/kg/day as a maintenance dose. Laboratory data redrawn two and a half years after initial episode of deficiency and revealed deficient levels of copper (27 ug/dL) and ceruloplasmin (10 mg/dL) despite lower dose supplementation being administered. Case 2: An 8-year-old female receiving exclusive post-pyloric enteral nutrition for 3 months. Laboratory data was drawn on 3/2019 and revealed deficient levels of copper (38 ug/dL) and ceruloplasmin (13 mg/dL). Supplementation of 50 mcg/kg/day cupric chloride administered through jejunal tube daily. Copper and ceruloplasmin labs redrawn at 11 months and 15 months after initiation of supplementation and revealed continued deficiency though hematologic values remained stable (Table 2).

Conclusion: There are currently no guidelines for clinicians for prevention, screening, treatment, and maintenance of copper deficiency in post-pyloric enteral feeding in pediatrics. Current dosing for copper repletion in profound copper deficiency is largely based on case series and expert opinions. At NCH, the current standard-of-care supplementation demonstrates inconsistent improvement in copper repletion, as evidenced by case reports discussed above. Future research should determine appropriate supplementation and evaluate their efficacy in patients with post-pyloric enteral feeding.

Table 1. Laboratory Evaluation of Case 1.

'-' indicates no data available, bolded indicates result below the lower limit of normal for age.

Table 2. Laboratory Evaluation of Case 2.

'-' indicates no data available, bolded indicates result below the lower limit of normal for age.

Meighan Marlo, PharmD1; Ethan Mezoff, MD1; Shawn Pierson, PhD, RPh1; Zachary Thompson, PharmD, MPH, BCPPS1

1Nationwide Children's Hospital, Columbus, OH

Financial Support: None Reported.

Background: Parenteral nutrition (PN) is and remains a high-risk therapy, typically containing more than 40 different ingredients. Incorporation of PN prescribing into the electronic health record (EHR) has been recommended to minimize the risk of transcription errors and allow for implementation of important safety alerts for prescribers. Ambulatory PN prescribing workflows typically require manual transcription of orders in the absence of robust EHR interoperability between systems used for transitions of care. Pediatric patients receiving ambulatory PN have an increased risk of medication events due to the need for weight-based customization and low utilization of ambulatory PN leading to inexperience for many pharmacies. Development and implementation of a prescribing system incorporating inpatient and outpatient functionality within the EHR is necessary to improve the quality and safety of ambulatory PN in pediatric patients. The primary goal was to create a workflow that provided improvement in transitions of care through minimization of manual transcription and improved medication safety. We describe modification of standard EHR tools to achieve this aim.

Methods: Utilizing a multidisciplinary team, development and incorporation of ambulatory PN prescribing within the EHR at Nationwide Children's Hospital was completed. Inpatient and outpatient variances, safety parameters to provide appropriate alerts to prescribers, and legal requirements were considered and evaluated for optimization within the new system.

Results: The final product successfully incorporated ambulatory PN prescribing while allowing seamless transfer of prescriptions between care settings. The prescriber orders the patient specific ambulatory PN during an inpatient or outpatient encounter. The order is subsequently queued for pharmacist review/verification to assess the adjustments and determine extended stability considerations in the ambulatory setting. After pharmacist review, the prescription prints and is signed by the provider to be faxed to the pharmacy.

Conclusion: To our knowledge, this is the first institution to be able to develop and incorporate pediatric PN prescribing into the EHR that transfers over in both the inpatient and outpatient settings independent of manual transcription while still allowing for customization of PN.

Faith Bala, PhD1; Enas Alshaikh, PhD1; Sudarshan Jadcherla, MD1

1The Research Institute at Nationwide Children's Hospital, Columbus, OH

Financial Support: None Reported.

Background: Extrauterine growth remains a concern among preterm-born infants admitted to the neonatal ICU (NICU) as it rarely matches intrauterine fetal growth rates. Although the reasons are multifactorial, the role played by the duration of exclusive parenteral nutrition (EPN) and the transition to reach exclusive enteral nutrition (EEN) phase remains unclear. Significant nutrient deficits can exist during the critical phase from birth to EEN, and thereafter, and these likely impact short and long-term outcomes. Given this rationale, our aims were to examine the relationship between the duration from birth to EEN on growth and length of hospital stay (LOHS) among convalescing preterm-born infants with oral feeding difficulties.

Methods: This is a retrospective analysis of prospectively collected data from 77 preterm infants admitted to the all-referral level IV NICU at Nationwide Children's Hospital, Columbus, Ohio, who were later referred to our innovative neonatal and infant feeding disorders program for the evaluation and management of severe feeding/aero-digestive difficulties. Inclusion criteria: infants born < 32 weeks gestation, birthweight < 1500 g, absence of chromosomal/genetic disorders, discharged at term equivalent postmenstrual age (37-42 weeks, PMA) on full oral feeding. Growth variables were converted to age- and gender-specific Z-scores using the Fenton growth charts. Using the Academy of Nutrition and Dietetics criteria for neonates and preterm populations, extrauterine growth restriction (EUGR) was defined as weight Z-score decline from birth to discharge > 0.8. Clinical characteristics stratified by EUGR status were compared using the Chi-Square test, Fisher exact test, Mann Whitney U test, and T-test as appropriate. Multivariate regression was used to explore and assess the relationship between the duration from birth to EEN and growth Z-scores at discharge simultaneously. Multiple Linear regression was used to assess the relationship between the duration from birth to EEN and LOHS.

Results: Forty-two infants (54.5%) had EUGR at discharge, and those with weight and length percentiles < 10% were significantly greater at discharge than at birth (Table 1). The growth-restricted infants at discharge had significantly lower birth gestational age, a higher proportion required mechanical ventilation at birth, had a higher incidence of sepsis, and took longer days to attain EEN (Table 2). The duration from birth to EEN was significantly negatively associated with weight, length, and head circumference Z scores at discharge. Likewise, the duration from birth to EEN was significantly positively associated with the LOHS (Figure 1).

Conclusion: The duration from birth to exclusive enteral nutrition (EEN) can influence growth outcomes. We speculate that significant gaps between recommended and actual nutrient intake exist during the period to EEN, particularly for those with chronic tube-feeding difficulties. Well-planned and personalized nutrition is relevant until EEN is established, and enteral nutrition advancement strategies using standardized feeding protocols offers prospects for generalizability, albeit provides opportunities for personalization.

Table 1. Participant Growth Characteristics.

Table 2. Participants Clinical Characteristics.

Figure 1. Relationship between the Duration from Birth to EEN Versus Growth Parameters and Length of Hospital Stay.

Alayne Gatto, MS, MBA, RD, CSP, LD, FAND1; Jennifer Fowler, MS, RDN, CSPCC, LDN2; Deborah Abel, PhD, RDN, LDN3; Christina Valentine, MD, MS, RDN, FAAP, FASPEN4

1Florida International University, Bloomingdale, GA; 2East Carolina Health, Washington, NC; 3Florida International University, Miami Beach, FL; 4Banner University Medical Center, The University of Arizona, Tucson, AZ

Financial Support: The Rickard Foundation.

Background: The neonatal registered dietitian nutritionist (NICU RDN) plays a crucial role in the care of premature and critically ill infants in the neonatal intensive care unit (NICU). Advanced pediatric competencies are frequently a gap in nutrition degree coursework or dietetic internships. Critical care success with such vulnerable patients requires expertise in patient-centered care, multidisciplinary collaboration, and adaptive clinical problem-solving. This research aimed to identify the needs, engagement levels, and expertise of NICU RDNs, while also providing insights into their job satisfaction and career longevity. Currently in the US, there are approximately 850 Level III and Level IV NICUs in which neonatal dietitians are vital care team members, making it imperative that hospital provide appropriate compensation, benefits, and educational support.

Methods: This was a cross-sectional examination using a national, online, IRB approved survey during March 2024 sent to established Neonatal and Pediatric Dietitian practice groups. A Qualtrics link was provided for current and former NICU RDNs to complete a 10-minute online survey that provided an optional gift card for completion. The link remained open until 200 gift cards were exhausted, approximately one week after the survey opened. Statistical analyses were performed using Stats IQ Qualtrics. In the descriptive statistics, frequencies of responses were represented as counts and percentages. For comparison of differences, the Chi-Squared test and Fisher's Exact test are used for categorical analysis.

Results: In total, 253 current (n = 206) and former (n = 47) NICU RDNs completed the online questionnaire. Of the 210 respondents, 84 (40%) reported having pediatric clinical experience, 94 (44%) had clinical pediatric dietetic intern experience, 21 (10%) had previously worked as a WIC nutritionist, 15 (7.1%) had specialized pediatric certification or fellowship, and 12 (5.7%) had no prior experience before starting in the NICU. (Table 1) Of 163 respondents, 83 (50.9%) reported receiving financial support or reimbursement for additional NICU training. Respondents who felt valued as team members planned to stay in the NICU RD role for more than 5 years (p > 0.0046). Additionally, they reported having acknowledgement and appreciation (64.4%), motivation (54.1%), and opportunities for advancement (22.9%). (Table 2).

Conclusion: NICU RDNs do not have a clear competency roadmap nor a career development track. In addition, financial support or reimbursement for continuing education is not consistently an employee benefit which may play a key role in job satisfaction and retention. This data provides valuable insight for not only managers of dietitians but also professional societies to build programs and retention opportunities.

Table 1. Question: When You Started as a NICU RD, What Experience Did You Have? (n = 210).

N and Percentages will total more than 210 as respondents could check multiple answers.

Table 2. Comparison of Questions: Do You Feel You Have the Following in Your Role in the NICU and How Long Do You Plan to Stay in Your Role?

Sivan Kinberg, MD1; Christine Hoyer, RD2; Everardo Perez Montoya, RD2; June Chang, MA2; Elizabeth Berg, MD2; Jyneva Pickel, DNP2

1Columbia University Irving Medical Center, New York, NY; 2Columbia University Medical Center, New York, NY

Financial Support: None Reported.

Background: Patients with short bowel syndrome (SBS) can have significant fat malabsorption due to decreased intestinal surface area, bile acid deficiency, and rapid transit time, often leading to feeding intolerance and dependence on parenteral nutrition (PN). Patients with SBS can have deficiency of pancreatic enzymes and/or reduced effectiveness of available pancreatic enzymes, resulting in symptoms of exocrine pancreatic insufficiency (EPI), including weight loss, poor weight gain, abdominal pain, diarrhea, and fat-soluble vitamin deficiencies. Oral pancreatic enzyme replacement therapy (PERT) is often tried but is impractical for use with enteral nutrition (EN) due to inconsistent enzyme delivery and risk of clogged feeding tubes. When used with continuous EN, oral PERT provides inadequate enzyme delivery as its ability to hydrolyze fats decreases significantly after 30 minutes of ingestion. The significant need for therapies to improve enteral absorption in this population has led to interest in using an in-line digestive cartridge to treat EPI symptoms in SBS patients on tube feedings. Immobilized lipase cartridge is an FDA-approved in-line digestive cartridge designed to hydrolyze fat in EN before it reaches the gastrointestinal tract, allowing for the delivery of absorbable fats through continuous or bolus feeds. In patients with SBS, this modality may be more effective in improving enteral fat absorption compared to oral preparations of PERT. Preclinical studies have demonstrated that use of an in-line digestive cartridge in a porcine SBS model increased fat-soluble vitamin absorption, reduced PN dependence, and improved intestinal adaptation. Our study aims to evaluate changes in PN, EN, growth parameters, stool output, and fat-soluble vitamin levels in pediatric SBS patients using an in-line digestive cartridge at our center.

Methods: Single-center retrospective study in pediatric patients with SBS on EN who used an in-line immobilized lipase (RELiZORB) cartridge. Data collection included patient demographics, etiology of SBS, surgical history, PN characteristics (calories, volume, infusion hours/days), EN characteristics (tube type, bolus or continuous feeds, formula, calories, volume, hours), immobilized lipase cartridge use (#cartridges/day, duration), anthropometrics, stool output, gastrointestinal symptoms, medications (including previous PERT use), laboratory assessments (fat-soluble vitamin levels, fatty acid panels, pancreatic elastase), indication to start immobilized lipase cartridge, and any reported side effects. Patients with small intestinal transplant or cystic fibrosis were excluded.

Results: Eleven patients were included in the study (mean age 10.4 years, 55% female). The most common etiology of SBS was necrotizing enterocolitis (45%) and 7 (64%) of patients were dependent on PN. Results of interim analysis show: mean duration of immobilized lipase cartridge use of 3.9 months, PN calorie decrease in 43% of patients, weight gain in 100% of patients, and improvement in stool output in 6/9 (67%) patients. Clogging of the cartridges was the most common reported technical difficulty (33%), which was overcome with better mixing of the formula. No adverse events or side effects were reported.

Conclusion: In this single-center study, use of an in-line immobilized lipase digestive cartridge in pediatric patients with SBS demonstrated promising outcomes, including weight gain, improved stool output and reduced dependence on PN. These findings suggest that in-line digestive cartridges may play a role in improving fat malabsorption and decreasing PN dependence in pediatric SBS patients. Larger multicenter studies are needed to further evaluate the efficacy, tolerability, and safety of in-line digestive cartridges in this population.

Vikram Raghu, MD, MS1; Feras Alissa, MD2; Simon Horslen, MB ChB3; Jeffrey Rudolph, MD2

1University of Pittsburgh School of Medicine, Gibsonia, PA; 2UPMC Children's Hospital of Pittsburgh, Pittsburgh, PA; 3University of Pittsburgh School of Medicine, Pittsburgh, PA

Financial Support: National Center for Advancing Translational Sciences (KL2TR001856.)

Background: Administrative databases can provide unique perspectives in rare disease due to the ability to query multicenter data efficiently. Pediatric intestinal failure can be challenging to study due to the rarity of the condition at single centers and the previous lack of a single diagnosis code. In October 2023, a new diagnosis code for intestinal failure was added to the International Classification of Diseases, 10th revision (ICD-10). We aimed to describe the usage and limitations of this code to identify children with intestinal failure.

Methods: We performed a multicenter cross-sectional study using the Pediatric Health Information Systems database from October 1, 2023, to June 30, 2024. Children with intestinal failure were identified by ICD-10 code (K90.83). Descriptive statistics were used to characterize demographics, diagnoses, utilization, and outcomes.

Results: We identified 1804 inpatient encounters from 849 unique patients with a diagnosis code of intestinal failure. Figure 1 shows the trend of code use by month since its inception in October 2023. The 849 patients had a total of 7085 inpatient encounters over that timeframe, meaning only 25% of their encounters included the intestinal failure diagnosis. Among these 849 patients, 638 had at least one encounter over the timeframe in which they received parenteral nutrition; 400 corresponded to an admission in which they also had an intestinal failure diagnosis code. Examining only inpatient stays longer than 2 days, 592/701 (84%) patients with such a stay received parenteral nutrition. Central line-associated bloodstream infections accounted for 501 encounters. Patients spent a median of 37 days (IQR 13-96) in the hospital. Patients were predominantly non-Hispanic White (43.7%), non-Hispanic Black (14.8%), or Hispanic (27.9%). Most had government-based insurance (63.5%). Child Opportunity Index was even split among all five quintiles. The total standardized cost from all encounters with an intestinal failure diagnosis totaled $157 million with the total from all encounters with these patients totaling $259 million. The median cost over those 9 months per patients was $104,890 (IQR $31,149 - $315,167). Death occurred in 28 patients (3.3%) over the study period.

Conclusion: The diagnosis code of intestinal failure has been used inconsistently since implementation in October 2023, perhaps due to varying definitions of intestinal failure. Children with intestinal failure experience high inpatient stay costs and rare but significant mortality. Future work must consider the limitations of using only the new code in identifying these patients.

Figure 1. Number of Encounters With an Intestinal Failure Diagnosis Code.

Poster of Distinction

Kera McNelis, MD, MS1; Allison Ta, MD2; Ting Ting Fu, MD2

1Emory University, Atlanta, GA; 2Cincinnati Children's Hospital Medical Center, Cincinnati, OH

Encore Poster

Presentation: 6th Annual Pediatric Early Career Research Conference, August 27, 2024, Health Sciences Research Building I, Emory University.

Financial Support: None Reported.

Background: The neonatal period is a time of rapid growth, and many infants who require intensive care need extra nutritional support. The Academy of Nutrition and Dietetics (AND) published an expert consensus statement to establish criteria for the identification of neonatal malnutrition. There is limited evidence regarding outcomes associated with diagnosis from these opinion-derived criteria. The objective of this study was to compare anthropometric-based malnutrition indicators with direct body composition measurements in infancy.

Methods: Air displacement plethysmography is considered the gold standard for non-invasive body composition measurement, and this was incorporated into routine clinical care at a referral Level IV neonatal intensive care unit. Late preterm and term infants (34-42 weeks gestational age) with body composition measurement available were included in this study. Infants were categorized as having malnutrition per AND criteria. Fat mass, fat-free mass, and body fat percentage z-scores were determined per Norris body composition growth curves. Logistic regression was conducted to ascertain the relationship of fat mass, fat-free mass, and body fat percentage with malnutrition diagnosis. Linear regression was performed to predict body mass index (BMI) at age 18-24 months from each body composition variable.

Results: Eighty-four infants were included, with 39% female and 96% singleton (Table 1). Fifteen percent were small for gestational age and 12% were large for gestational age at birth. Nearly half had a congenital intestinal anomaly, including gastroschisis and intestinal atresia. Sixty-three percent of the group met at least one malnutrition criterion. Fat-free mass z-score was negatively associated with a malnutrition diagnosis, with an odds ratio 0.77 (95% CI 0.59-0.99, p < 0.05). There was not a statistically significant association between malnutrition diagnosis and body fat percentage or fat mass. There was not a statistically significant relationship between any body composition variable and BMI at 18-24 months, even after removing outliers with a high Cook's distance.

Conclusion: Malnutrition diagnosis is associated with low fat-free mass in critically ill term infants. Body composition is not a predictor of later BMI in this small study.

Table 1. Characteristics of Late Preterm and Term Infants in the Neonatal Intensive Care Unit With a Body Composition Measurement Performed Via Air Displacement Plethysmography.

John Stutts, MD, MPH1; Yong Choe, MAS1

1Abbott, Columbus, OH

Financial Support: Abbott.

Background: The prevalence of obesity in children is rising. Despite the awareness and work toward weight reduction, less is known about malnutrition in children with obesity. The purpose of this study was to evaluate the prevalence of obesity in U.S. children and determine which combination of indicators best define malnutrition in this population.

Methods: The 2017-2018 National Health and Nutrition Examination Survey (NHANES) database was utilized to assess biomarkers (most recent complete dataset due to Covid-19 pandemic). Trends in prevalence were obtained from 2013-2018 survey data. Obesity was defined as ≥ 95th percentile of the CDC sex-specific BMI-for-age growth charts. Cohort age range was 12-18 years. Nutrient intake and serum were analyzed for vitamin D, vitamin E, vitamin C, potassium, calcium, vitamin B9, vitamin A, total protein, albumin, globulin, high sensitivity C-reactive protein (hs-CRP), iron, hemoglobin and mean corpuscular volume (MCV). Intake levels of fiber were also analyzed. Children consuming supplements were excluded. Categorical and continuous data analysis was performed using SAS® Proc SURVEYFREQ (with Wald chi-square test) and Proc SURVEYMEANS (with t-test) respectively in SAS® Version 9.4 and SAS® Enterprise Guide Version 8.3. Hypothesis tests were performed using 2-sided, 0.05 level tests. Results were reported as mean ± standard error (SE) (n = survey sample size) or percent ± SE (n).

Results: The prevalence of obesity in the cohort was 21.3% ± 1.3 (993). Prevalence trended upward annually; 20.3% ± 2.1 (1232) in 2013-2014, 20.5% ± 2.0 (1129) in 2015-2016. When compared with children without obesity, the mean serum levels of those with obesity were significantly (P ≤ 0.05) lower for vitamin D (50.3 ± 2.4 vs. 61.5 ± 2.1, p < 0.001), iron (14.4 ± 0.5 vs. 16.4 ± 0.4, p < 0.001), albumin (41.7 ± 0.3 vs. 43.3 ± 0.3, p < 0.001), and MCV (83.8 ± 0.5 vs. 85.8 ± 0.3, p = 0.003). When compared with children without obesity, the mean serum levels of those with obesity were significantly (p ≤ 0.05) higher for total protein (73.2 ± 0.4 vs. 72.0 ± 0.3, p = 0.002), globulin (31.5 ± 0.4 vs. 28.7 ± 0.3, p < 0.001) and hs-CRP (3.5 ± 0.3 vs. 1.2 ± 0.2, p < 0.001). A higher prevalence of insufficiency was found for Vitamin D (51.9% ± 5.6 vs. 26.8% ± 3.7, p = 0.001), hemoglobin (16.3% ± 3.1 vs. 7.5% ± 1.8, p = 0.034) and the combination of low hemoglobin + low MCV (11.2% ± 2.9 vs. 3.3% ± 1.0, p = 0.049). All other serum levels were not significantly (p > 0.05) different, with no significant difference in intake.

Conclusion: Results indicate a continued increase in prevalence of obesity in children. When comparing with the non-obese pediatric population, it also shows the differences in micro- and macronutrient serum levels, with no significant differences in dietary intake of these nutrients. The higher prevalence of low hemoglobin + low MCV supports iron deficiency and adds clinical relevance to the data surrounding low mean blood levels of iron. Children with obesity show higher mean globulin and hs-CRP levels consistent with an inflammatory state. The results underscore the existence of malnutrition in children with obesity and the need for nutrition awareness in this pediatric population.

Elisha London, BS, RD1; Derek Miketinas, PhD, RD2; Ariana Bailey, PhD, MS3; Thomas Houslay, PhD4; Fabiola Gutierrez-Orozco, PhD1; Tonya Bender, MS, PMP5; Ashley Patterson, PhD1

1Reckitt/Mead Johnson, Evansville, IN; 2Data Minded Consulting, LLC, Houston, TX; 3Reckitt/Mead Johnson Nutrition, Henderson, KY; 4Reckitt/Mead Johnson Nutrition, Manchester, England; 5Reckitt/Mead Johnson Nutrition, Newburgh, IN

Financial Support: None Reported.

Background: The objective was to examine whether nutrient intake varied across malnutrition classification among a nationally representative sample of children and adolescents.

Methods: This was a secondary analysis of children and adolescents 1-18 y who participated in the National Health and Nutrition Examination Survey 2001-March 2020. Participants were excluded if they were pregnant or did not provide at least one reliable dietary recall. The degree of malnutrition risk was assessed by weight-for-height and BMI-for-age Z-scores for 1-2 y and 3-18 y, respectively. As per the Academy of Nutrition and Dietetics/American Society for Parenteral and Enteral Nutrition, malnutrition was classified using Z-scores: none (Z > -1), mild (Z between -1 and -1.9), and moderate/severe (Z ≤ -2). Dietary intake was assessed from one to two 24-hr dietary recalls. Usual intakes of macronutrients from foods and beverages, and of micronutrients from foods, beverages, and supplements were assessed using the National Cancer Institute method. The cut-point approach was used to estimate the proportion of children with intake below the estimated average requirement for micronutrients and outside the acceptable macronutrient distribution range for macronutrients. All analyses were adjusted for sampling methodology and the appropriate sample weights were applied. Independent samples t-tests were conducted to compare estimates within age groups between nutrition status classifications, with no malnutrition as the reference group.

Results: A total of 32,188 participants were analyzed. Of those, 31,689 (98.4%) provided anthropometrics. The majority (91%) did not meet criteria for malnutrition, while 7.4% and 1.6% met criteria for mild and moderate/severe malnutrition, respectively. Dietary supplement use (mean[SE]) was reported among 33.3[0.7]% of all children/adolescents. Of those experiencing moderate/severe malnutrition, inadequate calcium intake was greatest in adolescents 14-18 y (70.6[3.3]%), and older children 9-13 y (69.7[3.4]%), compared to 3-8 y (29.5[3.9]%) and 1-2 y (5.0[1.2]%). In children 9-13 y, risk for inadequate calcium intake was greater among those experiencing mild (65.3[2.1]%) and moderate/severe malnutrition (69.7[2.1]%) compared to those not experiencing malnutrition (61.2[1.1]%). Similarly, in those experiencing moderate/severe malnutrition, inadequate zinc and phosphorus intake was greatest in adolescents 14-18 y (20.9[3.3]% and 30.6[3.4]%, respectively) and 9-13 y (13.4[2.6]% and 33.9[3.6]%, respectively) compared to younger children (range 0.2[0.1]-0.9[0.4]% and 0.2[0.1]-0.4[0.2]% in 1-8 y, respectively). Although the greatest risk for inadequate protein intake was observed among those experiencing moderate/severe malnutrition, percent energy intake from protein and carbohydrate was adequate for most children and adolescents. Total and saturated fat were consumed in excess by all age groups regardless of nutrition status. The greatest risk for excessive saturated fat intake was reported in those experiencing moderate/severe malnutrition (range across all age groups: 85.9-95.1%).

Conclusion: Older children and adolescents experiencing malnutrition were at greatest risk for inadequate intakes of certain micronutrients such as calcium, zinc, and phosphorus. These results may indicate poor diet quality among those at greatest risk for malnutrition, especially adolescents.

Anna Benson, DO1; Louis Martin, PhD2; Katie Huff, MD, MS2

1Indiana University School of Medicine, Carmel, IN; 2Indiana University School of Medicine, Indianapolis, IN

Financial Support: None Reported.

Background: Trace metals are essential for growth and are especially important in the neonatal period. Recommendations are available for intake of these trace metals in the neonate. However, these recommendations are based on limited data and there are few available descriptions regarding trace metal levels in neonates and their influence on outcomes. In addition, monitoring trace metal levels can be difficult as multiple factors, including inflammation, can affect accuracy. The goal of this project was to evaluate patient serum levels of zinc, selenium and copper and related outcomes including growth, rate of cholestasis, sepsis, bronchopulmonary dysplasia (BPD), and death in a cohort admitted to the neonatal intensive care unit (NICU) and parenterally dependent.

Methods: We completed a retrospective chart review of NICU patients who received parenteral nutrition (PN) and had trace metal panels drawn between January 2016 and February 2023. Charts were reviewed for baseline labs, time on PN, trace metal panel level, dose of trace metals in PN, enteral feeds and supplements, and outcomes including morbidities and mortality. Sepsis was diagnosed based on positive blood culture and cholestasis as a direct bilirubin >2 mg/dL. Fisher's Exact Test or Chi square were used to assess association between categorical variables. Spearman correlation was used to assess the correlation between two continuous variables. A p-value of 0.05 was used for significance.

Results: We included 98 patients in the study with demographic data shown in Table 1. The number of patients with trace metal elevation and deficiency are noted in Tables 1 and 2, respectively. The correlation between growth and trace metal levels is shown in Figure 1. Patient outcomes related to the diagnosis of trace metal deficiency are noted in Table 2. Copper deficiency was found to be significantly associated with sepsis (p = 0.010) and BPD (p = 0.033). Selenium deficiency was associated with cholestasis (p = 0.001) and BPD (p = 0.003). To further assess the relation of selenium and cholestasis, spearman correlation noted a significant negative correlation between selenium levels and direct bilirubin levels with (p = 0.002; Figure 2).

Conclusion: Trace metal deficiency was common in our population. In addition, selenium and copper deficiency were associated neonatal morbidities including sepsis, cholestasis, and BPD. When assessing selenium deficiency and cholestasis, the measured selenium level was found to correlate with direct bilirubin level. While there was correlation between trace metal levels and growth, the negative association noted is unclear and highlights the need for further assessment to determine the influence of other patient factors and the technique used for growth measurement. Overall this project highlights the important relation between trace metals and neonatal morbidities. Further research is needed to understand how trace metal supplementation might be used to optimize neonatal outcomes in the future.

Table 1. Patient Demographic and Outcome Information for Entire Population Having Trace Metal Levels Obtained. (Total population n = 98 unless noted).

Patient demographic and outcome information for entire population having trace metal levels obtained.

Table 2. Rate of Trace Metal Deficiency and Association With Patient Outcomes.

(Total n = 98).

Rate of trace metal deficiency and association with patient outcomes.

Scatter plot of average trace metal level and change in growth over time. (Growth represented as change in parameter over days). The Spearman correlation coefficient is listed for each graph. And significance note by symbol with *p-value < 0.05, †p-value < 0.01, ‡p-value < 0.001.

Figure 1. Correlation of Trace Metal Level and Growth.

Scatter plot of individual direct bilirubin levels plotted by selenium levels. Spearman correlation coefficient noted with negative correlation with p-value 0.002.

Figure 2. Correlation of Selenium Level With Direct Bilirubin Level.

Kaitlin Berris, RD, PhD (student)1; Qian Zhang, MPH2; Jennifer Ying, BA3; Tanvir Jassal, BSc3; Rajavel Elango, PhD4

1BC Children's Hospital, North Vancouver, BC; 2BCCHR, Vancouver, BC; 3University of British Columbia, Vancouver, BC; 4UBC/BCCHR, Vancouver, BC

Financial Support: None Reported.

Background: Pediatric critical illness causes increased demand for several nutrients. Children admitted requiring nutrition support have a naso-gastric tube to deliver enteral nutrition (EN) formula as liquid nutrition. These formulas are developed using dietary reference intakes (DRI) for healthy populations and does not account for altered nutrient needs in critical illness. Imbalanced nutrient delivery during this vulnerable time could lead to malnutrition, and negatively impact hospital outcomes. Published guidelines by the American Society of Parenteral and Enteral Nutrition (ASPEN) in 2017 aim to alleviate poor nutrition: use concentrated formula in fluid restriction, meet 2/3 estimated calories (resting energy expenditure, REE) by the end of first week, a minimum of 1.5 g/kg/day dietary protein and provide the DRI for each micronutrient. The objective of this retrospective cohort study was to evaluate nutrition delivery (prescribed vs. delivered) compared to 2017 guidelines, and correlate to adequacy in children admitted to a Canadian PICU.

Methods: Three-years of charts were included over two retrospective cohorts: September 2018- December 2020 and February 2022- March 2023. The first cohort, paper chart based, included children 1-18 y with tube feeding started within 3 d after admission. The second cohort, after transition to electronic medical records, included children 1-6 y on exclusive tube feeding during the first week of admission. Patient characteristics, daily formula type, rate prescribed (physician order), amount delivered (nursing notes) and interruption hours and reasons were collected. Statistical analysis included descriptive analysis of characteristics, logistic regression for odds of achieving adequacy of intake with two exposures: age categories and formula type. Pearson correlation was used to interpret interruption hours with percentage of calories met.

Results: Patients (n = 86) that were included spanned 458 nutrition support days (NSD). Admissions predominantly were for respiratory disease (73%), requiring ventilation support (81.4%). Calorie prescription (WHO REE equation) was met in 20.3% of NSD and 43.9% met 2/3 of calorie recommendation (table 1). Concentrated calories were provided in 34% of patients. Hours of interruptions vs. percentage of goal calories met was negatively correlated (r = -.52, p = .002) when looking at those ordered EN without prior EN history (i.e. home tube fed). More than 4 h of interruptions was more likely to not meet 2/3 calorie goal. Calorie goals were met when standard pediatric formula was used in children 1-8 y and concentrated adult formula in 9-18 y. Odds of meeting calorie goal increased by 85% per 1 day increase (OR 1.85 [1.52, 2.26], p < .0001) with a median of 4 d after admission to meet 2/3 calories. Minimum protein intake (1.5 g/kg/d) was only met in 24.9% of all NSD. Micronutrients examined, except for vitamin D, met DRI based on prescribed amount. Delivered amounts provided suboptimal micronutrient intake, especially for vitamin D (figure 1).

Conclusion: Current ASPEN recommendations for EN were not being achieved in critically ill children. Concentrated formula was often not chosen and decreased ability to meet calorie goals in younger patients. Prescribing shorter continuous EN duration (20/24 h) may improve odds of meeting calorie targets. Evaluation of NSD showed improving trend in calorie intake over the first week to meet 2/3 goal recommendation. However, results highlight inadequacy of protein even if calorie needs are increasingly met. Attention to micronutrient delivery with supplementation of Vitamin D is required.

Table 1. Enteral Nutrition Characteristics Per Nutrition Support Days. (N = 458)

Estimated Vitamin D Intake and 95% Confidence Intervals by Age and Formula Groups.

Figure 1. Estimated Vitamin D Intake by Age and Formula Groups.

Dana Steien, MD1; Megan Thorvilson, MD1; Erin Alexander, MD1; Molissa Hager, NP1; Andrea Armellino, RDN1

1Mayo Clinic, Rochester, MN

Financial Support: None Reported.

Background: Home parenteral nutrition (HPN) is a life-sustaining therapy for children with long-term digestive dysfunction. Historically HPN has been considered a bridge to enteral autonomy, or intestinal transplantation. However, thanks to medical and management improvements, HPN is now used for a variety of diagnoses, including intractable feeding intolerance (IFI) in children with severe neurological impairment (SNI). IFI occurs most often near the end-of-life (EOL) in patients with SNI. Thus, outpatient planning and preparation for HPN in this population, vastly differs from historical HPN use.

Methods: Case series of four pediatric patients with SNI who develop IFI and utilized HPN during their EOL care. Data was collected by retrospective chart review. The hospital pediatric palliative care service was heavily involved in the patients’ care when HPN was discussed and planned. The pediatric intestinal rehabilitation (PIR) and palliative care teams worked closely together during discharge planning and throughout the outpatient courses.

Results: The children with SNI in this case series, developed IFI between ages 1 and 12 years. Duration of HPN use varied from 5 weeks to 2 years. All patients were enrolled in hospice, but at various stages. Routine, outpatient HPN management plans and expectations were modified, based on each family's goals of EOL care. Discussions regarding the use and timing of laboratory studies, fever plans, central line issues, growth, and follow up appointments, required detailed discussions and planning.

Conclusion: EOL care for children differs from most EOL care in adults. Providing HPN to children with SNI and IFI can provide time, opportunities, and peace for families during their child's EOL journey, if it aligns with their EOL goals. PIR teams can provide valuable HPN expertise for palliative care services and families during these challenging times.

Jessica Lowe, DCN, MPH, RDN1; Carolyn Ricciardi, MS, RD2; Melissa Blandford, MS, RD3

1Nutricia North America, Roseville, CA; 2Nutricia North America, Rockville, MD; 3Nutricia North America, Greenville, NC

Financial Support: This study was conducted by Nutricia North America.

Background: Extensively hydrolyzed formulas (eHFs) are indicated for the management of cow milk allergy (CMA) and related symptoms. This category of formulas is often associated with an unpleasant smell and bitter taste, however; whey-based eHFs are considered more palatable than casein-based eHFs.1-4 The inclusion of lactose, the primary carbohydrate in human milk, in hypoallergenic formula can also promote palatability. Historically, concerns for residual protein traces in lactose has resulted in complete avoidance of lactose in CMA. However, “adverse reactions to lactose in CMA are not supported in the literature, and complete avoidance of lactose in CMA is not warranted.”5 Clinicians in the United Kingdom previously reported that taste and acceptance of an eHF is an important consideration when prescribing, as they believe a palatable eHF may result in decreased formula refusal and lead to more content families.1 The objective of this study was to understand caregiver sensory perspectives on an infant, when-based eHF containing lactose.

Methods: Fifteen clinical sites were recruited from across the United States. Clinicians enrolled 132 infants, whose families received the whey based eHF for 2 weeks, based on clinician recommendation. Caregivers completed two surveys: an enrollment survey and 2-week-post-survey characterizing eHF intake, CMA related symptoms, stooling patterns, sensory perspectives and satisfaction with eHF. Data was analyzed using SPSS 27 and descriptive statistics.

Results: One hundred and twenty-two infants completed the study. At enrollment infants were 22 ( ± 14.7) weeks old. Prior to study initiation, 12.3% of infants were breastfed, 40.2% were on a casein-based eHF, and 18.9% were on a standard formula with intact proteins. Most patients (97.5%) were fed orally and 2.5% were tube fed. Among all parents who responded, 92.5% (n = 86/93) reported better taste and 88.9% (n = 96/108) reported better small for whey-based formula containing lactose compared to the previous formula. For caregivers whose child was on a casein-based eHF at enrollment and responded, 97.6% (n = 41/42) reported better taste and 95.7% (n = 44/46) reported better smell than the previous formula. Additional caregiver reported perceptions of taste and smell are reported in Figure 1 and Figure 2, respectively. Finally, 89.3% of caregivers said it was easy to start their child on the whey-based eHF containing lactose and 91.8% would recommend it to other caregivers whose child requires a hypoallergenic formula.

Conclusion: The majority of caregivers had a positive sensory experience with the whey-based eHF containing lactose compared to the baseline formulas. Additionally, they found the trial formula easy to transition to and would recommend it to other families. These data support the findings of Maslin et al. and support the clinicians' expectation that good palatability would result in better acceptance and more content infants and families.1 Further research is needed to better understand how improved palatability can contribute to decreased waste and heath care costs.

Figure 1. Caregiver Ranking: Taste of Whey-Based, Lactose-Containing eHF.

Figure 2. Caregiver Ranking: Smell of Whey-Based, Lactose-Containing eHF.

Michele DiCarlo, PharmD1; Emily Barlow, PharmD, BCPPS1; Laura Dinnes, PharmD, BCIDP1

1Mayo Clinic, Rochester, MN

Financial Support: None Reported.

Background: There is limited information on hyperkalemia in adult patients that received trimethoprim-sulfamethoxazole (TMP-SMX). The mechanism for hyperkalemia is related to the TMP component, which is structurally related to the potassium-sparing diuretic amiloride. In children, there is no information on clinical impact or monitoring required. We noted that a pediatric patient on total parenteral nutrition (TPN), had a drop in the TPN potassium dosing once the TMP-SMX was started. This reduction remained for two weeks, following the last dose of the antibiotic. Case presentation: A 7-month-old was in a cardiac intensive care unit following complex surgical procedures and extracorporeal membrane oxygenation requirement. TPN was started due to concerns related to poor perfusion and possible necrotizing enterocolitis. TPN continued for a total of one hundred and ten days. Per protocol while on TPN, electrolytes and renal function (urine output, serum creatinine) were monitored daily. Diuretic therapy, including loop diuretics, chlorothiazide and spironolactone, were prescribed prior to TPN and continued for the entire duration. Renal function remained stable for the duration of TPN therapy. Dosing of potassium in the TPN was initiated per ASPEN guidelines and adjusted for serum potassium levels. Due to respiratory requirement and positive cultures, TMP-SMX was added to the medication regimen on two separate occasions. TMP-SMX 15 mg/kg/day was ordered twelve days after the start of the TPN and continued for three days. TMP-SMX 15 mg/kg/day was again started on day forty-three of TPN and continued for a five-day duration. Serum potassium was closely monitored for adjustments once TMP-SMX was started. When the TPN was successfully weaned off, we reviewed this information again. There was an obvious drop in TPN potassium dosing by day two of both regimens of TMP-SMX start and did not return to the prior stable dosing until approximately two weeks after the last dose of the antibiotic. This reduction lasted far beyond the projected half-life of TMP-SMX. (Table 1) Discussion: TMP-SMX is known for potential hyperkalemia in adult patients with multiple confounding factors. Factors include high dosage, renal dysfunction, congestive heart failure, and concomitant medications known to cause hyperkalemia. Little literature exists to note this side effect in pediatrics. The onset of our patient's increased serum potassium levels, and concurrent decrease in TPN dosing, could be expected, as TMP-SMX's time to peak effect is 1-4 hours. Half-life of TMP in children < 2 years old is 5.9 hours. Given this information, one would expect TMP-SMX to be cleared approximately thirty hours from the last dose administered. Our patient's potassium dosing took approximately two weeks from the end of the TMP-SMX administration to return to the pre TMP-SMX potassium dosing for both treatment regimens. Potential causes for the extended time to stabilize include concurrent high dose TMP-SMX and continuation of the potassium sparing diuretic. Prolonged potassium monitoring for pediatric patients started on high dose TMP-SMX while on TPN should be considered and further evaluation explored.

Methods: None Reported.

Results: None Reported.

Conclusion: None Reported.

Graph representing TPN Potassium dose (in mEq/kg/day), and addition of TMP-SMX regimen on two separate occasions. Noted drop in TPN potassium dose and delayed return after each TMP-SMX regimen.

Figure 1. TPN Potassium Dose and TMP-SMX Addition.

Jennifer Smith, MS, RD, CSP, LD, LMT1; Praveen Goday, MBBS2; Lauren Storch, MS, RD, CSP, LD2; Kirsten Jones, RD, CSP, LD2; Hannah Huey, MDN2; Hilary Michel, MD2

1Nationwide Children's Hospital, Dresden, OH; 2Nationwide Children's Hospital, Columbus, OH

Financial Support: North American Society for Pediatric Gastroenterology, Hepatology, and Nutrition (NASPGHAN) Foundation.

Background: The prevalence of Avoidant/Restrictive Food Intake Disorder (ARFID) is approximately 3% in the general population and 10% in adults with inflammatory bowel disease (IBD). Up to 10% of adolescents with IBD have reported disordered eating behaviors; however, there have been no prospective studies on the prevalence of ARFID eating behaviors in this population.

Methods: This is a one-time, cross-sectional, non-consecutive study of English-speaking patients with a confirmed diagnosis of IBD, aged 12-18 years. Participants completed the validated Nine Item ARFID Screen (NIAS), the SCOFF eating disorder screen (this screen utilizes an acronym [Sick, Control, One, Fat, and Food] in relation to the five questions on the screen) and answered one question about perceived food intolerances. The NIAS is organized into the three specific ARFID domains: eating restriction due to picky eating, poor appetite/limited interest in eating, and fear of negative consequences from eating, each of which is addressed by three questions. Questions are based on a 6-point Likert scale. Participants scoring ≥23 on the total scale or ≥12 on an individual subscale were considered to meet criteria for ARFID eating behaviors. Since some individuals with a positive NIAS screen may have anorexia nervosa or bulimia, we also used the SCOFF questionnaire to assess the possible presence of an eating disorder. A score of two or more positive answers has a sensitivity of 100% and specificity of 90% for anorexia nervosa or bulimia. Apart from descriptive statistics, Chi-square testing was used to study the prevalence of malnutrition, positive SCOFF screening, or food intolerances in patients with and without positive NIAS screens.

Results: We enrolled 82 patients whose demographics are shown in Table 1. Twenty percent (16/82) scored positive on the NIAS questionnaire, 16% (13/82) scored positive on the SCOFF questionnaire, and 48% (39/82) noted food intolerances. Of the 16 participants who scored positive on the NIAS, 50% (8/16) were male, 56% (9/16) had a diagnosis of Crohn's Disease, and 69% (11/16) had inactive disease. Twenty-five percent of those with a positive NIAS (4/16) met criteria for malnutrition (1 mild, 2 moderate, and 1 severe). Sixty-nine percent of those who scored positive on the NIAS (11/16) noted food intolerances and 30% (5/16) had a positive SCOFF screener. The prevalence of malnutrition (p = 0.4), the percentage of patients who scored positive on the SCOFF eating disorder screen (p = 0.3), or those who with reported food intolerances (p = 0.6) was similar in participants who scored positive on the NIAS vs. not.

Conclusion: Using the NIAS, 20% of adolescents with IBD met criteria for ARFID. Participants were no more likely to have malnutrition, a positive score on the SCOFF eating disorder screen, or reported food intolerances whether or not they met criteria for ARFID. Routine screening of adolescents with IBD for ARFID or other eating disorders may identify patients who would benefit from further evaluation.

Table 1. Demographics.

Qian Wen Sng, RN1; Jacqueline Soo May Ong2; Sin Wee Loh, MB BCh BAO (Ireland), MMed (Paed) (Spore), MRCPCH (RCPCH, UK)1; Joel Kian Boon Lim, MBBS, MRCPCH, MMed (Paeds)1; Li Jia Fan, MBBS, MMed (Paeds), MRCPCH (UK)3; Rehena Sultana4; Chengsi Ong, BS (Dietician), MS (Nutrition and Public Health), PhD1; Charlotte Lin3; Judith Ju Ming Wong, MB BCh BAO, LRCP & SI (Ireland), MRCPCH (Paeds) (RCPCH, UK)1; Ryan Richard Taylor3; Elaine Hor2; Pei Fen Poh, MSc (Nursing), BSN1; Priscilla Cheng2; Jan Hau Lee, MCI, MRCPCH (UK), USMLE, MBBS1

1KK Hospital, Singapore; 2National University Hospital, Singapore; 3National University Hospital Singapore, Singapore; 4Duke-NUS Graduate Medical School, Singapore

Financial Support: This work is supported by the National Medical Research Council, Ministry of Health, Singapore.

Background: Protein-energy malnutrition is pervasive in pediatric intensive care unit (PICU) patients. There remains clinical equipoise on the impact of protein supplementation in critically ill children. Our primary aim was to determine the feasibility of conducting a randomized controlled trial (RCT) of protein supplementation versus standard enteral nutrition (EN) in the PICU.

Methods: An open-labelled pilot RCT was conducted from January 2021 to June 2024 in 2 tertiary pediatric centers in Singapore. Children with body mass index (BMI) z-score < 0, who were expected to require invasive or non-invasive mechanical ventilation for at least 48 hours and required EN support for feeding were included. Patients were randomized (1:1 allocation) to protein supplementation of ³1.5 g/kg/day in addition to standard EN or standard EN alone for 7 days after enrolment or discharge to high dependency unit, whichever was earlier. Feasibility was based on 4 outcomes: Effective screening (>80% eligible patients approached for consent), satisfactory enrolment (>1 patient/center/month), timely protocol implementation (>80% of participants receiving protein supplementation within first 72 hours) and protocol adherence (receiving >80% of protein supplementation as per protocol).

Results: A total of 20 patients were recruited - 10 (50.0%) and 10 (50.0%) in protein supplementation and standard EN groups, respectively. Median age was 13.0 [Interquartile range (IQR) 4.2, 49.1] months. Respiratory distress was the most common reason for PICU admission [11 (55.0%)]. Median PICU and hospital length of stay were 8.0 (IQR 4.5, 16.5) and 19.0 (IQR 11.5, 36.5) days, respectively. There were 3 (15%) deaths which were not related trial intervention. Screening rate was 50/74 (67.6%). Mean enrollment was 0.45 patient/center/month. Timely protocol implementation was performed in 15/20 (75%) participants. Protocol adherence was achieved by the participants in 11/15 (73.3%) of protein supplementation days.

Conclusion: Satisfactory feasibility outcomes were not met in this pilot RCT. Based on inclusion criteria of this pilot study and setup of centers, a larger study in Singapore alone will not be feasible. With incorporation of revised logistic arrangements, a larger feasibility multi-center center study involving regional countries should be piloted.

Veronica Urbik, MD1; Kera McNelis, MD1

1Emory University, Atlanta, GA

Financial Support: None Reported.

Background: Many clinical management and physiologic knowledge gaps are present in the care of the tiny baby population (neonates born at ≤23 weeks gestation, also labelled periviable gestational age). There is therefore significant center-specific variation in practice, as well as increased morbidity and mortality observed in infants born at 22-23 weeks compared to those born at later gestational ages1. The importance of nutrition applies to tiny babies, and appropriate nutrition is perhaps even more critical in this population than others. Numerous studies have demonstrated the benefits of full early enteral feeding, including decreased rates of central-line associated blood infection and cholestasis2,3. The risk of developing necrotizing enterocolitis (NEC) is a balancing measure for advancement of enteral feeds4. Adequate nutrition is critical for growth, reducing morbidity and mortality, and improving overall outcomes. Current proposed protocols for this population target full enteral feeding volumes to be reached by 10-14 days of life5.

Methods: From baseline data collected at two Level III neonatal intensive care units (NICU) attended by a single group of academic neonatology faculty from January 2020 – January 2024, the average length of time from birth until full enteral feeds was achieved was 31 days. Using quality improvement (QI) methodology, we identified the barriers in advancement to full enteral feeds (defined as 120cc/kg/day) in babies born at 22- and 23-weeks gestational age admitted to the pediatric resident staffed Level III NICU.

Results: The Pareto chart displays the primary barriers including undefined critical illness, vasopressor use, evaluation for NEC, or spontaneous intestinal perforation, patent ductus arteriosus treatment, and electrolyte derangements (Figure 1). In many cases, no specific reason was able to be identified in chart review for not advancing toward full enteral feeds.

Conclusion: In this ongoing QI project, our SMART aim is to reduce the number of days to reach full feeds over the period of January 2024 -June 2025 by 10%, informed by root cause analysis and the key driver diagram developed from the data thus far (Figure 2). The first plan-do-study-act cycle started on January 16, 2024. Data are analyzed using statistical process control methods.

Pareto Chart.

Figure 1.

Key Driver Diagram.

Figure 2.

Bridget Hron, MD, MMSc1; Katelyn Ariagno, RD, LDN, CNSC, CSPCC1; Matthew Mixdorf1; Tara McCarthy, MS, RD, LDN1; Lori Hartigan, ND, RN, CPN1; Jennifer Lawlor, RN, BSN, CPN1; Coleen Liscano, MS, RD, CSP, LDN, CNSC, CLE, FAND1; Michelle Raymond, RD, LDN, CDCES1; Tyra Bradbury, MPH, RD, CSP, LDN1; Erin Keenan, MS, RD, LDN1; Christopher Duggan, MD, MPH1; Melissa McDonnell, RD, LDN, CSP1; Rachel Rosen, MD, MPH1; Elizabeth Hait, MD, MPH1

1Boston Children's Hospital, Boston, MA

Financial Support: Some investigators received support from agencies including National Institutes of Health and NASPGHAN which did not directly fund this project.

Background: The widespread shortage of amino acid-based formula in February 2022 highlighted the need for urgent and coordinated hospital response to disseminate accurate information to front-line staff in both inpatient and ambulatory settings.

Methods: An interdisciplinary working group consisting of pediatric gastroenterologists, dietitians, nurses and a quality improvement analyst was established in September 2022. The group met at regular intervals to conduct a needs assessment for all disciplines. The group developed and refined a novel clinical process algorithm to respond to reports of formula shortages and/or recalls. The Clinical Process Map is presented in Figure 1. Plan-do-study-act cycles were implemented to improve the quality of the communication output based on staff feedback. The key performance indicator is time from notification of possible shortage to dissemination of communication to stakeholders, with a goal of < 24 hours.

Results: From September 2022 to August 2024, the group met 18 times for unplanned responses to formula recall/shortage events. Email communication was disseminated within 24 hours for 8/18 (44%) events; within 48 hours for 9/18 (50%) and 1/18 (6%) after 48 hours. Iterative changes included the initiation of an urgent huddle for key stakeholders to identify impact and substitution options; development of preliminary investigation pathway to ensure validity of report; development of structured email format that was further refined to table format including images of products (Figure 2) and creation of email distribution list to disseminate shortage reports. The keys to this project's success were the creation of a multidisciplinary team dedicated to meeting urgently for all events, and the real time drafting and approval of communication within the meeting. Of note, the one communication which was substantially delayed (94.7 hours) was addressed over email only, underscoring the importance of the multidisciplinary working meeting.

Conclusion: Establishing clear lines of communication and assembling key stakeholders resulted in timely, accurate and coordinated communication regarding nutrition recalls/shortage events at our institution.

Figure 1. Formula Recall Communication Algorithm.

Figure 2.

Nicole Misner, MS, RDN1; Michelle Yavelow, MS, RDN, LDN, CNSC, CSP1; Athanasios Tsalatsanis, PhD1; Racha Khalaf, MD, MSCS1

1University of South Florida, Tampa, FL

Financial Support: None Reported.

Background: Early introduction of peanuts and eggs decreases the incidence of peanut and egg allergies in infants who are at high risk of developing food allergies. Prevention guidelines and clinical practice have shifted with recent studies. However, little is known about introducing common food allergens in infants fed via enteral feeding tubes. Early introduction of allergens could be of importance in infants working towards tube feeding wean and those that may benefit from blended tube feeding in the future. We aimed to compare the characteristics of patients with enteral tubes who received education during their gastroenterology visit verses those who did not.

Methods: We performed a single center retrospective chart review involving all patients ages 4 to 24 months of age with an enteral feeding tube seen at the University of South Florida Pediatric Gastroenterology clinic from August 2020 to July 2024. Corrected age was used for infants born < 37 weeks’ gestation age. All types of enteral nutrition were included i.e. nasogastric, gastrostomy, gastrostomy-jejunostomy tube. Data on demographics, clinical characteristics and parent-reported food allergen exposure were collected. An exception waiver was received by the University of South Florida Institution Review Board for this retrospective chart review. Differences between patients who received education and those who did not were evaluated using Student's t-test for continuous variables and chi-square test for categorical variables. All analysis was performed using R Statistical Software (v4.4.2). A p value < =0.05 was considered statistically significant.

Results: A total of 77 patients met inclusion criteria, with 349 total visits. Patient demographics at each visit are shown in Table 1. There was a documented food allergy in 12% (43) of total visits. Education on early introduction of common food allergens was provided in 12% (42) of total visits. Patients who received education at their visit were significantly younger compared to those who did not and were also more likely to have eczema. Table 2 compares nutrition characteristics of the patient at visits where education was discussed vs those where it was not. Infants with any percent of oral intake were more likely to have received education than those that were nil per os (p = 0.013). There was a significant association between starting solids and receiving education (p < 0.001). Reported allergen exposure across all visits was low. For total visits with the patient < 8 months of age (n = 103), only 6% (6) reported peanut and 8% (8) egg exposure. Expanded to < 12 months of age at the time of visit (n = 198), there was minimal increase in reported allergen exposure, 7% (14) reported peanut and 8% (15) egg exposure. Oral feeds were the most common source of reported form of allergen exposure. Only one patient received commercial early allergen introduction product. Cow's milk exposure was the most reported allergen exposure with 61% (63) under 8 months and 54% (106) under 12 months of age, majority from their infant formula.

Conclusion: Age and any proportion of oral intake were associated with receiving education on common food allergen introduction at their visit. However, there were missed opportunities for education in infants with enteral feeding tubes. There were few visits with patients that reported peanut or egg exposure. Further research and national guidelines are needed on optimal methods of introduction in this population.

Table 1. Demographics.

Table 2. Nutrition Characteristics.

Samantha Goedde-Papamihail, MS, RD, LD1; Ada Lin, MD2; Stephanie Peters, MS, CPNP-PC/AC2

1Nationwide Children's Hospital, Grove City, OH; 2Nationwide Children's Hospital, Columbus, OH

Financial Support: None Reported.

Background: Critically ill children with severe acute kidney injury (AKI) requiring continuous renal replacement therapy (CRRT) are at high risk for micronutrient deficiencies, including vitamin C (VC). VC acts as an antioxidant and enzyme cofactor. It cannot be made endogenously and must be derived from the diet. Critically ill patients often enter the hospital with some degree of nutritional debt and may have low VC concentrations upon admission. This is exacerbated in the case of sepsis, multi-organ dysfunction, burns, etc. when VC needs are higher to address increased oxidative and inflammatory stress. When these patients develop a severe AKI requiring CRRT, their kidneys do not reabsorb VC as healthy kidneys would, further increasing losses. Moreover, VC is a small molecule and is filtered out of the blood via CRRT. The combination of increased VC losses and elevated VC requirements in critically ill patients on CRRT result in high risk of VC deficiency. The true prevalence of VC deficiency in this population is not well-known; whereas the prevalence of deficiency in the general population is 5.9% and 18.3% in critically ill children, on average. The aim of this study is to ascertain the prevalence of VC deficiency in critically ill pediatric patients on CRRT by monitoring serum VC levels throughout their CRRT course and correcting noted deficiencies.

Methods: An observational study was conducted from May 2023 through August 2024 in a 54 bed high-acuity PICU offering ECMO, CRRT, Level 1 Trauma, burn care, solid organ transplant and bone marrow transplantation as tertiary care services. Fifteen patients were identified upon initiation of CRRT and followed until transfer from the PICU. Serial serum VC levels were checked after 5-10 days on CRRT and rechecked weekly thereafter as appropriate. Deficiency was defined as VC concentrations < 11 umol/L. Inadequacy was defined as concentrations between 11-23 umol/L. Supplementation was initiated for levels < 23 umol/L; dose varied 250-500 mg/d depending on age and clinical situation. Most deficient patients received 500 mg/d supplementation. Those with inadequacy received 250 mg/d.

Results: Of 15 patients, 9 had VC deficiency and 4 had VC inadequacy [FIGURE 1]. Of those with deficiency, 5 of 9 patients were admitted for septic shock [FIGURE 2]. VC level was rechecked in 8 patients; level returned to normal in 5 patients and 4 of those 5 received 500 mg/d supplementation. Levels remained low in 3 patients; all received 250 mg/d supplementation [FIGURE 3]. Supplementation dose changes noted in figure 4.

Conclusion: VC deficiency was present in 60% of CRRT patients, suggesting deficiency is more common in this population and that critically ill patients on CRRT are at higher risk of developing deficiency than those who are not receiving CRRT. Septic shock and degree of VC deficiency appeared to be correlated; 56% of deficient patients were admitted with septic shock. Together, this suggests a need to start supplementation earlier, perhaps upon CRRT initiation vs upon admission to the PICU in a septic patient; and utilize higher supplementation doses as our patients with low VC levels at their follow-up check were all receiving 250 mg/d. Study limitations include: 1. Potential for VC deficiency prior to CRRT initiation with critical illness and/or poor intakes as confounding factors and 2. Small sample size. Our results suggest that critically ill children with AKI requiring CRRT are at increased risk of VC deficiency while on CRRT. Future research should focus on identifying at-risk micronutrients for this patient population and creating supplementation regimens to prevent the development of deficiencies. Our institution is currently crafting a quality improvement project with these aims.

Figure 1. Initial Serum Vitamin C Levels of Our Patients on CRRT, Obtained 5-10 Days After CRRT Initiation (N = 15).

Figure 2. Underlying Disease Process of Patients on CRRT (N = 15).

Figure 3. Follow Up Vitamin C Levels After Supplementation (N = 8), Including Supplementation Regimen Prior to Follow-Up Lab (dose/route).

Figure 4. Alterations in Supplementation Regimen (dose) Based on Follow-up Lab Data (N = 6).

Tanner Sergesketter, RN, BSN1; Kanika Puri, MD2; Emily Israel, PharmD, BCPS, BCPPS1; Ryan Pitman, MD, MSc3; Elaina Szeszycki, BS, PharmD, CNSC2; Ahmad Furqan Kazi, PharmD, MS1; Ephrem Abebe, PhD1

1Purdue University College of Pharmacy, West Lafayette, IN; 2Riley Hospital for Children at Indiana University Health, Indianapolis, IN; 3Indiana University, Indianapolis, IN

Financial Support: The Gerber Foundation.

Background: During the hospital-to-home transition period, family members or caregivers of medically complex children are expected to assume the responsibility of managing medication and feeding regimens for the child under their care. However, this transition period represents a time of high vulnerability, including the risk of communication breakdowns and lack of tools tailored to caregivers’ context. This vulnerability is further heightened when a language barrier is present as it introduces additional opportunities for misunderstandings leading to adverse events and errors in the home setting. Hence, it is critical that caregivers are educated to develop skills that aid in successful implementation of post-discharge care plans. Addressing unmet needs for caregivers who use languages other than English (LOE) requires an in-depth understanding of the current challenges associated with educating and preparing caregivers for the post-discharge period.

Methods: In this prospective qualitative study, healthcare workers (HCWs) were recruited from a tertiary care children's hospital in central Indiana and were eligible to participate if they were involved directly or indirectly in preparing and assisting families of children under three years of age with medication and feeding needs around the hospital discharge period and/or outpatient care following discharge. Each HCW completed a brief demographic survey followed by observation on the job for 2-3 hours, and participated in a follow-up semi-structured interview using video conferencing technology to expand on observed behavior. Handwritten field notes were taken during the observations, which were immediately typed and expanded upon post-observation. Audio recordings from the interviews were transcribed and de-identified. Both the typed observation notes and interview transcripts were subjected to thematic content analysis, which was completed using the Dedoose software.

Results: Data collection is ongoing with anticipated completion in October 2024. Fourteen HCW interviews have been completed to date, with a target sample of 20-25 participants. Preliminary analysis presented is from transcripts of seven interviews. Participants included six females and one male with a mean age of 35.4 years (range, 24 - 59). HCWs were from diverse inpatient and outpatient clinical backgrounds including registered dieticians, physicians, pharmacists, and nurses. Four overarching themes describe the challenges that HCWs experience when communicating with caregivers who use LOE during the hospital-to-home transition. These themes include lack of equipment and materials in diverse languages, challenges with people and technologies that assist with translating information, instructions getting lost in translation/uncertainty of translation, and difficulty getting materials translated in a timely manner. Main themes, subthemes, and examples are presented in Figure 1, and themes, subthemes, and quotes are presented in Table 1.

Conclusion: The study is ongoing, however based on the preliminary analysis, it is evident that the systems and processes that are in place to aid in communication between HCWs and caregivers who use LOE can be improved. This can ultimately lead to improved quality of care provided to caregivers who use LOE during the hospital-to-home transition and resultant safer care in the home setting for medically complex children.

Table 1. Themes, Subthemes, and Quotes.

Figure 1. Main Themes, Subthemes, and Examples.

Ruthfirst Ayande, PhD, MSc, RD1; Shruti Gupta, MD, NABBLM-C1; Sarah Taylor, MD, MSCR1

1Yale University, New Haven, CT

Financial Support: None Reported.

Background: Growth faltering among preterm neonates admitted to NICUs may be managed with hypercaloric and/or hypervolemic feeding. However, in intractable cases of growth faltering or when fluid restriction is indicated, fortification with extensively hydrolyzed liquid protein may offer a solution to meeting the high protein demands for infant growth while limiting overfeeding. Yet, there is limited clinical data regarding supplementation with liquid protein, leaving clinicians to make decisions about dosing and duration on a case-by-case basis.

Methods: We present a case of neonatal growth faltering managed with a liquid protein modular in a level IV NICU in North America.

Results: Case Summary: A male infant, born extremely preterm (GA: 24, 1/7) and admitted to the NICU for respiratory distress, requiring intubation. NICU course was complicated by patent ductus arteriosus (PDA), requiring surgery on day of life (DOL) 31 and severe bronchopulmonary dysplasia. Birth Anthropometrics: weight: 0.78 kg; height: 31.5 cm. TPN was initiated at birth, with trophic feeds of donor human milk per gavage (PG) for a total provision of 117 ml/kg, 75 kcal/kg, and 3.5 gm/kg of protein. The regimen was advanced per unit protocol; however, on DOL 5, total volume was decreased in the setting of metabolic acidosis and significant PDA. PG feeds of maternal milk were fortified to 24 kcal/oz on DOL 23, and the infant reached full feeds on DOL 26. Feed provision by DOL 28 was ~144 ml/kg/day and 4 g/kg of protein based on estimated dry weight. TPN was first discontinued on DOL 25 but restarted on DOL 32 due to frequent NPO status and clinical instability. Of note, the infant required diuretics during the hospital stay. TPN was again discontinued on DOL 43. At DOL 116, the infant was receiving and tolerating PG feeds fortified to 24 kcal/oz at 151 ml/kg, 121 kcal/kg, and 2.1 gm/kg protein. The infant's weight gain rate was 56 g/day; however, linear growth was impaired, with a gain of 0.5 cm over 21 days (rate of ~0.2 cm/week). Liquid protein was commenced at DOL 124 to supply an additional 0.5 gm/kg of protein. A week after adding liquid protein, the infant's weight gain rate was 39 g/day, and height increased by 2.5 cm/week. Feed fortification was reduced to 22 kcal/oz on DOL 156 due to rapid weight gain, and liquid protein dosage increased to 0.6 gm/kg for a total protein intake of 2.2 g/kg. At DOL 170, Calorie fortification of maternal milk was discontinued, and liquid protein dosage was increased to 1 g/kg in the setting of a relapse of poor linear growth for a total protein intake of 3.1 g/kg. Liquid protein was provided for two months until discontinuation (d/c) at DOL 183 per parent request. At the time of d/c of liquid protein, the infant's weight and length gain rates for the protein supplementation period (59 days) was 42 gm/day and 1.78 cm/week, respectively.

Conclusion: While we observed objective increases in linear growth for the presented case following the addition of a liquid protein modular, it is crucial to note that these findings are not generalizable, and there is limited evidence and guidelines on the use of hydrolyzed liquid protein. Larger, well-controlled studies examining mechanisms of action, appropriate dosage and duration, short- and long-term efficacy, and safety are required to guide best practices for using these modulars.

Sarah Peterson, PhD, RD1; Nicole Salerno, BS1; Hannah Buckley, RDN, LDN1; Gretchen Coonrad, RDN, LDN1

1Rush University Medical Center, Chicago, IL

Financial Support: None Reported.

Background: Accurate assessment of growth and nutritional status is critical for preterm infants. Various growth charts have been developed to track the growth of preterm infants, but differences in reference standards may influence the diagnosis and management of malnutrition. The goal of this study was to compare the rate of malnutrition, defined by a decline in weight-for-age z-score, using the Fenton growth chart, the Olsen growth chart, and the INTERGROWTH-21st Preterm Postnatal Growth Standard.

Methods: All preterm infants born between 24 and 37 weeks of gestational age who were admitted to the neonatal intensive care unit (NICU) in 2022 and had weight at birth and on day 28 recorded were included. Preterm infants were excluded if they were admitted to the NICU ≥seven days after birth. Sex and gestational age were recorded for each infant. Weight and weight-for-age z-score at birth and on day 28 were recorded. Weight-for-age z-score was determined using three growth charts: (1) the Fenton growth chart, which was developed using growth data from uncomplicated preterm infants who were at least 22 weeks of gestational age from the United States (US), Australia, Canada, Germany, Italy, and Scotland; (2) the Olsen growth chart, which was developed using growth data from uncomplicated preterm infants who were at least 23 weeks of gestational age from the US; and (3) the INTERGROWTH-21st Preterm Postnatal Growth Standard, which was developed using growth data from uncomplicated preterm infants who were at least 24 weeks of gestational age from the US, United Kingdom, Brazil, China, India, Italy, Kenya, and Oman. Change in weight-for-age z-score from birth to day 28 was calculated using z-scores from each growth chart. Malnutrition was defined as a decline in weight-for-age z-score of ≥0.8; any infant meeting this cut-point had their malnutrition status further classified into three categories: (1) mild malnutrition was defined as a weight-for-age z-score decline between 0.8-1.1, (2) moderate malnutrition was defined as a weight-for-age z-score decline between 1.2-1.9, and (3) severe malnutrition was defined as a weight-for-age z-score decline of ≥2.0.

Results: The sample included 102 preterm infants, 58% male, with a mean gestational age of 29.3 weeks. At birth, the average weight was 1,192 grams, and the average weight-for-age z-score was -0.50, -0.36, and -1.14 using the Olsen, Fenton, and INTERGROWTH-21st growth charts, respectively. At 28 days, the average weight was 1,690 grams, and the average weight-for-age z-score was -0.96, -1.00, and -1.43 using the Olsen, Fenton, and INTERGROWTH-21st growth charts, respectively. Using the Olsen growth chart, 29 infants met the criteria for malnutrition; 15 had mild malnutrition and 14 had moderate malnutrition. Using the Fenton growth chart, 33 infants met the criteria for malnutrition; 21 had mild malnutrition and 12 had moderate malnutrition. Using the INTERGROWTH-21st growth chart, 32 infants met the criteria for malnutrition; 14 had mild malnutrition, 14 had moderate malnutrition, and 4 had severe malnutrition. In total, 24 infants met the criteria for malnutrition using all three growth charts, while 19 infants were only categorized as malnourished by one of the three growth charts.

Conclusion: The findings of this study reveal important discrepancies in average z-scores and the classification of malnutrition among preterm infants when using the Olsen, Fenton, and INTERGROWTH-21st growth charts. These differences suggest that the choice of growth chart has important implications for identifying rates of malnutrition. Therefore, standardized guidelines for growth monitoring in preterm infants are necessary to ensure consistent and accurate diagnosis of malnutrition.

Emaan Abbasi, BSc1; Debby Martins, RD2; Hannah Piper, MD2

1Univery of Galway, Vancouver, BC; 2BC Children's Hospital, Vancouver, BC

Financial Support: None Reported.

Background: Infants with gastroschisis have variable intestinal function with some achieving enteral independence within a few weeks and others remaining dependent on parental nutrition (PN) for prolonged periods. Establishing enteral feeds can be challenging for many of these neonates due to poor intestinal motility, frequent emesis and/or abdominal distension. Therefore, many care teams use standardized post-natal nutrition protocols in an attempt to minimize PN exposure and maximize oral feeding. However, it remains unclear whether initiating continuous feeds is advantageous or whether bolus feeding is preferred. Potential benefits of bolus feeding include that it is more physiologic and that the feeds can be given orally, but there remain concerns about feed tolerance and prolonged periods of withholding feeds with this approach. The objective of this study was to compare the initial feeding strategy in infants with gastroschisis to determine whether bolus feeding is a feasible approach.

Methods: After obtaining REB approval (H24-01052) a retrospective chart review was performed in neonates born with gastroschisis, cared for by a neonatal intestinal rehabilitation team between 2018 and 2023. A continuous feeding protocol was used between 2018-2020 (human milk at 1 ml/h with 10 ml/kg/d advancements given continuously until 50 ml/kg/d and then trialing bolus feeding) and a bolus protocol was used between 2021-2023 (10-15 ml/kg divided into 8 feeds/d with 15-20 ml/kg/d advancements). Clinical data was collected including: gestational age, gastroschisis prognosis score (GPS), need for intestinal resection, age when feeds initiated, time to full feeds, route of feeding at full feeds and hepatic cholestasis were compared between groups. Welch's t-test and chi square test were performed to compare variables with p-values < 0.05 considered significant.

Results: Forty-one infants with gastroschisis were reviewed (23 who were managed with continuous feed initiation and 18 with bolus feed initiation). Continuous feed and bolus feed groups had comparable mean gestational age at birth, GPS score, need for intestinal surgery, age at feed initiation, and the incidence of cholestasis (Table 1). Time to achieve enteral independence was similar between both groups with nearly half of infants reaching full feeds by 6 weeks (48% continuous feeds vs. 44% bolus feeds) and most by 9 weeks of life (74% continuous vs. 72% bolus). Significantly more infants in the bolus feeding group were feeding exclusively orally compared to the continuous feeding group at the time of reaching full enteral feeds (50% vs. 17%, p = 0.017).

Conclusion: Initiating bolus enteral feeding for infants with gastroschisis, including those requiring intestinal resection, is safe and does not result in prolonged time to full enteral feeding. Avoiding continuous feeds may improve oral feeding in this population.

Table 1. Clinical Characteristics and Initial Feeding Strategy.

International Poster of Distinction

Matheus Albuquerque1; Diogo Ferreira1; João Victor Maldonado2; Mateus Margato2; Luiz Eduardo Nunes1; Emanuel Sarinho1; Lúcia Cordeiro1; Amanda Fifi3

1Federal University of Pernambuco, Recife, Pernambuco; 2University of Brasilia, Brasília, Distrito Federal; 3University of Miami, Miami, FL

Financial Support: None Reported.

Background: Intestinal failure secondary to short bowel syndrome is a malabsorptive condition, caused by intestinal resection. Patients with intestinal failure require parenteral support to maintain hydration and nutrition. Long-term parenteral nutrition leads to complications. Teduglutide, an analog of GLP-2, may improve intestinal adaptation thereby minimizing reliance on parenteral nutrition. This meta-analysis evaluates the efficacy of teduglutide in reducing parenteral nutrition dependency in pediatric patients with intestinal failure.

Methods: We included randomised controlled trials (RCTs) that assessed the efficacy of teglutide in reducing parenteral nutrition support and improving anthropometrics in pediatric patients with intestinal failure secondary to short bowel syndrome. The RoB-2 tool (Cochrane) evaluated the risk of bias, and statistical analyses were conducted utilizing RevMan 5.4.1 software. The results are expressed as mean differences with CI 95% and p-value.

Results: Data was extracted from three clinical trials, involving a total of 172 participants. Teduglutide use was associated with a reduction in parenteral nutrition volume (-17.92 mL, 95% CI -24.65 to -11.20, p < 0.00001) with most patients reducing parenteral support by >20% (11.79, 95% CI 2.04 to 68.24, p = 0.006) (Figure2). Treatment with teduglutide also improved height (0.27 Z-score, 95% CI 0.08, to 0.46, p = 0.005) but did not significantly increase weight when compared to the control group (-0.13 Z-score, 95% CI, -0.41 to 0.16, p = 0.38) (Figure 3).

Conclusion: This meta-analysis suggests that therapy with teduglutide reduces parenteral nutrition volume in patients with short bowel syndrome and intestinal failure. Reduced pareneteral nutrition dependency can minimize complications and improve quality of life in patients with short bowel syndrome and intestinal failure.

Figure 1. Parenteral Nutrition Support Volume Change.

Figure 2. Anthropometric Data (Weight and Height) Change from Baseline.

Korinne Carr1; Liyun Zhang, MS1; Amy Pan, PhD1; Theresa Mikhailov, MD, PhD2

1Medical College of Wisconsin, Milwaukee, WI; 2Childrens Hospital of Wisconsin, Milwaukee, WI

Financial Support: Medical College of Wisconsin, Department of Pediatrics.

Background: Malnutrition is a significant concern in pediatric patients, particularly those critically ill. In children with diabetes mellitus (DM), the presence of malnutrition can exacerbate complications such as unstable blood sugar levels and delayed wound healing, potentially leading to worse clinical outcomes. Despite the known impact of malnutrition on adult patients and critically ill children, established criteria for identifying malnutrition in critically ill children are lacking. This study was designed to determine the relationship between malnutrition, mortality, and length of stay (LOS) in critically ill pediatric patients with diabetes mellitus.

Methods: We conducted a retrospective cohort study using the VPS (Virtual Pediatric Systems, LLC) Database. We categorized critically ill pediatric patients with DM as malnourished or at risk of being malnourished based on admission nutrition screens. We compared mortality rates between malnourished and non-malnourished patients using Fisher's Exact test. We used logistic regression analysis to compare mortality controlling for measures like PRISM3 (a severity of illness measure), demographic, and clinical factors. We compared the LOS in the Pediatric Intensive Care Unit (PICU) between malnourished and non-malnourished patients using the Mann-Whitney-Wilcoxon test. Additionally, we used a general linear model with appropriate transformation to adjust for the severity of illness, demographic, and clinical factors. We considered statistical significance at p < 0.05.

Results: We analyzed data for 4,014 patients, of whom 2,653 were screened for malnutrition. Of these 2,653, 88.5% were type 1 DM, 9.3% were type 2 DM, and the remaining patients were unspecified DM. Of the 2,653 patients, 841 (31.7%) were malnourished based on their nutrition screen at admission to the PICU. Mortality in patients who were screened as malnourished did not differ from mortality in those who were not malnourished (0.4% vs. 0.2%, p = 0.15). Malnourished patients also had longer PICU LOS, with a geometric mean and 95% CI of 1.03 (0.94–1.13) days, compared to 0.91 (0.86–0.96) days for non-malnourished patients. Similarly, the malnourished patients had longer hospital LOS with a geometric mean and 95% CI of 5.31 (4.84–5.83) days, compared to 2.67 (2.53–2.82) days for those who were not malnourished. Both differences were significant with p < 0.0001, after adjusting for age, race/ethnicity, and PRISM3.

Conclusion: We found no difference in mortality rates, but critically ill children who were screened as malnourished had longer PICU and hospital LOS than those who were not malnourished. This was true even after adjusting for age, race/ethnicity, and PRISM3.

Emily Gutzwiller1; Katie Huff, MD, MS1

1Indiana University School of Medicine, Indianapolis, IN

Financial Support: None Reported.

Background: Neonates with intestinal failure require parenteral nutrition for survival. While life sustaining, it can lead to serious complications, including intestinal failure associated liver disease (IFALD). The etiology of IFALD is likely multifactorial, with intravenous lipid emulsions (ILE) being a large contributor, particularly soybean oil-based lipid emulsions (SO-ILE). Alternate ILEs, including those containing fish oil, can be used to prevent and treat IFALD. Fish oil-based ILE (FO-ILE) is only approved at a dose of 1 g/kg/d, limiting calories prescribed from fat and shifting the calorie delivery to carbohydrate predominance. While FO-ILE was shown to have comparable growth to SO-ILE, a comparison to soy, MCT, olive, fish oil-based ILE (SO,MCT,OO,FO-ILE) has not been conducted to our knowledge. The purpose of this study is to compare the growth and laboratory data associated with SO,MCT,OO,FO-ILE and FO-ILE therapy in the setting of IFALD in a cohort of neonates treated during two time periods.

Methods: We performed a retrospective chart review of patients with IFALD receiving SO,MCT,OO,FO-ILE (SMOFlipid) or FO-ILE (Omegaven) in a level IV neonatal intensive care unit from September 2016 – May 2024. IFALD was defined as direct bilirubin >2 mg/dL after receiving >2 weeks of parental nutrition. Patients with underlying genetic or hepatic diagnoses, as well as patients with elevated direct bilirubin prior to two weeks of life were excluded. Patients were divided based on the period they were treated. Data was collected on demographic characteristics, parental and enteral nutrition, weekly labs, and growth changes while receiving SO,MCT,OO,FO-ILE or FO-ILE. Rate of change of weight, length, and head circumference and comparison of z-scores over time were studied for each ILE group. Secondary outcomes included nutritional data in addition to hepatic labs. Nonparametric analysis using Mann-Whitney U test was conducted to compare ILE groups and a p-value of < 0.05 was used to define statistical significance.

Results: A total of 51 patients were enrolled, 25 receiving SO,MCT,OO,FO-ILE and 26 FO-ILE. Table 1 notes the demographic and baseline characteristics of the two ILE groups. There was no difference in the rate of OFC (p = 0.984) or length (p = 0.279) growth between the two treatment groups (Table 2). There was a difference, however, in the rate of weight gain between groups. (p = 0.002; Table 2), with the FO-ILE group gaining more weight over time. When comparing nutritional outcomes (Table 2), SO,MCT,OO,FO-ILE patients received greater total calories than FO-ILE patients (p = 0.005) including a higher ILE dose (p < 0.001) and enteral calories (p = 0.029). The FO-ILE group, however received a higher carbohydrate dose (p = 0.003; Table 2). There was no difference in amino acid dose (p = 0.127) or parental nutrition calories (p = 0.821). Hepatic labs differed over time with the FO-ILE having a larger decrease in AST, direct bilirubin, and total bilirubin over time compared to the SO,MCT,OO,FO-ILE group (Table 2).

Conclusion: Our results show the FO-ILE patients did have a significant increase in weight gain compared to the SO,MCT,OO,FO-ILE patients. This is despite SO,MCT,OO,FO-ILE patients receiving greater total calories and enteral calories. The FO-ILE only received greater calories in the form of glucose infusion. With this increased weight gain but similar length growth between groups, concerns regarding alterations in body composition and increased fat mass arise. Further research is needed to determine the influence of this various ILE products on neonatal body composition over time.

Table 1. Demographic and Baseline Lab Data by Lipid Treatment Group.

(All data presented as median and interquartile range, unless specified.)

Table 2. Nutritional, Hepatic Lab, and Growth Outcomes by Lipid Treatment Group.

(All data presented as median interquartile range unless specified.)

*z-score change compares z-score at end and beginning of study period

OFC-occipitofrontal circumference

Rachel Collins, BSN, RN1; Brooke Cherven, PhD, MPH, RN, CPON2; Ann-Marie Brown, PhD, APRN, CPNP-AC/PC, CCRN, CNE, FCCM, FAANP, FASPEN1; Christina Calamaro, PhD, PPCNP-BC, FNP-BC, FAANP, FAAN3

1Emory University Nell Hodgson Woodruff School of Nursing, Atlanta, GA; 2Emory University School of Medicine; Children's Healthcare of Atlanta, Atlanta, GA; 3Emory University Nell Hodgson Woodruff School of Nursing; Children's Healthcare of Atlanta, Atlanta, GA

Financial Support: None Reported.

Background: Many children receiving hematopoietic stem cell transplant (HSCT) are malnourished prior to the start of transplant or develop malnutrition post-infusion. Children are at risk for malnutrition due to the pre-transplant conditioning phase, from chemotherapy treatments for their primary diagnosis, and from acute graft versus host disease (GVHD). These factors can cause impaired oral feeding, vomiting, diarrhea, and mucositis. Enteral nutrition (EN) and parenteral nutrition (PN) are support options for this patient population when oral feeding is impaired. There is currently a paucity of research in evidence-based guidelines of nutrition in this population. The purpose of this integrative literature review was to determine the outcomes of EN and PN in pediatric HSCT and to discuss evidence-based implications for practice to positively affect quality of life for these patients.

Methods: A literature search was conducted using the following databases: PubMed, CINAHL, Cochrane, and Healthsource: Nursing/Academic Edition. The search strategy included randomized controlled trials, prospective and retrospective cohort studies, case control studies, cross sectional studies, systematic reviews, and meta-analyses. Relevant papers were utilized if they discussed patients 0-21 years of age, allogenic or autogenic HSCT, and enteral and/or parenteral nutrition. Papers were excluded if there was no English translation, they did not discuss nutrition, or they had animal subjects.

Results: Initially 477 papers were identified and after the screening process 15 papers were utilized for this integrative review. EN and PN have effects on clinical outcomes, complications, and hospital and survival outcomes. EN was associated with faster platelet engraftment, improved gut microbiome, and decreased mucositis and GVHD. PN was used more in severe mucositis due to interference with feeding tube placement, therefore decreasing the use of EN. Use of PN is more common in severe grades III-IV of gut GVHD. Initiation of EN later in treatment, such as after conditioning and the presence of mucositis, can be associated with severe grades III-IV of gut GVHD. This is because conditioning can cause damage to the gut leading to mucosal atrophy and intestinal permeability altering gut microbiota. PN can induce gut mucosal atrophy and dysbiosis allowing for bacterial translocation, while EN improves the gut epithelium and microbiota reducing translocation. Additionally, increase access to central venous lines in PN can introduce bacterial infections to the bloodstream. Feeding tube placement complications include dislodgement, refusal of replacement, and increased risk of bleeding due to thrombocytopenia. Electrolyte imbalance can be attributed to loss of absorption through the gut. If a gastrostomy tube is present, there can be infection at the site. There is currently no consensus in the appropriate timeline of tube placement. There was no significant difference in neutrophil engraftment and variable findings in morbidity/mortality and weight gain. Weight gain can be attributed to edema in PN. Length of stay was significantly shorter in only EN than PN (p < 0.0001).

Conclusion: This literature review indicates a need for a more comprehensive nutritional assessment to adequately evaluate nutritional status before and after HSCT. EN should be given as a first line therapy and should be considered prior to the conditioning phase. The initiation of a feeding tube prior to conditioning should be considered. Finally, PN may be considered if EN cannot be tolerated. More research is needed for a sensitive nutritional evaluation, earlier administration of EN, and standardized pathways for EN and PN nutrition in pediatric HSCT.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
7.80
自引率
8.80%
发文量
161
审稿时长
6-12 weeks
期刊介绍: The Journal of Parenteral and Enteral Nutrition (JPEN) is the premier scientific journal of nutrition and metabolic support. It publishes original peer-reviewed studies that define the cutting edge of basic and clinical research in the field. It explores the science of optimizing the care of patients receiving enteral or IV therapies. Also included: reviews, techniques, brief reports, case reports, and abstracts.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信