Bedtime snacking with delayed insulin delivery increases risk of overnight hyperglycemia in hospitalized diabetes patients: Study

A new study published in Diabetes Technology and Therapeutics showed that one of the main causes of postprandial and nocturnal hyperglycemia in hospitalized inpatients is delayed insulin delivery after meals and snacking before bed without insulin administration.

For those with diabetes, controlling blood glucose levels is essential, and timing insulin boluses during meals and eating before bed are important factors in this process. Better decisions about insulin dosing and glucose stability at night are made possible by continuous glucose monitoring (CGM), which offers real-time insights into glucose changes.

Optimizing glycemic regulation and lowering the risks of hyperglycemia (high blood sugar) and hypoglycemia (low blood sugar) can be achieved by comprehending how these variables interact. It might be difficult to provide prandial insulin with meals on time for hospitalized inpatients. Moreover, no prior research has examined the glycemic effects of post-dinner snacking, sometimes known as “bedtime snacking,” in the absence of prandial insulin treatment. This study was by Sara Alexanian and colleagues examined the effects of bedtime eating and delayed insulin delivery on inpatient glycemic control.

The In-Fi study, compared Fiasp with insulin aspart (Novolog) in inpatients with type 2 diabetes, was the subject of a post hoc analysis by researchers. The Dexcom G6 PRO continuous glucose monitoring (CGM) device was used to measure the glucose results. A total of 122 randomized participants who finished the primary trial protocol (which involved wearing a CGM for at least 4 meals) had their CGM and insulin delivery data examined. This study assesses the effects of bedtime eating and postponed insulin injection on glucose regulation.

Insulin boluses given before meals (n = 149) had a 4-hour postprandial time in range (TIR70–180) of 48%, while those given more than 5 minutes after a meal had a TIR70–180 of 24%. When controlling for bedtime sensor glucose, eating between 9 p.m. and 12 a.m. was linked to a considerably lower overnight (9 p.m. and 6 a.m.) TIR70–180 and a significantly higher fasting glucose the following morning.

After controlling for initial bedtime sensor glucose, bedtime eating was linked to greater overnight glucose percentage coefficient of variation and higher overnight glucose standard deviation. Overall, one of the main cause of post-meal and nocturnal hyperglycemia in hospitalized patients is the delayed administration of insulin during meals and the consumption of insulin-free snacks before bed. 

Reference:

Alexanian, S. M., Cheney, M. C., Bello Ramos, J. C., Spartano, N. L., Wolpert, H. A., & Steenkamp, D. W. (2025). Impact of meal insulin bolus timing and bedtime snacking on continuous glucose monitoring-derived glycemic metrics in hospitalized inpatients. Diabetes Technology & Therapeutics. https://doi.org/10.1089/dia.2025.0027

Powered by WPeMatico

Women With High Genetic and Female-Specific Risks Face Greater Cardiometabolic Disease Risk: Study Finds

China: A recent cohort study highlights the significant role of female-specific health conditions in shaping the risk of cardiometabolic disease (CMD) and their interaction with genetic predisposition. The UK Biobank study involving 150,413 women identified a strong association between female-specific factors, including premature menopause and adverse pregnancy outcomes, and an increased risk of CMD.

“Each one-unit increase in the female-specific risk score (FSRC) was linked to a 24% rise in CMD risk, with the highest risk (243%) observed in individuals with both high genetic susceptibility and female-specific risk factors. These findings highlight the importance of incorporating FSRC into risk assessments for more accurate disease prediction and prevention strategies,” the researchers reported in the BMJ Journal Heart.

The influence of female-specific health factors on the development and progression of cardiometabolic disease remains an area of ongoing research, particularly in the context of genetic susceptibility. While traditional risk models primarily consider lifestyle and metabolic factors, the impact of conditions such as premature menopause, adverse pregnancy outcomes, and polycystic ovary syndrome (PCOS) is not fully understood. Recognizing this gap, Jiayu Yin, Department of Cardiology, Second Affiliated Hospital of Soochow University, Suzhou, China, and colleagues aimed to comprehensively evaluate how these female-specific factors contribute to CMD risk and interact with genetic predisposition, providing valuable insights for enhancing risk assessment and developing more effective preventive strategies for women.

For this purpose, the researchers conducted a prospective cohort study involving 150,413 women from the UK Biobank. They examined various female-specific factors, including premature menopause, adverse pregnancy outcomes, early or late menarche, multiparity, infertility, use of oral contraceptives or hormone therapy, and autoimmune diseases. A weighted female-specific risk score (FSRS) ranging from 0 to 6 was developed to quantify these risks.

The researchers analyzed the association between these female-specific factors and the occurrence and progression of cardiometabolic disease across different levels of genetic susceptibility.

The study led to the following findings:

  • Over a median follow-up of 13.7 years, 16,636 cardiometabolic disease events were recorded
  • Each one-point increase in the female-specific risk score (FSRS) was linked to a 24% higher risk of developing CMD.
  • FSRS remained consistently associated with progression to the first CMD event, cardiometabolic multimorbidity, and mortality.
  • Female-specific factors and genetic susceptibility had a synergistic effect on CMD risk.
  • Women with both high female-specific and genetic risk had a 243% greater likelihood of developing CMD compared to those with low risk in both categories.
  • FSRS demonstrated a strong predictive value for CMD, particularly in individuals with higher genetic susceptibility, and modestly enhanced the performance of two established cardiovascular risk algorithms.
  • Phenotypic aging, inflammation, metabolic factors, renal function, and estradiol collectively accounted for 21.6% of the association between FSRS and CMD.

The findings highlight the significant role of female-specific health factors in influencing cardiometabolic disease risk, particularly in combination with genetic susceptibility.

“Integrating these factors into risk assessment models could improve predictive accuracy, allowing for more personalized and effective prevention strategies, especially for women with a high genetic predisposition to CMD,” the authors concluded.

Reference:

Yin J, Li T, Yu Z, et al. Synergistic effects of female-specific conditions and genetic risk on cardiometabolic disease: a cohort studyHeart Published Online First: 26 March 2025. doi: 10.1136/heartjnl-2024-325355

Powered by WPeMatico

ABC-AF Risk Scores Offer Superior Stroke Prediction in AF Patients: Study

A new study published in the Journal of the American College of Cardiology showed that the patients with atrial fibrillation (AF) on oral anticoagulation, which includes NT-proBNP and high-sensitivity troponin outperforms clinical ratings in predicting stroke.

The risk of ischemic stroke is a major factor in stroke prevention therapy recommendations for people with atrial fibrillation (AF). It is known that a patient receiving direct oral anticoagulant treatment today still has a chance of having a stroke, which might range from 0.3% to 0.9% annually.

The true vision for the future was that if patients treated with a direct oral anticoagulant still had a high risk of stroke, they might benefit from additional treatment such as left atrial appendage occlusion [LAAO] devices, or perhaps have a more liberal indication for ablation to eliminate A-fib or lessen the burden of A-fib as a stroke risk indicator.

This was because the risk of stroke still varies during oral anticoagulant treatment. Thus, this study was to assess the biomarker-based Age, Biomarkers, Clinical history (ABC)-AF-stroke risk score. Lars Wallentin and colleagues also created a modified ABC-AF-istroke risk score to predict total stroke and ischemic stroke in AF patients, respectively.

The ABC-AF-stroke score and the modified ABC-AF-istroke score were calculated using data clinical history of stroke, on age, and levels of N-terminal pro B-type natriuretic peptide and troponin in 26,452 AF patients who were assigned to direct oral anticoagulants (DOACs) or warfarin.

There were 756 incidents of stroke or systemic embolism (SEE) throughout the follow-up period, including 534 cases of ischemic stroke/SEE. In comparison to the ATRIA (Anticoagulation and Risk Factors in Atrial Fibrillation) score of 0.632 and the CHA2DS2-VASc score of 0.614, the ABC-AF-stroke score, C-index, demonstrated greater discrimination of total stroke/SEE. With a C-index for ABC-AF-istroke of 0.677 when compared to 0.642 for the ATRIA and 0.624 for the CHA2DS2-VASc score, the outcomes for ischemic stroke/SEE were comparable (P < 0.001 for both).

For both total and ischemic stroke, the ABC-AF-stroke scores demonstrated satisfactory calibration. In the pertinent subgroups, the results were consistent. Analysis of decision curves revealed a net advantage with regard to decision thresholds for stroke prevention. Overall, when it came to predicting total and ischemic stroke, the biomarker-based ABC-AF risk scores were well-calibrated, demonstrated superior discrimination over clinical risk scores, and offered significant decision assistance for stroke-prevention therapies in AF patients.

Reference:

Wallentin, L., Lindbäck, J., Hijazi, Z., Oldgren, J., Carnicelli, A. P., Alexander, J. H., Berg, D. D., Eikelboom, J. W., Goto, S., Lopes, R. D., Ruff, C. T., Siegbahn, A., Giugliano, R. P., Granger, C. B., & Morrow, D. A. (2025). Biomarker-based model for prediction of ischemic stroke in patients with Atrial Fibrillation. Journal of the American College of Cardiology, 85(11), 1173–1185. https://doi.org/10.1016/j.jacc.2024.11.052

Powered by WPeMatico

Smartphone eye photos may help detect anemia in children, finds study

Anemia, a condition marked by low levels of hemoglobin in the blood, affects nearly 2 billion people worldwide. Among them, school-age children in low- and middle-income countries are particularly vulnerable.

Left untreated, anemia in children can interfere with growth, learning, and overall development. Detecting the condition early is essential, but standard diagnostic methods require blood samples and lab equipment—resources that are often unavailable in low-income areas.

A new study reported in Biophotonics Discovery offers a promising alternative: using simple grayscale photos of the eye’s conjunctiva—the inner surface of the eyelid and the white part of the eye—to predict anemia. Researchers from Purdue University, Rwanda Biomedical Center, and University of Rwanda used standard smartphones to take over 12,000 eye photos from 565 children aged 5 to 15. They then applied machine learning along with a technique called radiomics, which mathematically analyzes patterns and textures in medical images, to identify features linked to anemia.

First author Shaun Hong, a Purdue University PhD student, notes, “Unlike previous efforts that rely on color analysis or special imaging tools, this method doesn’t require color data. Instead, it uses black-and-white photos to examine tiny structural changes in the eye’s blood vessels. This approach avoids problems caused by different light conditions or camera models, making it more practical for use in a variety of settings.”

The results show a strong connection between specific spatial features and anemia status, pointing to the possibility of screening for anemia using just a smartphone and basic software. This could be especially useful in remote or under-resourced communities, offering a fast, noninvasive, and affordable way to identify children at risk.

Corresponding author Professor Young L. Kim of Purdue University remarks, “The technology isn’t meant to replace traditional testing but could help prioritize who needs further evaluation and treatment. With more development, the method could be integrated into mobile health tools to support early intervention in areas where healthcare access is limited.”

For details, see the original Gold Open Access article by S. G. Hong et al., “Radiomic identification of anemia features in monochromatic conjunctiva photographs in school-age children,” Biophotonics Discovery 2(2), 022303 (2025), doi: 10.1117/1.BIOS.2.2.022303.

Powered by WPeMatico

Block Pain: Study Unleashes Power of Suprainguinal Fascia Iliaca Compartment Block for Proximal Femur Fracture Relief

Analgesic management for pain following proximal femur fractures is critical to enhancing patient recovery. In a recent controlled study involving 60 adult trauma patients scheduled for surgical fixation of proximal femur fractures, the effectiveness of three analgesic techniques was compared: the continuous fascia iliaca plane block using a suprainguinal approach (SFICB), the continuous fascia iliaca plane block using an infrainguinal approach (IFICB), and a femoral nerve block (FNB). Participants were randomized into one of the three groups, receiving ultrasound-guided blocks with 0.2% ropivacaine for postoperative analgesia, followed by a continuous infusion at 10 mL/h for the first 24 hours.

Study Objectives

The primary aim was to assess the number of rescue analgesic (RA) doses required within the first 24 hours post-surgery to maintain a visual analogue scale (VAS) pain score below 4. Secondary objectives included total morphine consumption, duration of analgesia, pain scores over time, quality of pain relief, and the assessment of any adverse effects.

Results Overview

Results showed that patients in the SFICB group had a significantly lower need for RA doses: only 15% required additional morphine, compared to 40% in the IFICB group and 50% in the FNB group. Furthermore, median morphine consumption was notably less in the SFICB group (3 mg) than the IFICB (6.5 mg) and FNB (9.0 mg) groups, indicating better analgesic efficacy. SFICB patients reported lower median VAS scores and higher quality of pain relief, with a significant proportion noting excellent pain relief compared to those in the other two groups.

Methodological Considerations

Methodologically, patients were carefully selected based on specific criteria to ensure reliability. Those with any prior analgesic therapies, infections, pregnancy, or other relevant comorbidities were excluded. Each block’s feasibility and performance time were recorded along with patient-reported side effects. Patients in all groups underwent monitoring for hemodynamic stability during the procedure.

Efficacy of SFICB

Results demonstrated the superior analgesic efficacy of the SFICB approach due to its more extensive local anesthetic spread, successfully covering the femoral, lateral femoral cutaneous, and obturator nerves. While both the FNB and IFICB techniques were effective, they lagged behind SFICB in terms of overall effectiveness and patient satisfaction.

Clinical Implications

The findings emphasize the clear advantage of using a continuous suprainguinal fascia iliaca compartment block as a preferred technique for postoperative pain management in patients with proximal femur fractures. The implications suggest that implementing SFICB can lead to reduced opioid consumption and better pain management, providing a more effective strategy for enhancing recovery in surgical patients. This study supports the notion that optimized regional anesthesia techniques can play a pivotal role in improving postoperative care outcomes and patient satisfaction. Further investigations could expand to assess long-term effects and explore the potential for additional adjunct therapies to augment analgesic strategies.

Key Points

– A controlled study with 60 adult trauma patients evaluated three analgesic techniques for postoperative pain management following proximal femur fractures: continuous fascia iliaca plane block using a suprainguinal approach (SFICB), continuous fascia iliaca plane block using an infrainguinal approach (IFICB), and femoral nerve block (FNB). Each participant received ultrasound-guided blocks with 0.2% ropivacaine followed by a continuous infusion for the first 24 hours post-surgery.

– The primary objective was to analyze the number of rescue analgesic doses needed within the first 24 hours to maintain a visual analogue scale (VAS) pain score below 4. Secondary objectives assessed total morphine consumption, duration of analgesia, pain scores over time, quality of pain relief, and side effects.

– Results indicated that only 15% of patients in the SFICB group necessitated additional morphine for pain relief, significantly lower than 40% in the IFICB group and 50% in the FNB group, suggesting superior effectiveness of the SFICB technique.

– Median morphine consumption was significantly reduced for the SFICB group (3 mg), compared to the IFICB (6.5 mg) and FNB (9.0 mg) groups, demonstrating the SFICB’s enhanced analgesic efficacy.

– The superior analgesic performance of SFICB is attributed to its broader local anesthetic spread, covering essential nerves including the femoral, lateral femoral cutaneous, and obturator nerves, resulting in lower median VAS scores and improved patient satisfaction in pain management.

– The findings suggest that SFICB is a preferred postoperative pain management technique for proximal femur fractures, potentially leading to reduced opioid consumption and improved patient recovery outcomes. The study indicates a need for further exploration of long-term effects and adjunct therapies to enhance analgesic strategies.

Reference –

Nidhi Bhatia et al. (2025). Continuous Peripheral Nerve Block In Patients With Proximal Femur Fracture: A Randomised Comparison Of Three Techniques. *Indian Journal Of Anaesthesia*. https://doi.org/10.4103/ija.ija_1095_24.

Powered by WPeMatico

Second TB vaccination boosts immunity in bladder cancer patients and reduces cancer recurrence: Study

Two doses of a simple tuberculosis vaccination after surgery helps the immune system fight cancer cells and could greatly improve patient outcomes for the most common type of bladder cancer, according to a pilot study of 40 patients.

Initial results from the RUTIVAC-1 Trial are presented today [Sunday 23 March 2025] at the European Association of Urology (EAU) Congress in Madrid.

In the randomised controlled trial, administering the vaccine alongside standard treatment led to an elevated immune response, which is known to improve the body’s ability to suppress future tumours. Patients who received the vaccine had no discernible side effects and every patient was cancer-free after five years.

Bladder cancer is the ninth most common cancer in the world, with over 600,000 people diagnosed in 2022. Non-muscle invasive bladder cancer is an early-stage cancer that affects the lining of the bladder and has not progressed into the deeper muscle layer.

Following surgery to remove the tumour, bladder cancer patients are typically given a live Bacillus Calmette Guerin (BCG) inoculation directly into the bladder to help their immune system destroy any remaining cancer cells. While this reduces the chances of their cancer coming back, up to 50% of patients go on to experience disease recurrence or progression.

Principal Investigator, Dr Cecilia Cabrera, of IrsiCaixa and IGTP, Barcelona, and colleagues tested whether an additional injection of a non-live TB vaccine, called RUTI®, would further boost patients’ immune response in a small pilot study.

They found that the RUTI vaccine significantly enhanced the BCG-induced immune response compared with the control group. RUTI vaccination was also associated with significantly higher progression-free survival, with every patient in the RUTI® group tumour-free five years later compared with 13 of 18 patients in the control group.

Dr Cabrera says: “We expected that the RUTI® vaccine would improve the immune response for patients, but we didn’t know what effect this might have on cancer progression over five years. It was very surprising for us to see such a vast improvement in cancer progression even with such a small group of patients.”

RUTI® vaccination was also well tolerated by patients, with only a mild reaction at the injection site and no systemic adverse effects.

Dr Cabrera says: “This was a small pilot study, but we’ve been really encouraged by the reduction in disease recurrence and progression in patients treated with the RUTI® vaccine. We were particularly pleased to see that these results are further supported by an exploratory sub-analysis including only high-grade T1 bladder cancer patients. With such high rates of recurrence in bladder cancer, finding new ways to prevent this is very important.”

RUTI® is being developed as a therapeutic vaccine against tuberculosis by Archivel Farma SL. In parallel, RUTI® is being developed as an immunotherapeutic agent for bladder cancer as a collaboration between IrsiCaixa, IGTP and Archivel Farma.

Joost Boormans, Professor of Urology at the Erasmus University Medical Centre, The Netherlands, and a member of the EAU Scientific Congress Office, said: “This is a well-conducted pilot study and shows promising results. With just two injections over and above standard treatment, the burden on patients is very small and I look forward to seeing whether further studies in larger cohorts continue to demonstrate the benefits to patients and improve their outcomes.”

The researchers caution that a larger trial will be needed to confirm the results before the treatment can be considered for wider use.

Reference:

Second TB vaccination boosts immunity in bladder cancer patients and reduces cancer recurrence, European Association of Urology, Meeting: EAU25 European Association of Urology Congress.

Powered by WPeMatico

Innate immune training aggravates inflammatory bone loss, finds study

Clinical research has long focused on ways to harness the actions of the immune system. From vaccines to immunotherapies, researchers have used their knowledge of the immune system to develop therapies to treat or prevent diseases from influenza to autoimmune disease and cancer.

Now, researchers from Penn’s School of Dental Medicine and international collaborators have investigated the effects of training the innate immune system in experimental models of two chronic inflammatory diseases, periodontitis and arthritis. They found that this “trained” immunity, or TRIM, led to increased bone loss in these models. This study is published in Developmental Cell.

Previous approaches have largely focused on the adaptive immune system, that branch of the immune system that “remembers” previous threats and launches specific attacks when it encounters them again. The body also has an innate immunity branch, which, for a long time, was just considered the first-line, general attack arm of the immune system with no ability to remember prior assaults or respond differently when rechallenged.

“If you go and look at an immunology textbook-even today-it will likely tell you that innate immunity has no memory; its response doesn’t get improved the second time,” says George Hajishengallis, the Thomas W. Evans Centennial Professor in the Department of Basic & Translational Sciences at Penn Dental Medicine.

This belief, Hajishengallis notes, has been challenged over the past decade. Studies have shown that the innate immune system can respond more strongly when challenged again with the same or different stimulus-in other words, it can be “trained.”

And importantly, these studies have also shown that “training” the innate immune system has beneficial effects, such as anti-tumor activity and an increased response to fighting infections in certain experimental models.

But inflammation—the innate immune system’s natural response to harmful stimuli-can also exacerbate symptoms or even cause diseases, demonstrating the need to better understand the immune system when developing immune-based therapies. An increased response may not always be beneficial.

“Trained innate immunity (TRIM) has emerged as a major immunological principle that challenges the dogma that memory is restricted to adaptive immunity,” says Hajishengallis. “So, a better understanding of TRIM is imperative to appropriately harness it for therapeutic gain in human disease.”

The Hajishengallis team along with a collaborative lab led by Triantafyllos Chavakis at the Dresden University of Technology, induced TRIM using ß-glucan, a compound found in certain fungi, and measured the generation of osteoclasts, which resorb bone during growth and healing, in models of inflammatory periodontitis and arthritis.

“We found that this treatment primed osteoclast precursors to differentiate into osteoclasts more readily if presented with an inflammatory challenge like arthritis,” says Chavakis.

“So, although TRIM can have beneficial effects-protecting against infections and cancer-our results indicate that the memory of a previous infection may also contribute to inflammatory diseases and the comorbidity between inflammatory bone loss disorders,” adds Hajishengallis.

Their work, however, showed that ß-glucan only increases the opportunity for bone loss to occur-it does not cause actual bone loss. That only occurs if a second inflammatory stimulus, such as arthritis or periodontitis, is present.

“This requirement [for a secondary challenge] epitomizes the concept of trained immunity-the training stimulus causes a state of preparedness for future events,” says Hajishengallis.

Importantly, these results argue against the idea that it is the initial stimulus that is driving TRIM to be beneficial or maladaptive (harmful), as ß-glucan caused beneficial TRIM (for example, tumor growth inhibition) in previous studies by Hajishengallis and Chavakis.

“Our findings suggest that the context in which TRIM emerges dictates whether the functional outcome is protective or harmful,” says Chavakis.

“The double-edged sword nature of TRIM acquires special relevance when considering the preventive or therapeutic application of TRIM-inducing agents,” adds Hajishengallis.

Reference:

Nora Haacke, Hui Wang, Shu Yan, Marko Barovic, Xiaofei Li, Kosuke Nagai, Adelina Botezatu, Aikaterini Hatzioannou, Bettina Gercken, Giulia Trimaglio, Anisha U. Shah, Jun Wang, Ling Ye, Mangesh T. Jaykar, Martina Rauner, Ben Wielockx, Kyoung-Jin Chung, Mihai G. Netea, Lydia Kalafati, George Hajishengallis, Triantafyllos Chavakis, Innate immune training of osteoclastogenesis promotes inflammatory bone loss in mice, Developmental Cell, https://doi.org/10.1016/j.devcel.2025.02.001

Powered by WPeMatico

Lumican Identified as Key Extracellular Matrix-Related Biomarker in Diabetic Nephropathy Through Integrated Bioinformatics Analysis

This study highlights lumican as a promising biomarker for predicting the onset and progression of diabetic nephropathy and reveals its close association with extracellular matrix remodeling, a hallmark of diabetic kidney disease. The findings offer new perspectives for the clinical diagnosis and targeted treatment of diabetic nephropathy, potentially improving patient outcomes.

Using computational bioinformatics techniques, potential biomarkers for diabetic nephropathy were identified, with a particular emphasis on the key gene lumican and its involvement in disease-related molecular mechanisms. Publicly accessible microarray datasets, GSE96804 and GSE30528, from the Gene Expression Omnibus database were analyzed to detect genes that were differentially expressed between diabetic nephropathy patients and healthy controls. Gene Ontology enrichment analysis and Gene Set Enrichment Analysis based on the Kyoto Encyclopedia of Genes and Genomes database revealed that these differentially expressed genes were significantly involved in biological processes such as the cellular response to hexose and organization of cell–cell junctions, as well as pathways related to amino acid metabolism and adipokine signaling. Central regulatory genes were identified using Cytoscape software, and their clinical significance was further explored using the Nephroseq database. These genes were found to be upregulated in kidney tissues of patients with diabetic nephropathy, and their expression levels showed a negative correlation with estimated glomerular filtration rate, indicating deterioration in renal function. Among the core genes, lumican demonstrated substantial overexpression in diabetic nephropathy tissues, which was validated through immunohistochemistry and immunofluorescence using patient samples. Functional enrichment analysis of lumican and its interacting protein network revealed their strong association with extracellular matrix organization. These results establish lumican as a potential diagnostic and therapeutic target in diabetic nephropathy and provide novel insights into the underlying mechanisms of kidney tissue remodeling in the context of diabetes.

Keywords: Diabetic nephropathy, bioinformatics, biomarkers, hub gene, lumican, extracellular matrix, gene expression, renal function, glomerulosclerosis, transcriptomic analysis, Tao, Y., Liu, Y., Wang, Z., Tang, L., Zhang, Y., Zheng, S

References:

Tao, Y., Liu, Y., Wang, Z., Tang, L., Zhang, Y., Zheng, S., … Liu, S. (2025). Lumican as a potential biomarker for diabetic nephropathy. Renal Failure, 47(1). https://doi.org/10.1080/0886022X.2025.2480245

Powered by WPeMatico

Parasitic infection and treatment linked to cancer-related gene activity in cervix: Study

New research has revealed that Schistosoma haematobium (S. haematobium), a parasitic infection affecting millions globally, can trigger cancer-related gene activity in the cervical lining, with changes becoming even more pronounced after treatment. Presented today at ESCMID Global 2025, this pivotal study sheds new light on how this often-overlooked parasitic disease may contribute to cervical cancer risk at the molecular level.

Schistosomiasis is a widespread parasitic disease, particularly prevalent in regions with poor access to clean water and sanitation. S. haematobium, one of the main species responsible for human schistosomiasis, infects over 110 million people worldwide by depositing eggs that infiltrate the urinary and reproductive tracts. While this parasite is recognised as a cause of bladder cancer, its potential role in cervical cancer has remained poorly understood.

In this study, researchers analysed cervical tissue samples from 39 Tanzanian women with (n=20) and without (n=19) S. haematobium infection. Infected women received praziquantel treatment, and samples were collected at baseline and 4-12 months post-treatment. Through RNA sequencing and gene expression analysis, cancer-related pathways linked to infection were identified. Nine genes were expressed differently between infected and uninfected women, 23 genes changed in women who cleared the infection after treatment, and 29 genes differed between women post-treatment and those never infected.

Among the nine most significantly altered genes between infected and uninfected women, four were linked to cancer:

  • BLK proto-oncogene: A tyrosine kinase that drives cell proliferation and can contribute to tumour formation when dysregulated
  • Long Intergenic Non-Protein Coding RNA 2084: A prognostic marker in head, neck, and colon cancers, influencing gene regulation tied to tumour progression
  • Trichohyalin: Involved in keratin complex formation and upregulated in certain cancers
  • TCL1 family AKT coactivator A: Promotes cell survival and proliferation, and is linked to T- and B-cell lymphomas

Post-treatment, certain cancer-related biological pathways became more active, particularly those involved in inflammation, tissue remodelling, and the breakdown of protective barriers in the cervix. These changes were linked to increased blood vessel formation, activation of tumour-related processes, and reduced programmed cell death (apoptosis)—a key mechanism for eliminating abnormal cells.

“The findings suggest that infection may trigger molecular changes that make women more vulnerable to cancer-related processes in the cervix, especially after treatment,” explains Dr. Anna Maria Mertelsmann, lead study author. “One particularly concerning observation was the downregulation of genes responsible for maintaining cervical tissue integrity, including claudins and tight junction proteins. This loss of protective function could facilitate HPV infection and persistence, a major risk factor for cervical cancer.”

“Our research shows that women who received praziquantel treatment exhibited more genetic changes linked to cancer than those with an active infection,” Dr. Mertelsmann added. “This raises critical questions about the long-term effects of treatment and highlights the need for careful post-treatment monitoring.”

This study serves as an important first step in understanding the role of S. haematobium in cervical cancer, and a larger study following 180 women over 12 months is currently underway to confirm these findings. Future research will also explore whether women who have had schistosomiasis are at greater risk of cervical cancer due to long-term HPV infections.

Dr. Mertelsmann and her team stress the need for greater awareness of Female Genital Schistosomiasis (FGS), as many women with S. haematobium are also affected by this difficult-to-diagnose condition. “Women diagnosed with S. haematobium should be closely monitored for early signs of cervical tissue abnormalities,” she emphasised. She also suggested that additional treatments-such as anti-inflammatory or immune-modulating therapies-could help counteract the harmful effects seen after treatment. Moreover, widespread HPV vaccination could play a crucial role in reducing cervical cancer risk for women affected by schistosomiasis.

Reference:

Parasitic infection and treatment linked to cancer-related gene activity in the cervix, Beyond, Meeting: ESCMID Global 2025.

Powered by WPeMatico

Study Links Chronic Endometritis to Abnormal Uterine Bleeding; Antimicrobial Therapy Shows Promise

Italy: A recent prospective observational study published in the International Journal of Gynecology & Obstetrics has revealed a high prevalence of chronic endometritis (CE) in women with nonstructural abnormal uterine bleeding (AUB), with 70.3% of participants diagnosed with the condition.

The study highlights the significant role of CE in abnormal bleeding patterns in the absence of structural uterine abnormalities. Targeted antimicrobial therapy led to notable improvements in bleeding outcomes, with women cured of CE experiencing fewer bleeding days, better pictorial blood assessment chart scores at 3 and 6 months, and higher serum hemoglobin and ferritin levels than those with persistent CE. These findings emphasize the potential benefits of antimicrobial treatment in managing nonstructural AUB.

The causal relationship between chronic endometritis and nonstructural abnormal uterine bleeding remains inadequately explored, necessitating further investigation. Therefore, Pierpaolo Nicolì from the University of Bari “Aldo Moro,” Policlinico of Bari, Bari, BA, Italy, and colleagues aimed to assess the prevalence of chronic endometritis in women with nonstructural abnormal uterine bleeding and evaluate the impact of CE treatment on menstrual blood loss patterns.

For this purpose, the researchers conducted a prospective study between 2022 and 2024 at the University of Bari, Italy, involving women aged 20–45 with nonstructural AUB undergoing hysteroscopy. Chronic endometritis was diagnosed based on hysteroscopic and histologic/immunohistochemical (HIS/IHC) criteria. Women with CE received culture-guided therapy, and cure was confirmed by triple negativity (hysteroscopy, HIS/IHC, and culture) in Group A. If CE persisted, up to three therapy courses were administered (Group B). Participants completed a bleeding characteristics questionnaire and had serum hemoglobin and ferritin levels assessed at enrollment and post-treatment.

The following were the key findings of the study:

  • Chronic endometritis was diagnosed in 70.3% of women with nonstructural abnormal uterine bleeding (AUB) enrolled in the study.
  • Among the 102 CE patients, 81 (79.4%) showed CE resolution after therapy (group A), while 21 (20.6%) had persistent CE (group B).
  • The duration of heavy bleeding before treatment (baseline) was similar in both groups.
  • At the end of treatment, group A showed significant reductions in days of heavy bleeding, spotting, and Pictorial Blood Assessment Chart (PBAC) scores compared to their baseline and group B.
  • Serum hemoglobin and ferritin levels were significantly higher in women with resolved CE (group A) than those with persistent CE (group B).
  • PBAC scores in group A remained significantly improved 3 and 6 months after treatment, indicating persistent benefits.
  • Both univariate and multivariate regressions revealed a significant association between the cure of CE and reduced bleeding in patients.

The study highlights a high prevalence of chronic endometritis among women with nonstructural abnormal uterine bleeding. Notably, targeted antimicrobial therapy proved effective, with CE cure leading to marked and sustained improvements in bleeding patterns, emphasizing the clinical value of identifying and treating CE in this population.

Reference:

Cicinelli, E., Nicolì, P., Vimercati, A., Cicinelli, R., Marinaccio, M., Matteo, M., & Vitagliano, A. High prevalence of chronic endometritis in women with nonstructural abnormal uterine bleeding and benefits of antimicrobial treatment on blood loss pattern: A prospective, observational study. International Journal of Gynecology & Obstetrics. https://doi.org/10.1002/ijgo.70115

Powered by WPeMatico