Higher TyG-BMI Increases Risk of Developing Multiple Cardio-Renal-Metabolic Conditions: Study Finds

China: A recent study published in Cardiovascular Diabetology highlights the significant role of the triglyceride glucose-body mass index (TyG-BMI) in the development and progression of cardio-renal-metabolic (CRM) multimorbidity. The large-scale study involving 349,974 individuals revealed a significant association between TyG-BMI and the onset and progression of CRM multimorbidity.

“Multivariable analysis indicated that for every standard deviation increase in TyG-BMI, the risk of developing one, two, or three CRM conditions rose by 32%, 24%, and 23%, respectively, over 14 years. These findings suggest that TyG-BMI could serve as a valuable marker for the early detection of CRM diseases,” the authors reported.

Cardio-renal-metabolic multimorbidity, involving cardiovascular, renal, and metabolic disorders, poses major healthcare challenges. Identifying predictive markers is crucial for early intervention and better disease management. While TyG-BMI has been linked to cardiovascular outcomes, its role in the progression of CRM multimorbidity progression remains unclear. Previous studies were limited by small sample sizes and overlooked the competitive effects of other CRM conditions.

Xuerui Tan, Department of Cardiology, The First Affiliated Hospital of Shantou University Medical College, Shantou, Guangdong, China, and colleagues aimed to address these gaps by analyzing a large UK Biobank cohort of nearly 350,000 individuals using a multi-state model to assess the impact of TyG-BMI on CRM multimorbidity progression.

For this purpose, the researchers utilized data from the large-scale, prospective UK Biobank cohort. CRM multimorbidity was defined as the onset of ischemic heart disease, type 2 diabetes mellitus, or chronic kidney disease during follow-up.

Multivariable Cox regression was applied to assess the independent association between TyG-BMI and each stage of CRM multimorbidity (first, double, or triple CRM diseases). The C-statistic evaluated model performance, while a restricted cubic spline analyzed the dose–response relationship. A multi-state model further examined the progression of CRM multimorbidity, tracking transitions from no CRM disease to the first, double, and triple CRM conditions with disease-specific analyses.

The key findings of the study were as follows:

  • The study included 349,974 participants with a mean age of 56.05 years (SD: 8.08), of whom 55.93% were female. Over a median follow-up of 14 years, 56,659 (16.19%) participants without baseline CRM disease developed at least one CRM condition.
  • Among them, 8,451 (14.92%) progressed to double CRM disease and 789 (9.34%) further developed triple CRM disease.
  • Each SD increase in TyG-BMI was linked to a 47% higher risk of the first CRM disease, a 72% higher risk of double CRM disease, and a 95% higher risk of triple CRM disease, with C-statistics of 0.625, 0.694, and 0.764, respectively.
  • Multi-state model analysis indicated a 32% increased risk of new CRM disease, a 24% increased risk of progression to double CRM disease, and a 23% increased risk of further progression.
  • TyG-BMI showed a significant association with the onset of all individual first CRM diseases, except for stroke, and with the transition to double CRM disease.
  • Significant interactions were observed, but TyG-BMI remained independently associated with CRM multimorbidity across subgroups.
  • Sensitivity analyses, including variations in time intervals and an expanded CRM definition (incorporating atrial fibrillation, heart failure, peripheral vascular disease, obesity, and dyslipidaemia), reinforced these findings.

“TyG-BMI significantly influences the onset and progression of CRM multimorbidity, making it a valuable marker for the early identification of high-risk individuals. Its integration into prevention and management strategies could help reduce the burden of CRM multimorbidity and improve long-term health outcomes,” the authors wrote.

“Further research is needed to explore its applicability across diverse populations and its potential role in clinical practice and public health policies,” they concluded.

Reference:

Tang, H., Huang, J., Zhang, X. et al. Association between triglyceride glucose-body mass index and the trajectory of cardio-renal-metabolic multimorbidity: insights from multi-state modelling. Cardiovasc Diabetol 24, 133 (2025). https://doi.org/10.1186/s12933-025-02693-w

Powered by WPeMatico

Facial Photograph-Based AI Tool Promising in Thyroid Eye Disease Assessment: Study Shows

South Korea: A recent study published in Ophthalmology Science has found that a deep learning (DL)-driven method for measuring proptosis in thyroid eye disease (TED) patients using standard facial photographs has shown promising accuracy, potentially transforming traditional exophthalmometry practices.               

The AI model demonstrated a mean absolute error of just 1.27 mm, strongly correlated with conventional measurements (r = 0.82), and achieved a high diagnostic accuracy (AUC: 0.91). It accurately identified significant proptosis changes (≥2 mm) in 74.6% of cases, highlighting its potential as a reliable, non-invasive tool for clinical use.                

Proptosis, a common feature of thyroid eye disease, causes forward bulging of the eyeball due to tissue expansion within the orbit. It can impair vision, cause discomfort, and affect appearance—impacting patients’ quality of life. While tools like the Hertel exophthalmometer remain the gold standard for measuring proptosis, they require training and equipment. To address this, Jae Hoon Moon, Center for Artificial Intelligence in Healthcare, Seoul National University Bundang Hospital, Seongnam, Republic of Korea, and colleagues aimed to develop and evaluate a deep learning-based system to measure proptosis from facial photographs, offering a simpler, non-invasive alternative.

For this purpose, the researchers conducted a retrospective cohort study involving 1,108 TED patients from Severance Hospital and 171 from Seoul National University Bundang Hospital. They developed a deep learning-assisted system trained on 1,610 facial images paired with Hertel exophthalmometry values and validated it using 511 external images. The model, based on a dual-stream ResNet-18 neural network, used both RGB images and depth maps from the ZoeDepth algorithm. Its performance was evaluated using MAE, Pearson’s r, ICC, and AUC.

Based on the study, the researchers reported the following findings:

  • The deep learning-assisted system achieved a mean absolute error (MAE) of 1.27 mm on the Severance Hospital (SH) dataset and 1.24 mm on the Seoul National University Bundang Hospital (SNUBH) dataset.
  • Pearson correlation coefficients were 0.82 for SH and 0.77 for SNUBH, indicating strong agreement with clinical measurements.
  • Intraclass Correlation Coefficients (ICCs) were 0.80 for SH and 0.73 for SNUBH, reflecting high reliability.
  • The area under the ROC curve (AUC) for detecting proptosis was 0.91 for SH and 0.88 for SNUBH.
  • The system detected significant proptosis changes (≥2 mm) with an accuracy of 74.6%.

The study introduced a deep learning-assisted system that accurately measures proptosis in patients with thyroid eye disease using simple facial photographs. This accessible and user-friendly tool presents a promising alternative to traditional exophthalmometry, expanding access to reliable measurements even in non-specialist or resource-limited settings.

“While further validation across diverse populations is needed, the system has the potential to enhance TED severity assessment, monitor disease progression, and evaluate treatment response, with future applications possibly extending beyond TED,” the authors concluded.

Reference:

Park, J., Yoon, J. S., Kim, N., Shin, K., Park, H. Y., Kim, J., Park, J., Moon, J. H., & Ko, J. (2025). Deep Learning-Driven Exophthalmometry Through Facial Photographs in Thyroid Eye Disease. Ophthalmology Science, 100791. https://doi.org/10.1016/j.xops.2025.100791

Powered by WPeMatico

Breakthrough: Implantable microphone could lead to development of fully internal cochlear implants

Cochlear implants, tiny electronic devices that can provide a sense of sound to people who are deaf or hard of hearing, have helped improve hearing for more than a million people worldwide, according to the National Institutes of Health.

However, cochlear implants today are only partially implanted, and they rely on external hardware that typically sits on the side of the head. These components restrict users, who can’t, for instance, swim, exercise, or sleep while wearing the external unit, and they may cause others to forgo the implant altogether.

A multidisciplinary team from MIT, Massachusetts Eye and Ear, Harvard Medical School, and Columbia University has developed an implantable microphone that matches the performance of commercial external hearing aid microphones. This innovation addresses a major barrier to fully internal cochlear implants. The microphone, made from a biocompatible piezoelectric material, detects tiny movements on the ear drum’s underside. To optimize functionality, the team also created a low-noise amplifier to enhance signal clarity while reducing electronic noise..

While many challenges must be overcome before such a microphone could be used with a cochlear implant, the collaborative team looks forward to further refining and testing this prototype, which builds off work begun at MIT and Mass Eye and Ear more than a decade ago.

“It starts with the ear doctors who are with this every day of the week, trying to improve people’s hearing, recognizing a need, and bringing that need to us. If it weren’t for this team collaboration, we wouldn’t be where we are today,” says Jeffrey Lang, the Vitesse Professor of Electrical Engineering, a member of the Research Laboratory of Electronics (RLE), and co-senior author of a paper on the microphone.

Lang’s coauthors include co-lead authors Emma Wawrzynek, an electrical engineering and computer science (EECS) graduate student, and Aaron Yeiser SM ’21; as well as mechanical engineering graduate student John Zhang; Lukas Graf and Christopher McHugh of Mass Eye and Ear; Ioannis Kymissis, the Kenneth Brayer Professor of Electrical Engineering at Columbia; Elizabeth S. Olson, a professor of biomedical engineering and auditory biophysics at Columbia; and co-senior author Hideko Heidi Nakajima, an associate professor of otolaryngology-head and neck surgery at Harvard Medical School and Mass Eye and Ear. The research is published today in the Journal of Micromechanics and Microengineering.

Overcoming an implant impasse

Cochlear implant microphones are usually placed on the side of the head, which means that users can’t take advantage of noise filtering and sound localization cues provided by the structure of the outer ear.

Fully implantable microphones offer many advantages. But most devices currently in development, which sense sound under the skin or motion of middle ear bones, can struggle to capture soft sounds and wide frequencies.

For the new microphone, the team targeted a part of the middle ear called the umbo. The umbo vibrates unidirectionally (inward and outward), making it easier to sense these simple movements.

Although the umbo has the largest range of movement of the middle-ear bones, it only moves by a few nanometers. Developing a device to measure such diminutive vibrations presents its own challenges.

On top of that, any implantable sensor must be biocompatible and able to withstand the body’s humid, dynamic environment without causing harm, which limits the materials that can be used.

“Our goal is that a surgeon implants this device at the same time as the cochlear implant and internalized processor, which means optimizing the surgery while working around the internal structures of the ear without disrupting any of the processes that go on in there,” Wawrzynek says.

With careful engineering, the team overcame these challenges.

They created the UmboMic, a triangular, 3-millimeter by 3-millimeter motion sensor composed of two layers of a biocompatible piezoelectric material called polyvinylidene difluoride (PVDF). These PVDF layers are sandwiched on either side of a flexible printed circuit board (PCB), forming a microphone that is about the size of a grain of rice and 200 micrometers thick. (An average human hair is about 100 micrometers thick.)

The narrow tip of the UmboMic would be placed against the umbo. When the umbo vibrates and pushes against the piezoelectric material, the PVDF layers bend and generate electric charges, which are measured by electrodes in the PCB layer.

Amplifying performance

The team used a “PVDF sandwich” design to reduce noise. When the sensor is bent, one layer of PVDF produces a positive charge and the other produces a negative charge. Electrical interference adds to both equally, so taking the difference between the charges cancels out the noise.

Using PVDF provides many advantages, but the material made fabrication especially difficult. PVDF loses its piezoelectric properties when exposed to temperatures above around 80 degrees Celsius, yet very high temperatures are needed to vaporize and deposit titanium, another biocompatible material, onto the sensor. Wawrzynek worked around this problem by depositing the titanium gradually and employing a heat sink to cool the PVDF.

But developing the sensor was only half the battle — umbo vibrations are so tiny that the team needed to amplify the signal without introducing too much noise. When they couldn’t find a suitable low-noise amplifier that also used very little power, they built their own.

With both prototypes in place, the researchers tested the UmboMic in human ear bones from cadavers and found that it had robust performance within the intensity and frequency range of human speech. The microphone and amplifier together also have a low noise floor, which means they could distinguish very quiet sounds from the overall noise level.

“One thing we saw that was really interesting is that the frequency response of the sensor is influenced by the anatomy of the ear we are experimenting on, because the umbo moves slightly differently in different people’s ears,” Wawrzynek says.

The researchers are preparing to launch live animal studies to further explore this finding. These experiments will also help them determine how the UmboMic responds to being implanted.

In addition, they are studying ways to encapsulate the sensor so it can remain in the body safely for up to 10 years but still be flexible enough to capture vibrations. Implants are often packaged in titanium, which would be too rigid for the UmboMic. They also plan to explore methods for mounting the UmboMic that won’t introduce vibrations.

“The results in this paper show the necessary broad-band response and low noise needed to act as an acoustic sensor. This result is surprising, because the bandwidth and noise floor are so competitive with the commercial hearing aid microphone. This performance shows the promise of the approach, which should inspire others to adopt this concept. I would expect that smaller size sensing elements and lower power electronics would be needed for next generation devices to enhance ease of implantation and battery life issues,” says Karl Grosh, professor of mechanical engineering at the University of Michigan, who was not involved with this work.

Reference:

Aaron J Yeiser, Emma F Wawrzynek, John Z Zhang, Lukas Graf, Christopher I McHugh, Ioannis Kymissis, Elizabeth S Olson, Jeffrey H Lang and Hideko Heidi Nakajima, The UmboMic: a PVDF cantilever microphone, Journal of Micromechanics and Microengineering, DOI 10.1088/1361-6439/ad5c6d

Powered by WPeMatico

Secukinumab reduces bone erosion and prevents enthesiophyte progression in psoriatic arthritis: Study

A new study published in the journal of Arthritis & Rheumatology found with high-resolution peripheral quantitative CT (HR-pQCT), that individuals with psoriatic arthritis (PsA) who received secukinumab showed a substantial reduction in the amount of bone erosion and an improvement in the partial healing of erosions after a year.

Psoriatic arthritis can cause severe bone destruction and loss of physical function if inflammation is not managed. Conventional radiography has a limited sensitivity for detecting enthesophytes and bone erosions. A new method for analyzing bone microstructure in detail and with great consistency for evaluating enthesophyte and bony erosion is high-resolution peripheral quantitative CT.

After 6 months of anti-IL-17 medication, a prior single arm research showed encouraging results in stopping bone deterioration on HR-pQCT. This study used high-resolution peripheral quantitative computed tomography (HR-pQCT) to determine the impact of secukinumab on erosion and enthesiophyte development in psoriatic arthritis.

This experiment randomized patients 1:1 to assign patients with active PsA and ≥ 1 erosion in the metacarpophalangeal joints (MCPJ) 2-4 either subcutaneous secukinumab or placebo. The MCPJ 2-4 HR-pQCT was conducted at baseline, week 24, and week 48. Changes in the volume of erosions on MCPJ 2-4 as determined by HR-pQCT at 24 and 48 weeks were the main result.

This study recruited 40 patients (mean age: 51.9±13.4 years, 20 [50%] male, duration of disease: 4.7±6.7 years). The per-protocol analysis includes thirty-four participants who finished trial therapy. The secukinumab group had a substantial decrease in erosion volume at baseline, week 24, and week 48, whereas the placebo group exhibited no changes.

The secukinumab group showed a similar trend in enthesiophyte volume changes, but the placebo group showed no differences (change in the secukinumab group: -0.1 in the placebo group, p=0.067). According to GEE data, the secukinumab group’s odds ratio (OR) for enthesiophyte advancement was 0.264, while its OR for partial erosion repair was 2.882.

Overall, in PsA, secukinumab shows promise in aiding partial erosion healing and inhibiting the growth of enthesiophytes. The beneficial effects of secukinumab in treating bone disorders in PsA patients are highlighted by its ability to slow the creation of new bone and lessen bone degradation. 

Source:

Jin, Y., Cheng, I. T., So, H., Lai, B. T., Ying, S. K., Kwok, K. Y., Griffith, J., Hung, V., Szeto, C.-C., Lee, J. J., Qin, L., & Tam, L.-S. (2025). Effects of secukinumab on enthesiophyte and erosion progression in psoriatic arthritis -a one-year double-blind, randomized, placebo-controlled trial utilizing high-resolution peripheral quantitative computed tomography. Arthritis & Rheumatology. https://doi.org/10.1002/art.43154

Powered by WPeMatico

Ultrasound Outperforms Traditional Tests in Diagnosing Diaphragm Dysfunction: Study Finds

Netherlands: A recent prospective study published in Respiratory Medicine has spotlighted ultrasound as a reliable and effective diagnostic tool for assessing diaphragm dysfunction. It showed strong agreement with clinical diagnoses and solidified its potential role in routine respiratory evaluations.

Traditionally, diagnostic methods such as fluoroscopy and phrenic nerve conduction studies have been used to assess diaphragmatic function, but their reliance on specialized equipment limits accessibility in many clinical settings. Diaphragm dysfunction, a frequently overlooked cause of dyspnea, is typically identified through clinical history, symptoms, and imaging.

Given ultrasound’s advantages as a non-invasive, bedside, and widely available tool, M.D. Wytze S. de Boer, Department of Pulmonology, Isala Hospital, Zwolle, The Netherlands, and colleagues aimed to evaluate its construct validity by comparing ultrasound-based assessments with conventional diagnostic methods in detecting diaphragm dysfunction.

For this purpose, the researchers conducted a prospective, operator-blinded study across two centers in the Netherlands involving 36 adults with suspected diaphragm dysfunction. Participants underwent fluoroscopy, pulmonary function tests, and ultrasound examinations. The primary objective was to assess the agreement between predefined ultrasound diagnostic criteria and traditional methods. Secondary outcomes included evaluating the concordance of each diagnostic approach with the treating physician’s final diagnosis, assessing the performance of individual test parameters, and determining inter-rater reliability.

The study revealed the following findings:

  • Concordance between ultrasound and traditional test diagnostic criteria was observed in 55.6% of cases.
  • Ultrasound criteria aligned with the treating physician’s final diagnosis in 75.0% of cases, higher than traditional test criteria, which showed 63.9% concordance.
  • Visual ultrasound assessment, thickening fraction (TF), diaphragm excursion (DE), and fluoroscopy demonstrated high concordance with the final diagnosis at 91.4%, 90.3%, 88.3%, and 91.7%, respectively.
  • Inter-rater reliability was strong for fluoroscopy, visual ultrasound assessment, and diaphragm excursion, but was poor for thickening fraction.

The findings highlight ultrasound as a dependable and practical tool for diagnosing diaphragm dysfunction. It showed strong agreement with the treating physician’s final diagnosis and performed well across key parameters, such as visual assessment and diaphragm excursion. Moreover, the high inter-rater reliability of these measures reinforces the utility of ultrasound as a non-invasive, bedside alternative to traditional methods like fluoroscopy.

“Given its accessibility, ease of use, and diagnostic accuracy, ultrasound has the potential to become an integral part of routine clinical evaluation for suspected diaphragm dysfunction,” the authors concluded.

Reference:

Wytze S. de Boer, M., Krista L. Parlevliet, M., Leonie A. Kooistra, M., David Koster, M. P., Jellie A. Nieuwenhuis, M., Mireille A. Edens, P., Jan Willem K. van den Berg, M. P., Martijn F. Boomsma, M. P., Jos A. Stigt, M. P., Dirk Jan Slebos, M. P., & Marieke L. Duiverman, M. P. (2025). Ultrasound as Diagnostic Tool in Diaphragm Dysfunction: A Prospective Construct Validity Study. Respiratory Medicine, 108083. https://doi.org/10.1016/j.rmed.2025.108083

Powered by WPeMatico

Study confirms accuracy of blood test for early Alzheimer’s detection in Asian populations

A study in Alzheimer’s & Dementia, a leading journal in dementia research, has demonstrated the high accuracy of plasma p-tau217 as a blood-based biomarker for detecting abnormal brain beta-amyloid (Aβ) pathology, a hallmark of Alzheimer’s disease (AD). More significantly, the study validates its effectiveness even in individuals with cerebrovascular disease (CeVD), which is highly prevalent in Asian populations. This finding can enhance early diagnosis, improve patient risk stratification, and facilitate better clinical management of AD in diverse populations.

The study was led by Dr Mitchell Lai, Senior Lecturer at the Department of Pharmacology, Yong Loo Lin School of Medicine, National University of Singapore (NUS Medicine), in collaboration with local and international experts from the National University Health System (NUHS), University of Gothenburg, Institute of Neurology at University College London, and Banner Sun Health Research Institute.

Bridging gaps in Alzheimer’s research for Asia

While blood biomarkers like p-tau217 have been extensively studied in Western populations-where CeVD is less common-this study uniquely focuses on a Singapore-based cohort, reflective of broader Asian demographics with a high CeVD burden. The results confirm that higher plasma p-tau217 levels correlate with faster cognitive decline, reinforcing its role not just as a diagnostic tool but also as a potential predictor of disease progression.

Transforming Alzheimer’s diagnosis: A potential game-changer for clinical practice

Potential clinical applications include:

  • Earlier and more precise detection: Plasma p-tau217 provides a highly sensitive and specific method for identifying Alzheimer’s pathology before severe cognitive decline occurs, potentially enabling earlier intervention and monitoring.
  • A simpler, minimally invasive diagnostic tool: Unlike costly and invasive positron emission tomography (PET) scans and cerebrospinal fluid tests, a blood-based biomarker could be easily integrated into routine clinical practice, making Alzheimer’s screening more accessible and scalable.
  • Patient risk stratification for optimised, personalised care: Adding plasma p-tau217 to routine clinical assessments allows doctors to efficiently categorise individuals into low, intermediate, and high-risk groups for Aβ pathology, enabling tailored follow-up strategies and potential early therapeutic interventions for patients.

Professor Christopher Chen, Director of the Memory, Ageing and Cognition Centre at NUHS and co-author of the study, said “This study provides strong evidence that plasma p-tau217 could be a game-changer for early detection of AD brain changes in Asian populations with high CeVD burden. A blood-based biomarker like p-tau217 brings us closer to a more accessible approach to diagnosing and managing AD in Singapore and beyond”.

Dr Joyce Chong, a Research Fellow with the Department of Pharmacology, NUS Medicine, and first author of the study, added “Although blood biomarkers are not expected to replace the current gold standard in clinical measures such as amyloid PET, their greatest value may lie in providing a cost-effective, minimally-invasive screening and risk-stratification tool to help reduce the proportion of individuals requiring confirmatory PET scans.”

Looking forward, the team hopes to expand the study both in the length of follow-up, as well as the diversity of investigated biomarkers. Dr Lai said, “There is increasing awareness that dementia is a chronic condition arising from complex, interacting processes, especially in our population where CeVD is likely to be an important contributor to the cognitive impairments associated with AD. Our long-term goal is to be able to produce a panel of multi-modal, clinically useful biomarkers which can both suggest novel therapeutic targets as well as help in the diagnosis and prognosis of this debilitating condition.”

Reference:

Joyce R. Chong, Saima Hilal, Boon Yeow Tan, Narayanaswamy Venketasubramanian, Michael Schöll, Henrik Zetterberg, Kaj Blennow, Nicholas J. Ashton, Christopher P. Chen, Mitchell K. P. Lai, Clinical utility of plasma p-tau217 in identifying abnormal brain amyloid burden in an Asian cohort with high prevalence of concomitant cerebrovascular disease, Alzheimer s & Dementia, https://doi.org/10.1002/alz.14502

Powered by WPeMatico

Less Can Be More: Study Finds Low-Dose Doxycycline Works Well for Scarring Alopecia, With Fewer AEs

USA: A recent study published in the Journal of the American Academy of Dermatology has highlighted the potential of low-dose doxycycline as an effective and better-tolerated treatment option for patients suffering from lymphocytic scarring alopecias. This group of conditions, which includes disorders like lichen planopilaris and frontal fibrosing alopecia, is characterized by irreversible hair loss due to inflammation and scarring of hair follicles.   

“Low-dose doxycycline demonstrated similar effectiveness to high-dose therapy in managing lymphocytic scarring alopecia while being associated with fewer adverse events and a lower rate of treatment discontinuation,” the researchers wrote.

To evaluate treatment outcomes with low-dose doxycycline in lymphocytic scarring alopecia, Carli Needle BA, The Ronald O. Perelman Department of Dermatology, NYU Grossman School of Medicine, New York, NY, and colleagues conducted a retrospective review of 241 patients diagnosed between 2009 and 2023. The cohort had a mean age of 58.4 years and was predominantly female (82.6%), with 49.8% identifying as White and 14.1% as Black. The most frequently diagnosed conditions included lichen planopilaris (50.2%), frontal fibrosing alopecia (45.6%), and central centrifugal cicatricial alopecia (15.4%). Of the total, 64 patients (27.4%) received low-dose doxycycline (20 mg twice daily, 40 mg daily, or 50 mg daily), while 175 (72.6%) were treated with high-dose regimens (50 mg twice daily, 100 mg daily, or 100 mg twice daily).

All patients also received adjunctive therapies, most commonly intralesional corticosteroids (56.4%), topical corticosteroids (46.1%), and 5% topical minoxidil (31.5%). Treatment outcomes were assessed based on inflammation severity, scalp symptoms, hair loss stability, and patients’ perceptions of their condition.

Based on the study, the researchers reported the following findings:

  • No significant differences were observed between low- and high-dose doxycycline groups in terms of improvement in inflammation severity, hair loss stability, or the number of scalp symptoms.
  • There were similar results in the subgroup of patients with lichen planopilaris and frontal fibrosing alopecia, with no significant differences across inflammation, hair loss stability, or symptom count.
  • After adjusting for adjunctive treatments, patient-reported outcomes did not significantly differ between dose groups.
  • High-dose doxycycline was associated with a higher rate of adverse events (23.4%) compared to low-dose treatment (12.1%).
  • The most common adverse events were gastrointestinal symptoms (11.4% in high-dose vs 6.0% in low-dose) and photosensitivity or rash (6.9% vs 4.6%).
  • Doxycycline dosage had to be reduced or discontinued due to adverse events in 20.4% of high-dose patients versus 9.1% of low-dose patients.

The authors concluded that low-dose doxycycline offers comparable efficacy to high-dose regimens in managing lymphocytic scarring alopecias while demonstrating improved tolerability and fewer adverse events. They emphasized the potential of low-dose therapy as a safer treatment option and highlighted the need for prospective studies to further validate its effectiveness and long-term benefits in this patient population.

Reference:

Needle, C., Brinks, A., Pulavarty, A., Kearney, C., Nohria, A., Desai, D., Shapiro, J., & Lo Sicco, K. (2025). Efficacy and Tolerability of Low-Dose Versus High-Dose Doxycycline in the Management of Lymphocytic Scarring Alopecias. Journal of the American Academy of Dermatology. https://doi.org/10.1016/j.jaad.2025.02.028

Powered by WPeMatico

Compared to non surgical treatment, Surgery exhibits long-term durability in ASLS: JAMA

According to a new study published in JAMA, long-term durability of surgical treatment for ASLS (Adult Symptomatic Lumbar Scoliosis) is promising and may aid in patient management and counseling, helping clinicians set realistic expectations and guide treatment decisions.

Long-term follow-up studies of operative and nonoperative treatment of adult symptomatic lumbar scoliosis (ASLS) are needed to assess benefits and durability. A study was done to assess the durability of treatment outcomes for operative vs nonoperative treatment of ASLS. The Adult Symptomatic Lumbar Scoliosis 1 (ASLS-1) study was a multicenter, prospective study with randomized and observational cohorts designed to assess operative vs nonoperative ASLS treatment. Operative and nonoperative patients were compared using as-treated analysis of combined randomized and observational cohorts. Patients with ASLS aged 40 to 80 years were enrolled at 9 centers in North America. Data were analyzed from November 2023 to July 2024. Primary outcomes measures were the Oswestry Disability Index (ODI) and Scoliosis Research Society 22 (SRS-22) at 2-, 5-, and 8-year follow-up. Results The 286 enrolled patients (104 in the nonoperative group: median [IQR] age, 61.9 [54.4-68.8] years; 97 female [93%]; 182 in the operative group: median [IQR] age, 60.2 [53.5-66.6] years; 161 female [88%]) had follow-up rates at 2, 5, and 8 years of 90% (256 of 286), 70% (199 of 286), and 72% (205 of 286), respectively. At 2 years, compared with those in the nonoperative group, patients in the operative group had better ODI (mean difference = −12.98; 95% CI, −16.08 to −9.88; P < .001) and SRS-22 (mean difference = 0.57; 95% CI, 0.45-0.70; P < .001) scores, with mean differences exceeding the minimal detectable measurement difference (MDMD) for ODI (7) and SRS-22 (0.4). Mean differences at 5 years (ODI = −11.25; 95% CI, −15.20 to 7.31; P <.001; SRS-22 = 0.58; 95% CI, 0.44-0.72; P < .001) and 8 years (ODI = −14.29; 95% CI, −17.81 to −10.78; P <.001; SRS-22 = 0.74; 95% CI, 0.57-0.90; P < .001) remained as favorable as at 2 years without evidence of degradation. The treatment-related serious adverse event (SAE) incidence rates for operative patients at 2, 2 to 5, and 5 to 8 years were 22.24, 9.08, and 8.02 per 100 person-years, respectively. At 8 years, operative patients with 1 treatment-related SAE still had significant improvement, with mean treatment differences that exceeded MDMD (ODI = −9.49; 95% CI, −14.23 to −4.74; P < .001; SRS-22 = 0.62; 95% CI, 0.41-0.84; P < .001). Results of this nonrandomized clinical trial reveal that, on average, operative treatment for ASLS provided significantly greater clinical improvement than nonoperative treatment at 2-, 5- and 8-year follow-up, with no evidence of deterioration. Operative patients with a treatment-related SAE still maintained greater improvement than nonoperative patients. These findings suggest long-term durability of surgical treatment for ASLS and may prove useful for patient management and counseling.

Reference:

Smith JS, Kelly MP, Yanik EL, et al. Operative vs Nonoperative Treatment for Adult Symptomatic Lumbar Scoliosis at 8-Year Follow-Up: A Nonrandomized Clinical Trial. JAMA Surg. Published online April 02, 2025. doi:10.1001/jamasurg.2025.0496

Keywords:

Smith JS, Kelly MP, Yanik EL, Operative, Nonoperative, Treatment, Adult, Symptomatic, Lumbar, Scoliosis, 8-Year Follow-Up

Powered by WPeMatico

Antenatal Corticosteroids may significantly reduce neurodevelopment in children born at 28–33weeks’ gestation: Study

Antenatal corticosteroids (ACS) are widely used before
preterm birth (before 37 weeks’ gestation) to reduce neonatal mortality,
respiratory distress syndrome and intraventricular haemorrhage. However,
potential overuse of ACS is a concern, as up to half of all babies exposed to
ACS are subsequently born at term (≥37 weeks’ gestation), when neonatal
benefits are minimal. Fetal overexposure to glucocorticoids may contribute to
programming of disease later in life. Evidence on long-term neurodevelopmental
outcomes associated with ACS exposure is conflicting. Better understanding of
associations between ACS exposure and childhood neurodevelopment would inform
clinical decision-making and elucidate how effects of ACS vary by gestational
age at birth, which is strongly associated with child neurodevelopment. This
longitudinal population-based study examined the associations of ACS exposure
with early childhood neurodevelopment, and whether these varied with
gestational age. Continuous and categorical neurodevelopmental assessments from
population-wide child health reviews at 27–30months of age were used.

It was a Population-based cohort study carried out at
Scotland, UK. 285 637 singleton children born at 28–41weeks’ gestation, between
1st January 2011 and 31st December 2017, who underwent health reviews at
27–30months of age were analysed. Logistic and linear regression analyses,
stratified by gestation at birth (28–33, 34–36, 37–38 and 39–41weeks’
gestation), were used to evaluate the associations between ACS exposure and
neurodevelopmental outcomes, and adjusted for maternal age, body mass index,
diabetes, antenatal smoking, parity, neighbourhood deprivation, birth year,
child sex and age at review. Practitioner-identified concerns about any
neurodevelopmental domain, and the average of five domain scores on
neurodevelopmental milestones from the parent-rated Ages and Stages
Questionnaire (ASQ-3).

After adjustment for covariates, ACS exposure was associated
with reduced neurodevelopmental concerns in children born at 28–33weeks’
gestation (OR=0.79, 95% CI=0.62–0.999) and with increased neurodevelopmental
concerns in children born at 34–36weeks’ gestation (OR=1.11, 95% CI=1.01–1.21).
No independent associations emerged in children born at later gestations. ACS
exposure was not associated with ASQ-3 scores in any gestational age group.

This large population-based cohort study of associations between
ACS exposure and neurodevelopmental outcomes at 27–30 months of age found that
in children born at 28–33weeks’ gestation, ACS exposure was associated with a
statistically significantly reduced odds of practitioner concerns about their
neurodevelopment. Children born at 34–36weeks’ gestation who were ACS-exposed
had statistically significantly increased odds of practitioner concerns about
neurodevelopment than non-ACS-exposed children born at this gestation. In
children born at 39–41weeks’ gestation, associations that were observed between
ACS exposure and increased odds of practitioner neurodevelopmental concerns
were attenuated to non-significance after adjusting for maternal and perinatal
confounders. ACS exposure was not associated with practitioner-identified
neurodevelopmental concerns in children born at 37–38weeks’ gestation, nor with
parent-assessed continuous neurodevelopment in any gestational age group.

This population-based cohort study of ACS exposure and
neurodevelopment in early childhood showed that after adjusting for child age,
sex, maternal and perinatal covariates, statistically significant associations
were observed between ACS exposure and reduced odds of practitioner-identified
neurodevelopmental concerns in children born at 28–33weeks’ gestation, and with
increased odds of practitioner-identified neurodevelopmental concerns in
children born at 34–36weeks’ gestation. Associations between ACS exposure and
increased odds of practitioner-identified neurodevelopmental concerns in
children born at 39–41weeks’ gestation were partly attenuated after adjustment
for maternal and perinatal covariates. However, the effect sizes of all the
aforementioned associations were small. No consistent associations between ACS
exposure and continuously assessed neurodevelopment were observed. Future
studies should include school performance and educational achievement outcomes
to assess potential associations of ACS with neurodevelopment beyond early
childhood.

Source: Emily M. Frier,
Marius Lahti-Pulkkinen, Chun Lin; BJOG: An International Journal of
Obstetrics & Gynaecology, 2025; 0:1–14

https://doi.org/10.1111/1471-0528.1810

Powered by WPeMatico

L-arginine promising for pre-eclampsia prevention and treatment, suggests research

Pre-eclampsia is the leading cause of iatrogenic preterm
birth. Despite this, few drugs are recommended for pre-eclampsia prevention—low
dose aspirin (when initiated before 20 weeks gestation) and calcium
supplementation (for women with low dietary intake). For treatment, only
antihypertensives for blood pressure (BP) control and magnesium sulphate to
prevent or treat seizures have been recommended. Research into promising new
drugs for pre-eclampsia prevention and treatment is urgently needed.

In 2022, the Accelerating Innovations for Mothers (AIM)
project identified L-arginine as one of five high-potential medicines under
investigation for pre-eclampsia prevention. L-arginine is a precursor for the
endogenous synthesis of nitric oxide, a potent vasodilator that mediates
vascular smooth muscle relaxation and inhibits platelet aggregation. Abnormal
placentation, dysregulation of angiogenesis, oxidative stress, and endothelial
dysfunction are important aspects of the pathogenesis of pre-eclampsia. Thus,
the availability of L-arginine opposes the vasoconstriction that occurs in
pre-eclampsia.

L-arginine is a semi-essential amino acid obtained through
dietary intake (fish, meats, soy, nuts and seeds), protein turnover and
endogenous synthesis from L-citrulline. In pregnancy, bioavailable L-arginine
levels may diminish due to increased metabolic demand resulting from fetal growth.
Also, when dietary L-arginine and/or L-citrulline intake is low, the de novo
endogenous synthesis of L-arginine from L-citrulline cannot increase to
compensate. Maternal infections such as malaria may further deplete endogenous
L-arginine, making pregnant women in malaria-endemic regions particularly
vulnerable to arginine deficiency. Therefore, L-arginine may be necessary to
provide the substrate for nitric oxide synthesis during pregnancy. L-arginine
or L-citrulline could be affordable and scalable interventions to improve birth
outcomes, particularly in resource-limited settings.

Recent reviews on L-arginine in pregnancy have not explored
the effects of L-arginine for prevention differently from the treatment of pre-eclampsia.
This has significant clinical implications for the optimal timing of
supplementation initiation and understanding which women would benefit from
L-arginine. Furthermore, no review has considered the effects of L-citrulline
during pregnancy on pre-eclampsia. This review examines the effects of
L-arginine and L-citrulline on prevention separately from the treatment of
pre-eclampsia and related maternal, fetal and neonatal outcomes.

To evaluate the effects of L-arginine and L-citrulline
(precursor of L-arginine) on the prevention and treatment of pre-eclampsia.
MEDLINE, Embase, CINAHL, Global Index Medicus and the Cochrane Library were
searched through 7 February 2024. Trials administering L-arginine or
L-citrulline to pregnant women, with the comparison group receiving placebo or
standard care, were included. Meta-analyses were conducted separately for
prevention or treatment trials, using random effects models.

Twenty randomised controlled trials (RCTs) (2028 women) and
three non-randomised trials (189 women) were included. The risk of bias was
‘high’ in eight RCTs and showed ‘some concerns’ in 12. In prevention trials,
L-arginine was associated with a reduced risk of pre-eclampsia (relative risk
[RR] 0.52; 95% confidence interval [CI], 0.35, 0.78; low-certainty evidence,
four trials) and severe pre-eclampsia (RR 0.23; 95% CI, 0.09, 0.55;
low-certainty evidence, three trials). In treatment trials, L arginine may
reduce mean systolic blood pressure (MD −5.64mmHg; 95% CI, −10.66, −0.62; very
low-certainty evidence, three trials) and fetal growth restriction (RR 0.46;
95% CI, 0.26, 0.81; low-certainty evidence, two trials). Only one study (36
women) examined L-citrulline and reported no effect on pre-eclampsia or blood
pressure.

This is the first systematic review to evaluate the evidence
on Larginine and L-citrulline for the prevention or treatment of preeclampsia,
separately. This distinction is critical for informing clinical practice and
policy, as different stages of the aetiology are targeted by prevention or
treatment trials. Authors found low certainty evidence that L-arginine in
pregnancy decreases the risk of pre-eclampsia and severe pre-eclampsia among
women at risk of pre-eclampsia. The evidence is very uncertain about the effect
of L-arginine on mean systolic BP, and L-arginine may decrease the risk of FGR
in treatment trials. L-arginine may also decrease the risk of preterm birth and
SGA and increase nitric oxide serum levels in prevention trials. Authors are
uncertain of the effects of L-arginine on other secondary outcomes.

Consistent with previous reviews, they found that L-arginine
may be promising for preventing pre-eclampsia. However, this data should be
interpreted with caution, as the certainty of the evidence is low, requiring
further research. This review defined trials including women without a
diagnosis of pre-eclampsia as prevention trials; however, only four trials
specifically examined the efficacy of L-arginine to prevent pre-eclampsia in
women at increased risk. These trials included women who were nulliparous, had
a previous history of pre-eclampsia, had pre-eclampsia in a first-degree
relative, had chronic hypertension, had gestational hypertension without
proteinuria or had a body mass index ≥30.

L-arginine is promising for pre-eclampsia prevention, and
authors are uncertain of its effect as a treatment for women with established
pre-eclampsia. L-arginine could constitute an important addition to the
management protocol of women at risk of preeclampsia, but further research is needed
to establish the population that would benefit, the optimal dose, time of
initiation and duration of supplementation. An individual participant data
meta-analysis may provide definitive answers to these questions. Future
L-arginine trials should use standardised pre-eclampsia screening tools, define
the type of pre-eclampsia experienced and investigate co-administration with
aspirin or calcium.

Source: Maureen Makama,
Annie R. A. McDougall, Jenny Cao; BJOG: An International
Journal of Obstetrics & Gynaecology, 2025; 0:1–11
https://doi.org/10.1111/1471-0528.18070

Powered by WPeMatico