Early surgery for acute active infective endocarditis feasible soon after stroke, claims study

Researchers reported in a new study that early-valve surgery in acute infective endocarditis with preoperative stroke is feasible and non-inferior for postoperative stroke risk. This conclusion was drawn from a comprehensive review of an institutional database related to the timely surgical intervention can safely be considered in this high-risk population. The study was published by Kareem Wasef and colleagues in The Annals of Thoracic Surgery.

Acute infective endocarditis is usually associated with serious complications like acute stroke. The appropriate timing of valve repair or replacement in these patients is controversial because an increased incidence of hemorrhagic conversion has been reported in these patients. The optimal timing for surgery greatly influences the outcome of the patients. The present study evaluated the outcome of early valve surgery in comparison with delayed surgery among patients who had a preoperative stroke caused by acute endocarditis.

This was a retrospective review using an institutional Society of Thoracic Surgeons database including all patients who underwent valve surgery for active infective endocarditis from 2016 to 2024. Electronic medical records provided stroke details and longitudinal follow-up. Descriptive statistics and Kaplan-Meier survival curves assessed outcomes and survival rates.

Results

This research involved 656 patients who underwent surgery because of acute active infective endocarditis. Of these, preoperative stroke happened with 98 patients, which is 14.9%: 86 patients (87.8%) developed embolic strokes, 16 (18.6%) developed micro-hemorrhages, and 12 (12.2%) developed hemorrhagic strokes. The mean time from the diagnosis of preoperative stroke to surgery was 5.5 days.

• The overall postoperative stroke incidence was 2.1%: 14 of 656 patients.

• No statistically significant difference was observed in postoperative stroke rates between patients with and those without a preoperative stroke (4.1% vs 1.8%, P = 0.148).

• There were more cases of postoperative hemorrhagic stroke in the preoperative stroke group as well, at a rate of 3.1% compared to 0.5% (.

• Early surgery (within 72 h) in patients with preoperative stroke did not increase the incidence of postoperative stroke (2.6% vs. 5.0%; p=0.564).

This study has therefore established that patients with acute infective endocarditis and who have had preoperative stroke are associated with similar outcomes to those without preoperative stroke. Although there is a higher incidence of postoperative hemorrhagic strokes among the preoperative stroke group, the risk of postoperative stroke did not differ significantly and thereby supported the feasibility of early surgical intervention.

These results are especially relevant for clinicians treating patients with endocarditis complicated by stroke, providing evidence that early surgery does not exacerbate postoperative stroke risk. The slightly higher risk of hemorrhagic conversion mandates serious patient selection and perioperative management.

This study demonstrated that early-valve surgery for acute endocarditis in patients with a preoperative stroke is feasible and safe. In particular, noninferior postoperative stroke risk was evidenced in this high-risk population, which supports early surgical intervention that may improve patient outcomes. Further studies are thus warranted to further refine the selection of patients and optimize the quality of perioperative care with the aim of reducing the risk of hemorrhagic conversion.

Reference:

Wasef, K., D’etcheverry, T., Hayanga, J. W. A., Wei, L., Lagazzi, L. F., Badhwar, V., & Mehaffey, J. H. (2024). Early valve surgery for endocarditis after acute embolic stroke. The Annals of Thoracic Surgery. https://doi.org/10.1016/j.athoracsur.2024.07.017

Powered by WPeMatico

Radiographic Pincer Morphology Not Associated With Hip Osteoarthritis, Reveals Study

Radiographic Pincer Morphology Not Associated With Hip Osteoarthritis, reveals study published in the Arthritis Care and research.

The objective of this study was to assess the relationship between pincer morphology and radiographic hip osteoarthritis (RHOA) over 2, 5, 8, and 10 years’ follow-up and to study the interaction between pincer morphology and pain. Individuals from the prospective Cohort Hip and Cohort Knee study were drawn. Anteroposterior pelvic and false profile radiographs were obtained. Hips free of definite RHOA (Kellgren and Lawrence [KL] grade 0 or 1) at baseline were included. Pincer morphology was defined as a lateral or anterior center edge angle or both ≥40° at baseline. Incident RHOA was defined as KL ≥ 2 or total hip replacement at follow-up. Multivariable logistic regression with generalized estimating equations estimated the associations at follow-up. Associations were expressed as unadjusted odds ratios (ORs) and adjusted ORs with 95% confidence intervals (CIs).

An interaction term was added to investigate whether pincer morphology had a different effect on symptomatic hips. Results: Incident RHOA developed in 69 hips (5%) at 2 years’ follow-up, in 178 hips (14%) at 5 years’ follow-up, in 279 hips (24%) at 8 years’ follow-up, and in 495 hips (42%) at 10 years’ follow-up. No significant associations were found between pincer morphology and incident RHOA (adjusted OR 0.35 [95% CI 0.06–2.15]; adjusted OR 1.50 [95% CI 0.94–2.38]). Significant interactions between pain and anterior pincer morphology in predicting incident RHOA were found at 5, 8, and 10 years’ follow-up (OR 1.97 [95% CI 1.03–3.78]; OR 3.41 [95% CI 1.35–8.61]). No associations were found between radiographic pincer morphology and incident RHOA at any follow-up moment. Anteriorly located pincer morphology with hip pain, however, was significantly associated with incident RHOA. This highlights the importance of studying symptoms and hip morphology simultaneously.

Reference:

Riedstra, N.S., Boel, F., van Buuren, M., Eygendaal, D., Bierma-Zeinstra, S., Runhaar, J. and Agricola, R. (2024), Pincer Morphology Is Not Associated With Hip Osteoarthritis Unless Hip Pain Is Present: Follow-Up Data From a Prospective Cohort Study. Arthritis Care Res. https://doi.org/10.1002/acr.25285

Powered by WPeMatico

Cranberry Juice Reduces UTIs More Effectively Than Cranberry Tablets and Liquids, Shows research

Australia: European Urology Focus published a review indicating that evidence with moderate to low certainty supports the use of cranberry juice for preventing urinary tract infections (UTIs). 

Increased fluid intake can lower the rate of UTIs compared to no treatment, but cranberry juice offers even more effective clinical outcomes by further reducing UTI rates and antibiotic use, making it a valuable option for UTI management, the researchers wrote. 

With more than 50% of women experiencing at least one episode of urinary tract infection annually and the growing issue of antimicrobial resistance, it is crucial to identify evidence supporting potential non-drug interventions. 

This study by Christian Moro, Faculty of Health Sciences and Medicine, Bond University, Gold Coast et. al. aimed to compare the effectiveness of cranberry juice, cranberry tablets, and increased fluid intake in managing UTIs.

In this study, PubMed, Embase, and Cochrane CENTRAL were searched for randomized controlled trials. The primary outcome was the number of UTIs, while secondary outcomes included UTI symptoms and antimicrobial use. The risk of bias was assessed using the Cochrane risk of bias tool, and the certainty of evidence was evaluated using the Grading of Recommendations Assessment, Development, and Evaluation (GRADE) system.

The key points of research were as follows:

  • 18 studies showed that cranberry juice consumption resulted in a 54% lower rate of Urinary tract infections compared to no treatment.
  • Cranberry juice consumption led to a 27% lower rate of Urinary tract infections compared to placebo liquid.
  • Cranberry juice use resulted in a 49% lower rate of antibiotic use compared to placebo liquid.
  • Cranberry juice led to a 59% lower rate of antibiotic use compared to no treatment, based on a network meta-analysis of six studies.
  • The use of cranberry compounds also reduced the prevalence of symptoms associated with Urinary tract infections.

Researchers concluded that, with moderate to low certainty, evidence supports the use of cranberry juice for Urinary tract infection prevention. While increasing fluid intake lowers Urinary tract infection rates compared to no treatment, cranberry juice offers even better clinical outcomes by reducing Urinary tract infections and antibiotic use, making it a recommended option for managing Urinary tract infections. 

References

Moro C, Phelps C, Veer V, Jones M, Glasziou P, Clark J, Tikkinen KAO, Scott AM. Cranberry Juice, Cranberry Tablets, or Liquid Therapies for Urinary Tract Infection: A Systematic Review and Network Meta-analysis. Eur Urol Focus. 2024 Jul 18:S2405-4569(24)00122-6. doi: 10.1016/j.euf.2024.07.002. Epub ahead of print. PMID: 39030132. 

Powered by WPeMatico

Six distinct types of depression identified in Stanford Medicine-led study

In the not-too-distant future, a screening assessment for depression could include a quick brain scan to identify the best treatment.

Brain imaging combined with machine learning can reveal subtypes of depression and anxiety, according to a new study led by researchers at Stanford Medicine. The study, to be published June 17 in the journal Nature Medicine, sorts depression into six biological subtypes, or “biotypes,” and identifies treatments that are more likely or less likely to work for three of these subtypes.

Better methods for matching patients with treatments are desperately needed, said the study’s senior author, Leanne Williams, PhD, the Vincent V.C. Woo Professor, a professor of psychiatry and behavioral sciences, and the director of Stanford Medicine’s Center for Precision Mental Health and Wellness. Williams, who lost her partner to depression in 2015, has focused her work on pioneering the field of precision psychiatry.

Around 30% of people with depression have what’s known as treatment-resistant depression, meaning multiple kinds of medication or therapy have failed to improve their symptoms. And for up to two-thirds of people with depression, treatment fails to fully reverse their symptoms to healthy levels.

That’s in part because there’s no good way to know which antidepressant or type of therapy could help a given patient. Medications are prescribed through a trial-and-error method, so it can take months or years to land on a drug that works — if it ever happens. And spending so long trying treatment after treatment, only to experience no relief, can worsen depression symptoms.

“The goal of our work is figuring out how we can get it right the first time,” Williams said. “It’s very frustrating to be in the field of depression and not have a better alternative to this one-size-fits-all approach.”

Biotypes predict treatment response

To better understand the biology underlying depression and anxiety, Williams and her colleagues assessed 801 study participants who were previously diagnosed with depression or anxiety using the imaging technology known as functional MRI, or fMRI, to measure brain activity. They scanned the volunteers’ brains at rest and when they were engaged in different tasks designed to test their cognitive and emotional functioning. The scientists narrowed in on regions of the brain, and the connections between them, that were already known to play a role in depression.

Using a machine learning approach known as cluster analysis to group the patients’ brain images, they identified six distinct patterns of activity in the brain regions they studied.

The scientists also randomly assigned 250 of the study participants to receive one of three commonly used antidepressants or behavioral talk therapy. Patients with one subtype, which is characterized by overactivity in cognitive regions of the brain, experienced the best response to the antidepressant venlafaxine (commonly known as Effexor) compared with those who have other biotypes. Those with another subtype, whose brains at rest had higher levels of activity among three regions associated with depression and problem-solving, had better alleviation of symptoms with behavioral talk therapy. And those with a third subtype, who had lower levels of activity at rest in the brain circuit that controls attention, were less likely to see improvement of their symptoms with talk therapy than those with other biotypes.

The biotypes and their response to behavioral therapy make sense based on what they know about these regions of the brain, said Jun Ma, MD, PhD, the Beth and George Vitoux Professor of Medicine at the University of Illinois Chicago and one of the authors of the study. The type of therapy used in their trial teaches patients skills to better address daily problems, so the high levels of activity in these brain regions may allow patients with that biotype to more readily adopt new skills. As for those with lower activity in the region associated with attention and engagement, Ma said it’s possible that pharmaceutical treatment to first address that lower activity could help those patients gain more from talk therapy.

“To our knowledge, this is the first time we’ve been able to demonstrate that depression can be explained by different disruptions to the functioning of the brain,” Williams said. “In essence, it’s a demonstration of a personalized medicine approach for mental health based on objective measures of brain function.”

In another recently published study, Williams and her team showed that using fMRI brain imaging improves their ability to identify individuals likely to respond to antidepressant treatment. In that study, the scientists focused on a subtype they call the cognitive biotype of depression, which affects more than a quarter of those with depression and is less likely to respond to standard antidepressants. By identifying those with the cognitive biotype using fMRI, the researchers accurately predicted the likelihood of remission in 63% of patients, compared with 36% accuracy without using brain imaging. That improved accuracy means that providers may be more likely to get the treatment right the first time. The scientists are now studying novel treatments for this biotype with the hope of finding more options for those who don’t respond to standard antidepressants.

Further explorations of depression

The different biotypes also correlate with differences in symptoms and task performance among the trial participants. Those with overactive cognitive regions of the brain, for example, had higher levels of anhedonia (inability to feel pleasure) than those with other biotypes; they also performed worse on executive function tasks. Those with the subtype that responded best to talk therapy also made errors on executive function tasks but performed well on cognitive tasks.

One of the six biotypes uncovered in the study showed no noticeable brain activity differences in the imaged regions from the activity of people without depression. Williams believes they likely haven’t explored the full range of brain biology underlying this disorder — their study focused on regions known to be involved in depression and anxiety, but there could be other types of dysfunction in this biotype that their imaging didn’t capture.

Williams and her team are expanding the imaging study to include more participants. She also wants to test more kinds of treatments in all six biotypes, including medicines that haven’t traditionally been used for depression.

Her colleague Laura Hack, MD, PhD, an assistant professor of psychiatry and behavioral sciences, has begun using the imaging technique in her clinical practice at Stanford Medicine through an experimental protocol. The team also wants to establish easy-to-follow standards for the method so that other practicing psychiatrists can begin implementing it.

“To really move the field toward precision psychiatry, we need to identify treatments most likely to be effective for patients and get them on that treatment as soon as possible,” Ma said. “Having information on their brain function, in particular the validated signatures we evaluated in this study, would help inform more precise treatment and prescriptions for individuals.”

Powered by WPeMatico

ALS diagnosis and survival linked to metals in blood, urine, suggests research

People with higher levels of metals found in their blood and urine may be more likely to be diagnosed with-and die from-amyotrophic lateral sclerosis, or ALS, a University of Michigan-led study suggests.

Researchers have known that ALS, a rare but fatal neurodegenerative condition, is influenced by genetic and environmental factors, including exposure to pesticides and metals.

This latest study examined the levels of metals in the blood and urine of people with and without ALS, finding that exposure to individual and mixtures of metals is associated with a greater risk for ALS and shorter survival.

The results are published in the Journal of Neurology, Neurosurgery, and Psychiatry.

“Strengthening our understanding of the importance of exposure to metals as a risk factor for ALS is essential for future targeted prevention of the disease and improved therapeutic strategies,” said senior author Stephen Goutman, M.D., M.S., director of the Pranger ALS Clinic and associate director of the ALS Center of Excellence at University of Michigan.

“Several epidemiologic studies have linked metal exposure to ALS risk. Nonetheless, it remains critical for us to understand how these metal mixtures associate with ALS risk and survival and to identify who is at greatest risk of exposure or who is most susceptible to the exposure.”

Goutman’s team measured metal levels in plasma and urine samples from over 450 people with ALS and nearly 300 people without the condition.

They found that elevated levels of individual metals, including copper, selenium and zinc, significantly associated with higher ALS risk and earlier death.

They then used these results to create environmental ALS risk scores, similar to the polygenic risk scores previously developed at U-M. The environmental risk scores indicated that mixtures of metals in plasma and urine are linked to around a three-times greater risk for the disease.

In this study, the inclusion of an ALS polygenic risk score to assess a potential moderating effect of underlying genetic factors did not alter the association between metal exposure and disease risk or survival.

“While several studies suggest that environmental factors like metals interact with genetic variants to influence the onset, progression and severity of ALS, our study found that accounting for ALS polygenic risk scores did not influence the relationship between metal exposure and ALS,” said co-author Kelly Bakulski, Ph.D., associate professor of epidemiology at the University of Michigan School of Public Health.

“The relationships between genes and the environment on disease risk are complex, and future insights into other genetic factors or pathways that may be involved in ALS risk and metabolism of metals could enhance our understanding.”

Investigators also discovered that participants working in occupations with a higher likelihood of metal exposure had increased levels of metal mixtures in their blood and urine.

This echoes a previous study from the research team which found that people with ALS reported higher occupational exposure to metals prior to diagnosis.

“These findings emphasize the necessity of accounting for occupational and environmental factors when evaluating a person’s overall exposure risk,” said first author Dae Gyu Jang, Ph.D., postdoctoral fellow in the U-M Health Department of Neurology.

By avoiding high risk activities associated with metal exposures, Goutman says, individuals might lower their overall exposure and potentially mitigate risk.

“Our future research will further focus on what exposures have the strongest associations and their implications on the disease,” he said.

Reference:

Jang D, Dou JF, Koubek EJ, et alMultiple metal exposures associate with higher amyotrophic lateral sclerosis risk and mortality independent of genetic risk and correlate to self-reported exposures: a case-control studyJournal of Neurology, Neurosurgery & Psychiatry Published Online First: 06 August 2024. doi: 10.1136/jnnp-2024-333978.

Powered by WPeMatico

CT Enterography may Effectively predict Stricture Severity among patients with Crohns Disease: Study

Researchers have found that maximal associated small bowel dilation, stricture length, and maximal stricture wall thickness at CT enterography can help sufficiently describe the severity of active Crohn’s disease strictures. In a recent paper in Radiology, Florian Rieder and colleagues reported that these parameters corresponded to the severity of active Crohn’s disease strictures. This can thus be important in guiding clinical decisions and evaluating eligibility or efficacy in antistricture therapy trials.

Standardized methods of measuring and describing Crohn disease strictures by CT enterography are important for clinical decisions and therapeutic studies. Prior methods were not uniform and, therefore, have resulted in difficulty with treatment decisions or assessments for response. The objective of this study was to determine the reliability of CT enterography features that describe Crohn disease strictures and their relationship to stricture severity.

A retrospective study was conducted among 43 adults with symptomatic terminal ileal Crohn disease strictures who underwent standard-of-care CT enterography at Cleveland Clinic from January 2008 to August 2016. Four abdominal radiologists, trained on standardized definitions and blinded to all patient information, assessed imaging features of the most distal ileal stricture in two separate sessions (separated by ≥2 weeks) in random order. Features with an interrater ICC greater than or equal to 0.41 were considered reliable. Univariable and multivariable linear regression analyses determined which of these reliable features were associated with a VAS of overall stricture severity.

Results

  • Examinations of 43 patients (mean age, 52 years ± 16; 23 female) were evaluated.

  • Five continuous measurements and six observations demonstrated at least moderate interrater reliability (interrater ICC range, 0.42 [95% CI: 0.25, 0.57] to 0.80 [95% CI: 0.67, 0.88]).

  • Ten features were univariably associated with stricture severity, and three continuous measurements—stricture length (interrater ICC, 0.64 [95% CI: 0.42, 0.81]), maximal associated small bowel dilation (interrater ICC, 0.80 [95% CI: 0.67, 0.88]), and maximal stricture wall thickness (interrater ICC, 0.50 [95% CI: 0.34, 0.62])—were independently associated (P <.001 to .003) with stricture severity in a multivariable model.

  • These measurements were used to derive a well-calibrated (optimism-adjusted calibration slope = 1.00) quantitative model of stricture severity.

This study firmly establishes that standardized CT enterography measurements describe terminal ileal Crohn disease strictures in a reliable way. It shows that some of those measures—stricture length, maximal associated small bowel dilation, and maximal wall thickness—are reliable indicators of stricture severity and may help better assess and manage Crohn’s disease and antistricture therapy trials.

Researchers described that CT enterography measurements of stricture length, maximal associated small bowel dilation, and maximal stricture wall thickness are useful markers of stricture severity in Crohn disease. Such findings are pivotal for clinical decision-making and the evaluation of antistricture therapy efficacy.

Reference:

Rieder, F., Ma, C., Hanzel, J., Fletcher, J. G., Baker, M. E., Wang, Z., Guizzetti, L., Shackelton, L. M., Rémillard, J., Patel, M., Niu, J., Ottichilo, R., Santillan, C. S., Capozzi, N., Taylor, S. A., Bruining, D. H., Zou, G., Feagan, B. G., Jairath, V., … Atzen, S. (2024). Reliability of CT enterography for describing fibrostenosing Crohn disease. Radiology, 312(2). https://doi.org/10.1148/radiol.233038

Powered by WPeMatico

Nicotine E-Cigarettes and Varenicline Equally Effective for Smoking Cessation, Study Finds

Researchers have found that both nicotine-containing electronic cigarettes (ECs) and varenicline are effective smoking cessation aids, according to a randomized clinical trial conducted in northern Finland. A recent study was published in JAMA Internal Medicine by Anna T. and colleagues. This study compared the effectiveness of these two methods in helping individuals quit smoking conventional cigarettes for up to six months.

Smoking cessation is a significant public health goal, and various aids have been developed to support individuals in quitting smoking. Nicotine-containing electronic cigarettes (ECs) and varenicline are two such aids. However, little is known about their relative effectiveness. This study aimed to fill that knowledge gap by comparing the efficacy of ECs and varenicline in a controlled trial setting.

This randomized, placebo-controlled, single-center trial recruited 561 participants aged 25 to 75 years who smoked daily and wished to quit smoking. The study took place from August 1, 2018, to February 20, 2020, with 52 weeks of follow-up. Participants were randomized into three groups:

  • EC group: 18 mg/mL nicotine-containing ECs and placebo tablets.

  • Varenicline group: Standard dosing of varenicline and nicotine-free ECs.

  • Placebo group: Placebo tablets and nicotine-free ECs.

All groups also received a motivational interview, and the intervention phase lasted for 12 weeks. The primary outcome was self-reported 7-day conventional cigarette smoking abstinence, confirmed by exhaled carbon monoxide levels at week 26. The analysis followed the intent-to-treat principle.

Of the 561 recruited participants, 458 were eligible and randomized (257 women [56%]; 201 men [44%]; mean age, 51 years). The primary outcome of smoking abstinence at week 26 was achieved by:

  • 61 of 152 participants (40.4%) in the EC group.

  • 67 of 153 participants (43.8%) in the varenicline group.

  • 30 of 153 participants (19.7%) in the placebo group (P < .001).

Pairwise comparisons showed that both ECs and varenicline significantly outperformed the placebo:

  • ECs vs. placebo: Risk difference (RD) of 20.7% (95% CI, 10.4-30.4; P < .001).

  • Varenicline vs. placebo: RD of 24.1% (95% CI, 13.7-33.7; P < .001).

However, there was no significant difference between ECs and varenicline:

  • RD of 3.4% (95% CI, −7.6 to 14.3; P = .56).

  • No serious adverse events were reported.

The findings indicate that both nicotine-containing ECs and varenicline are effective smoking cessation aids. The significant differences between these methods and the placebo highlight their potential for helping individuals quit smoking. The study also underscores that no serious adverse events were associated with either method, supporting their safety profiles.

This study demonstrates that both varenicline and nicotine-containing ECs are effective in helping individuals quit smoking for up to six months. The lack of significant differences between the two methods suggests that both can be considered viable options for smoking cessation. Further research may continue to explore the long-term effects and efficacy of these smoking cessation aids.

Reference:

Tuisku, A., Rahkola, M., Nieminen, P., & Toljamo, T. (2024). Electronic cigarettes vs varenicline for smoking cessation in adults: A randomized clinical trial. JAMA Internal Medicine. https://doi.org/10.1001/jamainternmed.2024.1822

Powered by WPeMatico

Study finds Lower serum and follicular fluid prolidase levels activity in women with PCOS undergoing assisted conception

Polycystic ovary syndrome (PCOS), the most common endocrine disorder in reproductive-aged women, affects around 3–15% of women in this age group. It is frequently linked to long-term conditions like cardiovascular disease, type 2 diabetes mellitus, obesity, and infertility, with about 70% of affected women suffering from infertility. Recent study investigated prolidase activity in the serum and follicular fluid (FF) of women with polycystic ovarian syndrome (PCOS) undergoing in vitro fertilization/intracytoplasmic sperm injection (IVF/ICSI) treatment, comparing them to those with normal ovarian function. The study aimed to examine prolidase levels and their potential association with PCOS and assisted conception outcomes. A total of 50 participants were initially enrolled, with 44 included in the final analysis. Serum and FF prolidase levels were measured using spectrophotometric analysis, and correlations with various clinical parameters were explored.

Research Findings

The results revealed that serum and FF prolidase levels were significantly lower in patients with PCOS compared to those with normal ovarian function. Additionally, a direct correlation was observed between serum and FF prolidase levels. The study also found a negative correlation between serum prolidase levels and total antral follicle count, while serum and FF prolidase levels were positively correlated with blastocyst quality scoring (BQS). Despite lower BQS in PCOS patients, no statistical difference was observed in the clinical pregnancy rate between the groups. The findings suggest that patients with PCOS showed abnormal degradation of ovarian and follicular collagen, potentially leading to anovulation. The study’s implications indicate the potential role of prolidase in the pathophysiology of PCOS and its therapeutic applications. The authors recommended future studies with larger participant numbers to confirm the results and suggested measuring manganese levels to strengthen the findings. Limitations of the study included the small sample size and the absence of PCOS subgroups analysis.

Conclusion

Overall, the study provided valuable insights into the relationship between prolidase activity and PCOS, shedding light on the abnormal degradation of ovarian and follicular collagen in PCOS patients undergoing assisted conception. The research contributes to the understanding of PCOS pathophysiology and suggests potential implications for therapeutic interventions. Please note that this summary is a factual and objective representation of the study’s main points and findings.

Key Points

– The study aimed to investigate prolidase activity in the serum and follicular fluid of women with polycystic ovarian syndrome (PCOS) undergoing in vitro fertilization/intracytoplasmic sperm injection (IVF/ICSI) treatment and compare them to those with normal ovarian function.

– A total of 50 participants were initially enrolled, with 44 included in the final analysis. Serum and follicular fluid prolidase levels were measured using spectrophotometric analysis, and correlations with various clinical parameters were explored.

– The results revealed that serum and follicular fluid prolidase levels were significantly lower in patients with PCOS compared to those with normal ovarian function. There was also a direct correlation between serum and follicular fluid prolidase levels.

– The study found a negative correlation between serum prolidase levels and total antral follicle count, while serum and follicular fluid prolidase levels were positively correlated with blastocyst quality scoring (BQS). Despite lower BQS in PCOS patients, no statistical difference was observed in the clinical pregnancy rate between the groups.

– The findings suggest that patients with PCOS showed abnormal degradation of ovarian and follicular collagen, potentially leading to anovulation. The study’s implications indicate the potential role of prolidase in the pathophysiology of PCOS and its therapeutic applications.

– The authors recommended future studies with larger participant numbers to confirm the results and suggested measuring manganese levels to strengthen the findings. Limitations of the study included the small sample size and the absence of PCOS subgroups analysis. Overall, the study sheds light on the abnormal degradation of ovarian and follicular collagen in PCOS patients undergoing assisted conception, contributing to the understanding of PCOS pathophysiology and suggesting potential implications for therapeutic interventions.

Reference –

Kadriye Erdoğan, Emine Utlu Özen, İnci Kahyaoğlu, Salim Neselioglu, Özcan Erel, Serra Akar, Özhan Özdemir, Cihangir Mutlu Ercan & Yaprak Engin Üstün (2024) Prolidase activity in women with polycystic ovarian syndrome undergoing assisted conception, Journal of

Obstetrics and Gynaecology, 44:1, 2346228, DOI: 10.1080/01443615.2024.2346228

Powered by WPeMatico

Family psychiatric history: Effects on siblings of children with autism

Children who have an older sibling with autism spectrum disorder (autism) are at greater risk of developmental vulnerabilities if they also have other relatives with neurodevelopmental or psychiatric conditions, according to a new study from the Yale Child Study Center.

Researchers found that the siblings of children with autism had an increase in the severity of social and communication difficulties-which are common in autism-if they had relatives with conditions such as schizophrenia or anxiety. Family histories of anxiety and intellectual disability were also associated with lower verbal and nonverbal skills and with less developed adaptive skills in siblings participating in the study, according to the research published  in the journal Autism Research.

These findings can be useful to pediatricians in identifying infant siblings of children with autism who may be at higher risk for later developmental concerns.

“We are always on the lookout for information to help us monitor and support development of infants with known risk factors for developmental disorders. Information about family history is available at birth and may guide parents and practitioners in their developmental monitoring efforts,” said Katarzyna Chawarska, the Emily Fraser Beede Professor of Child Psychiatry at Yale School of Medicine and senior author of the study. “Considering family history of these disorders may improve efforts to predict long-term outcomes in younger siblings of children with autism and inform about factors contributing to variable phenotypic outcomes in this cohort.”

The study team, led by Chawarska, collected family history information from parents of 229 younger siblings of children with autism between March 2006 and May 2022. The siblings participated in comprehensive evaluation of social, cognitive, language, and adaptive skills.

The researchers investigated whether family history of neurodevelopmental and psychiatric conditions related to developmental outcomes of younger siblings of children with autism, controlling for variables such as the child’s birth year, age, sex assigned at birth, and family demographics.

Autism is a neurodevelopmental condition characterized by social and communication impairments as well as sensory sensitivities, repetitive behaviors, and stereotyped interests. Past research has shown that younger siblings of children with autism exhibit a wide range of developmental concerns across social, cognitive, language, and adaptive functioning.

Symptoms in some siblings are severe, span across multiple domains, and result in a diagnosis of autism, while they may be milder or present only in some developmental areas for others. Many siblings progress to develop typically.

“It is not clear what drives such heterogeneity of outcomes in younger siblings of children with autism,” said Chawarska. “Identifying factors linked with variable outcomes is essential for improving understanding of their underlying biology and for early identification of the most vulnerable siblings.”

As in prior studies, the researchers found an elevated prevalence of neurodevelopmental and psychiatric disorders in the first-, second-, and third-degree relatives of children with autism. According to Chawarska, the conditions most commonly present in relatives included anxiety disorders, schizophrenia, bipolar disorder, depression, attention-deficit/hyperactivity disorder, speech delays, and intellectual disability.

“Future studies will be necessary to disambiguate the mechanistic underpinnings of the observed associations between family history and developmental outcomes,” noted Chawarska.

However, despite a lack of clarity related to the underlying mechanisms of the observed effects, this research does suggest that family history of selected psychiatric and developmental disorders signals increased developmental vulnerabilities in younger siblings.

Reference:

Giselle Bellia, Joseph Chang, Zeyan Liew, Angelina Vernetti, Suzanne Macari, Kelly Powell, Katarzyna Chawarska, Family history of psychiatric conditions and development of siblings of children with autism, Autism Research, https://doi.org/10.1002/aur.3175.

Powered by WPeMatico

Study Reveals Positive Impact of Primary PCI on Elderly Chinese Patients with Heart Attacks

In a recent study conducted in China,
researchers explored the prevalence and outcomes of primary percutaneous
coronary intervention (PCI) in individuals aged 75 years and older who
experienced ST-segment elevation myocardial infarction (STEMI). The research,
based on data collected from a multicenter registry between 2013 and 2014,
provides insights into the underutilization of primary PCI in Chinese clinical
practice and emphasizes its significant potential to improve outcomes in
elderly STEMI patients. The study found that Patients who received primary PCI exhibited a noteworthy
reduction in the risk of two-year all-cause mortality, major adverse cardiac
and cerebrovascular events (MACCE), and cardiac death compared to those
who did not undergo reperfusion.

The
observational study was published in the International
Journal of Cardiology Cardiovascular Risk and Prevention.

Older Patients
aged ≥75, face high mortality due to age-related comorbidities due to with ST-segment elevation
myocardial infarction (STEMI). Hence, researchers using the China Acute Myocardial
Infarction Registry aimed to assess primary PCI prevalence and outcomes in
these older patients. Despite guidelines supporting invasive management, the
real-world scenario shows declining primary PCI rates with age, raising
concerns about disparities between developed and developing countries like
China. The
research addresses the gap by examining whether primary PCI improves outcomes
in older Chinese STEMI patients, including those aged ≥85.

The primary
outcome measured was all-cause mortality, and the secondary outcome was major
adverse cardiac and cerebrovascular events (MACCE), which included a composite
of all-cause mortality, cardiac death, recurrent myocardial infarction (MI),
stroke, revascularization, and major bleeding.

Findings:

  • Examining a cohort of 999 STEMI
    patients aged 75 and above, the study found that around 32.9% of these
    individuals underwent primary PCI.
  • Patients who received primary PCI
    showed a significant reduction in the risk of two-year all-cause mortality,
    MACCE, and cardiac death compared to those who did not undergo reperfusion.
  • Notably, the two-year all-cause
    mortality rate dropped from 36.4% without reperfusion to 18.0% with primary
    PCI.
  • Similar trends were observed in MACCE
    (43.5% without reperfusion vs. 28.7% with primary PCI) and cardiac death (23.6%
    without reperfusion vs. 10.0% with primary PCI).
  • Importantly, the study also delved
    into age-specific nuances, revealing that the positive outcomes associated with
    primary PCI extended to STEMI patients aged 85 years and older.
  • This
    underscores the potential benefits of the intervention even in the very elderly
    population.
  • Subgroup analysis further supported
    the efficacy of primary PCI, demonstrating its superiority in patients with
    high-risk profiles, such as those experiencing cardiogenic shock or facing
    delayed hospital admission.

Notably, no significant differences
were observed in recurrent myocardial infarction, stroke, revascularization,
and major bleeding between the primary PCI and no reperfusion groups.

However, despite these promising
results, the study highlighted a concerning trend of underutilization of
primary PCI in Chinese clinical practice. 
Even with its proven benefits in
reducing mortality and improving cardiac outcomes, a significant portion of
eligible patients did not receive this potentially life-saving intervention.

In conclusion, this comprehensive
study underscores the pivotal role of primary PCI in enhancing outcomes for
elderly Chinese patients grappling with STEMI. The findings advocate for a
paradigm shift in clinical practices, urging healthcare professionals to
embrace and implement primary PCI more widely to ensure that elderly
individuals receive the optimal care necessary to mitigate the devastating
impact of heart attacks.

Further reading: The prevalence and
outcomes in STEMI patients aged ≥75 undergoing primary percutaneous coronary
intervention in China. Doi: https://doi.org/10.1016/j.ijcrp.2024.200251

Powered by WPeMatico