Perioperative lidocaine infusion safe in field of liver surgery, suggests study

Recently published prospective monocentric study aimed to investigate the safety and efficacy of perioperative lidocaine infusion in patients undergoing liver surgery. The study was conducted from 2020 to 2021 in Caen University Hospital, France. The protocol included a bolus dose of 1.5 mg kg−1, followed by a continuous infusion of 2 mg kg−1 h‑1 until the beginning of hepatic transection. Plasma concentrations of lidocaine were measured four times during and after the lidocaine infusion.

The study included 20 subjects who underwent liver resection, with 35% having preexisting liver disease before tumor diagnosis, and 75% undergoing major hepatectomy. The plasmatic levels of lidocaine were within the therapeutic range, and no blood sample showed a concentration above the toxicity threshold throughout the infusion. Comparative analysis between the presence of preexisting liver disease or not and the association of intraoperative vascular clamping did not show significant differences concerning lidocaine blood levels.

The findings suggested that perioperative lidocaine infusion is safe in the field of liver surgery, and it was concluded that additional prospective studies are needed to assess the clinical usefulness in terms of analgesia and antitumoral effects. The study also highlighted the potential implications of intravenous lidocaine in liver surgery, mentioning its analgesic effects in the context of enhanced recovery after surgery. The paper discussed the metabolism of lidocaine in the liver, the recommended lidocaine infusion protocol, and the therapeutic range of lidocaine levels.

The study found that the intravenous lidocaine infusion used in the protocol maintained effective concentrations until surgical closure, and no adverse events related to lidocaine infusion were reported. Additionally, the study explored the pharmacokinetics of lidocaine during liver surgery, emphasizing the safety and tolerability of the lidocaine dosage used in the study. The limitations of the study, such as the small sample size and early termination of lidocaine infusion, were also noted, and it was suggested that further studies are necessary to assess the safety and efficacy of intravenous lidocaine in liver surgery.

In conclusion, the pilot study demonstrated the safety of intravenous lidocaine in the context of liver surgery and suggested further studies to evaluate its efficacy on postoperative pain. The study emphasized the need for additional research to understand the potential benefits of lidocaine infusion in liver surgery, including analgesia and antitumoral effects.

Key Points

– The study aimed to investigate the safety and efficacy of perioperative lidocaine infusion in patients undergoing liver surgery. This monocentric study was conducted at Caen University Hospital, France, from 2020 to 2021. The protocol included a bolus dose of 1.5 mg kg−1, followed by a continuous infusion of 2 mg kg−1 h‑1 until the beginning of hepatic transection. Plasma concentrations of lidocaine were measured four times during and after the lidocaine infusion.

– The study included 20 subjects who underwent liver resection, with 35% having preexisting liver disease before tumor diagnosis, and 75% undergoing major hepatectomy. The plasmatic levels of lidocaine were within the therapeutic range, and no blood sample showed a concentration above the toxicity threshold throughout the infusion. Comparative analysis between the presence of preexisting liver disease or not and the association of intraoperative vascular clamping did not show significant differences concerning lidocaine blood levels. The intravenous lidocaine infusion used in the protocol maintained effective concentrations until surgical closure, and no adverse events related to lidocaine infusion were reported.

– The findings of the study suggested that perioperative lidocaine infusion is safe in the field of liver surgery. The study highlighted the potential implications of intravenous lidocaine in liver surgery, mentioning its analgesic effects in the context of enhanced recovery after surgery. The study also emphasized the need for additional research to understand the potential benefits of lidocaine infusion in liver surgery, including analgesia and antitumoral effects. The paper discussed the metabolism of lidocaine in the liver, the recommended lidocaine infusion protocol, and the therapeutic range of lidocaine levels. The limitations of the study, such as the small sample size and early termination of lidocaine infusion, were noted, and it was suggested that further studies are necessary to assess the safety and efficacy of intravenous lidocaine in liver surgery.

Reference –

Grassin, Pierre; Descamps, Richard; Bourgine, Joanna1; Lubrano, Jean2; Fiant, Anne-Lise; Lelong-Boulouard, Véronique1; Hanouz, Jean-Luc. Safety of perioperative intravenous lidocaine in liver surgery – A pilot study. Journal of Anaesthesiology Clinical Pharmacology 40(2):p 242-247, Apr–Jun 2024. | DOI: 10.4103/joacp.joacp_391_22

Powered by WPeMatico

Use of higher concentration of NaOCl during endodontic treatment can increase postoperative pain significantly, reveals a study

Use of higher concentration of NaOCl during endodontic treatment can increase postoperative pain significantly, reveals a study published in The Journal of the American Dental Association.

This study aimed to evaluate whether using 8.25% sodium hypochlorite (NaOCl), compared with using 2.5% NaOCl, leads to higher postoperative pain after endodontic treatment. A total of 154 patients were randomly assigned into 2 groups: 8.25% and 2.5% NaOCl. A single-visit endodontic treatment was performed using a standard protocol, varying only the NaOCl concentration. Postoperative pain was assessed using the numeric rating scale multiple times over 30 days. Overall pain scores over time were explored via multilevel mixed-effects negative binomial regression. The need for pain medication was recorded and compared between groups via the Mann-Whitney U test. Results: Using 8.25% NaOCl increased postoperative pain scores over time by 3.48 times compared with using 2.5% NaOCl (incident rate ratio [IRR], 3.48; 95% CI, 1.57 to 7.67). Furthermore, the 8.25% NaOCl group exhibited higher pain incidence than the 2.5% NaOCl group during the 12-hour through 3-day period, with scores at these times ranging from 2.21 (IRR, 2.21; 95% CI, 1.35 to 3.62) through 10.74 (IRR, 10.74; 95% CI, 3.74 to 30.87) higher. No difference was detected in the number of analgesic capsules administered between groups. The use of 8.25% NaOCl resulted in higher postoperative pain than the use of 2.5% NaOCl, with pain scores increasing by 3.48 times when this solution was used. Furthermore, the 8.25% NaOCl group exhibited higher pain incidence than the 2.5% NaOCl group during the 12-hour through 3-day period. The use of 8.25% NaOCl during endodontic treatment can increase postoperative pain significantly.

Reference:

Vitali FC, Santos PS, Garcia LDFR, Teixeira CDS. Postoperative pain after endodontic treatment using 8.25% vs 2.5% sodium hypochlorite in necrotic mandibular molars with apical periodontitis: A randomized double-blind clinical trial. J Am Dent Assoc. 2024 May 30:S0002-8177(24)00242-3. doi: 10.1016/j.adaj.2024.04.011. Epub ahead of print. PMID: 38819357.

Keywords:

Endodontics, postoperative pain, randomized controlled trial, sodium hypochlorite, Vitali FC, Santos PS, Garcia LDFR, Teixeira CDS, The Journal of the American Dental Association

Powered by WPeMatico

Use of Contraceptive Pill in nulliparous women associated with three-fold increase in Multiple Sclerosis Risk: Study

Multiple research analyses have been done to ascertain the possibility that oral contraceptive pills would influence the development risks of Multiple Sclerosis (MS). Earlier studies on conflicting viewpoints were unclear whether Oral Contraceptives (OCs) influence the development of MS or not. A new study recently published, looking at information from UK Biobank, was an attempt to clarify the rationale and in particular how parity (having children) and how genetic susceptibility might modify this risk. This was published in the journal Fertility and Sterility by Andrea Nova and colleagues.

This was a population-based cohort study that included 181,058 white women from the United Kingdom born between 1937 and 1970, who were participants in the UK Biobank. A total of 1,131 women were diagnosed with MS. Oral contraceptive use was assessed in terms of ever/never use, current use, duration of use, and age at initiation. The primary outcome was defined as incident MS diagnosis (ICD-10 code G35).

Key Findings

  • No case-control comparisons revealed a statistically significant increased MS risk in women who had ever used OCs versus never users, or in current users versus never users.

  • The hazard ratio (HR) for ever vs. never users was 1.30 (95% CI: 0.93-1.82, p=0.12), and for current vs. never users, it was 1.35 (95% CI: 0.81-2.25, p=0.25).

  • In nulliparous women, which means women that have never given birth, ever use or current use of OCs showed a strongly increased MS risk.

  • The HR for ever use in nulliparous women was 2.08 (95% CI: 1.04-4.17, p=0.04), and for current use, it was even higher at 3.15 (95% CI: 1.43-6.92, p=0.004).

  • These findings suggest that childbearing without childbirth may be an important modifier of the effect of OCs on MS risk.

  • It was also noted that duration of use and age at first use were important.

  • Longer current duration and earlier age at first use have been shown to increase MS risk.

  • The authors did point out that among women with a lower genetic predisposition to MS polygenic risk score was a stronger association between use of OCs and MS risk.

The study’s findings highlight the complexity of the relationship between OC use and MS. While OCs did not significantly increase MS risk in the general population, specific subgroups particularly nulliparous women experienced a much higher risk. This underscores the importance of considering individual factors such as parity and genetic susceptibility when evaluating the potential risks of OC use.

This research suggests that oral contraceptive use, particularly in nulliparous women, may significantly increase the risk of developing Multiple Sclerosis. While the findings provide valuable insights, the authors caution against drawing definitive causal conclusions without further research. Future studies are needed to validate these results across different populations and types of OCs.

Reference:

Nova, A., Di Caprio, G., Baldrighi, G. N., Galdiolo, D., Bernardinelli, L., & Fazia, T. (2024). Investigating the Influence of Oral Contraceptive Pill use on Multiple Sclerosis Risk using UK Biobank data. Fertility and Sterility. https://doi.org/10.1016/j.fertnstert.2024.07.999

Powered by WPeMatico

Women with PCOS at higher risk of Eating disorder, suggests study

Women with the common reproductive and metabolic condition polycystic ovary syndrome (PCOS) face a greater risk of developing bulimia, binge eating disorder and disordered eating, according to new research published in The Journal of Clinical Endocrinology & Metabolism. PCOS affects roughly one in eight women. Women who have the condition face an increased risk of developing metabolic problems such as diabetes, reproductive issues such as infertility, and psychological issues including anxiety and depression.Women are diagnosed when they have at least two of the three key features of PCOS:

• Increased numbers of ovarian follicles containing immature eggs (called polycystic ovaries) seen on ultrasound;

• Slightly higher levels of testosterone or clinical symptoms of higher testosterone such as excess body hair; and

• Irregular or no menstrual periods.

The systematic review and meta-analysis examined results from 20 cross-sectional studies across nine countries. The studies included data from 28,922 women with PCOS and 258,619 women who did not have the condition.“This analysis is the first time we’ve been able to confirm an increased risk of specific eating disorders, including bulimia nervosa, commonly called bulimia, and binge eating disorder,” said the study’s first author, Laura Cooney, M.D., M.S.C.E., associate professor at the University of Wisconsin in Madison, Wisc. “Many women with PCOS experience weight stigma, and that can be detrimental to mental health generally and contribute to disordered eating.”When the researchers analyzed the women by body-mass index (BMI), both those who were normal weight and those of higher weight had higher disordered eating scores compared to women without PCOS.

This suggests that the association is not dependent on BMI, Cooney said.“Our findings emphasize the importance of screening women with PCOS for eating disorders before clinicians share any lifestyle advice,” Cooney said. “The lifestyle modifications we often recommend for women with PCOS-including physical activity, healthy diet and behavior modifications-could hinder the recovery process for eating disorders. Health care providers need to be vigilant about screening for eating disorders in this population.”The meta-analysis did not find an association between PCOS and the eating disorder anorexia. However, the authors caution that studies on anorexia and PCOS are more limited and there should always be a high suspicion for any disordered eating pathology in someone who is being evaluated for PCOS.

Reference:

Laura G Cooney, Kaley Gyorfi, Awa Sanneh, Leeann M Bui, Aya Mousa, Chau Thien Tay, Helena Teede, Elisabet Stener-Victorin, Leah Brennan, Increased Prevalence of Binge Eating Disorder and Bulimia Nervosa in Women With Polycystic Ovary Syndrome: A Systematic Review and Meta-Analysis, The Journal of Clinical Endocrinology & Metabolism, 2024;, dgae462, https://doi.org/10.1210/clinem/dgae462.

Powered by WPeMatico

New form of immunotherapy could prevent resistance to hormone therapy among prostate cancer patients: BMJ

A new form of immunotherapy using innovative nanoparticles can delay resistance to hormone therapy and help men with prostate cancer live longer.

Researchers from the University of Sheffield have today published findings from a Prostate Cancer UK-funded study, which shows a new form of immunotherapy could give men much more time before their cancer becomes resistant to hormone therapy.

For thousands of men diagnosed with prostate cancer, androgen deprivation therapy (ADT) is a powerful first-line treatment. Although initially effective in limiting the growth and spread of cancer, in some men their tumours develop resistance to this treatment, so their cancer spreads further throughout the body, becoming incurable.

Immunotherapy has had huge success in other cancers – offering long-term cures for previously untreatable cancers. However, this success has not translated to prostate cancer, and a huge focus of research has been to understand why.

The team used cutting edge-techniques to study how immune cells function within prostate tumours, especially after ADT, leading to the development of an entirely new way to deliver immunotherapy. Published in the Journal for Immunotherapy of Cancer, their study is the first to show that carefully designed nanoparticles can be used to stimulate immune cells called T cells to attack cancer cells. They found that this markedly delays the onset of resistance to ADT.

Professor Claire Lewis, from the University of Sheffield’s, School of Medicine and Population Health who led the study, said: “The onset of resistance to hormone therapy is a major clinical problem when it comes to treating men with prostate cancer as their tumours then start to regrow and spread. Once this happens, their disease is difficult to treat and harsher treatments like chemotherapy have to be used.

“Until now, immunotherapies for prostate cancer have been disappointing, with few men responding well to treatment. Carefully analysing the way that immune cells in prostate tumours are inhibited by hormone treatment helped us to develop a way to overcome this,and prevent resistance to hormone therapy.

“We’re excited by the potential of this new form of immunotherapy to enhance the response of prostate tumours to hormone treatment. We are now working with our clinical colleagues to explore ways to take this forward into clinical trials as soon as possible.”

Through their analysis, the research team discovered that a type of white blood cell called a macrophage accumulates in large numbers around blood vessels in prostate tumours during ADT. They then developed a way of using novel nanoparticles to selectively deliver a drug to these cells that makes the macrophages express a potent immunostimulant called interferon-beta. When this is released inside tumours, it stimulates other immune cells called T cells to kill cancer cells and this delays treatment resistance.

Dr Hayley Luxton, Research Impact Manager at Prostate Cancer UK, who funded the study, said: “Over 12,000 men die from prostate cancer each year in the UK, and we desperately need new and more effective treatments.

“Immunotherapy has completely changed the way other cancers are treated, but we haven’t yet seen anything even close to that success for men with prostate cancer.

“We’re thrilled to have funded this research which shows a new form of immunotherapy can give men much more time before their cancer becomes resistant to hormone therapy.

“It will be really exciting to see how it performs in future clinical trials, and we hope it will play a pivotal role in finally unlocking the potential of immunotherapy for men with prostate cancer.”

The study was funded as part of Prostate Cancer UK’s Research Innovation Awards programme, which has seen £20m invested in exciting new research over the last 10 years.

This research supports the University of Sheffield’s cancer research strategy. Through the strategy, the University aims to prevent cancer-related deaths by undertaking high quality research, leading to more effective treatments, as well as methods to better prevent and detect cancer and improve quality of life.

Reference:

Al-janabi H, Moyes K, Allen R, et alTargeting a STING agonist to perivascular macrophages in prostate tumors delays resistance to androgen deprivation therapyJournal for ImmunoTherapy of Cancer 2024;12:e009368. doi: 10.1136/jitc-2024-009368

Powered by WPeMatico

Recurrent culture-proven sepsis linked to retinopathy of prematurity in neonates: JAMA

A new study published in the Journal of American Medical Association suggests that repeated culture-proven sepsis should be recognized as a modifiable risk factor linked to retinopathy of prematurity in extremely preterm newborns.

Retinopathy of prematurity (ROP) is a significant morbidity associated with preterm newborns that results in visual impairment, including blindness. Prevention and prompt treatment are essential to combat this condition. There is mounting evidence that the development of ROP is facilitated by exposure to neonatal sepsis. In order to better understand the relationship between neonatal sepsis and ROP in two sizable cohorts of preterm children born at fewer than 29 weeks of gestation, Kirsten Glaser and colleagues undertook this study.

The data from the Norwegian Neonatal Network (NNN) and German Neonatal Network (GNN) were utilized in this retrospective cohort analysis. There were 68 level III neonatal critical care units in the GNN and 21 in the NNN. The newborns for this research were recruited in the GNN between January 1, 2009 and December 31, 2022, and the NNN between January 1, 2009 and December 31, 2018. They ranged in gestation from 22 weeks and 0 days to 28 weeks and 6 days. From February through September of 2023, the data obtained were examined. 

This study included the mean (SD) birth weight of the 12,794 newborns in the GNN were 848 (229) g and the 1844 infants included in the NNN were 807 (215) g. 6370 newborns (49.8%) in GNN and 620 infants (33.6%) in NNN had any ROP, whereas 840 infants (6.6%) in GNN and 140 infants (7.6%) in NNN had treatment-warranted ROP.

With every incident of sepsis in both groups, the incidence of treatment-warranted ROP increased. Following the GNN multiple confounders’ adjustment of this dataset, the number of sepsis episodes was linked to both ROP and treatment-warranted ROP when compared to 0 episodes. Propensity score matching verified these connections for any ROP. Also, the surgical NEC was linked to treatment-warranted ROP in the NNN dataset.

The outcome of this large-scale cohort analysis found that in preterm children delivered at fewer than 29 weeks, ROP and treatment-warranted ROP were particularly linked with culture-proven neonatal sepsis and recurrent sepsis episodes. It is important to do more research to determine the underlying processes of any inflammation-driven retinal morbidity and to determine if this connection is causative

Reference:

Glaser, K., Härtel, C., Klingenberg, C., Herting, E., Fortmann, M. I., Speer, C. P., Stensvold, H. J., Huncikova, Z., Rønnestad, A. E., Nentwich, M. M., Stahl, A., Dammann, O., Göpel, W., Faust, K., Müller, D., Thome, U., Guthmann, F., von der Wense, A., … Wieg, C. (2024). Neonatal Sepsis Episodes and Retinopathy of Prematurity in Very Preterm Infants. In JAMA Network Open (Vol. 7, Issue 7, p. e2423933). American Medical Association (AMA). https://doi.org/10.1001/jamanetworkopen.2024.23933

Powered by WPeMatico

Gestational diabetes: Newly identified subgroups improve personalized therapy

Patients with gestational diabetes show different disease progressions and therefore require personalised treatment measures. An international research team led by MedUni Vienna has now identified three subgroups of the disease with different treatment needs. The results of the study, recently published in the prestigious journal Diabetologia, could improve our understanding of gestational diabetes and significantly advance the development of personalised treatment concepts.

As part of the study, scientists from MedUni Vienna, in collaboration with colleagues from Charité – Universitätsmedizin Berlin and the Consiglio Nazionale delle Ricerche Padua, analysed certain data from 2682 women with gestational diabetes (GDM) that had been routinely collected at MedUni Vienna and Charité between 2015 and 2022. Using cluster analyses, a machine learning method, the patients were divided into different groups based on routine parameters such as age, body mass index (BMI) before pregnancy and blood glucose values from an oral glucose tolerance test (OGTT). “This allowed us to clearly identify three clusters with different treatment requirements,” reports study leader Christian Göbl (Department of Obstetrics and Gynecology, MedUni Vienna). “We also saw that different pregnancy complications occur at different rates in the individual subgroups.”

Routine data for individualised treatment decisions

The first subtype includes women with the highest blood glucose levels, a high prevalence of obesity and the highest need for blood glucose-lowering medication. The second subgroup consists of women with a medium BMI and elevated fasting blood glucose levels. Women with normal BMI but elevated blood glucose levels after the OGTT were summarised in the third subtype. “The patients in the subgroups we identified showed remarkable differences in terms of the need for glucose-lowering medication and treatment modalities such as rapid-acting versus intermediate or long-acting insulin,” Christian Göbl points out the enormous clinical relevance of the study results, which lay the foundation for further research to develop optimal treatment strategies for each subgroup.

The newly created model for this is based on machine learning, an area of artificial intelligence that can derive predictions and decisions from data. “In this case, only a few parameters are required that are always available in the clinical routine for gestational diabetes anyway. This means that patients can receive even more specific and individualised advice and treatment and the risk of complications for mother and child can be further reduced,” says Göbl.

Reference:

Salvatori, B., Wegener, S., Kotzaeridi, G. et al. Identification and validation of gestational diabetes subgroups by data-driven cluster analysis. Diabetologia (2024). https://doi.org/10.1007/s00125-024-06184-7.

Powered by WPeMatico

Time-restricted eating not better than usual eating pattern in reducing weight: Study

A recent clinical study published in the Annals of Internal Medicine challenges the effectiveness of time-restricted eating (TRE) as a superior method for weight loss when compared to traditional dieting practices. This study was conducted over 12 weeks at a clinical research unit and sought to determine if TRE could lead to weight loss independently of calorie reduction which is suggested by previous rodent studies.

The study involved a total of 41 adults who were either obese or had prediabetes or well-managed diabetes. The participants were randomly divided into two groups, where one group adhered to a TRE regimen by consuming 80% of their daily calories before 1 p.m. within a 10-hour window and the other group followed a usual eating pattern (UEP) by consuming at least 50% of their calories after 5 p.m. within a 16-hour window. Also, both groups consumed the same number of calories and nutrients that was carefully measured to match baseline caloric intake of every individual.

The primary focus was on changes in body weight, while the secondary outcomes included variations in fasting glucose levels, insulin resistance (measured by HOMA-IR) and glucose tolerance. The results after the 12-week period showed minimal differences in weight loss between the two groups. The participants in the TRE group lost an average of 2.3 kg, while the participants in the UEP group saw a slightly higher average weight loss of 2.6 kg. The difference of 0.3 kg between the two groups was statistically insignificant. There were no notable differences in glycemic control between the groups which indicates the timing of eating did not influence glucose metabolism under these conditions.

The findings were significant as they suggest that the benefits of TRE may not be due to the restricted eating window itself but possibly due to a reduction in overall calorie intake, that was controlled and neutralized in this study. Overall, this study adds a crucial perspective to the ongoing debate about the effectiveness of various dieting strategies by underlining that when calorie intake is consistent, the timing of meals might not play a significant role in weight management.

Reference:

Maruthur, N. M., Pilla, S. J., White, K., Wu, B., Maw, M. T. T., Duan, D., Turkson-Ocran, R.-A., Zhao, D., Charleston, J., Peterson, C. M., Dougherty, R. J., Schrack, J. A., Appel, L. J., Guallar, E., & Clark, J. M. (2024). Effect of Isocaloric, Time-Restricted Eating on Body Weight in Adults With Obesity. In Annals of Internal Medicine. American College of Physicians. https://doi.org/10.7326/m23-3132

Powered by WPeMatico

Study Highlights Age, Gender, BMI, and Lifestyle as Key Factors in Managing Lower Back Pain

India:
An observational study published in the Journal of the Association of Physicians
of India
revealed that Lower back pain is a complex condition shaped by factors such as age, gender, BMI, and lifestyle. To enhance patients’ quality of life, management and prevention strategies must take these risk factors into account. 

A holistic approach is crucial to effectively address the multifaceted causes of lower back pain, the researchers wrote. 

Lower back pain is defined
as the pain between the lower edges of the ribs and the buttocks. People at any
age can experience Lower back pain. Lower back pain is divided into acute
(<6 weeks), subacute (6 weeks to <3 months), and chronic (>3 months)
based on duration. Risk factors associated with Lower back pain are physical
factors, sociodemographic characteristics, lifestyle habits, and psychological
factors. Sociodemographic factors have a great impact on Lower back pain.
Considering this, Banshi Lal Kumawat, Senior Professor & Unit Head,
Department of Neurology, Sawai Man Singh Medical College and Hospital, Jaipur,
Rajasthan, India, et. al, conducted a study to analyze the risk factor and etiology
profile of lower back pain among patients.

For this purpose, the
research team conducted a cross-sectional observational study involving 170
patients from March 2023 to August 2023. Sociodemographic and lifestyle data were collected
and diagnostic investigations which include X-ray and MRI were performed.

The study assessed several
outcomes and based on the outcome, patients were categorized into acute and
chronic Low Back Pain groups for further analysis. Patients with acute LBP (≤6 weeks) and Patients
with chronic LBP (> 3 months). The magnetic resonance imaging of spine was
done to assess the etiology of LBP. Pfirrmann grading was done for prolapsed
intervertebral disk. Then the collected data were analyzed using SPSS 2021
version.

The findings revealed
that:

  • Patients under 35 years old had a higher
    prevalence of acute LBP, whereas those over 55 years old had a higher
    prevalence of chronic LBP.
  • Compared to men, women had a greater frequency
    of LBP, with chronic LBP being more frequent in women.
  • Talking about triggering events, coughing or sneezing was frequently associated with acute LBP, but hard
    weightlifting was a key trigger for chronic LBP.
  • Patients with physically demanding employment
    had a higher risk of acute low back pain (LBP), but homemakers and unemployed
    people were more likely to experience chronic LBP.
  • Overweight and obesity were linked to
    long-term low back pain. Compared to patients with acute LBP, those with
    chronic LBP were more likely to be using medication.
  • Prolapsed intervertebral disk (PIVD), which is
    more common in patients with chronic low back pain, was the most common cause
    identified by MRI. Tumors, tuberculosis, vertebral fractures, and other spinal
    disorders were among the other etiologies.

“Age, gender, BMI and lifestyle factors
influenced lower back pain. Proper management and preventive strategies help to
improve the condition of the patients”, researchers concluded.

Reference

Kumawat, B. L., Kaur, I.,
& Parashar, V. S. (2024). An Observational Study of Various Risk Factors
and Etiological Profile in Patients with Lower Back Pain at Tertiary Care
Center. The Journal of the Association of Physicians of India, 72(7),
48–54. https://doi.org/10.59556/japi.72.0557

Powered by WPeMatico

Donor kidneys with toxoplasma do not increase risks for transplant patients, suggests research

A new study from UC Davis Health could help to increase the supply of donor kidneys.

Researchers have found that transplant patients who receive kidneys infected with the parasite toxoplasma have virtually the same outcomes as those who receive toxoplasma-negative organs.

Despite longstanding concerns, those who received kidneys from toxoplasma antibody positive donors (TPDs) had almost identical mortality and rejection rates. The research was published in Transplant International.

“Organs from donors who were positive for toxoplasma did just as well as organs from those who were negative, both for survival of the patients and survival of the kidneys,” said Lavjay Butani, chief of pediatric nephrology. He coauthored the paper with Daniel Tancredi, professor of pediatrics. “This is quite encouraging.”

Inconsistency in approach

Toxoplasma is a ubiquitous parasite that infects many people but generally causes no harm. However, people who are immunosuppressed, such as kidney recipients, could be at higher risk. Toxoplasmosis can be transmitted through the transplanted kidney and reactivate a latent infection in the kidney recipient.

Still, there has been tremendous inconsistency in how transplant centers treat TPD kidneys, with some accepting them and others rejecting.

“We conducted this study because, about a year ago, there was a positive donor and the team did not want to use that kidney for one of our pediatric patients, so we didn’t accept it,” Butani said. “But we realized, we just didn’t have the data to know if that was the correct decision.”

What the study showed

The study analyzed 51,000 transplants from the Organ Procurement and Transplantation Network database. Of those, 4,300 were TPD. They found rejection and graft failure was 5% for both TPD and non-TPD kidneys. Other measures were similarly aligned. In other words, the TPD kidneys posed no additional risk.

The authors believe it is safe to transplant TPD kidneys-but do recommend additional monitoring. However, transplant patients routinely receive Bactrim, a two-antibiotic combination that is effective against toxoplasma and this may already be mitigating their risk. Most patients receive Bactrim for a year, but that could be extended for TPD cases.

The authors hope this work will help transplant centers unify their TPD policies. UC Davis Health is currently writing new protocols for pediatric transplants. Ultimately, this new understanding could help patients get the organs they desperately need.

“In transplants, kidneys are the greatest need,” Butani said. “Because of increased diabetes, high blood pressure and other conditions, the wait list just gets progressively longer. We hope these findings will help increase the supply of donor kidneys.”

Reference:

Lavjay Butani, Daniel Tancredi, Outcomes of Kidney Transplants From Toxoplasma-Positive Donors: An Organ Procurement and Transplant Network Database Analysis, Transplant International, https://doi.org/10.3389/ti.2024.13203.

Powered by WPeMatico