Low magnesium levels increase risk of DNA damage and chronic degenerative disorders: Study

A new Australian study has identified why a diet rich in magnesium is so important for our health, reducing the risk of DNA damage and chronic degenerative disorders.

Scientists from the University of South Australia measured blood samples from 172 middle aged adults, finding a strong link between low magnesium levels and high amounts of a genotoxic amino acid called homocysteine.

This toxic combination damages the body’s genes, making people more susceptible to Alzheimer’s and Parkinson’s disease, gastrointestinal diseases, a range of cancers, and diabetes.

Wholegrains, dark green leafy vegetables, nuts, beans and dark chocolate are all magnesium-rich foods, which help the body produce energy, build teeth and bones, regulate blood sugar and blood pressure, and ensure that the heart, muscles and kidneys all work properly.

UniSA molecular biologist Dr Permal Deo says a low intake of magnesium (less than 300mg per day) can increase the risk of many diseases, but its role in preventing DNA damage has not been fully studied in humans until now.

“Our study showed a direct correlation between low magnesium levels in blood (less than 18mg/L) and increased DNA damage, even after adjusting for gender and age,” Dr Deo says.

“Blood levels of magnesium, homocysteine (Hcy), folate and vitamin B12 were measured, showing an inverse correlation between magnesium and Hcy and a positive correlation between magnesium, folate and vitamin B12. This indicates that sufficiently high magnesium levels in the blood are essential to protect our genes from toxicity caused by homocysteine, which is increased when folate and vitamin B12 are deficient.”

Co-author Professor Michael Fenech says chronic magnesium deficiency is likely to disrupt the body’s ability to produce energy and power cells, causing accelerated tissue ageing and making people more susceptible to early onset of many diseases.

Magnesium is the fourth most abundant mineral present in the human body. More than 600 enzymes require it as a co-factor and almost 200 require it to activate critical processes in the body.

“The next step is to determine the optimal dietary intake of magnesium, either through food or supplements and how this could impact the onset or progression of cancer and other chronic diseases,” Prof Fenech says..

Reference:

Dhillon, V.S., Deo, P. & Fenech, M. Low magnesium in conjunction with high homocysteine increases DNA damage in healthy middle aged Australians. Eur J Nutr (2024). https://doi.org/10.1007/s00394-024-03449-0.

Powered by WPeMatico

Groundbreaking study uncovers link between serum apelin, visfatin levels, and body composition in PCOS patients

China: Polycystic Ovary Syndrome (PCOS) affects millions of women worldwide, causing hormonal imbalances and a myriad of symptoms, including irregular periods, acne, and infertility. Among the many challenges in managing PCOS is understanding its complex interplay with metabolic factors. In a breakthrough study, researchers have uncovered a significant relationship between serum apelin, visfatin levels, and body composition in women with PCOS.

The study, published in the European Journal of Obstetrics & Gynecology and Reproductive Biology, found an increased fat content in various parts of the body, reduced skeletal muscle content, and are often complicated by metabolic abnormalities in PCOS patients compared with healthy women.

“Serum apelin and visfatin correlated not only with fat mass, obesity, and fat distribution but also with muscle mass and distribution,” the researchers wrote. It may be possible to reduce the long-term risk of metabolic disease in PCOS through the management and monitoring of the body composition in PCOS patients or to reflect the PCOS therapeutic effect.

The study by Dan Kuai, Tianjin Medical University General Hospital, Tianjin, China, and colleagues investigated the relationship between body composition and serum visfatin and apelin levels in patients with polycystic ovary syndrome.

For this purpose, the researchers conducted a prospective observational study to compare the differences in body composition, levels of gonadal hormone concentrations, glucose metabolism, apelin, and visfatin between PCOS patients and the control group.

PCOS patients were further categorized into different subgroups according to different obesity criteria, and the differences between serum apelin and visfatin levels in different subgroups were compared. Finally, the researchers also explored the correlation of serum apelin levels and visfatin levels with body composition, and metabolism-related indicators in PCOS patients.

Based on the study, the researchers reported the following findings:

· 178 PCOS patients and 172 cases of healthy women (control group) were collected between July 2020 and 2021 November 2021.

· In PCOS patients, their weight, Waist Hip Rate (WHR), Body Mass Index (BMI), Percent Body Fat (PBF), Fat-Free Mass Index (FFMI), Fat mass index (FMI), PBF of Arm, PBF of Leg, PBF of the Trunk, Visceral Fat Level (VFL), fasting insulin (FINS), Luteinizing hormone (LH), and Homeostatic Model Assessment for Insulin Resistance (HOMA-IR) were significantly higher than in the control group, Percent Skeletal Muscle (PSM), PSM of Leg, and PSM of the Trunk were significantly decreased than in the control group.

· The PCOS patients had significantly higher serum visfatin and apelin levels than the control group.

· In PBF > 35 % PCOS patients, the apelin and visfatin levels were significantly higher than the PBF ≤ 35 % PCOS patients.

· In WHR ≥ 0.85 and BMI ≥ 24 kg/m2 PCOS patients, the visfatin levels were significantly higher than the WHR < 0.85 and BMI < 24 kg/m2 PCOS patients.

· Serum apelin and visfatin positively correlated with BMI level, WHR, FFMI, FMI, PBF, PBF of arms, PBF of the trunk, PBF of legs, FBG, VFL, HOMA-IR index and negatively correlated with PSM, PSM of legs, and PSM of the trunk.

The study provides a theoretical basis for monitoring and managing body composition in PCOS patients to reflect the treatment effect of PCOS and lower the long-term risk of metabolic diseases in PCOS.

Reference:

Kuai D, Tang Q, Wang X, Yan Q, Tian W, Zhang H. Relationship between serum apelin, visfatin levels, and body composition in Polycystic Ovary Syndrome patients. Eur J Obstet Gynecol Reprod Biol. 2024 Jun;297:24-29. doi: 10.1016/j.ejogrb.2024.03.034. Epub 2024 Mar 27. PMID: 38555852.

Powered by WPeMatico

Use of spironolactone for dermatological conditions may not increase risk of breast or uterine tumors: Study

Use of spironolactone for dermatological conditions may not increase the risk of breast or uterine tumours suggests a study published in the Journal of the American Academy of Dermatology.

Spironolactone is a blood pressure and congestive heart failure medication used off-label for treatment of dermatological conditions in women. As per a Food and Drug Administration warning included in the package insert, rat studies showed tumorigenicity with high oral spironolactone doses.1 Due to antiandrogen properties and structural similarity to estrogen, it has been hypothesized that spironolactone may increase breast/gynecologic cancer risk.2 Therefore, we examined whether spironolactone exposure for acne, hair loss, and/or hirsutism indications is associated with increased breast and/or gynecologic tumor risk.

Demographics and history of breast/gynecologic benign and malignant tumor diagnoses were collected for women with spironolactone exposure for acne, hair loss, and hirsutism indications and an unexposed group without spironolactone exposure prior to 4/30/2018 who presented for gynecology visits at our institution. Charts were queried through 8/1/2023 to allow for 5-year follow up after first spironolactone exposure, allowing adequate time to detect tumor development. Women with breast/gynecologic cancer diagnosis prior to spironolactone exposure were excluded. R version 4.2.2, with Pearson χ2, Wilcoxon/Kruskal-Wallis rank sum, and Fisher’s exact tests were used for statistical significance (P < .05). Logistic regression determined effect of age, race, and spironolactone exposure on tumor occurrence. A total of 420 and 3272 women with and without spironolactone exposure were included (Table I). Median dose was 100 mg (range: 25-225 mg). Adjusting for age and race, risk of tumor development was similar between exposed and unexposed cohorts (P > .05). Among patients with a benign tumor, malignant tumor, or breast or uterine cancers, spironolactone exposure was not a risk factor (all P > .05) (Table II). Daily dose of spironolactone did not impact tumor development risk (P > .05).Interpretation: Patients in the exposed cohort were not more likely to develop a malignant tumor than a benign tumor when compared to the unexposed cohort (P > .05).

Open table in a new tab

We found no increased risk of benign or malignant breast or uterine tumors with spironolactone treatment for dermatologic indications, similar to a systematic review and meta-analysis that reported no increased risk of breast (n = 3 studies, very low certainty of evidence) or ovarian (n = 2 studies, very low certainty of evidence) cancers with spironolactone use (risk ratio = 1.04; 95% CI, 0.86-1.22, risk ratio = 1.52; 95% CI 0.84-2.20, respectively).3 In addition, spironolactone use was not associated with increased risk for other cancer types, including prostate, bladder, kidney, gastric, or esophageal.3 We also found no increased risk of tumor development with changes in spironolactone doses, similar to a retrospective study of 28,032 women >55 years old with exposure to spironolactone for any indication and 55,961 controls matched by socioeconomic score and age, which showed no evidence of a dose response risk among exposed patients.4

Limitations include retrospective design, lack of detailed information about spironolactone courses, lack of control for family malignancy history, and lack of quantitation of follow-up.

In sum, we found no association between spironolactone exposure for dermatological conditions and risk of breast or uterine tumors compared to unexposed women. Therefore, women taking spironolactone for acne, hair loss, and hirsutism and who are at low risk of breast or gynecologic cancers may be counseled to have regular gynecology follow up, but no more frequently than the general population. Future studies are needed to assess risk over longer time periods.

Reference:

Hill RC, Wang Y, Shaikh B, Lipner SR. No increased risk of breast or gynecologic malignancies in women exposed to spironolactone for dermatologic conditions: A retrospective cohort study. J Am Acad Dermatol. 2024 Jun;90(6):1302-1304. doi: 10.1016/j.jaad.2024.02.030. Epub 2024 Feb 27. PMID: 38423468; PMCID: PMC11095997.

Keywords:

Use, spironolactone, dermatological, conditions, increased risk, breast, uterine tumours, study, Journal of the American Academy of Dermatology, acne, breast cancer, dermatology, gynecologic cancer, hair loss, hirsutism, medical dermatology, spironolactone, Hill RC, Wang Y, Shaikh B, Lipner SR.

Powered by WPeMatico

Blood pressure and lipid profiles favorable in children born after ART with frozen embryo transfer; Study

More than 10 million children have been born after the use
of assisted reproductive technology (ART). Especially the use of frozen, and
later on thawed, embryos has been increasing steadily during the last decade.
The health of the children born after ART is of utmost interest to the parents
and to society. Studies have shown that children born after the use of
frozen/thawed embryos are born with a higher birthweight compared with children
conceived naturally or after the use of fresh embryos. However, the potential
long-term implications of this elevated birthweight remain insufficiently
explored.

This study by Asserhøj et al was part of the cohort study
‘Health in Childhood following Assisted Reproductive Technology’ (HiCART),
which included 606 singletons (292 boys) born between December 2009 and
December 2013: 200 children were conceived after FET; 203 children were
conceived after fresh-ET; and 203 children were conceived naturally and matched
for birth year and sex. The study period lasted from January 2019 to September
2021. The included children were 7–10 years of age at examination and underwent
a clinical examination with anthropometric measurements, pubertal staging, and
BP measurement. Additionally, a fasting blood sample was collected and analysed
for cholesterol, low-density lipoproteins (LDL), high-density lipoproteins
(HDL), and triglycerides. Systolic and diastolic BP were converted to standard
deviation scores (SDS) using an appropriate reference and accounting for height
(SDS) of the child. The three study groups were compared pairwise using a
univariate linear regression model. Mean differences were adjusted for
confounders using multiple linear regression analyses.

Girls and boys conceived after FET had significantly higher
birthweight (SDS) compared with naturally conceived peers (mean difference:
girls: 0.35, 95% CI (0.06–0.64), boys: 0.35, 95% CI (0.03–0.68)).

Girls conceived after FET had significantly higher systolic
BP (SDS) and heart rate compared with girls conceived after fresh-ET (adjusted
mean difference: systolic BP (SDS): 0.25 SDS, 95% CI (0.03–0.47), heart rate:
4.53, 95% CI (0.94–8.13)).

Regarding lipid profile, no significant differences were
found between the three groups of girls. For the boys, no significant
differences were found for BP and heart rate.

Lipid profiles were more favourable in boys born after FET
compared with both boys conceived after fresh-ET and NC. All outcomes were
adjusted for parity, maternal BMI at early pregnancy, smoking during pregnancy,
educational level, birthweight, breastfeeding, child age at examination, and
onset of puberty.

In an analysis of BP, heart rate, and lipid profile in 606
children born after FET, fresh-ET, and NC, and stratified on sex, authors found
significantly higher SBP, SBP (SDS), and heart rate in girls born after FET
compared with girls born after fresh-ET. Boys born after FET had a more
favourable lipid profile compared with boys born after fresh-ET and NC.

In this study where a large cohort of children born after
FET, fresh-ET, and NC has been investigated, authors observed elevated SBP, SBP
(SDS), and heart rate among girls conceived after FET when compared with girls
conceived after fresh-ET. These differences persisted in sensitivity analyses
restricted to pre-pubertal girls. Notably, these disparities were not found in
boys. Further, boys born after FET exhibited more favourable lipid profiles compared
with boys born after fresh-ET and NC. These findings suggest that girls and
boys may display different susceptibility to intrauterine perturbances
affecting the cardiovascular system in childhood and this emphasizes the need
for further long-term follow-up studies on children born after ART stratified
on sex.

Source: Asserhøj et al.; Human Reproduction Open, 2024,
2024(2), hoae016 https://doi.org/10.1093/hropen/hoae016

Powered by WPeMatico

Risk of Post-Acute Sequelae of SARS-CoV-2 Remains High Even with Vaccination During Omicron Era: Study

United States: A recent study published in the New England Journal of Medicine concluded that throughout the pandemic, the cumulative incidence of PASC during the first year after SARS-CoV-2 infection decreased whereas in the omicron era, the risk of post-acute sequelae of SARS-COV-2 infection remained substantial among vaccinated persons who had SARS-CoV-2 infection.

Post-acute sequelae of SARS-CoV-2 infection (PASC), commonly referred to as long COVID. It refers to a range of symptoms that persist for weeks or months after the acute phase of COVID-19 has resolved. Fatigue, tiredness, shortness of breath, cough, headache, joint pain etc are the common symptoms that are associated with SARS-CoV-2 infection. Considering this, Yan Xie, from the Division of Pharmacoepidemiology (Y.X.), Washington University, St.Louis, et.al conducted a study to evaluate the cumulative incidence of PASC during the first year after SARS-CoV-2 infection.

For this purpose, the research team used the records of the Department of Veterans Affairs from March 1, 2020, and January 31, 2022, to build a study population of 441,583 veterans who were infected with SARS-CoV-2 infection and 4,748,504 noninfected participants were used as a control group. After 1 year of SARS-CoV-2 infection, the cumulative incidence of PASC was estimated during the pre–delta, delta, and omicron eras of covid-19.

The study assessed 206,011 persons with no vaccination and SARS-CoV infection during the pre delta era, and 54,002 persons with no vaccination and SARS-CoV-2 infection during the Delta era. 84,943 persons with vaccination and SARS-CoV-2 infection were evaluated during the omicron era.

The findings revealed that:

• Unvaccinated persons infected with SARS-CoV-2, the cumulative incidence of PASC during the first year after infection was 10.42 events per 100 persons, in the pre-delta era, 9.51 events per 100 persons in the delta era, and 7.76 events per 100 persons in the omicron era.

• Among vaccinated persons, the cumulative incidence of PASC at 1 year was 5.34 events per 100 persons during the delta era and 3.50 events per 100 persons during the omicron era. Vaccinated persons had a lower cumulative incidence of PASC at 1 year than unvaccinated persons

• Decomposition analyses showed 5.23 fewer PASC events per 100 persons at 1 year during the omicron era than during the pre-delta and delta eras combined; 28.11% of the decrease was attributable to era-related effects (changes in the virus and other temporal effects), and 71.89%was attributable to vaccines.

“The researchers concluded that in the omricon era, the chances of infection by SARS-CoV-2 remained among vaccinated persons. The incidence of PASC during first year after DARS-CoV-2 infection decreased over the course of a pandemic”, researchers concluded.

Reference

Xie, Y., Choi, T., & Al-Aly, Z. (2024). Postacute Sequelae of SARS-CoV-2 Infection in the Pre-Delta, Delta, and Omicron Eras. The New England journal of medicine, 10.1056/NEJMoa2403211. Advance online publication. https://doi.org/10.1056/NEJMoa2403211

Powered by WPeMatico

Low cortisol, hair-trigger stress response in the brain may underlie Long COVID: Study

Proteins left behind by COVID-19 long after initial infection can cause cortisol levels in the brain to plummet, inflame the nervous system and prime its immune cells to hyper-react when another stressor arises, according to new animal research by University of Colorado Boulder scientists.

The study, published in the journal Brain Behavior and Immunity, sheds new light on what might underly the neurological symptoms of Long COVID, an intractable syndrome which impacts as many as 35% of those infected with the virus.

The findings come as COVID makes a striking summer comeback, with cases rising in 84 countries and numerous high-profile athletes at the Paris Olympics testing positive.

“Our study suggests that low cortisol could be playing a key role in driving many of these physiological changes that people are experiencing with Long COVID,” said lead author Matthew Frank, PhD, a senior research associate with the Department of Psychology and Neuroscience at CU Boulder.

Previous research has shown that SARS-CoV-2 antigens, immune-stimulating proteins shed by the virus that causes COVID-19, linger in the blood stream of Long COVID patients as much as a year after infection. They’ve also been detected in the brains of COVID patients who have died.

To explore just how such antigens impact the brain and nervous system, the research team injected an antigen called S1 (a subunit of the “spike” protein) into the spinal fluid of rats and compared them to a control group.

After 7 days, in rats exposed to S1, levels of the cortisol-like hormone corticosterone plummeted by 31% in the hippocampus, the region of the brain associated with memory, decision making and learning. After 9 days, levels were down 37%.

“Nine days is a long time in the life span of a rat,” said Frank, noting that rats live on average for two to three years.

He notes that cortisol is a critical anti-inflammatory, helps convert fuel into energy and is important for regulating blood pressure and the sleep-wake cycle and keeping the immune response to infection in check. One recent study showed that people with Long COVID tend to have low cortisol levels. So do people with chronic fatigue syndrome, research shows.

“Cortisol has so many beneficial properties that if it is reduced it can have a host of negative consequences,” said Frank.

In another experiment, the researchers exposed different groups of rats to an immune stressor (a weakened bacteria) and observed their heart rate, temperature and behavior as well as the activity of immune cells in the brain called glial cells.

They found that the group of rats that had previously been exposed to the COVID protein S1 responded far more strongly to the stressor, with more pronounced changes in eating, drinking, behavior, core body temperature and heart rate, more neuroinflammation and stronger activation of glial cells.

“We show for the first time that exposure to antigens left behind by this virus can actually change the immune response in the brain so that it overreacts to subsequent stressors or infection,” said Frank.

He stresses that the study was in animals and that more research is necessary to determine whether and how low cortisol might lead to Long COVID symptoms in people.

But he theorizes that the process might go something like this: COVID antigens lower cortisol, which serves to keep inflammatory responses to stressors in check in the brain. Once a stressor arises – whether it be a bad day at work, a mild infection or a hard workout – the brain’s inflammatory response is unleashed without those limits and serious symptoms come screaming back.

Those might include, fatigue, depression, brain fog, insomnia and memory problems.

Frank said he is doubtful that cortisol treatments alone could be an effective treatment for Long COVID, as they would not get at the root cause and come with a host of side effects.

Instead, the findings suggest that identifying and minimizing different stressors might help manage symptoms.

Rooting out the source of antigens –including tissue reservoirs where bits of virus continue to hide out – might also be an approach worth exploring, he suggests.

The study was funded by the nonprofit PolyBio Research Foundation. More research is underway.

“There are many individuals out there suffering from this debilitating syndrome. This research gets us closer to understanding what, neurobiologically, is going on and how cortisol may be playing a role,” said Frank.

Reference:

Matthew G. Frank, Jayson B. Ball, Shelby Hopkins, Tel Kelley, Angelina J. Kuzma, Robert S. Thompson, Monika Fleshner, Steven F. Maier, SARS-CoV-2 S1 subunit produces a protracted priming of the neuroinflammatory, physiological, and behavioral responses to a remote immune challenge: A role for corticosteroids, Brain, Behavior, and Immunity, https://doi.org/10.1016/j.bbi.2024.07.034.

Powered by WPeMatico

Can meditation and stretching relieve cramping caused by cirrhosis?

People suffering from cirrhosis may find some symptom relief from two accessible activities: stretching and meditation.

A study from the University of Michigan compared the two therapies as a means to relieve nocturnal muscle cramps and found both effective.

The resulting paper, “The RELAX randomized controlled trial: Stretching versus meditation for nocturnal muscle cramps,” appeared in Liver International.

Two out of every three people with cirrhosis experience muscle cramps at night that wake them from sleep.

Since these cramps interrupt rest, they exacerbate other symptoms.

In previous research, Michigan doctors determined that muscle cramps have the highest impact on quality of life, relative to other cirrhosis-related symptoms, making their treatment a priority.

“We wanted to test two different treatments for cramps: One was coping with meditation and the other was physically stretching to prevent the occurrence of the cramp,” said Elliot Tapper M.D., director of the University of Michigan’s Cirrhosis Program, and lead author on the paper.

“What we ended up finding was that both interventions significantly reduced cramps severity and improved quality of life, which was somewhat unexpected.”

The unexpected result was the effect of meditation.

The researchers had selected meditation as an active placebo after previous research on mindfulness techniques for caregivers of people with cirrhosis.

In this study, the participants in the meditation group and the participants in the stretching group both reported reduced cramp severity and better sleep.

“The hope was that if we could see some positive effects for patients, then we could use meditation in other studies of generalized chronic pain,” Tapper said.

“I just didn’t expect it to have anything to do with cramp severity. I thought it could improve quality of life, but not reduce cramps.”

There is limited research on therapies to ameliorate these cramps, despite their prevalence in patients with cirrhosis and chronic liver disease.

A previous study led by Tapper showed that drinking pickle juice could help stop cramps, though it did not improve overall quality of life.

While a higher percentage of patients who stretched (79.5%) said they would recommend their intervention than those patients who meditated (55.3%), the results suggested meditating was more likely to improve overall health-related quality of life.

Tapper highlights this potential to improve quality of life, along with its accessibility, as making meditation an exciting option meriting further study.

“These results really show that that, if practiced, these mind-body methods have the ability to train people to overcome truly distressing physical symptoms,” Tapper said. 

Reference:

Elliot B. Tapper, Hirsh Trivedi, Douglas A. Simonetto, Vilas Patwardhan, Erin Ospina, Beanna Martinez, Xi Chen, Susan Walker, Samantha Nikirk, The RELAX randomized controlled trial: Stretching versus meditation for nocturnal muscle cramps, Liver International, 

Powered by WPeMatico

Increased tablet use in children may increase risk of emotional dysregulation, states JAMA study

A new study published in the Journal of American Medical Association found that the children who started using tablets at the age of 3.5 were more likely to vent their displeasure and rage. Children in preschool are increasingly using tablets and emotional dysregulation in children has been connected to the usage of mobile devices. Few studies have been able to demonstrate a direct correlation between the tablet use and the growth of self-regulation abilities in children. Furthermore, there is a dearth of research that models within-person correlations across time. Thus, this study by Caroline Fitzpatrick and colleagues wanted to determine the extent to which within-person emotions of rage and dissatisfaction between the ages of 3.5 and 5.5 years are influenced by children’s tablet use. 

During the COVID-19 pandemic, a total of 315 parents of preschool-aged children from Nova Scotia, Canada, who were part of a prospective, community-based convenience sample, were followed up on at the ages of 3.5 (2020), 4.5 (2021), and 5.5 years (2022). Between October 5, 2023, and December 15, 2023, all analyses were carried out. The study used tablet use reported by parents when their children were 3.5, 4.5, and 5.5 years old as an exposure criterion. The Children’s Behavior Questionnaire was used by parents to describe instances of anger/frustration expressed by their children at the ages of 3.5, 4.5, and 5.5.

The sample was split evenly between the sexes of the children and the majority said they were married (258 [82.0%]) and Canadian (287 [91.0%]). A 1-SD increase in tablet use at 3.5 years, or 1.15 hours per day, was linked to a 22% SD scale rise in anger/frustration at 4.5 years, according to a random-intercept cross-lagged panel model. A 22% SD (or 0.28 hours per day) increase in tablet use at 5.5 years was linked to a 1 SD scale rise in anger and irritation at 4.5 years. Overall, this study found that children who started using tablets at age 3.5 showed increased signs of dissatisfaction and rage by the time they were 4.5 years old. A child’s propensity for rage and irritation at age 4.5 was linked to increased tablet use by age 5.5. These findings imply that early tablet usage in children may be a part of a cycle that is harmful to emotional control.

Reference:

Fitzpatrick, C., Pan, P. M., Lemieux, A., Harvey, E., Rocha, F. de A., & Garon-Carrier, G. (2024). Early-Childhood Tablet Use and Outbursts of Anger. In JAMA Pediatrics. American Medical Association (AMA). https://doi.org/10.1001/jamapediatrics.2024.2511

Powered by WPeMatico

For some older adults with kidney failure, dialysis may not be the best option, claims study

Whether dialysis is the best option for kidney failure and, if so, when to start, may deserve more careful consideration, according to a new study.

For older adults who were not healthy enough for a kidney transplant, starting dialysis when their kidney function fell below a certain threshold-rather than waiting-afforded them roughly one more week of life, Stanford Medicine researchers and their colleagues found.

More critically, perhaps, they spent an average of two more weeks in hospitals or care facilities, in addition to the time spent undergoing dialysis.

“Is that really what a 75- or 80-year-old patient wants to be doing?” asked Maria Montez Rath, PhD, a senior research engineer.

Montez Rath is the lead author on a study about dialysis, life expectancy and time at home to be published in Annals of Internal Medicine. Manjula Tamura, MD, a professor of nephrology, is the senior author.

“For all patients, but particularly for older adults, understanding the trade-offs is really essential,” Tamura said. “They and their physicians should carefully consider whether and when to proceed with dialysis.”

Patients with kidney failure who are healthy enough for transplantation may receive a donated kidney, which will rid their blood of toxins and excess fluid. But that option is unavailable to many older adults who have additional health conditions such as heart or lung disease or cancer.

For those patients, physicians often recommend dialysis -a treatment that cleans the blood like healthy kidneys would-when patients progress to kidney failure. Patients are considered to have kidney failure when their estimated glomerular filtration rate (eGFR), a measure of renal function, falls below 15.

Patients and their family members sometimes assume that dialysis is their only option, or that it will prolong life significantly, Montez Rath said. “They often say yes to dialysis, without really understanding what that means.”

But patients can take medications in lieu of dialysis to manage symptoms of kidney failure such as fluid retention, itchiness and nausea, Tamura said. She added that dialysis has side effects, such as cramping and fatigue, and typically requires a three- to four-hour visit to a clinic three times a week.

“It’s a pretty intensive therapy that entails a major lifestyle change,” she said.

Lifespan and time at home

The researchers conducted the study to quantify what dialysis entails for older adults who are ineligible for a transplant: whether and how much it prolongs life, along with the relative number of days spent in an inpatient facility such as a hospital, nursing home or rehabilitation center.

The team evaluated the health records, from 2010 to 2018, of 20,440 patients (98% of them men) from the U.S. Department of Veterans Affairs. The patients were 65 and older, had chronic kidney failure, were not undergoing evaluation for transplant and had an eGFR below 12.

Simulating a randomized clinical trial with electronic health records, they divided patients into groups: those who started dialysis immediately, and those who waited at least a month. Over three years, about half of the patients in the group who waited never started dialysis.

Patients who started dialysis immediately lived on average nine days longer than those who waited, but they spent 13 more in an inpatient facility. Age made a difference: Patients 65 to 79 who started dialysis immediately on average lived 17 fewer days while spending 14 more days in an inpatient facility; patients 80 and older who started dialysis immediately on average lived 60 more days but spent 13 more days in an inpatient facility.

Patients who never underwent dialysis on average died 77 days earlier than those who started dialysis immediately, but they spent 14 more days at home.

“The study shows us that if you start dialysis right away, you might survive longer, but you’re going to be spending a lot of time on dialysis, and you’re more likely to need hospitalization,” Montez Rath said.

Tamura noted that physicians sometimes recommend dialysis because they want to offer patients hope or because the downsides of the treatment haven’t always been clear. But the study indicates physicians and patients may want to wait until the eGFR drops further, Tamura said, and should consider symptoms along with personal preferences before starting dialysis.

“Different patients will have different goals,” she said. “For some it’s a blessing to have this option of dialysis, and for others it might be a burden.”

It may be helpful, she added, if clinicians portray dialysis for frail, older adults as a palliative treatment-primarily intended to alleviate symptoms.

“Currently, dialysis is often framed to patients as a choice between life and death,” she said. “When it’s presented in this way, patients don’t have room to consider whether the treatment aligns with their goals, and they tend to overestimate the benefits and well-being they might experience. But when treatment is framed as symptom-alleviating, patients can more readily understand that there are trade-offs.”

Reference:

Maria E. Montez-Rath, I-Chun Thomas, Vivek Charu, Michelle C. Odden, Carolyn D. Seib, Shipra Arya, Enrica Fung, Ann M. O’Hare, Susan P.Y. Wong, Effect of Starting Dialysis Versus Continuing Medical Management on Survival and Home Time in Older Adults With Kidney Failure: A Target Trial Emulation Study, Annals of Internal Medicine, https://doi.org/10.7326/M23-3028.

Powered by WPeMatico

Manuka honey has potential of nutraceutical for breast cancer, reveals preliminary research

A new study led by investigators at the UCLA Health Jonsson Comprehensive Cancer Center found that Manuka honey could potentially be an alternative, natural option for breast cancer prevention and treatment-particularly for estrogen receptor (ER)-positive breast cancer, the most common subtype of breast cancer that accounts for about 70–80% of all breast cancer cases.

In preclinical experiments, researchers found:

  • Manuka honey significantly reduced tumor growth in mice with ER-positive breast cancer cells by 84% without affecting normal breast cells or causing major side effects.
  • Higher concentrations of Manuka honey led to a greater reduction in cancer cell growth.
  • Manuka honey reduced levels of signaling pathways that are upregulated in cancer such as AMPK/AKT/mTOR and STAT3, which are involved in tumor cell growth and survival.
  • Manuka honey reduced the proliferation of cancer cells but did not affect the growth of normal human mammary epithelial cells, indicating it might target cancer cells specifically.
  • Manuka honey induced apoptosis or cell death of breast cancer cells.
  • The Manuka honey enhances the effectiveness of existing treatments such as tamoxifen, a commonly used antiestrogen drug in ER-positive breast cancer therapy, when used to together.

There is an urgent need for alternative treatments to help prevent the development of endocrine resistance and improve long-term breast cancer survival. Endocrine resistance is a major factor contributing to breast cancer being the leading cause of cancer-related deaths among women worldwide. New research has shown that Manuka honey, long known for its antimicrobial and antioxidant properties, is also rich in compounds like flavonoids, phytochemicals, complex carbohydrates, vitamins, amino acids, and minerals. These compounds have demonstrated anticancer potential at a molecular level by inhibiting pathways activated in cancer that induce tumor cell proliferation, growth, and metastasis. Researchers theorize that one of the mechanisms of action of Manuka honey is to block estrogen receptors, making it potentially effective as a nutraceutical against hormone-sensitive breast cancer.

To understand the potential of Manuka honey as a natural treatment for breast cancer, the research team conducted a series of experiments in mice and in ER-positive MCF-7 and triple-negative MDA-MB-231 breast cancer cell lines, which represent two of the most common types of breast cancer. In these models, oral administration of Manuka honey resulted in a significant reduction in tumor growth compared to control groups. This significant inhibition of tumor progression underscores the honey’s potential effectiveness as a treatment for cancer prevention or treatment.

The findings suggests that Manuka honey could potentially be developed into a natural supplement or even a standalone treatment for ER-positive breast cancer, particularly for patients who experience resistance to traditional therapies.

“The findings provide hope for development of a natural, less toxic alternative to traditional chemotherapy,” said Dr. Diana Marquez-Garban, associate professor of medicine at the David Geffen School of Medicine at UCLA, and the study’s first author. “Although more research is necessary to fully understand the benefits of natural compounds in cancer therapy, this study establishes a strong foundation for further exploration in this area.”

Reference:

Márquez-Garbán, D.C.; Yanes, C.D.; Llarena, G.; Elashoff, D.; Hamilton, N.; Hardy, M.; Wadehra, M.; McCloskey, S.A.; Pietras, R.J. Manuka Honey Inhibits Human Breast Cancer Progression in Preclinical Models. Nutrients 2024, 16, 2369. https://doi.org/10.3390/nu16142369.

Powered by WPeMatico