New from the @EmoryCSHH News Team:
Merom Arthur recommends Dr. Jessi Gold's new memoir "How Do You Feel?: One Doctor's Search for Humanity" for those who serve others for its pointed reflection on the need for self-compassion in addition to compassion for others.
Yes, Adults Can Develop Seasonal Allergies
By Mohana Ravindranath, New York Times
The return of the pestering pollen accompanies the buzzing bees, shining sun, and blossoming buds of spring. A common misconception about seasonal allergies, also known as “just hay fever”, is that it only burdens children. After all, allergies are most prevalent among children and typically develop after age two.
Seasonal allergies are immunological reactions to environmental elements, like pollen or mold spores, which tend to be most prominent in the spring. Exposure to such allergens can trigger sneezing, itchy nose and throat, runny or congested nose, and watery red eyes. Moreover, fatigue, headaches, or sinus pressure can also occur, often leading to confusion with cold symptoms.
Some allergies, however, can develop in patients’ 20s, 30s, and 40s, but the symptoms might not be overt. Science does not yet have the answer as to why individuals can acquire allergies after living decades without, but one theory involves climate change. Environmental and external factors are shifting more than ever before, causing allergy season to begin earlier and last longer than standard, particularly in certain locations. Infections and hormonal fluctuations are also becoming more compromising, weakening the immune system’s ability to respond.
Those experiencing seasonal allergies should purchase over-the-counter antihistamines and nasal steroid sprays to alleviate symptoms. For more severe cases, immunotherapy and allergy shots may be more promising. Most importantly, those with allergies should wash their hands and face, take a shower before bed, and sleep with the windows closed to keep all allergens away.
— by Merom Arthur
LA firefighters put out massive blazes. Now they worry that cancer might be smoldering inside them
By John Bonifield, CNN Health
January’s Palisades Fire in Los Angeles has been named the second-most destructive wildfire in Southern California’s history, with more than 23,000 acres and 5,000 structures burned down.
In the aftermath, researchers from the University of Arizona, in collaboration with the Wildfire Conservancy, are conducting a study to monitor LA firefighters’ health outcomes related to their exposure to carcinogens from the recent fires. They procured blood and urine samples from firefighters with the California Department of Forestry and Fire Protection and the LA Fire Department. They also collected contaminant-absorbing wristbands that firefighters wore to measure the extent of their exposure. The samples are currently being tested for carcinogens associated with wildfires that burn in urban areas.
Preliminary data from this ongoing investigation found that 42 responders in the LA fires had significantly higher concentrations of certain chemicals called PFAS in their blood. Additionally, an analysis of heavy metal exposures showed elevated levels of key metals, such as chromium, arsenic and cobalt. A separate study conducted by the Centers for Disease Control and Prevention found that there was a brief 110-fold increase in air lead levels during the LA fires. Researchers, however, are not sure if, or how, these exposures will manifest into negative health outcomes.
The Wildfire Conservancy plans to use these studies to create recommendations and interventions to help firefighters better understand and predict their cancer risk throughout their lives.
— by Alexa Morales
Dementia May Not Always Be the Threat It Is Now. Here’s Why.
By Paula Span, The New York Times
Dementia, a symptom of Alzheimer’s disease and other neurodegenerative disorders, is a worry of many Americans, especially those who may be genetically predisposed to it. A recent study, which projected steep increases in dementia cases in the next three decades, has added to the public’s fright over developing cognitive decline. Dr. Josef Coresh, director of the Optimal Aging Institute at NYU Langone Health, led the study and included 15,000 Americans over the age of 55. The team projected a lifetime dementia risk of 42 percent, mostly emerging after 85 using data collected from 1987 to 2020 .
This risk is much higher than other studies have found, but may not be as alarming as it appears. Critics of the study believe the larger sample size skewed toward an older population may inflate the measure of lifetime risk. They did, however, estimate that the number of people who would develop dementia would double by 2060.
Eric Stallard, actuary and co-director of the Biodemography of Aging Research Unit at Duke University, also points out that age-specific prevalence of dementia in the US has actually declined in the last 40 years. Notably, investigations like the National Health and Aging Trends Study have actually found that rates of dementia have actually gone down within age-specific cohorts. Perhaps, any rise in dementia cases can be credited simply to people living longer.
— by Justine Borgia
Items summarized by: Merom Arthur, Alexa Morales, Justine Borgia