The Impact of Stress on Cognitive Reserve

Recent research suggests that the cognitive advantages gained from enriching and fulfilling life experiences can be diminished by both physical and psychological stress. Chronic or intense stress is linked to difficulties in participating in recreational and physical activities, decreased social engagement, and a heightened risk of developing dementia.

Stress could be working against your attempts to strengthen and maintain cognitive reserve. Research has indicated that individuals with higher cognitive reserve index (CRI) scores tend to experience slower cognitive decline, even in cases of Alzheimer’s disease. These scores are determined by engaging in mentally stimulating activities, achieving higher levels of education, pursuing complex careers, staying physically active, and fostering strong social connections.

The research investigated the relationship between CRI scores, cognitive function, and Alzheimer’s disease biomarkers in 113 participants from a memory clinic. The study also examined the impact of perceived stress, alongside physiological markers of psychological stress, specifically saliva cortisol levels.

The researchers found that while greater cognitive reserve enhanced cognitive function, physiological stress appeared to weaken this effect. Higher CRI scores were linked to improved working memory in individuals with normal cortisol levels, but this benefit was absent in those with elevated cortisol levels, indicating high psychological stress.

To preserve healthy cognitive function, people can use stress management techniques like relaxation exercises, meditation, physical exercise, or therapy to lower the risk of cognitive decline. Engaging in intellectually challenging activities helps strengthen neural connections and boosts abilities in memory, problem-solving, and communication. It is essential to recognize the effect of stress on cognitive function.

To view the original scientific study click below:
Cognitive reserve, cortisol, and Alzheimer’s disease biomarkers: A memory clinic study

Obesity and Its Effects on Fat-Burning Nerves

A recent study on mice has identified a particular type of neuron that releases a protein called neuropeptide Y (NPY), which aids in weight loss. However, the study also found that obesity can damage these neurons, rendering them ineffective.

The mice were fed a healthy diet but still continued to gain weight and become obese if they had low levels of neuropeptide Y (NPY). This could help explain why some individuals gain weight without overeating and might impact strategies for preventing obesity. The research indicates that energy expenditure may be more crucial than appetite in regulating body weight for certain individuals.

These particular neurons emit a protein known as NPY, a chemical signal present in the peripheral nervous system and the brain, which promotes fat burning. Approximately 40% of sympathetic neurons in the peripheral system are NPY-positive. Researchers have discovered that NPY increases appetite while also activating fat-burning cells in adipose tissues.

NPY helps prevent obesity by promoting the growth of fat cells that are burned to produce heat, thereby regulating body temperature. Studies have shown that mice deficient in NPY had fat cells with reduced fat-burning capabilities. When these mice were given a high-fat diet, they became obese more quickly and at an earlier stage, even without consuming more food.

Few studies have explored the connection between NPY and weight loss, largely because earlier research concentrated on its role in enhancing appetite. However, it has been established that the body secretes NPY in response to a range of stressors, both internal and external. This new research could potentially pave the way for new, innovative weight management strategies.

To view the original scientific study click below:
Sympathetic neuropeptide Y protects from obesity by sustaining thermogenic fat

Does Being More Flexible Help You Live Longer?

Staying limber could do more than just loosen your muscles; it could also prolong your life. Recent research suggests that reduced flexibility is linked to a higher risk of death among middle-aged adults. The research sought to determine if qualities like flexibility could impact lifespan in ways similar to other health factors.

Movement is crucial for maintaining good health, boosting cardiovascular function, decreasing the risk of type 2 diabetes, and enhancing mental health. Although studies have confirmed that cardiovascular and strength-training exercises can extend lifespan, the impact of flexibility has not been thoroughly explored. This oversight led to the current study’s focus on investigating flexibility’s potential impact on lifespan.

In this research, the term “flexibility” is used to describe the ease with which joints can move through their full range of motion. Maintaining good flexibility throughout the body is crucial for preventing or minimizing pain in various areas of the body. Flexibility typically mirrors a person’s general physical health and fitness level. While lack of flexibility isn’t a direct cause of death, it may point to larger health issues like chronic diseases, inactive lifestyles, inflammation, pain, and psychological stress.

The researchers analyzed data from about 3,000 individuals to explore how flexibility affects lifespan. They focused on participants aged 28 and older, monitoring them for an average of 13 years. Upon examining the data, the researchers discovered that individuals with higher flexibility levels demonstrated improved survival rates from natural causes of death. The analysis also revealed that women scored 35% higher in flexibility compared to men. In summary, the data indicated a link between higher flexibility scores and better mortality outcomes for both genders.

Increased flexibility can aid in injury prevention and facilitate movements crucial for daily health. Regular activities such as climbing stairs or lifting objects from the ground enhance joint flexibility and lower the risk of developing degenerative joint diseases.

The study emphasizes that middle-aged individuals should adopt a daily stretching routine to maintain their flexibility. This practice can make daily activities more manageable, prevent injuries, and contribute to joint health and help alleviate stiffness.

To view the original scientific study click below:
Reduced Body Flexibility is Associated with Poor Survival in Middle-aged Men and Women: A Prospective Cohort Study

Sugar Substitute Linked to Blood Clots and Heart Issues

Recent research indicates that erythritol, a widely used zero-calorie sweetener in various low-carb and no-sugar products, may elevate the risk of blood clots, strokes and heart attacks. Common in the ketogenic diet, erythritol is utilized not only as a sweetener but also as a thickener in numerous products, including snack bars, baked goods, candy, and ice cream.

Erythritol, which occurs naturally in fruits such as pears, watermelons, and grapes and vegetables, is now processed and is used extensively as a food additive to enhance sweetness and flavor. It is approximately 70% as sweet as sugar, and is derived from fermenting corn or wheat starch.

In the study, 20 participants were divided randomly: 10 drank water containing 30 grams of erythritol, and the other 10 consumed water with 30 grams of glucose. The researchers selected 30 grams as it represents a typical amount found in foods containing erythritol. The results noted that 30 minutes after individuals drank water sweetened with erythritol they exhibited signs of increased levels of proteins that promote platelet clumping in their blood. Out of all the subjects tested it was found that erythritol-sweetened beverages increased clotting risk and alter platelet function. This reaction was not seen by participants who drank water sweetened with regular sugar.

Amidst the rising obesity epidemic, artificial sweeteners like erythritol are more frequently included in soft drinks, diet foods, and various processed items. Health and weight-loss experts commonly suggest it as a sugar substitute for people with high cardiovascular risk factors, including obesity, diabetes, or metabolic syndrome.

Regrettably, erythritol is not efficiently metabolized by the body, leading to its accumulation. Previous studies have indicated that individuals at higher risk for heart disease who had elevated erythritol levels were twice as likely to suffer a major cardiac event within three years compared to those with lower levels.

Recently, the World Health Organization recommended against the use of artificial sweeteners due to lack of evidence that they aid in sustaining weight loss and potential health risks. These guidelines are intended for the general population, excluding individuals with preexisting diabetes. Erythritol is often suggested for diabetic patients because it does not elevate blood sugar and helps prevent cavities and oral health issues.

This study highlights concerns that a typical portion of food or drink sweetened with erythritol could immediately trigger clot formation. Erythritol and other related sugar substitutes, frequently used as alternatives to sugar, warrant further investigation for potential long-term health impacts, particularly as these effects are not observed with sugar.

To view the original scientific study click below:
Ingestion of the Non-Nutritive Sweetener Erythritol, but Not Glucose, Enhances Platelet Reactivity and Thrombosis Potential in Healthy Volunteers

Gut Bacteria Linked to Food Addiction and Obesity

A team of researchers have examined the role of gut bacteria in developing food addictions and excessive eating habits. They found that certain gut microbiota can lead to the onset of food addiction in both mice and humans, which may cause obesity.

Food addiction is characterized by a loss of control over food intake and is associated with obesity and other eating disorders. It also involves alterations in the composition of gut bacteria. Previously, the mechanisms behind this behavioral disorder were largely unclear.

The Yale Food Addiction Scale, which includes 35 questions for humans, was employed to diagnose food addiction in both mice and humans. For mice, the scale’s criteria are broken down into three categories: relentless pursuit of food, intense drive to secure food, and obsessive eating behavior.

Researchers then studied the gut bacteria of mice categorized as addicted to food comparing it to the non-addicted mice. They noted an elevation in bacteria from the Proteobacteria phylum and a reduction in bacteria from the Actinobacteria phylum in the food-addicted mice. Additionally, these mice showed a reduction in Blautia bacteria from the Bacillota phylum.

The research on both mice and humans indicated that certain microbiota may act as protective agents against food addiction. This study has shown for the first time a direct link between gut composition and brain genetic activity, exposing the intricate and diverse origins of this behavioral disorder associated with obesity. The study highlighted that proteobacteria might contribute negatively to the development of food addiction in both species, while actinobacteria could have a protective role.

These findings could lead to the identification of new biological indicators for food addiction and facilitate the exploration of these beneficial bacteria as potential therapeutic agents. Currently, there are no effective treatments available. Future strategies might involve employing beneficial bacteria alongside dietary supplements.

To view the original scientific study click below:
Gut microbiota signatures of vulnerability to food addiction in mice and humans

Sugar Intake Might Speed Up Cellular Aging

New research indicates that high levels of added sugars may be more detrimental to metabolic health and early disease onset than any other dietary component. Furthermore, the research revealed that cells appeared more youthful in middle-aged women who included a higher amount of minerals, vitamins, and antioxidants in their diets, compared to those who ate diets with fewer nutrients.

The assessment of cellular youthfulness was determined by examining chemical tags called methyl groups on the DNA surface. These tags adjust the activity of certain genes through a process called epigenetic modification, which does not change the underlying DNA sequence. As we age, the arrangement of these methyl groups shifts, a phenomenon thought to play a role in hastening cellular aging.

The study involved analyzing the dietary records of 342 black and white women, averaging 39 years old, over three non-consecutive days. The researchers rated each woman’s diet according to its adherence to various recognized dietary guidelines. Additionally, they evaluated the amount of added sugar consumed, which varied from 0.1 to 11 oz. per day. To calculate the participants’ epigenetic ages, the team examined DNA methylation in cells from saliva samples.

The results indicate that accelerated epigenetic aging from sugar is a key factor and suggests that excessive sugar intake may be one of several ways that it undermines healthy longevity. The study supports the notion that consuming nutritious foods low in added sugars can extend a person’s duration of life spent in good health, rather than merely surviving.

These findings are among the first to link added sugar intake with epigenetic aging, utilizing second-generation epigenetic clocks. This study is also one of the first to examine such associations in a diverse group of black and white middle-aged women.

Nonetheless, further research is necessary to evaluate the long-term effects of these diets on epigenetic aging.

To view the original scientific study click below:
Essential Nutrients, Added Sugar Intake, and Epigenetic Age in Midlife Black and White Women

The Link Between Higher Abdominal Fat and Reduced Cognitive Function

Aging frequently comes with a variety of comorbid conditions, among which dementia stands out as particularly daunting, given the difficulties in developing effective treatments. In light of these challenges, focusing on modifiable risk factors that could diminish the risk of this ailment presents a more practical strategy. Obesity has been pinpointed as one such factor.

It appears that obesity in midlife poses a risk for dementia, yet a higher BMI in older age is linked to a lower incidence of the disease. This paradox suggests that BMI might not be the most accurate measure of nutritional health in the elderly, who typically undergo shifts in body composition, including an increase in fat mass alongside a decrease in lean mass, without significant alterations in BMI.

In response, newer studies have shifted focus towards alternative health metrics like waist circumference and waist-to-hip ratio, pointing to the detrimental effects of abdominal fat on cognitive health. Nevertheless, the debate continues regarding the precise impact of different body measurements and fat types on cognitive decline.

This research monitored a cohort of 873 older adults from community settings in Japan. These individuals, all aged 60 years and above and free from cognitive impairments at the outset, were observed biennially over an average span of 9.67 years. The study measured each participant’s waist circumference, visceral fat area and subcutaneous fat area. Cognitive function was evaluated using the Mini-Mental State Examination (MMSE). For analysis, these measurements were categorized into three tiers: lowest, middle, and highest.

The analysis revealed that in men, the most significant reductions in MMSE scores were observed in those with the highest measurements of waist circumference, subcutaneous fat area, and visceral fat area, relative to their counterparts in the lowest measurement groups. For women, a similar pattern emerged for waist circumference and subcutaneous fat area, where those in the highest measurement groups experienced the greatest declines in MMSE scores compared to women in the lowest groups. This trend, however, did not hold for the groups categorized by visceral fat area in women.

The team sought to understand their findings by considering existing scientific insights. They suggested that the link between cognitive decline and both visceral and subcutaneous fat might be attributed to the function of adipose tissue as an endocrine entity that produces adipokines. The hypothesis is that abundant abdominal obesity could heighten the release of inflammatory cytokines, potentially leading to cognitive deterioration. This theory is supported by earlier studies, such as one indicating a correlation between elevated levels of the inflammatory cytokine IL-6, an adipokine, and a heightened risk of dementia.

Increased waist circumference (WC), subcutaneous fat area (SFA), and visceral fat area (VFA) in men, along with elevated WC and SFA in women, were linked to more significant cognitive deterioration over the following decade. Yet, no link was found between the accumulation of visceral fat and cognitive decline in women. These results indicate that abdominal fat buildup poses a risk for cognitive decline in the elderly. Additionally, the types of abdominal fat contributing to cognitive decline appear to vary between men and women, highlighting the need for further research to elucidate the sex-specific mechanisms underlying this difference.

To view the original scientific study click below:
Association between abdominal adiposity and cognitive decline in older adults: a 10-year community-based study

Pesticides Tied to Multiple Cancer Risks

According to new research, pesticides, which are chemicals intended to eradicate or control pests like bugs, weeds, and wildlife, could lead to hundreds of thousands of additional cancer cases, particularly in major corn-producing states. The potential link between pesticide use and cancer has long been a concern for scientists and public health experts.

Regarding specific cancers, pesticide use appeared most strongly linked to blood cancers such as leukemia and non-Hodgkin lymphoma. In fact, pesticide-related cases of non-Hodgkin lymphoma were one and a half times more numerous than those linked to smoking. The researchers used the well-known cancer risks associated with smoking cigarettes as a benchmark to help gauge the relative dangers of pesticide exposure. Consequently, exposure to pesticides is now considered to be as hazardous as smoking.

The researchers acknowledge that pesticides play a crucial role in modern agriculture, leading to substantial crop yields that are vital for global food security. The development and use of pesticides stand as one of the most significant innovations in agriculture. However, they also highlight the risks associated with relying on these chemicals.

For the first time, this study explored the link between pesticides and cancer with a primary focus on the health impacts across broad community groups. Previous research mainly examined specific groups such as farmers and their spouses. The findings revealed that regular use of Glyphosate (Roundup) was linked to an increased risk of all cancers, as well as specific increases in colon and pancreatic cancers.

Increased exposure to pesticide chemicals heightens cancer risk since many are potential carcinogens. The level and duration of exposure, along with the combination of different chemicals, can escalate the risk of DNA damage or disruptions in similar signaling pathways. Cancer develops through a multi-step process, and certain individuals may be more susceptible than others. The oxidative stress induced by pesticides can lead to DNA damage and interfere with the signaling of cellular pathways.

The researchers aim to raise awareness and inspire healthcare initiatives and educational programs focused on the risks and management of pesticide exposure. Non-urban areas, central to agricultural production, frequently have limited access to these vital resources. Over the long term, there is a need for ongoing research and the development of safer chemicals or application methods.

To view the original scientific study click below:
Comprehensive assessment of pesticide use patterns and increased cancer risk

New Study Shows Eating Vegan Can Slow Aging

A recent study provides strong evidence suggesting that individuals who eat a diet abundant in whole plant foods may experience slower aging compared to those consuming animal products and highly processed foods. This pioneering research compared the impact of a healthful vegan diet against a balanced diet with meat on the aging process, as gauged by DNA methylation analysis. The study’s methodology, which involved twin pairs, automatically controlled for genetics, age variations, and gender differences.

The study included 22 pairs of identical twins who were assigned to different diets to examine the effects over an eight-week period. One twin from each pair was randomized to follow a healthy vegan diet, while their counterpart adhered to a healthy omnivorous diet, allowing researchers to directly compare how these dietary choices influenced their biological age.

For their analysis, the researchers employed four established methylation clocks and calculated the individual ages of 11 organs and systems, as well as their collective age, termed “Systems Age.” At the conclusion of the study, three of these clocks indicated a noticeable reduction in age acceleration among the vegan group, a change not observed in the omnivorous group. Additionally, significant reductions in biological age were observed exclusively within the vegan cohort in 5 of the 11 systems, as well as in the overall Systems Age.

The study also found that a vegan diet improved epigenetic markers linked to aging. These markers act like hidden clocks within your genes, suggesting that while you may be forty years old chronologically, your biological age can vary due to lifestyle influences such as diet and exercise. Rather than relying on a single measure to determine biological age through epigenetics, the researchers utilized various tests to assess these clocks. The vegan group experienced a significant decrease, while the omnivorous group showed no change.

This study was limited by its small sample size and brief duration. Further research over an extended period would be valuable to better understand the long-term effects of a vegan diet on epigenetic markers. Despite these limitations, the findings align with increasing evidence that a well-planned vegan diet can offer substantial health benefits and bolster support for plant-based eating.

To view the original scientific study click below:
Unveiling the epigenetic impact of vegan vs. omnivorous diets on aging: insights from the Twins Nutrition Study (TwiNS)

Heavy Metal Presence Found in Some Dark Chocolate

Many people are now considering chocolate almost like a supplement, believing that their daily intake provides various health benefits. However, your preferred dark chocolate bar might carry a hidden health hazard. Recent research indicates that certain popular dark chocolate items could have alarming amounts of heavy metals, especially lead and cadmium.

Chocolate plants are notably proficient at absorbing heavy metals from the soil, and they are often cultivated in regions where these metals are prevalent. With these issues in mind, the researchers set out to investigate the extent of this contamination.

Following an eight-year study of dark chocolate, two toxic heavy metals were detected. Moreover, organic chocolates tended to have higher levels of these metals compared to conventional varieties. To evaluate chocolate as thoroughly as dietary supplements are for contaminants, the researchers performed certain tests. Their curiosity stemmed from whether the health benefits sought by consumers were outweighed by the potential risks of heavy metal exposure.

Between 2014 and 2022, researchers purchased 72 well-known cocoa products from U.S. retailers, mainly consisting of pure dark chocolate bars, along with some cocoa powder. All of these products were sourced from manufacturers in either the U.S. or Europe. Subsequently, the chocolates underwent an analysis at two independent labs in the U.S.

The standards from California Proposition 65 were applied in the tests because the U.S. Food and Drug Administration does not establish heavy metal limits for most foods. This state-level regulation is notably stricter. The findings revealed that 43% of the analyzed products surpassed the permissible exposure levels for lead, and 35% exceeded those for cadmium.

Products labeled as organic displayed notably higher levels of cadmium and lead, being 280% more likely to surpass California Proposition 65’s limits for cadmium and 14% higher potential for lead. The reasons why organic chocolate contains more of these heavy metals are unclear. It is speculated that the less intensive processing methods used in organic production might not eliminate as many heavy metals as those used in conventional processing.

For those concerned that their fondness for dark chocolate might be compromised, it’s important to note that the risk largely depends on consumption levels. Eating contaminated chocolates infrequently and in small quantities is unlikely to pose a significant public health issue. However, if a person regularly consumes a variety of these products, the cumulative exposure could become a cause for concern.

To view the original scientific study click below:
A multi-year heavy metal analysis of 72 dark chocolate and cocoa products in the USA