Archives

  • 2018-07
  • 2018-10
  • 2018-11
  • 2019-04
  • 2019-05
  • 2019-06
  • 2019-07
  • 2019-08
  • 2019-09
  • 2019-10
  • 2019-11
  • 2019-12
  • 2020-01
  • 2020-02
  • 2020-03
  • 2020-04
  • 2020-05
  • 2020-06
  • 2020-07
  • 2020-08
  • 2020-09
  • 2020-10
  • 2020-11
  • 2020-12
  • 2021-01
  • 2021-02
  • 2021-03
  • 2021-04
  • 2021-05
  • 2021-06
  • 2021-07
  • 2021-08
  • 2021-09
  • 2021-10
  • 2021-11
  • 2021-12
  • 2022-01
  • 2022-02
  • 2022-03
  • 2022-04
  • 2022-05
  • 2022-06
  • 2022-07
  • 2022-08
  • 2022-09
  • 2022-10
  • 2022-11
  • 2022-12
  • 2023-01
  • 2023-02
  • 2023-03
  • 2023-04
  • 2023-05
  • 2023-06
  • 2023-07
  • 2023-08
  • 2023-09
  • 2023-10
  • 2023-11
  • 2023-12
  • 2024-01
  • 2024-02
  • 2024-03
  • 2024-04
  • 2024-05
  • It has previously been hypothesised that iron supplementatio

    2018-11-05

    It has previously been hypothesised that iron supplementation increases malaria susceptibility by increasing the availability of \'s preference for young RBCs (). observed significant shifts in RBC populations following iron supplementation, finding younger RBCs are most prevalent at Day 49 post iron supplementation – the exact same period at which parasite growth increases the most. This finding supports the hypothesis of increased erythropoietic drive and young RBC population dynamics influencing overall malaria susceptibility. Results from this work concur with WHO recommendations cautioning iron supplementation in malaria endemic areas without extensive health monitoring and antimalarial preventative services and reveal the need for short-term malaria prophylaxis during iron supplementation campaigns. Furthermore, this study provides one possible explanation for why other researchers may have observed conflicting results regarding the malaria-related risks of iron supplementation, given the transient window of increased malaria susceptibility the authors observed here. Finally, as the authors observed a direct correlation between parasite growth rates and hemoglobin levels of the RBC donors, this provides a simple and practical measurement for healthcare workers to take into consideration regarding evaluation of malaria susceptibility and iron supplementation, as opposed to more laborious, less straightforward, and often unavailable measurements of iron status using other iron related biomarkers.
    Introduction The award of the 2016 Nobel Prize for Medicine or Physiology to Professor Yoshinori Ohsumi for his work on elucidating mechanisms of autophagy recognizes the importance of autophagy for human disease. Autophagy allows nk1 receptor to maintain intracellular homeostasis and respond to stress by degrading proteins, organelles and other cellular components via the lysosome. This process is evolutionally conserved and acts as a critical cellular response to nutrient and oxygen deprivation, resulting in recycled amino acids, nutrients, and lipids. Alterations in autophagy and inherited mutations in autophagy related genes, known as ATG genes, that control autophagy have been linked to human disease, including neurological diseases, autoimmune disease, metabolic disorders, infectious disease and cancer. These connections imply that therapeutic interventions to boost or inhibit autophagy may be useful to treat or prevent disease (Rubinsztein et al., 2012). Here we review the roles of autophagy in two such diseases– neurodegenerative disease and cancer– where direct therapeutic targeting intended to stimulate and inhibit autophagy is moving forward in the clinic.
    Opportunities to Target Autophagy for Therapy Three types of autophagy have been characterized– microautophagy, chaperone mediated autophagy (CMA), and macroautophagy. Microautophagy involves direct engulfument and lysosomal degradation of cytosolic material, (Li et al., 2012) while CMA is facilitated by chaperones that target proteins containing a specific amino acid motif to the lysosome. Although CMA is implicated in disease (Cuervo and Wong, 2014), this review will focus on macroautophagy. Macroautophagy (from here on termed autophagy) is characterized by the formation of double membrane vesicles called autophagosomes that engulf cytoplasmic material then fuse with the lysosome to degrade their contents. The process involves various steps: initiation, nucleation, elongation, and closure of the membranes that form the autophagosome, fusion with the lysosome and recycling of macromolecular precursors. Each step is regulated by particular autophagy-related proteins (ATGs) (Mizushima et al., 2011). Autophagy is controlled transcriptionally by MITF and FOXO families of transcription factors (Füllgrabe et al., 2014), as well as CREB and ATF (Amaravadi, 2015) and is subject to post-translational regulation allowing its pharmacological manipulation both positively and negatively (Fig. 1). For example, the mammalian target of rapamycin (mTOR) complex, mTORC1 inhibits autophagy. Thus, mTOR inhibitors are often used to stimulate autophagy. mTORC2 has also been linked to autophagy, although this may be specific to CMA (Arias et al., 2015). Autophagy can also be induced independently of mTOR. For example, the naturally occurring disaccharide, trehalose, which works independently of mTOR (Sarkar et al., 2007a), induces autophagy to protect against liver disease by affecting glucose transporters (DeBosch et al., 2016).