.
O

n a large scale, humanity is constantly struggling against bacteria and disease as well as non-communicable diseases (NCDs). Today our focus is on primary prevention (intervening before a disease is developed) or secondary prevention (preventing progression of a disease when you are already sick). In the near future, we will be solving for “primordial prevention,” looking at the prevention of the risk factors in the first place, and we will treat age as a disease that not only can be “cured” but can be prevented. Here, we’ll track the research that is transforming our understanding of the human body and, ultimately, saving lives.

The disease hadn’t been seen in people before. It had likely jumped into human populations from an animal and was causing severe respiratory illness. Health care professionals around the world teamed up together to fight the virus. Temperature scanners and masks became the norm in public places. Some governments enacted quarantines. However, in a scene that could have mirrored the world’s current COVID-19 crisis, the world’s 2003 experience with Severe Acute Respiratory Syndrome (SARS) came to an anti-climactic end in July of that year, when the WHO announced an end to the viral threat just nine months after the beginning of the outbreak.

There are many reasons the world had a very different experience with SARS than the experience it is currently having with COVID-19. A patient who had been infected with SARS would start exhibiting symptoms within 2-3 days; though a study from the Johns Hopkins Bloomberg School of Public Health shows that COVID-19 symptoms appear within a median of five days, some patients might not exhibit symptoms for a full 14 days after exposure. This makes COVID-19 patients much more difficult to isolate before they infect others. Additionally, unlike COVID-19, SARS was difficult to contract, and patients couldn’t spread the disease while they were pre-symptomatic. The differences between the two respiratory diseases has proven dire; while COVID-19 has infected over 43 million people worldwide, SARS disappeared from human populations after infecting a little over 8,000.

Preparing for the next pandemic is only one challenge the world will have to face in the future of healthcare. The dreaded, almost year-long nightmare much of the world has already experienced in its quest to contain COVID-19 has given us a taste of what difficulties might lie at the forefront of a quest to contain new disease. The challenges of developing vaccines as well as the challenges of distributing vaccines within lower-income countries pose some threat to what was once dubbed “the most successful form of disease prevention available today.” Experiences with the aggressive Candida auris fungus in 2019 indicate the potential dangers that could lurk in a future plagued by antibiotic resistance.

However, the future will also feature some of the best advances in healthcare the world has ever seen. Advances in digital technologies will help promote human well-being in new and exciting ways. Genetic modification will be one of many potential solutions currently being tested to end micronutrient deficiency in the developing world. And innovative research in epigenetics might lead to the end of human aging. Though the future of healthcare is sure to have its challenges, the world has never been more prepared to offer creative solutions.

New Disease

The experience the world had with SARS compared to the catastrophe that has unfolded from COVID-19 speaks volumes to the dangers of new disease. The healthcare community still doesn’t know for sure why SARS disappeared from human populations, though the beginning of the summer as well as an easily isolated infected population might have played a role. With COVID-19, while experts know for certain which factors are leading to the longevity of the pandemic, it would have been impossible to know what those factors might be in late 2019 when the disease first emerged. The inherent ambiguity which surrounds zoonotic illness means that the havoc it can wreak in human populations is incredibly variant. This variance underscores the need in global governance to prepare extensively for future challenges in healthcare systems across the globe.

States can prepare for the threat of new disease in several key ways. Governments can use mathematical models to simulate how contagious diseases will be when they first emerge, allowing them to allocate funding most appropriately for things like vaccine research and protective equipment. Additionally, governments can fund research into which kinds of diseases are circulating in animals and are most likely to jump into humans. Lastly, pandemic preparedness is certainly an area where experience pays off. Countries like Taiwan and Singapore had such strong initial responses to COVID-19 because the region had more substantial experience dealing with SARS, which primarily affected their ability to tackle the pandemic effectively. Similarly, preparedness might be contributing to the weak pandemic response seen in the United States. Though the country was prepared for bio-terrorism threats, with vaccines stockpiled to combat anthrax and smallpox, it was relatively unprepared for dealing with respiratory illness. In the future, states will need to invest a larger share of resources towards pandemic preparedness.

Vaccines

Vaccines are both a key element in fighting emerging diseases as well as a significant strategy for keeping known pathogens at bay. However, even as researchers have gotten significantly more skillful in developing vaccines, certain diseases bring specific challenges. SARS is an excellent example of a vaccine crisis that thankfully never was because a vaccine was never needed. Though a vaccine study was started for SARS, and inoculation resulted in protective immunity, the vaccine produced an immune disease in the animals from the trial. In other cases, funding issues can halt vaccine development. The Ebola vaccine took over two decades to develop—not because there were challenges in development, but because there were challenges in funding a vaccine that didn’t seem urgent until the 2014 West African outbreak. And the Johnson and Johnson COVID-19 vaccine trial was halted in late-stage development when a patient in the trial developed an “unexplained illness” that may or may not have been related to the shot itself.  

The pace of development that has accompanied the COVID-19 vaccines currently in trial has been astonishing. Typically, it takes years to develop a vaccine, and several COVID-19 vaccines are already approved or in final-phase trials. Additionally, the WHO already has a plan for financing COVID-19 vaccine distribution in countries whose governments may not be able to afford supplies on their own. The resulting program, known as COVAX, brings together more than two-thirds of countries worldwide. This solution is much needed, not just in a world that is grappling with ubiquitous respiratory illness, but also in a world that has continually failed to protect developing countries from easily preventable diseases such as measles. A 2017 WHO report showed that though close to a million more children in 68 developing countries achieved access to the basic diphtheria-tetanus-pertussis vaccine from 2015 to 2016, vaccination access remained at 80%. Further, millions of children in war-torn countries such as Syria or South Sudan remained under-vaccinated.

Doctors, researchers, and NGOs are working to make sure vaccine access is more equitable in the future. The WHO’s COVAX program to distribute COVID-19 vaccines equitably across the world is a great example of the kind of global coordination that will be needed to fight future pandemics effectively. Further, on the state level, India’s experience distributing the pneumococcal conjugate vaccine (PCV) as part of its childhood immunization program has demonstrated what future governance in healthcare might look like. As part of its plan to distribute vaccines to children across the country, India developed the Electronic Vaccine Intelligence Network, a program that digitally monitors vaccine temperatures and vaccine stocks along the supply chain. Additionally, India uses vaccine distribution as a gateway into the rest of its healthcare system; doctors make sure children are up-to-date on other vaccinations and checks them for malnutrition when they come in for the PCV vaccine.

Antimicrobial Resistance

If antimicrobials, like vaccines, proposed another solution to global health problems, antimicrobial resistance has been the backlash from their overuse. Before the 1928 discovery of penicillin, commonplace bacterial infections like strep throat, pneumonia, and whooping cough could all be fatal. Antibiotics have made it so that these basic bacterial diseases no longer spell a death sentence.

However, today, thanks to frequent antibiotic use, some strains of bacteria can no longer be treated with antibiotics. Basic bacterial infections like gonorrhea and tuberculosis are becoming harder to treat as a result.  Last year, a vicious fungal infection known as Candida auris tore across the world, showing up in Venezuela, Spain, South Africa, the UK, and the U.S. Of particular note is the fungus’s resistance to common antifungal treatments. If not taken seriously, antifungal and antibiotic resistance could cause serious problems for healthcare systems in the future. One UN report predicts that deaths related to drug-resistant bacteria could total 10 million by 2050.

Nations can mount defenses against the future threat of antibiotic- and antifungal-resistant diseases today. In the past few years, world leaders have demanded that hospitals and farms halt the “gluttonous” overuse of antimicrobial drugs. A 2018 WHO report found that out of the 154 countries studied, only 64 had policies in place to limit antibiotics used in livestock growth, and only 78 had policies designed to prevent environmental contamination from antibiotics. The CDC estimates that the most important action that can be taken to slow the spread of antimicrobial resistance is changing the way antimicrobials are used. They report that over half of antimicrobial use in humans is unnecessary. Fortunately, the 2018 WHO report shows some signs of states implementing policies which take this direction. According to the report, 105 countries had surveillance programs in place for reporting antimicrobial infections in humans; additionally, 123 countries regulate the sale of antimicrobials.

New Innovations on the Horizon

While the future of healthcare will feature threats like growing antimicrobial resistance and new diseases which pose challenges to vaccination, it will also include benefits to healthcare never seen before. The increased focus on wellness over the last fifty years will grow to include new technologies. Micronutrient deficiency, a chronic problem in developing countries, can be approached using new technologies. And aging, often the ultimate threat to human health, is being disrupted by methods never seen in centuries past.

Like many sectors, the wellness sector is one which currently benefits from the sophisticated digital technology of the 21st century. In the future, this digital technology is only supposed to take the industry further. In 2018, the Gottlieb Duttweiler Institute collaborated with the Global Wellness Institute to predict five trends pertinent to the future of the wellness industry. The report found that technology and human wellness would likely converge; this convergence can be illustrated with the concept of a “data selfie,” or a digital profile combining heart rates, galvanic skin responses, calorie use, as well as other healthcare data used to help healthcare providers better improve individual well-being. Additionally, the report found that smartphones can play an instrumental role in regulating emotions, using data to understand behavior patterns and make suggestions to improve users’ wellness in real time.

While technology is creating new innovations in the wellness sphere, in the future, it will also be applied to solve age-old problems in healthcare, such as micronutrient deficiency. Though they are only needed in small amounts, micronutrients (also known as vitamins and minerals) are a critical component of human health. Since micronutrients don’t occur naturally in the body, people must absorb them from their diets.

When people are unable to absorb the small amounts of certain micronutrients, the consequences can be large. In 1985, global health authorities agreed to a worldwide campaign for salt iodization to help billions of people suffering from iodine deficiency, which is the leading cause of brain damage. Though large salt producers quickly iodized their salt, smaller producers were harder to access, and as a result, UNICEF predicts that “30% of households in the developing world are not consuming (iodized) salt.”

Early solutions to micronutrient deficiency included the efforts of Micronutrient International, (now Nutrition International), a Canadian organization founded in 1992 that helped small salt producers iodize their salt. The organization helped develop valuable technologies, such as an iodization machine on wheels that was used to help iodize salt in Africa. However, salt iodization can only go so far if the population is also deficient in other valuable micronutrients, such as Vitamin A. For this reason, some organizations have developed micronutrient sprinkles, which can be distributed in sachets and add a variety of nutrients to things like baby food. Additionally, scientists can biofortify seeds for staple crops to add valuable nutrients to the foods that are the cheapest to grow in developing countries.

Lastly, aging is another area of healthcare that will be revolutionized by future technology. Amid predictions that imagine the 65-plus set as a larger share of the population than ever before, some scientists are trying to take the elderly back in time. In one 2016 study conducted at the Salk Institute for Biological Studies, researchers manipulated genes in mice using an approach that improved pancreatic function and rejuvenated damaged muscles. Other researchers thought that this study was additional evidence that aging is driven by epigenetic changes—that is, changes made to a gene that affect a trait’s presentation rather than its genetic code. Whereas genetic changes alter a gene’s DNA sequence, epigenetic changes might change the way a DNA sequence is read. If aging is driven by epigenetic changes, it might indeed be reversible like some researchers have suspected.

Researchers at Stanford University made more progress towards this theory last spring when they discovered a method for reversing errors in the epigenome. The epigenome is made up of chemical compounds which direct an organism’s full set of genes. Since the main cause of aging is thought to be a clutter of errors within the epigenome, Stanford researchers were able to propose a method for reversing aging by reversing these errors.

However, the real-life prospects for reversing aging lie far ahead in the future. In one early experiment where researchers erased the marks on the epigenome in mice cells, growth accelerated in the cells after they lost their identity, priming them for cancer. And in the 2016 study conducted at the Salk Institute, mice had to be genetically engineered so that the experiment wouldn’t result in a loss of cell identity. The solutions to reversing aging through epigenetics are thus several experiments away.

The COVID-19 pandemic has proven that the future of health might contain challenges for which the world hasn’t yet prepared. However, the future of health will also contain technologies the world hasn’t ever seen. These technological innovations will help people face not only the unknowns, but also the health-related problems that have been plaguing humanity for centuries. The next several years will tell how far the world gets in conquering the healthcare challenges to come.

About
Allyson Berri
:
Allyson Berri is a Diplomatic Courier Correspondent whose writing focuses on global affairs and economics.
The views presented in this article are the author’s own and do not necessarily represent the views of any other organization.

a global affairs media network

www.diplomaticourier.com

Preparing for the Next Pandemic

December 24, 2020

On a large scale, humanity is constantly struggling against bacteria and disease as well as non-communicable diseases. Today our focus is on primary prevention. In the near future, we will treat age as a disease that not only can be cured but can be prevented.

O

n a large scale, humanity is constantly struggling against bacteria and disease as well as non-communicable diseases (NCDs). Today our focus is on primary prevention (intervening before a disease is developed) or secondary prevention (preventing progression of a disease when you are already sick). In the near future, we will be solving for “primordial prevention,” looking at the prevention of the risk factors in the first place, and we will treat age as a disease that not only can be “cured” but can be prevented. Here, we’ll track the research that is transforming our understanding of the human body and, ultimately, saving lives.

The disease hadn’t been seen in people before. It had likely jumped into human populations from an animal and was causing severe respiratory illness. Health care professionals around the world teamed up together to fight the virus. Temperature scanners and masks became the norm in public places. Some governments enacted quarantines. However, in a scene that could have mirrored the world’s current COVID-19 crisis, the world’s 2003 experience with Severe Acute Respiratory Syndrome (SARS) came to an anti-climactic end in July of that year, when the WHO announced an end to the viral threat just nine months after the beginning of the outbreak.

There are many reasons the world had a very different experience with SARS than the experience it is currently having with COVID-19. A patient who had been infected with SARS would start exhibiting symptoms within 2-3 days; though a study from the Johns Hopkins Bloomberg School of Public Health shows that COVID-19 symptoms appear within a median of five days, some patients might not exhibit symptoms for a full 14 days after exposure. This makes COVID-19 patients much more difficult to isolate before they infect others. Additionally, unlike COVID-19, SARS was difficult to contract, and patients couldn’t spread the disease while they were pre-symptomatic. The differences between the two respiratory diseases has proven dire; while COVID-19 has infected over 43 million people worldwide, SARS disappeared from human populations after infecting a little over 8,000.

Preparing for the next pandemic is only one challenge the world will have to face in the future of healthcare. The dreaded, almost year-long nightmare much of the world has already experienced in its quest to contain COVID-19 has given us a taste of what difficulties might lie at the forefront of a quest to contain new disease. The challenges of developing vaccines as well as the challenges of distributing vaccines within lower-income countries pose some threat to what was once dubbed “the most successful form of disease prevention available today.” Experiences with the aggressive Candida auris fungus in 2019 indicate the potential dangers that could lurk in a future plagued by antibiotic resistance.

However, the future will also feature some of the best advances in healthcare the world has ever seen. Advances in digital technologies will help promote human well-being in new and exciting ways. Genetic modification will be one of many potential solutions currently being tested to end micronutrient deficiency in the developing world. And innovative research in epigenetics might lead to the end of human aging. Though the future of healthcare is sure to have its challenges, the world has never been more prepared to offer creative solutions.

New Disease

The experience the world had with SARS compared to the catastrophe that has unfolded from COVID-19 speaks volumes to the dangers of new disease. The healthcare community still doesn’t know for sure why SARS disappeared from human populations, though the beginning of the summer as well as an easily isolated infected population might have played a role. With COVID-19, while experts know for certain which factors are leading to the longevity of the pandemic, it would have been impossible to know what those factors might be in late 2019 when the disease first emerged. The inherent ambiguity which surrounds zoonotic illness means that the havoc it can wreak in human populations is incredibly variant. This variance underscores the need in global governance to prepare extensively for future challenges in healthcare systems across the globe.

States can prepare for the threat of new disease in several key ways. Governments can use mathematical models to simulate how contagious diseases will be when they first emerge, allowing them to allocate funding most appropriately for things like vaccine research and protective equipment. Additionally, governments can fund research into which kinds of diseases are circulating in animals and are most likely to jump into humans. Lastly, pandemic preparedness is certainly an area where experience pays off. Countries like Taiwan and Singapore had such strong initial responses to COVID-19 because the region had more substantial experience dealing with SARS, which primarily affected their ability to tackle the pandemic effectively. Similarly, preparedness might be contributing to the weak pandemic response seen in the United States. Though the country was prepared for bio-terrorism threats, with vaccines stockpiled to combat anthrax and smallpox, it was relatively unprepared for dealing with respiratory illness. In the future, states will need to invest a larger share of resources towards pandemic preparedness.

Vaccines

Vaccines are both a key element in fighting emerging diseases as well as a significant strategy for keeping known pathogens at bay. However, even as researchers have gotten significantly more skillful in developing vaccines, certain diseases bring specific challenges. SARS is an excellent example of a vaccine crisis that thankfully never was because a vaccine was never needed. Though a vaccine study was started for SARS, and inoculation resulted in protective immunity, the vaccine produced an immune disease in the animals from the trial. In other cases, funding issues can halt vaccine development. The Ebola vaccine took over two decades to develop—not because there were challenges in development, but because there were challenges in funding a vaccine that didn’t seem urgent until the 2014 West African outbreak. And the Johnson and Johnson COVID-19 vaccine trial was halted in late-stage development when a patient in the trial developed an “unexplained illness” that may or may not have been related to the shot itself.  

The pace of development that has accompanied the COVID-19 vaccines currently in trial has been astonishing. Typically, it takes years to develop a vaccine, and several COVID-19 vaccines are already approved or in final-phase trials. Additionally, the WHO already has a plan for financing COVID-19 vaccine distribution in countries whose governments may not be able to afford supplies on their own. The resulting program, known as COVAX, brings together more than two-thirds of countries worldwide. This solution is much needed, not just in a world that is grappling with ubiquitous respiratory illness, but also in a world that has continually failed to protect developing countries from easily preventable diseases such as measles. A 2017 WHO report showed that though close to a million more children in 68 developing countries achieved access to the basic diphtheria-tetanus-pertussis vaccine from 2015 to 2016, vaccination access remained at 80%. Further, millions of children in war-torn countries such as Syria or South Sudan remained under-vaccinated.

Doctors, researchers, and NGOs are working to make sure vaccine access is more equitable in the future. The WHO’s COVAX program to distribute COVID-19 vaccines equitably across the world is a great example of the kind of global coordination that will be needed to fight future pandemics effectively. Further, on the state level, India’s experience distributing the pneumococcal conjugate vaccine (PCV) as part of its childhood immunization program has demonstrated what future governance in healthcare might look like. As part of its plan to distribute vaccines to children across the country, India developed the Electronic Vaccine Intelligence Network, a program that digitally monitors vaccine temperatures and vaccine stocks along the supply chain. Additionally, India uses vaccine distribution as a gateway into the rest of its healthcare system; doctors make sure children are up-to-date on other vaccinations and checks them for malnutrition when they come in for the PCV vaccine.

Antimicrobial Resistance

If antimicrobials, like vaccines, proposed another solution to global health problems, antimicrobial resistance has been the backlash from their overuse. Before the 1928 discovery of penicillin, commonplace bacterial infections like strep throat, pneumonia, and whooping cough could all be fatal. Antibiotics have made it so that these basic bacterial diseases no longer spell a death sentence.

However, today, thanks to frequent antibiotic use, some strains of bacteria can no longer be treated with antibiotics. Basic bacterial infections like gonorrhea and tuberculosis are becoming harder to treat as a result.  Last year, a vicious fungal infection known as Candida auris tore across the world, showing up in Venezuela, Spain, South Africa, the UK, and the U.S. Of particular note is the fungus’s resistance to common antifungal treatments. If not taken seriously, antifungal and antibiotic resistance could cause serious problems for healthcare systems in the future. One UN report predicts that deaths related to drug-resistant bacteria could total 10 million by 2050.

Nations can mount defenses against the future threat of antibiotic- and antifungal-resistant diseases today. In the past few years, world leaders have demanded that hospitals and farms halt the “gluttonous” overuse of antimicrobial drugs. A 2018 WHO report found that out of the 154 countries studied, only 64 had policies in place to limit antibiotics used in livestock growth, and only 78 had policies designed to prevent environmental contamination from antibiotics. The CDC estimates that the most important action that can be taken to slow the spread of antimicrobial resistance is changing the way antimicrobials are used. They report that over half of antimicrobial use in humans is unnecessary. Fortunately, the 2018 WHO report shows some signs of states implementing policies which take this direction. According to the report, 105 countries had surveillance programs in place for reporting antimicrobial infections in humans; additionally, 123 countries regulate the sale of antimicrobials.

New Innovations on the Horizon

While the future of healthcare will feature threats like growing antimicrobial resistance and new diseases which pose challenges to vaccination, it will also include benefits to healthcare never seen before. The increased focus on wellness over the last fifty years will grow to include new technologies. Micronutrient deficiency, a chronic problem in developing countries, can be approached using new technologies. And aging, often the ultimate threat to human health, is being disrupted by methods never seen in centuries past.

Like many sectors, the wellness sector is one which currently benefits from the sophisticated digital technology of the 21st century. In the future, this digital technology is only supposed to take the industry further. In 2018, the Gottlieb Duttweiler Institute collaborated with the Global Wellness Institute to predict five trends pertinent to the future of the wellness industry. The report found that technology and human wellness would likely converge; this convergence can be illustrated with the concept of a “data selfie,” or a digital profile combining heart rates, galvanic skin responses, calorie use, as well as other healthcare data used to help healthcare providers better improve individual well-being. Additionally, the report found that smartphones can play an instrumental role in regulating emotions, using data to understand behavior patterns and make suggestions to improve users’ wellness in real time.

While technology is creating new innovations in the wellness sphere, in the future, it will also be applied to solve age-old problems in healthcare, such as micronutrient deficiency. Though they are only needed in small amounts, micronutrients (also known as vitamins and minerals) are a critical component of human health. Since micronutrients don’t occur naturally in the body, people must absorb them from their diets.

When people are unable to absorb the small amounts of certain micronutrients, the consequences can be large. In 1985, global health authorities agreed to a worldwide campaign for salt iodization to help billions of people suffering from iodine deficiency, which is the leading cause of brain damage. Though large salt producers quickly iodized their salt, smaller producers were harder to access, and as a result, UNICEF predicts that “30% of households in the developing world are not consuming (iodized) salt.”

Early solutions to micronutrient deficiency included the efforts of Micronutrient International, (now Nutrition International), a Canadian organization founded in 1992 that helped small salt producers iodize their salt. The organization helped develop valuable technologies, such as an iodization machine on wheels that was used to help iodize salt in Africa. However, salt iodization can only go so far if the population is also deficient in other valuable micronutrients, such as Vitamin A. For this reason, some organizations have developed micronutrient sprinkles, which can be distributed in sachets and add a variety of nutrients to things like baby food. Additionally, scientists can biofortify seeds for staple crops to add valuable nutrients to the foods that are the cheapest to grow in developing countries.

Lastly, aging is another area of healthcare that will be revolutionized by future technology. Amid predictions that imagine the 65-plus set as a larger share of the population than ever before, some scientists are trying to take the elderly back in time. In one 2016 study conducted at the Salk Institute for Biological Studies, researchers manipulated genes in mice using an approach that improved pancreatic function and rejuvenated damaged muscles. Other researchers thought that this study was additional evidence that aging is driven by epigenetic changes—that is, changes made to a gene that affect a trait’s presentation rather than its genetic code. Whereas genetic changes alter a gene’s DNA sequence, epigenetic changes might change the way a DNA sequence is read. If aging is driven by epigenetic changes, it might indeed be reversible like some researchers have suspected.

Researchers at Stanford University made more progress towards this theory last spring when they discovered a method for reversing errors in the epigenome. The epigenome is made up of chemical compounds which direct an organism’s full set of genes. Since the main cause of aging is thought to be a clutter of errors within the epigenome, Stanford researchers were able to propose a method for reversing aging by reversing these errors.

However, the real-life prospects for reversing aging lie far ahead in the future. In one early experiment where researchers erased the marks on the epigenome in mice cells, growth accelerated in the cells after they lost their identity, priming them for cancer. And in the 2016 study conducted at the Salk Institute, mice had to be genetically engineered so that the experiment wouldn’t result in a loss of cell identity. The solutions to reversing aging through epigenetics are thus several experiments away.

The COVID-19 pandemic has proven that the future of health might contain challenges for which the world hasn’t yet prepared. However, the future of health will also contain technologies the world hasn’t ever seen. These technological innovations will help people face not only the unknowns, but also the health-related problems that have been plaguing humanity for centuries. The next several years will tell how far the world gets in conquering the healthcare challenges to come.

About
Allyson Berri
:
Allyson Berri is a Diplomatic Courier Correspondent whose writing focuses on global affairs and economics.
The views presented in this article are the author’s own and do not necessarily represent the views of any other organization.