skip to primary navigationskip to content

Research News and Media

Study clears important hurdle towards developing an HIV vaccine

By cjb250 from University of Cambridge - vaccine. Published on Sep 13, 2017.

In a study published in 2009, results from a clinical trial carried out in Thailand found that an experimental vaccine against HIV lowered the rate of human infection by 31%. This gave cautious optimism that a vaccine against the virus might be a feasible prospect. A vaccine has obvious advantages over treatment with anti-retroviral drugs in that prevention could lead to eradication.

However, one of the major problems that prevented the vaccine from generating long-lasting protection was that the key immune response it needed to generate was very short-lived. The reason has now become clear and researchers have found a potential solution.

When a virus enters the body, its aim is to get into our cells and replicate itself again and again, spreading throughout the body. HIV is especially notorious because a protein on its outer coat specifically targets CD4 T-helper cells, the master regulators of the immune system.  These cells produce important signals for other types of immune cell: B-cells, which make antibodies; and T-killer cells, which kill virus-infected cells.

By specifically targeting the CD4 T-helper cells, HIV cripples the command and control centre of the immune system and prevents immune defences from working effectively. HIV does not even need to enter and kill the CD4 T-cells – it can cause a functional paralysis of these cells simply by binding its gp140 with the CD4 receptor, an important molecule on the surface of T-helper cells.

HIV’s envelope proteins are a key component of vaccines to protect against HIV infection. The body’s immune system targets this protein and generates antibodies directed at HIV’s outer coat to prevent the virus from entering the cells. If the effects of the vaccine last long enough, then with the assistance of robust helper T-cells, the human body should be able to develop antibodies that neutralise a large variety of HIV strains and protect people from infection.

Previous studies showed that vaccinating using a form of the outer coat protein called gp140 leads to the triggering of B-cells which produce antibodies to the virus, but only for a brief period and insufficient to generate sufficient antibodies that are protective from HIV infection over a long period.

Working with scientists in the UK, France, the USA, and the Netherlands, Professor Jonathan Heeney from the Laboratory of Viral Zoonotics at the University of Cambridge recognised that the binding of gp140 to the CD4 receptor on T-helper cells was probably causing this block, and that by preventing gp140 attaching to the CD4 receptor, the short-term block in antibody producing B-cells could be overcome.

In two back-to-back studies published in the print edition of Journal of Virology, the research team has demonstrated for the first time that this approach works, providing the desired responses that were capable of lasting over a year.  

“For a vaccine to work, its effects need to be long lasting,” says Professor Heeney. “It isn’t practical to require people to come back every 6-12 months to be vaccinated. We wanted to develop a vaccine to overcome this block and generate these long-lived antibody producing cells. We have now found a way to do this.”

The study showed that the addition of a tiny specific protein patch to the gp140 protein dramatically improved B-cell responses by blocking binding to the CD4 receptor and hence preventing the paralysis of T-helper cells early in the key stages of the immune response – like preventing a key from getting stuck in a lock. This small patch was one of several strategies to improve gp140 for an HIV vaccine by a team led by Susan Barnett (now at the Bill and Melinda Gates Foundation).

This modified vaccine approach now better stimulates long-lasting B-cell responses, boosting the ability of B-cells to recognise different contours of the virus coat and to make better antibodies against it. This new finding will allow HIV vaccines to be developed that give the immune system enough time to develop the essential B-cell responses to make protective antibodies.

“B-cells need time to make highly effective neutralising antibodies, but in previous studies B-cell responses were so short lived they disappeared before they have the time to make all the changes necessary to create the ‘silver bullets’ to stop HIV,” adds Professor Heeney.

“What we have found is a way to greatly improve B-cell responses to an HIV vaccine. We hope our discovery will unlock the paralysis in the field of HIV vaccine research and enable us to move forward.”

The team now hopes to secure funding to test their vaccine candidate in humans in the near future.

The studies were funded by the National Institutes of Health, USA, and the Isaac Newton Trust Cambridge.

Reference

Bogers, WMJM, et al. Increased, Durable B-Cell and ADCC Responses Associated with T-Helper Cell Responses to HIV-1 Envelope in Macaques Vaccinated with gp140 Occluded at the CD4 Receptor Binding Site. Journal of Virology; DOI: 10.1128/JVI.00811-17.

Shen, X et al. Cross-Linking of a CD4-Mimetic Miniprotein with HIV-1 Env gp140 Alters Kinetics and Specificities of Antibody Responses against HIV-1 Env in Macaques. Journal of Virology; DOI: 10.1128/JVI.00401-17. 

An international team of researchers has demonstrated a way of overcoming one of the major stumbling blocks that has prevented the development of a vaccine against HIV: the ability to generate immune cells that stay in circulation long enough to respond to and stop virus infection.

For a vaccine to work, its effects need to be long lasting. It isn’t practical to require people to come back every 6-12 months to be vaccinated
Jonathan Heeney
3D print of HIV (edited)

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
License type: 

The self-defence force awakens

By cjb250 from University of Cambridge - immunology. Published on Jul 04, 2017.

An army of cells constantly patrols within us, attacking anything it recognises as foreign, keeping us safe from invading pathogens. But sometimes things go wrong: the soldiers mistake benign cells for invaders, turning their friendly fire on
us and declaring war.

The consequences are diseases like multiple sclerosis (MS), asthma, inflammatory bowel disease, type 1 diabetes and rheumatoid arthritis – diseases that are increasing at an alarming rate in both the developed and developing worlds.

Cambridge will be ramping up the fight against immune-mediated and inflammatory diseases with the opening next year of the Cambridge Institute of Therapeutic Immunology and Infectious Disease, headed by Professor Ken Smith. The Institute will work at the interface between immunity, infection and the microbiome (the microorganisms that live naturally within us). “We’re interested in discovering fundamental mechanisms that can turn the immune system on or off in different contexts, to modify, treat or prevent both inflammatory and infectious diseases,” says Smith.

But while diseases such as Crohn’s and asthma have long been understood to be a consequence of friendly fire, scientists are starting to see this phenomenon give rise to more surprising conditions, particularly in mental health.

In 2009, Professor Belinda Lennox, then at Cambridge and now at Oxford, led a study that showed that 7% of patients with psychoses tested positive for antibodies that attacked a particular receptor in the brain, the NMDA receptor. This blocked a key neurotransmitter, affecting communication between nerve cells and causing the symptoms.

Professor Alasdair Coles from Cambridge’s Department of Clinical Neurosciences is working with Lennox on a trial to identify patients with this particular antibody and reverse its effects. One of their treatments involves harnessing the immune system – weaponising it, one might say – to attack rogue warriors using rituximab, a monoclonal antibody therapy that kills off B-cells, the cells that generate antibodies.

“You can make monoclonal antibodies for experimental purposes against anything you like within a few days,” explains Coles. “In contrast, to come up with a small molecule – the alternative sort of drug – takes a long, long time.”

The first monoclonal antibody to be made into a drug, created here in Cambridge, is called alemtuzumab. It targets both B- and T-cells and has been used in a variety of autoimmune diseases and cancers. Its biggest use is in MS, where it eliminates the rogue T- and B-cells that attack the protective insulation (myelin sheath) around nerve fibres. Licensed in Europe in 2013 and approved by NICE in 2014, it has now been used in tens of thousands of MS patients.

As well as treating diseases caused by the immune system, antibody therapies are now widely used to treat cancer. And, as Professor Gillian Griffiths, Director of the Cambridge Institute for Medical Research, explains, antibody-producing cells are not the only immune cells that can be weaponised.

“T-cells are also showing great promise,” she says. “They are the body’s serial killers, patrolling, identifying and destroying infected and cancer cells with remarkable precision and efficiency.”

But cancer cells are able to trick T-cells by sending out a ‘don’t kill’ signal. Antibodies that block these signals, which have become known as ‘checkpoint inhibitors’, are proving remarkably successful in cancer therapies. “My lab focuses on what tells a T-cell to kill, and how you make it a really good killer, using imaging and genetic approaches to understand how these cells can be fine-tuned,” Griffiths explains. “This has revealed some novel mechanisms that play key roles in regulating killing.”

A second, more experimental, approach uses souped-up cells known as chimeric antigen receptor (CAR) T-cells programmed to recognise and attack a patient’s tumour.

Neither approach is perfect: antibody therapies can dampen down the entire immune system, causing secondary problems, while CAR T-cell therapies are prohibitively expensive as each CAR T-cell needs to be programmed to suit an individual. But, says Griffiths, “the results to date from both approaches are really rather remarkable”.

One of the problems that’s dogged immunotherapy trials is that T-cells only have a short lifespan. Most of the T-cells transplanted during immunotherapy are gone within three days, nowhere near long enough to defeat the tumour.

This is where Professor Randall Johnson comes in. He’s been working with a molecule (2-hydroxyglutarate), which he says has “become trendy of late”. It’s an ‘oncometabolite’, believed to be responsible for making cells cancerous, which is why pharmaceutical companies are trying to inhibit its action. Johnson has taken the opposite approach.

He’s shown that a slightly different form of the molecule plays a critical role in T-cell function: it can turn them into renewable cells that hang around for a long time and can reactivate to combat cancer. Increasing the levels of this molecule in T-cells makes them stay around longer and be much better at destroying tumours. “Rather than creating killer T-cells that are active from the start, but burn out very quickly, we’re creating an army of cells that can stay quiet for a long time, but will go into action when necessary.”

This counterintuitive approach caught the attention of Apollo Therapeutics, who recognised the enormous promise and has invested in Johnson’s work, which he carried out in mice, to see if it can be applied to humans.

But T-cells face other problems, particularly in pancreatic cancer, explains Professor Duncan Jodrell from the Cancer Research UK Cambridge Institute, which is why immunotherapy against these tumours has so far failed. The problem with pancreatic cancer is that ‘islands’ of tumour cells sit in a ‘sea’ of other material, known as stroma. As Jodrell and colleagues have shown, it’s possible for T-cells to get into the stroma, but they go no further. “You can rev up your T-cells, but they just can’t get at the tumour cells.” They are running a study that tries to overcome this immune privilege and allow the T-cells to get to the tumour cells and attack them.

Tim Eisen, Professor of Medical Oncology at Cambridge and Head of the Oncology Translational Medicine Unit at AstraZeneca, believes we can expect great advances in cancer treatment from optimising and, in some cases, combining existing checkpoint inhibitor approaches.

Eisen is working with the Medical Research Council to trial checkpoint inhibitor antibody therapies as a complement – ‘adjuvant’ – to surgery for kidney cancer. Once the kidney is removed, the drug is used to destroy stray tumour cells that have remained behind. But even antibody therapies, which are now widely used within the NHS, are not universally effective and can cause serious complications. “One of the most important things for us to focus on now is which immunotherapeutic drug or particular combination of drugs might be effective in destroying tumour cells and be well tolerated by the patient.”

T-cell therapies – and, in particular, CAR T-cell therapies – are “very exciting, futuristic and experimental,” he says, “but they’re going to take some years to come in as standard therapy.”

The problem is how to make them cost-effective. “It’s never going to be easier to engineer an individual person’s T-cells than it is to take a drug off the shelf and give it to them,” he says. “The key is going to be whether you can industrialise production. But I’m very optimistic about our ability to re-engineer processes and make it available for people in general.”

We may soon see an era, then, when our immune systems become an unstoppable force for good.

Our immune systems are meant to keep us healthy, but sometimes they turn their fire on us, with devastating results. Immunotherapies can help defend against this ‘friendly fire’ – and even weaponise it in our defence.

T-cells are the body’s serial killers, patrolling, identifying and destroying infected and cancer cells with remarkable precision and efficiency.
Gillian Griffiths
The moment when a T-cell kills

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Infections during pregnancy may interfere with key genes associated with autism and prenatal brain development

By cjb250 from University of Cambridge - immunology. Published on Mar 21, 2017.

In a study published today in the journal Molecular Psychiatry, researchers at the University of Cyprus, University of Cambridge, University of California, San Diego, and Stanford University used rats and mice to help map the complex biological cascade caused by the mother’s immune response, which may lead to important consequences.

Maternal infections during pregnancy are a known risk factor for abnormal fetal development. Most strikingly, this has been seen during the recent emergence of Zika virus, which led to babies being born with an abnormally small head and brain (known as ‘microcephaly’). In the case of Zika, the virus has its impact by directly attacking fetal brain tissue. However, for most other infections, such as influenza, the infectious agent typically has a more indirect impact on fetal development.

Large population-based studies have previously shown that a variety of maternal infections during pregnancy are associated with small increases in the risk for psychiatric disorders, including autism spectrum disorders and schizophrenia. Other studies have shown that this effect is due not to the infectious agents themselves, but simply due to triggering a strong immune response in a pregnant mother – a phenomenon known as ‘maternal immune activation’.

“It’s important to underscore that the increase in risk is really small – too small to be meaningfully applied to specific individuals, and is only seen in very large studies when examining many thousands of people,” says Dr Michael Lombardo, lead author of the work from the University of Cyprus and the University of Cambridge. “Nevertheless, the biological cascade triggered by this effect is not well understood, particularly in how it may be similar to known biology behind conditions like autism. This was the motivation behind why we did the study.”

To understand how activating a mother’s immune system may affect her child’s brain development, Dr Lombardo and colleagues examined the activity of genes in the brain after injecting pregnant rats and mice with a substance called lipopolysaccharide. This substance contains no infectious agent and thus does not make the mothers sick, but will elicit a strong immune response in the mother, characterized by an increase in levels of cytokines. These are small immune signalling molecules that can have important effects on brain cells and the connections between these cells (known as ‘synapses’ in the fetus’s brain.

The scientists found that maternal immune activation alters the activity of multiple genes and pathways in the fetus’s brain. Importantly, many of these genes are known to be important in the development of autism and to key brain developmental processes that occur before birth. They believe that these effects may help to explain why maternal immune activation carries a small increased risk for later atypical neurodevelopment.

“The more we understand about how brain development is disrupted by these effects, the higher the chance of finding amenable targets for potential therapeutic intervention or for informing how to prevent such risk from occurring in the first place,” says Dr Tiziano Pramparo, senior author on the work from the University of California, San Diego.

While the effects caused by maternal immune activation are transient, the researchers argue that they may be very potent during fetal development and may cause different characteristics in the individual depending on when it occurs during pregnancy. The work underscores the importance of the idea that genes and the environment interact and that their interaction may have important roles in better understanding how risk for neurodevelopmental disorders manifests.

The research was funded in part by the University of California San Diego Altman Clinical and Translational Research Institute, the National Institutes of Health, the Simons Foundation Autism Research Initiative and the Child Health Research Institute at Stanford University.

Reference
Lombardo, MV et al. Maternal immune activation dysregulation of the fetal brain transcriptome and relevance to the pathophysiology of autism spectrum disorder. Molecular Psychiatry; 21 March 2017; DOI: 10.1038/mp.2017.15

If a mother picks up an infection during pregnancy, her immune system will kick into action to clear the infection – but this self-defence mechanism may also have a small influence how her child’s brain develops in the womb, in ways that are similar to how the brain develops in autism spectrum disorders. Now, an international team of researchers has shown why this may be the case, in a study using rodents to model infection during pregnancy.

It’s important to underscore that the increase in risk is really small.. Nevertheless, the biological cascade triggered by this effect is not well understood, particularly in how it may be similar to known biology behind conditions like autism
Michael Lombardo
Grobidon

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
License type: 

Crohn’s disease risk and prognosis determined by different genes, study finds

By cjb250 from University of Cambridge - immunology. Published on Jan 09, 2017.

Crohn’s disease is one of a number of chronic ‘complex’ diseases for which there is no single gene that causes the disease. In fact, to date around 170 common genetic variants have been identified that each increase the risk of an individual developing the disease. The conventional wisdom has been that there exists a ‘tipping point’: if someone has enough of these genes, they become very likely to develop the disease – and the more of the variants they carry, the more the severe the disease will then be.

However, in a study published today in Nature Genetics, a team of researchers led by the University of Cambridge has shown that this is not the case: genetic variants that affect the progression, or prognosis, of a disease operate independently of those that increase the likelihood of developing the disease in the first place.

“Genetic studies have been very successful at identifying genetic risk factors for Crohn’s disease, but have told us virtually nothing about why one person will get only mild disease while someone else might need surgery to treat their condition,” says Dr James Lee from the Department of Medicine at Cambridge. “We do know, though, that family members who have the disease often tend to see it progress in a similar way. This suggested to us that genetics was likely to be involved in prognosis.”

The researchers looked at the genomes – the entire genetic makeup – of more than 2,700 individuals, who were selected because they had either had experienced particularly mild or particularly aggressive Crohn’s disease. By comparing these patients’ DNA, the researchers found four genetic variants that influenced the severity of a patient’s condition. Strikingly, none of these genes have been shown to affect the risk of developing the disease.

The team then looked at all the known genetic risk variants for Crohn’s and found that none of these influenced the severity of disease.

“This shows us that the genetic architecture of disease outcome is very different to that of disease risk,” adds Professor Ken Smith, Head of the Department of Medicine. “In other words, the biological pathways driving disease progression may be very different to those that initiate the disease itself. This was quite unexpected.  Past work has focussed on discovering genes underlying disease initiation, and our work suggests these may no longer be relevant by the time a patient sees the doctor. We may have to consider directing new therapies to quite different pathways in order to treat established disease”

One of the genetic variants discovered by the team was in a gene called FOXO3. This gene is involved in modulating the release of the cytokine TNFα – cytokines are proteins released into the blood by immune cells in response to infection or, in the case of conditions such as Crohn’s, to the body erroneously attacking itself. This FOXO3-TNFα pathway is also known to affect the severity of rheumatoid arthritis, another auto-inflammatory disease.

Another of the variants was close to the gene IGFBP1, which is known to play a role in the immune system. This genetic region, too, has previously been linked to rheumatoid arthritis, in a study looking at the presence of a particular antibody in patients – presence of this antibody is associated with more severe disease.

The third genetic variant was in the MHC region, which is responsible for determining how our immune cells respond to invading organisms. This region has been implicated in a number of auto-immune diseases, including Crohn’s, but the genetic variant that alters Crohn’s disease risk is different to the one that affects prognosis. The variant the team identified, which was associated with a milder course of Crohn’s disease, was shown to affect multiple genes in this region, and result in a state that is known to cause weaker immune responses.

The final variant occurred in the gene XACT, about which very little is known; however, in adults this gene appears to be mainly active in cells in the intestine – the organ affected by Crohn’s disease.

“This discovery has shown us a new way of looking at disease and opens up potential new treatment options, which could substantially ease the burden of Crohn’s disease,” says Dr Lee. “What’s more, we have evidence that some of these prognosis genes will be shared with other diseases, and as such this approach could be used to improve treatment in a number of conditions.”

The study has been welcomed by Crohn's and Colitis UK, who helped fund the study. "This is an exciting breakthrough which offers new hope for people who suffer every day from Crohn's and Colitis,” says Dr Wendy Edwards, Research Manager at Crohn’s and Colitis UK. “The research sheds new light on why some people with inflammatory bowel disease experience more severe symptoms than others, which has been little understood until now." 

As well as its implications for Crohn’s and other diseases, the approach taken by the researchers has suggested that there is value in re-examining previous genetic studies. Around a third of the genomes of Crohn’s disease patients analysed in this study had been collected for a previous study in 2007. By dividing the patients into groups categorised by disease severity, the researchers were able ask new questions – and gain new insights – from the old data.

The research was mainly funded by Wellcome, NIHR Cambridge Biomedical Research Centre, Crohn’s and Colitis UK and the Evelyn Trust.

Reference
Lee, JC, Biasci, D, et al. Genome-wide association study identifies distinct genetic contributions to prognosis and susceptibility in Crohn's disease. Nature Genetics; 9 Jan 2017; DOI: 10.1038/ng.3755

Researchers have identified a series of genetic variants that affect the severity of Crohn’s disease, an inflammatory bowel disease – but surprisingly, none of these variants appear to be related to an individual’s risk of developing the condition in the first place.

Genetic studies have been very successful at identifying genetic risk factors for Crohn’s disease, but have told us virtually nothing about why one person will get only mild disease while someone else might need surgery to treat their condition
James Lee
DNA

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
License type: 

Weight loss condition provides insight into failure of cancer immunotherapies

By cjb250 from University of Cambridge - immunology. Published on Nov 08, 2016.

Cancer immunotherapies involve activating a patient’s immune cells to recognise and destroy cancer cells. They have shown great promise in some cancers, but so far have only been effective in a minority of patients with cancer. The reasons behind these limitations are not clear.

Now, researchers at the Cancer Research UK Cambridge Institute at the University of Cambridge have found evidence that the mechanism behind a weight loss condition that affects patients with cancer could also be making immunotherapies ineffective. The condition, known as cancer cachexia, causes loss of appetite, weight loss and wasting in most patients with cancer towards the end of their lives. However, cachexia often starts to affect patients with certain cancers, such as pancreatic cancer, much earlier in the course of their disease.

In research published today in the journal Cell Metabolism, the scientists have shown in mice that even at the early stages of cancer development, before cachexia is apparent, a protein released by the cancer changes the way the body, in particular the liver, processes its own nutrient stores.

“The consequences of this alteration are revealed at times of reduced food intake, where this messaging protein renders the liver incapable of generating sources of energy that the rest of the body can use,” explains Thomas Flint, an MB/PhD student from the University of Cambridge School of Clinical Medicine and co-first author of the study. “This inability to generate energy sources triggers a second messaging process in the body – a hormonal response – that suppresses the immune cell reaction to cancers, and causes failure of anti-cancer immunotherapies.”

“Cancer immunotherapy might completely transform how we treat cancer in the future – if we can make it work for more patients,” says Dr Tobias Janowitz, Medical Oncologist and Academic Lecturer at the Department of Oncology at the University of Cambridge and co-first author. “Our work suggests that a combination therapy that either involves correction of the metabolic abnormalities, or that targets the resulting hormonal response, may protect the patient’s immune system and help make effective immunotherapy a reality for more patients.”

The next step for the team is to see how this discovery might be translated for the benefit of patients with cancer.

“If the phenomenon that we’ve described helps us to divide patients into likely responders and non-responders to immunotherapy, then we can use those findings in early stage clinical trials to get better information on the use of new immunotherapies,” says Professor Duncan Jodrell, director of the Early Phase Trials Team at the Cambridge Cancer Centre and co-author of the study.

“We need to do much more work in order to transform these results into safe, effective therapies for patients, however,” adds Professor Douglas Fearon, Emeritus Sheila Joan Smith Professor of Immunology at the University of Cambridge and the senior author, who is now also working at Cold Spring Harbor Laboratory and Weill Cornell Medical College. “Even so, the results raise the distinct possibility of future cancer therapies that are designed to target how the patient’s own body responds to cancer, with simultaneous benefit for reducing weight loss and boosting immunotherapy.”

The research was largely funded by Cancer Research UK, the Lustgarten Foundation, the Wellcome Trust and the Rosetrees Trust.

Nell Barrie, senior science information manager at Cancer Research UK, said: "Understanding the complicated biological processes at the heart of cancer is crucial for tackling the disease - and this study sheds light on why many cancer patients suffer from both loss of weight and appetite, and how their immune systems are affected by this process. Although this research is in its early stages, it has the potential to help make a difference on both fronts - helping treat weight loss and also improving treatments that boost the power of the immune system to destroy cancer cells."

Reference
Flint, TR et al. Tumor-Induced IL-6 Reprograms Host Metabolism to Suppress Anti-tumor Immunity. Cell Metabolism; 8 Nov 2016; DOI: 10.1016/j.cmet.2016.10.010

A weight loss condition that affects patients with cancer has provided clues as to why cancer immunotherapy – a new approach to treating cancer by boosting a patient’s immune system – may fail in a substantial number of patients. 

Cancer immunotherapy might completely transform how we treat cancer in the future – if we can make it work for more patients
Tobias Janowitz

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
License type: 

Self-renewable killer cells could be key to making cancer immunotherapy work

By cjb250 from University of Cambridge - immunology. Published on Oct 26, 2016.

In order to protect us from invading viruses and bacteria, and from internal threats such as malignant tumour cells, our immune system employs an army of specialist immune cells. Just as a conventional army will be made up of different types of soldiers, each with a particular role, so each of these immune cells has a particular function.

Among these cells are cytotoxic T-cells – ‘killer T-cells’, whose primary function is to patrol our bodies, programmed to identify and destroy infected or cancerous cells. Scientists are now trying to harness these cells as a way to fight cancer, by growing T-cells programmed to recognise cancer cells in the laboratory in large numbers and then reintroducing them into the body to destroy the tumour – an approach known as adoptive T-cell immunotherapy.

However, this approach has been hindered by the fact that killer T-cells are short-lived – most killer T cells are gone within three days of transfer – so the army may have died out before it has managed to rid the body of the tumour.

Now, an international team led by researchers at the University of Cambridge has identified a way of increasing the life-span of these T-cells, a discovery that could help scientists overcome one of the key hurdles preventing progress in immunotherapy.

In a paper published today in the journal Nature, the researchers have identified a new role for a molecule known as 2-hydroxyglutarate, or 2-HG, which is known to trigger abnormal growth in tumour cells. In fact, the team has shown that a slightly different form of the molecule also plays a normal, but critical, role in T-cell function: it can influence T-cells to reside in a 'memory state’.  This is a state where the cells can renew themselves, persist for a very long period of time, and re-activate to combat infection or cancer.

The researchers found that by increasing the levels of 2-HG in the T-cells, the researchers could generate cells that could much more effectively destroy tumours. Rather than expiring shortly after reintroduction, the memory state T-cells were able to persist for much longer, destroying tumour cells more effectively.

“In a sense, this means that rather than creating killer T-cells that are active from the start, but burn out very quickly, we are creating an army of ‘renewable cells’ that can stay quiet for a long time, but will go into action when necessary and fight tumour cells,” says Professor Randall Johnson, Wellcome Trust Principal Research Fellow at the Department of Physiology, Development & Neuroscience, University of Cambridge.

“So, with a fairly trivial treatment of T-cells, we’re able to change a moderate response to tumour growth to a much stronger response, potentially giving people a more permanent immunity to the tumours they are carrying. This could make immunotherapy for cancer much more effective.”

The research was largely funded by the Wellcome Trust.

Reference
Tyrakis, PA et al. The immunometabolite S-2-hydroxyglutarate regulates CD8+ T-lymphocyte fate; Nature; 26 Oct 2016; DOI: 10.1038/nature2016

A small molecule that can turn short-lived ‘killer T-cells’ into long-lived, renewable cells that can last in the body for a longer period of time, activating when necessary to destroy tumour cells, could help make cell-based immunotherapy a realistic prospect to treat cancer.

Rather than creating killer T-cells that are active from the start, but burn out very quickly, we are creating an army of ‘renewable cells’ that can stay quiet for a long time, but will go into action when necessary and fight tumour cells
Randall Johnson
T lymphocyte

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
License type: 

Anti-inflammatory drugs could help treat symptoms of depression, study suggests

By cjb250 from University of Cambridge - immunology. Published on Oct 18, 2016.

Researchers from the Department of Psychiatry at Cambridge led a team that analysed data from 20 clinical trials involving the use of anti-cytokine drugs to treat a range of autoimmune inflammatory diseases. By looking at additional beneficial side-effects of the treatments, the researchers were able to show that there was a significant antidepressant effect from the drugs compared to a placebo based on a meta-analysis of seven randomised controlled trials. Meta-analyses of the other types of clinical trials showed similar results.

When we are exposed to an infection, for example influenza or a stomach bug, our immune system fights back to control and remove the infection. During this process, immune cells flood the blood stream with proteins known as cytokines. This process is known as systemic inflammation.

Even when we are healthy, our bodies carry trace levels of these proteins – known as ‘inflammatory markers’ – which rise exponentially in response to infection. Previous work from the team found that children with high everyday levels of one of these markers are at greater risk of developing depression and psychosis in adulthood, suggesting a role for the immune system, particularly chronic low-grade systemic inflammation, in mental illness.

Inflammation can also occur as a result of the immune system mistaking healthy cells for infected cells and attacking the body, leading to autoimmune inflammatory diseases such as rheumatoid arthritis, psoriasis and Crohn’s disease. New types of anti-inflammatory drugs called anti-cytokine monoclonal antibodies and cytokine inhibitors have been developed recently, some of which are now routinely used for patients who respond poorly to conventional treatments. Many more are currently undergoing clinical trials to test their efficacy and safety.

The team of researchers carried out a meta-analysis of these clinical trials and found that the drugs led to an improvement in the severity of depressive symptoms independently of improvements in physical illness. In other words, regardless of whether a drug successfully treated rheumatoid arthritis, for example, it would still help improve a patient’s depressive symptoms. Their results are published today in the journal Molecular Psychiatry.

Dr Golam Khandaker, who led the study, says: “It’s becoming increasingly clear to us that inflammation plays a role in depression, at least for some individuals, and now our review suggests that it may be possible to treat these individuals using some anti-inflammatory drugs. These are not your everyday anti-inflammatory drugs such as ibuprofen, however, but a particular new class of drugs.”

“It’s too early to say whether these anti-cytokine drugs can be used in clinical practice for depression, however,” adds Professor Peter Jones, co-author of the study. “We will need clinical trials to test how effective they are in patients who do not have the chronic conditions for which the drugs have been developed, such as rheumatoid arthritis or Crohn’s disease. On top of this, some existing drugs can have potentially serious side effects, which would need to be addressed.”

Dr Khandaker and colleagues believe that anti-inflammatory drugs may offer hope for patients for whom current antidepressants are ineffective. Although the trials reviewed by the team involve physical illnesses that trigger inflammation – and hence potentially contribute to depression – their previous work found a connection between depression and baseline levels of inflammation in healthy people (when someone does not have an acute infection), which can be caused by a number of factors such as genes and psychological stress.

“About a third of patients who are resistant to antidepressants show evidence of inflammation,” adds Dr Khandaker. “So, anti-inflammatory treatments could be relevant for a large number of people who suffer from depression.

“The current approach of a ‘one-size-fits-all’ medicine to treat depression is problematic. All currently available antidepressants target a particular type of neurotransmitter, but a third of patients do not respond to these drugs. We are now entering the era of ‘personalised medicine’ where we can tailor treatments to individual patients. This approach is starting to show success in treating cancers, and it’s possible that in future we would use anti-inflammatory drugs in psychiatry for certain patients with depression.”

The research was mainly funded by the Wellcome Trust, with further support from the National Institute for Health Research (NIHR) Cambridge Biomedical Research Centre.

Reference
Kappelmann, N et al. Antidepressant activity of anti-cytokine treatment: a systematic review and meta-analysis of clinical trials of chronic inflammatory conditions. Molecular Psychiatry; 18 Oct 2016; DOI: 10.1038/mp.2016.167

Anti-inflammatory drugs similar to those used to treat conditions such as rheumatoid arthritis and psoriasis could in future be used to treat some cases of depression, concludes a review led by the University of Cambridge, which further implicates our immune system in mental health disorders.

It’s becoming increasingly clear to us that inflammation plays a role in depression, at least for some individuals, and now our review suggests that it may be possible to treat these individuals using some anti-inflammatory drugs
Golam Khandaker
Depressing fog

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
License type: 

New approach to treating type 1 diabetes aims to limit damage caused by our own immune system

By cjb250 from University of Cambridge - immunology. Published on Oct 11, 2016.

Type 1 diabetes is one of the most common chronic diseases in children and there is a rapid increase in the number affected each year. About 400,000 people in the UK are affected, 29,000 of them children. In type 1 diabetes, the body’s own immune system mistakes the insulin producing cells of the pancreas as harmful, attacks and then destroys them. The result is a lack of insulin, which is essential for transporting glucose from the blood into cells. Without insulin, glucose levels in the blood rise, causing short term and long term damage: hence patients have to inject themselves several times a day with insulin to compensate.

In a study published today in the open access journal PLOS Medicine, a team led by researchers from the JDRF/Wellcome Trust Diabetes Inflammation Laboratory at the Cambridge Institute of Medical Research used a drug to regulate the immune system with the aim of preventing a patient’s immune cells attacking their insulin-producing cells in the pancreas.

The drug, aldesleukin, recombinant interleukin -2 (IL-2), is currently used at high doses to treat certain types of kidney tumours and skin cancers. At much lower doses, aldesleukin enhances the ability of immune cells called regulatory T cells (Tregs) to stop the immune system losing control once stimulated and prevent it from damaging the body’s own organs (autoimmunity).

Critical to this approach was to first determine the effects of single doses of aldesleukin on Tregs in patients with type 1 diabetes. To achieve this the team employed a state-of-the-art trial design combined with extensive immune monitoring in 40 participants with type 1 diabetes, and found doses to increase Tregs by between 10-20%. These doses are potentially enough to prevent immune cells from attacking the body, but not so much that they would supress the body’s natural defences, which are essential for protecting us from infection by invading bacteria or viruses.

The researchers also found that the absence of response of some participants in previous trials may be explained by the daily dosing regimen of aldesleukin used. The current trial results suggest that daily dosing results in Tregs becoming less sensitive to the drug, and the recommendation from the study is that the drug should not be administered on a daily basis for optimal immune outcomes.

“Type 1 diabetes is fatal if left untreated, but the current treatment – multiple daily injections of insulin – are at best inconvenient, at worst painful, particularly for children,” says Dr Frank Waldron-Lynch, who led the trial. “Our goal is to develop a treatment that could see the end to the need for these life-long, daily injections by curtailing the early damage caused by the patient’s own immune system.

“Our work is at an early stage, but it uses a drug that occurs naturally within the body to restore the immune system to health in these patients. Whereas previous approaches have focused on suppressing the immune system, we are looking to fine-tune it. Our next step is to find the optimal, ‘Goldilocks’ treatment regimen – too little and it won’t stop the damage, too much and it could impair our natural defences, but just right and it would enhance the body’s own response.”

The researchers say that any treatment would initially focus on people who are newly-diagnosed with type 1 diabetes, many of whom are still able to produce sufficient insulin to prevent complications from the disease. The treatment could then help prevent further damage and help them to continue to produce a small amount of insulin for a longer period of time.

The research was largely funded by the type 1 diabetes charity JDRF, the Wellcome Trust and the Sir Jules Thorn Charitable Trust, with support from the National Institute for Health Research (NIHR) Cambridge Biomedical Research Centre.

Angela Wipperman, Senior Research Communications Manager at JDRF, said: “Immunotherapy research offers the potential to change the lives of those affected by type 1 diabetes. We eagerly await the next steps from this talented research team.”

Reference
Todd JA, Evangelou M, Cutler AJ, Pekalski ML, Walker NM, Stevens HE, et al. PLOS Medicine; 11 Oct 2016; DOI: 10.1371/journal.pmed.1002139

Researchers at the University of Cambridge have taken the first step towards developing a new form of treatment for type 1 diabetes which, if successful, could mean an end to the regular insulin injections endured by people affected by the disease, many of whom are children.

Our goal is to develop a treatment that could see the end to the need for these life-long, daily injections by curtailing the early damage caused by the patient’s own immune system
Frank Waldron-Lynch
Diabetes (rotated)

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes
License type: 

Yoshinori Ohsumi – a deserving winner of the Nobel Prize for physiology or medicine

By cjb250 from University of Cambridge - immunology. Published on Oct 03, 2016.

I am delighted that Yoshinori Ohsumi won this year’s Nobel Prize in physiology or medicine. His pioneering work in yeast led to the discovery of genes and biological processes that are needed for autophagy.

Autophagy (from the Greek for “self-eating”) is the mechanism by which cells break down and recycle cellular content. Without this vital housekeeping role we’d be more prone to cancer, Parkinson’s and other age-related disorders.

Although scientists have been aware of autophagy since the 1960s, it wasn’t until Ohsumi’s experiments with yeast in the 1990s that we began to understand the important role of this biological process.

The autophagy process is remarkably similar across lifeforms. One function that is the same, from yeast to humans, is to protect cells against starvation and related stresses. In these conditions, autophagy allows cells to degrade large molecules into basic building blocks, which are used as energy sources. The discovery of key yeast autophagy genes that was led by Ohsumi was particularly powerful because it helped scientists to quickly identify the genes in mammals that have similar functions. This, in turn, has provided vital tools for laboratories around the world to study the roles of autophagy in human health and disease.

With the knowledge that various mammalian genes are needed for autophagy, researchers could then remove these genes from cells or animals, including mice, and examine their functions. These types of studies have highlighted the importance of autophagy in processes including infection and immunity, neurodegenerative diseases and cancer.

The importance of Ohsumi’s findings

My laboratory, for example, found that autophagy can break down the proteins responsible for various neurological diseases, including forms of dementia (caused by tau), Parkinson’s disease (alpha-synuclein) and Huntington’s disease (mutant huntingtin). We are pursuing the idea that by increasing the autophagy process we could potentially treat some of these conditions.

A tau protein fragment. molekuul_be/Shutterstock.com

Another important consequence of Ohsumi’s discoveries is that they allowed subsequent studies that aimed to understand the mechanisms by which autophagy proteins actually control this process. Indeed, Ohsumi’s group have also made seminal contributions in this domain.

This Nobel prize highlights some other key characteristics of Ohsumi and his work. One is that his laboratory works on yeast. At the time he made his discoveries in the 1990s, no one would have guessed that they would have such far-reaching implications for human health. Essentially, he was studying autophagy in yeast because he was curious. This basic research yielded the foundation for an entire field, which has grown rapidly in recent years, especially as its relevance for health has become more apparent. This should serve as a reminder to those influencing science strategy that groundbreaking discoveries are often unexpected and that one should not only support science where the endpoint appears to be obviously relevant to health.

Ohsumi has also nurtured outstanding scientists like Noboru Mizushima and Tamotsu Yoshimori, who have been major contributors to the understanding of autophagy in mammals. Perhaps most importantly, he continues to do interesting and fundamental work. This Nobel prize is very well deserved for the man who opened the door to an important field.

The Conversation

David Rubinsztein, Professor of molecular neurogenetics, University of Cambridge

This article was originally published on The Conversation. Read the original article.

Yoshinori Ohsumi is a deserving winner of this year's Nobel Prize in physiology or medicine, whose work shows the value of basic research, writes Professor David Rubinsztein, Deputy Director of the Cambridge Institute for Medical Research on The Conversation website.

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

FAMIN or feast? Newly-discovered mechanism influences how immune cells ‘eat’ invading bacteria

By cjb250 from University of Cambridge - immunology. Published on Aug 01, 2016.

To date, researchers have identified hundreds of genetic variants that increase or decrease the risk of developing diseases from cancer and diabetes to tuberculosis and mental health disorders. However, for the majority of such genes, scientists do not yet know how the variants contribute to disease – indeed, scientists do not even understand how many of the genes function.

One such gene is C13orf31, found on chromosome 13. Scientists have previously shown that variants of the gene in which a single nucleotide – the A, C, G and T of DNA – differs are associated with risk for the infectious disease leprosy, and for the chronic inflammatory diseases Crohn’s disease and a form of childhood arthritis known as systemic juvenile idiopathic arthritis.

In a study published today in the journal Nature Immunology and led by the University of Cambridge, researchers studied how this gene works and have identified a new mechanism that drives energy metabolism in our immune cells. Immune cells help fight infection, but in some cases attack our own bodies, causing inflammatory disease.

Using mice in which the mouse equivalent of the C13orf31 gene had been altered, the team showed that the gene produces a protein that acts as a central regulator of the core metabolic functions in a specialist immune cell known as a macrophage (Greek for ‘big eater’). These cells are so named for their ability to ‘eat’ invading organisms, breaking them down and preventing the infection from spreading. The protein, which the researchers named FAMIN (Fatty Acid Metabolic Immune Nexus), determines how much energy is available to the macrophages.

The researchers used a gene-editing tool known as CRISPR/Cas9, which acts like a biological ‘cut and paste’ tool, to edit a single nucleotide in the risk genes within the mouse’s genome to show that even a tiny change to our genetic makeup could have a profound effect, making the mice more susceptible to sepsis (blood poisoning). This showed that FAMIN influences the cell’s ability to perform its normal function, controlling its capacity to kill bacteria and release molecules known as ‘mediators’ that trigger an inflammatory response, a key part of fighting infection and repairing damage in the body.

Professor Arthur Kaser from the Department of Medicine at the University of Cambridge, who led the research, says: “By taking a disease risk gene whose role was completely unknown and studying its function down to the level of a single nucleotide, we’ve discovered an entirely new and important mechanism that affects our immune system’s ability to carry out its role as the body’s defence mechanism.”

Dr Zaeem Cader, the study’s first author, adds: “Although it’s too early to say how this discovery might influence new treatments, genetics can provide invaluable insights that might help in identifying potential drug targets for so-called precision medicines, tailored to an individual’s genetic make-up.”

The research was largely funded by the European Research Council and the Wellcome Trust, with support from National Institute for Health Research (NIHR) Cambridge Biomedical Research Centre.

Reference
Cader, MZ et al. C13orf31 (FAMIN) is a central regulator of immunometabolic function. Nature Immunology; 1 Aug 2016; DOI: 10.1038/ni.3532

A new mechanism that affects how our immune cells perform – and hence their ability to prevent disease – has been discovered by an international team of researchers led by Cambridge scientists.

By taking a disease risk gene whose role was completely unknown and studying its function down to the level of a single nucleotide, we’ve discovered an entirely new and important mechanism that affects our immune system’s ability to carry out its role as the body’s defence mechanism
Arthur Kaser
Pacman

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Call to arms: how lessons from history could reduce the ‘immunisation gap’

By lw355 from University of Cambridge - vaccine. Published on Apr 25, 2016.

An outbreak of measles in Disneyland sounds like a fairytale gone bad. Yet, in January 2015, states across the USA began reporting measles among individuals who had visited the Disneyland Resort in California the month before. All because a visitor to the resort had unwittingly carried the virus into the ‘Happiest Place On Earth’.

The virus is so contagious that 90% of those close to ‘patient zero’ had been at risk of being infected if they were not already immune. Epidemiologists later concluded that “substandard vaccination compliance” was likely to blame for the outbreak. Six months later, the state of California made vaccination mandatory: from July 2016, all children enrolling in school must be fully vaccinated.

Measles and other vaccine-preventable diseases have been on the rise globally in recent years. France, for instance, seemed close to eliminating measles in 2007, but in the following four years, reported a dramatic outbreak of more than 20,000 cases, with 80% of reported cases occurring in unvaccinated people.

These recent events have highlighted the ‘immunisation gap’ – the trend for parents not to have their child vaccinated because of anxiety about unforeseen health consequences. But without a certain threshold of vaccination in a community – so-called herd immunity – the unvaccinated become especially vulnerable.

Yet, vaccinations are considered to be one of the greatest public health achievements in history. Perhaps that’s part of the problem, says historian Dr Stephen Mawdsley: “We have largely forgotten what it’s like to face an epidemic sweeping through a population.” Vaccinations, it seems, have become a victim of their own success.

But this isn’t the first time that ‘vaccine hesitancy’ has threatened public health. “During the first half of the 20th century, America faced a terrifying disease – polio,” he adds. “As many as 57,000 new cases were being reported every year in the early 1950s. Not only was this a painful illness, it had grave economic consequences. Thousands of survivors required expensive acute and convalescent care, and many suffered from lasting paralysis.”

Although the polio virus could strike anyone, young children were particularly affected, inspiring the term ‘infantile paralysis’. Despite a vaccine being available, few teenagers and adults sought its protection because they believed they were not sufficiently at risk to warrant paying for the course of three inoculations.

Mawdsley’s research, just published in the Journal of Cultural and Social History, has uncovered how young people themselves became the answer to the problem, in what might be the first, largest and most successful case of teen health activism of the time. This fight waged against vaccine noncompliance in 1950s America, he suggests, could provide important lessons for the world today.

It was while hunting through the archives of the March of Dimes (MOD) – a fundraising campaign set up by polio survivor President Franklin D. Roosevelt and his law partner Basil O’Connor – that he made the discovery. “Who’d have thought that, after suffering terrible epidemics and fear, Americans would have a very mixed reaction towards polio vaccination? Or that those in the ‘vaccination gap’ would help to fill it.”

A range of social, economic and political factors complicated the delivery of a comprehensive vaccination programme. Teens, in particular, were a demographic group that was difficult to reach. Two years after the vaccine was licensed in 1955, as many as 30% still had no inoculations, and a third of all new cases were in teens. The public health message wasn’t getting through, and new strategies were needed.

Celebrities helped the cause. ‘Presley Receives a City Polio Shot’ proclaimed the New York Times in 1956, as the King of Rock ‘n’ Roll offered his arm for vaccination before appearing on the Ed Sullivan Show. But the real drivers of the message were a group of teenagers gathered together by the MOD-financed National Foundation for Infantile Paralysis (NFIP).

“Growing consumerism and rising purchasing power and recreational time spurred the emergence of an assertive teen culture by the late 1950s,” explains Mawdsley. “Many national organisations began to recognise teens as important consumers with cultural influence. By tapping into this segment of society, the NFIP hoped to inspire a new wave of vaccination driven by peer approval.”

The relationship was reciprocal. For the hundreds of young people brought together by the NFIP from all over the USA for a conference, this was a chance to challenge negative stereotypes about juvenile delinquency, and gain recognition and appreciation through grassroots activism.

Officials and teenagers debated strategies to improve vaccination, as well as how to break down race, ethnicity and gender stereotypes. The underlying ethos was that the vaccination message could penetrate teen culture only if it came from within its ranks. After the conference, the teenagers established county chapters across the country under the motto ‘Teens Against Polio’ (TAP), each chapter recruiting yet more teens to promote vaccination.

Some canvassed door to door or gave talks at schools; others organised car washes and peanut sales, or visited polio wards and rehabilitation centres. “No shots, no dates” was a recurring phrase, and teens were often asked at school dances to prove they were immunised before gaining entry. “By using exclusive dances as a tactic, young volunteers were able to exploit the fear of missing out as a means to increase vaccine uptake among teens,” he says.

“I interviewed some former TAP volunteers, and they said that looking back it was surprising that some of these tactics were so acceptable – it showed the power of teens understanding and connecting with their own demographic.”

The creativity and audacity of teens were acknowledged as cornerstones to the marketing strategy by adults, as one NFIP chapter chairman recalled: “The youngsters did have enterprise and nerve. They went in offices, stores, restaurants, hotels – any place there was a person. They barged in on bank presidents, dentists, janitors, even the jail.”

Although teen health activists could not solve all the challenges facing vaccination, their strategies had a remarkable effect. As teen vaccination increased, fewer cases of polio emerged. By 1960, the annual incidence of polio had decreased by nearly 90% compared with 1950.

Mawdsley believes that lessons might be learnt from the history of the fight against polio by public health communication campaigns today. “Yes, their approaches and language were very much the product of 1950s America, but the lesson here is that a hard-to-influence group can be reached. This could be by tapping into new forms of communication such as social media, or clever approaches to promoting vaccination to people opposed to vaccination.

“TAP reinvigorated a failing public health campaign by addressing the fears, access restrictions and misinformation about polio. The teen polio crusaders were a Trojan horse in the battle for public support and donations for polio.”

This research was funded by Clare Hall, Cambridge, and Cambridge Infectious Diseases.

Inset images: March of Dimes.

A rise in the number of outbreaks of vaccine-preventable diseases has highlighted the growing trend for parents not to have their child vaccinated. Could the activities of a group of teenagers in 1950s America inspire a fresh look at the effectiveness of pro-vaccine public health information campaigns?

Who’d have thought that, after suffering terrible epidemics and fear, Americans would have a very mixed reaction towards polio vaccination? Or that those in the ‘vaccination gap’ would help to fill it.
Stephen Mawdsley
Elvis Presley receives a polio vaccination

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

‘Clogged-up’ immune cells help explain smoking risk for TB

By cjb250 from University of Cambridge - immunology. Published on Mar 24, 2016.

TB is an infectious disease caused by Mycobacterium tuberculosis that primarily infects the lungs, but can also infect other organs. It is transmitted from person to person through the air. The disease can cause breathlessness, wasting, and eventual death. While treatments do exist, the drug regimen is one of the longest for any curable disease: a patient will typically need to take medication for six months.

For people exposed to TB, the biggest risk factor for infection is exposure to smoke, including active and passive cigarette smoking and smoke from burning fuels. This risk is even greater than co-infection with HIV. However, until now it has not been clear why smoke should increase this risk.

When TB enters the body, the first line of defence it encounters is a specialist immune cell known as a macrophage (Greek for ‘big eater’). This cell engulfs the bacterium and tries to break it down. In many cases, the macrophage is successful and kills the bacterium, preventing TB infection, but in some cases TB manages not just to avoid destruction, but to use macrophages as ‘taxi cabs’ and get deep into the host, spreading the infection. TB’s next step is to cause infected macrophages to form tightly-organised clusters known as tubercles, or granulomas. Once again here, the macrophages and bacteria fight a battle – if the macrophages lose, the bacteria use their advantage to spread from cell to cell within this structure.

An international team of researchers, led by the University of Cambridge, and the University of Washington, Seattle, studying genetic variants that increase susceptibility to TB in zebrafish – a ‘see-through’ animal model for studying the disease – identified a variant linked to ‘lysosomal deficiency disorders’. The lysosome is a key component of macrophages responsible for destroying bacteria. This particular variant caused a deficiency in an enzyme known as cathepsin, which acts within the lysosome like scissors to ‘chop up’ bacteria; however, this would not necessarily explain why the macrophages could not destroy the bacteria, as many additional enzymes could take cathepsin’s place.

The key, the researchers found, lay in a second property of the macrophage: housekeeping. As well as destroying bacteria, the macrophage also recycles unwanted material from within the body for reuse, and these lysosomal deficiency disorders were preventing this essential operation.

Professor Lalita Ramakrishnan from the Department of Medicine at the University of Cambridge, who led the research, explains: “Macrophages act a bit like vacuum cleaners, hoovering up debris and unwanted material within the body, including the billions of cells that die each day as part of natural turnover. But the defective macrophages are unable to recycle this debris and get clogged up, growing bigger and fatter and less able to move around and clear up other material.

“This can become a problem in TB because once the TB granuloma forms, the host’s best bet is to send in more macrophages at a slow steady pace to help the already infected macrophages.”

Image: Left - normal macrophages (green); Right - dysfunctional macrophages whose lysosomes (red) are clogged with cell debris. Credit: Steven Levitte

“When these distended macrophages can’t move into the TB granuloma,” adds co-author Steven Levitte from the University of Washington, “the infected macrophages that are already in there burst, leaving a ‘soup’ in which the bacteria can grow and spread further, making the infection worse.”

The researchers looked at whether the effect seen in the lysosomal deficiency disorders, where the clogged-up macrophage could no longer perform its work, would also be observed if the lysosome became clogged up with non-biological material. By ‘infecting’ the zebrafish with microscopic plastic beads, they were able to replicate this effect.

“We saw that accumulation of material inside of macrophages by many different means, both genetic and acquired, led the same result: macrophages that could not respond to infection,” explains co-author Russell Berg.

This discovery then led the team to see whether the same phenomenon occurred in humans. Working with Professor Joe Keane and his colleagues from Trinity College Dublin, the researchers were able to show that the macrophages of smokers were similarly clogged up with smoke particles, helping explain why people exposed to smoke were at a greater risk of TB infection.

“Macrophages are our best shot at getting rid of TB, so if they are slowed down by smoke particles, their ability to fight infection is going to be greatly reduced,” says Professor Keane. “We know that exposure to cigarette smoke or smoke from burning wood and coal, for example, are major risk factors for developing TB, and our finding helps explain why this is the case. The good news is that stopping smoking reduces the risk – it allows the impaired macrophages to die away and be replaced by new, agile cells.”

Image: Smoke-clogged macrophages of cigarette smokers are unable to move to engulf infecting TB bacteria, which may explain why cigarette smokers are more susceptible to tuberculosis. Credit: Kevin Takaki and drawn by Paul Margiotta

The research was supported by the National Institutes of Health, the Wellcome Trust, the National Institute for Health Research (NIHR) Cambridge Biomedical Research Centre (BRC), the Health Research Board of Ireland and The Royal City of Dublin Hospital Trust.

Also contributing to this research were Professor David Tobin from Duke University, Dr Cecilia Moens from the Fred Hutchinson Cancer Research Institute, Drs C.J. Cambier and  J. Cameron from University of Washington, Dr Kevin Takaki from University of Cambridge and Drs Seonadh O’Leary and Mary O’Sullivan from Trinity College Dublin.

Reference
Berg, RD, Levitte, S et al. Lysosomal Disorders Drive Susceptibility to Tuberculosis by Compromising Macrophage Migration. Cell; 24 Mar 2016; 10.1016/j.cell.2016.02.034

Smoking increases an individual’s risk of developing tuberculosis (TB) – and makes the infection worse – because it causes vital immune cells to become clogged up, slowing their movement and impeding their ability to fight infection, according to new research published in the journal Cell.

Macrophages act a bit like vacuum cleaners, hoovering up debris and unwanted material within the body, including the billions of cells that die each day as part of natural turnover
Lalita Ramakrishnan
Macrophage engulfing Tuberculosis bacteria

Creative Commons License
The text in this work is licensed under a Creative Commons Attribution 4.0 International License. For image use please see separate credits above.

Yes

Minimising ‘false positives’ key to vaccinating against bovine TB

By cjb250 from University of Cambridge - vaccine. Published on Feb 19, 2015.

Cows in a field

Using mathematical modelling, researchers at the University of Cambridge and Animal & Plant Health Agency, Surrey, show that it is the specificity of the test – the proportion of uninfected animals that test negative – rather than the efficacy of a vaccine, that is the dominant factor in determining whether vaccination can provide a protective economic benefit when used to supplement existing controls.

Bovine TB is a major economic disease of livestock worldwide. Despite an intensive, and costly, control programme in the United Kingdom, the disease continues to persist. Vaccination using the human vaccine Mycobacterium bovis bacillus Calmette-Guérin (BCG) offers some protection in cattle, but is currently illegal within the European Union (EU) due to its interference with the tuberculin skin test. This test is the cornerstone of surveillance and eradication strategies and is used to demonstrate progress towards national eradication and as the basis of international trade in cattle.

The current tuberculin skin test has a very high estimated specificity of over 99.97%, which means that less than three animals in 10,000 will test falsely positive. The test as carried out in Great Britain is thought to have at best an 80% sensitivity – a measure of how many infected animals will correctly test positive – missing around 1 in 5 bovine TB-infected cattle. It is used to determine if animals, herds and countries are officially free of bovine TB.

Vaccinated animals that test positive have to be treated as infected animals. Under European law, if an animal tests positive, it must be slaughtered. The remaining herd is put under movement restrictions and tested repeatedly using both the skin test and post-mortem examinations until it can be shown to be officially clear of infection. The duration of movement restrictions is important due to the considerable economic burden they place on farms. The cost to the UK government alone, which depends on the number of visits to farms by veterinarians, tests carried out and compensation for the slaughter of infected animals, is estimated to be up to £0.5 billion pounds over the last ten years.

For vaccination to be feasible economically and useful within the context of European legislation, the benefits of vaccination must be great enough to outweigh any increase in testing. A new generation of diagnostic tests, known as ‘Differentiate Vaccinated from Infected Animals’ (DIVA) tests, opens up the opportunity for the use of BCG within current control programmes.

The EU has recently outlined the requirements for changes in legislation to allow cattle vaccination and a recent report from its European Food Safety Authority emphasized the importance of demonstrating that BCG is efficacious and that DIVA tests can be shown to have a comparable sensitivity to tuberculin testing in large-scale field trials. However, a key factor overlooked in this report was that the currently viable DIVA tests have a lower specificity than tuberculin testing; this could lead to vaccinated herds being unable to escape restrictions once a single test-positive animal has been detected, as the more times the herd is tested, the more likely the test is to record a false positive.

In the study published today, the researchers from Cambridge and the Animal & Plant Health Agency used herd level mathematical models to show that the burden of infection can be reduced in vaccinated herds even when DIVA sensitivity is lower than tuberculin skin testing – provided that the individual level protection is great enough. However, in order to see this benefit of vaccination the DIVA test will need to achieve a specificity of greater than 99.85% to avoid increasing the duration and number of animals condemned during breakdowns. A data set of BCG vaccinated and BCG vaccinated/experimentally M. bovis infected cattle suggests that this specificity could be achievable with a relative sensitivity of the DIVA test of 73.3%.

However, validating a test to such a high specificity will likely prove a challenge. Currently, there is no gold standard test to diagnose TB in cattle. Cattle that test positive are slaughtered immediately and therefore have rarely developed any physical signs – in fact, only around a half of animals examined post-mortem show physical signs of infection even if they are, indeed, infected.

Dr Andrew Conlan from the Department of Veterinary Medicine at the University of Cambridge says: “In order for vaccination to be viable, we will need a DIVA test that has extremely high specificity. If the specificity is not good enough, the test will find false positives, leading to restrictions being put in place and a significant financial burden for the farmer.

“But validating a test that has a very high specificity will in itself be an enormous challenge. We would potentially need to vaccinate, test and kill a large number of animals in order to be confident the test is accurate. This would be very expensive.”

The need for a better DIVA test was acknowledged by the Government at the end of last year. In a written statement to the House of Commons noting data from the University of Cambridge and Animal Health and Veterinary Laboratories Agency, the Rt Hon Elizabeth Truss, Secretary of State for Environment, Food and Rural Affairs, said: “An independent report on the design of field trials of cattle vaccine and a test to detect infected cattle among vaccinated cattle (DIVA) shows that before cattle vaccination field trials can be contemplated, we need to develop a better DIVA test.”

The study was funded by the Department for Environment, Food and Rural Affairs (Defra) in the UK.

Reference
Conlan, AJK et al. Potential benefits of cattle vaccination as a supplementary control for bovine tuberculosis. PLOS Comp Biol; 19 Feb 2015

New diagnostic tests are needed to make vaccination against bovine tuberculosis (bovine TB) viable and the number of false positives from these tests must be below 15 out of every 10,000 cattle tested, according to research published today in the journal PLOS Computational Biology.

Validating a test... will in itself be an enormous challenge. We would potentially need to vaccinate, test and kill a large number of animals in order to be confident the test is accurate
Andrew Conlan
Cows

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes
License type: 

Staying ahead of the game: Pre-empting flu evolution may make for better vaccines

By cjb250 from University of Cambridge - vaccine. Published on Nov 20, 2014.

Flu vaccine

In a study published today in the journal Science, the researchers in the UK, Vietnam, The Netherlands and Australia, led by the University of Cambridge, describe how an immunological phenomenon they refer to as a ‘back boost’ suggests that it may be better to pre-emptively vaccinate against likely future strains than to use a strain already circulating in the human population.

Influenza is a notoriously difficult virus against which to vaccinate. There are many different strains circulating – both in human and animal populations – and these strains themselves evolve rapidly. Yet manufacturers, who need to produce around 350 million doses ahead of the annual ‘flu season’, must know which strain to put in the vaccine months in advance – during which time the circulating viruses can evolve again.

Scientists at the World Health Organisation (WHO) meet each February to select which strain to use in vaccine development. Due to the complexity of human immune responses, this is decided largely through analysis of immune responses in ferrets to infer which strain best matches those currently circulating. However, vaccination campaigns for the following winter flu season usually start in October, by which time the virus may have evolved such that the effectiveness of the vaccine match is reduced.

“It’s a real challenge: the WHO selects a strain of flu using the best information available but is faced with the possibility that the virus will evolve before the flu season,” explains Dr Judy Fonville, one of the primary authors on the paper and a member of WHO Collaborating Centre for Modelling, Evolution and Control of Emerging Infectious Diseases at the University of Cambridge. “Even if it does, though, it’s worth remembering that the flu vaccine still offers much greater protection than having no jab. We’re looking for ways to make an important vaccine even more effective.”

According to the WHO, seasonal influenza causes between 3 and 5 million cases of severe illness each year worldwide, up to 500,000 deaths, as well as significant economic impact. Vaccination policies vary per country, but are typically recommended for those at risk of serious complications, such as pregnant women and the elderly. The seasonal flu vaccine has been described as one of the most cost-effective measures of disease prevention, and vaccination therefore has a large health economic benefit. Currently 350 million people partake in annual vaccination programmes. Yet there is room for improvement.

After gathering an extensive amount of immunological data, the team modelled the antibody response to vaccination and infection using a newly developed computer-based method to create an individual’s ‘antibody landscape’. This landscape visualises an individual’s distinct immune profile like a three dimensional landscape with mountains in areas of immune memory and valleys in unprotected areas. The technique enables a much greater understanding of how our immune system responds to pathogens such as flu that evolve and re-infect us.



A key finding from the work is that upon infection, a response is seen not just to the infecting influenza strain, but to all the strains that an individual has encountered in the past. It is this broad recall of immunity, that they term the ‘back-boost’, that is the basis for the proposed vaccine improvement.

Dr Sam Wilks, one of the primary authors, explains: “Crucially, when the vaccine strain is updated pre-emptively, we see that it still stimulates better protection against future viruses yet this comes at no cost to the protection generated against currently circulating ones.

“Faced with uncertainty about how and when the flu virus might evolve, it’s better to gamble than to be conservative: if you update early, you still stimulate protection against current strains – much worse is if you update too late. Rather than trying to play ‘catch-up’, it’s better to anticipate and prepare for the likely next step of influenza evolution – and there is no penalty for doing it too soon.”

Professor Derek Smith, also from Cambridge, adds why this may lead to improved vaccines in a relatively short timeframe: “The beauty of this approach is that it would not require any change to the current manufacturing process. From the point that the new strain has been selected through to an individual receiving their shot, the steps will be exactly the same. The only difference would be greater protection for the recipient.”

The team is now combining this research with their other work on predicting the way in which the virus will evolve, and plan to combine these two major pieces of work in prospective clinical trials.

The international collaboration included researchers from: the Erasmus Medical Center, the Netherlands; the Oxford University Clinical Research Unit & Wellcome Trust Major Overseas Programme and the National Institute of Hygiene and Epidemiology, Vietnam; and the WHO Collaborating Centre for Reference and Research on Influenza at the Victorian Infectious Diseases Reference Laboratory in Melbourne. Its principal funders were the Wellcome Trust and the US National Institutes of Health Centers of Excellence for Influenza Research and Surveillance (CEIRS).

Reference
Fonville, JM et al. Antibody landscapes after influenza virus infection or vaccination. Science; 20 Nov 2014

An international team of researchers has shown that it may be possible to improve the effectiveness of the seasonal flu vaccine by ‘pre-empting’ the evolution of the influenza virus.

Faced with uncertainty about how and when the flu virus might evolve, it’s better to gamble than to be conservative
Sam Wilks
Flu Vaccination Grippe (cropped)

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes
License type: 

Why live vaccines may be most effective for preventing Salmonella infections

By cjb250 from University of Cambridge - vaccine. Published on Sep 18, 2014.

Salmonella bacteria

The BBSRC-funded researchers used a new technique that they have developed where several populations of bacteria, each of which has been individually tagged with a unique DNA sequence, are administered to the same host (in this case, a mouse). This allows the researchers to track how each bacterial population replicates and spreads between organs or is killed by the immune system. Combined with mathematical modelling, this provides a powerful tool to study infections within the host. The findings are published today in the journal PLOS Pathogens.

“We effectively ‘barcode’ the bacteria so that we can see where in the body they go and how they fare against the immune system,” explains Dr Pietro Mastroeni from the Department of Veterinary Medicine at the University of Cambridge, who led the study. “This has provided us with some important insights into why some vaccines are more effective than others.”

The multidisciplinary research team led by Dr Mastroeni used the new technique to look at the effectiveness of vaccines against infection by the bacterium Salmonella enterica, which causes diseases including typhoid fever, non-typhoidal septicaemia and gastroenteritis in humans and animals world-wide. Current measures to control S. enterica infections are limited and the emergence of multi-drug resistant strains has reduced the usefulness of many antibiotics. Vaccination remains the most feasible means to counteract S. enterica infections.

There are two main classes of vaccine: live attenuated vaccines and non-living vaccines. Live attenuated vaccines use a weakened form of the bacteria or virus to stimulate an immune response – however, there are some concerns that the weakened pathogen may become more virulent when used in patients with compromised immune systems, for example people infected with HIV, malaria or TB. Non-living vaccines, on the other hand, are safer as they usually use inactive bacteria or viruses, or their fragments – but these vaccines are often less effective. Both vaccines work by stimulating the immune system to recognise a particular bacterium or virus and initiate the fight back in the event of future infection.

Using their new technique, Dr Mastroeni and colleagues showed that live Salmonella vaccines enhance the ability of the immune system to prevent the bacteria from replicating and spreading to other organs. They can also prevent the spread of the bacteria into the bloodstream, which causes a condition known as bacteraemia, a major killer of children in Africa.

They also found that the antibody response induced by live vaccines enhances the ability of immune cells known as phagocytes to kill bacteria in the very early stages of infection, but that a further type of immune cell known as the T-cell – again stimulated by the live vaccine – is subsequently necessary for control and clearance of the bacteria from the blood and tissues. The killed vaccine, whilst able to boost the phagocyte response via the production of antibodies, did not stimulate a protective form of T-cell immunity and was unable to prevent the subsequent bacterial growth in infected organs or the development of bacteraemia, and was unable to control the spread of the bacteria in the body.

Dr Chris Coward, first author on the study, says: "We have used a collaboration between experimental science and mathematical modelling to examine how vaccines help the immune system control infection. We found that, for Salmonella infections, the immune response induced by a killed vaccine initially kills a proportion of the invading bacteria but the surviving bacteria then replicate resulting in disease. The live vaccine appears superior because it induces a response that both kills the bacteria and restrains their growth, leading to elimination of the infection."

Dr Mastroeni adds: “There is a big push towards the use of non-living vaccines, which are safer, particularly in people with compromised immune systems – and many of the infections such as Salmonella are more prevalent and dangerous in countries blighted by diseases such as HIV, malaria and TB. But our research shows that non-living vaccines against Salmonella may be of limited use only and are not as effective as live vaccines. Therefore more efforts are needed to improve the formulation and delivery of non-living vaccines if these are to be broadly and effectively used to combat systemic bacterial infections. We have used Salmonella infections as a model, but our research approaches can be extended to many pathogens of humans and domestic animals.”

The research was carried out Dr Mastroeni, Dr Coward and colleagues Dr Andrew Grant, Dr Oliver Restif, Dr Richard Dybowski and Professor Duncan Maskell. It was funded by the Biotechnology and Biological Sciences Research Council, which has recently awarded Dr Mastroeni funding  to extend this research to the study of how antibiotics work. The new research aims to optimise treatments and reduce the appearance of antibiotic resistance.

Professor Melanie Welham, BBSRC’s Science Director, said: "To protect our health and the health of animals we rely on, such as livestock, effective vaccines are needed against disease. This new technique provides unique insights that will help us compare vaccines produced in different ways to ensure the best disease prevention strategies."

Reference
Coward, C et al. The Effects of Vaccination and Immunity on Bacterial Infection Dynamics In Vivo. PLOS Pathogens; 18 Sept 2014

Vaccines against Salmonella that use a live, but weakened, form of the bacteria are more effective than those that use only dead fragments because of the particular way in which they stimulate the immune system, according to research from the University of Cambridge published today.

We effectively ‘barcode’ the bacteria so that we can see where in the body they go and how they fare against the immune system
Piero Mastroeni
Salmonella bacteria invade an immune cell

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes
License type: 

Ebola vaccine success highlights dilemma of testing on captive chimps to save wild apes

By fpjl2 from University of Cambridge - vaccine. Published on May 26, 2014.

The first conservation-specific vaccine trial on captive chimpanzees has proved a vaccine against Ebola virus is both safe and capable of producing a robust immune response in chimpanzees.

This unprecedented study, published in the journal PNAS, shows that ‘orphan’ vaccines - which never complete the expensive licensing process for human use - can be co-opted for use on wildlife and might be a godsend for highly endangered species such as gorillas and chimpanzees, say researchers.  

They suggest that, by ending captive research in an effort to pay back an “ethical debt” to captive chimpanzees, the US Government is poised to “renege on an even larger debt to wild chimpanzees” at risk from viruses transmitted by tourists and researchers – as safety testing on captive chimpanzees is required before vaccines can be used in the wild.

“The ape conservation community has long been non-interventionist, taking a ‘Garden of Eden’ approach to modern medicine for wild animals, but we ended Eden by destroying habitats and spreading disease,” said Dr Peter Walsh, the senior author on the study from the Division of Biological Anthropology, University of Cambridge, who conducted the trial at the New Iberia Research Centre in the US with researchers from the Centre, as well as the US Army, the University of Louisiana and the conservation charity Apes Incorporated (ApesInc.org).

“Half of deaths among chimps and gorillas that live in proximity to humans are from our respiratory viruses. For us it’s a sore throat - for them it’s death.”

“We need to be pragmatic about saving these animals now before they are wiped out forever, and vaccination could be a turning point. But park managers are adamant - and rightly so at this stage - that all vaccines are tested on captive apes before deployment in the wild. This means access to captive chimpanzees for vaccine trials.”

Infectious diseases pose extinction-level threats to African ape species on a par with poaching and habitat loss, say researchers, with populations continuing to be devastated by malaria, anthrax and “spillover” respiratory viruses - as well as massive Ebola outbreaks which had killed roughly one third of the world gorilla population by 2007.

They believe ‘orphan’ vaccines could be critical weapons in the fight for wild ape survival. But the ability to test new vaccines relies on research access to captive chimpanzees, and the study’s authors argue that it is vital to retain captive chimpanzees for vaccine trials, not for human use, but to help save key species of wild apes from extinction.

The US Fish and Wildlife Service is now considering regulations that would end all biomedical testing on captive chimpanzees over the next few years - the US being the only developed country to allow such research. The study’s authors believe that the US should establish a “humanely housed” captive chimpanzee population dedicated solely to conservation research.

The researchers administered captive chimpanzees with a new ‘virus-like particle’ (VLP) vaccine being developed by the biotech company Integrated Biotherapeutics for use on humans. While they did not challenge the vaccinated animals directly with Ebola, researchers tested whether antibodies harvested from the chimpanzees could protect mice against the deadly virus. They also monitored the chimpanzees in case the vaccine produced health complications.

Results showed that the vaccine is safe in chimpanzees. The vaccinated chimpanzees developed ‘robust immune responses’, with virus-specific antibodies detected as early as 2 to 4 weeks after the first vaccination in some animals and within 2 weeks of the second vaccination in all animals.

The authors note that these VLP vaccines currently require multiple administrations to reach “full potency”, but could prove the difference between survival and extinction for species that are highly endangered or immunologically fragile but also easy to vaccinate.

“There is a large pool of experimental vaccines that show excellent safety and immunity profiles in primate trails but are never licensed for human use,” said Walsh.

“We’ve demonstrated that it’s feasible for very modestly funded ape conservationists to adapt these orphan vaccines into conservation tools, but the ability to trial vaccines on captive chimps is vital. Ours is the first conservation-related vaccine trial on captive chimpanzees – and it may be the last.

“Although Congress specifically instructed the National Institutes of Health (NIH) to consider the conservation value of captive chimpanzee research, no findings on its possible impact were presented. If the biomedical laboratories that have the facilities and inclination to conduct controlled vaccine trials ‘liquidate’ their chimpanzee populations, there will be nowhere left to do conservation-related trials.”

Study illustrates “high conservation potential” of vaccines for endangered wild primates devastated by viral disease, but highlights need for access to captive chimpanzees so vaccines can be trialled before being administered in the wild.

If biomedical laboratories ‘liquidate’ their chimpanzee populations, there will be nowhere left to do conservation-related trials
Peter Walsh
Common chimpanzee

The text in this work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page. For image rights, please see the credits associated with each individual image.

Yes

Research reveals details of how flu evolves to escape immunity

By sj387 from University of Cambridge - vaccine. Published on Nov 21, 2013.

Scientists have identified a potential way to improve future flu vaccines after discovering that seasonal flu typically escapes immunity from vaccines with as little as a single amino acid substitution. Additionally, they found these single amino acid changes occur at only seven places on its surface – not the 130 places previously believed. The research was published today, 21 November, in the journal Science.

“This work is a major step forward in our understanding of the evolution of flu viruses, and could possibly enable us to predict that evolution. If we can do that, then we can make flu vaccines that would be even more effective than the current vaccine,” said Professor Derek Smith from the University of Cambridge, one of the two leaders of the research, together with Professor Ron Fouchier from Erasmus Medical Center in The Netherlands.

The flu vaccine works by exposing the body to parts of inactivated flu from the three major different types of flu that infect humans, prompting the immune system to develop antibodies against these viruses. When exposed to the actual flu, these antibodies can eliminate the flu virus.

However, every two or three years the outer coat of seasonal flu (made up of amino acids) evolves, preventing antibodies that would fight the older strains of flu from recognising the new strain. As a result, the new strain of virus escapes the immunity that has been acquired as a result of earlier infections or vaccinations. Because the flu virus is constantly evolving in this way, the World Health Organisation meets twice a year to determine whether the strains of flu included in the vaccine should be changed.

For this study, the researchers created viruses which had a variety of amino acid substitutions as well as different combinations of amino acid substitutions. They then tested these viruses to see which substitutions and combinations of substitutions caused new strains to develop.

They found that seasonal flu escapes immunity and develops into new strains typically by just a single amino acid substitution. Until now, it was widely believed that in order for seasonal flu to escape the immunity individuals acquire from previous infections or vaccinations, it would take at least four amino acid substitutions.

They also found that such single amino acid changes occurred at only seven places on its surface – all located near the receptor binding site (the area where the flu virus binds to and infects host cells). The location is significant because the virus would not change so close to the site unless it had to, as that area is important for the virus to conserve.

“The virus needs to conserve this, its binding site, as it uses this site to recognize the cells that it infects in our throats,” said Bjorn Koel, from Erasmus Medical Center in The Netherlands and lead author of the paper.

Seasonal flu is responsible for half a million deaths and many more hospitalizations and severe illnesses worldwide every year.

Study shows that seasonal flu escapes immunity with single amino acid substitutions.

This work is a major step forward in our understanding of the evolution of flu viruses, and could possibly enable us to predict that evolution
Professor Derek Smith

This work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page.

Yes

Clearing the BAR to oral vaccines

By lw355 from University of Cambridge - vaccine. Published on Jun 10, 2013.

From the mouth to the small intestine, the digestive system presents a series of challenges designed to protect us by killing ingested bacteria. If a microbe survives the digestive enzymes in saliva and the corrosive acid of the stomach, the toxic fat-emulsifying bile acids in the small intestine will probably kill it. As a first line of defence against disease and infection, the digestive system is an extremely efficient bactericide.

However, not all bacteria are pathogenic invaders intent on wreaking havoc. For ‘friendly’ bacteria – such as those used in oral vaccines or as probiotics – keeping them alive long enough to exert their benefits poses a significant challenge to biotechnologists.

Now, a new technology that can safely deliver friendly bacteria to the gut is under development by an academic–industry collaboration as an oral vaccine, and Phase I clinical trials are planned. Developed by Alexander Edwards, Krishnaa Mahbubani and Professor Nigel Slater in the University of Cambridge Department of Chemical Engineering and Biotechnology, the technology has been licensed by biotechnology company Prokarium through Cambridge Enterprise Ltd, the University’s commercialisation arm.

The oral vaccine is based on inactivated Salmonella enterica serovar Typhi – the pathogen responsible for typhoid fever – which has been engineered to carry proteins from the bacterium that causes traveller’s diarrhoea. When the body makes a strong protective immune response to Salmonella, it does so also to its hitchhiker, making it a powerful vaccine delivery platform for this and potentially any other disease-causing pathogen.

Salmonella is better able to survive the digestive system compared with other microbes and stimulates a strong immune response. This approach also reduces the cost and time of vaccine production, compared with the traditional methods of purifying vaccine proteins from cultured cells.

Mahbubani and Slater particularly wanted to create a vaccine that did not require injection. “Oral vaccines are part of a new generation of needle-free vaccination strategies,” explained Mahbubani. “These strategies are especially suited for use in developing countries, where needle-based vaccination can pose logistical challenges due to the lack of a cold supply chain, hindering the roll out of vaccination programmes.”

Formulating the vaccine for ease of distribution and administration required the production of dried bacteria. However, simply administering dried microbes isn’t the answer. “Protection from saliva can be achieved by swallowing the dried bacteria in the form of a pill or capsule, and the digestive effects of the stomach can be protected against by using an enteric coating that dissolves once the capsule has moved out of the stomach into the more-alkaline small intestine,” said Mahbubani. “In the assault course of the digestive system, the finish line for oral vaccines is the small intestine, where they must survive the detrimental effects of bile. After drying, bacteria lose their natural tolerance to bile. We needed to find a way of stabilising the bacterium in a dried from so that it could be brought back to life before the bile destroys it.”

Once rehydrated, and after the bacterium has reached the lining of the small intestine, it is intercepted by the immune system, eliciting a strong response to the multiplying pathogen. The next time the immune system encounters the same material, usually in the form of the disease-causing pathogen itself, it can react quickly to clear the invader.

The answer to overcoming the encounter with bile came when Edwards made a surprising discovery, as Slater explained: “Drying did not affect the bacteria permanently. On rehydration, they regain their natural protection to bile.

“When we started the project, this wasn’t known. But the finding opened a door to how we could create an oral vaccine that could survive in the digestive system and didn’t require cold storage. We realised that we needed a technology that would allow the bacteria to rehydrate before the bile reaches it.” The solution lay in a novel adaptation of a material called bile-acid adsorbing resins (BARs). Developed in the 1960s to lower cholesterol levels, BARs such as cholestyramine have a long track record of safe oral administration to patients.

The scientists reasoned that if the capsule contained dried bacteria mixed with BAR then, when the enteric coating dissolves and water and bile enter freely, the movement of bile would be held back by the resin long enough for water to rehydrate the bacteria, before the capsule finally breaks open. When she tested the theory, Mahbubani found that this adsorption concept works, even at progressively smaller and smaller capsule sizes.

With funding from the Technology Strategy Board (TSB) and the Biotechnology and Biological Sciences Research Council (BBSRC), the Cambridge scientists have been working with BioPharma Technology Ltd, Microbial Developments Ltd, Cobra Biologics Ltd, and now Prokarium Ltd, as well as Professor Simon Cutting at Royal Holloway College.

Now, as plans are put together for a Phase I clinical trial, work is ongoing to define the precise formulation of bile-adsorbing materials and dried bacterial vaccine, as well as to design the capsule that goes into the trial.

“It’s been very important during the development process that we’ve had the support of the TSB and BBSRC to progress the invention to the stage we’ve now reached,” explained Dr Rocky Cranenburgh, Prokarium’s Chief Scientific Officer. “The combination of BAR technology with the Salmonella vector will allow us to develop an advanced oral delivery platform that gives us the potential to revolutionise vaccinations.

“We are focusing on the development of a dual oral vaccine against typhoid and enterotoxigenic Escherichia coli (ETEC) – a major cause of diarrhoea – for travellers and developing country markets. There are 22 million cases of typhoid every year resulting in 200,000 deaths, so an effective oral vaccine could have a significant impact. Currently there is no dedicated vaccine against ETEC, considered responsible for 300,000–500,000 deaths per year, mostly of young children.”

“This is a great example of the University working with industry, interpreting needs to create a viable product using real science,” added Slater. “We think this formulation has the potential to be distributed to the four corners of the earth irrespective of supply chain considerations.”

A new technology under development by an academic–industry partnership protects oral vaccines from destruction by the digestive system.

This is a great example of the University working with industry
Nigel Slater
As intestinal fluid hydrates the capsule, both bile (brown) and water (blue) enter; bile-acid adsorbing resins (BARs, white) hold back the progress of the bile long enough for the water to rehydrate the bacteria (green)

This work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page.

Yes

Harnessing the power of research to benefit developing countries

By gm349 from University of Cambridge - vaccine. Published on Apr 25, 2013.

Ghana

On Thursday 2 May, the CEO of the GAVI Alliance, Dr Seth Berkley, will discuss how to harness the power of research to expedite the development of vaccines appropriate for developing countries and improve access to them.

Dr Berkley’s talk will set out how the GAVI Alliance’s public-private partnership model brings together donors, developing countries, industry, civil society and academia to solve the challenges of reaching every child with vaccines no matter where they are born.

GAVI leverages expertise across a variety of sectors, including innovative financing for development, supply chain management, the development of mobile phone platforms for the collection of epidemiological data, mathematical modelling of infectious disease and health economics and policy.

Prior to joining GAVI in 2011, Dr Berkley was the founder, president and CEO of the International AIDS Vaccine Initiative (IAVI) for over a decade. His talk, ‘Harnessing the power of science research and the public and private sector: a 21st century model for international development’, is the Wellcome Trust-Cambridge Centre for Global Health Research’s inaugural lecture.

Dr Berkley’s talk will be followed with a presentation by the world-leading flu expert, Professor Derek Smith, Director of the WHO Collaborating Centre for Modelling, Evolution and Control of Emerging Infectious Diseases at the University of Cambridge. There will be an opportunity for questions and answers after the talks. 

The evening begins at 5.30pm at the Howard Lecture Theatre, Downing College, Cambridge (map). If you would like to attend, please RSVP: http://wt-cghr-cambridge-gavi-lecture.eventbrite.com/

Professor David Dunne, Director of the Wellcome Trust-Cambridge Centre for Global Health Research and host of the lecture, said: “By partnering with globally important organisations such as the GAVI Alliance, Cambridge’s multi-disciplinary research and technology communities can have a more profound effect on international development, public health, and the lives of people in the developing world.”

“As an innovative public-private partnership, the GAVI Alliance works to harness the expertise and experience from a range of sectors to help us to improve access to lifesaving vaccines for children in developing countries,” said Dr Seth Berkley, CEO of the GAVI Alliance. “Our partners range from WHO and UNICEF to donors – including the UK government – implementing countries, vaccine manufacturers, civil society organisations, and academia. 

“We have made great progress in the past decade, but the stark reality is that 22 million children born every year around the world don’t receive the immunisation they need against potentially fatal childhood illnesses.  Supply chain management, improving the quality of vaccine coverage data and developing vaccines that remain highly effective outside of cold storage systems are just some of the challenges which, if they can be overcome, would have a huge positive impact on GAVI’s ability to reach more children.

“Cambridge University has an outstanding reputation for academic research, coupled with its commitment to Africa, which makes it an ideal forum to set out the challenges and opportunities in improving access to immunisation in developing countries.”   

The GAVI Alliance is a public-private partnership which aims to immunise a quarter of a billion additional children in the developing world with life-saving vaccines by 2015. With GAVI support, countries are now introducing new vaccines against the primary causes of two of the biggest childhood killers in the world: pneumonia and severe diarrhoea. Together these diseases account for 30% of child deaths in low-income countries. It was established in 2000 by the Bill and Melinda Gates Foundation, the UK government and others to improve access to immunisation.

The Wellcome Trust-Cambridge Centre for Global Health Research status was awarded to the University of Cambridge in February of this year. The Centre plans to capture and capitalise on the extensive basic biomedical and health-related research capacity across many departments and research institutes in Cambridge. It will make this fully available for research capacity building and knowledge exchange partnerships with African universities and institutes, as a means of improving the health and welfare of those in low- and middle-income countries.

CEO of GAVI Alliance to give Wellcome Trust-Cambridge Centre for Global Health Research inaugural lecture

We have made great progress in the past decade, but the stark reality is that 22 million children born every year around the world don’t receive the immunisation they need.
Dr Seth Berkley, CEO of the GAVI Alliance
Vaccinations in Ghana

This work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page.

Yes

New study shows how Salmonella colonises the gut

By ns480 from University of Cambridge - vaccine. Published on Apr 19, 2013.

Salmonella is a major cause of human diarrhoeal infections and is frequently acquired from chickens, pigs and cattle, or their products. Around 94 million such infections occur in people worldwide each year, with approximately 50,000 cases in the UK per annum.

In a BBSRC-funded collaboration between the University of Cambridge’s Department of Veterinary Medicine, the University of Edinburgh’s Roslin Institute and the Wellcome Trust Sanger Institute, scientists have studied how Salmonella colonises the intestines of food-producing animals. This is relevant both to the welfare of the animal hosts and to contamination of the food chain and farm environment.

To unravel how Salmonella persists in farm animals, the scientists studied the role of thousands of its genes. Using a novel DNA-sequencing method the team screened 10,000 mutants of Salmonella for their ability to colonise the guts of chickens, pigs and cattle.  This was achieved by using a novel technique based on high-throughput DNA sequencing which enabled the screening of 475 mutants of the bacteria per single animal. In the process, they assigned roles in infection to over 2700 Salmonella genes in each of the farm animal hosts. This has yielded roles for over half the genetic instructions of the bacterium and is by far the most comprehensive survey for any pathogen in its natural hosts to date.

Professor Duncan Maskell at the University of Cambridge said, “We found that hundreds of genes are important for colonisation; this provides vital new data for the design of strategies to control Salmonella in animals and reduce transmission to humans. Our data indicate that Salmonella contains a core set of genes that is important when it infects all three hosts, but that there are smaller sets of genes that are required for infection of each individual host species.”

Professor Mark Stevens at The Roslin Institute added, “We are always trying to develop new ways of reducing the number of animals used in experiments. The methods we applied allowed us to survey the fate of hundreds of bacterial mutants simultaneously in one animal, rather than us having to test them one-by-one. This represents a significant advance in the study of microbial diseases, and can be applied to other pathogens and host animals.”

The team now plans to use the data it has collected to design vaccines or treatments to reduce the burden of salmonellosis in animals and humans.

Researchers plan to use data collected to develop vaccines to control Salmonella in animals and humans

Needle

This work is licensed under a Creative Commons Licence. If you use this content on your site please link back to this page.

Yes