Cambridge Medical Communication becomes part of Prime Global
Prime Global, the international medical communications and market access group, today announced the acquisition of Cambridge Medical Communication, an independent medical communications agency based near Cambridge, UK. Cambridge Medical Communication will join Prime Global’s existing group of six agencies and two consultancies to deliver brilliant medical communications, focusing on scientific excellence coupled with innovative execution.
“We are very pleased to welcome Cambridge Medical Communication to Prime Global; I am excited to be working with Jenny and her team!” says Graeme Peterson PhD, Chairman and CEO of Prime Global. “Not only is this the next step of our ongoing and successful growth plan, acquiring Cambridge strengthens our scientific excellence, allowing us to establish a new office near the city of Cambridge, one of the largest centres for scientific research in Europe, and also to expand our European client base.”
Founded in 1998 by their Managing Director, Jenny Muiry PhD, Cambridge Medical Communication has built up a formidable reputation for providing clear and imaginative medical communication materials. Jenny says, “Becoming part of Prime Global will provide our staff and our clients with additional resources, including access to specialised in-house teams, such as Creative, Digital, and Strategy. With our shared commitment to deliver excellence and maintain long-lasting relationships with our clients, I look forward to working alongside Graeme and the rest of the Prime Global team.”
Prime Global has experienced exceptional organic growth in recent years, including the recent launch of Prime Access, a Market Access consultancy. The company is proud to rank on several business growth lists such as the Sunday Times Fast Track 100 in 2018 and 2019, and of being selected as consultancy growth winners of the Alantra Fast Pharma 50 in 2019 and 2020. The addition of Cambridge Medical Communication continues this expansion trend and is a strong cultural fit; both companies share the values of passion, excellence, and partnership.
Cambridge Medical Communication will be known as Cambridge, a Prime Global agency. Take a look at the Prime Global website to see more.
Resistance – a matter of course
The seeming inevitability of antibiotic-resistant bacteria
Antibiotic resistance is a topic rarely out of the newspapers. Historically, patients and physicians alike were taught that longer antibiotic courses are needed to prevent the development of antibiotic resistance. However, recent research by Professor Martin Llewelyn and colleagues argues that, in some cases, it is not even necessary to complete a full course of antibiotics.
This seemingly conflicting advice is confusing for a public who have been told time and time again to always complete the course. So, why would Professor Llewelyn make this argument, and which advice should we follow? This is a complex issue, not least because many researchers now believe that antibiotic resistance can arise whether or not we complete the course.
Failure to complete the course can lead to resistance
Bacterial infections occur when pathogenic bacteria invade a person’s body, where they reproduce and cause harm. Antibiotics work by stopping the bacteria from behaving as normal and, ultimately, killing them. However, if the antibiotic is taken for too short a duration (or the dose is too low) then there is a chance that the bacterial strain will adapt before they are all killed.
Bacteria can reproduce very quickly. If they are lucky, random genetic mutations – which naturally occur during reproduction – may produce offspring that are unaffected by a particular antibiotic. What’s more, bacteria that become resistant can package their resistance genes into a ‘plasmid’ and pass them to other bacteria, even to others of a different species.
The resistant bacteria thrive as their non-resistant competitors are killed off, culminating in a population of antibiotic-resistant bacteria – seemingly, all because the patient did not complete their course of antibiotics.
Completing the course can lead to ‘collateral’ resistance
Our bodies are surrounded by bacteria, which live on us and inside us (the ‘microbiome’) – notably in the gut, where they play a critical role in digestion. When we take antibiotics, the microbiome is affected as well as the target pathogenic bacteria. Many of these harmless (or even beneficial) microbiome bacteria are killed, and some may evolve to become resistant, as described above.
Not only can certain previously harmless bacteria take advantage of this situation and become ‘opportunistic pathogens’, but also resistance in harmless species can be passed to pathogenic species, resulting in harmful strains with resistance. And, the longer that a patient takes antibiotics, the greater the risk that ‘collateral’ antibiotic resistance will develop.
The debate on whether or not to complete the course of antibiotics is likely to continue for some time, and to require further experiments. Every patient is different, and what works for one person could be harmful to another. A recent analysis has shown that even physicians don’t always adhere to recommended prescribing guidelines. The best advice on whether to complete the course, then – anticlimactic though it may be – is to follow the advice of your doctor.
Pouwels KB, Hopkins S, Llewelyn MJ, et al. Duration of antibiotic treatment for common infections in English primary care: cross sectional analysis and comparison with guidelines. BMJ 2019; 364: l440.
Llewelyn MJ, Fitzpatrick JM, Darwin E, et al. The antibiotic course has had its day. BMJ 2017; 358: j3418.
Richardson LA. Understanding and overcoming antibiotic resistance. PLoS Biol 2017; 15 (8): e2003775.
Adherence and persistence – how often, and how long?
To properly assess the efficacy of a new drug, it is important to know if the patient took it in the intended manner. If they did not, the drug may not have reached the desired concentration in the patient’s bloodstream, and hence might not have had the expected effect.
Investigators generally use two variables to assess patient conformity to the prescribed medication: adherence and persistence. However, these two terms are regularly confused, and are often used without adequate explanation. Cramer et al., proposed the following definitions: adherence indicates the degree to which a patient conformed with the prescribed regimen for their medication, whilst persistence refers to the length of time over which the patient kept taking their medication, regardless of whether they consistently followed the exact regimen.1
In line with these definitions, adherence is usually reported as a percentage – showing the extent to which a patient takes the correct dose, at the correct time, over a fixed period – whereas persistence, the length of time that passes before a patient discontinues their medication, is usually reported as a function of time.1
The importance of distinguishing between adherence and persistence is made clear by the impact of each variable on patient prognosis. Reduced adherence (skipping a dose, or taking a dose late) will result in fluctuating levels of medication in the bloodstream, which could lead to reduced benefits and/or increased risks. On the other hand, the impact of reduced persistence (i.e., a patient taking their medication for a shorter period of time than intended) is perhaps more difficult to evaluate.
When a patient discontinues their medication (i.e., does not persist), the level of drug in their bloodstream will drop to nothing – if the patient was persistent for only a short period of the intended duration of their prescribed treatment course, their health may not improve. However, if a patient has been persistent for almost their entire prescription, and only misses the last few doses, the shortened treatment duration may have little negative impact on their recovery.
Consider a chronic condition, such as human immunodeficiency virus (HIV) infection, which requires a patient to persist with their treatment regimen over the long term – for life, in fact – to maintain virological suppression.2 Reduced adherence may mean that the level of medication in the bloodstream decreases enough to allow the virus to replicate and mutate – increasing the chance of a drug-resistant strain evolving. The speed at which a drug is cleared from the blood stream is an important factor here; a drug with a long half-life may remain at levels sufficient to prevent viral replication for long enough that taking one dose late would carry a low risk of drug resistance developing. However, a more extensive reduction in adherence, or reduced persistence, will carry a high risk of drug resistance – leading to virological failure and increased likelihood of HIV transmission.2
Adherence and persistence are important factors to consider when assessing a clinical study since, aside from a drug’s biochemical activity, its effectiveness in the real world depends not only on how well a patient complies with the treatment regimen, but also on how long they continue to comply.
1. Cramer JA, Roy A, Burrell A, et al. Medication compliance and persistence: terminology and definitions. Value Health 2008; 11 (1): 44–47.
2. Bae JW, Guyer W, Grimm K, Altice FL. Medication persistence in the treatment of HIV infection: a review of the literature and implications for future clinical care and research. AIDS 2011; 25 (3): 279–290.
Sequencing a genome, base by base, by base, by base
Genetics has a proud history of great accomplishments – from the bold logic of Mendel, to the immense scale of the human genome project. But the enterprising spirit of mankind is unquenchable; as we strive to reach out into the cosmos, we also seek to delve deeper into the unfathomed depths of our own biology.
Sequencing techniques have often been seen as a bottleneck constraining the progress of modern genetics. Sanger sequencing, developed in the 1970s, is a ‘chain-termination’ method.1 Many copies of the target DNA are produced, but of varying length.1 Four different reactions are run in parallel, using chain-terminating versions of each of the four nucleobases, A, T, C, or G, and a radioactive label.1 Within, for example, the adenine reaction, this produces DNA strands of different lengths, each terminating at the position of an adenine.1 Likewise, for the other nucleobases, each of the reactions produces DNA strands of different lengths, terminating at their designated nucleobase chain terminator.1 The four reactions are ‘run out’ in individual lanes of a sequencing gel to separate the strands of DNA based on their length.1 The nucleobase at each position along the DNA strand will correspond to a band on the gel (from one of the four reactions) at the position corresponding to the DNA sequence.1 A later variation used fluorescent dye colours to label the nucleobases.2 Whilst still useful today, the methods that were developed from Sanger’s original sequencing technique provide good-quality results for less than 1,000 base pairs,2 which can be quite limiting, especially if the task in hand is to sequence a human genome, which extends over billions of base pairs!3
Increasingly, therefore, science sought to develop ’high-throughput’ methods of DNA sequencing, and there are now several possibilities in this field.2 In the first decade of this millennium, there was a prize fund of $10 million announced for the first company to be able to sequence 100 whole human genomes to a standard never before achieved and at a cost of less than $1,000 per genome – the so-called ‘Archon Genomics XPrize’.4 The race was on! However, because the next-generation sequencing market was growing so much faster than anticipated, the Archon Genomics XPrize was subsequently cancelled.4 Among the contenders for that prize would have been an innovative Cambridge-based company – Base4 Innovation.
Base4 Innovation are developing a microdroplet-based sequencing system.5 In brief, this is a high-throughput technology whereby a single strand of DNA is captured in a channel, and the individual nucleobases are broken off, one-by-one, from the end of the DNA strand.5 Each individual nucleobase is encased in a microdroplet, inside which a cascade reaction emits an optical signal; the colour of the fluorescence corresponds to the identity of the nucleobase, A, T, C, or G.5 As long as the sequence of the microdroplets is preserved, the code of the DNA can be read out as the microdroplets pass through the system.5 In principle, this technology should have a low error rate, and will be able to rapidly read long sequences from a single strand of DNA. One of a plethora of potential next-generation sequencing technologies, the innovation of Base4 exemplifies the dynamic environment for science in which we find ourselves, here in Cambridge.
1. Sanger F, Nicklen S, Coulson AR. DNA sequencing with chain-terminating inhibitors. PNAS 1977; 74 (12): 5463–5467.
2. Heather JM, Chain B. The sequence of sequencers: the history of sequencing DNA. Genomics 2016; 107 (1): 1–8.
3. Venter JC, Adams MD, Myers EW, et al. The sequence of the human genome. Science 2001; 291 (5507): 1304–1351.
4. Cancellation of the Archon Genomics XPrize: a public debate. https://genomics.xprize.org/news/blog/cancellation-of-archon-genomics-xprize-public-debate. Accessed May 2018.
5. Base4 website. http://www.base4.co.uk/. Accessed May 2018.
Cambridge researchers lead potential Parkinson’s disease breakthrough
The triumph of modern medicine increasingly sees people living into their ninth and tenth decades. However, this success makes the challenges of neurodegenerative diseases – such as Parkinson’s disease (PD) and Alzheimer’s disease (AD) – all the more acute. Poor understanding of the underlying disease process (or processes) may be one of the main obstacles hampering the development of drugs for diseases such as PD and AD. Consequently, new research into the pathology of neurodegenerative conditions is vitally important.
The exact cause of the damage to the brain in PD is unknown, but the protein alpha-synuclein appears to play a critical role.1 Whilst alpha-synuclein is a necessary component of the synapse, in PD it is thought to misfold and oligomerise – forming chains of linked alpha-synuclein particles.1
In a fascinating paper published at the end of 2017 (Fusco et al.2), researchers from the UK, Spain and Italy, have shed some light on what may be a crucial element in the pathology of PD. This new research explored the structure of alpha-synuclein, revealing that the oligomerised form of the protein could – unlike the non-oligomerised form – become inserted within the lipid bilayer of the cell membrane.2 These observations suggest that, in PD, alpha-synuclein oligomers perturb the membrane of neurons in the brain, interfering with their normal functioning and potentially leading to cell destruction.2
Such pioneering research opens up new avenues of drug development for the conditions of older age that are so burdensome to modern society.
1. Lee VM, Trojanowski JQ. Mechanisms of Parkinson’s disease linked to pathological alpha-synuclein: new targets for drug discovery. Neuron 2006; 52 (1): 33–38.
2. Fusco G, Chen SW, Williamson PTF, et al. Structural basis of membrane disruption and cellular toxicity by alpha-synuclein oligomers. Science 2017; 358 (6369): 1440–1443.
Cambridge scientists uncover key protein in Zika virus pathology
The recent Zika virus outbreak has had a devastating global impact on human health, with at least 70 countries reporting infections since 2015.1 Zika virus is spread to humans by mosquito bites, although it can also be passed from person to person via sexual contact or bodily fluids, and from mother to foetus via the placenta. In children and adults, the virus causes only a mild illness. However, in pregnant women it can have devastating consequences for foetal neurodevelopment.
In the Americas, over the past two years, more than 3,500 babies have been confirmed to have congenital malformations associated with Zika virus infection.2 Perhaps the most tragic consequence of Zika virus on the unborn baby is microcephaly; a serious birth defect where the brain fails to develop properly, characterised by an abnormally small head.
Until recently, it was not known how the Zika virus caused neonatal microcephaly. However, scientists at the University of Cambridge have provided an insight into the mechanism by which the Zika virus may ‘hijack’ neural stem cells to cause this devastating brain condition.3 Their research showed that, in order to replicate, the Zika virus interacts with a neural stem cell protein called Musashi-1. In developing embryos, Musashi-1 is an RNA-binding protein that binds to target genes to regulate neural stem cell function. The interaction between the Zika virus and Musashi-1 disrupts this process and damages the foetal neural stem cells. Ultimately, neural stem cells die and the brain fails to develop to normal size.
Although, transmission of Zika virus may finally be slowing down, in the Americas at least,4 we are starting to understand the way that Zika virus causes microcephaly, giving scientists hope that, someday, we may be able to treat this devastating disease.
1. European Centre for Disease Prevention and Control. Rapid risk assessment: Zika virus epidemic, tenth update. 4 April 2017.
2. Pan American Health Organization/World Health Organization. Zika cases and congenital syndrome associated with Zika virus reported by countries and territories in the Americas, 2015–2017. 21 September 2017. www.paho.org/hq/index.php?option=com
_content&view=article&id=12390&Itemid=42090. [Accessed September 2017]
3. Chavali PL, Stojic L, Meredith LW, et al. Neurodevelopmental protein Musashi-1 interacts with the Zika genome and promotes viral replication. Science 2017; 57 (6346): 83–88.
4. Pan American Health Organization/World Health Organization. Regional Zika Epidemiological Update (Americas) August 25, 2017. www.paho.org/hq/index.php?option=com_content&view=article&id
=11599%3Aregional-zika-epidemiological-update-americas. [Accessed September 2017]
Much ado about nothing?
Assessing the value of placebo treatment in clinical trials
Clinical trials often involve placebo treatment. Use of a placebo tests the influence of study procedures themselves on treatment outcomes, thus helping to isolate the ‘true’ effect of the investigational product.
Placebo-controlled clinical trials are commonly used to evaluate the efficacy and safety of medicinal products. However, in situations where an approved treatment already exists, the use of a placebo essentially denies patients access to a proven effective treatment. Withholding treatment can result in long-lasting harm; in schizophrenia, for example, progressive relapses may lead to irreversible brain damage.1 Is it ethically acceptable to switch a patient from an effective intervention to placebo treatment, and observe the resulting rate of decline?
The use of an active comparator is an alternative method, whereby a new treatment is compared with one already deemed effective, rather than with a placebo. This method illustrates relative efficacy versus a current approach, and does not involve withholding treatment. However, the very act of being treated can result in remarkably high response rates – the so-called ‘placebo effect’ – and so, in an active-comparator study, it is impossible to tell how much of the treatment response is due to the activity of the drug, and how much of a response would have been observed if patients had only received a placebo.
How, therefore, can we weigh up whether or not placebo treatment is justified? Millum and Grady emphasise that methodological reasoning alone is not enough to warrant using a placebo; rather, any additional risks associated with placebo treatment must be balanced by the additional societal value of the study, relative to using other study designs.2
Unfortunately, the concept of “societal value” is by no means clear-cut. Compared with placebo-controlled studies, demonstration of efficacy in active-comparator studies requires a greater number of patients, thus exposing more patients to any risks associated with the new treatment. Is a smaller risk to a greater number of patients better societal value than a greater risk to a smaller number of patients? Furthermore, enrolment of fewer patients may translate to lower study costs,2 and saved costs could help fund future research, although it is certainly debatable whether a patient would consider pharmaceutical company finances an acceptable reason for risking their health.
In another avenue of debate, how should we assess the value of a study if the patients involved may not personally benefit in the long run? Or, put another way, how do we balance societal value and personal value?
Some argue that patients should consider participation in clinical studies ‘for the greater good’; others suggest that clinical research should only be conducted if its intent is to benefit the population from which the study participants are drawn.2
Ultimately, the benefits of placebo-controlled studies must undoubtedly outweigh the risks, but exactly how to evaluate benefit and risk in the unpredictable world of clinical research is a difficult decision to make.
1. Andreasen NC, Liu D, Ziebell S, et al. Relapse duration, treatment intensity, and brain tissue loss in schizophrenia: a prospective longitudinal MRI study. Am J Psychiatry 2013; 170 (6): 609–615.
2. Millum J, Grady C. The ethics of placebo-controlled trials: methodological justifications. Contemp Clin Trials 2013; 36 (2): 510–514.
Treating Alzheimer’s disease in an era of intelligent drug design
Given the major advances in the treatment of cancer, it is frustrating that the same progress has not been achieved in the field of Alzheimer’s disease and dementia.
In the early 1970s, a diagnosis of cancer meant a 50:50 chance of surviving for one year; for patients diagnosed in 2010, that 50:50 chance of survival had extended to ten years.1 This increase in the duration of survival is impressive considering that cancer is an aggressive and belligerent disease. However, there is still more work to be done in order to improve the treatment of cancer, and certain types in particular (for example, lung and pancreatic cancer) still show very low survival rates.1 The improved chance of survival observed over the past 50 years is due to therapeutic advances as a result of researchers gaining a more in-depth understanding of the disease process. Today, many of the new drugs in development target specific pathways, such as those involved when cancer cells evade the immune system, or when cancer cells generate their own blood supply. These targeted therapies would not be possible without a detailed appreciation of the disease pathology.
The situation for Alzheimer’s disease is quite different. Despite decades of work on the underlying disease pathology, the formation and testing of the ’amyloid hypothesis’, and the interrogation of the function of tau protein, little is known about the underlying cause of Alzheimer’s disease. This paucity of knowledge has led to a horrific attrition rate for drugs in development for Alzheimer’s disease – between 2002 and 2012, of 244 compounds assessed, only one was approved for marketing (99.6% attrition).2 Many of the diseases of middle and older age are being alleviated or cured, and so there is a growing population of older individuals who are susceptible to dementia. Consequently, the need to find an effective treatment for Alzheimer’s disease should be met with the rising urgency it deserves.
Quite why Alzheimer’s disease is so intractable to understanding remains a mystery but, if it were a simple problem, it would have been solved already. If a solution is ever to be found, society must not lose hope. We cannot be as fatalistic as Lyall Watson, who once said: “If the brain were so simple we could understand it, we would be so simple we couldn’t.”
1. Quaresma M, Coleman MP, Rachet B. 40-year trends in an index of survival for all cancers combined and survival adjusted for age and sex for each cancer in England and Wales, 1971–2011: a population-based study. Lancet 2015; 385 (9974): 1206–1218.
2. Cummings JL, Morstorf T, Zhong K. Alzheimer’s disease drug-development pipeline: few candidates, frequent failures. Alzheimers Res Ther 2014; 6 (4): 37.
Should prevention play a greater role in modern healthcare?
Over the past few decades, huge strides have been made in the treatment of HIV infections. There was a time when a diagnosis of HIV was a death sentence – now, highly active antiretroviral therapy (HAART) is a viable treatment allowing HIV-positive individuals to live full and satisfying lives. However, the disease is still spreading; in Europe, 2014 saw the highest number of newly diagnosed infections in a 1-year period, since monitoring began (in the 1980s).1 Despite the efficacy of treatment, there is clearly an unmet need for drugs that can help to prevent the spread of the HIV pandemic. The old bromide goes ‘prevention is better than the cure’, so why, then, is PrEP (pre-exposure prophylaxis) not more widely available in healthcare systems across Europe? PrEP is a method whereby HIV-negative individuals who are at a high risk of contracting HIV are given antiretroviral therapy (ART), in order to reduce the risk of infection (when used in conjunction with other preventative measures).2
Of course, there are always other ways of looking at the same problem. I can remember James Watson (a man never shy of controversy) giving a talk at my institute arguing that science should be focussed on curing late-stage cancer, nothing else. If we can cure cancer in its latest stages – he reasoned – then we need not worry about early detection, screening programmes, and maintaining a healthy lifestyle. The same argument can be applied to HIV – if science focusses on treating and curing AIDS, then prophylaxis is unnecessary. However plausible this theory may appear, it shows complete disregard for an individual’s quality of life, which should be of paramount importance for treating physicians and for clinical researchers. Quality of life is not served by allowing patients to progress to the late stages of debilitating conditions, nor is it served by allowing HIV to spread when preventative measures like PrEP have been developed.
As we move into an age where disease management aims to provide holistic benefits that extend beyond the traditional approach of symptomatic treatment, we may see a greater acceptance of preventive medicine into healthcare systems.
1. World Health Organization. Highest number of new HIV cases in Europe ever. 26 November 2015. http://www.euro.who.int/en/media-centre/sections/press-releases/2015/11/highest-number-of-new-hiv-cases-in-europe-ever. Accessed 22 June 2016.
2. Terrence Higgins Trust website. http://www.tht.org.uk. Accessed 22 June 2016.
Can a simple blood test change the way we treat depression?
Despite the many treatment options that are available for depression, patients often struggle to find a drug that works for them. Patients may try many different antidepressants before settling on a particular medication, and some patients never find a satisfactory treatment. Such trial and error approaches to therapy are inefficient and expensive, as well as exposing the patient to unnecessary side effects. Fortunately, a recent scientific breakthrough indicates that it may be possible to predict whether or not a patient will respond to antidepressants before they have taken their first dose.
The test for potential non-responders is based upon levels of ‘inflammatory biomarkers’ in the blood. Inflammation is the term used to describe the body’s response to harmful stimuli, such as infection or injury. During inflammation, the body releases chemicals into the blood that help to fight the cause of the harm, as well as to repair or remove damaged tissue. Measuring the levels of these chemicals, or ‘biomarkers’, in the blood, enables doctors to gauge the degree of inflammation in a patient.
While depression itself cannot be diagnosed with a blood test,a studies have shown that depressed patients who are resistant to standard antidepressants have higher blood concentrations of certain inflammatory biomarkers than those patients who are not resistant to antidepressants. It is thought that a high degree of inflammation can interfere with particular biological processes that are crucial for antidepressants to exert their therapeutic action.
The latest breakthrough expands this research by identifying biomarker concentration cut-offs that accurately predict non-response to antidepressants. Patients with biomarker levels below the cut-offs are likely to benefit from standard antidepressants, whereas patients with levels above the cut-offs may require additional treatment with other antidepressants, or perhaps with anti-inflammatories. Thus, in the future, a doctor may be able to perform a simple blood test before prescribing the appropriate antidepressant medication – thereby eliminating months of suffering and uncertainty for the patient.
Cattaneo A, Ferrari C, Uher R, et al.; MRC ImmunoPsychiatry Consortium, Pariante CM. Absolute measurements of macrophage migration inhibitory factor and interleukin-1-beta mRNA levels accurately predict treatment response in depressed patients. Int J Neuropsychopharmacol 2016 [Epub ahead of print].
aDepression is diagnosed based upon symptoms such as low mood and loss of interest in daily activities.
The beginning of the end for HIV?
During my time as an undergraduate student, I wrote an essay arguing that HIV could never be cured by drugs because it readily mutates within the body, and that it could never be eradicated by vaccination for essentially the same reason. Fortunately, my reasoning failed to anticipate the revolutions in technology that would be of benefit to the medical field.
Viral infections, such as HIV, are notoriously difficult to treat. Recently, there has been a drive to apply one of the great nascent medical technologies – CRISPR – to the treatment of HIV infections.1 CRISPR (clustered regularly-interspaced short palindromic repeats) is a gene-editing technology used to cleave DNA at specific locations (see an earlier blog entitled ‘Medicine in an era of genetic control’).
The technique has been used by one research group to target the HIV virus within T-cells but, despite an initial reduction in HIV replication, certain strains of the HIV virus managed to escape the attack (those with mutations around the CRISPR target site).2,3 The team have described this finding as only a ‘minor setback’, and they continue with their efforts to implement CRISPR in the ongoing fight against HIV.1
It is easy to see that, with some fine tuning, CRISPR (and subsequent technologies) could become a powerful weapon in the battle against viruses. Maybe, as an undergraduate, I should have hedged my bets when weighing my opinions against the totality of the future?
1. Pharmafile. HIV successfully overcomes CRISPR gene-editing technology. Available at: http://www.pharmafile.com/news/503834/hiv-successfully-overcomes-crispr-gene-editing-technology. Accessed 6 May 16.
2. Wang Z, Pan Q, Gendron P, et al. CRISPR/Cas9-derived mutations both inhibit HIV-1 replication and accelerate viral escape. Cell Rep 2016; 15 (3): 481–489.
3. Wang G, Zhao N, Berkhout B, Das AT. CRISPR-Cas9 can inhibit HIV-1 replication but NHEJ repair facilitates virus escape. Mol Ther 2016; 24 (3): 522–526.
Combination therapy shows promise for prostate cancer treatment
A novel approach to the treatment of prostate cancer combines gene therapy and radiotherapy; the result is a synergistic effect on the anti-tumour immune response.1 An adenoviral vector, containing a herpes simplex thymidine kinase gene (ADV/HSV-tk), is used to render tumour cells vulnerable to valacyclovir, an anti-viral prodrug.1 Phosphorylation of valacyclovir mediates the effect of ADV/HSV-tk, which produces a cytotoxic nucleotide analogue, killing the tumour cells.1 Consequently, tumour-associated antigens are released, and an immune response is triggered.1 Administration of Intensity-modulated radiotherapy (IMRT) enhances the response of the immune system.1 This combination of therapies has been investigated as a treatment for prostate cancer, through a Phase II trial.1
The trial enrolled 66 men with prostate cancer.1 They were treated according to the severity of their disease – Group A comprised patients with less severe disease; Group B comprised those with more severe disease.1
Gene therapy sessions of four ADV/HSV-tk intraprostatic injections, 14 days of oral valacyclovir, and IMRT were given twice to patients in Group A, and three times to those in Group B.1 Patients in Group B also received hormone therapy after the first injection.1 The primary endpoint was freedom from failure (FFF), where failure was defined as a rise by greater than or equal to 2 ng/ml above the nadir prostate-specific antigen (PSA).1,2 Five-year FFF rates of 94% and 91% were observed in Groups A and B, respectively.1 Overall survival, a secondary endpoint, was 97% for Group A and 94% for Group B.1
Further evaluation of this strategy is warranted and, indeed, a randomised trial is ongoing.1 However, the development of this and other immunotherapies represents a paradigm shift in cancer management.
1. Teh BS, Ishiyama H, Mai W-Y, et al. Long-term outcome of a Phase II trial using immunomodulatory in situ gene therapy in combination with intensity-modulated radiotherapy with or without hormonal therapy in the treatment of prostate cancer. J Radiat Oncol 2015; 4 (4): 377–386.
2. Roach M 3rd, Hanks G, Thames H Jr, et al. Defining biochemical failure following radiotherapy with or without hormonal therapy in men with clinically localized prostate cancer: recommendations of the RTOG-ASTRO Phoenix Consensus Conference. Int J Radiat Oncol Biol Phys 2006; 65 (4): 965–974.
The latest trends in e-learning
At Cambridge Medical Communication we create e-learning courses on a wide range of medical and scientific topics. We start by thinking about the desired aims and outcomes of the e-learning, before developing a detailed specification of the structure and content. Medical writing, editing, and all the usual review loops and checks for accuracy and scientific integrity are undertaken. The final e-learning package can then be built using software such as Adobe Articulate.
We recognise the importance of keeping up to date with current ideas and technologies in the creation of e-learning materials. Consequently, this month, members of our team participated in the FutureLearn online course, ‘Designing e-learning for health’.
The course, run by the University of Nottingham’s Health E-Learning and Media (HELM) team, introduced the ASPIRE process as a framework for the creation of high-quality ‘reusable learning objects’. Throughout the course, we were reassured to find that the value of the core principles by which we work at Cambridge Medical Communication is reflected in other people’s experiences. For example, we are not the only ones who shy away from the use of jargon, and we also recognise the importance of interactivity – a key feature that we embrace in our e-learning courses, to engage the audience and to facilitate understanding of the programme content.
Ultimately, an effective e-learning programme is one that is designed with clear objectives, with learners at the forefront, and with a relentless insistence on accuracy. These drivers are at the heart of the e-learning programmes that we deliver at Cambridge Medical Communication.
Contact us to discuss your next e-learning project firstname.lastname@example.org
Embracing the devil in the detail
Being meticulous about grammar could be considered indicative of an individual’s general attention to detail. Indeed, someone with a tendency to “scatter commas into a sentence with all the discrimination of a shotgun” may disregard the value of detail in other aspects of their work, whether or not it is related to writing.1 Here at Cambridge Medical Communication, we keep our shotguns at bay.
Over the last few weeks, I have been working on scientific posters that will be presented at the European Crohn’s and Colitis Organisation (ECCO) congress later this month. My part of the project involved reference checking the draft text, editing the content for accuracy and clarity, and working with designers to prepare an eye-catching layout.
As with all aspects of our work, these tasks require an impeccable attention to detail. Have the facts and data been checked and cross-checked? Do the posters effectively and unambiguously communicate the study findings, and the value of the work? Is the writing style consistent, and is the text grammatically correct? Do the layout, colour usage and line weights adhere to specified branding guidelines? I could go on…
We take great care to address these questions. Ultimately, we are communicating details that may inform the treatment of patients. It is therefore vital that the content is accurate, and that the messages are clear.
At Cambridge Medical Communication, we believe that our drive for accuracy provides the foundation for scientific integrity, which is fundamental to the clear and imaginative communication materials that we produce. This commitment to delivering excellence reflects the true passion that we hold for our work.
1. Wiens K. I won’t hire people who use poor grammar. Here’s why. 20 July 2012. http://tinyurl.com/ooqbrun. Accessed 2 March 2016.
Should I aim for 7 hours of sleep per night?
Barely a week seems to go by without a story in the news about how much sleep we should be getting. While these articles tend to agree that too little sleep, or too much sleep, is bad for your health, they do not always agree on what amount of sleep is optimal. What’s more, the precise relationship between sleep and health tends to be skimmed over in the popular press. Does poor sleep cause poor health, does poor health cause poor sleep, or is there a hidden third factor that affects both our amount of sleep and our health?
A meta-analysis published this week has attempted to gather together all studies correlating night-time sleep duration with risk of all-cause mortality. Over one and a half million adult participants were included from sleep studies around the world.
A U-shaped relationship was found between sleep duration and risk of mortality, meaning that the further from 7 hours of sleep one goes, the greater the risk of premature death. At either end of the range, 4 hours of sleep was associated with a 7% increase in risk, and 11 hours with a 55% increase in risk.
The authors of the meta-analysis were careful to state that amount of sleep may be just a marker of health status, rather than being directly linked to mortality. For example, a person with depression may sleep too much, and may also have a premature death. However, looking at only those patients without chronic diseases, a similar U-shaped relationship was found. Thus, it appears that both long and short night-time sleep duration are independent predictors of all-cause mortality.
These results are potentially affected by publication bias, since studies with null results tend not to be published, and therefore were not included in the analysis. However, as the best evidence available, it appears that we really should be aiming for 7 hours of sleep per night in order to minimise the risk of premature death.
Shen X, Wu Y, Zhang D. Nighttime sleep duration, 24-hour sleep duration and risk of all-cause mortality among adults: a meta-analysis of prospective cohort studies. Sci Rep 2016; 6: 21480.