And 1 that isn’t….but probably should
1. There is a larger emphasis on clinical interpretation of sequencing data
It’s now all about ‘big data’ since the decreasing cost of sequencing is not a major market restraint. In the long term, this is a
restraint to watch. The bottle-neck is now shifting towards being able to actually interpret the data and those providing clinical interpretation of sequencing data and NGS diagnostic tests. The housing and interpretations of human biological and clinical data, such as sequencing, phenotypic, metabolomic, medical, and family history information.
These databases are critical for driving better clinical utility of the genome and achieving the dream of precision/personalized medicine. I’ve compiled a list of all the bioinformatics software companies here that will be looking to cash-in on this trend. A rare-breed, bioinformaticians will be in high-demand.
2. Larger and greater number of databases of sequencing in parallel with patient histories are driving better outcomes
Until lately, the high cost of sequencing restrained project sizes and the ability to develop these large databases. However, we are beginning to reach a point where the reduced cost of the technology will allow for the rapid and affordable sequencing of enough human genomes to develop useful databases. A great example I was particularly excited to read was the fantastic news about the Georgia and Jessica’s story on the 100,000 genome project.
However, there will be a divergence. Whilst larger laboratories will be producing this ‘big data’ through the like of whole genome sequencing in HiSeq X10, NGS instruments that are adopted in clinical diagnostics labs and hospitals will more likely be highly simplified, affordable (less than $50,000), and provide walk-away answers. Therefore, while clinical sequencing will contribute heavily to market growth, the high value will be on clinical tests and clinical interpretation on the bioinformatics side.
3. A continuing trend for simple, affordable and smaller ‘plug and play’ systems (POC)
You must have heard about the new MiniSeq, surely? If not, you may want to check your emails to see the 100x from Illumina telling you about their MiniSeq (and subsequently the Mini Cooper…). Let’s not forget the simple and scalable S5 Ion, it was one of The Scientists Top Ten innovations of 2015. Although if you really want ‘simple’, you can’t forget the
Biocartis Idylla which is perhaps the best plug and play RT-PCR to have come to the market in 2015. Although labs should be cautious in adopting these ‘black box’ platforms for several reasons, furthermore having an open-source flexibility coupled with a simple and efficient user interface can help improve NGS data analysis. Also keep in mind that it takes only 4 to 5 years for a sequencer’s specifications to become obsolete.
Did I mention the MiniSeq was available too?
4. Multigene panels are soon to be used clinically
And there will be a decreasing reliance on single confirmatory testing with older technologies…
Genection’s AML panel is first of several NGS-based Dx tests it plans to bring through FDA clearance require as well as the co- development of a multi-gene NGS-based companion diagnostic test for the drug Vectibix (therapeutic treatment for metastatic colorectal cancer).
Furthermore, while Sanger sequencing is being used to support mutation validation, for example in the Illumina MiSeqDx 510(k) clearance, it is not possible to use Sanger data to provide a definitive call when mutations in the range of 1–15%. Although there are other increasingly popular technologies that are being integrated into a NGS clinical workflow, such as pyrosequencing or Sequenom-based, that can detect mutant allele frequencies down to 5–10% frequency. It’s only a matter of time until reflex testing decreases once the confidence in precision and accuracy of NGS increases.
5. An emphasis on liquid biopsies and cell free DNA
Call it what you want: cell free DNA, circulating tumor DNA or circulating cell free DNA, it’s perhaps the second biggest trend to hit molecular diagnostics in 2016. The applications are numerous to say the least, including post-cancer serial monitoring and non-invasive diagnostics. The concept of liquid biopsy allows for a more global genomic picture of metastatic disease since blood serves as a reservoir for all metastatic sites. In addition, cell free DNA can be measured quantitatively, presenting the possibility of using its as a biomarker to measure disease burden and response to therapies.
Predictions are that it will be used as a first-line and/or follow up test run at least once per year per patient on average by 2020. Assuming that patients will live for 10-20 years with their disease this works out at 1.75 million to 3.5 million tests per year.
Worldwide there are over 14 million new cancers per year, if the logic above translates then the number of tests climbs fast – maybe 150M liquid biopsies per year. Just make sure you don’t get a tattoo if you’re about to have a blood withdrawal…
What’s next? How about cell free RNA….!
6. Further definition of clinical standards and guidelines
The industry is at a tipping point in determining what should be done in molecular diagnostics, recently the FDA has entered and is out to regulate as I have previous written here. There are huge downsides to a non-standardized industry, including:
- Inefficient use of time and resources with high profile instances costing millions of dollars;
- Damage to reputations of laboratories and institutions
- Overall less favorable public opinion of molecular diagnostics
Needless to say, it can be a significant problem, particularly for assays, because everyone is doing something different and it is difficult to compare results across studies. Laboratory Developed Tests (LDTs) are a prime example of this, being a unique test individual to the specific laboratory that is using it: how can the performance specifications be the same as the next lab? How do you know if the result is accurate every time? Questions, questions.
Regardless of the outcome, there is sure to be more regulations, guidelines and standards in the future to improve the industry.
7. A change in the fundamental chemistry of sequencing
Illumina sequencing technology, sequencing by synthesis (SBS), is the most successful and widely adopted next-generation sequencing (NGS) technology worldwide currently. However, there has recently been a dramatic rise in the use of other technologies such as: nanopores, electronics, microfluidics, real-time technology, and others. (Read this to learn more about the evolution of sequencing).
What will the chemistry change to? Well, semi-conductor sequencer may be a good bet. If this new technology can deliver a robust point-of-care or field deployable sequencer then there will be a huge wealth of benefits, including: faster sequencing, portability, simplicity and a decreased cost. Semi-conductors will also be more robust outside of a traditional sequencing lab as the optics are gone.
How do I know this top secret information? Well let’s just say a little bird may have hinted that it’s not as far away as you may think….quite obviously…. *cough cough* Project Firefly
8. Whole genome sequencing is becoming more popular
If you can sequence the whole genome, why not? There are several reasons currently there is a disadvantage here.
Whole genome sequencing (WGS) has advantages compared to the traditional capturing, especially in sufficiently covering coding exons in GC-rich regions. Genome-wide read coverage furthermore allows reliable detection of copy number variations (CNVs) that can contribute substantially to disease burden.
The advantage of WGS therefore does not only include the identification of non-coding pathogenic variation, but, its more complete exomic coverage is simply the better whole exome sequencing (WES). This PCR-free technique is considered as the most comprehensive second-tier genomic test and with sequencing costs further declining and by using appropriate virtual panels, WGS even has the potential to entirely replace WES as the go-to genetic test.
The phgfoundation blog seems to think there’s a little way to go yet, with standardized frameworks needed to take this incredible technology to the mass-market. Would you agree?
9. There is a decrease in the price of sequencing a genome to….$50??
In a previous post, I covered how the cost of sequencing a human genome has decreased down below the $1,000 mark, courtesy of the illumina HiSeq X10. However, there are still a few companies out there, aiming for something even more aspirational: a $50 genome. Who are these companies??
One such unknown company is Genapsys: aiming for a $50 genome with 99.7% accuracy through electronic DNA sequencing based on proprietary label-free GENIUS (gene electronic nano integrated ultra-sensitive) technology. Is this feasible in 2016? Probably not. But I’d put a good chunk of money on it happening within the next 10 years. However, perhaps paradoxically, the advances here must be met with a bit of caution any decrease in the cost of genomic sequencing could lead to an overall increase in the cost of medical care (you can read a post here that I wrote on this subject)
10. The one that should be happening (but isn’t): The ‘golden sequencer’
Finally – if you’ve read this far, well done to you! Let me know if you agree or disagree with any of the top trends we’ll be seeing over the next year or so. My final trend ‘that should be happening but isn’t’ has to go to the ultimate dream of:
The ‘golden sequencer’: a sequencer that can eliminates trade-offs between performance (accuracy, throughput, speed, capabilities, and technology type) and cost.
Perhaps then we’ll truely have a genomic revolution.
Main image credit: omicsmap