Jun 13 2017

The CRISPR Controversy

CRISPR-mechanismI suspect that CRISPR is rapidly following the path of DNA in that many people know the abbreviation and what it refers to but not what the letters stand for. Clustered regularly interspaced short palindromic repeats (CRISPR) is a recently developed technology for making precise gene edits. Such technology carries a great deal of promise for treating or even curing disease, for accelerating research, and for genetic modification technology.

However a recent study has thrown some water on the enthusiasm for CRISPR and sparked a mini-controversy. The authors looked at two mice that were treated for blindness with CRISPR-cas9, sequencing their entire genome. They found over 1,500 unintended mutations. That would be bad news for the technology, which is revolutionary partly because it is supposed to be so precise.

For a little background, CRISPR was discovered in bacteria and archea. It is essentially part of their immune system – locating inserted bits of DNA from viruses and clipping them out. Researchers realized they could use this system to target specific sections of a genome to insert or remove a genetic information. The technique is fast, cheap, and convenient.

What this has meant is that genetic modification can now be cheaply available to even small labs. Further, since the technique can be used in living organisms, it could theoretically be used to cure genetic diseases.

Researchers already knew that CRISPR can also create mutations in parts of the genome that have some similarity to the target region, so-called “off target” mutations. So they routinely will monitor at-risk parts of the genome to detect such off target mutations. This new study, however, sequenced the entire genome, looking for off-target mutations even where they weren’t expected. That is why, the researchers believe, they found so many more off-target mutations than prior studies. They warn that we must monitor more that just the high risk parts of the genome for such unintended mutations.

If the results of this study are true, that would certainly put a dampener on the enthusiasm for CRISPR, especially as a therapeutic tool (not as much for research).

However, there has already been a backlash from the research community, many of whom are criticizing the study for methodological flaws and overcalling the results. For example, geneticist Eric Topol told Gizmodo:

“I think the Nature Methods paper was a false alarm on CRISPR induced mutations. Ironically, the methods used were flawed. While we remain aware of such concerns —unintended genomic effects that might occur with editing—that report was off-base.”

Essentially critics are saying that the researchers counted mutations incorrectly. They did not do a before and after comparison, but rather compared the CRISPR treated mice to other mice in their litter. However, some argue that those mutations can simply be due to natural variation. The researchers did not prove they were due to CRISPR.

What this study has done is raise a concern, but not really prove that there is a problem with CRISPR. What needs to happen from here is that this preliminary study be followed up with a more rigorous study that will address all the criticisms. It seems like the needed study should have more CRISPR treated animals and do before and after genome sequencing. That would likely settle the debate.

Once we know what the real risk of unintended mutations from CRISPR is researchers can then address the issue. Perhaps the technique needs to be modified (if possible). In any case, I don’t think concerns off off-target mutations are going to kill CRISPR. It is still clearly a revolutionary research tool. It may be more complicated to use as a therapeutic tool than previously assumed, but I doubt that will be an unsurmountable problem.

It is, as I often point out, difficult to predict technological advance, partly because of unexpected hurdles like this. In the 1990s using retroviruses to deliver gene therapy was a promising technology, but this was stalled because of unintended effects from the viruses, specifically resulting in leukemia in some patients. The technology is still useful for research, but as a therapeutic tool these safety concerns have essentially derailed it.

It remains to be seen if CRISPR will go the same route or will overcome these apparent safety hurdles.

20 responses so far

20 thoughts on “The CRISPR Controversy”

  1. Sarah says:

    I’m holding my judgment until we understand the full story. The lack of a before/after comparison seems pretty damning – the method they chose seems like it would be guaranteed to give a result of making it seem like there were far more unintentional altered sequences than desired or normally accounted for. We’ll see how well it holds up after more thorough replication.

    Still, it’s a good reminder that we can’t get too excited (and too wedded) to any technology, no matter how amazing and promising it seems, until it’s been fully vetted.

  2. BBBlue says:

    Doesn’t introgression backcrossing used to “clean up” the genome of engineered plants greatly reduce the likelihood of off-target effects?

  3. Average Joe says:

    In the past few months, I starting using CRISPR-Cas9 at work. Useful tool but the issues described are not especially surprising. Here are my thoughts:

    The Cas9 protein is seeking out a 20 nucleotide sequence followed by a PAM sequence (ie. GGG). Ideally, Cas9 should cut only at that sequence and no other. But we now know that is not the case.
    To put in context, the bacteria that Cas9 originates from an organisms that typically have genomes of about 3-4 million base pairs (give or take a few million).
    Engineered human cells have about 3 billion base pairs, and two copies of each chromosome (effectively 6 billion nucleotides). First order approximation: it is 2000 times more likely to have off target events in mammalian cells than bacteria when using any DNA targeting technology.

    The sequencing technology will also introduce errors caused by random PCR errors or because the DNA was damaged by oxidation during the extraction and purification from the cells. This is a big issue if not corrected for during data analysis!
    www dot neb dot com/tools-and-resources/feature-articles/dna-damage-the-major-cause-of-missing-pieces-from-the-dna-puzzle
    www dot biorxiv dot org/content/early/2016/08/19/070334
    science dot sciencemag dot org/content/355/6326/752

    A big piece of the puzzle that is often over looked in CRISPR-Cas9 news stories is the ability of the cell to repair the DNA damage caused by Cas9. Remember, all that Cas9 does is cut DNA, it does not repair DNA. This is important if you want to replace a mutant gene with a corrected one. Viruses often bring along the needed enzymes for integrating their genetic material into the host. CRISPR-Cas9 alone is not enough. If human cells had inherent robust machinery for incorporating DNA into the genome, it would be easy. But that is not the case, and as a result, it is a relatively hard to engineer embryos efficiently.
    www dot nature dot com/news/chinese-scientists-genetically-modify-human-embryos-1.17378

    A number of DNA repair mechanisms are in place, one being NHEJ (non-homologous end joining). Using a NHEJ inhibitor may reduce the frequency of viable cells containing mutations arising from NHEJ events.

    CRISPR-Cas9 is found in many types of bacteria (not all though). A number have been screened for robustness. I would suspect that there is a spread of fidelities of Cas9 machinery among the different species. Testing a bunch would be one straight-forward way to finding bester enzymes.

  4. Average Joe says:

    BBBlue, what you say sounds reasonable. The same thing could be applied to mammalian cells but it would require several generations to “clean-up” the background. Ain’t gonna happen with human embryos, let alone a human adult.

  5. It has been very interesting reading around this… I first heard about it in the recent “Science or Fiction” part of the SGU Podcast. It really does seem like the methods employed have led to severely overcalled results by both the researchers (surprisingly) and the media (unsurprisingly).

    If there ever was a case or cause for “science can test this!”, this is it! Reproduction of these results will certainly lead to clarity.

  6. Scott Young says:

    Unless I’m missing something, the off-target mutations should have inserted the restriction site NdeI in all the mutated sites (inserted while making the change in the originally corrected rd1 mouse). However, I don’t see them mention or show in their supplemental materials any sign of this occurring. Please correct me if I’m wrong is some way.

  7. Sarah says:

    Why are you replacing the dots in URLs with “dot”, Joe?

  8. BillyJoe7 says:

    …yeah, you wouldn’t think an average joe would think of doing that (let alone know much about Crispr!)

  9. edamame says:

    Sarah — inoculation against moderation oblivion is my guess.

  10. Average Joe says:

    Another thought…

    Cas9 has been mutated to bind to DNA but inactivate the catalytic ability to break DNA. By doing this, the Cas9 is used to ‘silence’ gene by physically sitting on the DNA and preventing RNA transcription. In my opinion, this use may be adopted a lot faster for changing a phenotype. Say for example, if one allele is messed up that causes a dominant phenotype, an inactive Cas9 to silence the mutant allele but leave the other copy of the gene alone (one copy for each chromosome) is fairly straightforward. The Cas9 gene and sgRNA would have to be integrated into the genome.

  11. Pete A says:

    Regarding the usage of ” dot ” or “[dot]” to replace the literal “.” in URLs:

    1) It is extraordinarily difficult to discover, let alone to remember, the commenting rules of each website: not least because many websites often change the number of URLs per comment, which when exceeded, automatically transfers our comment to the lengthy queue for moderation. It seems to me that the number of URLs automatically allowed per comment on this website is limited to 3. However, this seems to depend on whether or not the URL contains flagged keywords.

    2) There are times when I think it essential to provide thinly-obfuscated links to my sources. I use a ” dot ” or “[dot]” obfuscation because I strongly object to providing clickbait to either the readers or to the robotic Web crawlers / Web spiders / Internet bots. E.g., I refuse to provide direct URLs to websites which promote fraudulent or ignorant medical practices — especially the websites of quacks — unless the author of the article on which I’m commenting has already provided a direct link to the website(s).

    3) Some websites wrap the URLs provided in their comment sections with HTML tags that instruct Web crawlers to ignore them. I cannot remember, neither do I wish to be required to remember, which websites do this and which websites do not.

  12. BillyJoe7 says:

    …but did anyone go to average joe’s links or was it just too difficult?

  13. Pete A says:

    BillyJoe7,

    The commentator “Average Joe” provided the following two links:

    1) www dot biorxiv dot org/content/early/2016/08/19/070334 (bioRxiv preprint first posted online Aug. 19, 2016), which clearly states: “This article is a preprint and has not been peer-reviewed [what does this mean?]”.

    2) science dot sciencemag dot org/content/355/6326/752 (Science 17 Feb 2017, Vol. 355, Issue 6326, pp. 752-756), which appears to be referring to the above publication, but it excludes the above clearly-stated caveat.

    I shall leave it to the readers to decide. Hopefully, they won’t find it too difficult!

  14. Rogue Medic says:

    One area where large amounts of data can be very helpful is in finding potential problems to examine by higher quality means.

    Dr. Eric Topol seems to think that quantity of data will cure the problems of quality of data, but without quality standards (such as the randomized controlled trials he doesn’t like) the what that is supposedly being measured may not have anything to do with the result.

    Homeopathy and acupuncture are two scams that try to overwhelm us with bad data and claim that there is truth in the cherry picking of large numbers. The truth is that if we need to cherry pick, there is a problem with the data. However, Dr. Topol does not recognize the problems with research that really is flawed.

    He claims that acupuncture, hypnosis and guided imagery lead to better outcomes, and that this is supported by good studies.

    The Future of Medicine by Harriet Hall at Science-Based Medicine.

    https://sciencebasedmedicine.org/the-future-of-medicine/

    Dismissing results we do not like, without examining the results carefully, is not good science or good medicine.

    If a real problem has been identified by this research, no matter how flawed the methodology may be, the only way to find out is to try to replicate the research.

    The use of logical fallacies, by Dr. Topol and others, to dismiss the research is inappropriate and misleading. Misleading people is the primary purpose of those who intentionally use logical fallacies. As a cardiologist, geneticist, and digital medicine researcher (he almost makes Dr. Novella seem like a slacker), Dr. Topol should be smart enough to recognize the logical fallacies he employs.

    .

  15. Average Joe says:

    The articles I came across after searching for reference quickly were written by a biotech company New England Biolabs. I just quickly grabbed them and put them in. I can only guess why they published in a public outlet before the Science publication which is behind a pay wall. I think it’s a ‘white paper’. https://en.m.wikipedia.org/wiki/White_paper

    I buy from NEB and I really like their products. They’re a small company that has a great reputation.

  16. djjosh21 says:

    I can’t believe they chose to compare sequences from different mice rather than do the simple before and after on the same mice. Seems so obvious. It’s not like there was a savings doing it their way.

  17. BBBlue says:

    If you can’t even get your on-target correct, how do you think people can trust your data? There are some genes are assigned to even wrong chromosomes! https://pubpeer.com/publications/ADB932A24CD25CD0939F3E05B2EC83?utm_source=Chrome&utm_medium=BrowserExtension&utm_campaign=Chrome

    Perhaps someone who knows more about the subject than I do can say if the above critic is valid.

  18. Pete A says:

    “[Average Joe] The articles I came across after searching for reference quickly …”

    Precisely!

  19. kevinfolta says:

    The sad part is that this single report gets so much discussion and visibility, yet the massive amount of work already done in plants is ignored. Many use this tech and find that off-target effects are rare, but depend on the species, guide RNA and other factors.

    This paper (https://bmcbiotechnol.biomedcentral.com/articles/10.1186/s12896-016-0271-z )shows a comparison between fast neutron, transgenic additions and conventional breeding. Even control plants show single-nucleotide changes (10 or 41, depending on genotype). These are spontaneous– mutations happen.

    Figure 1 is pretty clear.

    The big issue is that we can’t give gene editing techniques some extra barrier before application. They pose no more risk than traditional breeding, and the fact that alterations are directed and can be back-crossed to “clean up” a genome means that discrete changes are completely possible.

  20. Average Joe says:

    ‘Tis frustrating Prof. I do sympathize. I think the media focus is in part because engineering an animal is easier for the journalist/public to make the leap to human gene engineering.
    I hear what you are saying about back cross breeding but it’s not clear in my mind how that can work with a human embryo or as gene treatment therapy on a infant or adult. What’s your take on that?

Leave a Reply