Before testing new drugs in human beings, drug developers must first perform a series of safety tests in animals. Unfortunately, these preclinical toxicology studies are typically protected as trade secrets. In fact, many countries have laws that specifically bar drug regulators from releasing preclinical toxicology data submitted by drug developers.
Unless you take the extreme view that animal experimentation raises no ethical concerns, this represents a terrible waste of animals and a failure of researchers to enable the sacrifice of animals to enrich the bank of human knowledge. As an afterthought, it’s worth mentioning that this also comes with certain opportunity costs for human beings, since such nondisclosure potentially 1-frustrates efforts by researchers to improve their knowledge about drug safety, 2- results in duplicative expenditure of human resources.
It needn’t be this way, and the field of gene transfer shows one modest way toxicology data could be published and pooled. Since it was established, the National Gene Vector Laboratories, at Indiana University, have invited gene transfer researchers to submit summary data on toxicology studies to their database (the laboratory recently was eliminated and replaced with the National Gene Vector Biorepository– NGVB for short). As described by NGVB director Ken Cornetta and project coordinator Lorraine Matheson in Molecular Therapy (April 2009), the database is intended to provide a resource for researchers so that they can cross-reference toxicology experiments in their FDA filings and avoid duplicative studies. The authors also envision the database as a resource for grant reviewers.
The database contains 27 toxicology studies in all. This number seems small when you consider the volume of gene transfer studies pursued since the database was established. The fact that every institution that has contributed to the database is a nonprofit suggests that the private sector has not taken an interest in this worthy resource. One question I have is how many private companies have used data contained in this databank in their FDA filings (this should be easy to determine).
These questions aside, other fields should create similar resources to pool data and create opportunities for data linkage. I would go so far as to say that ethics policies should require that, at a minimum, such summary data be published on a public database. The failure to do so seem a toxic waste for animals, scientists, funders, and patients alike. (photo credit: drp, Waste Not, 2004)
@Manual{stream2009-104,
title = {Toxic Waste?},
journal = {STREAM research},
author = {Jonathan Kimmelman},
address = {Montreal, Canada},
date = 2009,
month = apr,
day = 27,
url = {https://www.translationalethics.com/2009/04/27/toxic-waste/}
}
MLA
Jonathan Kimmelman. "Toxic Waste?" Web blog post. STREAM research. 27 Apr 2009. Web. 14 Oct 2024. <https://www.translationalethics.com/2009/04/27/toxic-waste/>
APA
Jonathan Kimmelman. (2009, Apr 27). Toxic Waste? [Web log post]. Retrieved from https://www.translationalethics.com/2009/04/27/toxic-waste/
Chicago in plastic and balsa. If only animal models were as convincing as the one pictured above from the Museum of Science and Industry.
The August 7 issue of Nature ran a fascinating feature on how many scientists are reassessing the value of animal models used in neurodegenerative preclinical research (“Standard Model,” by Jim Schnabel).
The story centers on the striking failure to translate promising preclinical findings to treatments for various neurodegenerative diseases. In one instance, a highly promising drug, minocycline, actually worsened symptoms in patients with ALS. In other instances, impressive results in mice have not been reproducible. According to the article, a cluster of patient advocacy groups, including organizations like Prize4Life and a non-profit biotechnology company ALS TDI, are spearheading a critical look at standard preclinical models and methodologies.
Much of the report is about limitations of mouse models. Scientists from the Jackson Laboratories (perhaps the world’s largest supplier of research mice) warn that many mouse strains are genetically heterogenous; others develop new mutations on breeding. Other problems described in the article: infections that spread in mouse colonies, problems matching sex or litter membership in experimental and control groups, and small sample sizes. The result is Metallica-like levels of noise in preclinical studies. Combined with nonpublication of negative studies, and the result is many false positives.
The article bristles with interesting tidbits. One that struck me is the organizational challenges of changing the culture of model system use. According to the article, many academic researchers and grant referees have yet to warm to criticisms of models, and some scientists and advocates are asking for leadership from the NIH. Another striking point in the piece-alluded to in the article’s closing-is a fragmentation of animal models that mirrors personalized medicine.
“Drugs into bodies.” That’s the mantra of translational research. It is an understandable sentiment, but also pernicious if it means more poorly conceived experiments on dying patients. What is needed is a way to make animal models- and guidelines pertaining to them- as alluring as supermodels. (photo credit: Celikens 2008)
@Manual{stream2008-131,
title = {The Problem with Models},
journal = {STREAM research},
author = {Jonathan Kimmelman},
address = {Montreal, Canada},
date = 2008,
month = oct,
day = 10,
url = {https://www.translationalethics.com/2008/10/10/the-problem-with-models/}
}
MLA
Jonathan Kimmelman. "The Problem with Models" Web blog post. STREAM research. 10 Oct 2008. Web. 14 Oct 2024. <https://www.translationalethics.com/2008/10/10/the-problem-with-models/>
APA
Jonathan Kimmelman. (2008, Oct 10). The Problem with Models [Web log post]. Retrieved from https://www.translationalethics.com/2008/10/10/the-problem-with-models/