The value of vaccination
(CNN) -- The U.S. Food and Drug Administration recently approved the use of new vaccines in humans without having them go through proper human testing.
The change was prompted by concerns that it would be unethical to test new vaccines against biological weapons -- such as anthrax or smallpox -- in humans, since study participants would need to be exposed to the agents.
It would be much too risky to intentionally expose subjects to such diseases because of the chance of contracting them.
Now, the FDA will allow such vaccines to go directly from animal testing to approval for widespread use.
What's at stake and what does it mean for how we conduct human research?
Running risks in research
Vaccines are usually tested in clinical trials, much like the testing of other new medicines.
However, vaccine trials are unique because of the need to expose subjects to diseases the vaccine is designed to combat.
The participants in vaccine trials are exposed to harm so that others -- and sometimes themselves -- can benefit from the information gained. But the risk must be outweighed by the hoped-for benefits.
And as the risk increases, at least some of the potential benefits should come back to the subjects themselves.
Otherwise, we could justify very risky research that exposed a few people to threat of serious harm by arguing that it would yield very important information that would save many lives in the future.
The problem is that such a calculation effectively trades the well-being of current subjects for the well-being of future patients. While the future patients may find such a trade-off perfectly acceptable, those currently in harm's way may feel differently.
Risk and vaccine trials
In much vaccine research, participants would otherwise have a very high likelihood of being exposed to the disease for which the vaccine is being developed.
For example, subjects for HIV vaccine trials are recruited from sexually promiscuous and intravenous drug using populations, both of whom are at high risk for being exposed to HIV.
The research would not intentionally expose subjects to HIV, but subjects recruited would likely to be exposed in the course of their normal behavior.
Such an approach wouldn't work in anthrax or smallpox vaccine research, since these and other potential bioterror agents aren't part of anyone's daily existence.
So if research subjects in a vaccine trial can't be "challenged" with a disease like smallpox, how can we know that a vaccine will work?
From animals into humans?
As with other medical testing, animal models can be used to approximate results in humans.
The closer the animals are to humans physically, the better the research is for predicting effects in humans. That being said, animals are not the same as humans, so the information will never be totally predictive for how humans will respond to the same drug or vaccine.
But from the FDA's perspective the loss in information is outweighed by what is gained by not exposing human research subjects to potentially fatal illnesses.
Once vaccines are approved after animal testing, the human population may eventually be challenged -- when diseases make their way back into our lives either naturally or through biowarfare.
To be prepared for the challenge, we must decide the right balance between research that may risk human health and potentially less safe and effective vaccines using only animals for testing.
Time will tell whether we're drawing the line in the right place.
"Ethics Matters" Archive
where you'll find other columns from Jeffrey Kahn
on a wide range of bioethics topics.
HEALTH TOP STORIES:
Clearing up picture on laser eye surgery
No serious smallpox shot reactions yet
Iraqi children vaccinated for polio
Survey seeks to ID depressed teens
FTC shuts down firm touting cancer cure
|Back to the top|