How a Drug Becomes a Drug
22
August

By Adem Lewis / in , , , , , , , , , , , , /


Drug research and development aims to
prevent or treat disease. While the end result–a small pill or capsule–seems so
simple, the process for developing a safe and effective new drug is anything but. Development can take as long as 20
years and cost more than one billion dollars. Like a relay race, drug development has
several stages and requires a team effort, often involving the government,
universities, and pharmaceutical companies to reach the finish line. The process begins with basic research. U.S. government agencies, such as the
National Institutes of Health, conduct and fund research at laboratories around
the world to uncover fundamental knowledge about diseases. This research helps identify potential
drug “targets,” usually genes or proteins whose functions are critical to the
survival or spread of a disease-causing organism. Scientists then investigate how they can
interfere with these targets to either control or eliminate disease. They may test tens of thousands of
chemical or biological compounds to see if they either inhibit or stimulate a
given target. Usually only a very small percentage of
these compounds will have an effect on a target. These “hits,” are then re-screened multiple
times to confirm the results and further trim the list of potential drug candidates. Researchers look for compounds that interact
only with the desired target. If a compound reacts to unrelated
targets, there is a greater chance of adverse side effects. To further minimize this risk, researchers conduct experiments
to optimize a compound’s absorption, distribution, and metabolism inside the
body. These studies help determine which
compounds are safe and effective enough for further testing. With the results from these “preclinical”
studies, drug developers seek permission from the U.S. Food and Drug
Administration to begin testing the compounds in people. If granted, scientists embark on a
three-phase clinical testing process that will determine whether a drug will be
approved for public use. Because clinical testing requires a
substantial investment, it is common for pharmaceutical companies to play a
larger role during this stage. Phase I clinical trials test an
experimental drug in about 20 to 80 healthy adults to evaluate its
safety, determine a safe dosage range, and identify side effects. In Phase II, the drug is given to
approximately 100 to 300 people, including those with the target disease,
to get an early indication of how the drug is working and to further evaluate
its safety. Phase III trials evaluate the drug in
a group of 1,000 to 3,000 people who have the disease. These trials
aim to confirm the drug’s effectiveness and monitor side effects.
They also may compare the drug to commonly used treatments or to no
treatment at all. The larger size of Phase III trials
allows for results that are more statistically significant, or less likely
to have occurred by chance. If a drug successfully completes
Phase III testing, a company will seek permission from the
FDA to market the compound. Its application contains data from all the
preclinical studies and clinical trials, along with information that the FDA
will need to make its decision on the drug’s safety, efficacy, and quality. If approved, the drug can be manufactured
and sold to prevent or treat the disease in question, but the process doesn’t stop there. The FDA
will continue to monitor the drug’s safety and effectiveness for as long as it
remains on the market.


4 thoughts on “How a Drug Becomes a Drug

Leave a Reply

Your email address will not be published. Required fields are marked *