Abstract

The pharmaceutical industries’ main goals are to research, develop and bring to market new medicines that improve the health and general well-being of people worldwide.1 In order to develop new medicines, pharmaceutical/biopharmaceutical companies must outlay in the order of US$1.2 billion (2007 figures) for preclinical and clinical testing alone,2 with an estimated average development time of 10–15 years.3 The above-mentioned costs are exclusive of developing processes and commercial-scale manufacturing facilities that produce the final product. All of this comes about because the industry is (justifiably) highly regulated, and high regulation means tighter control of development and quality practices. In the case of traditional approaches to drug development and manufacturing, this translates to high costs. Unfortunately, this cost has to be passed down to the consumer (although some governments around the world see it as a responsibility to look after the health care of its citizens and therefore have programmes in place to subsidise the cost of medicine4). Using simple calculations, each year that can be taken off the average development time would save a company close to US$100 million. Even though there is a high cost related to drug development and manufacture, the overall benefit that the industry has provided the general public has been quantified in Europe where it is estimated that modern medicine has added 30 years to the average human’s lifespan.1 The dilemma faced by the industry has always been how to reduce costs while at the same time maintaining a high level of quality. One report estimated that the cost of quality is 25% of a total annual operating budget at a production site, excluding raw materials,5 but the cost of non-compliance can be even higher, owing to product recalls and loss of company reputation. Unlike other industrial sectors, the uptake of technology to improve efficiency and quality in the pharmaceutical sector is either slow or non-existent, relying too heavily on “traditional” methods of analysis. To encourage the use of scientific, risk-based approaches to quality, the US FDA published their document entitled Pharmaceutical cGMPs for the 21st Century—A Risk Based Approach.6 Soon after, the US FDA supported this guidance with the process analytical technology (PAT) Framework guidance.7 The question now is, where does near infrared (NIR) spectroscopy fit into all of this? NIR spectroscopy was in use in the pharmaceutical industry well before the PAT initiative was introduced in the early 2000s, but was limited in its scope to a few quantitative analyses and the identification of raw materials.8 In the late 1990s/ early 2000s, NIR instruments were being adopted primarily by the larger multinational companies for raw-material identification to reduce the cost of laboratory testing and provide assurance that all materials delivered were correctly labelled. NIR technology also offered the means for 100% inspection. Instruments were being moved from the laboratory into warehouses and dispensaries, and even beside tablet presses to assess the quality of products, non-destructively at the point of manufacture. In the early 2000s, the instrumentation available was primarily research-grade and not well adapted to process applications (even taking into account the clean and well-conditioned environments of a pharmaceutical manufacturing plant). NIR practitioners were, however, able to use the available instrumentation in an offline or at-line manner to gain better insights into raw-material quality and to characterise a material for its process ability,9 blend uniformities10 B. Swarbrick, J. Near Infrared Spectrosc. 22, 153–156 (2014)

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call