Impact and mitigation of near infrared absorption in quantitative Transmission Raman Spectroscopy.
Ryckaert A., Corujo MP., Andrews D., De Beer T., Griffen J., Matousek P.
Transmission Raman Spectroscopy is an analytical technique commonly used for uniformity content analysis of manufactured pharmaceutical products to quantify the active pharmaceutical ingredients and excipients in the final formulation. Such samples are subjected to a variety of physical-chemical stressors during the manufacture such as compaction force or thickness variations. These effects can impact the effective optical paths Raman photons traverse inside the sample and hence impose different attenuation to emerging Raman photons through near infrared absorption resulting in distortions of Raman spectral profiles. These distortions can propagate to quantitative models and manifest themselves as systematic errors in predictions. In this work, we studied the impact of thickness, porosity and compaction force variations on the predictive capability of a quantitative model and propose a basic spectral standardization technique to correct for it. We observed an improvement in the statistical metrics used to evaluate the performance of the model built on the whole calibration set (RMSE from 2.5 % to 2.0 %) and near complete elimination of the present bias between the most extreme values of compaction (from 8.40 to ∼0 %) accompanied by the corresponding reduction in residuals (RMSE from 8.63 % to 2.06 %).