Draft EDQM General Chapter 5.38 Quality of Data

The European Directorate for the Quality of Medicines (EDQM) has released a draft version of General Chapter 5.38 titled “Quality of Data,” which is now open for public review. This chapter focuses on the importance of maintaining high-quality data throughout its lifecycle in pharmaceutical production, especially in light of recent advancements in technology and data analytics.

Impact of Technological Advancements

Recent advancements in technology, particularly in data collection, storage, and analysis, are profoundly impacting industries like pharmaceuticals. The ability to collect vast amounts of data from numerous sources, often referred to as “big data,” combined with innovations in data analysis and high-performance computational tools, is transforming how quality assessments in pharmaceutical manufacturing are carried out.

This transformation is especially evident in the use of machine learning (ML) and artificial intelligence (AI) models, which rely on data to make inferences and guide algorithmic decision-making (ADM) systems. 

Properly designed models have the potential to complement or even partially replace traditional quality assessment methods. However, the accuracy of these models depends entirely on the quality of the data they are trained on. Poor-quality data can lead to false conclusions and incorrect decisions about the quality of medicines. Therefore, ensuring data is trustworthy, accessible, and usable is essential for maintaining high standards in pharmaceutical production.

Scope 

This chapter focuses on the quality of data throughout its entire lifecycle. It highlights the growing reliance on data in quality control processes, including Process Analytical Technology (PAT), real-time release testing (RTRT), and continuous manufacturing systems. These processes increasingly depend on advanced data analysis tools, such as ML and AI, to monitor and predict critical quality attributes (cQA) of pharmaceutical products.

The chapter aligns with other European Pharmacopoeia (Ph. Eur.) chapters that address the role of data in pharmaceutical processes, such as:

  • Chapter 5.21: Chemometric methods applied to analytical data
  • Chapter 5.28: Multivariate statistical process control
  • Chapter 5.33: Design of Experiments

By focusing on data quality, this draft chapter outlines the need for common standards in handling and managing data in pharmaceutical processes, providing guidance on achieving high-quality data to support accurate and reliable analyses.

Quality of Data 

The draft defines data as a collection of data elements with specific meanings derived from context, including the source and generation processes. Data can be classified into:

  • Primitive types (e.g., Boolean, integers)
  • Composite/structured types (e.g., arrays, tables)
  • Unstructured types (e.g., text documents, natural language)

Key aspects of data quality include:

  • Accuracy: Data must accurately reflect the real-world conditions it represents. In the context of machine learning, accuracy is critical because it affects the reliability of predictions and decisions made by the model.
  • Data Bias: Bias can occur when data is collected or processed in ways that favor certain outcomes. Reducing bias is essential for making objective, trustworthy decisions.
  • Completeness: All necessary data must be available without missing values to ensure reliable analysis.
  • Consistency: Data must be consistent across different sources and datasets, with clearly defined formats and standards. Consistency ensures data can be integrated from different systems without causing errors.
  • Timeliness: Data should be current and reflect real-time conditions when required, especially in scenarios involving streaming data for real-time quality monitoring.
  • Reproducibility: The ability to independently verify data is a core principle of scientific integrity. Proper documentation of the data collection process ensures results can be repeated and verified by others.
  • Veracity: Veracity refers to the truthfulness of data. Any inaccuracies in data could propagate through the system and lead to erroneous conclusions, making veracity crucial for algorithmic decision-making.

Data Management

The Quality of Data chapter emphasizes the need for effective data management practices, such as the Extract-Transform-Load (ETL) process, which ensures data is properly collected, cleaned, and stored for further analysis. 

Additionally, data governance practices are highlighted as essential to ensuring compliance with regulatory standards like Good Manufacturing Practices (GMP).

SEE ALSO: GMP vs cGMP

Role of Subject Matter Experts (SMEs)

The involvement of Subject Matter Experts (SMEs) is also discussed, emphasizing their role in defining data quality indicators, validating models, and addressing ethical considerations such as data biases. SMEs are key to ensuring that data-driven processes are both effective and compliant with regulatory expectations.

Call for Industry Feedback

EDQM invites industry stakeholders to review the draft chapter and provide feedback. The input received will be used to finalize this important guidance, which aims to help pharmaceutical manufacturers leverage digital transformation while maintaining high-quality standards in data management.

Have thoughts on EDQM’s new approach to data quality? Share your feedback and be part of shaping the future of data integrity in pharmaceuticals!

Leave a Reply

Your email address will not be published. Required fields are marked *