Compliance Blog


Page 1 of 212
  1. Warning Letter: Data could be deleted, audit trails were not enabled, and audit trails were not reviewed for laboratory data computer systems. (ucm619450)

    Tags: |

     Your firm failed to exercise appropriate controls over computer or related systems to assure that only authorized personnel institute changes in master production and control records, or other records (21 CFR 211.68(b)).

     The electronic data systems that you use to generate drug product results lack appropriate controls. There is no assurance that your systems have the appropriate controls to prevent deletion of data and record all modifications to data. For example, electronic data files generated from your [redact] system, used for identity testing of drug products, could be deleted. In addition, the [redact] that controls your high-performance liquid chromatography and gas chromatography systems for testing the actives in OTC drug products did not have all appropriate audit trails enabled to record significant changes.

    View the original warning letter


    Recommended Solutions

    Compliance Training >>

    Recommended Solutions

    ExcelSafe >>

    Recommended Solutions

    Ofni Clinical >>

    Recommended Solutions

    Part 11 Toolkit >>

    Recommended Solutions

    Part 11 Advisor >>

    Recommended Solutions

    MedWatch Reporter >>

    Recommended Solutions

    Computer Validation >>

    Recommended Solutions

    Custom Programming >>

    Recommended Solutions

    Validating MS Excel Spreadsheets >>

    Recommended Solutions

    Access Databases >>

  2. FDA Data Integrity and Compliance With CGMP Guidance for Industry

    What is the purpose of the draft guidance?

    In April 2016, the FDA released draft guidance for clarifying the data integrity role in current good manufacturing practice (CGMP) as a reaction to the large number of data integrity violations that have been cited with increasing frequency by the FDA. This guidance defines data integrity as the “completeness, consistency, and accuracy of data”, and states that maintaining data integrity requires that data be attributable, legible, contemporaneous, original, and accurate (ALCOA). This guidance is arranged in a question and answer format and helps to clarify terms and requirements found in other FDA guidances.

    Defined terminology:

    First, this guidance defines commonly used terminology. Metadata, most simply defined as “data about data”, is what gives data the context it needs in order to reconstruct the entire CGMP activity. It must be maintained throughout the entirety of the retention period along with associated data and must be secure and traceable. Metadata includes system audit trails, defined for the sake of this guidance as “a secure, computer-generated time-stamped electronic record that allows for reconstruction of the course of events relating to the creation, modification, or deletion of an electronic record”. Audit trails can either track changes to data (such as testing results) or track actions taken at the record or system level (such as accessing the system). “Backup” is used to refer to data copies maintained securely throughout the record retention period with all associated metadata in a form identical with or compatible with the original format. It does not refer to temporarily created backup copies.

    What controls are required for computer systems?

    Computer systems must have controls in place to attribute data to a specific user. If it is not possible for each operator to have their own login account, then other controls and documentation must be implemented to ensure traceability to an individual. Electronic signatures satisfying Part 11 requirements must have the appropriate controls to link the signature with the individual and the record. System administrators should be independent from record generators and should maintain a list of all authorized system personnel and their security privileges. If this is not possible, there should be a second reviewer for settings and content. Controls should be in place to restrict who can change a record or enter data and who has the ability to edit specifications, testing methods, and parameters. All workflows for a CGMP system must be validated for their intended use. Audit trails should be routinely reviewed based on the risk of the system, whenever the rest of the record is reviewed, and before final approval. All data should be reviewed if it is used in the CGMP decision-making process unless there is a justified, valid reason for excluding it.

    Static records are defined as fixed-data documents, while dynamic records are defined as allowing user interaction with the record. It is acceptable to retain static records or printouts of original electronic records if they are complete copies and they are not from a dynamic record. CGMP records must be saved at the time of performance. Upon completion of the record they must be sent to storage or controlled in such a way that they cannot be modified. Do not store electronic data in a temporary manner that allows for data manipulation before the permanent record is created. Instead, consider a Laboratory Information Management System (LIMS) or Electronic Batch Record (EBR) system to automatically save data after each individual entry. When storing results, all parts of the results must be saved including original data before it is reprocessed, all raw data, graphs, charts, and spectra.

    Preventing data integrity issues:

    Testing into compliance, usually defined as testing different samples until a desired passing outcome is obtained, is prohibited. This includes scenarios such as using actual samples during a “test” run to make sure they will pass in the “actual” run. Instead, a system suitability test should be performed for determining system precision or an actual sample selected from a different batch than the samples being tested. The chosen test sample should be rationalized and tested according to written procedures. To prevent data integrity issues, personnel must be trained in detecting data integrity issues as part of routine CGMP training programs. If personnel ever suspect data integrity issues, reports should be submitted to the FDA at DrugInfo@fda.hhs.gov with “CGMP data integrity” in the subject line. Any internal investigations taken to inspect potential quality issues must include determining the root cause and the effects on patient safety, product quality, data reliability as well as ensuring the corrective actions to be taken.

    When the FDA performs an investigation, all records, even electronic, must be allowably inspected, reviewed, and copied by FDA inspectors. If the FDA identifies any data integrity problems, it is recommended that companies hire a third party auditor, determine the scope of the identified problem, determine and implement a corrective action plan, and remove individuals responsible for the problems from CGMP positions. They FDA may also conduct an inspection to determine whether or not CGMP data integrity violations have been remedied.

    Written based off FDA’s Data Integrity and Compliance With CGMP Guidance for Industry


    Recommended Solutions

    Compliance Training >>

    Recommended Solutions

    ExcelSafe >>

    Recommended Solutions

    Ofni Clinical >>

    Recommended Solutions

    Part 11 Toolkit >>

    Recommended Solutions

    Part 11 Advisor >>

    Recommended Solutions

    MedWatch Reporter >>

    Recommended Solutions

    Computer Validation >>

    Recommended Solutions

    Custom Programming >>

    Recommended Solutions

    Validating MS Excel Spreadsheets >>

    Recommended Solutions

    Access Databases >>

  3. PIC/S Good Practices for Data Management and Integrity in Regulated GMP/GDP Environments

    What is this draft guidance and why is it important?

    The Pharmaceutical Inspection Co-operation Scheme (PIC/S) is an instrument to improve cooperation between health authorities in different countries which includes members such as the FDA, MHRA, HPRA, and many other regulatory bodies. In August 2016, PIC/S released a draft guidance intended to give inspectors insight on conducting quality inspections and interpreting current GMP/GDP requirements. This draft guidance is the result of collaboration between different health regulatory agencies. It combines many of the thoughts and requirements advocated by these agencies, with an emphasis on methods to ensure data integrity, the importance of having an open quality culture, and some of the different employee roles involved in maintaining good data integrity principles. This document is particularly useful for medical companies because it outlines certain areas that inspectors have been trained to check. While the guidance focuses on data integrity issues concerning both manual and computerized systems, this post will only focus here on data integrity issues concerning computerized systems.

    What is a data governance system?

    Throughout the entirety of the data lifecycle, defined as the moment when data is generated to when it is discarded at the end of the retention period, data must be maintained using a documented data governance program. Data governance is defined in this guidance as “The sum total of arrangements to ensure that data, irrespective of the format in which it is generated, is recorded, processed, retained and used to ensure a complete, consistent and accurate record throughout the data lifecycle”. A data governance program consists of organizational controls (such as procedures for retaining completed paper records, training employees, self-auditing, scheduled data verifications) and technical controls (such as computerized system controls and automation). The purpose of a data governance plan is to provide an acceptable level of data control with documented rationale for the level of control determined by the associated data integrity risk.

    When performing a risk assessment, this guidance recommends taking into account the process complexity and consistency, how data is stored and generated, the level of automation and level of human interaction, and how open to interpretation the results of the process are. When determining risk as it pertains to data integrity, take into account the impact a data integrity issue will have on product quality and decision making and how easy or difficult it is to change data and detect changes to data.

    How does a company encourage its employees to maintain data integrity?

    Several main user roles and the importance they play in maintaining data integrity are described, with a focus on Administrators and Managers. Administrator access must be given in a controlled manner to employees who are not invested in the outcome of the generated data. Management needs to have an understanding that they are legally and morally obligated to find data integrity issues. They are instructed to help induce a climate that encourages maintaining data integrity by staying involved, setting achievable expectations with the resources necessary to achieve these expectations, ensuring accountability and implementing consequences and rewards that are fair. By clearly communicating expectations and creating the right organizational atmosphere, the incentive to falsify or change data in some way will be reduced.

    The quality culture of a company can have a huge influence on data integrity. Companies should strive to be transparent and “open”, where the hierarchy can be challenged by subordinates and any issues discovered by employees will be reported without fear of retribution. A “closed” company culture where employees cannot communicate undesirable information without fear of consequences often leads to an increase in tendencies to amend, delete, or alter data to meet expectations. A code of ethics should be established and communicated to all employees that includes data integrity policies and allows for information flow between personnel at all levels. There should also be a system in place so employees can inform management of any breaches to this code.

    What documentation and procedures should be enacted to maintain data integrity?

    Companies need to have a Validation Summary Report for each computerized system with the change control including system configurations, a list of all users and their individual privileges including the identity of the system Administrator, the schedule for reviewing the audit trail and system log, how to modify a system for a user, how often the system is backed up and how to recover the system from this backup, how data is archived and where, and a statement that all data and relevant metadata is stored and that users are unable to alter audit trails. Computer systems should have documented risk assessments and validation documentation with tests that challenge areas where data integrity is at risk, such as data transfer interfaces between systems. In order to ensure systems have maintained their validated status, they should be periodically evaluated and any changes, deviations, upgrades, and maintenance should be documented. There should also be a list of all computerized systems including the name, location, purpose, and primary function of the system. Assessments of the system and its data should also be performed to determine risks, such as whether or not the system has a direct impact on GMP/GDP.

    There should be established procedures for the process of restoring archived data. Backups should be stored in a physically separate location in case of disaster and they must be readable for the entirety of the data retention period. All records of data that are generated by the system must be accessible for the retention period and able to be printed in a legible form. If a new system is installed or updated, the old data must still be capable of being read; if this is not possible with the new system, then retain old software on a different computer and keep a hard copy of the old system (such as on an external drive or saved to a disk).

    “The review of data-related audit trails should be part of the routine data review within the approval process”. This means that companies should review audit trails when records are being reviewed for approval. Audit trails should be reviewed more frequently for higher risk data and evidence of each review should be documented. Audit trails need to include at least the name of the user performing changes, what the change is and the reason for the change, the time and date, and the name of the individual who has approved the change. Once the system audit trail is validated for production use, it should be locked at all times. Remember: “Failure to adequately review audit trails may allow manipulated or erroneous data to be inadvertently accepted by the Quality Unit and/or Authorized Person” – reviewing audit trails is a great way to catch data integrity issues. Companies should aim to acquire software that includes electronic audit trails that captures system events and the previously listed information. Hybrid systems (paper and electronic), while permitted, must be equivalent to electronic audit trails.

    Written based off the PIC/S Good Practices for Data Management and Integrity in Regulated GMP/GDP Environments


    Recommended Solutions

    Compliance Training >>

    Recommended Solutions

    ExcelSafe >>

    Recommended Solutions

    Ofni Clinical >>

    Recommended Solutions

    Part 11 Toolkit >>

    Recommended Solutions

    Part 11 Advisor >>

    Recommended Solutions

    MedWatch Reporter >>

    Recommended Solutions

    Computer Validation >>

    Recommended Solutions

    Custom Programming >>

    Recommended Solutions

    Validating MS Excel Spreadsheets >>

    Recommended Solutions

    Access Databases >>

  4. MHRA GxP Data Integrity Definitions and Guidance for Industry

    What is Data Integrity?

    In July 2016, the Medicines and Healthcare products Regulatory Agency (MHRA) released its draft guidance on data integrity. The guidance covers the importance of and tips for designing systems that encourage good data integrity practices. It defines popular data integrity terms and presents their applications in common data integrity topics.

    The MHRA defines data as facts and statistics collected together for reference or analysis. Data should be attributable to the person generating the data, legible, permanent, contemporaneous, original, and accurate (ALCOA). Data can be found in different formats throughout its lifecycle. Original or raw data is stored in the file or format that was originally generated. Raw data must permit the full reconstruction of activities resulting in the generation of the data. A true copy of data is a copy of the original information that has been verified as an exact copy with all of the same attributes and information. If the record is electronic, a true copy can be retained in a different file format than the original record if it maintains the same static or dynamic nature of the original record.

    How are audit trails implemented?

    All data must have associated data that provide context and meaning, or metadata. Metadata is typically captured by an audit trail. An audit trail must permit the reconstruction of activities and associate all changes to data including the individual who made the changes. There must be a time stamp and a reason for change. Only system administrator should have the ability to switch an audit trail off, and even then the audit trail should never be turned off. The system administrator role should be limited to as few people as possible. Each user should have access rights appropriate for their role and training. Every individual should have a unique login and password that should never be shared. In some cases there are limitations in computer systems where the system design does not support individual user access.

    It is proposed that GxP facilities should upgrade to an audit trailed system if their paper based or hybrid systems cannot demonstrate equivalence to a fully automated audit trail. It is also suggested that GxP facilities upgrade to computer systems that allow for each user to have an individual login with audit trails. It is recommended that both of these changes be made by the end of 2017.

    How is a data governance plan used to maintain data integrity?

    The MHRA defines data integrity as the extent to which all data are complete, consistent, and accurate throughout the data lifecycle. Each GxP facility must have a data governance plan to ensure that all data is recorded, processed, retained, and used to ensure a complete, consistent and accurate record throughout the data lifecycle. A data governance plan should address data ownership of the data throughout its lifecycle, especially when any third party is involved. Staff should be trained on the importance of data integrity principles and on how to create a work environment with a blame free culture. The work environment should facilitate visibility of errors, omissions, and any abnormal results.

    As part of the data governance plan, the roles of senior management in maintaining data integrity must be defined. Senior management is responsible for implementing systems and procedures that minimize the risk to data integrity. Good techniques and principles for risk management can be found in ICH Q9 (Quality Risk Management). In addition to securing data, management should also ensure that data are readily available on request. Data accessibility is critical especially where regulatory bodies request access to data for auditing purposes.

    How is a system designed to properly store data?

    Data migration and data transfer processes must be designed and validated to ensure data integrity. Data recording techniques should ensure the accuracy, completeness, content, and meaning are collected and kept for their proposed use. Any data that is excluded from results must still be retained in a format that can be reviewed easily and the reasoning behind excluding data from results must be documented.

    As part of the data governance plan there must be defined procedures for data review and approval. Both data and associated metadata/audit trails must be reviewed. Review must be of the original data or a true copy of the data. All data review should be documented. Where electronic signatures are used there must be associated metadata. An inserted image of a signature or a footnote stating that a document has been reviewed is insufficient.

    Archiving data means that it is protected for long term storage. Data that is archived should be protected so that it cannot be altered or deleted without detection. It should permit recovery and readability of the data and associated metadata through its retention period. The data retention period is defined as part of the risk assessment. Data backup is used for disaster recovery and means that the data is stored, but dynamic. These archive and back up processes should be validated and periodically tested. Computerized systems should also be validated for their intended purpose and should comply with the appropriate governing regulatory requirements and associated guidance.

    Written based off the MHRA GxP Data Integrity Definitions and Guidance for Industry 


    Recommended Solutions

    Compliance Training >>

    Recommended Solutions

    ExcelSafe >>

    Recommended Solutions

    Ofni Clinical >>

    Recommended Solutions

    Part 11 Toolkit >>

    Recommended Solutions

    Part 11 Advisor >>

    Recommended Solutions

    MedWatch Reporter >>

    Recommended Solutions

    Computer Validation >>

    Recommended Solutions

    Custom Programming >>

    Recommended Solutions

    Validating MS Excel Spreadsheets >>

    Recommended Solutions

    Access Databases >>

  5. Review of Good Data Integrity Principles

    Tags:

    What is Data Integrity?

    Data integrity is the idea of maintaining and ensuring the accuracy and consistency of data over its
    lifecycle. This includes good data management practices, such as preventing data from being altered
    each time it is copied or moved. Data integrity applies to both paper records and electronic records.
    Processes and procedures are put in place for companies to maintain data integrity during normal
    operation [1].

    Data Integrity Importance

    Data in its final state is the driving force behind industry decision making. Raw data must be changed
    and processed to reach a more usable format. Data integrity ensures that this data is attributable,
    legible, contemporaneous, original, and accurate (ALCOA). Data can easily become compromised if
    proper measures are not taken to ensure data safety. Errors with data integrity commonly rise from
    human error, noncompliant operating procedures, data transfers, software defects, compromised
    hardware, and physical compromise to devices. Maintaining data integrity is a necessary part of the
    industry’s responsibility to ensure the safety, effectiveness, and quality of their products [1].

    To view the complete article click here.


    Recommended Solutions

    Compliance Training >>

    Recommended Solutions

    ExcelSafe >>

    Recommended Solutions

    Ofni Clinical >>

    Recommended Solutions

    Part 11 Toolkit >>

    Recommended Solutions

    Part 11 Advisor >>

    Recommended Solutions

    MedWatch Reporter >>

    Recommended Solutions

    Computer Validation >>

    Recommended Solutions

    Custom Programming >>

    Recommended Solutions

    Validating MS Excel Spreadsheets >>

    Recommended Solutions

    Access Databases >>

  6. ALCOA Data Integrity

    Tags:

    Growing Need for Good Data and Record Management

    Organizations have been utilizing validated computerized systems for years. However, in recent years, regulators have found that these organizations are falling short when it comes to maintaining adequate data integrity within their computerized systems. In response to the increasing number of observations related to data integrity made during inspections, a recommendation for the development of new guidance for good data management was put forth at an informal consultation held by the WHO (World Health Organization) in April 2014. Shortly after, the WHO Expert Committee on Specifications for Pharmaceutical Preparations received documents from PQT-Inspections proposing the outline of a new guidance. The goal of this was to consolidate and improve upon the existing principles ensuring data integrity from current good practices and guidance documents.

    Data integrity & what it means to have ALCOA data

    Attributable data must be recorded so that it can be linked to the unique individual who produced it. Every piece of data entered into the record must be capable of being traced back to the time it was taken and to the individual who entered it.

    Legible data must be traceable, permanent, readable, and understandable by anyone reviewing the record. This is expanded to include any metadata pertaining to the record.

    Contemporaneous data are data that are summarily entered into the record at the time they are generated.

    Original data, or the source data, is the record medium in which the data was first recorded. An original data record should include the first data entered and all successive data entries required to fully detail the scope of the project.

    Accurate data are correct, truthful, complete, valid, and reliable. Controls put in place to assure the accuracy of data should be implemented on a risk-based structure.

    Meeting ALCOA expectations on electronic records

    Attributable: The main controls needed to maintain an attributable electronic record are the use of secure and unique user logons and electronic signatures. Using generic login-IDs or sharing credentials should always be avoided. Unique user logons allow for individuals to be linked to the creation, modification or deletion of data within the record. For a signature to be legally-binding there should be a secure link between the person signing and the resulting signature. The use of digital images of hand written signatures is not often considered a secure method for electronically signing documents. These images lose their credibility when not stored in a secure location or when they appear on documents that can be easily copied by others.

    Legible: In order for an electronic record to be considered legible, traceable, and permanent it must utilize controls such as writing SOPs and designing a system that promotes saving electronic data in concurrence with the execution of the activity. This is best done by prohibiting the creation or manipulation of data in temporary memory as well as immediately committing data to a permanent memory before moving on. Secure time stamped audit trails should be used to record operator actions. The system configuration should limit the enhanced security rights of users such as turning off the audit trail or overwriting data. These administrative rights should be reserved (whenever possible) for individuals who are independent of those responsible for the content of the electronic records. Improperly overwriting data or manipulating the audit trail impairs the legibility of the data by obscuring the original value of the record. This is equivalent to the use of single line cross outs in paper records to denote changes to the data. The data in these paper records are changed but the original values must still be legible beneath the cross out mark.

    Contemporaneous:
    Contemporaneous recording of data also utilizes the controls of writing SOPs and maintaining settings that immediately commits data to a permanent memory. In order for the data to be considered contemporaneous the record must also have a secure time stamp system that cannot be altered by users. Time and date stamps should be synchronized across all systems involved in the GxP activity. These controls should be true for both the workstation OS and any relevant software application used. Data is not considered contemporaneous when recorded on an unofficial document and then later entered into the official electronic record.

    Original: Original electronic records (or certified true copies) should undergo review and approval procedures. These reviews should describe the review method itself as well as any changes made to the information in the original records. These include changes documented in audit trails or any other relevant metadata. Written procedures should define the frequency, roles and responsibilities, and approach to the review of metadata. A risk-based approach to the scope of these procedures is recommended. Once reviewed, electronic data sets should be electronically signed to document their approval.

    Controls should also be put in place to guarantee the retention of original electronic documents as best as possible. The original record should be routinely backed up and stored separately in a safe location in case the original record is lost. Secure storage areas should have a designated electronic archivist who is independent of the GxP operation. Tests should be carried out at times in order to verify that the copy can be retrieved and utilized from secure storage areas.

    Accurate:
    Data accuracy should be maintained through a quality management system that is risk-based and appropriate to the scope of the operation. Routine calibration and equipment maintenance should be performed. Computer systems that generate, maintain, distribute or archive electronic records should be validated. Entry of critical data such as high priority formulas for spreadsheets should be verified by two authorized individuals. Once verified, critical data fields should be locked to prevent modification by any unauthorized individuals.

    Written based off the WHO Annex 5 Draft Guidance on Data Integrity


    Recommended Solutions

    Compliance Training >>

    Recommended Solutions

    ExcelSafe >>

    Recommended Solutions

    Ofni Clinical >>

    Recommended Solutions

    Part 11 Toolkit >>

    Recommended Solutions

    Part 11 Advisor >>

    Recommended Solutions

    MedWatch Reporter >>

    Recommended Solutions

    Computer Validation >>

    Recommended Solutions

    Custom Programming >>

    Recommended Solutions

    Validating MS Excel Spreadsheets >>

    Recommended Solutions

    Access Databases >>

  7. Improved Reporting Efficiency with FastVal

    Tags: |

    Document it. Write it down. Record it or it didn’t happen. The mantra of regulated industry. In pharmaceuticals, every action has potential health outcomes for real patients. Despite regulatory promotion of modern technologies, such as the Paperwork Elimination Act of 1999 and 21 CFR Part 11, many companies continue to suffer from outdated procedures. Quality documentation, however, need not hinder efficiency. Many pitfalls of traditional documentation practices can be avoided using simple, efficient tools.

    Pitfalls in Standard Reporting Practices

    Reporting departments, particularly, stand to benefit from abandoning stagnant practices. Generally, documents are built using approved templates or, less ideally, modeled on previous reports. Reporting staff assemble and format data, summarize procedures and deviations, and adjust layout to make everything fit. Each report is created anew for each project phase, with the only links between documents being the project name or a network folder. This approach, albeit common, presents a number of pitfalls. Manual data entry, on-the-fly formatting, and copy-and-pasting results into poorly-maintained templates lead to:

    • Repeat work
    • Errors and omissions
    • Avoidable revisions
    • Non-standard, confusing language
    • Template and style drift
    • Review errors

    These reporting risks translate into delays. Striking balance between process improvement and production is tricky. The tendency is to forget about process improvement and focus on the mounting report backlog. Without addressing the central problems, producing reports becomes a matter of reinventing the wheel time and again.

    An Updated Approach

    Ofni Systems FastVal software is a compliant way to realize higher efficiency in documentation and to maintain control over the process. FastVal users are able to produce reports in up to 70% less time than conventional methods. FastVal mitigates risk in validation and reporting through a number of features: use of controlled templates, clear linkages between documents, single-point entry of replaceable terms, audit trails, change control, and electronic security. FastVal allows for enforced step-by-step review sign-off, promoting complete, thorough QC and QA checks for each project. And it allows for routine reports to be created accurately and efficiently without error propagation from outdated practices. The common elements of quality documentation are housed in one framework, bringing control, flexibility, and risk management to the reporting process.
     
    Good documentation doesn’t just reinforce the scientific validity of results; it provides waypoints that allow for informed process improvement, deviation handling, and identifying training needs. It is the medium of Quality Systems, providing the framework for combining good business, advanced technology, and good science. And its evolution is a necessary component of providing the world with safe, effective, and valuable treatments and therapies.
     
    Let us help you streamline production and remove the reporting bottleneck. Contact Ofni Systems for a FastVal software demonstration or with any questions you may have.


    Recommended Solutions

    Compliance Training >>

    Recommended Solutions

    ExcelSafe >>

    Recommended Solutions

    Ofni Clinical >>

    Recommended Solutions

    Part 11 Toolkit >>

    Recommended Solutions

    Part 11 Advisor >>

    Recommended Solutions

    MedWatch Reporter >>

    Recommended Solutions

    Computer Validation >>

    Recommended Solutions

    Custom Programming >>

    Recommended Solutions

    Validating MS Excel Spreadsheets >>

    Recommended Solutions

    Access Databases >>

  8. FDA Mobile Medical Applications Guidance

    Tags: | |

    FDA Mobile Medical Applications Guidance“Mobile Medical Applications Guidance for Industry and Food and Drug Administration Staff”, issued by the FDA in September 2013, provides insight to the FDA strategy for overseeing mobile medical applications. If a mobile application is classified as a medical device, it will receive the same FDA scrutiny as currently regulated medical devices, requiring appropriate validation.

    Key Points in the Guidance

    There are several regulatory requirements that mobile app developers need to consider in their design, development, and marketing strategies:

    How you market your app defines whether it constitutes a device.

    “When the intended use of a mobile app is for the diagnosis of disease or other conditions, or the cure, mitigation, treatment, or prevention of disease, or is intended to affect the structure or any function of the body of man, the mobile app is a device.”

    Mobile medical apps are those having the intended use of:

    1. connecting to one or more medical devices to control or otherwise extend device functionality,
    2. transforming the mobile platform into a regulated medical device via connected peripherals or included medical device functionalities, or
    3. converting the mobile platform into a regulated medical device by providing patient-specific analysis, diagnosis, or treatment recommendations.

    FDA focus is on device functionality and risk, not on the platform. Devices subject to regulatory oversight are ones that pose a risk to patients should they fail to operate properly. Manufacturers are expected to comply with regulations for the appropriate device class, including appropriate Quality System controls, cGMP, and 21 CFR 11 requirements.

    Qualifying Devices

    The FDA intends to focus its enforcement on medical applications with immediate potential health effects. This includes apps that measure or record patient-specific information and exhibit a high potential for risk if they don’t function properly. Example mobile medical apps that merit FDA oversight include those that:

    • use a sensor or lead connected to a mobile platform to measure and display, amplify, record, or analyze physiological parameters for diagnostic purposes. Examples: electrocardiograph (ECG), electroencephalograph (EEG), electronic stethoscope, CPR feedback monitor, nystagmograph¸ tremor transducer, limb accelerometer, blood oximeter, or glucometer
    • produce controlled tones or signals using tools within the mobile platform (e.g., a speaker) intended for use in diagnostic evaluations of possible otologic disorders (e.g., an audiometer)
    • present donor history questions to a potential blood donor and record/transmit responses for a collection facility to determining donor eligibility prior to collection of blood or components
    • alter the function or settings of an infusion pump, blood-pressure cuff , implantable neuromuscular stimulator, cochlear implant, computed tomography (CT), or X-Ray machine
    • connect to bedside (or cardiac) monitors and transfer data to a central viewing station for display and active patient monitoring

    Enforcement Discretion

    The FDA has listed a number of mobile medical apps for which it intends to exercise “enforcement discretion” due to low risk imposed on patients. Although these may constitute devices and/or be marketed for diagnosis, cure, mitigation, treatment, or prevention of disease or other conditions, the FDA does not intend to enforce requirements under the Federal, Food, Drug, and Cosmetic Act. Example devices:

    • provide educational information, reminders,  planning, or motivational guidance in support of therapy, medication, or healthy behaviors
    • use GPS information to advise of environmental risks
    • track, trend, or store diet, health events, behavior triggers, symptoms, or health metrics data
    • record conversations with medical professionals for later access
    • use patient characteristics or symptoms to help recommend or a locate a  health provider
    • initiate emergency calls or alert first responders

    In spite of the provisions for enforcement discretion, FDA still strongly recommends adherence to Quality System regulation for all medical device software, even that which poses low patient risk. The guidance cites a nine-year FDA study indicating that over ninety percent of software-related device failures were generally due to failure to validate software prior to production.

    Some apps are not regulated by the FDA. These would include apps that:

    • are not intended for medical or health-related use
    • present commonly available information for general education
    • present reference or review material for trained professionals
    • perform or automate common office tasks
    • function as generic aids or general purpose products

    Final Thought

    With mobile medical applications, software validation remains an important means of guaranteeing product consistency and accuracy of results. It simplifies the risk equation by actively exploring the design integrity of the application and reinforces market-readiness.

    Let us help you ensure safe, reliable products. Contact us to find out more about validating your mobile medical applications or with any questions you have about regulatory compliance.


    Recommended Solutions

    Compliance Training >>

    Recommended Solutions

    ExcelSafe >>

    Recommended Solutions

    Ofni Clinical >>

    Recommended Solutions

    Part 11 Toolkit >>

    Recommended Solutions

    Part 11 Advisor >>

    Recommended Solutions

    MedWatch Reporter >>

    Recommended Solutions

    Computer Validation >>

    Recommended Solutions

    Custom Programming >>

    Recommended Solutions

    Validating MS Excel Spreadsheets >>

    Recommended Solutions

    Access Databases >>

  9. Disambiguation of Clinical Database Validation versus Clinical Data Validation

    Tags: |

    Here’s an interesting test: go to your favorite search engine, search once for “clinical database validation” and a second time for “clinical data validation” (leave the quotes off). Many of the same links will be returned and in a similar order. The problem is that these two terms are for very different processes, so the results should also be very different … but aren’t. The search engines are not the only contributor to the confusion – people in clinical data management have been known to use the two terms interchangeably also.

    Disambiguation

    Validation of a clinical database is a series of tests of the software.

    Validation of clinical data is a series of tests of the data in the software.

    So the difference is between testing the container and testing the contents.

    The Core of the Problem

    The core of the problem seems to be the word “validation”. It is possible to validate a process, software, hardware, or data. The goals of the different types of validation are similar: to ensure the quality and consistency of the product. However, the procedures, documentation, and timing for process validation, software validation, and hardware validation are similar, but the procedures, documentation, and timing for data validation are notably different from the other three. For this reason, they are different skills – people who are good at one type of validation are not necessarily good at the other, although in practice there is some overlap. In clinical data management, software validation and data validation are the two most common forms of validation, so those will be the two types of validation discussed here.

    Timing

    The biggest difference between the two types of validation is in the timing:

    Validation of clinical databases should be completely finished before any real data are entered.

    Validation of clinical data occurs only once the data have started being entered and usually cannot be completed until all data have been entered.

    Documentation

    Another difference between the two types of validation is in the documentation.

    Software validation has a specific set of terms and expected documents that are generally consistent across GxP systems, which includes clinical databases. References to those terms and documents can be found in FDA guides, in industry standards references, and on Ofni Systems website.

    Clinical data validation does have some common terms, but since few of the terms are defined by regulations the way the terms are used varies a lot more. Also, since validation of clinical data is generally a software-specific function, the documentation practices vary substantially.

    Another significant difference in documentation is that validation of clinical databases is expected to include a full list and description of all test results, whether they passed or failed, but the product for a clinical data validation test that passes is the absence of results, i.e. zero problems. In other words, clinical data validation tests that pass require very little documentation; database validation tests that pass require quite a bit.

    Procedures

    Since the expected documentation and timing of clinical software validation and clinical data validation are different, the procedures to perform the tests and generate the results are also different. The differences in timing and documentation are the primary sources of differences in procedures.

    Summary

    The phrases “clinical data validation” and “clinical database validation” are commonly confused even though they refer to different processes. Clinical database validation is a documented series of tests on the database while clinical data validation is a documented series of tests on the contents of the database. Despite the common goal of both processes, that is ensuring the quality and consistency of the product, the two processes differ substantially in timing, in documentation, and in the procedures used.


    Recommended Solutions

    Compliance Training >>

    Recommended Solutions

    ExcelSafe >>

    Recommended Solutions

    Ofni Clinical >>

    Recommended Solutions

    Part 11 Toolkit >>

    Recommended Solutions

    Part 11 Advisor >>

    Recommended Solutions

    MedWatch Reporter >>

    Recommended Solutions

    Computer Validation >>

    Recommended Solutions

    Custom Programming >>

    Recommended Solutions

    Validating MS Excel Spreadsheets >>

    Recommended Solutions

    Access Databases >>

  10. Validation Case Study: Electronic Protocol Execution vs. Paper-Based Execution

    Tags: |

    filesAt Ofni Systems, we use FastVal to execute protocols electronically. Testers read the procedure and compare the actual result to the expected result. Users document results with screen shots that are automatically embedded into the executed protocol. If actual results do not match expected results, a deviation is automatically triggered. FastVal is a validated 21 CFR Part 11 compliant system. Testers are required to enter a valid User ID and password before accessing the protocol. All data entry is recorded in compliance with 21 CFR 11, and the audit trail is provided to clients along with other products of validation. The audit trail and security features provide additional information and increased data integrity over paper-based validation documents.

    Recently, Ofni Systems was hired to perform an onsite validation of our ExcelSafe product.  The ExcelSafe validation package is a 1200-step protocol which usually takes us four person-days to execute electronically with FastVal. This particular client requested that we perform the validation on paper as a case study to demonstrate the increased value of electronic protocol execution. There were numerous places where paper-based execution meant that the validation took longer or recorded less information than our standard electronic protocol execution method:

    • Handwritten Dates have less information than Automatic Date/Time Collection: FastVal automatically attaches the system date/time to the test step after execution. This reduces the error rate in recording dates to zero, and provides more information by including the system time.
    • Corrections do not clearly demonstrate when the changes were performed: The standard practice for making a correction on a paper protocol is to line through, initial the change and make the correction with the appropriate date. While changes are marked sequentially on a page, there is no documentation of order of changes with the same change date. With FastVal, all of these changes are automatically documented into the audit trail, making it extremely transparent when changes occurred.
    • Screenshots had to be printed, gathered from the printer, then manually sorted and signed/dated: This was particularly difficult issue because our printer was several rooms away from the testers. FastVal embeds screenshots in the IOQ and automatically attaches the username, system date/time, and computer name to the image. With FastVal, the tester only needs to take the screenshot before proceeding with protocol execution. When paper execution required taking a screen shot, the user took the screen shot, walked down the hall to collect the printout, signed, dated and noted the correct test case step on the printout, then ensured that the piece of paper was physically placed in the correct section of the executed paper protocol. Because recording screen shots became so time intensive, paper execution tended to record fewer pictures and less information was recorded into the protocol.
    • Deviations had to be completed manually: As always, there were a few minor testing nonconformances (one planned deviation, plus a few tester glitches) during the execution. With the paper execution, deviations had to be filled in by hand. There will also be future costs to ensure that the deviation is mapped to the correct test case and step. In FastVal, once deviations are classified, the default text is populated from a  template, which reduces the amount of time required to complete the documentation of the deviation. This also provides a standard approach to completing all deviations which increases control over the deviation resolution process. Deviations are also linked to their test case and step.
    • Typing is faster than writing legibly: By entering information into the keyboard, there was no question about protocol legibility. The same cannot be said for handwritten documents at the end of a long day of completing protocols.
    • No automatic generation of validation metrics, reports or summary reports: FastVal automatically generates metrics recording the amount of time required to complete validation testing, including providing accurate reports on estimated time of completion and who executed what parts of the protocol. FastVal also automatically creates the Validation Summary Report; we had to spend a day writing the Summary Report from scratch.

    The paper based execution took a total of 10 person-days, two and a half times longer, to execute the protocol and generate the validation summary report (as opposed to 4 person-days when we use FastVal) and recorded less accurate, complete and legible information. The final result was a thick binder of paper that had to be scanned back into a document control system, as opposed to a collection of electronic files and PDF outputs that was already prepared for the client’s document control process. The paper based execution resulted in more time required to complete the project and a significant additional expense to the client.


    Recommended Solutions

    Compliance Training >>

    Recommended Solutions

    ExcelSafe >>

    Recommended Solutions

    Ofni Clinical >>

    Recommended Solutions

    Part 11 Toolkit >>

    Recommended Solutions

    Part 11 Advisor >>

    Recommended Solutions

    MedWatch Reporter >>

    Recommended Solutions

    Computer Validation >>

    Recommended Solutions

    Custom Programming >>

    Recommended Solutions

    Validating MS Excel Spreadsheets >>

    Recommended Solutions

    Access Databases >>


Page 1 of 212