ISPE:数据完整性风险评估指南(数据完整性成熟度分级)
- 2018-08-01 16:19:00
- gmpfan
- 转贴 6108
关于如何评估一个企业数据完整性管理处于一个什么样的水平?ISPE给出了其数据完整性成熟度分级(DATA INTEGRITY MATURITYLEVEL CHARACTERIZATION)的评估模型,翻译如下,供大家参考:
DATA INTEGRITY MATURITY LEVEL CHARACTERIZATION | ||||
Level 1 | Level 2 | Level 3 | Level 4 | Level 5 |
Cluture | ||||
• DI Understanding and awareness | Awareness of the importance of data integrity, and understanding of dataintegrity principles | |||
Low awareness, limited to SMEs and specialists | General awareness of the topic, but not fully reflected in workingpractices | Principles reflected in working practices, but not consistently applied | Data integrity principles fully incorporated and applied in establishedprocesses and practices | Formal ongoing awareness programme, proactively keeping abreast ofindustry developments |
• Corporate culture and workingenvironment | A culture of willing and open reporting for errors, omissions andabnormal results, and willing collaboration to achieve data integrityobjectives | |||
Unwillingness or no motivation to report errors and abnormal results. | DI problems may be reported but mitigation is either inadequate orignored | Policies and procedures encourage openness, but not implemented in allcases. Mitigation generally limited to the specific instance | Full openness and collaboration achieved through such behaviour beingmotivated by management behaviour. Mitigation considers wider implication | Anticipating potential future DI weaknesses and applying appropriatecontrols |
• Quality Culture | An environment in which employees habitually follow quality standards,take taking quality-focused actions, and consistently see others doing so. | |||
Low awareness and application of quality principles and standards. Aculture of not reporting what management would rather not hear | Ad-hoc quality. Activities performed, but relying on individual efforts | General application of some quality principles, but not fully ingrainedor consistent. | Quality considerations incorporated in normal working practice | Quality and continuous improvement incorporated in normal workingpractice |
Governance and Organization | ||||
• Leadership | Objectives defined and communicated by executive management. | |||
Leadership silent or inconsistent on the need for data integrity. Otherbusiness priorities typically override. | Leadership state need for DI, but do not lead by example. | Objectives defined in policies and high level statements, but not alwaysfully reflected in management priorities. | Management actions and priorities fully reflect stated objectives | DI aspects routinely addressed and improved as part of management review |
• Sponsorship | Executive management providing appropriate resources and support. | |||
Appropriate resources only made available in emergencies (e.g. criticalcitation). | Appropriate resources available in principle, but often not be availablein practice due to other pressures. | Appropriate resources available, but may be diverted or diluted due toother pressures. | Required and planned resources are available and safeguarded due toongoing commitment to data integrity | Management looking ahead to identify future resource needs, based onexperience |
• Structure | Appropriate roles and reporting structures. | |||
No consideration of specific data governance in roles andresponsibilities. | Data governance roles only recently established, or in flux. | Data governance roles established, but not always effective. | Data Governance roles are well integrated into the management structuresand systems | Management reviewing and adapting organizational structures based onexperience |
• Stakeholder Engagement | Engagement of business Process Owners, Quality Assurance, and keysupporting technical groups (e.g. IT) | |||
Data integrity and governance seen as either an IT issue or a QualityIssue. No real Process Owner involvement | Ad-hoc involvement of Process Owners, and Quality Assurance. High persondependence. | Process Owners, and Quality Assurance typically involved, but notconsistently | Process Owners, Quality Assurance, and IT work together through the dataand system life cycles | All stakeholders consistently work together to identify furtherco-operation opportunities, based on experience. |
• Data Ownership | Clear ownership of data and data-related responsibilities | |||
Process, system, and data owners not defined | Process, system, and data owners identified in few areas. | Process, system, and data owners typically defined in many, but not allcases, and responsibilities not always clear | Process, system, and data owners are well defined and documented. | Process, system, and data owner responsibilities considered andclarified during management review. |
• Policies and Standards | Defined polices and standards on data integrity | |||
No established policies and standards for data integrity | Ad-hoc policies and standards for data integrity in some cases | Polices and standards exist, but not fully integrated into the QMS andbusiness process. | Policies and standards fully integrated into the QMS and fully reflectedin business processes and practices | Policies and standards regularly reviewed and improved based onexperience |
• Procedures | Established procedures defining key activities and processes | |||
No established procedures for key data integrity related activities | Ad-hoc procedures for data integrity in some cases | Some procedures and standards exist, but not covering all data integrityrelated activities. | Procedures for all key areas fully integrated into the QMS andreflecting established policies and standards. | Procedures regularly reviewed and improved based on experience |
• Awareness and Training | Awareness and training on regulatory requirements and organizationalpolices and standards. | |||
No real awareness of regulatory requirements and company policy in thisarea | Some awareness of regulatory requirements and company policy, inpockets. | General awareness of well-known regulations, and the existence ofcompany policies | Comprehensive training program ensures an appropriate level of knowledgeof specific regulatory and company requirements | Formal training needs analysis, taking into account regulatorydevelopments. Training effectiveness assessment for ongoing improvement |
Quality Management System | Established and effective Quality Management System, focused on patientsafety, product quality and data integrity. | |||
Few procedures in place focused on patient safety, product quality anddata integrity. | Some procedures and quality control processes, but not consistentlyachieving quality goals. | Established Quality Management System, but compliance and data integrityactivities are not fully effective | Established and effective Quality Management System, consistentlyachieving data integrity goals in support of patient safety and productquality | QMS subject to regular management review and continuous improvement |
Business process definition | Clear and accurate definitions of regulated business processes, coveringall key GxP areas | |||
Few business processes formally defined and documented | Some business processes formally defined and documented on an ad-hocbasis, either by project or operational groups | Most business processes defined, but not consistently followingconventions or standards, and not always complete and up-to-date. | Business processes defined following established conventions andstandards. | Business processes defined and supported by appropriate tools, andconsistently maintained. |
Supplier and service provider management | Assessment of suppliers and service providers against agreed standards,and setting up and monitoring of contracts and agreements to deliver thosestandards. | |||
Many suppliers and providers with a potential impact on data integritynot assessed or managed | Some suppliers and providers with a potential impact on data integrityinformally assessed | Established process for supplier management, but not appliedconsistently. Data integrity implications not always fully covered byassessments or agreements | Established process for supplier management, consistently applied, andincluding a data integrity risk review. | Effectiveness of supplier management subject to regular managementreview based on metrics. |
Strategic Planning and Data Integrity Program | ||||
• Planning | Executive level strategic planning and programs for improving and/ ormaintaining data governance and data integrity. | |||
No planning for data integrity or data governance at executive level | Limited planning for data integrity or data governance, typically drivenby emergencies | Specific Data Integrity program or equivalent underway. | Successful Data Integrity programs achieving stated objectives | Data integrity integral to ongoing organizational strategic planning |
• Communication | Communication and change management processes, supported by a suitablerepository of information and resources. | |||
No communication and change management process for DI | Some informal and person dependent communication and change management. | Formal communication and change management for DI in place, but on aper-project or per-site basis, with ad hoc repositories. | Communication and change management for DI integral to QMS, supported bytools and central repository. | Communication and change management for DI subject to review andimprovement, supported by defined metrics. |
Regulatory | ||||
• Awareness | Awareness of applicable regulatory requirements | |||
No awareness of key regulatory requirements. | Some awareness of detailed regulatory requirements, based on individualexperience and effort. | Formal regulatory awareness-raising underway, including training onregulations and guidance. | All staff aware of regulatory requirements affecting their work. | Formal training needs analysis and action, taking into accountregulatory and industry developments. |
• Traceability | Traceability to applicable regulatory requirements from, e.g., QualityManual, polices or procedures | |||
No traceability to regulations | Little traceability of policies and procedures to specific regulations. | Traceability in place, but limited to key regulatory requirements. | Full traceability, e.g. from Quality Manual or policies, to specificregulatory requirements. | Traceability effectively maintained and updated taking into accountregulatory developments |
• Inspection readiness | Preparation for inspection, including responsibilities, and inspectionreadiness documentation. | |||
No inspection readiness preparation | Limited inspection readiness preparation - ad-hoc and dependent onindividual Process and System Owners | Inspection readiness activities in place, but inconsistent in level,content, and approach | Established process for inspection readiness covering all systemsmaintaining regulated data and records. | Inspection readiness processes regularly reviewed and refined based onregulatory and industry developments. |
• Regulatory Relationship andcommunications | Effectiveness of communication with regulatory authorities, andeffectiveness of dealing with concerns and citations. | |||
No communication except during inspections, when specific citations areaddressed. | Ad-hoc , informal communication as-and-when required, not following adefined procedure. | Communication as-and-when required, following a defined procedure. | Effective, consistent, communication with regulatory bodies following adefined procedure. | Clear communication lines to key regulatory bodies, with internalspecialists following an established process. Concerns and citations areproactively managed. |
Data Life Cycle | ||||
• Data life cycle definition | Data life cycle(s) defined in standards and/or procedures | |||
Data life cycles not defined. | Some data life cycles defined on an ad-hoc basis. | Data life cycles generally defined following procedures. Notconsistently applied. | Data life cycle defined in procedures, and applied consistently to allkey regulated data and records. | Data life cycles defined f and maintained, supported by effectiveautomated tools |
• Quality Risk Management | Application of risk management (including justified and documented riskassessments) through the data life cycle. | |||
No documented and justified assessment of risks to data integrity | Limited data integrity risk assessments performed on an ad-hoc basis. | Data integrity considered in risk assessment procedures, but notperformed to a consistent level. | Data integrity risk management established as an integral part of thedata life cycle and system life cycle. | Quality Risk Management activities subject to continuous improvement |
• Data Management processes andtools | Established data management processes, supported by appropriate tools. | |||
No data management processes | Some data management processes defined by individual Process Owners | Data management procedures defined, but not always effectivelyimplemented | Well established and effective data management processes. | Well established common data management processes, maintained, updated,supported by appropriate automated tools |
• Master and reference datamanagement | Established processes to ensure the accuracy, consistency, and controlof master and reference data. | |||
No master/reference data management processes | Some master/reference data management processes defined by individualProcess Owners | Master/reference Data management procedures defined, but not alwayseffectively implemented | Well established and effective master/reference data managementprocesses. | Well established common master/reference data management processes,maintained, updated, supported by appropriate automated tools |
• Data Incident and ProblemManagement | Established processes to deal with data incidents and problems, linkedwith change management and deviation management as appropriate. | |||
No formal data incident and data problem management process | Some data incident and data problem management processes defined byindividual Process/System Owners | Data incidents and problems typically effectively dealt with as a partof normal system or operational incident management, but with limitedconsideration of wider DI implications. | Established data incident and problem management process linked to CAPAand deviation management where necessary. | Established data incident and problem management process, supported bytools and appropriate metrics, leading to process improvement. |
• Access and Security management | Establishing technical and procedural controls for access management andto ensure the security of regulated data and records. | |||
Lack of basic access control and security measures allowing unauthorizedchanges | Some controls, but group logins and shared accounts widespread. Passwordpolices weak or not enforced | Established standards and procedures for security and access control,but not consistently applied | Established system for consistent access control and securitymanagement, including regular review of security breaches and incidents | Established integrated system for consistent access control and securitymanagement, supported by appropriate tools and metrics for continuousimprovement. |
• Archival and retention | Establishing processes for ensuring accessibility, readability andintegrity of regulated data in compliance with regulatory requirementsincluding retention periods. | |||
No consideration of long term archival and retention periods | No effective process for identifying and meeting regulatory retentionrequirements. Few archival arrangements in place. | Retention policy and schedule defined covering some, but not allregulated records. Some systems with no formal archival process. | Retention Schedule includes all regulated records, and those policiessupported by appropriate archival processes and tools. | Archival and data retention policies and processes regularly reviewedagainst regulatory and technical developments |
• Electronic Signatures | Effective application of electronic signatures to electronic records,where approval, verification, or other signing is required by applicableregulations. | |||
No control of electronic signatures. | Lack of clear policy on signature application, and lack of consistenttechnical support for e-signatures. | Policies in place. Compliant e-signatures in place for some, but not allrelevant systems. | Compliant e-signatures in place for all relevant systems, supported byconsistent technology where possible | Electronic signature policies and processes regularly reviewed againstcurrent best practice and technical developments |
• Audit trails | Usable and secure audit trails recording the creation, modification, ordeletion of GxP data and records, allowing effective review either as part ofnormal business process or during investigations. | |||
Lack of effective and compliant audit trails | Some limited use of audit trails. Often incomplete or not fit forpurpose (e.g. in content and reviewability). Not typically reviewed as partof normal business process. | Audit trail in place for most regulated systems, but with undefined and inconsistentuse within business processes in some cases. | Effective audit trail in place for all regulated systems, and use andreview of audit trail included in established business processes. | Audit trail policies and use regularly reviewed against regulatory andtechnical developments |
Data Life Cycle Supporting Processes | ||||
• Auditing | Auditing against defined data quality standards, including appropriatetechniques to identify data integrity failures | |||
No data quality or integrity audits performed | Some audits performed on an ad-hoc and reactive basis, but noestablished process for data quality and integrity auditing. | Data quality and integrity process defined, but audits not alwayseffective and the level of follow-up inconsistent. | Effective data auditing fully integrated into wider audit process andschedule. | Auditing process and schedule for subject to review and improvement,based on audit results and trends. |
• Metrics | Measuring the effectiveness of data governance and data integrityactivities | |||
No data related metrics captured. | Limited metrics captured, on an ad-hoc basis | Metrics captured for most key systems and datasets. Level, purpose, anduse inconsistent. | Metrics captured consistently, according to an established process. | Metrics captured consistently, and fed into a continuous improvementprocess for data governance and integrity |
• Classification and assessment | Data and system classification and compliance assessment activities | |||
No data classification. | Limited data classification, on an ad-hoc basis. No formal process | Data classification performed (e.g. as a part of system complianceassessment), but limited in detail and scope. | Established process for data classification, based on business processdefinitions and regulatory requirements. | Classification process subject to review and improvement, based outcomesand trends. |
• CS Validation and compliance | Established framework for achieving and maintaining validated andcompliant computerized systems | |||
Systems supporting or maintaining regulated records and data are notvalidated | No formal process for CS validation, The extent of validation andevidence dependent on local individuals. | Most systems supporting or maintaining regulated records and data arevalidated according to a defined process, but approach is not alwaysconsistent between systems and does not fully cover data integrity risks | Established process in place for ensuring that all systems supportingand maintaining regulated records and data are validated according toindustry good practice, and fully compliant with regulations, includingeffective and documented management of data integrity risks. | CS Validation policies and processes regularly reviewed againstregulatory and industry developments |
• Control strategy | Proactive design and selection of controls aimed at avoiding failuresand incidents, rather than depending on procedural controls aimed atdetecting failure | |||
No consideration of potential causes of data integrity failures andrelevant controls | Some application of controls, typically procedural approaches aimed atdetecting failures | Technical and procedural controls applied, but dependent on individualproject or system | Technical and procedural controls are applied in most cases, based on anestablished risk-based decision process | Integrity fully designed into processes before purchase of systems andtechnology, including appropriate controls |
IT Architecture | Appropriate IT architecture to support regulated business processes anddata integrity | |||
No consideration of IT architecture strategy | IT architecture strategy and decisions not documented, and dependent onlocal SMEs. | IT architecture considered, and generally supports data integrity andcompliance, but is typically defined on a system by system basis. | Established IT architecture policy and strategy, with full considerationon how this supports data integrity. | IT architecture strategy regularly reviewed against industry andtechnical developments. |
IT Infrastructure | Qualified and controlled IT infrastructure to support regulatedcomputerized systems | |||
No infrastructure qualificationperformed | No established process for infrastructure qualification. Some performed,dependent on local SMEs. | Infrastructure generally qualified, according to an established process,but is often a document driven approach, sometimes applied inconsistently | Established risk-based infrastructure qualification process, ensuringthat current good it practice is applied, supported by tools and technology | Infrastructure approach regularly reviewed against industry andtechnical developments. |








