fme compliance-center solves validation challenges

fme compliance-center solves validation challenges

At last! An end to many of the onerous software validation activities will boost innovation in Life Sciences.

It’s been a long time coming, but a new risk-based approach to computer software assurance looks set to spur new digital advancement in Life Sciences. After a quarter of a century of the rigidly comprehensive Computer Software Validation (CSV) model, in which each and every tweak to an IT system led to a whole raft of testing and documentation, the FDA has published new draft guidance with an emphasis which is more fit for purpose.

Rather than enforce a checklist of no-exceptions tests, the new Computer Software Assurance (CSA) model will be geared to the impact of any changes to a system. It will require new testing and associated documentation only if there is likely to be a direct effect on the product or on patient safety. Indiscriminate testing of logins and similar low-risk processes will no longer be required – or incur the risk of a failed inspection if not completed.

Removing the validation-related barriers to digital transformation

The update to the FDA requirements comes just in time. Over-zealous requirements have caused a reticence to upgrade systems in a Life Sciences Regulatory context because the cost of validation is often two to three times the original cost of the software. While a large company with a sizeable budget and ample internal resources might be able to weather this comfortably, the same hasn’t always been true and cannot always be said for small businesses along the supply chain. Neither size company wants this added cost.

This is one of the reasons Life Sciences has lagged behind other industries in digital innovation. Years ago, analyst firms put the sector at 17th for innovation, a barely altered ranking. If a mundane transactional system had to be put through its paces each time an adjustment was made, consider how much more of a barrier introducing an AI-based capability would have been under the outgoing regime would be.

Although the new approach is still only at a draft guidance stage, the excitement around the change is palpable. Suddenly, even the regulators are pushing for digital innovation in how Life Sciences companies manage their processes, and this impetus for change is now driving the FDA’s new approach to software assurance.

Reducing complexity & cost of validation

The new emphasis on risk-based and ad-hoc software assurance checks and documentation could reduce the time and cost burden by as much as 80%. Instead of resigning themselves to weeks of writing scripts and capturing screens, IT teams will be able to explore the potential for concepts like:

  • AI-based safety signal detection;
  • advanced use of Regulatory intelligence to drive pipeline development and submission strategies;
  • and adopting more innovative interfaces (e.g., voice assistants) to drive complex RIM queries.

At last, companies will be able to reduce their reliance on Excel spreadsheets without worrying about adding to an already overwhelming validation worklist.

Cultivating greater cloud use

Next-level cloud adoption is likely to increase because there won’t be the same requirement to track down and physically visit hosting data centers to check on fire alarms and the like. Instead, third-party assessments will usually be acceptable, opening new opportunities to embrace the latest remote capabilities.

It might be the best part of another year before the new provisions apply, but knowing that the FDA is altering its approach to software assurance paves the way for bolder tech-based ambitions today – whether that’s around AI capabilities or enhanced inter-system data integration.

And fortunately, we’re fme poised to help companies chart and respond to these opportunities. Our new compliance-center solution encapsulates the FSA’s new CSA approach in a digital platform, delivering a fit-for-purpose QMS capability for companies in a paperless cloud platform.

Read more about the advantages of CSA over traditional CSV here and keep an eye out for the compliance-center webinar in May. We’d be delighted to share more about it – so why not get in touch?

Read more and watch the fme compliance-center™ webinar here

Learn more about CSV and CSA

Learn more about the challenges and new solutions for software validation in this recording from fme’s David Gwyn. 

About the author

David Gwyn is a strategic, creative, and data-driven Business Development and Technology Specialist with extensive expertise in building key partnerships, implementing business strategies, and deploying solutions across the life science and emerging technology industries. In October 2021 he joined fme US as the Business Unit Director for Business Consulting. For the past 30 years, he has led teams in the delivery of content management, clinical, and quality solutions with a recent focus on end-to-end Regulatory Information Management (RIM). His practice has evolved in parallel to the Life Sciences industry, moving from custom-developed software solutions to packaged-based implementations, and the development of methodologies and best practices to guide practitioners in realizing the greatest return on investment. Mr. Gwyn is passionate about helping organizations evolve from traditional to digital businesses and increase their ability to act with speed and agility.

fme’s MetadataAssist™ reconnects lost documents

fme’s MetadataAssist™ reconnects lost documents

Document metadata is a critical component of any life sciences Content Management System (CMS). Each element within the structure is vital to the document history, and ensures the document is properly stored and managed throughout its lifecycle.

These are just a few of the key types of metadata required on each document stored by life sciences firms:

    1. Title: The title of a document provides important information about the content and purpose of the document.
    2. Author: The author of a document provides valuable information about the origin and credibility of the scientific data contained within the document.
    3. Date: The date of a document provides valuable information about the age and relevance of the scientific data contained within the document.
    4. Keywords: Keywords provide valuable information about the content and context of the scientific data contained within the document.
    5. Regulatory Information: Regulatory information provides valuable information about the compliance of the document with relevant regulations and guidelines.
    6. Scientific Data: Scientific data provides valuable information about the results, methodology, and analysis contained within the document.

Unfortunately, this valuable information about the content, structure and context of documents is often disconnected or lost in storage, translation, or migration to and from different systems. Over time, it becomes increasingly difficult to locate, share, and utilize critical information.

Fixing metadata is a larger problem

Many organizations don’t recognize the extent of the problem until they plan a consolidation or migration project. When preparing their content for an updated or new platform, they realize the amount of information that is either missing or incomplete. They also discover the volume of effort required to review, identify and update this information across their content library. Shifting key business resources to spend hundreds of hours on tedious content review and metadata updates isn’t going to be beneficial to anyone’s business goals, but disorganized content will cause any ‘content management system’ to feel more like ‘content maybe somewhere.’ This quickly becomes a high-risk situation when the company has committed resources to a larger platform project.

Facing project delays, many companies decide to take the ‘just make it work, and we’ll fix it later’ approach, contributing additional risk to a backlog of work for future admin teams. Eventually, this will fail. There is a better way.

fme MetadataAssist™ is the solution

Staffed with life sciences content experts, fme MetadataAssist™ service is a value-adding solution that supplements your team to rescue disconnected content. With a proprietary process and customized workflows, we save your team hundreds of hours of tedious content review and metadata updates, and deliver an accessible library of essential information. Your content can then be properly filed in in your CMS for accurate search results or migrated to a consolidated solution.

If you are consolidating volumes of legacy content, fme MetadataAssist™ is a unique solution that can address large amounts of data while reducing or eliminating the historically high-level of business involvement. For companies that are migrating to a new platform, fme MetadataAssist™ can streamline your migration, minimizing costs and reducing timelines for your entire team.

Conclusion

Accurate metadata is a critical component of any document management system within a life science company. By reviewing and updating the metadata in your documents, you can improve searchability, enhance collaboration, facilitate regulatory compliance, preserve and archive critical scientific data, and extract insights from scientific data to drive innovation and improve research outcomes.

To learn more about fme MetadataAssist™, contact us to schedule a time to discuss your current document and metadata challenges. We’ll share a few of our recent projects, and show how fme can help you reclaim the value lost in your document library.

About the author

David Gwyn is a strategic, creative, and data-driven Business Development and Technology Specialist with extensive expertise in building key partnerships, implementing business strategies, and deploying solutions across the life science and emerging technology industries. In October 2021 he joined fme US as the Business Unit Director for Business Consulting. For the past 30 years, he has led teams in the delivery of content management, clinical, and quality solutions with a recent focus on end-to-end Regulatory Information Management (RIM). His practice has evolved in parallel to the Life Sciences industry, moving from custom-developed software solutions to packaged-based implementations, and the development of methodologies and best practices to guide practitioners in realizing the greatest return on investment. Mr. Gwyn is passionate about helping organizations evolve from traditional to digital businesses and increase their ability to act with speed and agility.

Advantages of CSA – Computer Software Assurance – Over Traditional CSV

Advantages of CSA – Computer Software Assurance – Over Traditional CSV

For over 20 years, the traditional CSV computer system validation has created mountains of paperwork to validate a new or updated system. It’s also created an overwhelming burden that prevented many companies from upgrading their complex systems. CSA – Computer Software Assurance – was introduced to relieve that burden, allowing companies to optimize validation activities by focusing on the processes that impact patient health and safety.  

There are many advantages of CSA over the traditional CSV approach. It is a more streamlined and efficient risk-based methodology that saves time, frustration, and money by: 

  • Providing clarity for FDA’s guidance and methodology  
  • Driving critical thinking to identify, evaluate, and control potential impact to patient safety, product quality, and data integrity  
  • Focusing on the ability to leverage vendor qualification activities  
  • Providing streamlined testing instead of one-size-fits all  
  • Saving as much as 80% of validation costs  

 These are much needed improvements that are being welcomed by forward-thinking companies striving to improve their systems to stay competitive in today’s volatile market.  

Old Lessons Applied to Current Challenges  

In the 1930’s, Henry Ford shifted his thinking about how cars were manufactured. His industry changing assembly line focused on specific sub-components of his vehicles creating a plethora of efficiencies and quality improvements that allowed him to achieve unprecedented production goals.  

 Life science companies are now able to apply similar lessons in the context of validation. Much like the traditional car factory, the traditional CSV methodology demands extensive structure for every aspect of the system. CSA opens up new tools, templates and techniques, revised SOPs, and training to shift the focus to thinking critically rather than dogmatically. It is a dramatic shift in focus that can improve a company’s competitive edge by increasing their ability to test and adopt new business processes and systems and accelerating validation activities. 

fme Delivers the Advantages of CSA 

The fme team are life science experts that have the deep regulatory, clinical and quality experience required to integrate complex business and regulatory compliance requirements. By leveraging proven practices of previous decades, our extensive process expertise, and today’s best-in-class toolsets, we fast-track your evolution from CSV to CSA, eliminating manual, labor-intensive validation efforts and establishing a proven risk-based methodology.  

Learn More about the Advantages of CSA 

Currently there is no registration required to download fme’s Optimizing Validation through a Risk-Based Approach: Leveraging Computer Software Assurance (CSA) to learn more about CSA, and our CSA training options in online, instructor led, or a hybrid of eLearning and remote instructor coaching. We are happy to provide you with detailed information on our validation service offerings and can even tailor an approach that meets your needs and exceeds your expectations. 

 

Highlights of Generis and fme’s “Data-centricity in RIM” Webinar

Highlights of Generis and fme’s “Data-centricity in RIM” Webinar

In October, fme’s Director of Business Consulting David Gwyn was a featured contributor in an informative webinar with the Generis CARA Life Sciences team. He was able to share his rich experience and perspective on the value of a data-centric approach to document and information management, and outline some of the benefits that can be realized across an organization.

Generis also provided a comprehensive demo of their CARA Life Sciences Platform, and how it can improve quality, efficiency, consistency, and scalability across any organization.

Below is a summary of David’s introduction, an outline of the webinar, and a highlight video of the presentation. View the full webinar on the Generis site, and contact us with any questions you have about data-centricity or the CARA Life Sciences Platform.

 

Summary of Data-Centricity Introduction

David Gwyn: I’d like to speak for a few minutes on the essential concept of data-centricity. What I mean by that is how we can reshape our thinking about documents and the other entities we’re managing beyond traditional paper. Right now we all have an opportunity, and I will argue a necessity, to change the way we’re thinking about the information we’re managing and move forward into a more data-centric and data-focused approach.

I’m sure you remember the days where we would produce reams and reams of paper that we’d stack on a conference room table and ask the CEO to come in and sign the 356H. Eventually we said “Let’s digitize this, so we took our Word documents that we printed out before and turned them into PDFs. While this was a step forward, we really just took pictures of all those documents and obfuscated all the value. All the data that was there was buried in a digital version of the paper process.

There are much better solutions now that eliminate traditional challenges, and provide extensive improvement to quality, efficiency, consistency, and scalability across your entire organization. Let’s look at what’s possible.

Data-Centricity Webinar Outline

  • Overview of a document-centric progress
  • Impacts of document focus
  • Brief History of Medicinal Product Submissions
  • What is triggering the transition to digitalization of the process?
    • Regulations, data standards, compliance
    • Improve quality, efficiency, consistency
    • Enable scalability, promote quality, endure changing landscapes
  • Characteristics of a data-driven approach
  • Benefits of data-centric process
  • Questions to ask to prepare for a transition to a data-centric approach
  • Detailed demo of Generis CARA Life Sciences Platform

Watch the Webinar Highlights

 

For more information, please complete this contact form and we’ll put you in touch with our experts.

 

“Data Discernment – A simplified approach to strategic study and portfolio planning.”

“Data Discernment – A simplified approach to strategic study and portfolio planning.”

Now, with the rapid innovation in eClinical system technology, a Sponsor can more readily find a CTMS solution that supports exactly their business profile – enhancing strategic insights without unnecessarily encumbering the clinical operations process.

Most small-to-midsize clients focus on two key areas when using CTMS to enhance data accessibility

1. Enrollment Metrics
Key Areas of Concern – patient recruitment and screen failure rate, open to enrollment to first patient enrolled, patients enrolled and not completing all visits (breakdown analytics for interim visits, safety events, and/or protocol deviations), target enrollment, accrual timeline.

2. Site Performance
Key Areas of Concern – study start-up timelines/spend, protocol compliance and deviations, safety reporting, data entry timeliness and accuracy, clinical document quality.

These two areas provide the greatest/earliest indication of protocol success or on-going risk, as well as return on study investment. And, so the question becomes, should the Sponsor bring the necessary data points in-house using CTMS as a nexus for cross-functional collaboration? Great question.

Use skills, experience, and resources of your CRO to your greatest advantage!

Essentially, CROs have a deeper data lake and access to more robust, well-rounded data points. It’s a volume game, pure and simple. Unless your organization has very specific goals in mind, it likely isn’t worth the cost and resourcesto duplicate the data collection efforts, particularly if the CRO has been contracted to perform a majority of the study activities.
Another important consideration is application maintenance. When the CTMS application is cloud-based and subject to periodic release updates – like Veeva Vault Clinical – any integration must be tested and maintained to ensure integrity of both the connection and the data communicated thereby. This can be a big resourcing effort, considering the 3x Annual Veeva Vault Release schedule.

Get specific and targeted with meaningful KPIs

When Sponsor goals dictate that it is time to bring data in-house (worth the implementation and maintenance efforts), be highly targeted. Choose specific, meaningful Sponsor-priority KPIs to capture in the CTMS environment, then leverage Vault application features to boost efficency in on-going management activities. Resist the urge to capture data simply because there is a visible field available or an out-of-the-box report associated with it; if you don’t need it, hide it.

Recap

In this blog series, we discussed the importance of a simplified eClinical system environment, then juxtaposed compliance „have-to-dos“ with strategic „want-to-dos“ using a simple framework, and voila – a hybrid governance:maturity map. Using this map, you’re ready to drive innovation both internally and within the industry. And, if you need some extra help, just ask – fme is here to support you!

Managing CRO Sponsor Compliance in Veeva Vault Clinical

Managing CRO Sponsor Compliance in Veeva Vault Clinical

We also highlighted the importance of having a full picture of your outsourcing strategy as a small-to-midsize clinical research Sponsor. The activities to be performed and the systems of record to be utilized by the Contact Research Organization (CRO) drive our approach for oversight and governance – the essential compliance aspects of eClinical systems in the clinical operations space.

For example:

  • Will the CRO use their in-house eTMF system, work exclusively in the Sponsor eTMF, or a combination of the both?
  • Will the CRO systems/reports integrate with the Sponsor CTMS to capture enrollment metrics or other pertainant, strategic clinical data points?
  • What is the cadence of document and information sharing expected (i.e. Quarterly, Annually, End of Study)?

With this healthy foundation – simplicity and awareness – we are primed to tackle two key components of long-term eClinical system success: oversight and governance.

Oversight doesn’t have to be a complex process

First, let’s box in exactly what a Sponsor is obligated to under the FDA requirement of ICH E6, Section 5.2:

5.2 Contract Research Organization (CRO)
5.2.1 A sponsor may transfer any or all of the sponsor’s trial-related duties and functions to a CRO, but the ultimate responsibility for the quality and integrity of the trial data always resides with the sponsor. The CRO should implement quality assurance and quality control.
5.2.2 Any trial-related duty and function that is transferred to and assumed by a CRO should be specified in writing.

ADDENDUM
The sponsor should ensure oversight of any trial-related duties and functions carried out on its behalf, including trial-related duties and functions that are subcontracted to another party by the sponsor’s contracted CRO(s).

Put simply, the Sponsor must ensure that there is appropriate oversight and the evidence of the oversight needs to be retained. If the CRO didn’t do something or didn’t execute in the agreed upon manner or timeframe („specified in writing“ as required in 5.2.2), the Sponsor continues to bare the risk for the breach or underperformance.

As we can see in 5.2.1 of ICH E6, the CRO does have an obligation to implement (1) quality assurance and (2) quality control measures; as the Sponsor, these are our items of focus for oversight.

Is the CRO executing in alignment with these things:

  • Quality Assurance – the overarching quality management concept that relates to how a process is to be performed, and uses a proactive approach to prevent quality issues from arising
    (i.e. work instructions, process checklists, standards to be met in order to approve a document)
  • Quality Control – a reactive measure used to detect defects/deviations from defined processes and the steps taken to capture (documentary evidence of issue identification process and remediation steps) and correct any issues
    (i.e. mock inspection, QC Review workflows, logging Quality Issues on documents and routing for remediation).

Now that we know what is required, let’s talk game plan. Oversight doesn’t have to be a complex process – just set the cadence, determine the sample size, and, like your 3rd grade math teacher used to say, no points awarded unless you show your work!
Start with what is feasible, given the applicable outsourcing model and regulatory history of the organization (annual, staggered with Veeva Vault releases (3x yearly), etc.). Determine the subject matter to be reviewed and create a simple, user-friendly format to capture CRO oversight activities. This should include the remedial measures taken to correct any issues; to follow through with those measures is key, and should be incorporated into the next round of CRO oversight (if not completed sooner).

 

 

How to create a simple, painfree governance framework

Governance can be an intimidating concept! Given Veeva‘s 3x annual release schedule and associated impact activities for each one, where do you even start? The most common challenge that small-to-midsize organizations face is optimizing limited resources – how do you find the time to develop a strategic system management plan, create structure and process to support that vision, and then, actually execute on it?

The good news? In the exasperation, is the answer.

Let’s break it down:

1. Develop a strategic system management plan.

a. What we do now

i. CRO Oversight
ii. Veeva Vault Releases
iii. Program-Level Support

b. What we want to do 1 year from now

i. CRO Oversight
ii. Veeva Vault Releases
iii. Program-Level Support

c. What we want to do 3 years from now

i. CRO Oversight
ii. Veeva Vault Releases
iii. Program-Level Support

2. Create structure and process to support that vision.

a. What are we doing now that works?
b. What are we doing now that doesn’t work?
c. What needs to change to get us to the Year 1 goal?
d. What needs to change to get us to the Year 3 goal?

3. Execute on it.

a. Start with what is within your control. Lay the groundwork to persuade others toward your future vision, then embrace the evolution of the strategic vision.

By answering the above questions, you’ll create a simple, painfree governance framework; a good foundation from which to build and mature, tailored to your organization‘s specific goals and needs. And, if task 2 „Create a structure and process to support that vision.“ was a tricky one, not to worry. We’ll dig deeper into that specific topic in our upcoming blog post: „Data Discernment – A simplified approach to strategic planning“ – stay tuned!