fme compliance center: Validation at the speed of business

fme compliance center: Validation at the speed of business

The rigidly comprehensive CSV (Computer Software Validation) model and its tedious testing and documentation have finally given way to a risk-based CSA (Computer Software Assurance) approach. It’s an improvement, but this doesn’t mean validation is easy to execute or document.  

fme compliance-center is here to solve that challenge. 

Remove validation-related barriers

The validation costs required when upgrading systems in a Life Sciences firm have put digital transformation initiatives out of reach for many. While CSA is intended to minimize this barrier, leveraging a CSA-based approach requires businesses to assess risks and apply appropriate testing steps when performing validation of systems. fme compliance-center delivers a complete end-to-end solution to address these needs.

fme compliance-center streamlines validation

Based on the CSA guidelines and our 25 years of working with global pharmaceutical and medical firms, fme compliance-center is specifically built to streamline validation workflows. With an intuitive interface and clear instructions, the solution walks you through each validation process step, providing detailed guidance on when you can take advantage of CSA, when traditional CSV is required, and when a hybrid approach is possible.   

 In the end, fme compliance-center saves users time and money, and implements solutions quicker while delivering a higher quality product.  

Learn more about fme compliance-center

Join fme’s validation experts for an in-depth introduction to our newest solution that delivers fit-for-purpose QMS capability for companies in a paperless cloud platform. We’ll demonstrate how fme compliance-center:

  • Drives the risk assessment process to determine the appropriate testing levels

  • Provides the tools and templates to conduct the testing

  • Automates the document management and reporting activities

We’ll also show how our CSA-based guidelines help you organize, execute, and document your critical validation tasks in an easy to understand dashboard that connects your people, processes, and content across your business.

Reserve your spot today to ensure you receive the full recording to share with your team!

Webinar Details

fme compliance-center: Validation at the speed of business

Date: Wednesday, May 31, 2023
Time: 10am EDT

fme compliance-center solves validation challenges

fme compliance-center solves validation challenges

At last! An end to many of the onerous software validation activities will boost innovation in Life Sciences.

It’s been a long time coming, but a new risk-based approach to computer software assurance looks set to spur new digital advancement in Life Sciences. After a quarter of a century of the rigidly comprehensive Computer Software Validation (CSV) model, in which each and every tweak to an IT system led to a whole raft of testing and documentation, the FDA has published new draft guidance with an emphasis which is more fit for purpose.

Rather than enforce a checklist of no-exceptions tests, the new Computer Software Assurance (CSA) model will be geared to the impact of any changes to a system. It will require new testing and associated documentation only if there is likely to be a direct effect on the product or on patient safety. Indiscriminate testing of logins and similar low-risk processes will no longer be required – or incur the risk of a failed inspection if not completed.

Removing the validation-related barriers to digital transformation

The update to the FDA requirements comes just in time. Over-zealous requirements have caused a reticence to upgrade systems in a Life Sciences Regulatory context because the cost of validation is often two to three times the original cost of the software. While a large company with a sizeable budget and ample internal resources might be able to weather this comfortably, the same hasn’t always been true and cannot always be said for small businesses along the supply chain. Neither size company wants this added cost.

This is one of the reasons Life Sciences has lagged behind other industries in digital innovation. Years ago, analyst firms put the sector at 17th for innovation, a barely altered ranking. If a mundane transactional system had to be put through its paces each time an adjustment was made, consider how much more of a barrier introducing an AI-based capability would have been under the outgoing regime would be.

Although the new approach is still only at a draft guidance stage, the excitement around the change is palpable. Suddenly, even the regulators are pushing for digital innovation in how Life Sciences companies manage their processes, and this impetus for change is now driving the FDA’s new approach to software assurance.

Reducing complexity & cost of validation

The new emphasis on risk-based and ad-hoc software assurance checks and documentation could reduce the time and cost burden by as much as 80%. Instead of resigning themselves to weeks of writing scripts and capturing screens, IT teams will be able to explore the potential for concepts like:

  • AI-based safety signal detection;
  • advanced use of Regulatory intelligence to drive pipeline development and submission strategies;
  • and adopting more innovative interfaces (e.g., voice assistants) to drive complex RIM queries.

At last, companies will be able to reduce their reliance on Excel spreadsheets without worrying about adding to an already overwhelming validation worklist.

Cultivating greater cloud use

Next-level cloud adoption is likely to increase because there won’t be the same requirement to track down and physically visit hosting data centers to check on fire alarms and the like. Instead, third-party assessments will usually be acceptable, opening new opportunities to embrace the latest remote capabilities.

It might be the best part of another year before the new provisions apply, but knowing that the FDA is altering its approach to software assurance paves the way for bolder tech-based ambitions today – whether that’s around AI capabilities or enhanced inter-system data integration.

And fortunately, we’re fme poised to help companies chart and respond to these opportunities. Our new compliance-center solution encapsulates the FSA’s new CSA approach in a digital platform, delivering a fit-for-purpose QMS capability for companies in a paperless cloud platform.

Read more about the advantages of CSA over traditional CSV here and keep an eye out for the compliance-center webinar in May. We’d be delighted to share more about it – so why not get in touch?

Read more and watch the fme compliance-center™ webinar here

About the author

David Gwyn is a strategic, creative, and data-driven Business Development and Technology Specialist with extensive expertise in building key partnerships, implementing business strategies, and deploying solutions across the life science and emerging technology industries. In October 2021 he joined fme US as the Business Unit Director for Business Consulting. For the past 30 years, he has led teams in the delivery of content management, clinical, and quality solutions with a recent focus on end-to-end Regulatory Information Management (RIM). His practice has evolved in parallel to the Life Sciences industry, moving from custom-developed software solutions to packaged-based implementations, and the development of methodologies and best practices to guide practitioners in realizing the greatest return on investment. Mr. Gwyn is passionate about helping organizations evolve from traditional to digital businesses and increase their ability to act with speed and agility.

fme’s DataAssist™ reconnects lost documents

fme’s DataAssist™ reconnects lost documents

Document metadata is a critical component of any life sciences Content Management System (CMS). Each element within the structure is vital to the document history, and ensures the document is properly stored and managed throughout its lifecycle.

These are just a few of the key types of metadata required on each document stored by life sciences firms:

    1. Title: The title of a document provides important information about the content and purpose of the document.
    2. Author: The author of a document provides valuable information about the origin and credibility of the scientific data contained within the document.
    3. Date: The date of a document provides valuable information about the age and relevance of the scientific data contained within the document.
    4. Keywords: Keywords provide valuable information about the content and context of the scientific data contained within the document.
    5. Regulatory Information: Regulatory information provides valuable information about the compliance of the document with relevant regulations and guidelines.
    6. Scientific Data: Scientific data provides valuable information about the results, methodology, and analysis contained within the document.

Unfortunately, this valuable information about the content, structure and context of documents is often disconnected or lost in storage, translation, or migration to and from different systems. Over time, it becomes increasingly difficult to locate, share, and utilize critical information.

Fixing metadata is a larger problem

Many organizations don’t recognize the extent of the problem until they plan a consolidation or migration project. When preparing their content for an updated or new platform, they realize the amount of information that is either missing or incomplete. They also discover the volume of effort required to review, identify and update this information across their content library. Shifting key business resources to spend hundreds of hours on tedious content review and metadata updates isn’t going to be beneficial to anyone’s business goals, but disorganized content will cause any ‘content management system’ to feel more like ‘content maybe somewhere.’ This quickly becomes a high-risk situation when the company has committed resources to a larger platform project.

Facing project delays, many companies decide to take the ‘just make it work, and we’ll fix it later’ approach, contributing additional risk to a backlog of work for future admin teams. Eventually, this will fail. There is a better way.

fme’s DataAssist™ is the solution

Staffed with life sciences content experts, fme’s DataAssist™ service is a value-adding solution that supplements your team to rescue disconnected content. With a proprietary process and customized workflows, we save your team hundreds of hours of tedious content review and metadata updates, and deliver an accessible library of essential information. Your content can then be properly filed in in your CMS for accurate search results or migrated to a consolidated solution.

If you are consolidating volumes of legacy content, DataAssist™ is a unique solution that can address large amounts of data while reducing or eliminating the historically high-level of business involvement. For companies that are migrating to a new platform, DataAssist™ can streamline your migration, minimizing costs and reducing timelines for your entire team.

Conclusion

Accurate metadata is a critical component of any document management system within a life science company. By reviewing and updating the metadata in your documents, you can improve searchability, enhance collaboration, facilitate regulatory compliance, preserve and archive critical scientific data, and extract insights from scientific data to drive innovation and improve research outcomes.

To learn more about DataAssist™, contact us to schedule a time to discuss your current document and metadata challenges. We’ll share a few of our recent projects, and show how fme can help you reclaim the value lost in your document library.

 

About the author

David Gwyn is a strategic, creative, and data-driven Business Development and Technology Specialist with extensive expertise in building key partnerships, implementing business strategies, and deploying solutions across the life science and emerging technology industries. In October 2021 he joined fme US as the Business Unit Director for Business Consulting. For the past 30 years, he has led teams in the delivery of content management, clinical, and quality solutions with a recent focus on end-to-end Regulatory Information Management (RIM). His practice has evolved in parallel to the Life Sciences industry, moving from custom-developed software solutions to packaged-based implementations, and the development of methodologies and best practices to guide practitioners in realizing the greatest return on investment. Mr. Gwyn is passionate about helping organizations evolve from traditional to digital businesses and increase their ability to act with speed and agility.

Advantages of CSA – Computer Software Assurance – Over Traditional CSV

Advantages of CSA – Computer Software Assurance – Over Traditional CSV

For over 20 years, the traditional CSV computer system validation has created mountains of paperwork to validate a new or updated system. It’s also created an overwhelming burden that prevented many companies from upgrading their complex systems. CSA – Computer Software Assurance – was introduced to relieve that burden, allowing companies to optimize validation activities by focusing on the processes that impact patient health and safety.  

There are many advantages of CSA over the traditional CSV approach. It is a more streamlined and efficient risk-based methodology that saves time, frustration, and money by: 

  • Providing clarity for FDA’s guidance and methodology  
  • Driving critical thinking to identify, evaluate, and control potential impact to patient safety, product quality, and data integrity  
  • Focusing on the ability to leverage vendor qualification activities  
  • Providing streamlined testing instead of one-size-fits all  
  • Saving as much as 80% of validation costs  

 These are much needed improvements that are being welcomed by forward-thinking companies striving to improve their systems to stay competitive in today’s volatile market.  

Old Lessons Applied to Current Challenges  

In the 1930’s, Henry Ford shifted his thinking about how cars were manufactured. His industry changing assembly line focused on specific sub-components of his vehicles creating a plethora of efficiencies and quality improvements that allowed him to achieve unprecedented production goals.  

 Life science companies are now able to apply similar lessons in the context of validation. Much like the traditional car factory, the traditional CSV methodology demands extensive structure for every aspect of the system. CSA opens up new tools, templates and techniques, revised SOPs, and training to shift the focus to thinking critically rather than dogmatically. It is a dramatic shift in focus that can improve a company’s competitive edge by increasing their ability to test and adopt new business processes and systems and accelerating validation activities. 

fme Delivers the Advantages of CSA 

The fme team are life science experts that have the deep regulatory, clinical and quality experience required to integrate complex business and regulatory compliance requirements. By leveraging proven practices of previous decades, our extensive process expertise, and today’s best-in-class toolsets, we fast-track your evolution from CSV to CSA, eliminating manual, labor-intensive validation efforts and establishing a proven risk-based methodology.  

Learn More about the Advantages of CSA 

Currently there is no registration required to download fme’s Optimizing Validation through a Risk-Based Approach: Leveraging Computer Software Assurance (CSA) to learn more about CSA, and our CSA training options in online, instructor led, or a hybrid of eLearning and remote instructor coaching. We are happy to provide you with detailed information on our validation service offerings and can even tailor an approach that meets your needs and exceeds your expectations. 

 

Highlights of Generis and fme’s “Data-centricity in RIM” Webinar

Highlights of Generis and fme’s “Data-centricity in RIM” Webinar

In October, fme’s Director of Business Consulting David Gwyn was a featured contributor in an informative webinar with the Generis CARA Life Sciences team. He was able to share his rich experience and perspective on the value of a data-centric approach to document and information management, and outline some of the benefits that can be realized across an organization.

Generis also provided a comprehensive demo of their CARA Life Sciences Platform, and how it can improve quality, efficiency, consistency, and scalability across any organization.

Below is a summary of David’s introduction, an outline of the webinar, and a highlight video of the presentation. View the full webinar on the Generis site, and contact us with any questions you have about data-centricity or the CARA Life Sciences Platform.

 

Summary of Data-Centricity Introduction

David Gwyn: I’d like to speak for a few minutes on the essential concept of data-centricity. What I mean by that is how we can reshape our thinking about documents and the other entities we’re managing beyond traditional paper. Right now we all have an opportunity, and I will argue a necessity, to change the way we’re thinking about the information we’re managing and move forward into a more data-centric and data-focused approach.

I’m sure you remember the days where we would produce reams and reams of paper that we’d stack on a conference room table and ask the CEO to come in and sign the 356H. Eventually we said “Let’s digitize this, so we took our Word documents that we printed out before and turned them into PDFs. While this was a step forward, we really just took pictures of all those documents and obfuscated all the value. All the data that was there was buried in a digital version of the paper process.

There are much better solutions now that eliminate traditional challenges, and provide extensive improvement to quality, efficiency, consistency, and scalability across your entire organization. Let’s look at what’s possible.

Data-Centricity Webinar Outline

  • Overview of a document-centric progress
  • Impacts of document focus
  • Brief History of Medicinal Product Submissions
  • What is triggering the transition to digitalization of the process?
    • Regulations, data standards, compliance
    • Improve quality, efficiency, consistency
    • Enable scalability, promote quality, endure changing landscapes
  • Characteristics of a data-driven approach
  • Benefits of data-centric process
  • Questions to ask to prepare for a transition to a data-centric approach
  • Detailed demo of Generis CARA Life Sciences Platform

Watch the Webinar Highlights

 

For more information, please complete this contact form and we’ll put you in touch with our experts.

 

“Data Discernment – A simplified approach to strategic study and portfolio planning.”

“Data Discernment – A simplified approach to strategic study and portfolio planning.”

Now, with the rapid innovation in eClinical system technology, a Sponsor can more readily find a CTMS solution that supports exactly their business profile – enhancing strategic insights without unnecessarily encumbering the clinical operations process.

Most small-to-midsize clients focus on two key areas when using CTMS to enhance data accessibility

1. Enrollment Metrics
Key Areas of Concern – patient recruitment and screen failure rate, open to enrollment to first patient enrolled, patients enrolled and not completing all visits (breakdown analytics for interim visits, safety events, and/or protocol deviations), target enrollment, accrual timeline.

2. Site Performance
Key Areas of Concern – study start-up timelines/spend, protocol compliance and deviations, safety reporting, data entry timeliness and accuracy, clinical document quality.

These two areas provide the greatest/earliest indication of protocol success or on-going risk, as well as return on study investment. And, so the question becomes, should the Sponsor bring the necessary data points in-house using CTMS as a nexus for cross-functional collaboration? Great question.

Use skills, experience, and resources of your CRO to your greatest advantage!

Essentially, CROs have a deeper data lake and access to more robust, well-rounded data points. It’s a volume game, pure and simple. Unless your organization has very specific goals in mind, it likely isn’t worth the cost and resourcesto duplicate the data collection efforts, particularly if the CRO has been contracted to perform a majority of the study activities.
Another important consideration is application maintenance. When the CTMS application is cloud-based and subject to periodic release updates – like Veeva Vault Clinical – any integration must be tested and maintained to ensure integrity of both the connection and the data communicated thereby. This can be a big resourcing effort, considering the 3x Annual Veeva Vault Release schedule.

Get specific and targeted with meaningful KPIs

When Sponsor goals dictate that it is time to bring data in-house (worth the implementation and maintenance efforts), be highly targeted. Choose specific, meaningful Sponsor-priority KPIs to capture in the CTMS environment, then leverage Vault application features to boost efficency in on-going management activities. Resist the urge to capture data simply because there is a visible field available or an out-of-the-box report associated with it; if you don’t need it, hide it.

Recap

In this blog series, we discussed the importance of a simplified eClinical system environment, then juxtaposed compliance „have-to-dos“ with strategic „want-to-dos“ using a simple framework, and voila – a hybrid governance:maturity map. Using this map, you’re ready to drive innovation both internally and within the industry. And, if you need some extra help, just ask – fme is here to support you!