fme’s MetadataAssist™ reconnects lost documents

fme’s MetadataAssist™ reconnects lost documents

Document metadata is a critical component of any life sciences Content Management System (CMS). Each element within the structure is vital to the document history, and ensures the document is properly stored and managed throughout its lifecycle.

These are just a few of the key types of metadata required on each document stored by life sciences firms:

    1. Title: The title of a document provides important information about the content and purpose of the document.
    2. Author: The author of a document provides valuable information about the origin and credibility of the scientific data contained within the document.
    3. Date: The date of a document provides valuable information about the age and relevance of the scientific data contained within the document.
    4. Keywords: Keywords provide valuable information about the content and context of the scientific data contained within the document.
    5. Regulatory Information: Regulatory information provides valuable information about the compliance of the document with relevant regulations and guidelines.
    6. Scientific Data: Scientific data provides valuable information about the results, methodology, and analysis contained within the document.

Unfortunately, this valuable information about the content, structure and context of documents is often disconnected or lost in storage, translation, or migration to and from different systems. Over time, it becomes increasingly difficult to locate, share, and utilize critical information.

Fixing metadata is a larger problem

Many organizations don’t recognize the extent of the problem until they plan a consolidation or migration project. When preparing their content for an updated or new platform, they realize the amount of information that is either missing or incomplete. They also discover the volume of effort required to review, identify and update this information across their content library. Shifting key business resources to spend hundreds of hours on tedious content review and metadata updates isn’t going to be beneficial to anyone’s business goals, but disorganized content will cause any ‘content management system’ to feel more like ‘content maybe somewhere.’ This quickly becomes a high-risk situation when the company has committed resources to a larger platform project.

Facing project delays, many companies decide to take the ‘just make it work, and we’ll fix it later’ approach, contributing additional risk to a backlog of work for future admin teams. Eventually, this will fail. There is a better way.

fme MetadataAssist™ is the solution

Staffed with life sciences content experts, fme MetadataAssist™ service is a value-adding solution that supplements your team to rescue disconnected content. With a proprietary process and customized workflows, we save your team hundreds of hours of tedious content review and metadata updates, and deliver an accessible library of essential information. Your content can then be properly filed in in your CMS for accurate search results or migrated to a consolidated solution.

If you are consolidating volumes of legacy content, fme MetadataAssist™ is a unique solution that can address large amounts of data while reducing or eliminating the historically high-level of business involvement. For companies that are migrating to a new platform, fme MetadataAssist™ can streamline your migration, minimizing costs and reducing timelines for your entire team.

Conclusion

Accurate metadata is a critical component of any document management system within a life science company. By reviewing and updating the metadata in your documents, you can improve searchability, enhance collaboration, facilitate regulatory compliance, preserve and archive critical scientific data, and extract insights from scientific data to drive innovation and improve research outcomes.

To learn more about fme MetadataAssist™, contact us to schedule a time to discuss your current document and metadata challenges. We’ll share a few of our recent projects, and show how fme can help you reclaim the value lost in your document library.

About the author

David Gwyn is a strategic, creative, and data-driven Business Development and Technology Specialist with extensive expertise in building key partnerships, implementing business strategies, and deploying solutions across the life science and emerging technology industries. In October 2021 he joined fme US as the Business Unit Director for Business Consulting. For the past 30 years, he has led teams in the delivery of content management, clinical, and quality solutions with a recent focus on end-to-end Regulatory Information Management (RIM). His practice has evolved in parallel to the Life Sciences industry, moving from custom-developed software solutions to packaged-based implementations, and the development of methodologies and best practices to guide practitioners in realizing the greatest return on investment. Mr. Gwyn is passionate about helping organizations evolve from traditional to digital businesses and increase their ability to act with speed and agility.

Advantages of CSA – Computer Software Assurance – Over Traditional CSV

Advantages of CSA – Computer Software Assurance – Over Traditional CSV

For over 20 years, the traditional CSV computer system validation has created mountains of paperwork to validate a new or updated system. It’s also created an overwhelming burden that prevented many companies from upgrading their complex systems. CSA – Computer Software Assurance – was introduced to relieve that burden, allowing companies to optimize validation activities by focusing on the processes that impact patient health and safety.  

There are many advantages of CSA over the traditional CSV approach. It is a more streamlined and efficient risk-based methodology that saves time, frustration, and money by: 

  • Providing clarity for FDA’s guidance and methodology  
  • Driving critical thinking to identify, evaluate, and control potential impact to patient safety, product quality, and data integrity  
  • Focusing on the ability to leverage vendor qualification activities  
  • Providing streamlined testing instead of one-size-fits all  
  • Saving as much as 80% of validation costs  

 These are much needed improvements that are being welcomed by forward-thinking companies striving to improve their systems to stay competitive in today’s volatile market.  

Old Lessons Applied to Current Challenges  

In the 1930’s, Henry Ford shifted his thinking about how cars were manufactured. His industry changing assembly line focused on specific sub-components of his vehicles creating a plethora of efficiencies and quality improvements that allowed him to achieve unprecedented production goals.  

 Life science companies are now able to apply similar lessons in the context of validation. Much like the traditional car factory, the traditional CSV methodology demands extensive structure for every aspect of the system. CSA opens up new tools, templates and techniques, revised SOPs, and training to shift the focus to thinking critically rather than dogmatically. It is a dramatic shift in focus that can improve a company’s competitive edge by increasing their ability to test and adopt new business processes and systems and accelerating validation activities. 

fme Delivers the Advantages of CSA 

The fme team are life science experts that have the deep regulatory, clinical and quality experience required to integrate complex business and regulatory compliance requirements. By leveraging proven practices of previous decades, our extensive process expertise, and today’s best-in-class toolsets, we fast-track your evolution from CSV to CSA, eliminating manual, labor-intensive validation efforts and establishing a proven risk-based methodology.  

Learn More about the Advantages of CSA 

Currently there is no registration required to download fme’s Optimizing Validation through a Risk-Based Approach: Leveraging Computer Software Assurance (CSA) to learn more about CSA, and our CSA training options in online, instructor led, or a hybrid of eLearning and remote instructor coaching. We are happy to provide you with detailed information on our validation service offerings and can even tailor an approach that meets your needs and exceeds your expectations. 

 

Highlights of Generis and fme’s “Data-centricity in RIM” Webinar

Highlights of Generis and fme’s “Data-centricity in RIM” Webinar

In October, fme’s Director of Business Consulting David Gwyn was a featured contributor in an informative webinar with the Generis CARA Life Sciences team. He was able to share his rich experience and perspective on the value of a data-centric approach to document and information management, and outline some of the benefits that can be realized across an organization.

Generis also provided a comprehensive demo of their CARA Life Sciences Platform, and how it can improve quality, efficiency, consistency, and scalability across any organization.

Below is a summary of David’s introduction, an outline of the webinar, and a highlight video of the presentation. View the full webinar on the Generis site, and contact us with any questions you have about data-centricity or the CARA Life Sciences Platform.

 

Summary of Data-Centricity Introduction

David Gwyn: I’d like to speak for a few minutes on the essential concept of data-centricity. What I mean by that is how we can reshape our thinking about documents and the other entities we’re managing beyond traditional paper. Right now we all have an opportunity, and I will argue a necessity, to change the way we’re thinking about the information we’re managing and move forward into a more data-centric and data-focused approach.

I’m sure you remember the days where we would produce reams and reams of paper that we’d stack on a conference room table and ask the CEO to come in and sign the 356H. Eventually we said “Let’s digitize this, so we took our Word documents that we printed out before and turned them into PDFs. While this was a step forward, we really just took pictures of all those documents and obfuscated all the value. All the data that was there was buried in a digital version of the paper process.

There are much better solutions now that eliminate traditional challenges, and provide extensive improvement to quality, efficiency, consistency, and scalability across your entire organization. Let’s look at what’s possible.

Data-Centricity Webinar Outline

  • Overview of a document-centric progress
  • Impacts of document focus
  • Brief History of Medicinal Product Submissions
  • What is triggering the transition to digitalization of the process?
    • Regulations, data standards, compliance
    • Improve quality, efficiency, consistency
    • Enable scalability, promote quality, endure changing landscapes
  • Characteristics of a data-driven approach
  • Benefits of data-centric process
  • Questions to ask to prepare for a transition to a data-centric approach
  • Detailed demo of Generis CARA Life Sciences Platform

Watch the Webinar Highlights

 

For more information, please complete this contact form and we’ll put you in touch with our experts.

 

“Data Discernment – A simplified approach to strategic study and portfolio planning.”

“Data Discernment – A simplified approach to strategic study and portfolio planning.”

Now, with the rapid innovation in eClinical system technology, a Sponsor can more readily find a CTMS solution that supports exactly their business profile – enhancing strategic insights without unnecessarily encumbering the clinical operations process.

Most small-to-midsize clients focus on two key areas when using CTMS to enhance data accessibility

1. Enrollment Metrics
Key Areas of Concern – patient recruitment and screen failure rate, open to enrollment to first patient enrolled, patients enrolled and not completing all visits (breakdown analytics for interim visits, safety events, and/or protocol deviations), target enrollment, accrual timeline.

2. Site Performance
Key Areas of Concern – study start-up timelines/spend, protocol compliance and deviations, safety reporting, data entry timeliness and accuracy, clinical document quality.

These two areas provide the greatest/earliest indication of protocol success or on-going risk, as well as return on study investment. And, so the question becomes, should the Sponsor bring the necessary data points in-house using CTMS as a nexus for cross-functional collaboration? Great question.

Use skills, experience, and resources of your CRO to your greatest advantage!

Essentially, CROs have a deeper data lake and access to more robust, well-rounded data points. It’s a volume game, pure and simple. Unless your organization has very specific goals in mind, it likely isn’t worth the cost and resourcesto duplicate the data collection efforts, particularly if the CRO has been contracted to perform a majority of the study activities.
Another important consideration is application maintenance. When the CTMS application is cloud-based and subject to periodic release updates – like Veeva Vault Clinical – any integration must be tested and maintained to ensure integrity of both the connection and the data communicated thereby. This can be a big resourcing effort, considering the 3x Annual Veeva Vault Release schedule.

Get specific and targeted with meaningful KPIs

When Sponsor goals dictate that it is time to bring data in-house (worth the implementation and maintenance efforts), be highly targeted. Choose specific, meaningful Sponsor-priority KPIs to capture in the CTMS environment, then leverage Vault application features to boost efficency in on-going management activities. Resist the urge to capture data simply because there is a visible field available or an out-of-the-box report associated with it; if you don’t need it, hide it.

Recap

In this blog series, we discussed the importance of a simplified eClinical system environment, then juxtaposed compliance „have-to-dos“ with strategic „want-to-dos“ using a simple framework, and voila – a hybrid governance:maturity map. Using this map, you’re ready to drive innovation both internally and within the industry. And, if you need some extra help, just ask – fme is here to support you!

Managing CRO Sponsor Compliance in Veeva Vault Clinical

Managing CRO Sponsor Compliance in Veeva Vault Clinical

We also highlighted the importance of having a full picture of your outsourcing strategy as a small-to-midsize clinical research Sponsor. The activities to be performed and the systems of record to be utilized by the Contact Research Organization (CRO) drive our approach for oversight and governance – the essential compliance aspects of eClinical systems in the clinical operations space.

For example:

  • Will the CRO use their in-house eTMF system, work exclusively in the Sponsor eTMF, or a combination of the both?
  • Will the CRO systems/reports integrate with the Sponsor CTMS to capture enrollment metrics or other pertainant, strategic clinical data points?
  • What is the cadence of document and information sharing expected (i.e. Quarterly, Annually, End of Study)?

With this healthy foundation – simplicity and awareness – we are primed to tackle two key components of long-term eClinical system success: oversight and governance.

Oversight doesn’t have to be a complex process

First, let’s box in exactly what a Sponsor is obligated to under the FDA requirement of ICH E6, Section 5.2:

5.2 Contract Research Organization (CRO)
5.2.1 A sponsor may transfer any or all of the sponsor’s trial-related duties and functions to a CRO, but the ultimate responsibility for the quality and integrity of the trial data always resides with the sponsor. The CRO should implement quality assurance and quality control.
5.2.2 Any trial-related duty and function that is transferred to and assumed by a CRO should be specified in writing.

ADDENDUM
The sponsor should ensure oversight of any trial-related duties and functions carried out on its behalf, including trial-related duties and functions that are subcontracted to another party by the sponsor’s contracted CRO(s).

Put simply, the Sponsor must ensure that there is appropriate oversight and the evidence of the oversight needs to be retained. If the CRO didn’t do something or didn’t execute in the agreed upon manner or timeframe („specified in writing“ as required in 5.2.2), the Sponsor continues to bare the risk for the breach or underperformance.

As we can see in 5.2.1 of ICH E6, the CRO does have an obligation to implement (1) quality assurance and (2) quality control measures; as the Sponsor, these are our items of focus for oversight.

Is the CRO executing in alignment with these things:

  • Quality Assurance – the overarching quality management concept that relates to how a process is to be performed, and uses a proactive approach to prevent quality issues from arising
    (i.e. work instructions, process checklists, standards to be met in order to approve a document)
  • Quality Control – a reactive measure used to detect defects/deviations from defined processes and the steps taken to capture (documentary evidence of issue identification process and remediation steps) and correct any issues
    (i.e. mock inspection, QC Review workflows, logging Quality Issues on documents and routing for remediation).

Now that we know what is required, let’s talk game plan. Oversight doesn’t have to be a complex process – just set the cadence, determine the sample size, and, like your 3rd grade math teacher used to say, no points awarded unless you show your work!
Start with what is feasible, given the applicable outsourcing model and regulatory history of the organization (annual, staggered with Veeva Vault releases (3x yearly), etc.). Determine the subject matter to be reviewed and create a simple, user-friendly format to capture CRO oversight activities. This should include the remedial measures taken to correct any issues; to follow through with those measures is key, and should be incorporated into the next round of CRO oversight (if not completed sooner).

 

 

How to create a simple, painfree governance framework

Governance can be an intimidating concept! Given Veeva‘s 3x annual release schedule and associated impact activities for each one, where do you even start? The most common challenge that small-to-midsize organizations face is optimizing limited resources – how do you find the time to develop a strategic system management plan, create structure and process to support that vision, and then, actually execute on it?

The good news? In the exasperation, is the answer.

Let’s break it down:

1. Develop a strategic system management plan.

a. What we do now

i. CRO Oversight
ii. Veeva Vault Releases
iii. Program-Level Support

b. What we want to do 1 year from now

i. CRO Oversight
ii. Veeva Vault Releases
iii. Program-Level Support

c. What we want to do 3 years from now

i. CRO Oversight
ii. Veeva Vault Releases
iii. Program-Level Support

2. Create structure and process to support that vision.

a. What are we doing now that works?
b. What are we doing now that doesn’t work?
c. What needs to change to get us to the Year 1 goal?
d. What needs to change to get us to the Year 3 goal?

3. Execute on it.

a. Start with what is within your control. Lay the groundwork to persuade others toward your future vision, then embrace the evolution of the strategic vision.

By answering the above questions, you’ll create a simple, painfree governance framework; a good foundation from which to build and mature, tailored to your organization‘s specific goals and needs. And, if task 2 „Create a structure and process to support that vision.“ was a tricky one, not to worry. We’ll dig deeper into that specific topic in our upcoming blog post: „Data Discernment – A simplified approach to strategic planning“ – stay tuned!

Reducing the “noise” – How to tailor Veeva Vault Clinical to your individual needs

Reducing the “noise” – How to tailor Veeva Vault Clinical to your individual needs

With this blog article I am taking a closer look at how CRO Users and Activities are managed in Veeva Vault Clinical and deal with decision-making re: who from CRO can/should have access to eTMF/CTMS in Vault, what activities are performed in Sponsor vs. CRO eTMF, and how you can tailor the system according to your needs to avoid functional overload. In my next blog I will talk about Inspection Readiness – “Managing CRO and Sponsor Compliance in Vault Clinical” and part three will cover the topic of Integration and Reporting – “Data Discernment – A simplified approach to strategic planning.”

Moving to a cloud-based eTMF and/or CTMS is a big step, particularly for a small to mid-market life sciences organization. The budget to secure and implement eClinical technology typically coincides with a strategic shift by company leadership, often leaving clinical teams feeling pressured to support visibility and engage deeply in a new system. A key challenge is balancing the existing outsourcing model (spend, system integration, and study activities) with the capabilities of the new system(s).

How to find the right balance of streamlined configuration and intentional user access

Here are a few important questions to ask as you determine the right fit – a balance of streamlined configuration and intentional user access – for your organization:

First, do you know your outsourcing strategy? Learn it. Read the contract, request the scope of work or a synopsis thereof:

      • what is the CRO scoped to do?
      • is it on a study by study basis OR 80/20 consistent activities OR other?
      • is work done in CRO or Sponsor environment per the scope of work/contract?

Second, align the supporting documents:

      • what does the CRO Oversight Plan say about system work?
      • if there is no CRO Oversight Plan, make one.

Third, observe what is actually happening with your CRO partners:

      • is the CRO performing in accordance with the scope of work as it relates to system activities?
      • does this CRO user need the level of access that they have requested?
      • if not, what steps are being taken to communicate, mitigate, correct, and document?

Now that we know how the CRO is contracted, we can address what the Sponsor system should support. It’s challenging for many small and mid-size Veeva customers to effectively tailor their Veeva Vault experience to the actual work being performed in the Vault environment. Keep in mind that Veeva Vault Clinical is intentionally designed to support a broad customer profile – biotechnology, medical device, and pharmaceutical companies in ALL phases of research and commercialization, across ALL indications. The out of the box offering is powerful, dynamic and comprehensive, but can also be overwhelming!

 

 

Reducing the “noise“ – Tailoring the Veeva Vault Clinical environment to your individual needs

Of particular importance for small to mid-size customers is to tailor the Vault environment – not by adding new configurations, but by strategically turning off features that, while beneficial, act more as noise and barrier to end-user adoption.

For example, if you utlilize a full-service CRO (one that performs most study activities), a Sponsor CTMS runs the risk of capturing duplicitous, out of date data, not to mention the resourcing burden for Sponsor with ingesting superfluous data.

The key question here is: what data elements do I need as the Sponsor to support (1) compliance and (2) the strategic portfolio?

Many out of the box fields in Vault CTMS are not required for an outsourced Sponsor and may create more “noise“ than potential business benefit. If the CRO provides a weekly report or output from their CTMS, it may be more beneficial to integrate or ingest their data rather than utilize scarce resources to perform duplicate data entry, particularly if these data points are not currently actionable or used.

Features to consider:

    • Expected Documents / Expected Document Lists
      • Are we collecting the Expected Documents or is the CRO collecting, then updating the Sponsor eTMF (or sharing a weekly/monthly report)?
      • Can we eliminate or reduce the template EDL to better support our oversight activities, rather than duplicating work already done by the CRO?
    • Milestones
      • Are the template Milestones relevant to our internal processes?
      • If not, let’s turn them off or simplify the Milestone Template to align with our process and create a more useful, purposeful user environment.
    • Workflows
      • What document processes are actually taking place in our eTMF or CTMS?
      • Do we need all of the possible out of the box workflows available to users or is it causing confusion?
      • If we don’t need all available workflows, let’s turn them off so that users have only the relevant business processes available when actioning a document in the system.

The technology landscape puts capabilities at our fingertips, but just because we can do something doesn’t necessarily mean that it‘s the right thing, right this minute for our team(s). This simplified foundation provides a system environment primed for agile adaptation, rather than beginning from a place of overwhelm: start simple, then mature the system and supporting processes to meet your company‘s unique needs.

And, lastly, how to determine when to kick things up a knotch. Being intentional is of great importance, particularly in system management. We are the nexus point joining the limitless potential of modern technology with the business user experience. It is our role to facilitate clinical enablement, to make doing the work of bringing drugs, devices and biotechnology to market that much better – faster, safer, and more cohesive!

In our next post, we’ll speak to the unique needs of small to mid-size life sciences organizations as concerns strategic maturity and how that fits into a hybrid governance model, specifically designed to align with Veeva’s 3x annual release schedule. Stay tuned!