Advantages of CSA – Computer Software Assurance – Over Traditional CSV

Advantages of CSA – Computer Software Assurance – Over Traditional CSV

For over 20 years, the traditional CSV computer system validation has created mountains of paperwork to validate a new or updated system. It’s also created an overwhelming burden that prevented many companies from upgrading their complex systems. CSA – Computer Software Assurance – was introduced to relieve that burden, allowing companies to optimize validation activities by focusing on the processes that impact patient health and safety.  

There are many advantages of CSA over the traditional CSV approach. It is a more streamlined and efficient risk-based methodology that saves time, frustration, and money by: 

  • Providing clarity for FDA’s guidance and methodology  
  • Driving critical thinking to identify, evaluate, and control potential impact to patient safety, product quality, and data integrity  
  • Focusing on the ability to leverage vendor qualification activities  
  • Providing streamlined testing instead of one-size-fits all  
  • Saving as much as 80% of validation costs  

 These are much needed improvements that are being welcomed by forward-thinking companies striving to improve their systems to stay competitive in today’s volatile market.  

Old Lessons Applied to Current Challenges  

In the 1930’s, Henry Ford shifted his thinking about how cars were manufactured. His industry changing assembly line focused on specific sub-components of his vehicles creating a plethora of efficiencies and quality improvements that allowed him to achieve unprecedented production goals.  

 Life science companies are now able to apply similar lessons in the context of validation. Much like the traditional car factory, the traditional CSV methodology demands extensive structure for every aspect of the system. CSA opens up new tools, templates and techniques, revised SOPs, and training to shift the focus to thinking critically rather than dogmatically. It is a dramatic shift in focus that can improve a company’s competitive edge by increasing their ability to test and adopt new business processes and systems and accelerating validation activities. 

fme Delivers the Advantages of CSA 

The fme team are life science experts that have the deep regulatory, clinical and quality experience required to integrate complex business and regulatory compliance requirements. By leveraging proven practices of previous decades, our extensive process expertise, and today’s best-in-class toolsets, we fast-track your evolution from CSV to CSA, eliminating manual, labor-intensive validation efforts and establishing a proven risk-based methodology.  

Learn More about the Advantages of CSA 

Currently there is no registration required to download fme’s Optimizing Validation through a Risk-Based Approach: Leveraging Computer Software Assurance (CSA) to learn more about CSA, and our CSA training options in online, instructor led, or a hybrid of eLearning and remote instructor coaching. We are happy to provide you with detailed information on our validation service offerings and can even tailor an approach that meets your needs and exceeds your expectations. 

 

Just because a current content/data-based process works doesn’t mean it’s efficient

Just because a current content/data-based process works doesn’t mean it’s efficient

New or consolidated systems should lead to better outcomes, so content migration pre-assessments are important to maximize the ROI.

Whether the goal is digital transformation, system consolidation or moving to a new content management system – if you’re going to spend a lot of money on a new IT project it should be with a view to delivering something tangibly better.

Too often departmental teams have become so adept at process workarounds to assemble or manage content, however, that they lose sight of what’s possible. As a result, when they are asked to give an overview of their current systems and ways of working, they tend to be overly optimistic about the caliber and integrity of the content that will need to be transferred to the new system.

This creates risk, as content migration projects are scoped, planned and costed on the back of these insights.

It’s quite odd, when you think about it, that such pivotal projects – which may involve critical Regulatory, Clinical or Quality systems – should be left to chance in this way. No airline or pilot would embark on a transatlantic flight without first checking for expected weather events, happy to simply react and make adjustments once hurricane conditions present themselves. And yet companies fix budgets and set deadlines for projects that have been scoped with only partial knowledge of the conditions that will be encountered. They prepare for a smooth ride, yet in nine cases out of 10 experience something altogether more turbulent.

Apples & oranges

In the aftermath of a merger/acquisition, it’s expected that blending systems will throw some issues if the technology platforms differ, the object models don’t match, or if the receiving/lead company does not have direct insights into the scale and integrity of the incoming systems of record.

But even within one company, there are likely to be corrupt, inaccurate, incomplete or out-of-date files, or differences in data model, which will continue to cause issues if migrated without remediation to a new platform or system.

And it is far better to understand the scope and scale of such issues before a content migration project takes form. The danger, otherwise, is that an already sizable undertaking will multiply as change order after change order is pushed through, with the result that ‘best case’ deadlines and budgets are far exceeded.

Warning signs

So how can you tell if you are likely to encounter such issues?

Clues to a sub-optimal starting point might include:

  • Over-reliance on highly manual or protracted processes, often involving multiple people, to prepare and submit a document;
  • Dependence on file shares or non-managed systems to locate information; and/or
  • The need to regularly plug gaps in content by chasing down additional detail;
  • Not being sure to what the actual number of documents there are that are required in the new system.

Don’t rely on guesswork

The only reliable way to scope content migration work is to engage the specialists ahead of time. Giving them an opportunity to look over the data themselves, ask the right questions, gather the correct number of documents in scope, and conduct a gap analysis between the data models of the old and new systems, will ensure that the formal migration project is scoped and designed optimally.

From all of this knowledge, for instance, and with a clearer idea of how content is typically organized, those that will later be tasked with performing the migration service will be able to architect the best approach – both tactically and strategically.

Considerations include:

  • How much data/content is earmarked to be migrated (and which data/content is beyond the scope of this project)?
  • Where is the data/content coming from, and where is it going to?
  • Which data models are involved in the old and new state?
  • How many data/content attributes exist in the old and new system?
  • What are the risks associated with a poor or badly scoped migration?
  • Where are the gaps/differences between the old and new models, and what will be needed to address them?
  • Given all of the known parameters, will a phased, or ‘big bang’ approach work best?

Forewarned is forearmed

Strategic pre-assessments, which can also be thought of as Phase 0 in the fuller context of a system migration, are an investment in a tight, focused, and hopefully expedited main project.

As a rule of thumb, we recommend allowing 6-8 weeks ahead of the core undertaking. During this time a project manager, migration lead, and business analyst will conduct a thorough analysis of all of the variables and propose a migration approach that will deliver maximum value.

This pre-assessment can be conducted entirely remotely.

Involving the execution team ahead of time also starts to build a strong relationship and understanding of the context of the migration, setting expectations on both sides. All of which should contribute to and build confidence in a smooth project delivery.

To discuss your own data migration journey, please fill out this contact form and we’ll put you in touch with our experts.

3 Life Sciences data trends & associated risks to watch out for in 2023

3 Life Sciences data trends & associated risks to watch out for in 2023

With the focus of regulated information processes shifting toward structured data exchange, pharma companies are investing heavily in systems to help them capture, collate, analyze, and manage that data in smart and efficient ways.

But in their keenness to digitally advance their operations, companies need to be careful they don’t unwittingly create new complexity and costs for themselves.

Here are 3 trends we expect to grow in prominence in 2023, and how to best navigate them so that they don’t create more problems than they solve:

1. The pursuit of end-to-end regulatory information management/the rise of best-of-breed

It makes perfect sense that companies want to streamline end-to-end RIM activity, so they can view and manage data consistently and harness it efficiently.

But if you become locked into a single vendor’s software in the process, this could create new risk. If any single brand effectively has control over your company’s data, you’ll need to maintain that relationship no matter what, and absorb any changes in that vendor’s direction or cost structure over time.

Emerging best practice is to maintain strong master data, and then take a platform approach to mixing and matching the best applications or ‘modules’ for each set of tasks (e. g. Clinical, Regulatory, Quality, Safety & Pharmacovigilance).

By emphasizing the data as an asset in its own right, and optimizing modern plug-and-play application integration to allow this to flow seamlessly to where it’s needed, companies can benefit from the best of both worlds – optimal functionality, without lock-in.

And modern, cloud-based deployment makes this easier than ever. 

2. Progress beyond IDMP: the extended roadmap

By now companies generally know what they’re doing with ISO IDMP to comply with EU/EMA expectations.

But if you really want to innovate and drive new efficiencies internally, you’ll need to look beyond the immediate requirements around regulated product data. You might want to establish a clear line of sight across your ERP system too, for instance, to create a seamless data trail right the way through to manufacturing.

Otherwise, you’ll end up with new silos which, as ever, are a source of cost and risk.

In 2023, it’s time to ask ‘What’s next?’ and form a roadmap that extends beyond EU IDMP for its own sake.

3. A recalibration of data projects, learning from outsourcing mistakes

In any large system project, it can be tempting to downplay the ‘data’ detail and make it a target for offshoring to contain costs. But this is at odds with the importance of data and its quality in ensuring that every other element of a project and its goals can be delivered.

All too frequently we see statements of work being repeatedly tweaked in an attempt to control costs. And, more often than not, it’s the critical data tasks that are the focus of those ‘efficiency savings’.

Procurement KPIs seem geared to insisting on these cuts, which is immensely short-sighted. Data is the most complex and skilled part of any system project and, if that data work is skimped on to save money, the chances are that the system will fail and ALL of the associated investment will have been in vain.

Just as you would never compromise on a Class I Project Manager, you should never scale back on the required data expertise – especially in complex areas where a certain talent density and experience is needed (e. g. across IDMP, Clinical, Quality and other specialist disciplines).

In 2023, as data becomes intrinsic to almost every strategic systems initiative, business owners really need to push back against pressure to ‘best shore’ data work, to ensure that the project as a whole delivers. This is no time to be squeezing specialist service providers in favor of low-cost, low-touch commodity offerings whose output can’t be guaranteed.

For more information, please complete this contact form and we’ll put you in touch with our experts.

Highlights of Generis and fme’s “Data-centricity in RIM” Webinar

Highlights of Generis and fme’s “Data-centricity in RIM” Webinar

In October, fme’s Director of Business Consulting David Gwyn was a featured contributor in an informative webinar with the Generis CARA Life Sciences team. He was able to share his rich experience and perspective on the value of a data-centric approach to document and information management, and outline some of the benefits that can be realized across an organization.

Generis also provided a comprehensive demo of their CARA Life Sciences Platform, and how it can improve quality, efficiency, consistency, and scalability across any organization.

Below is a summary of David’s introduction, an outline of the webinar, and a highlight video of the presentation. View the full webinar on the Generis site, and contact us with any questions you have about data-centricity or the CARA Life Sciences Platform.

 

Summary of Data-Centricity Introduction

David Gwyn: I’d like to speak for a few minutes on the essential concept of data-centricity. What I mean by that is how we can reshape our thinking about documents and the other entities we’re managing beyond traditional paper. Right now we all have an opportunity, and I will argue a necessity, to change the way we’re thinking about the information we’re managing and move forward into a more data-centric and data-focused approach.

I’m sure you remember the days where we would produce reams and reams of paper that we’d stack on a conference room table and ask the CEO to come in and sign the 356H. Eventually we said “Let’s digitize this, so we took our Word documents that we printed out before and turned them into PDFs. While this was a step forward, we really just took pictures of all those documents and obfuscated all the value. All the data that was there was buried in a digital version of the paper process.

There are much better solutions now that eliminate traditional challenges, and provide extensive improvement to quality, efficiency, consistency, and scalability across your entire organization. Let’s look at what’s possible.

Data-Centricity Webinar Outline

  • Overview of a document-centric progress
  • Impacts of document focus
  • Brief History of Medicinal Product Submissions
  • What is triggering the transition to digitalization of the process?
    • Regulations, data standards, compliance
    • Improve quality, efficiency, consistency
    • Enable scalability, promote quality, endure changing landscapes
  • Characteristics of a data-driven approach
  • Benefits of data-centric process
  • Questions to ask to prepare for a transition to a data-centric approach
  • Detailed demo of Generis CARA Life Sciences Platform

Watch the Webinar Highlights

 

For more information, please complete this contact form and we’ll put you in touch with our experts.

 

Why wait for a big bang consolidation project to connect teams to the data and content they need?

Why wait for a big bang consolidation project to connect teams to the data and content they need?

When companies merge, it can take several years to consolidate IT systems and phase out duplicate, outdated or incompatible applications. And, in the meantime, business must continue as usual, which means that teams (existing, blended, or new) will need timely access to widely dispersed information and documents that pre-date the acquisition, alongside any new data and content.

One relatively painless and inexpensive way to enable this is through a cloud-hosted, subscription-based integration service, which makes it possible to form reliable interconnections and data/content exchange pathways between old and current systems.

Instead of having to log into multiple different systems to gain a clear, up-to-date picture of a current scenario, business users can simply go into their primary application which gives them a view of the connected data and documents they need.

Out-of-the-box connectivity

The key to such a facility is a series of web services/system connectors, and low code as needed, to link to and exchange data with other platforms. At fme, we have ready-to-go connectors to all of the popular content repositories, from OpenText Documentum and SharePoint generically, to CARA and Veeva at a more specialist level (e.g. for Life Sciences Regulatory information or Quality management). This shortens the time to information access considerably.

This easy integration proposition is a great option for any situation where time is ticking on the transitional service agreement. It allows the relevant datasets to be ‘lifted and shifted’ when there just isn’t the bandwidth to engage in large-scale system consolidation or data migration initiatives before the cord is cut with the old set-up.

As long as there is an application programming interface (API), we can build a connector into any retiring system, allowing data and content to be pushed and pulled between this and the new target system(s), whether on a timed schedule or on demand – even without the SaaS option.

Buying your company vital time

One of the main benefits of taking an integration approach as an interim solution to system consolidation/replacement is that it buys a company time before it has to untangle a multitude of systems or put in something new, such an all-singing-all-dancing RIM system. While they are evaluating their best options, they can continue operating at full capacity, knowing that users have access to live and accurate information to support their current tasks.

fme’s integration-center can be deployed in a client’s own cloud environment, or provided as a hosted, subscription-based microservice.

This deployment model, coupled with our deep and extensive expertise in enterprise content management (ECM) system implementations, and our pre-packaged, technology-aware accelerators (connectors), mean we can often get users up and running with a fully connected environment within just a couple of months. (An environment that can be quickly switched off again once any bigger and more permanent project has been deployed.)

It’s easy to become overwhelmed with all of the various activities, with the result that companies do not know where to start. fme’s Integration Consultants can provide invaluable advice here, to help teams determine which data they most need to access in the short term and prioritize the content that is really important.  this contact form and we’ll put you in touch with our experts.

 

For more information, please complete this contact form and we’ll put you in touch with our experts.

 

Lessons learnt from life sciences content migrations

Lessons learnt from life sciences content migrations

3 common pain points and how to avoid them. As I explored recently, digital transformation can surface a series of perplexing data challenges for most organizations, but particularly those operating in life sciences. Such has been the inertia in the sector previously around system modernization that, when a project is finally approved and chartered, stakeholders often rush toward the benefits without looking back. This can result in disappointment and frustration when the ‘as is’ data migrated to the new set-up turns out to be in a poor state and largely unusable.

In this blog, I want to focus on 3 particular takeaways from such experiences which companies can extrapolate from to avoid making the same mistakes.

1.     Technology is just one piece of the puzzle in a successful system migration

The successful execution of any project involves equal focus on people, process, and technology, so that everything is aligned to deliver the expected outcomes. Certainly, it’s as important to line up sufficient resources to plan and manage the transition, and to build engagement and momentum among all stakeholders and users, as well as any new skills and resources they might need.

But another element that’s often neglected is the data the new system will draw on to deliver the expected process streamlining, and improved visibility. However fast and feature-rich the new technology platform, if it’s dependent on old and inadequate data quality it won’t be able to fulfill its promise. If the new system can’t fulfil future-state governance or meet new standards for regulatory compliance, then the ‘retro-fitting’ burden could be immense as teams try to massage and improve the data after the fact. Unplanned data enrichment is hugely time-consuming.

The lesson here is about scoping system projects thoroughly and planning them early enough so that every dimension is well catered for in plenty of time.

If delivery is scheduled close to a deadline for adhering to new health authority submissions standards, companies will want to be sure that the data coming across is in good shape and fit for purpose to allow that deadline to be confidently met. Filling CMC Module III in eCTD submissions is already a hot spot, highlighting where existing system data typically falls short. Companies can learn from this by doing their due diligence and performing a detailed data assessment up front, to help them mitigate these kinds of risks.

2.     Time & resources are critical success factors

Unless a project has a sufficient runway leading up to it, pressures will mount, and good intentions are likely to give way to compromise and the cutting of corners. Even if they plan to bring in external help, the key project stakeholders will need to set aside their own people’s time to do the vital groundwork. That’s in understanding old and new data governance parameters, so that any data preparations and actual data migration is in line with current and emerging requirements. Without those parameters, and a clear picture of the ‘as is’ status, project teams risk investing their time and budgets in the wrong places and setting themselves up for a big clean-up operation later on post migration.

So, even before performing a data quality assessment, it’s a good idea to seek a bit of preliminary strategy advice from a trusted expert – almost as a Phase 0 – to understand the bigger picture and how everything needs to align to deliver against it.

This isn’t about engaging specialists prematurely, but rather about making sure that any investment that follows (and any external help brought in) is well targeted, so delivering maximum value.

3.     It’s important to allow and plan for failure

Despite the best intentions, projects can go awry due to the many moving parts that make up the whole. So, it’s important to factor ‘the unexpected’ into all planning.

This includes allowing for a certain number of iterations based on the finding of data quality assessments, to get the data to fit the required data governance standards going forward. If the data coming across is in a disastrous state, a planned migration phase duration could quickly turn into a material schedule delay. Underestimating the work involved is very common. I have seen this in many client projects. For example, if the ‘happy path’, where everything goes to plan, was expected to take 10-12 months, the real-life route situation took 18 months– so to de-risk the project, allow for contingency. If in doubt, take the forecast number of days and double it.

All of the preparatory work recommended above should help contain delays and protect against timescale or budget shocks, but it’s better to plan properly so that the journey goes as smoothly as it can. Although, ultimately, the client company is responsible for the quality and integrity of its own data, technology vendors and service providers will need to plan their involvement too and ensure they have the right skilled resources available at the right time.

Our experts can provide guidance on all of the above. Ultimately, this is about setting the right expectations and a realistic schedule, and resourcing projects so that they optimize the time to value. A little foresight can go a long way.

To discuss your own data migration journey, please fill out this contact form and we’ll put you in touch with our experts.