Digital transformation with Veeva? Choose a Certified Partner that can do it all

Digital transformation with Veeva? Choose a Certified Partner that can do it all

Veeva certification: Our annual commitment

Veeva has 3 releases each year, and it’s a requirement that any Veeva partners re-certify each year to ensure 360-degree familiarity with the most recent updates. This is a big commitment for any company to make to support the platform, but it ensures that Veeva implementation and service partners will be able to provide the highest level of knowledge and support for Veeva customers.

To keep up-to-date with the features and opportunities associated with the evolving product suite, fme’s over 30 Certified Professionals maintain their skills and certification with the following courses, and verify their knowledge with final exams that can take an additional three hours to complete.

Vault Platform – Associate White Belt Exam (live lead courses)

  • Vault Platform Fundamentals Pre-work (4 hours)
  • Vault Platform Fundamentals (4 hours)
  • Vault Business Administrator (5 hours)
  • Vault System Administrator (11 hours)
  • Vault Platform Workflow & Security Fundamentals (5 hours)
  • Vault Document Workflow & Security Fundamentals (6 hours)
  • Vault Document Workflow & Security Fundamentals (6 hours)
  • Vault Platform Data Load & Configuration Migration (5 hours)

To extend the validity of the certification, it’s necessary to pass the latest Veeva Release Training and Exam, covering both topics linked to the recent releases and the Veeva Vault Platform as a whole.

Other Life Science-specific Veeva certifications

For a specific Associate White Belt for specific solutions (QualityDocs, QMS, Submissions etc.) there is a Vault Platform Associate White Belt Exam that also needs to be renewed each year.

Vault QualityDocs Associate White Belt

As for the wider Veeva Vault Platform, specific live lead courses need to be completed, including:

  • QualityDocs Fundamentals & Business Administrator (2 + 4 hours)
  • QualityDocs System Administrator (4 hours)

Vault QMS Associate White Belt

  • QMS Fundamentals & Business Administrator (2 + 4 hours)
  • QMS System Administrator (4 hours)
  • Vault RIM Submissions Associate White Belt
    • Submissions Fundamentals (2 hours)
    • RIM Suite Domain Overview Pre-work (3.5 hours)
    • Submission Fundamentals Pre-work (3.5 hours)
    • Submissions Business Administrator (8 hours)
    • Submission Business Administrator Pre-work (5 hours)
    • Submissions System Administrator (8 hours)

Vault RIM Submissions Archive Associate White Belt

  • Submissions Archive Fundamentals (2 hours)
  • RIM Suite Domain Overview Pre-work (3.5 hours)
  • Submissions Archive Business Administrator (4 hours)
  • Submissions Archive System Administrator (4 hours)

fme is your Veeva Certified Partner

Whether you’re implementing Veeva for the first time, or bringing across more content to the platform, fme – a longstanding Veeva Certified Partner for both technology and services – is the ideal choice to manage all your integration and data migration needs. That’s thanks to:

  • Our deep experience. fme was founded in 1995 with a core team of digital transformation experts and has continued to grow to a global presence with offices in Germany, Romania, and the United States of America as well as operations in India. We integrated the Veeva product catalog into our offerings in 2020 and expand our partnership each year.
  • The scale of our resources. We have over 30 Veeva Certified experts working across our client projects at any one time, with a team of experts in OpenText, Generis, and all major platform solutions.
  • Our deep Life Sciences knowledge. We have deep experience with Veeva solutions across all kinds of Life Sciences use cases. We understand your functional and business needs, and have solved many of the most complex challenges in the industry.
  • Our extensive legacy system knowledge. We have in-depth knowledge of the environments that Veeva adopters are moving from, so we have unique insights into how to ensure a migration project achieves its goals the first time, on time, minimizing down-time and workflow interruptions.

Our tested and proven holistic approach, as well as our commitment to an efficient, cost-effective deployment and migration, will decrease your TCO while increasing your technology ROI. We know both the pitfalls and potential of effectively integrated solutions, and can help you plan, implement, migrate and maintain your technology environment. We can also take care of all aspects of business analysis, data validation, data clean-up, and data enrichment – all critical tasks as part of the digital transformation and business accelerating process.

fme’s proprietary migration-center

As service partner, we understand the power and potential of Veeva’s solutions, but the fme team takes it a step farther; we are also a Veeva Technology Partner. To provide the most efficient and error-free transformation to the Veeva platform, we’ve built Veeva natively into our migration-center migration tool. Continually refined over the last 16 years of global migration projects, migration-center is our proprietary tool that allows clients to migrate, consolidate, archive, decommission and flexibly coordinate all of their stored data and documents saving 60% in costs and 80% in project duration, all without impacting daily business operations. It simplifies migration from any platform to Veeva solutions and is not offered by any other provider. Learn more at migration-center.com

Work with the Veeva experts

As a leading Veeva Certified Partner with extensive life sciences experience, fme can help you plan, implement, and migrate to this powerful platform with confidence, speed, and ease. Contact us to get started.

For more information, please complete this contact form and we’ll schedule a time to discuss your Veeva-related challenges and solution options.

Always something new – A document management system in the change of time (1)

Always something new – A document management system in the change of time (1)

Overview of the blog post series

  • Modernization: OpenText WebTop to D2
  • Development of an individual administration client
  • Phasing out the OpenText WebTop
  • Interface to other systems
  • Change Management / Communication

Background facts

  • More than 100,000 users
  • More than 70 million documents
  • Used in numerous countries and companies

Part 1: Modernization – OpenText Webtop to D2

The OpenText Webtop is technology that has been outdated for several years. Both the user interface and the functionalities have not been further developed by the vendor, so users must adapt to the rigid web application. At the same time, the application reaches its limits and can no longer handle large amounts of data. The result: dissatisfaction, frustration and acceptance problems among users. The goal on the part of our client was therefore clear: user acceptance must be increased again and at the same time future security must be ensured.

What doesn’t fit is made to fit – the development

Existing technology

  •  OpenText D2 21.4
  • OpenText Content Server 21.4
  • Oracle 19c
  • Red Hat Linux 7.9
  • Apache TomCat 5.5.

With OpenText D2, a standard application was available that was also set up for modern technology such as the latest browsers. This was a good starting point, but of course the standard did not take all the client’s needs into account. So, we developed and customized together with the client, got power users on board in an early pilot, and then successively expanded the pilot user base.

The result

  • A modern and chic interface – consisting of a wide variety of widgets
  • All necessary functions for normal and power users are available
  • Fast document display of numerous formats
  • Customizable interface
    • User can switch between different interfaces (workspaces), depending on work style and focus
    • User can customize individual interface according to his wishes and needs by selecting relevant widgets

Please find further information here: Datasheet Application Modernization

From development to user acceptance – with change management

The functioning technology is the basis, but it is essential to bring the users along with the change. Even if the new system obviously holds many advantages – most people do not like to change. For this reason, we supported the entire rollout with change management to explain the new options in an appealing way and to ease the transition. This included a custom design, information sessions, help documents and short videos. We will provide a more detailed insight into our change management approach in the last part of our blog series.

Last but not least, the most important question: Did we succeed and achieve our goal?

We remember, the goal was to ensure future-proofing of the document management system and to improve user acceptance. Against this background: Yes, the goal was definitely achieved. The system is secure for the future – new technology, compatible with the latest browsers and all relevant functions are available. And user acceptance has also increased significantly again. The new, customizable options are being actively used.

The attentive reader may have noticed that so far, we have only talked about the users. But what would a document management system be without comprehensive administration options? Since OpenText D2 did not offer the possibility to implement customer requirements in the standard version, we developed a customized administration client, through which the assignment of authorizations, preassignments, regular authorization confirmations and many other administrator activities take place. But more about this in the next part of our blog series…

To discuss your own document management system modernization journey, please fill out this contact form and we’ll put you in touch with our experts.

DIA RSIDM ’23 in Bethesda, Maryland

DIA RSIDM ’23 in Bethesda, Maryland

Will you be attending DIA RSIDM next month? We’re looking forward to being (safely) back in person to discuss the challenges and opportunities within regulatory information and document management.

Visit us at Booth #202 to discuss recently updated strategies and solutions, the advantages of CSA validation, and to learn more about the meetup on the EDM Structured Submission Reference Model organized by fme’s own David Gwyn. You may have seen him presenting recently on building a more effective and efficient data-centric approach to document management in RIM.

Essential Sessions

If you haven’t reviewed and selected your sessions yet, here are just a few we feel are well-worth the time to attend. Hope to see you there!

Driving Performance from RIM

Enhance your processes to drive ultimate performance from your RIM technology. While describing the road to high performance, the speakers in this session will provide clear examples of regulatory metrics that will demonstrate a high performing RIM technology and how to make the most of country affiliates.

Is It Time to Streamline Your Processes with Structured Content Management?

The current standard processes for document management are no longer sufficient for streamlining submissions or able to support the level of content management, collaboration, and re-use that is possible with the transition from documents to data. This session focuses on practicalities and benefits of structured content management (SCM), including the benefits of transitioning from documents to data-centric processes, and practical use cases and key success factors for SCM.

IDMP Ontology: Semantic Interoperability Throughout the Entire Medicinal Product Lifecycle

This session outlines the implementation challenges of ISO IDMP standards at pharma companies and showcases how an ontology can help achieve semantic interoperability of product data.

How Innovative Collaboration Supports Compliance Across the Regulatory Ecosystem

This session will explore ways sponsors can achieve both greater control and compliance as well as improving their way of working together with health authorities to expand access to medicines.

Visit us at Booth #202

We are excited to participate in these valuable conversations with regulatory thought leaders, and re-connect with industry colleagues! Contact us to schedule a time to talk, or stop by Booth #202 to talk with our solution experts, and share your perspective and questions with the fme team. We look forward to speaking with you!

Advantages of CSA – Computer Software Assurance – Over Traditional CSV

Advantages of CSA – Computer Software Assurance – Over Traditional CSV

For over 20 years, the traditional CSV computer system validation has created mountains of paperwork to validate a new or updated system. It’s also created an overwhelming burden that prevented many companies from upgrading their complex systems. CSA – Computer Software Assurance – was introduced to relieve that burden, allowing companies to optimize validation activities by focusing on the processes that impact patient health and safety.  

There are many advantages of CSA over the traditional CSV approach. It is a more streamlined and efficient risk-based methodology that saves time, frustration, and money by: 

  • Providing clarity for FDA’s guidance and methodology  
  • Driving critical thinking to identify, evaluate, and control potential impact to patient safety, product quality, and data integrity  
  • Focusing on the ability to leverage vendor qualification activities  
  • Providing streamlined testing instead of one-size-fits all  
  • Saving as much as 80% of validation costs  

 These are much needed improvements that are being welcomed by forward-thinking companies striving to improve their systems to stay competitive in today’s volatile market.  

Old Lessons Applied to Current Challenges  

In the 1930’s, Henry Ford shifted his thinking about how cars were manufactured. His industry changing assembly line focused on specific sub-components of his vehicles creating a plethora of efficiencies and quality improvements that allowed him to achieve unprecedented production goals.  

 Life science companies are now able to apply similar lessons in the context of validation. Much like the traditional car factory, the traditional CSV methodology demands extensive structure for every aspect of the system. CSA opens up new tools, templates and techniques, revised SOPs, and training to shift the focus to thinking critically rather than dogmatically. It is a dramatic shift in focus that can improve a company’s competitive edge by increasing their ability to test and adopt new business processes and systems and accelerating validation activities. 

fme Delivers the Advantages of CSA 

The fme team are life science experts that have the deep regulatory, clinical and quality experience required to integrate complex business and regulatory compliance requirements. By leveraging proven practices of previous decades, our extensive process expertise, and today’s best-in-class toolsets, we fast-track your evolution from CSV to CSA, eliminating manual, labor-intensive validation efforts and establishing a proven risk-based methodology.  

Learn More about the Advantages of CSA 

Currently there is no registration required to download fme’s Optimizing Validation through a Risk-Based Approach: Leveraging Computer Software Assurance (CSA) to learn more about CSA, and our CSA training options in online, instructor led, or a hybrid of eLearning and remote instructor coaching. We are happy to provide you with detailed information on our validation service offerings and can even tailor an approach that meets your needs and exceeds your expectations. 

 

Just because a current content/data-based process works doesn’t mean it’s efficient

Just because a current content/data-based process works doesn’t mean it’s efficient

New or consolidated systems should lead to better outcomes, so content migration pre-assessments are important to maximize the ROI.

Whether the goal is digital transformation, system consolidation or moving to a new content management system – if you’re going to spend a lot of money on a new IT project it should be with a view to delivering something tangibly better.

Too often departmental teams have become so adept at process workarounds to assemble or manage content, however, that they lose sight of what’s possible. As a result, when they are asked to give an overview of their current systems and ways of working, they tend to be overly optimistic about the caliber and integrity of the content that will need to be transferred to the new system.

This creates risk, as content migration projects are scoped, planned and costed on the back of these insights.

It’s quite odd, when you think about it, that such pivotal projects – which may involve critical Regulatory, Clinical or Quality systems – should be left to chance in this way. No airline or pilot would embark on a transatlantic flight without first checking for expected weather events, happy to simply react and make adjustments once hurricane conditions present themselves. And yet companies fix budgets and set deadlines for projects that have been scoped with only partial knowledge of the conditions that will be encountered. They prepare for a smooth ride, yet in nine cases out of 10 experience something altogether more turbulent.

Apples & oranges

In the aftermath of a merger/acquisition, it’s expected that blending systems will throw some issues if the technology platforms differ, the object models don’t match, or if the receiving/lead company does not have direct insights into the scale and integrity of the incoming systems of record.

But even within one company, there are likely to be corrupt, inaccurate, incomplete or out-of-date files, or differences in data model, which will continue to cause issues if migrated without remediation to a new platform or system.

And it is far better to understand the scope and scale of such issues before a content migration project takes form. The danger, otherwise, is that an already sizable undertaking will multiply as change order after change order is pushed through, with the result that ‘best case’ deadlines and budgets are far exceeded.

Warning signs

So how can you tell if you are likely to encounter such issues?

Clues to a sub-optimal starting point might include:

  • Over-reliance on highly manual or protracted processes, often involving multiple people, to prepare and submit a document;
  • Dependence on file shares or non-managed systems to locate information; and/or
  • The need to regularly plug gaps in content by chasing down additional detail;
  • Not being sure to what the actual number of documents there are that are required in the new system.

Don’t rely on guesswork

The only reliable way to scope content migration work is to engage the specialists ahead of time. Giving them an opportunity to look over the data themselves, ask the right questions, gather the correct number of documents in scope, and conduct a gap analysis between the data models of the old and new systems, will ensure that the formal migration project is scoped and designed optimally.

From all of this knowledge, for instance, and with a clearer idea of how content is typically organized, those that will later be tasked with performing the migration service will be able to architect the best approach – both tactically and strategically.

Considerations include:

  • How much data/content is earmarked to be migrated (and which data/content is beyond the scope of this project)?
  • Where is the data/content coming from, and where is it going to?
  • Which data models are involved in the old and new state?
  • How many data/content attributes exist in the old and new system?
  • What are the risks associated with a poor or badly scoped migration?
  • Where are the gaps/differences between the old and new models, and what will be needed to address them?
  • Given all of the known parameters, will a phased, or ‘big bang’ approach work best?

Forewarned is forearmed

Strategic pre-assessments, which can also be thought of as Phase 0 in the fuller context of a system migration, are an investment in a tight, focused, and hopefully expedited main project.

As a rule of thumb, we recommend allowing 6-8 weeks ahead of the core undertaking. During this time a project manager, migration lead, and business analyst will conduct a thorough analysis of all of the variables and propose a migration approach that will deliver maximum value.

This pre-assessment can be conducted entirely remotely.

Involving the execution team ahead of time also starts to build a strong relationship and understanding of the context of the migration, setting expectations on both sides. All of which should contribute to and build confidence in a smooth project delivery.

To discuss your own data migration journey, please fill out this contact form and we’ll put you in touch with our experts.

3 Life Sciences data trends & associated risks to watch out for in 2023

3 Life Sciences data trends & associated risks to watch out for in 2023

With the focus of regulated information processes shifting toward structured data exchange, pharma companies are investing heavily in systems to help them capture, collate, analyze, and manage that data in smart and efficient ways.

But in their keenness to digitally advance their operations, companies need to be careful they don’t unwittingly create new complexity and costs for themselves.

Here are 3 trends we expect to grow in prominence in 2023, and how to best navigate them so that they don’t create more problems than they solve:

1. The pursuit of end-to-end regulatory information management/the rise of best-of-breed

It makes perfect sense that companies want to streamline end-to-end RIM activity, so they can view and manage data consistently and harness it efficiently.

But if you become locked into a single vendor’s software in the process, this could create new risk. If any single brand effectively has control over your company’s data, you’ll need to maintain that relationship no matter what, and absorb any changes in that vendor’s direction or cost structure over time.

Emerging best practice is to maintain strong master data, and then take a platform approach to mixing and matching the best applications or ‘modules’ for each set of tasks (e. g. Clinical, Regulatory, Quality, Safety & Pharmacovigilance).

By emphasizing the data as an asset in its own right, and optimizing modern plug-and-play application integration to allow this to flow seamlessly to where it’s needed, companies can benefit from the best of both worlds – optimal functionality, without lock-in.

And modern, cloud-based deployment makes this easier than ever. 

2. Progress beyond IDMP: the extended roadmap

By now companies generally know what they’re doing with ISO IDMP to comply with EU/EMA expectations.

But if you really want to innovate and drive new efficiencies internally, you’ll need to look beyond the immediate requirements around regulated product data. You might want to establish a clear line of sight across your ERP system too, for instance, to create a seamless data trail right the way through to manufacturing.

Otherwise, you’ll end up with new silos which, as ever, are a source of cost and risk.

In 2023, it’s time to ask ‘What’s next?’ and form a roadmap that extends beyond EU IDMP for its own sake.

3. A recalibration of data projects, learning from outsourcing mistakes

In any large system project, it can be tempting to downplay the ‘data’ detail and make it a target for offshoring to contain costs. But this is at odds with the importance of data and its quality in ensuring that every other element of a project and its goals can be delivered.

All too frequently we see statements of work being repeatedly tweaked in an attempt to control costs. And, more often than not, it’s the critical data tasks that are the focus of those ‘efficiency savings’.

Procurement KPIs seem geared to insisting on these cuts, which is immensely short-sighted. Data is the most complex and skilled part of any system project and, if that data work is skimped on to save money, the chances are that the system will fail and ALL of the associated investment will have been in vain.

Just as you would never compromise on a Class I Project Manager, you should never scale back on the required data expertise – especially in complex areas where a certain talent density and experience is needed (e. g. across IDMP, Clinical, Quality and other specialist disciplines).

In 2023, as data becomes intrinsic to almost every strategic systems initiative, business owners really need to push back against pressure to ‘best shore’ data work, to ensure that the project as a whole delivers. This is no time to be squeezing specialist service providers in favor of low-cost, low-touch commodity offerings whose output can’t be guaranteed.

For more information, please complete this contact form and we’ll put you in touch with our experts.