Preparing for Migration: Critical steps to know

Preparing for Migration: Critical steps to know

In a previous article “Migration strategy and the path to operational resilience”, we examined the relationship between data migration and regulatory compliance, and the importance for enterprises to create a clear plan for initiating a data migration. Today, we will look at how migration is never one size fits all, and how each company’s migration journey will be different, requiring strategic approaches linked to the complexity of the unique data. There are several key steps to ensure migration initiatives are as streamlined as possible.

  • Understand the project scope: Assessing the quality of data, requirements, and complexity of data will help you set the right migration strategy.
  • Set realistic timelines: The assessment phase will help create an achievable migration plan that includes an ability to clearly measure progress.
  • Define the migration rules with the experts: Allow the migration expert to come out with the migration order and the rules for you to confirm.
  • Validate the migration: Set the testing strategy during the assessment phase. Conduct preliminary testing during the building phase and confirm with the business that all use cases were considered when preparing the test scripts.
  • The final migration strategy: The migration strategy dictates how the migration will be performed, on a one time basis or incrementally, and how to shut down and dispose of old legacy systems.
  • Sanity checks in production: Once the migration is complete, a sanity check on the agreeable amount of migration data should be initiated to prove that the migration is performed in accordance with the validation phase.

Define the scope of the project

Scoping a migration project involves defining the parameters, requirements, and goals of the project, and developing a clear plan and timeline for the migration. It’s also important to define the limits of the project and identify what won’t be included in the current phase.

For example, in the pharma industry, the implementation of the IDMP (Identification of Medicinal Products) standards requires updates, changes, and possibly even the roll-out of a new information system, but this type of agile modus operandi doesn’t always align with the pharma industry’s historically siloed way of working. Nearly every team in the drug development lifecycle – safety, clinical, regulatory, and research and development – can use different, disconnected systems that might not even integrate (e.g. Submission management, RIM, Master data management, Pharmacovigilance, Document management system etc.). Additionally, they are encouraged by the IDMP implementation to apply a single unified, holistic platform that removes silos connecting data and people.

When a decision on the implementation of a new solution and migration is made, it is important to set a clear scope and goal for the whole project. In the context of the implementation of regulatory solutions in the pharma industry, this can be a big deal for the company and the comprehensive assessment of the project is of paramount importance.

Getting started: Plan a data assessment

Before commencing, it is critical to have a clear overview of the project scope, data, and information to ensure efficient planning and execution. Data migration projects in the life science space are often complex, time-consuming and in most cases involve multiple systems and different technology. A clear and comprehensive assessment phase is key to avoiding exceeding predetermined budgets, implementation delays or undercutting business processes. The assessment phase is primarily used to review and assess the data in the existing systems and identify any potential issues and risks that might occur during the project. The purpose of this phase is not to carry out any migration activities, but to benchmark the scope, set recommendations, strategy and ensure visibility for the client. At this juncture, the client should have already known the migration requirements and the expectations for the migration. As a result of the assessment, a decision can be made to go further with the major project after the plan is clear or whether a POC (Proof of Concept) should be performed to assess some more complex data of the migration (e.g. Migration of the drug registration history).

Choosing the right migration approach

Once a thorough data assessment has been carried out, the next step is to decide on the right migration approach. Every company’s data is unique, and the appropriate migration strategy will depend entirely on the quality, value, and complexity of the data. For example, sensitive regulatory data should be handled with particular care. Moving ahead without proper planning will ultimately cause more work in a later phase, and possibly undermine a project’s success. We often recommend businesses to migrate registrations of important products separately and the rest after the go-live, using the same migration rules. This decision can be made if a high volume of data should be enriched, and the business cannot provide them on time. Another important consideration is whether to migrate everything at once – a Big Bang migration – which requires considerable time and resources to complete. Alternatively, businesses can carry out the migration incrementally and transfer data in phases – this is a rolling migration. A big bang can be more straightforward, but a problem with this approach is that no additions or changes can happen during this time, as all data processes are paused during the migration process. An incremental approach doesn’t require as much downtime but can bring more complexity as the source and the target system are run in parallel, eliminating downtime.

 Aligning project & migration timelines

Knowing when to initiate the migration is pivotal. Ideally, the migration should take place in parallel with the solution implementation process, and the development of the migration rules should follow the implementation cycles. For example, when the solution implementation of master data is locked, the definition and development of the migration master data can start and eventually be migrated. After each cycle, the business should confirm the migrated data in the target system and if all the requirements were fulfilled. This approach tells us that we need to understand the whole concept of the project, including the implementation of the solution when building the plan. Otherwise, it is impossible to set all cycles in a logical order and set the right priorities. It is essential that the data migration is carried out in tandem with domain experts that have the technological know-how and strategic acumen to deliver.

Consider a Proof of Concept data migration

Rigorous planning and assessment are critical to the success of any data migration project, and in some instances, a proof of concept (POC) is highly advisable. A POC data migration is essentially a trial run of a larger data migration project, geared towards testing the feasibility of migrating data from one system to another. The purpose of the POC is to demonstrate the viability of the data migration project and to identify any potential issues or challenges that may arise during the actual migration process. When a company is approaching a large-scale data migration, or a high-risk migration involving sensitive or mission-critical business data, a POC can provide significant value.

For a successful POC, it is important that the business provides rich sample data so that the migration team can test and verify the exact processes that will be used in the full migration.  Before running the POC, businesses should:

  • Review sample data sets and consider all use cases
  • Confirm the result of the migration fulfills the requirements
  • Examine and define any gaps and how to fill them
  • Define what work is required if they need to enrich their data

By its very nature, migrating data can be a complex and protracted process, and if not thought out fully in advance, can lead to significant data loss and system downtime. In our next blog, we will highlight the key success factors and best practices for optimized migration.

Contact us to start your migration right

Whatever the reason for the data migration, the goal of all stakeholders is to provide a solution to the business in order to improve business performance and ensure competitive advantage. To achieve this, they should give more attention to data migration and be smarter in the assessment, planning, and migrating data with experts that have experience and knows the business in the life science industry.

fme has been guiding global pharmaceutical and manufacturing firms through their complex migration journeys for over 20 years. We’ve even developed our own proprietary tool migration-center to enable seamless migrations with minimal downtime. Contact us to discuss your challenges and start your journey on the right path.

 

Preparing for Migration: Critical steps to know

Migration strategy and the path to operational resilience

Over the past few years, an effective digital transformation has been underscored emphatically as a prerequisite for long term business success. Complacency around operational and digital resilience now represents a legitimate existential threat to enterprises in any industry, both from a competitiveness and regulatory compliance perspective.

At its core, digital operational resilience describes an enterprise’s ability to mitigate the risk of spontaneous server outages, rapidly recover from service interruptions and maintain a bird’s eye view of potential systems vulnerabilities. Upgrading to modern solutions and maintaining legacy data in those solutions with an effective data migration strategy is an important aspect of robust digital resilience strategies. This article introduces the importance of an effective migration strategy within a digital transformation process, the approaches being adopted by industry leaders, and best-in-class examples that can accelerate business value.

The link between data migration and regulatory compliance

While data migration has become an integral process in the digital transformation game, it is often overlooked as a core facet of regulatory compliance. Botched or poorly executed data migrations can be catastrophic from an operational point of view – leading to data loss, data corruption and unnecessary downtime – and can also incur heavy regulatory sanctions. If personal data is lost or compromised during a migration, it can result in legal and financial penalties, reputational damage, and loss of customer trust. For example, if a company can’t produce the required documentation for review and approval, it can impede drug certification and undercut revenue.

To add to the challenges, public discourse around data privacy has intensified over the past few years, with new frameworks being implemented to protect users’ Personally Identifiable Information (PII): General Data Protection Regulation (GDPR), the California Consumer Privacy Act (CCPA) and the Data Protection Shield are just some of the regulatory initiatives designed to safeguard user data in recent years. This new regulatory minefield has forced enterprises to rigorously assess their processes for handling and storing user data and has amplified the case for robust data migration strategies when integrating new systems and combining data in corporate merger processes. It is essential that data is held securely and that the processes for protecting that data fit snugly within regulatory parameters. Effectively planning a migration to a new system can also enhance data security, while also ensuring that the data is accessible for deletion in accordance with GDPR stipulations. Additionally, enhanced data control and security can greatly mitigate the risk of data breaches and associated regulatory penalties.

According to a 2022 survey by digital infrastructure company Equinix, complying with data regulations was a top priority within the technology strategies for 87% of US organizations’, with 83% acknowledging IT infrastructure migration to the cloud as a top priority. The groundswell of momentum powering enterprises’ migration to cloud based servers from on-prem physical servers shows no signs of slowing down. We believe that innovative data migration technology will be a key component of successful digital transformations to advanced technologies. Luckily for businesses today, the development of advanced data migration technology is bringing more transparency into the data migration process, giving enterprises a diverse set of tools to choose from to ensure they are well equipped to thrive in today’s frenetic digital environment

Trust proven, consistent experts

The often-ignored truth is that data migration is a complex endeavor that touches multiple departments and roles that need to coordinate tasks, requirements, and timelines. By leveraging innovation partners with demonstrable expertise in the data migration arena, companies can now navigate the migration journey with a greater sense of confidence. To be successful, migration experts and implementation consultants of the target system should have a common understanding of the business objectives. In an ideal scenario, consultants should be available for the entire duration of the project, to ensure consistency of service delivery. Rotating personnel on the project might incur delays as they will need to be briefed thoroughly on the progress or could lead to impaired decision making that doesn’t fully factor in legacy efforts.

Understand your ‘why’

Organizations undertake data migration for a variety of reasons, such as strengthening system security, enhancing customer service capabilities, or driving operational efficiencies. Perhaps a streamlined platform is being implemented to support different business processes, establish a new data warehouse, or merge new data from other sources.

Before any prospective migration program is initiated, there must be a consensus and unified vision from leadership around the desired outcome and goals of the initiative. Irrespective of the initial rationale, the end goal is to have streams of up-to-the minute and accurate data sets which can enable businesses to personalize their services and boost customer retention, while developing a more nuanced understanding of key demographics. In both instances, data should be in the system, either the master data or all relevant registration objects.

In future articles, we will discuss key steps for the preparation phase, and what core considerations should be made before project initiation.

Conclusion and next steps

Whatever the reason for the data migration, the goal of all stakeholders is to provide a solution to the business in order to improve business performance and ensure competitive advantage. To achieve this, they should give more attention to data migration and be smarter in the assessment, planning, and migrating data with experts that have experience and knows the business in the life science industry.

fme has been guiding global pharmaceutical and manufacturing firms through their complex migration journeys for over 20 years. We’ve even developed our own proprietary tool migration-center to enable seamless migrations with minimal downtime. Contact us to discuss your challenges and start your journey on the right path.

Just because a current content/data-based process works doesn’t mean it’s efficient

Just because a current content/data-based process works doesn’t mean it’s efficient

New or consolidated systems should lead to better outcomes, so content migration pre-assessments are important to maximize the ROI.

Whether the goal is digital transformation, system consolidation or moving to a new content management system – if you’re going to spend a lot of money on a new IT project it should be with a view to delivering something tangibly better.

Too often departmental teams have become so adept at process workarounds to assemble or manage content, however, that they lose sight of what’s possible. As a result, when they are asked to give an overview of their current systems and ways of working, they tend to be overly optimistic about the caliber and integrity of the content that will need to be transferred to the new system.

This creates risk, as content migration projects are scoped, planned and costed on the back of these insights.

It’s quite odd, when you think about it, that such pivotal projects – which may involve critical Regulatory, Clinical or Quality systems – should be left to chance in this way. No airline or pilot would embark on a transatlantic flight without first checking for expected weather events, happy to simply react and make adjustments once hurricane conditions present themselves. And yet companies fix budgets and set deadlines for projects that have been scoped with only partial knowledge of the conditions that will be encountered. They prepare for a smooth ride, yet in nine cases out of 10 experience something altogether more turbulent.

Apples & oranges

In the aftermath of a merger/acquisition, it’s expected that blending systems will throw some issues if the technology platforms differ, the object models don’t match, or if the receiving/lead company does not have direct insights into the scale and integrity of the incoming systems of record.

But even within one company, there are likely to be corrupt, inaccurate, incomplete or out-of-date files, or differences in data model, which will continue to cause issues if migrated without remediation to a new platform or system.

And it is far better to understand the scope and scale of such issues before a content migration project takes form. The danger, otherwise, is that an already sizable undertaking will multiply as change order after change order is pushed through, with the result that ‘best case’ deadlines and budgets are far exceeded.

Warning signs

So how can you tell if you are likely to encounter such issues?

Clues to a sub-optimal starting point might include:

  • Over-reliance on highly manual or protracted processes, often involving multiple people, to prepare and submit a document;
  • Dependence on file shares or non-managed systems to locate information; and/or
  • The need to regularly plug gaps in content by chasing down additional detail;
  • Not being sure to what the actual number of documents there are that are required in the new system.

Don’t rely on guesswork

The only reliable way to scope content migration work is to engage the specialists ahead of time. Giving them an opportunity to look over the data themselves, ask the right questions, gather the correct number of documents in scope, and conduct a gap analysis between the data models of the old and new systems, will ensure that the formal migration project is scoped and designed optimally.

From all of this knowledge, for instance, and with a clearer idea of how content is typically organized, those that will later be tasked with performing the migration service will be able to architect the best approach – both tactically and strategically.

Considerations include:

  • How much data/content is earmarked to be migrated (and which data/content is beyond the scope of this project)?
  • Where is the data/content coming from, and where is it going to?
  • Which data models are involved in the old and new state?
  • How many data/content attributes exist in the old and new system?
  • What are the risks associated with a poor or badly scoped migration?
  • Where are the gaps/differences between the old and new models, and what will be needed to address them?
  • Given all of the known parameters, will a phased, or ‘big bang’ approach work best?

Forewarned is forearmed

Strategic pre-assessments, which can also be thought of as Phase 0 in the fuller context of a system migration, are an investment in a tight, focused, and hopefully expedited main project.

As a rule of thumb, we recommend allowing 6-8 weeks ahead of the core undertaking. During this time a project manager, migration lead, and business analyst will conduct a thorough analysis of all of the variables and propose a migration approach that will deliver maximum value.

This pre-assessment can be conducted entirely remotely.

Involving the execution team ahead of time also starts to build a strong relationship and understanding of the context of the migration, setting expectations on both sides. All of which should contribute to and build confidence in a smooth project delivery.

To discuss your own data migration journey, please fill out this contact form and we’ll put you in touch with our experts.

Lessons learnt from life sciences content migrations

Lessons learnt from life sciences content migrations

3 common pain points and how to avoid them. As I explored recently, digital transformation can surface a series of perplexing data challenges for most organizations, but particularly those operating in life sciences. Such has been the inertia in the sector previously around system modernization that, when a project is finally approved and chartered, stakeholders often rush toward the benefits without looking back. This can result in disappointment and frustration when the ‘as is’ data migrated to the new set-up turns out to be in a poor state and largely unusable.

In this blog, I want to focus on 3 particular takeaways from such experiences which companies can extrapolate from to avoid making the same mistakes.

1.     Technology is just one piece of the puzzle in a successful system migration

The successful execution of any project involves equal focus on people, process, and technology, so that everything is aligned to deliver the expected outcomes. Certainly, it’s as important to line up sufficient resources to plan and manage the transition, and to build engagement and momentum among all stakeholders and users, as well as any new skills and resources they might need.

But another element that’s often neglected is the data the new system will draw on to deliver the expected process streamlining, and improved visibility. However fast and feature-rich the new technology platform, if it’s dependent on old and inadequate data quality it won’t be able to fulfill its promise. If the new system can’t fulfil future-state governance or meet new standards for regulatory compliance, then the ‘retro-fitting’ burden could be immense as teams try to massage and improve the data after the fact. Unplanned data enrichment is hugely time-consuming.

The lesson here is about scoping system projects thoroughly and planning them early enough so that every dimension is well catered for in plenty of time.

If delivery is scheduled close to a deadline for adhering to new health authority submissions standards, companies will want to be sure that the data coming across is in good shape and fit for purpose to allow that deadline to be confidently met. Filling CMC Module III in eCTD submissions is already a hot spot, highlighting where existing system data typically falls short. Companies can learn from this by doing their due diligence and performing a detailed data assessment up front, to help them mitigate these kinds of risks.

2.     Time & resources are critical success factors

Unless a project has a sufficient runway leading up to it, pressures will mount, and good intentions are likely to give way to compromise and the cutting of corners. Even if they plan to bring in external help, the key project stakeholders will need to set aside their own people’s time to do the vital groundwork. That’s in understanding old and new data governance parameters, so that any data preparations and actual data migration is in line with current and emerging requirements. Without those parameters, and a clear picture of the ‘as is’ status, project teams risk investing their time and budgets in the wrong places and setting themselves up for a big clean-up operation later on post migration.

So, even before performing a data quality assessment, it’s a good idea to seek a bit of preliminary strategy advice from a trusted expert – almost as a Phase 0 – to understand the bigger picture and how everything needs to align to deliver against it.

This isn’t about engaging specialists prematurely, but rather about making sure that any investment that follows (and any external help brought in) is well targeted, so delivering maximum value.

3.     It’s important to allow and plan for failure

Despite the best intentions, projects can go awry due to the many moving parts that make up the whole. So, it’s important to factor ‘the unexpected’ into all planning.

This includes allowing for a certain number of iterations based on the finding of data quality assessments, to get the data to fit the required data governance standards going forward. If the data coming across is in a disastrous state, a planned migration phase duration could quickly turn into a material schedule delay. Underestimating the work involved is very common. I have seen this in many client projects. For example, if the ‘happy path’, where everything goes to plan, was expected to take 10-12 months, the real-life route situation took 18 months– so to de-risk the project, allow for contingency. If in doubt, take the forecast number of days and double it.

All of the preparatory work recommended above should help contain delays and protect against timescale or budget shocks, but it’s better to plan properly so that the journey goes as smoothly as it can. Although, ultimately, the client company is responsible for the quality and integrity of its own data, technology vendors and service providers will need to plan their involvement too and ensure they have the right skilled resources available at the right time.

Our experts can provide guidance on all of the above. Ultimately, this is about setting the right expectations and a realistic schedule, and resourcing projects so that they optimize the time to value. A little foresight can go a long way.

To discuss your own data migration journey, please fill out this contact form and we’ll put you in touch with our experts.

 

Digital transformation in life sciences: ensuring data is fit for future purpose

Digital transformation in life sciences: ensuring data is fit for future purpose

In life sciences, such considerations are often underestimated. Organizations are so ready to untether themselves from the complexity and constraints of old legacy systems that they can become distracted from other factors which need to be considered to get the most out of their new investment.

Priority goals may include transforming the way they manage and work with regulatory information management, to drive greater efficiency, accuracy, visibility as well as compliance with the evolving demands of regulators. But the scope of even the most dynamic new platform or system will be dependent on, and limited by, the business data available. If that data has material gaps in it, contains significant duplication and/or errors, or is not aligned with the fields and formats required for target/future use cases in conjunction with the data governance strategy, even the best-planned project will not deliver effectively.

Start by considering what’s possible

As life sciences organizations form their digital transformation strategies and reset their goals, it’s important that they understand the potential opportunity to streamline and improve associated processes – and the way that these new or reframed processes will harness data to deliver step changes in execution and output.

One opportunity, for example, could be to transform the way companies manage their product portfolios – via a more dynamic, finer-grained definition of that portfolio and an end-to-end view of its change management, registration/licensing and commercial status in every market globally.

Another is to harness regulatory information (RIM data) to streamline the way a whole host of other functions plan and operate. There’s a lot of interest now in flowing this core data more fluidly into processes beyond Regulatory Affairs – such as Clinical, Manufacturing, Quality, Safety, and Pharmacovigilance. Rather than each function deploying and managing its own applications and data set to serve a single purpose, as has been largely the case up to now, the growing trend is to take a cross-functional platform approach to data, change, and knowledge management. This means that each team can draw on the same definitive and live information set to fulfil their business need.

All of this is much more efficient, as well as less error prone – because similar or overlapping data is not being input many different times, in slightly different ways. This, in turn, will expose companies to much lower risk as regulators like EMA start to require simultaneous data-and-document based submissions for marketing authorizations and variations/updates, which inevitably will see them implement formal cross-checks to ensure information is properly synchronized and consistent.

There are no shortcuts to rich, reliable data

The process transformation opportunities linked to all the above are considerable, and they are exciting. However, they rely on the respective teams understanding and harnessing that potential through advanced, proactive planning. By agreeing, collectively, on the scope for greater efficiency, and on the strategic advantages that are made possible through access to more holistic intelligence and insights, teams can start to move together toward a plan that will benefit everyone.

Practically, this will require an investment of time and thought, considering the state and location of current data, and what will need to happen to it to ensure that it is of sufficient quality, completeness and multi-purpose reusability to support improved processes in the future state. Unquestionably, this will also require a considerable amount of targeted work to ensure existing data is aligned and of high quality; that it uses agreed vocabularies; and consistently adheres to standardized/regulated formatting, data governance, and naming conventions.

Source expert help as needed

All of this may sound like a lot of “heavy lifting”, but it is exactly the kind of activity our experts can advise on. We can start by helping life sciences companies put together a strategy based on how data will ideally be used in the future state, and what needs to happen to it to prepare it for migration.

Working alongside the various business subject-matter experts (e. g. the people closest to the product portfolios and the processes involved in managing these), we’ll help scope the work involved and the resources that will be required. We can also help to determine the historical, current, and future role of respective data, so that only active data is prioritized for preparation for migration to the new system or platform (in terms of refactoring/clean-up/enrichment).

Forewarned is forearmed, as they say. Although preparing data so that it’s migration-ready may sound like an onerous undertaking, it is far better to know this and be able to do something about it ahead of time, than to be caught out once a critical technology project is already well advanced – by which time fundamental data transformation considerations may be too late.

To sound out our experts about data preparations needed for an upcoming new systems project, please get in touch, I’m happy to support you.

Get in touch with us

 

Don’t be sidetracked by EMA’s DADI curveball: data is still the goal

Don’t be sidetracked by EMA’s DADI curveball: data is still the goal

Yes, the initial plans have altered to buy everyone a bit more time, but the broader plan is still on track – to make product data submissions at least as important as electronic documents (and ultimately the priority/default) in all communications with the regional Regulator.

In other words, whatever tweaks companies and software vendors make to their roadmap and immediate capabilities over the next few months, these should not detract from – nor dilute – the overarching plan to get regulated product data in complete, consistent and up-to-date order.

That’s so that when it is time to migrate to – and go live – with an optimized new regulatory information/submissions management platform or system, there is comprehensive, compliant, high-quality data ready to run across it.

And of course, the foundational work done now will stand companies in good stead for when other regions across the world implement their own take on IDMP – given that dynamic data exchange is where all of this will go, globally, in due course.

What’s changed, and how does it affect you?

To recap what’s changed in the interim, EMA’s Digital Application Dataset Integration (DADI) interface – a project that has been evolving alongside IDMP and SPOR (see https://esubmission.ema.europa.eu/ for more information) – will in the short term, from April 2023, serve as the means for entering electronic application forms through interactive web forms.

This will enable companies to pull data from the PMS (that has been migrated from XEVMPD and other EMA databases) and to get out the human-readable PDF format and machine-readable XML format. Both formats will be submitted along with the eCTD; this part of the process will not change.

This latest development is an important step toward the IDMP goal of reusing data from PMS, and the first step toward the IDMP standardization of data. EMA will support this approach for variation forms only at this point, extending it to initial applications and renewals later.

Ultimately, EMA’s plan is for standardized medicinal product data currently held in the xEVMPD database to be enriched for the PMS, where fuller, more granular, IDMP-compliant medicinal product detail will be kept and updated over time.

There are some practical challenges still to be worked out, such as how IDMP detail that is currently missing will be added to PMS, and how internal company RIM systems and EMA’s PMS database will get to a point of being able to exchange data more seamlessly without requiring manual data re-entry. But for now, this is a chance for companies to update and correct their data with EMA’s dictionary through the familiar XEVMPD process. (Ultimately, FHIR – the global industry standard for passing healthcare data between systems – will support more dynamic data exchange/sync’ing between companies’ RIM systems and EMA’s PMS.)

Rather than play for time, here are 4 opportunities that the interim DADI move makes possible, as well as 5 next steps that companies should take to stay on track with their data preparations:

4 benefits to exploit

  1. The use of the DADI interface for getting data from the EMA PMS allows life sciences companies and software providers to take a breath as they prepare for full-scale IDMP implementation and compliance. FHIR-based submissions via API have been pushed back for now (this will still happen, just not within the next year).
  2. The industry is now less dependent on immediate technology changes. There is no need for their RIM systems to support DADI, as at this point data won’t flow directly between RIM records and EMA’s PMS.
  3. The EMA’s roadmap allows for implementation to happen in manageable chunks. EMA’s ‘DADI first’ approach allows for Product (PMS) data re-use, and accounts for the largest proportion of regulatory submissions.
  4. This is a chance to reset or adapt IDMP/regulatory data strategies, catch up, and prepare to deliver maximum benefits and efficiencies from the preparations (e. g. by doing sufficient groundwork to enable a confident system migration, when the time comes).

5 things to do next, for pharma companies

  1. Set or re-set your strategy and position around regulatory, structured data.
  2. Collect and assess product data and prepare this for compliance (scoping and getting stuck into any data enrichment now) – so that it addresses the granularity of IDMP requirements and maps to EMA’s dictionaries/vocabularies.
  3. Prepare to support xEVMPD e-submissions based on the new data model and all of the levels of detail that are expected, to be ready for the future and to enable a rapid transition to IDMP.
  4. Improve your ability to respond and adapt quickly to further changes to regulatory requirements. EMA’s switch to using DADI to submit data to the PMS highlights just how swiftly the roadmap can change, and why an Agile approach to project management is so important.
  5. Start to migrate your content into the new target system as soon as possible. If you have started with the collection of data in Excel files locally, this data could become outdated if not maintained. Don’t leave thoughts of migration until the last minute. Plan for this now, as part of your overall scoping work.To maximize your IDMP system migration, or discuss your best route to IDMP data preparation as you plan for this, please fill out the contact form below and we’ll put you in touch with our experts.
    Get in touch with us