Keep Talking: Why Communication Is the Key to a Successful Project

Keep Talking: Why Communication Is the Key to a Successful Project

When an investment in IT solutions doesn’t live up to expectations, this can often be attributed to lack of effective communications and the establishment of a formalized communication plan. If the project is pushed through the various phases before the teams discuss shared expectations, what each phase includes, and the potential issues and anomalies that could potentially arise during each phase, the project has a high probability that it will go off the rails at some point.

To ensure successful project outcomes, project leaders must be good communicators. In fact, the Project Management Institute says the biggest part of any project manager’s job, consuming 90 percent of their time, is communicating.

Institutionalizing communications

  • Set clear objectives A successful communication plan should begin with clear objectives, milestones, expectations and even escalation paths should something go wrong. The rules of the road should be discussed and agreed to with the key stakeholders on both the client and vendor partner side. While it may seem obvious, it’s important to set up a clear and regular line of communication so team members can report project progress.
  • Develop lines of communication Project leaders can’t be responsible for every aspect of the project, so it’s important to delegate tasks and set up clear lines of communication so team members can quickly and easily report back on what has been accomplished, what’s outstanding, and where problems have arisen. And always share changes to any part of the project with all team members..
  • Establish a regular cadence for meetings Regular meetings are important to allow project risks to be identified and talked about. But since projects and teams are often dispersed across departments and even geographies, it’s of paramount importance to find a variety of ways to communicate effectively and to establish effective lines of communication that can help the project move along fluidly.

A Closer Look: Client Learnings

Let’s take a closer look at what can happen when communications are not part of the project plan from the start.

In a recent client project, we were moving swiftly from phase to phase in implementing a new system. The client’s project team had an aggressive timetable, and while we were aligned on the tasks for each phase, inevitably stakeholders would be asking questions at the end of each development cycle that indicated to us the proper communications was not established from the onset. It was clear that some key decisions or clarifications were taken by functional leads that didn’t take into account other regions or users of the future system.

Were they made aware how the schedule impacted them? Did they understand changes in processes? It was clear they weren’t.

We were able to leverage our experience and lead the client through an exercise to understand the impact on other regions/sites and the broader implications for their processes. While solidifying needs and requirements at the midway point of the project would impact the timeline and overall budget, the client understood the criticality in getting regional leads involved in the decision-making process and fully aligned on the complexity of the project and its potential impact to their functions.

This one decision allowed us to get back on track and guarantee all users’ needs were addressed while ensuring the broader goals and objectives of the organization were being met as well. Having clear communications and more inclusive processes helped them avoid further unexpected starts and stops to the project, but more importantly, ensured greater success for user acceptance post implementation. The client appreciated fme’s expert guidance, ultimately helping them avoid any further costly changes and delays.

 fme’s 8 communication tips for a successful project

  1. At the start of the project, develop a communication plan that maps to the overall project plan and incorporates triggers on when and how to communicate with stakeholders.
  2. Reach understanding, agreement or consensus about pivotal decisions that could impact processes, schedule or stakeholders’ day-to-day job.
  3. Manage stakeholder and project team expectations. A key part of communications is understanding the requirements and expectations of stakeholders so they’re “on board” with the changes that need to happen.
  4. Ensure there is a smooth delegation or handoff of tasks by documenting roles, responsibilities and accountability.
  5. Identify potential project risks and the communication channels and escalation required to minimize those risks.
  6. Develop effective methods to report project progress, looking for opportunities to communicate in person or via regular virtual meetings so there is regular and ongoing communication about the project.
  7. Track progress and let all stakeholders know where the project stands, particularly important as project deliverables or milestones shift.
  8. Create communication channels to quickly share changes with team members as they arise. For example > SharePoint is one of many tools to store weekly status reports as well as meeting minutes and action items.

When good and ongoing communication is built into a project, problems can quickly be addressed and the stress of failure can be alleviated.

Veeva R&D Summit Puts Focus on Innovation and Best Practices

Veeva R&D Summit Puts Focus on Innovation and Best Practices

As a Veeva Migration Services partner, fme was a key sponsor of the summit and we were excited to be part of this growing industry event. During both days of the summit, sessions were separated into tracks: clinical operations, clinical data management, quality and manufacturing, regulatory, the Vault platform and innovation theatres. A total of approximately 400 life sciences companies attended and 61 spoke at the summit.

One of the newer Veeva Vault initiatives is Vault RIM and Veeva has been proactive at making product enhancements based on customer feedback. A key part of moving to RIM is data migration, which is often a complex and time-consuming process. Ultimately, the time spent on the migration is worth the effort since a centralized RIM capability, such as Veeva RIM, is a game-changer for companies.

Moving to a shared data model

A common challenge for many companies is managing product changes, or variations, after receiving approvals, because information is typically not in one place. Veeva and fme client Vertex spoke at the summit about the benefits of having a shared data model within a unified RIM platform. Not only was there a business benefit to this approach but it also helped to align with their objectives of meeting their commitments to patients. Pharma giant GlaxoSmithKline also spoke at the regulatory session about their journey toward unifying regulatory processes and how Vault RIM is helping them to integrate teams in different countries. The ability to simplify deployment and streamline implementation will also be key to ensuring pharmaceutical companies meet their RIM objectives.

Clinical Data Management

Another key track was clinical data management, with a session focused on building and running complex trials using advances in monitoring, cleaning and reporting data. Integral to this is the migration and management of safety data. As companies begin their transition to a more integrated approach to clinical data, they will need to consider their safety data migration fully and carefully.

It was also great to recognize and celebrate Veeva Heroes, which honors six industry pioneers who over the year have gone above and beyond to help move the industry forward. These innovators have pushed boundaries and navigated change at their company to improve processes and deliver outcomes. This year’s innovators included: Jennifer Trundle, Gilead Sciences; Joe Brenner, Johnson & Johnson; Lisa Little-Tranter, Lilly; Michelle Harrison, Vertex Pharmaceuticals; Sandra Freeman, Johnson & Johnson; and Shelly Plapp, Melinta Therapeutics, Inc. Congratulations to everyone.

The Veeva Vault Summit continues to grow and attract increasingly large numbers of life sciences companies that are eager to learn more about areas integral to their business. In fact, the summit has become so large that during this year’s keynote speech organizers had to use an overflow room to accommodate the audience. Next year’s session, which will be held in Boston, will undoubtedly take these large audiences into account.

In addition to joining the 2020 Veeva R&D Summits in both the US and Europe in our capacity as a partner, fme is also planning to sponsor the Medical Device & Diagnostics Summit to be held in Minneapolis in June 2020. Hope to see you there!

Updates from the OpenText Enterprise World Conference in Toronto

Updates from the OpenText Enterprise World Conference in Toronto

On top of that, there are additional challenges as a result of mergers and acquisitions. Once they come together, companies are forced to consolidate documents and data from different systems and sources. Further, there is high pressure to reduce the on-premise footprints of enterprise applications – and with that, maintenance and personnel costs.

So, it’s always good to keep on top of these trends and get the latest solutions available to respond to these changing dynamics. We were happy to join the OpenText Enterprise World Conference in Toronto earlier in July to hear the latest news.

We were out in force with our product, services and sales teams meeting clients at our booth and with our technical staff joining the variety of sessions and labs as much as they could manage. Overall, the conference provided a good overview of the status and roadmaps of the components of the OpenText and Documentum product suite.

In case you missed the meeting or were looking for the latest updates, here are some of the highlights from this year’s conference:

  • For the Documentum platform, it is good to see that there are improvements and enhancements happening across the platform. Cloud is of course one of the big drivers, and all components are on the path to support containerization to allow for easier deployment and management in cloud environments.
  • Documentum Cloud Edition is the next generation from OpenText and will be available in April 2020. It can run anywhere – off or on cloud – and there will be no need to worry about upgrades again.
  • On the InfoArchive side, the latest version offers connectivity to Azure Blob storage to store archive storage (unstructured content and more) for lowest cost of ownership, and the next versions will continue to support more cloud storages like Google Cloud storage, NetApp Storage Grid, and AWS Storage.

 

Working primarily on front-end topics I was especially interested in the following:Working primarily on front-end topics I was especially interested in the following:

  • Good news – Webtop will not disappear just yet. Though it will just be maintained, there is no end-of-life announced at this point. While it is not the client of future, this will at least buy a bit more time for clients to decide which direction to go for their future user interface. Main clients for Documentum remain as before – D2 for content-centric applications and xCP for process-centric applications. An integration or merge is not planned at this point.
  • In addition, the D2 SmartUI has come into play. The goal is to come up with a (close-to) unified user interface across the OpenText applications – Content Server and Documentum. It should simplify the access to the stored documents as well as provide easy access to integrated systems. For D2, it uses a big part of the D2 Configurations. It also will be the base for the mobile app to provide access from devices.

Another big topic at the meeting was the extension of some of the advanced OpenText features to the Documentum platform. One of them is the SAP integration with the Extended ECM Adaptor for SAP. Also, this integration can be exposed for end users through the Smart UI.

Working heavily with customers in the life sciences industry, I was of course eager to find out what is planned there as well.

  • On the Documentum front-end side, the Documentum for Life Sciences suite will remain the major front end. The cloud-based Life Sciences Express client uses the same D2 configurations and is planned to replace the Classic UI more and more in future, additional features are still being added there to turn it into a full-feature client. The goal is to provide an easy-to-use but feature-rich client that will run as a regular web application but will also allow mobile device users to efficiently work with the application.So far, the application is focused on read-only use cases but there is a plan to extend it with the required document management capabilities. Some glimpse of that could be caught during the Innovation Lab – and it looked very promising.The classic Documentum Life Sciences suite will still be extended and enhanced, but pure user interface and usability improvements are being targeted for the LS Express client only moving forward. Further business-process-specific cloud applications are planned to enhance the client portfolio. The Quality CAPA application had been published before and was enhanced some more, and a new Regulatory Planning application is in the queue now and is targeted to be developed in close interaction with interested customers.

Finally, on the innovation front, the other big topic was – of course – the use of artificial intelligence (AI). During the keynote sessions, it was noted that all companies should now consider themselves “information companies,” making it important to find ways to put information into business context. Whether to comply with regulations or to create efficiencies to accelerate time-to-market, AI technologies can be leveraged.

 

All in all, the conference provided several opportunities for clients to connect and exchange experiences. Thanks a lot for the interesting conference (and for the “Paws for the Break” – such cute puppies) and I’m looking forward to seeing everyone next year in Vegas!

A master plan: why master data management is key to successful migration projects

A master plan: why master data management is key to successful migration projects

»Master data management is the method that an organization uses to define and manage its critical data in order to achieve a single source of truth across the enterprise.«

The importance of master data cannot be overstated. Master data represents the most critical data for operations within an organization or function. It is data that can be trusted, is unlikely to change, has been verified as correct and error free, meets compliance requirements, is complete and consistent, is common to all stakeholders, and is crucial to the business’ operations. The term “master data” is often applied to databases of business-critical information, such as customer information files, product information files and so on. This is true, these types of databases are usually the authoritative source of the master data values. That master data is typically used by other applications, such as content management systems, which must be kept in synch with changes in the authoritative source. What can hinder the use of master data in a content migration project is inadequate resources, lack of standards, failure to implement an internal governance process, poor planning, changing direction part way through a project, and not having a business owner or champion that understands the complexities of master data. In one example, errors occurred early in the migration phase because the master data requirements and master data interdependencies were unclear. That meant the team responsible for the migration had to unexpectedly address a lot of master data issues. Adding to the problems faced was the fact that the organization was in the process of developing and changing naming convention standards for drug products, so the data in the system and the data used for mapping were different. Finally, further problems were encountered as the company decided on a last-minute major change to its approach to handling application master data. This occurred because the business users failed to understand how the original approach would affect their user community. As a result, the project was delayed, and huge effort had to be expended to tackle the issues. These problems can be averted with the right approach.

Five lessons learned on how to mitigate migration issues

  1. Plan early. A successful content migration project needs to evaluate master data requirements at the planning stage to ensure a complete set of the right data is made ready for migration. If master data is missing, incorrect or not available at the right time, it can lead to delays and increased costs. It may even result in projects being cancelled or suspended.
  2. Develop a cross-functional team. Often, anything to do with data management is seen as an IT issue. However, it’s important that a cross-functional migration team, comprising IT and business stakeholders, works together to determine the master data required for migration. It is essential that one or more business representatives take ownership of the master data aspect of the project.
  3. Bring in the resources needed. Many organizations don’t understand how master data affects the application or migration process. That’s because many simply don’t have the internal resources or expertise to address master data management as an integral part of a migration project.
  4. Spend time on business analysis. Another issue is that few organizations have a business analyst function sitting between the business and IT. Spending the time upfront on analysis of the master data needed for the migration and comparing it to master data in the existing system can prevent project delays and disruptions.
  5. Always consider master data context in determining which master data to use in a content migration. If the documents are being used for a specific part of the business, it makes sense to only incorporate master data relevant to the project and end users. For example, if manufacturing documents are being migrated, the master data should be relevant to manufacturing users. In this case, internal drug product names are probably more appropriate than drug product trade names.
Experience has shown that making master data management as a key element of any migration project vastly improves the success of a project. Master data, developed and supported through a collaborative process, should be the bedrock of any migration project. More on our Content Migration Services
Checklist for a Successful OpenText Documentum D2 Implementation

Checklist for a Successful OpenText Documentum D2 Implementation

After all, the sheer volume of content can be overwhelming and the potential to change business processes that developed over time can be a contentious point in any organization. This planning checklist can assist in making the transition as painless as possible and put your organization one step ahead in planning and implementation.

1. Know your workflow processes, but be adaptable
While Competent Authorities have specific requirements around the development, manufacture, and maintenance of drug products, drug substances, and medical devices, the specific business processes to meet those requirements are not explicitly defined by the Authorities.Companies generally develop processes to comply with the requirements and still provide an efficient flow to ensure the product/device can be supplied to the market. While your business processes may have been developed specifically for your company, certain efficiencies can be gained by broadening requirements and adopting the best-practice solution provided by OpenText. By adopting a future state business workflow that aligns with the best-practice models supplied, you can take a smarter and faster approach to your implementation, allowing your implementation partner to focus on other, higher-value customizations that will benefit your business.

2. Familiarize yourself with the DIA reference model
The Quality & Manufacturing solution has been developed with the DIA reference model as its backbone. This industry-supported model outlines an organizational structure for documentation. OpenText Documentum for Life Sciences Quality & Manufacturing module leverages this model to provide an out-of-the-box solution that can manage 95 percent of your documentation. You can derive maximum value from this model by identifying artifacts that closely align with current documentation and by adopting the DIA terminology. If necessary, additional artifacts can be added in the Quality & Manufacturing solution with consideration for validation and documentation effects.

3. Define templates
Content creation is the function of your authors. Using templates for the content ensures uniformity across the solution and makes your review team more efficient, allowing them to focus on continuity of content, rather than on formatting. OpenText Documentum for Life Sciences Quality & Manufacturing module provides the ability to associate templates with each artifact name, if desired. Templates that are imported into the system must be approved before use in content creation, allowing control over the formatting and structure of content authored in the system.

4. Know your users’ roles and functions
OpenText Documentum for Life Sciences Quality & Manufacturing module provides a site-based, role-based security model to save time and effort in determining how to set up security. The combination of membership in site and role allows the business to limit access to content, as needed while permitting appropriate individuals to perform certain business functions. Roles provided by the solution include author, reviewer, approver, quality organization approver, coordinator, and administrator. By identifying the users who fit these business roles early on in implementation, you can accelerate the system set up and provide very focused communication and process training to the impacted roles.

5. Identify your system champions
Implementing OpenText Documentum for Life Sciences is a significant change, whether moving from a paper process, a previous Documentum system, other software system, or even a previous version of the solution. The best approach to building positive momentum is to identify a small group of users who will learn the system first and act as change-makers and champions of the new system. This will go a long way to support the overall user adoption. Of course, user training is paramount to system adoption and a positive impression of the system’s capabilities. Beyond that, a core group of champions for the software can initiate a wave of excitement for the new system. Change in any large organization can be challenging, but change combined with a learning curve presents two significant hurdles. By anticipating the challenges and preparing your processes and people in advance, organizations can positively promote change and work through many of the challenges presented by those learning curves.

Lost in translation: Why complex EDMS projects often miss the mark

Lost in translation: Why complex EDMS projects often miss the mark

Typically, teams or committees are created across different business units and geographies, each with different goals and expectations.

The result can be a budget blow-out, failure to harmonize on common processes, and projects that drag on far longer than they should. And in almost every instance, the finger of blame is pointed at the business owner not really knowing what they want. After all, the other teams – Usually IT and Quality – are following well-established processes. They’ve carried out similar projects countless times. The suggestion is that it’s the business owners who hold up their own project.

That’s simply not true. The business owners know what they need from their EDMS and other systems. The problem is that most are engaging in these types of projects for the first time. It’s not that they don’t know what they want; it’s just that they aren’t as experienced in knowing where to start and what is critical at any given time. And so, their goals and objectives can sometimes get lost in translation.

To address or limit these challenges, business owners need to take a step back. Rather than thinking about EDMS as a technology project, they need to start by thinking of it as a business objective. The first step is to gather the knowledge that will be needed to define specific business-focused expectations within the context of detailed milestones. That typically involves reaching out across departmental lines and collaborating with other teams and documenting the expectations and outcomes.

However, given that the business owner has little, if any, experience in such projects, the question is where do they gather the necessary knowledge?

Alignment is key

In the past, this may have been easier. This expertise existed in-house and within the functional units. These complex systems were designed, managed, built, and modified by the people who were using it, meaning the scientists, regulatory staff, and Quality departments utilizing these systems. Each functional unit had its own technology specialists managing those systems.

Since that model no longer exists, business owners need to find a translator to help them define their specific needs, objectives and milestones and provide the necessary tips and techniques to ensure success. Today, that expertise is unlikely to exist in-house. Rather, the expertise required to translate requirements into practical, aligned and harmonized solutions now reside with partner organizations. They’ve implemented and managed these types of projects countless times – internally and externally – and they know what is needed to successfully deploy a system that meets the business need. Further, really good partner organizations have the expertise to help the business implement best-practices into their processes with the least disruption during the transition period. They also supply and help enforce the time-tested processes and implementation tools to ensure the project goes to plan.

It’s evident that the cost to the business of poorly managed EDMS implementation projects is high. So, finding an accomplished partner or “translator” – someone who can ensure that everyone is in sync and proceeding at the same rate and based on the same expectations – is crucial for both the business and the users who depend on EDMS systems and Software tools.