Updates from the OpenText Enterprise World Conference in Toronto

Updates from the OpenText Enterprise World Conference in Toronto

On top of that, there are additional challenges as a result of mergers and acquisitions. Once they come together, companies are forced to consolidate documents and data from different systems and sources. Further, there is high pressure to reduce the on-premise footprints of enterprise applications – and with that, maintenance and personnel costs.

So, it’s always good to keep on top of these trends and get the latest solutions available to respond to these changing dynamics. We were happy to join the OpenText Enterprise World Conference in Toronto earlier in July to hear the latest news.

We were out in force with our product, services and sales teams meeting clients at our booth and with our technical staff joining the variety of sessions and labs as much as they could manage. Overall, the conference provided a good overview of the status and roadmaps of the components of the OpenText and Documentum product suite.

In case you missed the meeting or were looking for the latest updates, here are some of the highlights from this year’s conference:

  • For the Documentum platform, it is good to see that there are improvements and enhancements happening across the platform. Cloud is of course one of the big drivers, and all components are on the path to support containerization to allow for easier deployment and management in cloud environments.
  • Documentum Cloud Edition is the next generation from OpenText and will be available in April 2020. It can run anywhere – off or on cloud – and there will be no need to worry about upgrades again.
  • On the InfoArchive side, the latest version offers connectivity to Azure Blob storage to store archive storage (unstructured content and more) for lowest cost of ownership, and the next versions will continue to support more cloud storages like Google Cloud storage, NetApp Storage Grid, and AWS Storage.


Working primarily on front-end topics I was especially interested in the following:Working primarily on front-end topics I was especially interested in the following:

  • Good news – Webtop will not disappear just yet. Though it will just be maintained, there is no end-of-life announced at this point. While it is not the client of future, this will at least buy a bit more time for clients to decide which direction to go for their future user interface. Main clients for Documentum remain as before – D2 for content-centric applications and xCP for process-centric applications. An integration or merge is not planned at this point.
  • In addition, the D2 SmartUI has come into play. The goal is to come up with a (close-to) unified user interface across the OpenText applications – Content Server and Documentum. It should simplify the access to the stored documents as well as provide easy access to integrated systems. For D2, it uses a big part of the D2 Configurations. It also will be the base for the mobile app to provide access from devices.

Another big topic at the meeting was the extension of some of the advanced OpenText features to the Documentum platform. One of them is the SAP integration with the Extended ECM Adaptor for SAP. Also, this integration can be exposed for end users through the Smart UI.

Working heavily with customers in the life sciences industry, I was of course eager to find out what is planned there as well.

  • On the Documentum front-end side, the Documentum for Life Sciences suite will remain the major front end. The cloud-based Life Sciences Express client uses the same D2 configurations and is planned to replace the Classic UI more and more in future, additional features are still being added there to turn it into a full-feature client. The goal is to provide an easy-to-use but feature-rich client that will run as a regular web application but will also allow mobile device users to efficiently work with the application.So far, the application is focused on read-only use cases but there is a plan to extend it with the required document management capabilities. Some glimpse of that could be caught during the Innovation Lab – and it looked very promising.The classic Documentum Life Sciences suite will still be extended and enhanced, but pure user interface and usability improvements are being targeted for the LS Express client only moving forward. Further business-process-specific cloud applications are planned to enhance the client portfolio. The Quality CAPA application had been published before and was enhanced some more, and a new Regulatory Planning application is in the queue now and is targeted to be developed in close interaction with interested customers.

Finally, on the innovation front, the other big topic was – of course – the use of artificial intelligence (AI). During the keynote sessions, it was noted that all companies should now consider themselves “information companies,” making it important to find ways to put information into business context. Whether to comply with regulations or to create efficiencies to accelerate time-to-market, AI technologies can be leveraged.


All in all, the conference provided several opportunities for clients to connect and exchange experiences. Thanks a lot for the interesting conference (and for the “Paws for the Break” – such cute puppies) and I’m looking forward to seeing everyone next year in Vegas!

Checklist for a Successful OpenText Documentum D2 Implementation

Checklist for a Successful OpenText Documentum D2 Implementation

After all, the sheer volume of content can be overwhelming and the potential to change business processes that developed over time can be a contentious point in any organization. This planning checklist can assist in making the transition as painless as possible and put your organization one step ahead in planning and implementation.

1. Know your workflow processes, but be adaptable
While Competent Authorities have specific requirements around the development, manufacture, and maintenance of drug products, drug substances, and medical devices, the specific business processes to meet those requirements are not explicitly defined by the Authorities.Companies generally develop processes to comply with the requirements and still provide an efficient flow to ensure the product/device can be supplied to the market. While your business processes may have been developed specifically for your company, certain efficiencies can be gained by broadening requirements and adopting the best-practice solution provided by OpenText. By adopting a future state business workflow that aligns with the best-practice models supplied, you can take a smarter and faster approach to your implementation, allowing your implementation partner to focus on other, higher-value customizations that will benefit your business.

2. Familiarize yourself with the DIA reference model
The Quality & Manufacturing solution has been developed with the DIA reference model as its backbone. This industry-supported model outlines an organizational structure for documentation. OpenText Documentum for Life Sciences Quality & Manufacturing module leverages this model to provide an out-of-the-box solution that can manage 95 percent of your documentation. You can derive maximum value from this model by identifying artifacts that closely align with current documentation and by adopting the DIA terminology. If necessary, additional artifacts can be added in the Quality & Manufacturing solution with consideration for validation and documentation effects.

3. Define templates
Content creation is the function of your authors. Using templates for the content ensures uniformity across the solution and makes your review team more efficient, allowing them to focus on continuity of content, rather than on formatting. OpenText Documentum for Life Sciences Quality & Manufacturing module provides the ability to associate templates with each artifact name, if desired. Templates that are imported into the system must be approved before use in content creation, allowing control over the formatting and structure of content authored in the system.

4. Know your users’ roles and functions
OpenText Documentum for Life Sciences Quality & Manufacturing module provides a site-based, role-based security model to save time and effort in determining how to set up security. The combination of membership in site and role allows the business to limit access to content, as needed while permitting appropriate individuals to perform certain business functions. Roles provided by the solution include author, reviewer, approver, quality organization approver, coordinator, and administrator. By identifying the users who fit these business roles early on in implementation, you can accelerate the system set up and provide very focused communication and process training to the impacted roles.

5. Identify your system champions
Implementing OpenText Documentum for Life Sciences is a significant change, whether moving from a paper process, a previous Documentum system, other software system, or even a previous version of the solution. The best approach to building positive momentum is to identify a small group of users who will learn the system first and act as change-makers and champions of the new system. This will go a long way to support the overall user adoption. Of course, user training is paramount to system adoption and a positive impression of the system’s capabilities. Beyond that, a core group of champions for the software can initiate a wave of excitement for the new system. Change in any large organization can be challenging, but change combined with a learning curve presents two significant hurdles. By anticipating the challenges and preparing your processes and people in advance, organizations can positively promote change and work through many of the challenges presented by those learning curves.

User Forum Brings Life Sciences Community Together

User Forum Brings Life Sciences Community Together

The meeting held in April brought together business and technical leaders from within regulatory affairs, quality, and IT to share their experiences with regulatory content management strategies, technology implementation challenges, and leveraging next-generation digital tools to automate and improve business processes.

The meeting held in April brought together business and technical leaders from within regulatory affairs, quality, and IT to share their experiences with regulatory content management strategies, technology implementation challenges, and leveraging next-generation digital tools to automate and improve business processes.

To garner deeper perspectives and insights, attendees separated into three groups and were asked to discuss three key topics:

  • Priorities and considerations when deploying an end-to-end regulatory information management (RIM) solution, including technical priorities, platform limitations, and the limitations posed by document and data quality.
  • Upgrading and re-platforming to OpenText Documentum for Life Sciences, taking into account master data management, goals to streamline processes, and compliance or data quality issues that should be improved.
  • Quality management solutions, sharing current experiences and areas that need improvement, managing change control processes, and existing measures to oversee QMS processes.
  • The post discussion roundtable resulted in a healthy and productive exchange of ideas and experiences. These perspectives will be further explored in future LSUF meetings.

    A Focus on Improving Processes

    The effort to improve and simplify processes is a high priority for life sciences companies. Artificial intelligence offers a powerful opportunity to improve document quality, for example by enriching metadata. During the meeting, fme talked about the intelligent document classification and metadata enrichment solution it is developing in partnership with Docxonomy, called fmeMATRIX. As establishment of the solution continues, the views and experiences of clients will be integral to applying a progressive approach. The LSUF meeting was a key opportunity to garner feedback on the needs and concerns of life sciences companies. Here are some additional takeaways:
    • In another LSUF presentation, one of our clients shared its experiences with collaborative authoring, using technology to enable simultaneous or parallel authoring and review in real-time. Integral to enabling the company to meet its regulatory requirements is SharePoint Connector, and the client team is working with OpenText on implementing this capability.
    • During its presentation, the client’s IT account manager and IT regulatory architect posed several key discussion points and questions for their peers. These included how other companies tackle collaborative authoring of submissions and the point at which a document becomes an official record, based on good practice guidelines.
    • During another presentation, a client presented a proven, iterative approach to integrate a leading Regulatory Information Management platform into the OpenText Documentum for Life Sciences Research and Development module. The client outlined an approach that ensured quicker solution implementation timelines and more effective R&D alignment and buy-in.
    These were some of the highlights from the April LSUF meeting, which was the second forum meeting to bring together the life sciences client community to exchange ideas. The next forum will be held this October and we look forward to further productive and insightful discussions. Please reach out to me if you’re interested in joining this growing group!
    Why OpenText Documentum products and Cloud Computing are complementary!

    Why OpenText Documentum products and Cloud Computing are complementary!

    Compared with cloud computing technologies that are very strong in providing elastic (scalable) services OpenText Documentum products could be regarded as inflexible and monolithic / layered applications. Although they seem to be the exact opposite of the flexible Microservice architecture approach used for cloud native application design, there are ways to combine OpenText Documentum products with cloud computing technologies. Here are some examples to give you an idea of how to achieve more flexibility by breaking either the OpenText Documentum base installation or business logic into smaller services.

    Infrastructure Services

    A classic OpenText Documentum Content Server installation could be split into four services / Linux containers:
    • The Docbroker
    • The repository services
    • The Java Method Server (serverapps.ear)
    • The Accelerated Content Services (acs.ear)
    It is obvious that the Docbroker does not have to deal with much traffic and does not need any load balancing. Thus one or two instances at most could be sufficient (complemented with according health check monitoring) to provide a robust fail over. For the repository services, the Java Method Server and the Accelerated Content Services, two instances each are sufficient to provide a quite robust service. However, at some point you might want to perform a data migration into your repository with many documents for example. In this situation, you might think of hiring our highly skilled content migration team. During the migration period, especially with systems utilizing the Java Method Server heavily, exactly this service would become a bottleneck. All other server components would be able to handle all migration tasks. And here things become interesting: If you had used the service architecture as described earlier and you had been utilizing an orchestration tool, you would have been able to request two additional Java Method Server instances on-demand within minutes. The orchestration tool automatically creates two more instances, which are automatically proxied via a service end point. All upcoming migration requests are then spread over all existing instances, providing a good migration experience. Once the migration is finished, you can scale down the number of instances.

    Business Logic Services

    If you are using OpenText Documentum D2 e.g. and have potentially “heavyweight” logical services like watermarking, you can create real services (Microservices) and connect them to service discovery tools with load balancing/fail over aware clients (e.g. the Netflix OSS – Eureka, Ribbon, Hystrix). With this option, the watermarking service becomes scalable and flexible for any upcoming future needs and can be placed or moved to dedicated computing resources as needed. If at a certain point this service is identified as a bottleneck, you may instruct your orchestration tool to create additional instances of the same service. If there is a one-time event like a submission to an authority, you may also instruct your orchestration tool to create additional instances of the same service, but you are able to downscale the service after the specific event has finished in order to efficiently use your hardware resources.


    Do not fear to use the best of both worlds! We will support you in combining both technologies and providing best results to you!
    • Analyze your existing OpenText Documentum infrastructure architecture
    • Analyze your existing OpenText Documentum software architecture
    • Create a roadmap with you on how to make your OpenText Documentum stack cloud computing-ready
    • Create best practices on how to create future components inside your existing OpenText Documentum stack to make them elastic and to comply with the concepts of cloud computing
    • Move application logic (where applicable) into elastic Microservices
    We are looking forward to sharing our expertise with you!
    Traditional or cloud-native? Why not something in between?

    Traditional or cloud-native? Why not something in between?

    Your on-premise cloud options at a glance

    Starting from complete use in their own data center (on-premise), companies have the following options:
    1. Cloud storage: adding storage capacity from the cloud to your application.
    2. Re-platforming
      1. Lift’n’Shift: i.e. virtualizing or containerizing the application and hosting the virtual machines or containers with a cloud provider. Of course, provided that the containers are orchestrated accordingly, they can also be hosted in one’ s own data center.
      2. Lift’n’Extend: i.e. to containerize the applications and to enhance with other cloud services functionally, or to create new cloud-native functions. It is also possible to develop individual elements of the application, e.g. the clients, in a new and cloud-native way and to link them to the backend. Such solutions are commonly referred to as hybrid cloud applications; not to be confused with hybrid cloud concepts.
    3. Re-Factoring: i.e. to redevelop the application on a PaaS stack as a cloud-native application. If the solution is based on a commercial basic product such as a DMS that is not cloud-compatible, this method is not possible.
    4. Use a new solution from a SaaS provider.

    Containerization as an ideal middle way between traditional and cloud-native

    For example many organizations, are facing the decision to continue to operate DMS-based solutions at great expense and cost or to choose option 4, i.e. to switch to a SaaS provider. However, the options listed under 2 offer an ideal middle way with containerisation. The advantages of containerization have already been described in detail in the fme blog post “How can Linux containers save your day?”. Our “Distributed Environments With Containers” data sheet provides a more technical insight into containers. In particular, the use of containers in validated environments – so critical for the life sciences industry – is a prime example of the additional advantages offered by container technologies.

    Your container advantages at a glance

    • Proven software and aplications remain in use
    • The user interface and user guidance remain the same
    • Faster, less error-prone, automated processes can result in faster application deployment and lower validation costs
    • The agility in projects increases
    • Upgrade paths are simplified: Entire migration environments are containerized, even with different host operating systems for which previously elaborately installed, own virtual machines were necessary
    • The applications are manifold!

    Strong expertise from fme’s independent cloud, container and migration experts

    The containerisation experts from fme disassemble the basic product, pack it into containers and reassemble it in your data centre or at a cloud provider, such as AWS, to form the new basic application. For some products, fme has already built ready-to-use containers that can be rolled out fast and easily. Subsequently, configurations and, if available, customizations are made. The new system is filled with the fme Migration Services and thus a 1:1 copy is created. What sounds so simple requires a high level of expertise and is associated with costs that are more than justified if one considers the advantages of containerization.

    A useful example for Lift’n’Extend (2b): Connection of Alexa to a containerized DMS application

    An example for the extension of a containerized DMS application with native cloud services can be seen impressively in the YouTube video Showcase: Alexa, please open Documentum! “ For this purpose, fme containerization experts have rolled out OpenText Documentum on AWS and fme AWS specialists have connected it to Alexa. We believe that Alexa skills can make your daily work easier. In environments where operating a system with mouse and keyboard is difficult, Alexa can use voice commands to find and read documents. A use case could be a laboratory where safety gloves have to be worn, but the user working there needs SOPs from a DMS system.
    The OpenText Documentum REST API? – A field report from the developer’s point of view

    The OpenText Documentum REST API? – A field report from the developer’s point of view

    What is the “Documentum Rest API”?

    In principle, the term Documentum REST API refers to a web interface introduced with Documentum 7 that allows access to objects and functions of OpenText Documentum. This is based on Spring-Boot, is delivered as a WAR file and must be installed on an application server – e.g. Apache Tomcat. This interface can be used to write customized clients, apps, or plug-ins of other systems.

    Before the Documentum Rest API, the APIs DFS and DFC had to be used as an interface for business applications in order to interact with Documentum. DFS is based on SOAP and is therefore also a web interface, while DFC is based on Java.

    The REST API consists of several residual web services that interact with OpenText Documentum. It is “hypertext-driven, server-side stateless, and content negotiable, which provides you with high efficiency, simplicity, and makes all services easy to consume” [Source: https://community.emc.com/docs/DOC-32266, date of collection: 29.11.2016 – 11:51]
    Introduction to REST Services

    To understand how the Documentum REST API works, the first step is to explain the basic operation of a general REST service. REST stands for Representational State Transfer and is based on a stateless, client-server and cache capable communication protocol. In almost all cases the http protocol is used. REST is additionally hypertext controlled, i.e. REST clients must have all the information they need at any given moment to decide where to forward them. Hypermedia connects resources and describes their capabilities in a machine-readable way. For this reason, a REST client only needs to know one thing to communicate with the REST server – the understanding of hypermedia.

    REST itself is an architecture style for network services (web services), where complex mechanisms should be avoided as far as possible by using simple http calls.
    This means that a REST web service and its functions can be called without much effort from a simple http client. A call via the browser – which is also based on an http client – is also possible if the web service has implemented a GET call for the specified service URL. It is because of the http protocol that the functions and accessibility of the REST services can be easily tested and used, for example, for the in-house development of a client whose code base can be determined by the user.

    In addition, the REST API can be accessed from any client, provided that this client is on the same network and can authenticate and authorize itself. An installation of local software is not necessary for the accessibility – for the use of the functions only an http-Client must exist (e.g. browser or self-developed client).

    Compared to SOAP, REST also offers a small performance advantage..

    Functions of the services

    The Documentum REST API is delivered with some REST services that allow you to call basic functions of Documentum. Here is a brief list of what I think are the most interesting services:
    Object Service
    This service allows interaction with Documentum objects inheriting from dm_sysobject, such as cabinets, folders and documents. You can use the service to display objects, create and change new objects (for example, check-out/check-in, new rendition,…) or delete objects.
    Objects that do not inherit from dm_sysobject must be implemented through an extension of the REST API. Read more about this in the section Extensibility.

    User Service
    This service can be used to manage users, groups and ACLs. This means that user administration can also be accessed from an external source and easily adapted to your own processes and requirements.

    DQL Service
    The DQL web service is a service that allows you to execute DQL queries and report their results back to the client. By default, the DQL service only supports select queries for security reasons. Also for security reasons, changes to objects via DQL should be implemented using your own, secure services.

    Example Call of the Object Service
    Once the REST API has been successfully installed on an application server, provided the firewall rules have been set up correctly, it can be called by any client from the local network under the link “/dctm-rest” (example: http://vm-documentum-app-test:8080/dctm-rest for default Tomcat configuration). When calling up this URL, e.g. via the browser, you should then be able to see the following:

    If the above link, adapted to local conditions, cannot be called either via the network or locally from the application server, the log files must be inspected. Most likely something failed during the installation of the REST API.

    Clicking on the “Services” button would call “/dctm-rest/services” (example: http://vm-documentum-app-test:8080/dctm-rest/services).
    The result, when the page is called up, should look something like this:

    If you now call the first URL, for example, all Documentum repositories would be displayed:

    The URLs contained in the entries could now be used to connect to the repositories using valid logon data, for example to execute a DQL query, check out a document, or similar.

    With the appropriate knowledge of the architecture and function of REST services, training is relatively quick, since the Documentum REST API corresponds to a REST-compliant interface. In addition, OpenText provides many helpful tutorials and code examples that can be found on the Internet. OpenTexts code libraries, which are available in almost all common programming languages (Java, C#, Pyton, Swift, Ruby, Objective-C, AngularJS,…) on GitHub, complete the simple introduction.
    Nevertheless, these GitHub libraries should be chosen with care, since you depend on the written code and the libraries can be adapted by the manufacturer at any time. Features could be different with a new patch and not work as before, which can lead to errors.
    The other option would be to create a custom written code library that accesses the API. This ensures independence from the code libraries provided by OpenText and gives you more control over what happens in the foreground during development.

    As already indicated in the point of the individual functions of Documentum REST Services, not every function of Documentum is enabled via the web interface described here in the standard system.
    However, if functions that are urgently required for business logic are missing in the standard delivery of the REST API, they can be implemented using enhancements to the REST services. It should be noted that the familiarization is not quite as simple as, for example, for the initial entry of the creation of a document management client. However, thanks to the documentation of the REST API and good examples on the OpenText community page, most of the initial problems can be handled easily. It is implemented in Java and using the build management tool Apache Maven.

    Docker Support
    Since version 7.3, Documentum REST web services can officially be executed as a finished docker image in a Docker container. The support of Docker is in my opinion the right step to place the REST API as the primary web interface. Docker makes it easy to scale and migrate to other servers without much effort. For this reason, the finished REST API can be quickly integrated into the internal network and used, of course under the circumstances that the necessary Documentum infrastructure already exists.

    Documentum REST web services offer a good and beginner-friendly way to interact with OpenText Documentum from self-implemented systems or clients. In my opinion, the quick training in the functionality and use of the API is especially positive. But you should be careful with special cases, because the implementation of an extension of the standard API is a time-consuming process and requires some training. The possibility of using the services from all possible programming languages and thus also with any operating system is also very nice.