Open Source ECM is Dead

imageIt finally happened. An acquisition in the ECM space that was so newsworthy I had to write about it. One so big that it is going to fundamentally change the market.

Hyland just announced that they are acquiring Nuxeo.

I never thought that an acquisition involving these two firms would be so newsworthy. However, this is the second acquisition of a major open source ECM vendor in the past year by Hyland. And that is the problem.

There were only two major open source ECM vendors in the market.

That’s right. A single vendor, who was not in the open source market before they bought Alfresco, has acquired both major players. While this may not spell the end of open source in the ECM space, it does mean the end of true choice.

And only with one choice, you do not have a competitive ecosystem.

The Coexistence of Alfresco and Nuxeo

Let’s look at the practicalities of the acquisition, putting aside the open source nature of both Alfresco and Nuxeo. Alfresco was a good fit. They had a larger footprint with “enterprise” customers and their content services architecture was more cloud ready. There was a little bit of overlap but there were lot of reasons to not worry.

Nuxeo overlaps with Alfresco quite a bit. It has a stronger digital asset management (DAM) offering and a more advanced technical architecture. It is lacking in records management features, though that can be compensated by leveraging a tool with federated records management capabilities, like the one within Alfresco.

Alfresco was liked by enterprise buyers. Nuxeo was liked by the technical geeks. However, as Alan Pelz-Sharpe points out, there was no love lost between the vendors because they saw each other, rightly so in my opinion, as each others main competitor.

Future of Content Services

Right now, Hyland is a big unknown. Will they provide information governance capabilities for Nuxeo and use that as their cloud baseline? Will they take Nuxeo’s DAM and engineers but ditch the rest? Whatever the direction, it will take time to get everything structured at Hyland and moving in the right direction.

Meanwhile, Microsoft 365 and Open Text have to be a little concerned. If Hyland does things correctly, Hyland is going to be a strong competitor. Best case scenario, they can leverage the uncertainty for the next year to retain customers thinking of leaving and to win a few more deals before Hyland comes out swinging.

The biggest winner, and likely the only one in both the short-term and the long-term, is Box. They have benefited by the on-premises ECM industry failing to successfully attack the cloud. They just got one more chance to “win” the industry, just when they might need it.

What Is Next?

It is hard to say. There is clearly an opportunity for some vendor to step-up and become a significant player. Perhaps one of the headless CMS (content management systems) players that are making a splash in the web content management (WCM) space.

To be honest, I half expected Amazon to buy Nuxeo and turn them into an AWS offering. If Amazon created an ECM offering, perhaps with Textract tied-in, that could be formidable. Microsoft may also decide to move past checkbox content services and turn SharePoint into a real platform.

A lot could happen. For the next few months, everything should be status quo. If I was a cloud native vendor, I’d be closing my gaps and getting ready to pounce on the clients being left behind. Right now, Box is likely the best positioned. Their largest weak spot, from a content services perspective, is their lethargic content modeling.

And that can be compensated for if necessary.

Content Services Made Possible With AWS

[Originally written for the TeraThink blog. Additional edits have been to clarify context.]

We’ve shared a bit about how we’ve setup a working infrastructure for content services at USCIS. While it hasn’t always been easy, there have been a few key takeaways that have made TeraThink’s efforts successful.

  1. Define business-centric APIs. We currently use Mule as it makes the basics easy and allows for complexity.
  2. Understand, capture, and fully execute the non-functional requirements. User experience drives adoption. Non-functional requirements drives management support and avoids messy incidents.
  3. Architect for, and deploy in, the cloud.

Designing for the cloud seems obvious in today’s IT world. However, I cannot stress how much time and effort has been saved by keeping this in the forefront of our efforts. I’ve been doing enterprise content management (ECM) for decades and I can tell you that using the different cloud capabilities of Amazon Web Services (AWS) has made a huge, positive impact.

Continue reading

Join TeraThink At Alfresco’s 2019 Government Summit

[Originally published on the TeraThink blog]

Alfresco is bringing their Alfresco Days series of events back to D.C. again on May 23. The 2019 Alfresco Government Summit focuses on generating discussions around leveraging Alfresco as a platform in the cloud. Specifically, as an open source content services platform living in AWS.

TeraThink will be there again this year to talk about how we make content services work using Alfresco in AWS. We have been leveraging the content services platform (CSP) approach to deploying enterprise content management (ECM) for a few years. During that time, we’ve learned a lot of lessons. We will be bringing that expertise to the Application Platform Revolution Panel moderated by Alfresco founder John Newton.

Before the 23rd, I want to take a few minutes to giving you a preview of some of the thoughts we will be sharing.

Continue reading

Digitally Transform Your Processes and Information Governance Policies

[Originally published on the TeraThink blog]

Looking at these boxes of records in the window you have to wonder if the retention is driven by how much space they have or actual business need.One of the great things about using content services in your digital transformation efforts is the automation a lot of information governance processes. You can link business entities, automate the application of policies, and reduce duplicate content. All of which increases reliability of information and reduces redundancy. The newly digitized processes streamline the work that you do daily, increasing your ability to innovate across your business.

Sounds great, right?

But what about those policies you are applying? Have you thought about what they are doing? Do they reflect the realities of your day-to-day? Now that you are no longer dealing with paper and information silos, you can revisit your records policies that were written years ago.

Continue reading

Digital Preservation Matters As Our Records, And History, Are Vanishing

[Originally published on the TeraThink blog]

Some 3.5 inch floppy disks from the 80s and 90s. Recognize any?I’ve been seeing an uptick in interest in digital preservation recently. We are a few decades into the digital age and even without the push to digitally transform everything, people are realizing that they have a lot of digital information. I am surrounded by people who are using a digital records system I put in place over a decade ago. This puts that system into the realm of digital preservation. As per AIIM in their 2017 Digital Preservation Market Research:

The capabilities to ensure the readability and usability of digital information that must be retained for longer than 10 years.

I used to think ten years was a long time. It isn’t. People are also realizing that while storing large volumes of electronic documents is easier than paper, you have to take greater care. I have books that are older than 100 years in my house. The only accessible, viable, digital content I have over 25 years old are some music compact discs.

As we create more and more digital information, we need to start thinking more about long-term preservation.

Continue reading

Data, Content, Information, and Records Management

Information Coalition's initial view on the relationship between data, content, information, records, knowledge, and documentsThere are so many terms for the things that we manage everyday. Most people’s understanding of them are remnants of what was learned as we each entered the industry. This understanding has been expanded by how we use it in our daily life. The Information Coalition is working on their InfoBok that seeks to finally define these disciplines.

Recently, I was part of a twitter discussion with several people, primarily hailing from the web side of the content management world. It has been many years since I made the argument that the world of Enterprise Content Management (ECM) should include the Web Content Management (WCM) space. The worlds turned out to be connected but distinct. The uses of the word “content” and how it relates to information is evidence of that difference.

I thought I would take time to better share my thoughts where there were more than 280 characters to frame my thoughts. Hopefully, this will stir some more discussions.

Continue reading

Beyond the Hype of Content Services

[Originally published on the TeraThink blog]

The original Washington Monument just off the Appalachian Trail in MarylandA few weeks back, I spoke on an Information Coalition webinar with Nick Inglis about getting Beyond the Hype of Content Services. We discussed content services and tried to separate the reality from the hype. If you been following, there is a lot of hype out there and has been since Gartner stopped tracking ECM (enterprise content management) and switched to content services. This has fed people’s instinct to equate content services with ECM. Many vendors and consultants are now taking their marketing messaging and simply substituting one term for the other. Even more distracting are people that reflexively reject content services because they assume the person using the term is just doing a term swap.

The truth is that content services is not ECM. It is an approach to implementing solutions that support an ECM strategy and providing sound information governance. Content services doesn’t eliminate the need for an ECM strategy or information governance. In fact, if you don’t have a strategy or proper governance, you might end up addressing the wrong things.

You still need a plan. To determine how to implement it, you need to know what content services is and how it can make a difference.

Continue reading

Book Review: Designing Connected Content

Designing Connected ContentTwo book reviews in a row? Yep. As I said in my last review, I’m reading non-fiction a lot more now and I have a backlog of industry books to read. One of the authors of this book, Carrie Hane, is a good friend. I watched her work on Designing Connect Content for pretty much all of 2017. I was very excited to finally get my copy.

For years, Carrie and her co-author, Mike Atherton, have been talking about Designing Future Friendly Content. In the web world this means using a structured content model so that the management of the content is not tightly coupled with the presentation layer. As design trends change, your content and underlying website structure doesn’t have to. Taken to its ultimate conclusion, you are looking at a headless Content Management System (CMS) supporting one or more presentation layers (web, mobile, Alexa…).

They finally took the time to write a book on the topic. It was time well spent.

Continue reading

Talking Agile Content Services at NCC-AIIM

[Originally published on the TeraThink blog]Russ Stalters presenting at January 2018 NCC-AIIM meeting

Recently, I was at the local NCC-AIIM Chapter meeting. Russ Stalters was visiting from Texas and shared the story about how he created a new, 200+ person, data management team for the BP Gulf Coast Restoration Organization. A separate organizational entity from BP, the organization was stood up in 90 days from vision to operation. It was an impressive tale involving massive amounts of information being absorbed and managed in a highly visible environment.

As Russ spoke, it became clear that two of the key lessons were around agile processes and content analytics. It generated some great discussion that took us well past the scheduled time. I wanted to take some time to share some of the highlights.

Continue reading

InfoGovCon 2017 Continues to Set the Bar High

Governor Raimondo speaks at InfoGov17This post has been a long time in coming because I’ve been trying to process everything that happened this year. Once again, InfoGovCon was a great event and the Information Coalition should be proud at the quality of speakers that they assembled. After all, how many conferences score a governor and get them to talk about something relevant?

Conferences like InfoGovCon are critical for the industry. We are still building a template for consistent success. As Shannon Harmon, whom I had the pleasure to meet this year, put it,

The best practices are still being developed. The body of knowledge is under construction.  This makes information governance an exciting space within which to work.  It can also be immensely frustrating for those who want a well-defined structure in place.  Working in this space requires a certain comfort level with the unknown.

After decades of working in this space, I agree that there are still some unknowns. We have learned a lot about what NOT to do. It is the way we can get things done consistently that we are still putting together.

Continue reading