jump to navigation

Rebooting Enterprise Content Management July 8, 2014

Posted by Marko Sillanpää in ECM.
Tags: ,
1 comment so far

For the last year I’ve been a lurker in the ECM landscape.  Mainly I’ve not been sure what to say.  I’ve written a few post and responded to a few others.  But I haven’t felt ready to take a position.  Recently I’ve had a few gentle pushes that have me thinking it’s time to say something.  To talk about what has changed in my approach.

My tipping point was a visit to the Crown Partners website.  From Momentum’s past, people will remember their Documentum Viper and two Documentum Hummers.  Today Documentum is nowhere to be found in their partner list.  Crown has moved on, successfully, to become a web experience company.  It leaves me to ask, is the ECM problem … changing? (more…)

When Process Replaces Product (or Why I Hate My HoA) June 10, 2014

Posted by Lee Dallas in Content Management.
Tags: , ,
add a comment

I dislike my home owners association.

I am not alone.  (more…)

EMC World 2014 May 5, 2014

Posted by Lee Dallas in Documentum, ECM, EMC, xCP2.
Tags: , , , , , ,
add a comment

Began my week at EMC World 2014 and Momentum today.

Even before the sessions I had a great conversation over breakfast with a colleague from another division about new things our respective technologies could do together. That is my favorite part of this conference. Free flow of ideas with wicked smart people.

First thoughts on the week ahead.

xCP 2.1 : I have precious little time to attend sessions because of my day job but I made time this morning to hear from xCP product management on the new features of 2.1 and a little peek into the future. I hope to write more on 2.1 later this week but the key message is the maturity of  xCP 2.1 into a terrific platform to create beautiful, flexible and powerful business applications that can go far beyond run of the mill BPM or content management. Admittedly there were many things in 2.0 that we all wish had been there. A few points I will make here

  • Process debugging and preview solve my biggest complaints.
  • Seamless editing finally lets me build content management features on par with the rest of the portfolio
  • Session variables, page and type fragments allow me to be as creative as I want to be building an easy to use and understand user experience.

InfoArchive : For customers and partners at EMC World,  I suggest you spend some time to understand InfoArchive. As I have written many times before, long time Documentum and IIG practitioners need to make sure that you do not carry too much of your conceptual baggage with you as you look at this opportunity.

We have been in the business of archiving data for many years but this allows us to expand the value of our expertise to deliver a purpose built structured and unstructured archiving platform. Even though InfoArchive has a suite of components that may not all be necessary, the evolution of the business model to  consumption base pricing simplifies the transaction as well.

Just like you don’t want to confuse complex ECM use cases with InfoArchive, the other risk is to oversimplify and think this is simply an object storage play. Unfortunately as an industry we overload terms and loose very important distinctions. InfoArchive is a layer above the storage conversation and delivers a business user application layer to search and manage archived records.

Tonight I start my job on the show flow which is the main reason for being here. Getting a chance to spend time with customer and partners talking about their challenges and ideas for how we can help meet them. If you are here in Vegas, please stop by the booth and see all the latest and greatest from IIG this year at EMC World.

Standard disclaimer and friendly reminder : I am an employee of EMC working in IIG Partner Alliances. Everything written here is my own opinion.

Deliberate Disruption April 17, 2014

Posted by Lee Dallas in Content Management, ECM, Technology.
Tags: , ,

Last weekend Box CEO Aaron Levie tweeted

“Disruption is the art of identifying which parts of the past are no longer relevant to the future, and exploiting that delta at all costs.”

To which I responded :

“deliberate disruption is extraordinarily rare. So rare that I think you can capitalize on it but never plan for it

I’ve been asking myself this question since. Can disruption ever really be deliberate?

The question conjures up images of team meetings where Dilbert’s pointed haired boss declares, “nobody leaves this room until we innovate!”

It occurs to me that in order to be deliberate, disruption must be an objective not just an outcome. This is a mistake.

Truly disruptive technology sets out to solve a problem first. Disruption is a possible but not guaranteed outcome of innovation introduced into a landscape. That landscape is made up of evolving technology, existing competition, and fluid user expectations all of which can be exploited or encountered as obstacles depending on conditions.

The disruption is a function of the problems the competition face in response to what you are doing. Competitors can easily fall into the trap of thinking that imitating the turmoil with existing portfolios rather than finding new ways to solve problems is the same thing.  It isn’t.

Genuine disruption solves new problems in a landscape, solves old problems in new ways and/or significantly alters cost, value and accessibility to those solutions. It is in the areas of cost and accessibility where we have the ability to interrogate the landscape and potentially predict the degree of disruption introduced. This is that part that can be deliberate.

Cloud and mobile in recent years have provided a method for disruption by making possible the migration of traditionally on/prem problem sets to off/prem. Reseting cost models for both producers and consumers of services and forcing the redefinition of well established roles and funding models.

When those services are not re-imagined in the cloud context and simply ported from old delivery models, there is plenty of turmoil for vendors but they are most likely victims  – not instigators of the disruption.

The challenge for buyers and vendors alike is to understand what combination of forces make up the disruptors and which companies or products are merely caught in the vortex. It is not always easy to tell. Even new businesses can get caught up into a pattern of generating turmoil and lose sight of the real objective.

Solve problems for real people.

That is what we must do at all cost – and occasionally it will be disruptive.

Tom Rouse on “ECM and Cold Pizza In the Fridge” February 25, 2014

Posted by Marko Sillanpää in Content Management, Documentum, ECM.
Tags: ,
add a comment

It’s crazy how busy we get sometimes, so it’s fun to be able to catch up with people that you haven’t spoken with for a while.  When I heard from Tom Rouse, I took the time to catch up.  I’ve known Tom for over 15 years.  We were both consultants at Documentum at the time.  He’s one of those people that enjoys thinking about ECM and making it relate-able.   When I heard his document cold pizza analogy, I thought this is something I should share.  So I asked if he would write it down so I could share it.

Here’s what Tom said:

Cold pizza is generally a staple of every refrigerator (fridge). It gets tossed in and sometimes forgotten. But there it is…Still tasty and waiting to fulfill its purpose as a snack.  Most Enterprise Content Management (ECM) systems today are serving as “refrigerators” for the documents and content in them

Please stay with me on the metaphor. Most clients use the carefully constructed models and user interfaces when they need a “snack”.  The number one reason most clients use the ECM system is to seek out information when they are hungry for it…They just go looking in the fridge for a slice of content!


ECM Trends 2014 – We Don’t Need Big Content January 19, 2014

Posted by Lee Dallas in cloud, Content Management, ECM, Technology.
Tags: , ,
1 comment so far

A friendly reminder that all of the opinions expressed here are completely my own and not those of my employer’s.

I am late this year putting together my thoughts around trends for 2014 in ECM. To be frank many of the trends in ECM seem obvious with much already having been written about them.

  • Everyone is moving to cloud and this no longer trend worthy news .
  • There will be a few acquisitions, especially among the mid-tier players to round out capture, workflow and mobile capabilities.
  • IPO’s of a few key players will be frequently discussed but deferred until 2015.
  • DropBox will relaunch it’s business offering AGAIN. Look for them to make acquisitions of overlapping tools to gain this foothold.

I struggled to find something more substantive to cover until I found myself in a  lively discussion on the topic of “Big Content.” 

Commercialization of value extraction from the enormous amounts of unstructured data being generated today is the next major focus for advancement in the ECM industry. Going beyond improving transactional throughput and accuracy and into understanding. It may be ironic since I am one of the “Big Men On Content” but I do not like the term “Big Content”. This term perpetuates outdated stereotypes and division at a time when technologies should be coming together

To understand what is meant by this term I did what anyone else would do. I went to bigcontent.com. Surely the genius that had the foresight to grab the URL could tell me if it really is separate from big data. I was disappointed. A marketing firm jumped on the term and is using it in a completely different context. Good for them but it is perhaps a missed opportunity from an ECM perspective.

Big Content as it relates to ECM seems to be the content management industry’s attempt to ride the coattails of Big Data marketing. A never ending quest to be appreciated as much as the more popular sibling, structured data. 

So what do we mean by Big Content. Is it a subset, a superset or something altogether different from what we are now calling Big Data? EMC’s Dave Dietrich wrote this piece on Big Data misconceptions and  takes the position that unstructured data that rises to Big dimensions is a subset.

To be “big” in this context Dietrich argues the data in question must have great volume yes but also must have both variety and velocity. Certainly some unstructured data has these characteristics. He goes so far as to say most Big Data problems are grounded in the unstructured citing last year’s IDC’s Digital Universe study.

Gartner’s Darin Stewart seems to agree that Big Content is a subset of big data. He goes on to posit  that slow uptake of interest in the unstructured aspect is because of a lack of “comfort” in dealing with documents as opposed to databases on the part of IT. He touches on what I feel is the crux of the issue but I don’t think it has anything at all to do with the comfort itself. I think it is the utter lack of an integrated tooling approach across the industry.

All silos begin as words.

If you make it a separate category you may one day have tools that let you do meaningful things but without a common analytical approach it will perpetuate the integration burden the structured and unstructured worlds deal with today. Deriving value from structured data will always be easier and as separate solutions structured data applications will continue to maintain the attention of the buyers.

The very things long term ECM proponents hope to achieve by trumpeting Big Content will continue the technological isolation and lack of innovation that the industry has struggled with for a decade. What is needed is a coordinated drive to raise the expectations of the emerging structured analytical tools to demand search, content analytics, sentiment, etc. Some offerings, particularly those tailored to social media analytics began this work but we must continue to push beyond 140 characters to more valuable and information rich content.

We do not need a category for Big Content tooling, marketing and expertise. We already have one. It is called Big Data. And it is very nice:

Investment in this is happening whether you call it Big Content or not. IBM’s billion dollar investment in Watson is the best example. From a user perspective, Watson does not make a distinction between structured and unstructured sources as part of the question when presented. Likewise as we begin to think about other analytical frameworks from a user’s point of view,  the difference in the structure of the data sources should make less difference over time and disappear altogether eventually.

One might ask, isn’t this just the same content and semantic analytics that we have been talking about for years. The answer is “sort-of.” You will be hard pressed to find any of the lofty promises of those initiatives fulfilled. It is my contention that this tooling needs to scale and be formally folded in to the analytical tool set of big data. The correlated structured data provides context for the extracted unstructured. At some level this is happening but as an industry I think we derail this motion when we attempt to create differentiation in categories.

This convergence of structured and unstructured analysis will be hampered if we spend overt mental and marketing energy today perpetuating a separation of the disciplines simply to defend the value of our current expertise.

The trend has begun. We need to help it along or get out of the way.

ECM, “Reports of my death are premature.” November 21, 2013

Posted by Marko Sillanpää in Content Management, ECM.

For a while I’ve seen the various posts of the death of ECM. What has happen is that “enterprise” has become an adjective rather than a noun. Often, as vendors, we think enterprise means a big deal size rather than a wide venture. Enterprise solutions have come to mean solutions to large problem encountered often by a small audience. ECM sales were often made as enterprise-deal but the solutions rarely left the large complex problems they were envisioned to solve.

I believe that ECM is about to have a revolution, a revolution towards EwCM (Enterprise-wide Content Management).  That revolution is being lead by the customer. (more…)

The Real Story Group release 2013 Content Technology Vendor (Subway) Maps July 15, 2013

Posted by Marko Sillanpää in ECM.
Tags: , , , , ,
add a comment

Gartner recently released their Digital Media Transit Map reminded me that I’d seen something this before for ECM.  So I looked and found that Real Story Group (formerly CMS Wire) had also recently released their 2013 Content Technology Vendor Map.  The Content Technology Vendor is an interesting graphical way to see all of the vendors within the content management space.  Each “transit route” represents a specific technology space, like enterprise search or web content & experience management.  Major hubs represent locations represent vendors where multiple technologies come toghether.

Understanding ECM Is About Dialect July 9, 2013

Posted by Marko Sillanpää in AIIM, Consulting, Content Management, ECM.
Tags: , , ,
add a comment

One thing I’ve learned over 15 years in ECM is that there are characteristics in individual systems and solutions that often come to the level of idiosyncrasies.  The problem is that too often people on opposite sides of the table can argue against the same side of a solution for hours without realizing that both sides were talking about the same thing.  Too often we get so wrapped up in our own language of ECM that we don’t realize there are different vendor, industry, and customer dialects.



Get every new post delivered to your Inbox.

Join 1,546 other followers

%d bloggers like this: