Whether you are pulling the trigger on a multi-million dollar enterprise resource planning (ERP) project or a more modest software purchase and implementation for a middle-market business, what you don't know can hurt you.
Knowing what is behind the kimono, having visibility into what life will be like after contract signing and post go-live, is hard. It was hard before, when independent technology media still retained professional staffs to cover these companies and products. Even they could only go so deep, and objective coverage focused more on speeds and feeds of technology products than functional benefits of software.
But the move to digital proved that the number of eyeballs reading enterprise tech content of any depth were small, undercutting ad revenue. And it is hard to find independent voices capable of understanding and explaining the ins and outs of software products and categories for an entry level salary supported by online ads.
From the well-respected work of experts like Michael Krigsman to the occasional published tales of failed ERP implementations that make the news, software selection teams should have fair notice that things can go wrong once they sign the contract. A Gartner membership may provide some visibility into various products, as will an account with a deep comparison tool like Technology Evaluation Centers.
But both Gartner and Technology Evaluation Centers are funded at least in large part by the software vendors they provide information on. And this is a challenge. With Technology Evaluation Centers, vendors will fill out spreadsheets with hundreds or even thousands of data points on what their software functionality will or will not do. The results are made visible through breathtakingly detailed and configurable radial graphs and other visualizations.
But how are these software vendors kept honest?
Technology Evaluation Centers found they could not--vendors will say what they want about their product--particularly larger ones, often representing customizations as standard functionality. What Technology Evaluation Centers eventually did was offer an optional validation step. Validated vendors have to demo to prove they can do the things they say they can in their submitted spreadsheet. So now, the mega-vendor can lie for free (well, more cheaply anyway) and their competitors will pay extra for validation.
What about Enterprise Software Peer Reviews?
Peer reviews like those on Gartner Peer Insights and G2 are interesting, but are also limited. That reviewing businesses' use case may be different than yours, they may not have had the best implementation either due lack of skill or planning on their part or on the part of their vendor or services partner. The product may have been oversold in their business but might work in yours--or vice versa.
Vendors also will aggressively send their best and most referencable customers to these sites as well. With substantial effort directing happy customers to post reviews and few or no resources soliciting insights from the unhappy, this is an imperfect system.
What About Software Comparisons from Vendors?
One historically interesting set of data can come from vendors taking a product to market. In preparing to sell the software, they may create functional comparisons that highlight features they have and comparing this to their competitors.
A smart marketer will want this information to be accurate because it will be used not just to sell the software, but oftentimes to drive decisions on research and development investment or go-to-market targeting. If a software product is stronger for quote-to-cash business models than a competing product, using this method it is easy to call that out. If you are looking for software that includes contract management, and a vendor's comparison shows they have it but their competitor does not, that is a viable data point.
But vendors can and do get hit by cease and desist letters for making such claims. Case in point, the founder and CEO of adtech vendor Littledata posted on LinkedIn that a competitor had served them a cease and desist letter for claiming they had functionality their competitor did not.
Enterprise software guru Dave Kellogg commented on the thread with sage advice that illustrates how challenging it is to stay out of hot water:
"I'd stop posting and talk to a lawyer if I were you," he wrote. "Instead, you're doubling down, promoting the page in question and trying (unsuccessfully imho) to get sympathy for being attacked for a false advertising claim when I'd guess few people in the audience actually know whether it's false or not. Moreover, none of the features you're putting in the comparison table have definitions which is one way to keep yourself clean in such marketing.
"Example: they don't have schmumble, we do. And here's a link to our definition of schmumble. At least providing that definition increases the odds of your claims being correct and not solely relying on what the reader (and/or a reasonable person) would think schmumble means."
With the exception of the deep and multi-layered comparisons found on Technology Evaluation Centers, these side-by-side functional illustrations often lack context--including not only the definitions that ensure we are comparing apples to apples but whether or not the feature is released to market yet, whether it has been successfully implemented and oftentimes whether it is white labeled from another vendor.
Who Tells the Unvarnished Truth?
There are consultants who work only for software buyers and end-user organizations, providing insight to clients on an individualized basis. These software selection and services customers may have formal partnerships with one or more software vendor enabling them to resell and provide services on the product--others are completely agnostic.
Even these consultants can be muzzled somewhat in selection cycles, which is problematic. I have heard tales of consultants associated with a major ERP vendor being retained to run a software selection process for a customer and telling vendors and selection consultants they would not be allowed to question the implementation risk of the product they are in fact hoping to sell into the account.
I hear about objective selection consultants being threatened by vendors intimidated either by their lack of leverage of by the consultant's tendency to disclose pricing insights that may help a customer negotiate a better deal or put them off selecting a vendor due to hidden costs.
Working with an objective software selection consultant will still be a benefit for many companies though. These professionals have the benefit of having seen inside the kimono of various software vendors. They will know what functionality they offer, what they do not offer and even more importantly sometimes what functionality they offer that is just not very good but enables them to check off a box on an RFI. They will know how each vendor prices and negotiates, and the level of effort associated with implementation.
Involving these third parties at the beginning of due diligence may make a lot more sense than retaining them as expert witnesses later--a service many such firms provide.
The Role of Market Power
Finding an independent, honest broker to work with can be very valuable. Working with a vendor more prone to be honest is also a viable approach. The old saying that power corrupts may not be accurate, but power can reveal. It can reveal the true value system of a management team or salesforce. A vendor threatening a consultant or journalist may feel they are too powerful to need third party validation of what they are saying and have the leverage and power to get their way. They may feel they are too important to cross, and sometimes they are right.
A middle market vendor will have less power over others, and this may force them to be more transparent. Having limited resources for attorneys and a desire to build a reputation rather than recline on one--these and other dynamics may make that insurgent vendor more attractive.
People tell the truth when the price for dissembling or withholding information is too steep. The consequences for being less than forthright are greater when the two parties to the deal are more equal in power.
Read Between the Lines
While reading what vendors or analysts say may get you only the part off the picture vendors want you to see, software buyers can become skillful in unpacking these communications to derive meaning.
This is an artistic ability cultivated over a lifetime, but examples may include:
Look at what is not said. Did a vendor acquire another vendor? This is interesting and may promise new capabilities, but what is the roadmap for integrating the acquired product into the existing one? Lacking clear communication on this, it may be safe to assume it will be quite some time before there is a meaningful integration, if ever. Is a capability promoted in a case study part of standard software functionality or did it require modification of the source code or a partner integration? Is a new feature backwards compatible with previous versions?
Be careful with words and definitions. Cloud, software-as-a-service, artificial intelligence, machine learning--all of these terms can be abused. For cloud computing models, confusion can be averted by relying on, per the excellent advice of Jonathan Gross, ITIL definitions.
What level of effort is required to sustain the software? Software in the cloud and specifically software-as-a-service products will consume fewer internal IT resources than software run on-premise. But there are still demands placed on an IT and operations staff to keep a solution operational. Getting to an idea of how many full time equivalents (FTE) are required and what skill sets they must have can be challenging. Seeking a detailed list of tasks that must be performed after go-live on what periodicity can be very informative and less subject to interpretation. Allocating the required time conservatively may provide some insight into what staffing costs driven by the software will in fact be.
Interesting too can be the number of people a vendor sends to demo the software. A larger team may suggest complexity that is too much for a smaller team or individual to master. A vendor sending one or two people to demo may also be quietly sending a message of intuitive usability that suggests it is easier to cross train employees across multiple modules of the software.
When you ask a third party expert about a software vendor's pricing strategy, technical debt or product roadmap, it might make sense to watch their body language as they respond. Do they:
Avoid eye contact
Fidget
Exhibit facial expressions that reveal discomfort
Purse their lips or cover their mouth, as if holding words back
Comments