Two years of experimentation may have given rise to several valuable use cases for gen AI, but during the same period, IT leaders have also learned that the new, fast-evolving technology isn’t something to jump into blindly. Some organizations, like imaging and laser printer company Lexmark, have found ways of fencing in the downside potential so they can benefit from the huge upside.
Before considering a project, Lexmark first makes sure the problem is worth tackling. If they decide a project could solve a big enough problem to merit certain risks, they then make sure they understand what type of data will be needed to address the solution. The next thing is to make sure they have an objective way of testing the outcome and measuring success.
And finally, says Vishal Gupta, the company’s global CTO, CIO, and SVP of connected technology, the IT department won’t touch a project that doesn’t have commitment from business users. “You really need a stakeholder on the other end who’s willing to adopt it and is going to partner with you on the journey,” he says. “They need to be motivated to change their existing processes, and tolerant that AI isn’t going to be 100% accurate at this point.” Gupta applies the same common sense he applies to other technology projects, with a few adjustments to account for the novelty of gen AI.
Vishal Gupta, global CTO, CIO, and SVP connected technology, Lexmark
Lexmark
As a general rule, once it’s clear a project is worth pursuing, IT leaders should decide whether to build or buy. If it’s a buy, they should do these three things when recruiting vendors.
Use a mix of established and promising small players
To mitigate risk, Gupta rarely uses small vendors on big projects. “If it’s one of our top 10 projects, I want to either build it myself or work with a more established vendor to minimize the risk,” he says. “But we don’t ignore the smaller players. We often use them on tier two problems that could become tier one next year.”
The smaller players are more likely to have the innovative solutions that allow an organization to get ahead of the competition, so Gupta allocates 10 to 20% of his IT budget for the more nimble suppliers. “We run some AI projects for our legal or HR teams, where they don’t have as much scale and the users are willing to experiment,” he says.
For smaller projects, Lexmark asks if it’s worth experimenting with from a commercial perspective. If it is, they take a close look at the smaller vendors to make sure they don’t miss out on some of the innovation coming from the broader market. “If it doesn’t work, at least it won’t create a hole in our priorities for this year,” says Gupta. “And if it does work, it’s all upside.”
Large software vendors are used to solving the integration problems that enterprises deal with on a daily basis, says Lee McClendon, chief digital and technology officer at software testing company Tricentis. It’s more common for smaller vendors to deliver point solutions to specific problems that may not have the level of connectivity and robustness that enterprises require. This applies to all technologies, not just AI. “But it’s important to consider whether multiple point solutions in the AI space are worth the management overhead given the complexities of managing data privacy and security in this rapidly evolving field,” he says.
Lee McClendon, chief digital and technology officer, Tricentis
Tricentis
Concerning established vendors, a word of caution from Eric Helmer, CTO for Rimini Street, is that big companies may be bundling and up-charging for AI as part of their standard products. All the major software vendors are putting it into their products, he says. “If they haven’t done so already, it’ll be bundled in their next release,” he says. “Companies who have a lot of legacy applications may find themselves on an AI journey they didn’t ask for. You may go through the evolution of these very disruptive upgrades, only to find out the functionality you got will never be of use.”
Test every vendor’s knowledge of AI
The large enterprise application vendors are not AI companies, Helmer says. “They don’t have the same level of experience as organizations that have been working on AI for the last 10 years. They’re merely fans of AI, playing catch up.”
And when a vendor claims they’re an AI company, IT leaders should question them to make sure there’s a mutual understanding of exactly what that means, according to Nick Durkin, field CTO for software developer Harness. Make sure you know if they use predictive versus generative models. Find out if they use somebody else’s models, which they may have very little control over. “There’s a lot of nuance to understanding the AI underneath the covers,” he says.
Moreover, everybody is suddenly pretending to be AI experts. There hasn’t been enough time for so many people to know the new technology, says McClendon. “We try to figure out who has something legitimate and who just stuck some AI buzzwords in their marketing materials to make it sound like they’re up to date on the latest technology,” he says.
One of the ways McClendon does this is to ask vendors what they were doing several years ago. Their answer may reveal if they’ve made a conscious investment in AI as opposed to just jumped on something because it’s new. “If they’ve been working on AI for a long time, even though it wasn’t generative AI, they probably have people who were keeping up to date with the technology as it evolved,” he says. “And if it’s new to them, it doesn’t necessarily mean they don’t have something good. But it’s a data point to consider.”
Gupta points out that some vendors make a lot of claims, but then say they need Lexmark’s data. This usually signals the supplier hasn’t developed the model yet. “If they’re going to use all of our data to do a project, then I can do it myself,” he says. ”I’m more interested in a vendor who can bring us something that’s proven, rather than one who’s just going to use my data to build a product they sell to somebody else. That takes away our competitive advantage.”
What IT leaders really need to find out is how well a vendor can explain what their product does, what the specific problems it solves are, and success rates. “If they come back with a success rate of 99%, I don’t believe them,” says McClendon. “AI is not that good yet.”
Eric Helmer, CTO, Rimini Street
Rimini Street
Find out if they have ways of measuring success and if they follow a methodology that allows them to see that while AI is great technology, it’s not perfect. “You can tell pretty quickly if you start asking the right questions of vendors about their knowledge of AI and what they’re actually doing,” McClendon adds. “It quickly becomes clear if they’re just making stuff up.”
IT leaders should look closely at how suppliers treat data privacy and security in general. McClendon asks companies in what ways their products replace things that humans normally do. “The AI can end up producing results that aren’t in line with your company culture,” he says. “Make sure a vendor considers all of these different areas about your data, your security, and how their AI needs to interact within your company to fit in. They may not have a great solution yet, but you need to see if they’re at least thinking about these things.”
Establish success criteria during the selection process
It’s always a good idea to start with a proof-of-concept (PoC) to demonstrate tangible results before rolling out new technology to a large user base. During the selection process, McClendon makes sure he can get vendor support to do this. He works with them to establish success criteria before starting.
Similarly, Lexmark defines a set of test cases for their projects ahead of time and uses them to help validate vendor claims during the selection process. Gupta asks suppliers how well they can perform against the data points. For example, if a project is for customer care, they might ask a vendor if they can increase the success rate of calls by 20%, and reduce the call duration by 30%. “If you don’t make your success criteria clear from the beginning, you can waste six months to a year for an evaluation, only to find nothing coming out of it,” he says.
Nick Durkin, field CTO, Harness
Harness
Durkin at Harness says most established vendors are ready to demonstrate the measurable impact of their tools through cost savings or productivity gains. Small vendors should be expected to do the same.
And IT leaders should make user acceptance a key success criteria. “Beyond productivity outcomes, it’s crucial that users like the tools,” he says. “If users find a solution clunky, difficult to use, or less effective than expected, it’s simply not worth moving forward.”
Read More from This Article: 3 musts when recruiting vendors for AI
Source: News