While you take a look at new applied sciences, are you want a child in a sweet retailer excited to attempt each newest innovation? Perhaps a pacesetter in your group is a know-how gambler and able to choose distributors with out enough evaluation and due diligence? Or maybe the procurement supervisor, the undertaking administration workplace, or enterprise stakeholders put tech picks by way of such exhaustive analysis that your group is left in innovation’s wake and caught within the mud with legacy platforms?
These know-how shopping for personas are discovered in lots of organizations, they usually can undermine the power of tech leaders to make sensible and well timed know-how picks. Haphazard tech choice results in wasted effort and technical debt, whereas overly methodical approaches sluggish the tempo of innovation and thwart experimentation, good risk-taking, and agile cultures.
These personas can derail your know-how determination course of in all types of how, from bogging down your group’s know-how analysis course of to impairing the decision-making round when to spend money on applied sciences and which services or products to contemplate. Listed here are 12 anti-patterns to be careful for. If you wish to make sensible know-how determination, then don’t do the next:
Settle for government enter as a ultimate determination
When the CEO or one other influential government asks the know-how staff to purchase and implement a selected tech answer, it’s important to take a couple of steps backward to grasp the rationale. What drawback is that this chief attempting to unravel, and the way effectively does the answer meet expectations? All too typically, I hear tech leaders settle for the manager’s voice as an edict and never take steps to rationalize the method or current options.
One answer is to create the self-discipline of drafting and presenting one-page imaginative and prescient statements that target an issue, alternative, or worth proposition. Nicely-crafted imaginative and prescient statements outline targets however should not prescriptive concerning options or implementations. Even when the tech staff fills this out on behalf of the manager, it typically results in a dialogue and debate on a number of options.
Fail to solicit or contemplate buyer enter
As technologists, we generally make the identical errors that executives make when leaping into implementations. We see the issue, we all know an answer, and a way of urgency drives us to implement the repair. Sadly, by not together with the shopper’s voice within the decision-making course of, or understanding the advantages (or not) to the shopper, we will simply ship capabilities that miss the mark. Typically organizations even fail to formally outline who the shopper is for sure know-how initiatives.
Defining a buyer is simpler when you find yourself creating end-user purposes by defining roles and personas. However discovering a buyer position will be more difficult when contemplating back-end capabilities, together with infrastructure, safety capabilities, middleware, libraries, or internet providers. However technologists are a part of the enterprise too. Architects, enterprise analysts, or know-how leads can function proxies for the shopper position when implementing back-end applied sciences. Ask them to supply necessities, determine acceptance standards, make choices on trade-offs, and fee their satisfaction with the carried out answer.
Ignore current requirements and applied sciences
Traditionally, tech departments have struggled with creating and sustaining documentation and with speaking and managing requirements. So, when an pressing request or high requirement surfaces, we’re extra prone to search new options quite than examine and reuse current capabilities.
This method typically results in redundant capabilities, half-developed options, and mushrooming technical debt. Including a “analysis inside options” step earlier than or as a part of investigating new options is a straightforward self-discipline that may enhance reuse. When folks suggest new applied sciences, create a course of for estimating upgrades to legacy platforms or consolidating applied sciences with related capabilities.
Foster a one-vendor, one-approach tech tradition
Ever hear somebody state emphatically, “We’re an x store,” as a means of curbing any analysis, evaluation, and consideration of different distributors or applied sciences? It’s one factor to have requirements and most well-liked distributors. It’s one other to be unaware of third-party capabilities and to stymie dialogue of options.
Permitting the voice of some robust platform advocates drown out any exploration and experimentation can result in expensive errors. Expertise leaders ought to brazenly deal with this cultural anti-pattern, particularly if it’s suppressing folks from asking questions or difficult establishment considering.
Presume construct or purchase is the one alternative
There’s a extensive gray zone between constructing options with customized code and shopping for SaaS or different applied sciences that present out-of-the-box capabilities. In between are extremely configurable low-code and no-code platforms, commercial partnerships, and opportunities to leverage open source technologies.
So build versus buy is an oversimplification. A better set of questions is whether the required capabilities help differentiate the business and what types of solutions deliver more innovation and flexibility over the long run.
Assume APIs meet integration needs
Most modern SaaS and even many enterprise systems offer APIs and other integration options. But cataloging integration hooks should be only the start of the investigation of whether they meet business needs. What data does the API expose? Are the desired views and transactions supported? Can you easily connect data visualization and machine learning tools? Does the API perform sufficiently, and are there underlying usage costs that need consideration?
Approaches to accelerating reviews of integration capabilities include these three ways to validate APIs and leveraging low-code integration platforms.
Fail to perform social due diligence
When we’re confronted with a long list of possible solutions, trusted information sources can help us narrow the playing field. Reading blogs, white papers, reviews, and research reports, and watching webinars, keynotes, and online tutorials are all key learning steps. But one tool often left out is leveraging social networks to consult with experts. Two places to start include IDGTechTalk and #CIOChat, where many experts will provide advice and share alternative solutions.
Skip the proof of concept
The art, craft, and science of selecting technologies involves designing and executing proof-of-concept solutions (PoCs) that validate assumptions and test for key strategic requirements. PoCs are particularly important when validating emerging technologies or evaluating SaaS platforms, but even using agile spikes to review third-party technology components helps accelerate decision-making and avoid expensive mistakes.
The biggest mistake may be skipping the PoC, either because you believe what you’ve read, you trust the vendor, or you face too much time pressure. Even when a PoC green-lights a technology, what you learn from the PoC can help you steer priorities to feasible implementations.
Develop elaborate decision matrices
When many people are involved in reviewing and evaluating new tools and technologies, one common approach to help drive a data-driven decision is to create a decision matrix spreadsheet. Features and capabilities are weighted by importance, then rated by a review committee. The spreadsheet calculates the aggregate scores.
Unfortunately, these tools can get out of hand quickly when too many people are involved, too many features are chosen, or arbitrary weightings are assigned. The spreadsheet ends up prioritizing its author’s preferences, and people lose sight of what needs to be evaluated strategically by reviewing all of the bells and whistles.
Before embarking on a decision matrix, take a step back. Consider distilling the characteristics of the solutions down to the essence of the business problem, rather than requiring long lists of features to be evaluated by too many reviewers.
Ignore long-term architecture, lifecycle, and support considerations
I’m a big proponent of evaluating technologies based on ease-of-use and time to value, but that doesn’t mean longer-term architecture, maintenance, and support considerations aren’t important or don’t require evaluation.
The key is to decide when to evaluate them, what are the key considerations, who will be involved in the review, and how long to invest in the assessment. A good way to do this is to separate the gating concerns that tech teams should consider at the start of an evaluation from the longer-term factors that should be inputs to the decision-making process.
Omit SLA, data protection, and security reviews
Time pressure or (blind) faith in your chosen technology are poor excuses for skimping on reviews of service level agreements (SLA) and evaluations of vendor security and data protection practices. The key to doing these reviews well is having the necessary expertise, negotiation skills, and tools—and an efficient evaluation process, so that technologists and business sponsors don’t perceive the reviews as bottlenecks.
Larger organizations that perform SLA, data protection, and security reviews in-house must be time-efficient and focus their efforts on aligning the evaluation with the top risks. Smaller companies with insufficient expertise should seek outsiders with expertise in the solution domain.
Delay financial and legal reviews
Last on my list, but certainly not least, is financial and legal reviews. The anti-pattern here is waiting too long to bring in the experts needed to conduct them.
Consider that many SaaS offerings, API services, and cloud-native technologies have consumption-based pricing models, and the operating costs may not meet budget or financial constraints. Legal reviews are particularly important for companies in regulated industries or companies that operate globally, and reviewing compliance factors in both cases can be especially time-consuming. For both financial and legal reviews, delays can be costly.
Don’t wait until the end of the technology review process to bring in financial and legal expertise. My advice is to bring them in at the start and ask them to weigh in on what will need reviewing early on—before any technology selection decisions are made. Further, don’t overtax your financial and legal resources by having too many evaluations in progress at once.
Trying to juggle multiple technology evaluations is unrealistic for many companies, and leaders should prioritize their shopping efforts. If they do, I promise you that smart, comprehensive, and efficient technology reviews are possible.
Copyright © 2021 IDG Communications, Inc.