Category Archives: alsbridge

EU Network Strategy: Don’t Delay Mobile Consolidation

Margot Wall Blog2

Margot Wall, Managing Consultant – 

Recent moves by the European Community (EC) have opened up the competitive landscape and created opportunities for global enterprises to consolidate network operations in EC countries.  Mobile operations are especially promising, and we’re seeing a number of client organizations move forward with plans to consolidate and rationalize plans from multiple carriers.

At the same time, other enterprises appear to be content to wait until their existing mobile contracts expire before taking advantage of these emerging market opportunities. That would be a mistake. The time to start a consolidation initiative is now, even for organizations with a number of contracts still in force.

Consider: With multiple countries and multiple in-country teams, and with at least one provider in each country, getting a consolidation deal to market – and then getting to contract – both take longer than you might expect.  If you wait between 6 and 18 months to go to market, by the time you’re done with the project another 8 to 18 months has gone by – and now your rates are more than three years old.

However, by going to market now, by the time you’ve signed new contracts the old agreements you were waiting on to expire are ready to migrate to the new deal, instead of sitting at the old rates. Bottom line – you save money sooner.

We’ll be discussing network and telecom issues, contracting strategies and the European marketplace at the 2015 Alsbridge European Vendor Summit, to be held May 19th in London.

WTF(S)?

Questions, questions, questions... the Concept photo

Jeff Seabloom, Managing Director – 

I’ve recently been involved in a number of Unlimited License Agreement (ULA) negotiations where the contracts were full of arcane, intricate and complex terms and conditions – nothing new there, that’s the nature of the beast. However, I was struck by the fact that on three occasions, my team and I – all industry veterans – encountered no less than 20 items that we had never seen in an agreement. What was more surprising was that in two instances we discovered specific terms that directly negated or contradicted other specific terms – within the same agreement.

This led to several hours of consideration and head scratching, after which we deemed the items in question as falling into the category of “Why the Final Signature?” (WTFS). In other words, why bother? Why spend days and weeks pouring over the minutiae of a complex agreement in the belief that this attention to detail is necessary to build a partnership that benefits both parties – only to learn that the agreement is so one sided to the provider that very little “partnering” is considered.

Existential musings aside, the answer is that the Ts & Cs in today’s contracts have to be painstakingly analyzed, parsed and understood in all their complex glory.  Otherwise, clients are likely to have those minutiae used against them later in the contract term.  As hardware and software agreements – particularly ULAs – become increasingly impenetrable, clients need access to narrow, deep and specific expertise around individual vendor licensing strategies and sales techniques.

Lacking that expertise, clients are at risk of signing bad deals. One trap is that the myriad intricacies and multiple price points in the contract come back to haunt you. Customers will sign on to a ULA only to learn after the fact that the additional licenses they expected to acquire may be excluded from the umbrella agreement. Or they’ll realize that vague language on assignment and usage – such as, for example, how “North America” is defined – doesn’t mean what they initially thought.

This “fine print” strategy of the ULA works hand in hand with the “you’re getting a special deal” approach. Vendor account teams plead year-end management pressure to make their numbers, and convince clients they have “leverage” to drive a favorable agreement. The ULA   that is offered upfront as a prized concession loses its luster downstream when specific language and clauses turn out be not an advantage or premium after all.

Bottom line: Negotiating a software contract is a challenging proposition under the best of circumstances. Proceeding without technical and contractual expertise and specialized knowledge of vendor strategies makes it downright scary.

Feeding the Microsoft Money Machine

Louis Pellegrino Blog

Louis Pellegrino, Director – 

If you’re a Microsoft Enterprise Agreement customer these days, you might be feeling like a walking ATM. It seems that the more products and licenses you purchase, the harder the sell becomes to buy even more.

The reality is that Microsoft Enterprise sale teams are assigned aggressive growth targets for increasing account spend year over year. And the bigger the account, the greater the pressure to grow revenue from that customer.

The problem is that the opportunity for organic growth simply isn’t there.  Many large customers – those with 5000 devices or licenses – are already well-stocked with core Microsoft products.  Ironically, however, it’s the customers with a lot of products – who would seem to have the least need for more – are the ones most aggressively targeted to drive additional revenue.

One common tactic employed by Microsoft sales teams is pushing new online services that customers either don’t need, aren’t ready for or already have from other providers.

Another ploy is to use a dizzying mix of licensing metrics, shifting pricing models and executive relationships to fragment and confuse the buyer and convince them they’re getting a great value-proposition – but only if they Act Now.

Customers who push back can expect to be played the compliance card and threatened with software audits or with significant price increases.

In addition to finding themselves under constant siege, many customers ultimately come to realize that they don’t have the time, resources or know-how to leverage any of what was sold to them as a great value proposition.

Bottom line: If you’re a large Microsoft customer, be aware that your wallet is at risk.

I’ll be hosting a webinar on Thursday, April 30th at 11 a.m. to discuss how customers can effectively respond to Microsoft compliance audits and sales strategies, specifically focusing on volume licensing agreements.

What Makes Healthcare Different? (Part Two of Two)

Bill Huber Blog

Bill Huber, Managing Director – 

What Makes Healthcare Different? (Part Two of Two)

The dramatic changes transforming the Healthcare industry are having a dramatic impact on outsourcing service providers, as payers and providers increasingly focus on new delivery models and the integration of disruptive technologies. I recently spoke with HCL’s Healthcare head, Gurmeet Chahal on the concept of “Patient Centricity” in today’s environment.  Our conversation continues below, as Gurmeet discusses what makes Healthcare different.

BH For you as a service provider, what is different about healthcare from other areas?

GC: We have a very strong domain-led technology which is consistent across all of our verticals. In healthcare, we believe that we are unique in the degree to which we work across the entire healthcare ecosystem. This gives us the capability to be front and center. An example is how we are leveraging our strong medical device expertise to create next generation solutions that benefit patient, payer, provider & the device manufacturer.

BH: How are regulatory changes driving increased use of service providers?

GC: The Healthcare industry is among the most regulated. New regulations do have impact on IT services consumption. As an example ICD10 had driven growth in IT services, and is expected to have an ongoing impact in areas like RCM going forward based on the complexity of codes. All of the quality, compliance and regulatory mandates require payers/providers to upgrade their existing IT infrastructure and in some cases to build entirely new capabilities.

BH: As applicable to your services, what are common priorities for both payers and providers?

GC: We believe that new business models are emerging that encourage the payers and providers to improve collaboration. The first is based on the need to drive distinctive customer experience management. This is what will differentiate both in the long run and drive patient retention levels
Secondly, to run effective care management, claims information is insufficient. The payers need to integrate clinical data, lab data, etc. This means that they need a flexible, agile and external focused operating model.
Lastly, both payers and providers have a string need to reduce cost while improving care quality, and to accomplish this while investing in new capabilities such as analytics, social, mobility and so on.

BH: What are the things that HCL is doing to address these priorities?

GC: HCL’s approach is twofold. First, we leverage our strong technology and process capabilities, and secondly, we are investing in frameworks and accelerators where we are leveraging domain experts. For example, we have come up with a solution that we call Member Experience Management. This allows our customers to build a multichannel engagement and communication strategy. It provides a framework the gives a single view of the customer and drives the customer experience. It includes a view of workflow, CRM, infrastructure, next generation CTI Similarly, we have a solution called population care management, which allows providers to engage and drive the medical protocols that they have designed for a population pool

BH: You offer services across infrastructure, applications and business services. Is there a natural evolution among these services when you are engaged with a healthcare client?

GC: It’s very rare that we see a customer take a big bang approach of bundling the whole thing. A lot of times, we get engaged in a business solution kind of discussion. For example, in a successful population health management program, you will need a specialized application, underlying infrastructure and analytical and business services. In these cases, it’s an integrated solution with all three layers. If you look at the conventional towers of ITO. There was a lot of application development work that was happening given the exchange readiness rush. Currently there is a surge in developing front end transformation and analytics capabilities. There is a recognition that a lot of cost can be saved by outsourcing basic infrastructure and in back office functions like claims processing. While there is need and desire to move on all tracks, depending on customers’ readiness there may be a phased approach.

BH: What are unique service levels for HCl associated with healthcare? Are any of these outcome-based? 

GC: We have a number of outcome-based examples. One of the solutions that we have is a combination of applications and BPO in fraud, waste and abuse. The contract is linked to recovery through the process. Another example is revenue cycle management where a focusing on improving customer satisfaction year over year.

BH: Final thoughts?

GC: There is so much change happening in health care, but I believe that this is a great opportunity for healthcare to transform itself. There is a lot of change, but this is the opportunity to gain from this change. It is very rare to see any industry witnessing so much change at one time. On a recent airplane ride, I sat next to a retired IT executive. When I explained what I was working on, he said, “I’m really jealous. Your industry is going through so much. Through technology, you can make such an impact on the lives of humans. I wish that I had that opportunity.” That has stuck with me. We should be grateful for this opportunity, and it’s time to make that impact and gain from this change.

What Keeps an Analytics Expert Up at Night (Part II of II)

Analytics2

In part one of this discussion, Alsbridge Managing Director Bill Huber spoke with Paul Burton, Senior VP and Head of Analytics and Research at Genpact, about how the growing use of analytics is redefining Business Process Management. Here’s a continuation of their conversation.

BH: What are examples of how analytics is changing BPM?

PB: BPO is outsourcing which grew up 15-20 years ago for arbitrage reasons.  The emphasis is shifting to process as a service, which has nothing to do with captives or rebadging.  It is more of a technology and analytics focus, to make it smarter to deliver the same service with less, but having the same group of people.  Customers are coming to ask for capabilities.  The cultural issue is that clients still expect to save money, even if the provider is delivering new capabilities.   In customers’ minds, taking people out of the process should mean freeing up resources to allow for the addition of capabilities.  They expect more for less, and not more for more.

The biggest thing is data. Clients have disparate data systems, CRM, back office banking, GL, finance, and sales.  None of them are ever fully integrated.  The notion of the 90s was building an enterprise data warehouse, but they never really worked.  The reason was that nothing was ever static, and changes to the system were difficult to implement.  The idea now is to leave the source systems alone, as they are what they are.  The new thing which is important is to simply virtualize the data from multiple systems and look at it in a single view, which enables the analyst to query the data for whatever purpose needed to run a report or produce a visualization.  The analyst will pull the query as often as needed.  These days, compute power is cheap, and network is cheap.  Data virtualization technologies are allowing you to pull together the data which used to be hard wired into the data warehouse.  Now analytics can be done near real time.

BH: Can you speak to specific impacts of analytics in vertical processes?

PB: In banking, risk is the big issue, with the need for stress testing, and so forth to satisfy the regulators.  Doing the model isn’t good enough.  Banks need to produce the model and let a third party look at it and then refine the model.  In low margin businesses such as CPG and retail, customer centricity is king.  Margins are low, so analytics enables you to build scale.  In businesses such as high tech and manufacturing, asset optimization is critical.  Analytical insights help to predict, mitigate and optimize repair and warranty costs.  For technology companies, manufacturers, airlines, and oil fields, asset optimization is huge.  It enables these companies to reduces reserves for product liability issues and frees up cash reserves from balance sheets.

BH: How should clients think about the business case for analytics?

PB: Clients need to focus on how analytics will enable a culture change. It’s not sufficient to do some neat math tricks, and it can’t be based on a one-time result. Analytics need to be embedded into business process so results are continuous.  This kind of culture change requires top down support, with C-level executives driving the use of analytics.  The evidence is out there to support the importance of analytics.  The problem is that companies have been spending money and not seeing the expected returns. The only way to spend money smartly is to change the culture so that you are realizing the benefits that you are investing in.

There are some similarities to when companies implemented ERPs.  When companies simply automated ineffective processes, they spent a lot of money with limited rewards. Once they began changing the process, the software became easier to implement and companies started getting returns.  The same thing applies to analytics.

BH: How can advisors such as Alsbridge help to enable more value to buyers of analytics?

PB: Advisors need to develop a view of the world that emphasizes the criticality of culture change. When that is integrated into the advisory services, advisors can play a huge role.

BH: What keeps you up at night?

PB: Not having the right skills early on.  The math and analytics are easy.  It’s the domain skills and business savvy to understand the industry and define the problems that are critical. The other thing that keeps me up is missing a big shift in the industry. Change is constant, and you need to always be aware of new innovations occurring.

BH: Thank you very much!

“Innovation Investments,” Discounts and Monopoly Money

window.location.href=”https://alsbridge.wordpress.com/2015/03/09/innovation-investments-discounts-and-monopoly-money/”;

Monopoly

Jeff Seabloom, Managing Director-

As enterprise technology vendors scramble to gain market share, we’re seeing a significant new trend emerging where customers are offered myriad “discounts” on core products. These are presented under the guise of a wide range of nebulous terms such as “special services,” “training,” “consulting,” and – a new term being used quite frequently – “innovation investment.”

I was recently involved in a negotiation where a major hardware vendor’s premium product – which is never discounted – was offered at a significant mark down. This didn’t seem right, so I did some exploring to decompose the actual deliverables and found that the “free” training and special services that were added as a special deal incentive were in fact charged elsewhere in the contract. And, upon further exploration into what this “training” and “special services” would entail in terms of vendor resources, it was clear that the answer would be, not very much at all.

In other words, the shuffling around of charges created the appearance of a discount, when in fact there was none.

This is all perfectly legal and very common. But the growing use of these “bundles” raises some important issues. The obvious concern is that buyers don’t have an understanding of the entire pricing picture and quote. Perhaps more importantly, this strategy creates an inaccurate representation of the playing field and makes it increasingly difficult for buyers to understand what they’re paying for products and services, and how their fees compare to what the market will and should bear.

Enterprise clients need to take a close look at the details of their agreement terms and demand to know specifically what they are paying for and what they are getting. This includes defining what exactly is meant by terms such as training, special services and innovation investments. That’s the only way to ensure that the incentives vendors are putting on the table aren’t made of monopoly money.

Managing Microsoft Licenses: the Cost of Convenience

Louis Microsoft Asset blog graphic

Louis Pellegrino, Director ·

If you have an employer-supplied notebook computer, there’s probably an asset tag on the back that your organization uses to track and manage that device during its useful service life.

That asset tag likely places the cost of the device at less than $1,000.  Meanwhile, the various Microsoft programs running on the same notebook, such as Office, Visio and Project, as well as client access licenses such as Windows Server, Exchange Server, SharePoint Server, SQL Server and Systems Center Server, go largely un-inventoried.

The cost of just a few of these Microsoft licenses can well exceed the value of the notebook or server device itself. However, because they’re not physical “things,” software licenses often don’t get the same level of asset management scrutiny and discipline applied to hardware.

Most mainstream business software is licensed via volume agreements, and Microsoft uses several programs to address small, medium and enterprise class customers.  When installed on either PC or server hardware, the software is activated by an installation key.  Prior to the volume licensing approach, these installation keys were typically unique and acted similarly to the asset tag affixed to a notebook.

Software acquired via volume licensing, meanwhile, offers the ability to reuse a common installation key repeatedly, thus making it difficult for that identifier to serve as an asset tag.  The convenience of a common installation key leads many customers to over-deploy many Microsoft products; without, moreover, any automation in place to track installations and required order placement obligations.

While this dynamic is positioned under the auspices of convenience and simplicity, the dark side is that Microsoft has, like many other software publishers, dramatically stepped up its auditing and compliance efforts. In fact, many sellers are assigned revenue targets specifically tied to extracting remediation dollars from high-volume customers.

The compliance dance begins with a seemingly benign notice from Microsoft to customers who haven’t kept pace with growth expectations. From there the pressure builds to submit to a Software Asset Management (SAM) engagement, which is a degree less onerous than full-scale, official audit.  Ultimately, Microsoft insists on a comprehensive network scan, using either a sanctioned tool licensed by the customer or one that Microsoft brings to the engagement.  In most cases, this network scan reveals installations of products far beyond what the customer has paid licenses for, and starts a difficult negotiation cycle with Microsoft focused on collecting remediation revenue.

Once Microsoft learns that no formal license management solution is in place, the burden shifts to the customer to prove why those licenses are not in service or used for production purposes.  This process can take weeks or months, consuming valuable people cycles.  Ultimately, Microsoft collects significant remediation revenue, even if the negotiated amount might be less than what the original scan indicated was owed.

More importantly, the process gives Microsoft the knowledge that the customer has no license management solution. The result: as when a taxpayer is audited by the IRS, you are placed on an ominous “watch list.”

The customer can turn this entire scenario on its head by demonstrating complete oversight of its license entitlements and deployments.  Many commercial solutions available today are optimized for Microsoft Software License Management.  By providing a robust inventory of volume licensing agreements, deployment landscape and purchase history, along with automated workflows and better controls, these solutions ensure that few to no installations take place in a vacuum.  Moreover, improved oversight reveals countless instances where deployed products are no longer assigned to a device or user and can be returned to available inventory for reallocation to either another device or another user (repurpose vs. repurchase).

An effective licensing management solution can show Microsoft that they have no leverage to arbitrarily squeeze revenue from a customer for mismanaged product deployments.  Once Microsoft becomes aware that a customer has the power of a licensing management platform in place, the leverage for future negotiations fundamentally shifts back to the customer.

Will RPA Turn Us Into Technology Babysitters?

young engeneer business man with thin modern aluminium laptop in

Michael Fullwood, Director

Boosters of Robotic Process Automation (RPA) assure us that increasingly intelligent software tools will free people from days of drudgery and repetitive tasks and allow us to focus on creative and value-added activities.

While automating routine functions certainly has economic benefits, and while people undoubtedly prefer to do engaging rather than mind-numbing work, the growth of smart machines may also have some unintended consequences.

We are conditioned to think of “technology as an enabler” – people use technology tools to solve problems and do their jobs. But as RPA, cognitive computing and artificial intelligence capabilities continue to evolve, at some point the tools will be doing the bulk of (some) jobs, and it will be people who assume the role of “enabler.”

Consider: today smart tools in a service desk environment can take care of simple and straightforward incidents like password resets. But when a user sends an email or calls with a specific problem, the smart tool very quickly runs out of if/then scenarios or logical sequences. So, unable to solve the problem, the machine kicks it over to the human agent. Enhanced reasoning and problem-solving capabilities of technology will rapidly change that dynamic in the near future; human agents will shrink in number, and their role will be largely limited to handling increasingly rare exceptions, checking code and monitoring systems.

In other words, at least some of us will become baby-sitters for well-behaved machines.

Leaving aside the macro-implications of this trend in terms of employability, job satisfaction and economic impact, this trend presents an immediate and practical challenge to enterprises and service providers; namely, how to structure the service delivery chain to optimize the role of both the human agent and the smart machines that require increasingly minimal supervision.

Addressing that challenge will require process expertise and new ways of structuring human/technology interaction, as well as effective training and staffing to ensure the right skill sets are in place. (For example, you don’t want a software engineer with 15 years of experience watching a machine all day.)

RPA and intelligent machines are clearly reshaping the outsourcing world and changing the nature of the discussion around not just service delivery and business processes, but around the nature of work itself. Some of the changes underway may not be to our liking, but the reality is they are happening and we have to deal with them.

Oracle’s Data Center Offerings – Not That Simple

Row of network servers in data center

Jeff Seabloom, Managing Director

Oracle’s recent introduction of a line of aggressively priced engineered systems products presents customers with compelling new options. Clearly, Oracle now becomes a viable player in a very hot space. Existing customers – and it’s becoming increasingly hard to find an enterprise that isn’t an Oracle customer in some fashion – now have a cheaper option. For enterprises not closely tied to Oracle at present, the new offerings represent an attractive alternative solution. On the surface, a decision to go with Oracle’s new line might appear to be a no-brainer. But it’s not that simple. Despite the promise of faster, cheaper and simpler, CIOs would be well-advised to avoid the temptation to say, “Let’s get some of those today.” Whatever the virtues of Oracle’s products, the implementation of server-based virtualization and database tools into an integrated solution can’t be done in a tactical manner. Indeed, while Oracle is providing lower-cost bundles, and while the overall components are becoming increasingly commodity-based, the solutions they comprise are anything but. Rather, to deliver optimal benefits, the overall solution requires a system-wide, technology-wide strategy, and must include communication with the business and with service providers, as well as extensive homework on options and alternatives. Enterprises evaluating the new Oracle offerings should consider a wide range of immediate and downstream implications. Specifically: are technology processes and practices “enterprise ready?” Another issue: What happens to the dated technology (servers, mainframes, etc.) that’s already been purchased and implemented? Is there a buy-back opportunity? How will the new solution impact resource units? Will you be able to recycle resource units and older technology already committed to? Specific questions such as these must be addressed in order to understand what the new tools are replacing (and at what cost), how they will work and what will be left over. Complex stuff, in other words. And the fact that many CIOs don’t have a clear idea of what they have in place to begin with only magnifies the complexity of the situation. Bottom line: Oracle’s new product line clearly offers some significant benefits and advantages. Oracle cannot be ignored, and the solutions are, indeed proven. That said, executives considering the offerings should do so in the context of the big picture of their integrated IT strategy – a strategy that is too important to trust to a “quick, easy and cheap” solution.

Service Integration: Own it or Outsource it? (A Perspective from London)

Outsource In-House Signboards

Chris Lawn, Director

Service Integration and Management (SIAM) refers to the framework or service wrap that converts a bunch of discrete technology-based “towers” from various suppliers into a set of seamless, business user-oriented, end-to-end services.

Getting SIAM right is imperative in today’s increasingly multi-sourced world. While bringing together a disparate team of specialized providers can certainly yield benefits, successfully orchestrating multiple services from multiple vendors presents a daunting challenge.

One key consideration is how SIAM services – that is, the specific function of managing the multi-vendor service delivery environment – are best delivered.

The two logical extremes for SIAM delivery can be characterized as “provide it in house” on the one hand, and, on the other, to “outsource it.”

Which approach is better? I moderated a debate on this question at a recent event at the Ritz in London, in conjunction with Alsbridge’s expansion in the European market. To summarize:

The case for in-house multi-supplier governance: Customers are best qualified to understand business requirements and direct this understanding to encourage appropriate competition and innovation from providers. Further, customers benefit from building and retaining critical skills and competencies and keeping them in-house. And, since multi-source new governance models are just a natural development of the roles of the existing procurement and contract management teams, it makes sense to retain that function. Ultimately, if the contracts for the various providers on the team include clauses to mandate cooperation, end-to-end services and joint innovation, then the client only needs to add a management layer to ensure that the suppliers deliver on the their obligations.

The case for third-party multi-supplier governance: Managing a multi-vendor environment is highly complex and specialized work that requires a team with skills and experience that many customers can’t attract and retain. Specifically, third-party suppliers will have already invested in the tools and the offshore back office capabilities necessary to implement cost-effective and reliable governance. Also, given the potentially contentious nature of inter-supplier relations, the governance team must be able to navigate the operational and commercial sensitivities surrounding IP sharing, knowledge transfer and joint ownership of end-to-end SLAs. Finally, an objective and independent third party can play a critical intermediary role to encourage collaboration and mediate when client/supplier conflicts occur.

So, on the face of things, both approaches appear to have merit. So which is better? The answer – as anyone who has ever worked with consultants can guess – is that It Depends….

Specifically, it depends on the nature of the multi-sourced model, and on the maturity level and existing organizational structure of the client organization.

For example, at the London event, executives from large organizations with significant outsourcing experience said they tend to focus on an in-house approach to managing SIAM, since their scale and expertise equip them to field the required numbers of staff with specialized skills. That said, outsourcing part of the SIAM function is frequently a consideration: over half of these executives said they were contemplating the use of a strategic partner to support their SIAM environment, but were unlikely to fully outsource this function.

In contrast, several attendees representing smaller companies from the retail and logistics sectors had either not yet considered SIAM opportunities, or had experienced serious problems during implementation. The consensus here was that specialist advice is needed to support SIAM implementation, and that a SIAM outsourcing partner should be considered by smaller or less mature companies taking their first foray into outsourcing.

In a nutshell, then, there’s no single answer to the question of how best to manage SIAM. But given the prevalence of multi-vendor operating models, it’s increasingly imperative that the question be asked.