IT outsourcing Frameworks – Part 1

October 16th, 2015 1 comment
Reading Time: 7 minutes

The following is extracted from a report I put together for my Masters recently. I’d like to say it’s based on an initial draft, however, that would suggest that I managed to revise it before submission. I thought that there were some useful pieces of information in here, so rather that just hide it, I thought I’d upload here for others.

For those that follow me on twitter, you may have seen part of the conversation:

Screen Shot 2015-10-12 at 5.09.39 PM Screen Shot 2015-10-12 at 5.09.56 PM Screen Shot 2015-10-12 at 5.10.20 PM

The report was based on back sourcing of IT services and the reasons for it. Constraints were 3000 words and using the University’s text as the basis for the frameworks – for those that know me, that never flies well, I prefer to stretch it as wide as I can to see what is out there. I wanted to bring in some of Simon’s wisdom and add a multi-dimensional view to the process, something I do in life, however, with the constrains and time restrictions…. This was a single pass, written in 2 days and now chopped up into a 3 part blog for your reading pleasure. All references will be in the final blog.

TL;DR – Information Technology Outsourcing or ITO can take a number of forms. These forms include Fee-for-service models, Strategic Partnerships and Buy-in-Contracts (VELTRI et al., 2008). Due to the complexity of some of these ITO agreements, there is the risk that something will go wrong and the services will need to be backsourced, either partially or in full, in order to regain control over those services (VELTRI et al., 2008) before working out what to do next.

This recommends that a flexible, modular and clear contracting model is used, with a strong governance framework in place to manage and oversee the delivery of services. Some of you reading that last sentence will go “ah, duh!” but in reality I’ve seen so many convoluted contracts and models I want to scream. They work for no one except the consultants that are hired to negotiate it and then later manage it.

 

Introduction

Information Systems (IS) and Information Technology (IT) are now the backbone of the modern business. Today all businesses are reliant on IS/IT to provide a base level of capability in market and not only to achieve a competitive edge (CARR, 2004). Outsourcing IS/IT services to an external party requires giving up a degree of control, thus requiring an organisation to decide how much control they wish to relinquish (BANNISTER, F and Remenyi, D, 2005; CIOINSIGHT, 2012).

There are many reasons for outsourcing include cost reduction, improved quality of service, and access to technological expertise (BAHLIA and Rivard, 2005), however, IT outsourcing (ITO) inherently contains an element of risk, and it sometimes leads to undesirable consequences that are the opposite of the expected benefits (BAHLIA and Rivard, 2005).

This report looks at the reasons for backsourcing IT services; the catalysts for doing so; the risks associated with this; and the frameworks that can be used to ensure that the outsource is not only successful but flexible to support future change.

 

Report:

Part 1 – Drivers for Back-sourcing

There are a number of risk factors involved in outsourcing and management of these risks are crucial to the success of ITO arrangements (ALEXANDROVA, 2015); if not managed appropriately, these lead to the misalignment of expectations and can be the catalyst of backsourcing (VELTRI et al., 2008; LACITY et al., 2009; BAHLIA and Rivard, 2005). These risks can be separated into client outsourcing risks and vendor outsourcing risks.

Client Outsourcing Risks

Client outsourcing risks are those risks associated with an outsourcing arrangement as perceived by the client. Willcocks et al. (1999) in their UK Government LISA case study provide a comprehensive list of client factors (WILLCOCKS et al., 1999, p.290), that if not addressed could cause the breakdown and backsource of IS/IT services.

 

Organisational Maturity, Contracts and Treatments

The following factors can be linked to a lack of maturity and experience of contracting for and managing ‘total’ outsourcing arrangements. Each of these individual issues

  • Treating IT as an undifferentiated commodity to be outsourced – This provides frameworks and contracting mechanisms uniformly across the IS/IT service. Treating IS/IT services in this manner will not allow the business to get the most out of their services (CARR, 2004). This can also lead to unrealistic expectations with multiple objectives for outsourcing.
  • This may be due to difficulties in constructing and adapting deals in the face of rapid business/technical change – This change may be market or organisationally driven and is linked to the risk above. This can create a situation where contracts are incomplete (WILLCOCKS et al., 1999; CARR, 2004).
  • Outsourcing for short-term financial restructuring or cash injection rather than to leverage IT assets for business advantage – This treats IT as a cost centre and not a strategic enabler for strategy, locking in contracts and delivery models that are inflexible and likely to fail in the mid-long term.
  • Poor sourcing and contracting for development and new technologies – restricting the business’ ability to take advantage of new and emerging capabilities in order to meet business as it changes to address market demands (CHOU and Chou, 2009)

 

Governance

Relational governance, in addition to contractual governance is essential for IT outsourcing (ITO) success (LACITY et al., 2009). These are very broad practices associated with managing supplier relationships.

  • Lack of active management of the supplier on contract and relationship dimensions – A strong governance framework is critical in not only establishing the ITO, but requisite for ongoing success. This can be arms-length, integrated or embedded (LACITY et al., 2009);
  • Failure to build and retain requisite in-house capabilities and skills – Outsourcing services requires the retention of some skills associated with services being outsourced. Loosing creates the opportunity for the need for the service and it’s changing business requirements to be lost or not managed (TAPPER et al., 2014; LACITY and Willcocks, 2000);
  • Power asymmetries developing in favour of the vendor – This can lead to abuse of the relationship, and services and capabilities being delivered that are not in-line with the expectations and needs of the business (LACITY and Willcocks, 2000).

 

Vendor Outsourcing Risks

Additionally, vendors engaging in providing outsourcing are faced with another set of risks (ALEXANDROVA, 2015, p.754) including

  • Lack of contract compliance – Clients unable or unwilling to meet their deliverables of the contract, such as providing documentation, can lead to relational and financial strain (LACITY and Willcocks, 2000; MCKEEN, J D and Smith, H A, 2015);
  • Dependence on the client – Inherent need of direction from the client, their domain knowledge or the perceived importance of IS/IT systems within the business (KAISER and Buxmann, 2012) can precipitate the backsourcing of IS/IT;
  • Miscommunication – As described in 3.1.1.2, governance is critical to the successful delivery of ITO services. Communication styles may differ (CROMAR, 2014, pp.124-125, 167) or direction miscommunicated (BARKER, J R, 1993) leading to breakdown in communication (ALEXANDROVA, 2015, p.748); and
  • Globalisation pressures and cost competitiveness – Drives vendors to continuously look for more cost-effective ways to deliver the services. Clients who don’t feel that they are getting value for money backsource to their own offshore centres to bring costs down.

 

Grouping of drivers for backsourcing

These issues stem from a combination of the lack of maturity of the outsourcing client organisation (BAHLIA and Rivard, 2005) and inappropriate sourcing models for services (LACITY and Willcocks, 2000). They can be further identified as contract problems, opportunities from internal changes, and opportunities from external changes (VELTRI et al., 2008).

By now you can start to see that issues stem, not only from conflict, but from lack of situational awareness and how the

In Part 2 I look at considerations for outsourcing

Service Management in an as-a-service world – Part 2

August 6th, 2015 Comments off
Reading Time: 4 minutes

This is part 2 of a guest blog I was asked to create for the Service Management Conference. you can find the original here and where it was published completely in the July issue of the itSMF Bulletin.

Why business mapping is critical to effective Service Management and how to get started.

In Part 1 we looked at why the cloud can give IT service management team more control – not less. Now let’s look at how to use business mapping to provide control and visability in a world where applications are offered as subscription services, from a multitude of vendors.

Use Business Mapping To Ensure IT Truly Supports the Business

A map looks at the context of complex systems. We’re familiar with technology roadmaps that match short-term and long-term goals with specific technology solutions to help meet those goals, often presented in a diagram. They are designed to help customers (including internal customers) understand the technology, current and future, that is at work in their business. But the technology view is only one part of the puzzle.

In addition to addressing the business’ immediate and projected needs you need to have a larger view of the product/capability that your organisation provides and the market forces that may impact it. The external forces range from market segment growth, competitive situation and your distribution channels through to political, economic and environmental factors – and more. There are also internal forces including the company, customers, suppliers and other constituents. This view is known as a market audit.

A business map takes this to the next level. It starts with identifying the need that the organisation is addressing with its product or service, the evolution of that product/service from an idea through to a marketable product and eventually a commodity.

Business maps arm the technologist, and business professional, with information that can be used to understand the overall business’ direction and what factors influence the various capabilities that underpin the central need of the value chain. This holistic view of the business gives context for recommendations and decisions. Hint: Get it right and there will be less instances of Shadow IT, as you will be able to understand the emerging needs of the business as it relates to its strategy

Here are six questions to help you start the mapping process:

  1. Where are we now with the business capabilities, supporting processes and technologies?
  2. What is the visibility and value placed on each of these
  3. Where do we want or need to go with these? Ultimately the drive is to head toward commodity, however, that isn’t always the right answer as there are sometimes constraints
  4. How do we get to where we want or need to be?
  5. As the organisation moves from new and novel to commodity, what are your options for sourcing and delivering?
  6. How will we know that we are on track?

If you’d like to know more about business mapping read my blog or go see Simon Wardley’s blog

Transparency across multiple vendors

IDC predicts more than 65 percent of enterprise IT organisations globally will commit to hybrid cloud technologies before 2016. This hybrid environment encompasses everything from applications, to platforms to business services, providing the services the business needs dynamically.

So once you’ve mapped your organisation and selected your solutions how do you track and manage service delivery across multiple delivery modes and suppliers? How do you let the business know what is available to it? And how do you encourage the innovation through the adoption of new services?

Integrating the disparate IT and business systems and providing a clear view of what services are available to the business based on Persona allows everyone to know what is available. Most importantly this provides a way of tracking and measuring the services, both individually and holistically as they underpin key business capabilities.

So there’s no need to fear the cloud. Recognise it for what it is – a different way of delivering services that can actually give you more control, not less, provided you take the effort to jump into the driver’s seat and use your map.

NOTE: Original post included corporate product links, I’ve removed them from here and made specific reference to Simon’s blog (which was found through my blog link in the original)

Service Management in an as-a-service world – Part 1

July 30th, 2015 Comments off
Reading Time: 5 minutes

This is part one of a guest blog I was asked to create for the Service Management Conference. you can find the original here and where it was published completely in the July issue of the itSMF Bulletin.

Screen Shot 2015-08-30 at 12.37.17 pm

Why moving to the cloud can give you more control, not less.

What are the opportunities and challenges for the IT service management team in a world where more applications are moving into the cloud, offered as subscription services, from a multitude of vendors? Can you keep control and visibility?

Recently I led a discussion at an itSMF Special Interest Group meeting about IT service management in an “as-a-Service” world – a world where the way IT is procured, delivered and consumed has fundamentally changed with the advent of cloud computing. Not that cloud computing is new by any means – particularly in smaller organisations, but it is now becoming more and more prevalent in large enterprises. Or it is expected to be…

While there has been a lot of hype around “the cloud”, what became apparent at the meeting is that most information is targeted at the executives in high level overviews, or at techies in great technical detail.

Meanwhile, the IT service management team has been left in the cold. There is little clear direction on “how to” or “where to start” and too much hype versus fact. Yet it is the service management team who often has the responsibility to “make it happen”.

In our discussion, which included IT service management professionals from government, financial services and IT vendors, the concerns/queries about service management in a cloud environment were startlingly consistent across industry sectors:

  •        What is the best way to monitor and report service delivery?
  •        How have other organisations done it?
  •        What is hybrid cloud and how do you manage it?
  •        How do you manage service integration across multiple vendors?

The Australian Government defines cloud computing as a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.

Interestingly, the itSMF group viewed cloud as a commercial model for delivering IT, rather than a technology. And the overriding concern is that these services are not in their control.

So how does cloud impact the policies, processes and procedures service management uses to plan, deliver, operate and control IT services offered to end-users?

For me it comes down to recognising that while traditional IT procurement has changed, you can still be in control; defining a clear – but flexible – business map for how the technology, processes and people will support the business; and ensuring transparency across multiple vendors.

New Ways of IT Procurement Don’t Have to Mean You Lose Control

Much of the fear of losing control comes from the feeling that IT departments are relinquishing control to IT third parties because they no longer own the IT and can’t see, touch or grab it. Yet in many ways they have more control than ever as it is easier to increase or decrease capacity quickly in response to changes in your organisation or the market in which it operates. And, if you chose the right vendor, they should provide you with regularly updated innovative solutions and contracted service levels rather than you being locked into a technology that will start to age as soon as you implement it.

Of course it’s not simple matter of moving everything into the cloud. Sometimes legislative requirements will dictate where data can be stored or who has access to it which may force an application to be insourced. Or it may depend on the maturity of an organisation’s approach to IT – an immature organisation may refuse to outsource because it is simply fearful of doing so whereas a mature approach is open to pushing risk outside the organisation.

And not all clouds are the same. A private cloud is used by a single organisation. A community cloud is for the exclusive use of a specific community of consumers with shared concerns (eg security requirements or mission). A public cloud is for open use by the general public. And a hybrid cloud is comprised of multiple distinct cloud infrastructures (private, community or public). Whilst the debate over public vs. private cloud services rages on, in the context of the above and the relative organisational needs and maturity, they all have a place.

This feeling of a loss of control can be exacerbated by departments choosing their own systems, easily bought and delivered over the Internet. However this “shadow IT” should not be feared – instead it should be seen as an indicator that the IT department is not delivering what they need. This is why business mapping is so important.

 

Part 2 of this blog will cover why business mapping is critical to ensuring IT and Service Management truly support the business and how to get started.

Waves of innovation

July 25th, 2015 Comments off
Reading Time: 2 minutes

Today I’ve been reading about McNurlin and Sprague’s “Waves of innovation” model (2009) for the changing role of IT within an organisation. It’s essentially made up of 6 waves that have been observed over time, it looks like McNurlin started with 5 and the 6th was added somewhere in 2009.

Waves of Innovation

Wave 1 – Reducing Costs – began in the ’60s with a focus on automation

Wave 2 – Leveraging Investments – began in the ’70s with a focus on reusing corporate assets with systems justified on ROI and cashflow

Wave 3 – Enhancing Products and Services – began in the ’80s with the focus on IT being a revenue source through creating a strategic advantage

Wave 4 – Enhancing Executive Decision Making – began in the late ’80s with the emergence of real-time business management systems

Wave 5 – Reaching the Consumer  – began in the ’90s with using IT to communicate directly with users, completely changing the rules of engagement

Wave 6 – Partnership Supply-Chain Management – looks at integration of partners into the supply chain.

The premise is that these are observed waves and that IT is appearing to loosing some of it’s traditional responsibilities. I think that this is because the view painted treats IT capability as a uniform blob and not as discrete functions and capabilities. It doesn’t take into account that you have a spectrum of bleeding edge capabilities through to commoditised offerings at the far end and the value that each capability or service delivers sits somewhere on the “value” spectrum too.

Delivering value with IT systems requires clear understanding of the business, the services and capabilities that make it up and how IT can then support those individual pieces. This one dimensional view of IT is what holds business back from making smart decisions.

/rant

Leveraging IoT

March 31st, 2015 Comments off
Reading Time: 7 minutes

IoTLast week I was fortunate enough to attend the AIIA Government conference on “Navigating the Internet of Things”. This is the 4th year that they’ve run a Government specific conference for sharing experiences and educating people on what is happening in the industry, locally and globally in Government with Technology.

The conference was opened by the Honourable Malcolm Turnbull MP (Minister for Communications), who gave a great summary of the state of affairs with regards to the adoption of Internet technologies and how industry, on the back of initial Government stimulation, is thriving, constantly reinventing itself and driving innovation.

The major themes of the conference was transformation, transformation of cities and how we do things more efficiently, be it resource use, transportation or healthcare. It also reiterated that the emerging IoT world is very much a Digitally driven economic world.

 

Resource Use

One example used by Minister Turnbull was Water. Water Utilities loose 25-50% of water  due to leaks and due to the reactive nature repairs, are extremely costly to  repair – NICTA have created analytics on predicting what pipe is is most likely to fail and when, allowing for proactive maintenance, reducing the cost of the service. David Gambrel of NICTA explained how this approach was already being used on the Harbour Bridge reducing the cost of maintenance 10 fold.

Energy use and smart lighting that make up approximately 25-50% of government energy budgets, was the another area explored. The move to transforming lighting to smarter, LED based technologies has the ability to significantly reduce the cost and use of energy. One idea posed was the ability to equip smart lamp posts with ability to be charging stations, also creating an opportunity for governments and councils to offer charging to electric cars and create a new source of revenue.

 

Transportation

Another example of resource use is roads. In Australian cities, congestion on roads account for 4.26B working hours wasted, said Minister Turnbull. Connected vehicles for traffic management could solve some of this. One of the biggest hurdles to date is the getting real life data and not driver opinions. As the cost of sensors and integrated chips continues to drop, live monitoring of services becomes more feasible, especially when we include feeds from the likes of Google Traffic. Understanding how roads (as a resource are used)

Steve Leonard, Executive Deputy Chairman, Infocomm Development Authority of Singapore (IDA) presented the challenges of the Singaporean government and their approach to Transportation – how to use infrastructure more efficiently, In summary support adoption of smart cars to essentially allow them to be packed closer on roads, potentially doubling the capacity of the existing roads. This was supported by Susan Harris, CEO of Intelligent Transport Systems Australia who’s research suggested that up to 40% more cars could be put on roads if we had automated cars with smart telemetry capability.

Lutz Heuser, CTO for the Urban Institute presented his Institute’s Reference architecture for future cities (see below). This was part of a wider view that to be successful in  to create a new Government service Infrastructure of data streams and analytics – new utility provided by governments. Cloud based open and realtime.

Smart city Ref Arch

Smart Cities Reference Architecture – Lutz Heuser, Urban Institute, 2015

Finally UBER’s Melbourne General Manager explained how IoT and the marketplace they created using that technology allowed them to extend and supplement the public transport system.

The real future of the connected city or “Smarter” city will have smart and autonomous vehicles, providing better use of existing transport systems, allowing for denser and more efficient use of vehicles on the already crowded routes. All enabled by sensors that feed large inter-connected systems that make sense of the data.

 

Healthcare

Again Steve Leonard, (IDA) explained the problem they have in Singapore. Urban density of approximately 8000 people per square kilometre, means that they are not only far more dense than Australia or the U.S. but that they need to make sense of the projected needs of the population with the real estate available. Singapore, like most have an ageing population- they are all living longer and birthrate is slowing – This change in demographic has caused them to look at the statistics of hospital stays. If you are over 65 you are likely to stay 30% longer in hospital – this has a huge impact on hospitals and the projected number of future hospitals needed to support the population. Given geographic and economic constraints Singapore cannot build hospitals as much as they need; nor could they staff them. Additionally their studies have shown that 20% of patients contributed to 80% of re-admissions. So how can they offload chronic care, focus on triage and emergency care? They’ve looked to technology. leveraging their fibre network reach (1GB to each home) and eHealth technologies with in-home care to offload.

Dr James Freeman, CEO of GP2U.com.au a Telehealth business delivering services in Australia via video-conference so patients don’t have to physically see a Dr. and can have scripts filled and ready for pickup from local pharmacy. With the proliferation of sensors and cameras in consumer devices, they are able to deliver some consultation services remotely, never having to physically see a patient. Dr. Freeman pointed out that that the adoption is slow to date and this is a combination of no financial incentive to take up these services and legislation being slow to catch up to technology. The financial incentive model is absolutely necessary as there is little chance people will use these services off their own back. I’ve recently seen with my father-in-law, being issued a blood pressure monitor from his health insurer. Each measurement is logged and set directly to the provider for them to track his health. He rarely does it as there is yet no incentive to do so, no lowering of his premium or rebate for his troubles.

 

Government as a Service – The new Utility

What all of these presentations and discussions showed was that the future for Government is providing data as a service. Today the DTO is working at improving the way government delivers services, with the end goal of speaking to customers as one public sector. Delivering services on common platforms. Data.gov.au will continue to be developed and invested in.

This view was echoed by Ros Harvey, Chief Strategist and Advisor for KEI and Sirca, Government as a platform is the future, getting the community to innovate on top of the services and data that government supplies. This was reiterated by Pia Waugh, department of finance, who has been working for years working towards the goal of “Government as an API” and creating the mashable government- making what Gov does more available regardless of agency or jurisdiction.

If the Australian Government can continue with the work that they’ve started it will be well on its way to making Australia the worlds leading digital economy, an aspiration of Minister Turnbull.

 

How to make IoT successful.

The resounding themes were integrity and security will be important as IoT proliferates. Security must be the foundation of any platform (Brian McCarson, Intel) and approached from an epidemiological standpoint (Turnbull). Using high level pattern analysis and large mass data analytics to see trends and changes in the system.

 

fundemental IoT

Fundamental Tenants of IoT, Intel, 2015

 

Conclusion

IoT is breaking through the novelty and into the mainstream with the backing and support of Government. As more and more sensors find their way into roads, waterways, infrastructure components and government systems, this data, raw and refined, will become the new economy that governments will not only collect revenue from, but use to manage and shape the policies of the future. Using this knowledge and mapping the ILC cycle will help businesses (and government) understand how to leverage the innovation and properly commodities the services needed.