Experience management is a hot IT service management (ITSM) trend, bringing with it experience level agreements (XLAs). However, if you’ve tried to find an agreed-upon definition of what an XLA is online, you’ll find that there isn’t one. Besides, as with service level agreements (SLAs) and service-level metrics or targets, the term XLA can also have multiple meanings. To help, this blog sheds some light on what XLAs are and how best to approach them.
What are XLAs?
The term experience level agreement logically maps to service level agreements – meaning it’s a document of some form. However, as with the term “SLA”, XLA is often used to define the metric or target rather than the actual agreement that houses the metric and other experience-focused attributes.
It might sound trivial. However, it’s important to understand which of agreements and metrics is being talked about or written about when XLAs are mentioned. Take this blog, for example – it could be about experience level agreements, experience level targets, or both. Most organisations focus on the targets, which are used to supplement SLAs, rather than experience level agreements replacing SLAs.
In terms of experience level agreements, their origins and those of XLA metrics are said to come from a Dutch company, Giarte, which describes them using statements such as:
- “SLAs measure deliverables”
- “XLAs measure impact”.
There are other differing attributes, too. For example, SLAs tend to be vertically focused – for example, measuring the performance of the IT service desk. Whereas XLAs have a horizontal focus, appreciating that an end-user’s experience is formed from interactions with multiple IT or business touchpoints.
Another example is that SLA metrics are usually set for the life of the agreement, perhaps with the targets changing to reflect improvements over time. Whereas XLA targets should change regularly to reflect what’s most important to employees and business goals as internal and external factors change (including the impact of experience-focused improvements).
The focus of XLA metrics
As with the lack of an industry-agreed XLA definition, the measurement of experience can also differ across tool vendors and service providers. There are, however, two commonly used experience metrics (which might also be called “experience indicators”). The first is how the service consumer feels about the services they have experienced. This can be based on individual transactions, such as the incidents and service requests handled by the IT service desk. Or it can be periodic feedback that looks at broader IT services such as corporate IT equipment or applications. This metric is usually presented as a happiness score or similar.
The second common XLA metric relates to the perceived time lost due to individual IT issues and service requests. So, for instance, an employee’s perception of their lost productivity might be significantly different to what’s reported by traditional SLA metrics. An extreme example is where the “SLA clock” is repeatedly stopped during the incident resolution process to reflect the incident resolution being out of the IT service desk’s “control” at certain points. The ticket might meet the agreed-upon SLA, but from the end-user’s perspective, it has taken days for them to get the resolution they need to become fully productive again.
An important point to appreciate about these and similar XLA metrics is that they address two key issues with SLA metrics:
- SLA metrics have traditionally focused on what IT thinks is most important
- SLA metrics measure performance at the point of service creation, not at the point of service consumption.
The result is that XLAs allow IT service providers to see beyond their operationally focused SLA metrics to understand the end-user perspective of how IT service delivery and support capabilities are helping or hindering them.
7 tips for introducing XLAs
- Gain a shared understanding of experience management and XLAs. This includes how experience management differs from traditional IT performance measurement and whether your organisation will start with experience level agreements or simply XLA metrics. So far, the industry has seen most organisations opting for the latter. If only because it’s the XLA targets and experience data and insights that will really make a difference to the IT status quo.
- Work with end-users and other business stakeholders to understand “what matters most”. Move from making decisions based on the IT “gut feel” of what’s working and what’s not (perhaps when SLA metrics are showing that all is well) to finding out what end-users in particular need from corporate IT capabilities. For example, their key IT touchpoints and the required experiences. This might highlight additional XLA metrics to the ones outlined above.
- Set clear objectives for the chosen XLA metrics. XLA metrics should aim to improve experiences, so ensure that the desired end state (or at least the direction) is understood before selecting and implementing XLA metrics.
- Define the XLA metrics and initial targets carefully. These XLA metrics or key experience indicators (KEIs) should focus on what matters most to end-user experiences. They also need to be able to provide insights that drive improvement activity. Understanding the difference between XLA and traditional SLA metrics is critical. For example, while existing customer satisfaction (CSAT) questionnaires might be considered suitable to measure end-user experiences, they usually don’t. Whether it’s their operational focus, timings, low response rates, or other reasons.
- Ensure mechanisms are established to analyse and act on the gathered experience data. This is key to experience management and XLA success – showing that capturing experience data and the prioritised improvements are making positive differences to end-user experiences, IT and business operations, and business outcomes. Organisations often find this step harder than the initial capturing of experience data (thanks to fit-for-purpose experience measurement tools).
- Share the experience data and insights widely. While the early experience data might be viewed as “skeletons in IT’s closet”, i.e. issues that were previously not identified, reported, and addressed, widely communicating IT’s performance against XLA targets provides transparency that builds trust and encourages more feedback.
- Plan to review XLA metrics regularly. It’s good old-fashioned continual improvement to help ensure that XLA metrics (and experience level agreements if used) remain relevant and effectively improve end-user experiences.
Plus, don’t bin your service level agreements and SLA metrics when starting with XLAs. You might feel comfortable doing so sometime in the future. However, your business stakeholders will likely still want to see the SLA reports until they’re convinced of the importance and power of XLAs.
Finally, while the tail end of this blog has focused on XLA metrics, the tips can also be applied to creating experience level agreements for IT services (that contain XLA metrics).