Overview

Overview

Analysis overview and configuration

Configuration

Analysis TypeSla Analysis
CompanySupport Operations
ObjectiveAnalyze support ticket resolution times, SLA compliance rates, and identify performance bottlenecks across priority levels, ticket types, and channels
Analysis Date2026-02-28
Processing Idtest_1772308485
Total Observations2769

Module Parameters

ParameterValue_row
sla_thresholds4,8,24,48sla_thresholds
time_unithourstime_unit
resolution_bins0,1,4,8,24,48,72,168resolution_bins
min_group_size5min_group_size
significance_level0.05significance_level
alpha0.05alpha
Sla Analysis analysis for Support Operations

Interpretation

Purpose

This SLA analysis evaluates support ticket resolution performance across 2,769 resolved tickets, examining compliance rates, resolution speed, and operational bottlenecks. The analysis identifies which priority levels, ticket types, channels, and customer segments experience the highest SLA breaches to guide operational improvements.

Key Findings

  • Overall Breach Rate: 29% (802 of 2,769 tickets) - Nearly one-third of resolved tickets miss SLA targets
  • Critical Priority Crisis: 66.3% breach rate with 481 breaches across 726 tickets—the dominant performance driver
  • Resolution Speed: Median of 6.7 hours with P90 at 16.15 hours shows wide variability; 42% of tickets take 8+ hours
  • Satisfaction Disconnect: Average rating of 2.99/5.0 with virtually zero correlation to resolution time (-0.001), suggesting quality issues beyond speed
  • Channel Consistency: All four channels perform similarly (28.7–29.2% breach rates), indicating systemic rather than channel-specific problems

Interpretation

The analysis reveals a priority-driven performance crisis. Critical tickets breach at 2.3× the overall rate, while Medium and Low priorities show zero breaches, suggesting SLA targets may be misaligned with actual capacity. The weak satisfaction-resolution correlation

Data Preparation

Data Pipeline

Data preprocessing and column mapping

Data Quality

Initial Rows8469
Final Rows2769
Rows Removed5700
Retention Rate32.7

Data Quality

MetricValue
Initial Rows8,469
Final Rows2,769
Rows Removed5,700
Retention Rate32.7%
Processed 8,469 observations, retained 2,769 (32.7%) after cleaning

Interpretation

Purpose

This section documents the data filtering applied to the ticket dataset before SLA analysis. The 67.3% row removal rate is substantial and directly impacts the scope of conclusions—the analysis reflects only resolved tickets, excluding unresolved cases that may exhibit different SLA patterns. Understanding this preprocessing is critical for assessing whether findings generalize to the full ticket population.

Key Findings

  • Retention Rate: 32.7% (2,769 of 8,469 rows retained) - Analysis is based exclusively on resolved tickets, excluding 5,700 unresolved cases
  • Rows Removed: 5,700 observations filtered out, likely representing open or in-progress tickets not yet closed
  • No Train/Test Split Documented: The analysis uses the entire retained dataset without explicit cross-validation partitioning, limiting statistical robustness assessment

Interpretation

The heavy filtering toward resolved tickets creates a survivorship bias in the SLA analysis. The 29% breach rate and 7.74-hour average resolution reflect only completed cases; unresolved tickets may have different characteristics (longer durations, higher complexity). This preprocessing choice aligns with the stated objective to analyze SLA performance on closed tickets, but excludes insights into tickets stuck in workflow or pending closure—potentially the most problematic cases.

Context

The absence of a documented train/test split suggests this

Executive Summary

Executive Summary

Key Metrics

total_tickets
2769
breach_rate
29
avg_resolution_hours
7.74
median_resolution_hours
6.7
p90_resolution_hours
16.15

Key Findings

categoryfindingimpact
Overall SLAOverall SLA breach rate is 29.0% (802 of 2769 resolved tickets)Medium
Resolution SpeedMedian resolution time is 6.7 hours (P90: 16.1 hours)Info
PriorityCritical priority has highest breach rate at 66.3% (726 tickets)High
Ticket TypeCancellation request tickets have highest breach rate at 31.4%High
Top BottleneckPriority: Critical has highest impact score (481.3): 66.3% breach rate across 726 ticketsHigh
Ticket SubjectHardware issue has highest breach rate at 36.1% (183 tickets)High
DemographicsAge group 36-45 has highest breach rate at 29.9% (532 tickets)Info
SatisfactionAverage satisfaction rating is 2.99 out of 5.0Info

Summary

Bottom Line: SLA compliance needs improvement with targeted interventions. 29.0% of 2769 resolved tickets breached SLA targets.

Key Metrics:
- Breach Rate: 29.0% (802 tickets)
- Median Resolution: 6.7 hours
- P90 Resolution: 16.1 hours

Recommendations:
- Prioritize the top-ranked bottleneck categories for process improvement
- Review SLA targets for alignment with actual resolution capabilities
- Investigate root causes for high-breach categories
- Consider staffing adjustments for high-volume, high-breach categories

Interpretation

EXECUTIVE SUMMARY: SLA PERFORMANCE ANALYSIS

Purpose

This analysis evaluates ticket resolution performance against Service Level Agreement (SLA) targets across 2,769 resolved tickets. The objective is to identify compliance gaps and understand which operational factors drive SLA breaches, enabling targeted improvement efforts.

Key Findings

  • Overall Breach Rate: 29% (802 of 2,769 tickets) — Nearly one-third of resolved tickets miss SLA targets, indicating systemic compliance challenges
  • Critical Priority Bottleneck: 66.3% breach rate on 726 critical tickets with 4-hour SLA — The most severe performance gap, representing 481.3 impact score
  • Resolution Speed: Median 6.7 hours with P90 at 16.1 hours — Wide variance suggests inconsistent handling; 42% of tickets exceed 8 hours
  • Secondary Bottlenecks: Hardware issues (36.1% breach), cancellation requests (31.4% breach), and high-priority tickets collectively account for majority of breaches
  • Satisfaction Disconnect: Average rating of 2.99/5.0 shows minimal correlation with resolution speed (-0.001), suggesting other factors drive dissatisfaction

Interpretation

The 29% breach rate reflects a structural misalignment between SLA targets and operational capacity

Table 4

SLA Performance KPIs

Key SLA performance indicators including breach rate, resolution time statistics, and overall compliance metrics

metricvalue
Total Tickets (Resolved)2769
SLA Breaches802
Breach Rate (%)29%
Avg Resolution (hrs)7.74
Median Resolution (hrs)6.7
P90 Resolution (hrs)16.15
Std Dev (hrs)5.61

Interpretation

Purpose

This section establishes the baseline SLA performance across all resolved tickets, providing a high-level view of compliance and resolution efficiency. Understanding these aggregate metrics is essential for identifying whether systemic issues exist and for contextualizing performance variations across priority levels, ticket types, and channels examined in subsequent analyses.

Key Findings

  • Overall Breach Rate: 29% (802 of 2,769 tickets) — Nearly one-third of resolved tickets exceeded their SLA targets, indicating moderate compliance challenges
  • Median Resolution Time: 6.7 hours — Half of all tickets resolve within this timeframe, suggesting reasonable baseline performance
  • P90 Resolution Time: 16.15 hours — The slowest 10% of tickets take more than double the median, revealing a long tail of extended resolutions
  • Mean vs. Median Gap: 7.74 vs. 6.7 hours — The small difference (1.04 hours) indicates a relatively symmetric distribution without extreme outliers

Interpretation

The 29% breach rate represents a significant operational gap, with roughly 1 in 3 tickets failing to meet SLA commitments. The tight clustering around the median (6.7 hours) suggests most tickets resolve predictably, but the P90 threshold reveals a meaningful subset experiencing substantially longer resolution cycles. This distribution pattern indicates the breach problem is not uniform

Figure 5

Resolution Time Distribution

Distribution of ticket resolution times across time buckets to identify where most tickets cluster

Interpretation

Purpose

This section reveals how ticket resolution times cluster across the organization, identifying whether most tickets resolve quickly or if significant delays are common. Understanding this distribution is critical for assessing SLA compliance patterns and identifying whether the 29% breach rate stems from systemic delays or isolated problem cases.

Key Findings

  • Median Resolution Time: 6.7 hours – half of all tickets resolve within this window, indicating generally responsive performance
  • P90 Resolution Time: 16.15 hours – the slowest 10% of tickets extend well beyond the median, showing a right-skewed tail
  • Standard Deviation: 5.61 hours – moderate variability suggests some consistency, but with meaningful outliers
  • Distribution Pattern: 42% of tickets take 8–25 hours (longest bucket), while only 7.6% resolve in under 1 hour, indicating most tickets cluster in the 1–8 hour range (50.4% combined)

Interpretation

The distribution reveals a bimodal tendency: a core group resolving efficiently within 8 hours, and a substantial tail (42%) extending into longer timeframes. This explains the 29% overall breach rate—while median performance is solid, the extended tail creates systematic SLA violations, particularly for priority levels with tight targets (Critical: 4-hour SLA). The moderate standard deviation masks the impact

Figure 6

Performance by Priority

SLA breach rate and resolution time performance broken down by ticket priority level

Interpretation

Purpose

This section evaluates SLA compliance across four priority tiers to identify which ticket categories are meeting contractual resolution targets. Since different priorities have distinct SLA windows (Critical: 4 hours, High: 8 hours, Medium: 24 hours, Low: 48 hours), this breakdown reveals whether the organization's response capacity aligns with urgency levels and where performance gaps exist.

Key Findings

  • Critical Priority Breach Rate: 66.3% (481 of 726 tickets) — the most severe performance gap, despite a 4-hour SLA target
  • High Priority Breach Rate: 45.5% (321 of 705 tickets) — significant non-compliance with 8-hour target
  • Medium & Low Priority: 0% breach rates — both tiers fully meet their 24-hour and 48-hour targets
  • Resolution Speed Consistency: Average resolution times (7.36–8.2 hours) remain relatively stable across priorities, suggesting systemic capacity constraints rather than priority-based differentiation

Interpretation

The data reveals a critical misalignment: tickets marked Critical are breaching SLA at two-thirds the rate, indicating the organization cannot consistently meet its most stringent commitments. Medium and Low priorities achieve perfect compliance, suggesting adequate capacity exists for longer windows but insufficient resources for rapid turnaround. The

Figure 7

Breach Rate by Ticket Type

Ticket volume and SLA breach rate analysis by ticket type to identify problematic categories

Interpretation

Purpose

This section identifies which ticket types are most problematic for SLA compliance. By comparing breach rates and resolution times across five ticket categories, it reveals whether certain issue types systematically exceed service level targets—indicating potential process gaps, complexity mismatches, or resource allocation problems that warrant investigation.

Key Findings

  • Worst Type Breach Rate: 31.4% (Cancellation requests) — 2.4 percentage points above the best performer (Billing inquiry at 27%)
  • Resolution Time Correlation: Cancellation and Refund requests show longest average resolution (8.07 and 8.06 hours) and highest breach rates, suggesting inherent complexity
  • Billing Inquiry Performance: Lowest breach rate (27%) with fastest average resolution (7.24 hours), indicating efficient handling
  • Breach Rate Range: Narrow spread (27–31.4%) across types indicates relatively consistent performance, with no catastrophic outliers

Interpretation

The 4.4-percentage-point variance between best and worst performers is modest relative to the overall 29% breach rate, suggesting ticket type is not the primary SLA driver. However, Cancellation and Refund requests consistently underperform, correlating with longer resolution times. This pattern indicates these types may require more complex decision-making, customer interaction, or backend processing rather than simple routing or staff

Figure 8

Breach Rate by Issue Subject

SLA breach rate and satisfaction analysis by specific ticket subject/issue type

Interpretation

Purpose

This section identifies which specific ticket subjects (issue types) drive SLA breaches and resolution delays. By analyzing 16 distinct subjects, it reveals whether performance problems are concentrated in particular issue categories or distributed broadly—critical for understanding whether targeted process improvements can meaningfully reduce overall breach rates.

Key Findings

  • Worst Subject Breach Rate: Hardware issue at 36.1% (66 of 183 tickets)—significantly above the 28.95% subject average and the 29% overall breach rate
  • Resolution Time Correlation: Hardware issue also has the longest average resolution at 8.34 hours, suggesting complexity drives both delays and breaches
  • Performance Range: Breach rates span 23.7% (Battery life) to 36.1% (Hardware issue)—a 12.4 percentage point spread indicating substantial subject-level variation
  • Satisfaction Paradox: Hardware issue maintains neutral satisfaction (3.0) despite high breach rate; Installation support achieves highest satisfaction (3.18) with lowest breach rate (24.7%)

Interpretation

Hardware issues represent a concentrated performance problem: they account for 183 tickets with disproportionately high breach rates and resolution times. This suggests specific technical or diagnostic challenges inherent to hardware troubleshooting. The weak correlation between breach rate and satisfaction indicates customers may accept longer resolution times for complex issues

Figure 9

Customer Age Demographics

Customer demographics analysis showing SLA breach rates and satisfaction by age group

Interpretation

Purpose

This section examines whether SLA performance and customer satisfaction vary meaningfully across age demographics. By segmenting the 2,769 resolved tickets into five age groups, it reveals whether certain customer cohorts experience systematically different resolution times or breach rates—critical for identifying whether service quality is equitable across the customer base.

Key Findings

  • Breach Rate Range: 28.2% (18-25) to 29.9% (36-45)—a 1.7 percentage point spread indicating minimal age-based variation in SLA compliance
  • Resolution Time Variation: 7.37 hours (18-25) to 8.09 hours (46-55)—younger customers resolve slightly faster, though all groups cluster near the 7.74-hour mean
  • Satisfaction Consistency: 2.96 to 3.05 across all groups—negligible differences despite resolution time variance
  • Volume Distribution: 56+ age group dominates with 790 tickets (28.5% of resolved), while 18-25 represents the smallest segment at 390 tickets

Interpretation

Age demographics show remarkably homogeneous SLA performance, with breach rates hovering within 1.7 percentage points and satisfaction ratings virtually identical (0.09-point range). The 46-55 cohort experiences slightly longer resolution

Figure 10

Channel Performance

Performance comparison across support channels (email, phone, chat, social media)

Interpretation

Purpose

This section evaluates SLA performance consistency across four support channels (Chat, Email, Phone, Social Media). Understanding channel-level compliance reveals whether operational challenges are systemic or channel-specific, informing resource allocation and workflow optimization decisions.

Key Findings

  • Breach Rate Range: 28.7%–29.2% across all channels—a remarkably narrow 0.5 percentage point spread indicates uniform SLA performance regardless of channel
  • Phone Channel: Fastest median resolution (6.4 hours) and lowest average resolution time (7.43 hours), yet maintains 28.9% breach rate
  • Email Channel: Highest ticket volume (720) with lowest breach rate (28.7%), suggesting efficient handling despite workload
  • Chat Channel: Highest breach rate (29.2%) despite moderate resolution speed (7.7 hours average)

Interpretation

Channel performance is remarkably homogeneous, with all four channels clustering around the 29% organizational breach rate. This uniformity suggests that SLA breaches are driven by systemic factors (priority levels, ticket complexity, staffing constraints) rather than channel-specific inefficiencies. The slight variations in resolution speed do not translate to meaningful breach rate differences, indicating that speed alone does not determine compliance.

Context

These findings align with the overall 29% breach rate and suggest that channel optimization alone will

Figure 11

Satisfaction vs Resolution Time

Relationship between resolution speed and customer satisfaction ratings

Interpretation

Purpose

This section quantifies the relationship between how quickly tickets are resolved and customer satisfaction levels. Understanding this connection is critical for evaluating whether SLA performance improvements directly translate to better customer experience—a key business objective for operations optimization.

Key Findings

  • Average Satisfaction Rating: 2.99 / 5.0 - Consistently low across all resolution timeframes, indicating systemic satisfaction challenges independent of speed
  • Resolution-Satisfaction Correlation: -0.001 - Essentially zero correlation, meaning faster resolution times do not meaningfully improve satisfaction scores
  • Satisfaction Stability: Ranges only 2.93–3.01 across all resolution buckets, with longest-resolution tickets (8–25 hours) showing marginally higher satisfaction (3.01)

Interpretation

The near-zero correlation reveals a critical insight: resolution speed alone is not driving customer satisfaction. Despite 42% of tickets resolving within 8–25 hours, satisfaction remains flat at ~3.0 across all timeframes. This suggests underlying issues—such as solution quality, communication clarity, or unmet expectations—are more influential than speed. The slight uptick in satisfaction for slower resolutions contradicts typical expectations and warrants investigation into what differentiates those cases.

Context

This analysis assumes satisfaction ratings are comparable across resolution buckets and channels. The weak correlation does not eliminate

Figure 12

Product SLA Performance

SLA breach rate analysis by product or service category

Interpretation

Purpose

This section examines SLA compliance across 42 product categories to identify which products generate the most support burden and breach risk. Understanding product-level performance reveals whether certain products have inherent complexity, quality issues, or documentation gaps that drive higher ticket volumes and SLA failures—critical for prioritizing product improvements or support resource allocation.

Key Findings

  • Breach Rate Range: 30.9% to 42.2% across products—notably higher than the overall 29% breach rate, indicating significant product-level variation
  • Highest Risk Products: Bose SoundLink Speaker (42.2% breach rate, 8.7 hrs avg resolution) and MacBook Pro (40.7% breach rate) show the steepest SLA challenges
  • Resolution Time Correlation: Products with longer average resolution times (Nintendo Switch at 8.93 hrs, LG Washing Machine at 9.31 hrs) tend toward higher breach rates
  • Ticket Volume: iPhone leads with 82 tickets but maintains moderate breach rate (31.7%), suggesting volume alone doesn't determine performance

Interpretation

Product-level breach rates exceed the overall 29% baseline, revealing that specific products drive disproportionate SLA failures. The 11.3-percentage-point spread (30.9%–42.2%) suggests product complexity, support documentation quality, or

Table 13

Statistical Tests

Statistical hypothesis tests for significance of performance differences across categories

test_namestatisticp_valueeffect_sizeinterpretation
ANOVA (Resolution by Priority)3.010.0290.0033Significant
Kruskal-Wallis (Resolution by Priority)7.5580.0561Not significant
Chi-Square (Breach by Type)2.5460.6365Not significant

Interpretation

Purpose

This section validates whether observed differences in SLA resolution times across priority levels are statistically significant or attributable to random variation. Statistical testing is critical for distinguishing genuine performance patterns from noise, ensuring that resource allocation decisions are based on real operational differences rather than sampling artifacts.

Key Findings

  • ANOVA p-value (0.029): Statistically significant at the 0.05 threshold, indicating resolution times genuinely differ across priority levels
  • Kruskal-Wallis p-value (0.056): Marginally non-significant, suggesting the parametric test is more sensitive to the detected differences
  • Effect Size (0.003): Extremely small eta-squared value indicates that priority level explains only 0.3% of variance in resolution times, despite statistical significance
  • Chi-Square (Breach by Type): Non-significant (p=0.64), meaning breach rates do not meaningfully differ across ticket types

Interpretation

The ANOVA confirms that Critical priority tickets resolve differently than other tiers—a real operational phenomenon. However, the negligible effect size reveals this difference is modest in practical magnitude. While Critical tickets show a 66.3% breach rate versus 0% for Medium/Low, the actual resolution time differences (7.56 vs 7.36 hours) are small. This suggests priority classification influences S

Figure 14

Top Bottlenecks by Impact

Top bottleneck categories ranked by impact score (breach rate x volume) to prioritize improvement efforts

Interpretation

Purpose

This section identifies which ticket categories contribute most significantly to overall SLA breaches by combining breach rate with ticket volume. The impact score reveals where operational improvements would yield the greatest reduction in the current 29% breach rate, enabling prioritized resource allocation across priority levels, channels, and ticket types.

Key Findings

  • Critical Priority Impact: 481.3 impact score with 66.3% breach rate across 726 tickets—the dominant bottleneck, representing two-thirds of all breaches despite being one category
  • High Priority Secondary Bottleneck: 320.8 impact score with 45.5% breach rate across 705 tickets, compounding priority-driven SLA failures
  • Channel Consistency: All four channels (Email, Phone, Social Media, Chat) cluster tightly around 28.7–29.2% breach rates with minimal variation, indicating channel is not a primary differentiator
  • Ticket Type Homogeneity: Breach rates across ticket types range only 27–31.4%, suggesting type-level factors have limited impact compared to priority

Interpretation

The data reveals a priority-driven problem: Critical and High priority tickets account for the top two impact scores (802.1 combined), while lower priorities show zero breaches. This concentration indicates SLA targets for critical tickets are either misaligned with achievable resolution times

Want to run this analysis on your own data? Upload CSV — Free Analysis See Pricing