- Measuring application latency instead of Internet speed offers a better understanding of today’s Internet user experience.
- Internet service providers can leverage artificial intelligence in customer premises equipment to observe and provide personalised, application-specific feedback on how to improve performance.
This provocative title intends to force a rethinking of the network attributes that most impact Internet application quality for end users.
In much of the world, we are in an era that I like to call the “post-gigabit era”. Many users have access to gigabit connections—or at least hundreds of Mbps—and have moved from an era of bandwidth scarcity to bandwidth abundance. In this era, bandwidth no longer constrains end-user application quality.
As a result of this shift to low latency, the measurement ecosystem is beginning to move beyond speed tests and is more focused on user experience. Let’s explore.
Moving Away from Speed Tests
Internet speed tests have defined Internet performance assessment for the last 30 years.
Despite the increase in bandwidth, people are increasingly experiencing poor application quality. This is due to several other factors, including WiFi and working latency (a.k.a. network responsiveness).
In the case of WiFi, issues include:
- Using a 2.4 GHz band rather than 5 GHz
- Being too far from the access point
- Having only one radio in the access point
- Using wireless mesh backhaul from extenders rather than Ethernet
- Having interference from nearby networks.
The issues are varied, but in many cases, user-administered WiFi networks are a significant performance constraint.
As for measuring the effects of latency, rather than looking at PING latency—which is under “idle” conditions —you need to generate cross-traffic while doing a latency test. This better simulates real-world conditions where many users and devices simultaneously use the network.
When doing so, it isn’t uncommon to see connections with an idle latency of 25 ms rocket to hundreds of milliseconds per round trip (with multiple round-trips needed). Some users even experience one or two seconds of delay, which is highly variable over time.
A Different Take: “Quality of Outcome” (QoO)
An alternative approach is emerging, shifting away from artificial measurements that are indirect proxies of Internet quality to directly representative or predictive measurements.
These approaches can leverage an artificial intelligence agent running in customer premises equipment (CPE) in a home network to observe:
- the performance of the access network and the Internet beyond
- the user’s local wired and WiFi network, including down to a per-device and per-application level.
What this framework suggests is a future where your network can provide device and application-specific feedback, such as:
- “You had excellent FaceTime quality last week on all devices, across 54 call sessions.”
- “Jane’s iPhone experienced poor YouTube quality today due to low signal strength. To improve quality, install a new 5 GHz WiFi network extender in the living room. This will benefit four devices in the network.”
- “The Living Room Smart TV experienced frequent buffering of Netflix, Peacock, and Max applications because it had low WiFi signal strength on 2.4 GHz channels. You should connect that device to the network with an Ethernet cable or install a new 5 GHz WiFi network extender in the living room.”
- “Last month, several downloads of Fortnite and Call of Duty occurred, which were limited by your available bandwidth; you may want to consider adding bandwidth.”
Optimize for Network Responsiveness and Test Appropriately
Internet service providers can now take steps to optimize network responsiveness. This can involve deploying newer Active Queue Management (AQM) algorithms in the network and CPE and implementing IETF L4S and NQB, known as dual-queue networking.
Test platforms should also shift from focusing on bandwidth to the attributes affecting application quality, particularly latency and jitter.
Ultimately, the best goal is to directly track end-user application Quality of Experience and historical and predictive Quality of Outcome.
Jason Livingood leads Technology Policy, Product, and Standards, and is the single point of coordination for Comcast Cable’s technology and product division (TPX) on all key tech policy, standards, industry organization and research engagements.
The views expressed by the authors of this blog are their own and do not necessarily reflect the views of the Internet Society.


