In user experience, delays of even a fraction of a second can determine success or failure. This is especially true in the Internet of Things as connected devices become more commonplace. Today, users expect near-zero lag between user input and onscreen output. Whether controlling a smart doorbell via its accompanying app, or customizing a sensor on a desktop, the latency must be minimal.
This requirement is more about humans than modern computers. Sure, the device and its connection determine how quickly user input is reflected on the screen, but we humans are highly sensitive to delays in response. And we notice when devices and their interfaces just aren’t quick enough.
Let’s explore why response times are crucial to UX and device success and consider how device makers can get up to speed with the competition and improve latency — or risk getting left behind.
What are the 3 main reasons for device latency?
- The distance. Light can only move so fast.
- The device. Microcontrollers and cloud middlemen can slow devices down.
- The network. Lots of devices using the same network have to compete for fast service.
Why Latency Matters
Achieving an optimal device UX hinges on a fundamental benchmark: a response time of 0.1 seconds. This brief window is the key to making users feel that their actions translate directly into onscreen outcomes.
Think about when you click on an app option. If the command materializes in less than 100 milliseconds, it imparts a sense of agency, as though you initiated the app’s response. If the app takes longer than this to load, however, the responsiveness no longer appears instantaneous. Instead, it gives the impression that the computer is laboring to execute your command.
In essence, the challenge lies in achieving this 0.1-second milestone. To imbue the illusion of direct manipulation, developers must strive to communicate swiftly between the device and interface and reflect this to the user. The holy grail of device UX, therefore, is to provide swiftness, responsiveness and a level of immediacy that reinforces the user’s sense of control.
The problem is that IoT latency often falls short of this ideal. Far short. In fact, latency in many database-driven IoT solutions is often more than five seconds. Why is this the case?
What Slows Devices Down?
There are three main culprits to blame for high latency in connected devices.
1. The Distance
The first is distance. Sluggish response times often arise due to the required travel between data centers and their final destinations. Why? Because light can only move so fast. For example, a round trip between New York and Paris takes approximately 80 milliseconds simply due to the constraints of physics. This is an issue since cloud service providers strategically situate their data centers in remote, cost-effective locales. The sheer physical distance involved can extend data transit times, pushing them perilously close to the 100-millisecond mark and sometimes beyond.
2. The Device
Latency can also stem from the device itself. Connected devices often rely on small and inexpensive microcontrollers, which are slower than computer architecture when it comes to authentication and encryption. This is because computer architecture, found in personal computers and mobile phones, is characterized by powerful processors, abundant memory, and complex peripheral connections that enable high-speed data processing and multitasking.
In contrast, microcontrollers are compact, single-chip computers with slower processors, limited memory, and simpler peripheral connections. This makes them suitable for specialized tasks in resource-constrained environments like IoT applications but naturally slower than their PC counterparts.
Also, many devices use cloud services to connect between the client and the device. But this requires data to travel through a central server before reaching the end user, and vice versa when a command comes from the client. This cloud middleman affects latency. Additionally, although data is safeguarded with encryption during transit, it’s commonly decrypted at the server level. This means that the device data is temporarily stored by a third party, which also raises important questions regarding privacy and security.
3. The Network
The third is the network. In enterprises, for example, an overloaded network can prevent data packets from processing at peak efficiency. Whether competing with production traffic or even a corporate guest Wi-Fi network, extra activity can increase the time taken for a data packet to reach its destination.
How To Improve Device Speed and Interactivity
The good news is that device makers and vendors don’t have to just accept bad latency. There are tweaks for the above three issues that can improve user satisfaction without breaking the bank.
1. Edge Computing
For example, overcome distance by bringing data closer to home and processing at the edge. Edge computing is a decentralized model that relocates processing and data storage nearer to data sources, leading to faster response times and bandwidth conservation. This means that commands are processed immediately rather than at a faraway server. In addition to latency, the edge also offers important performance benefits. For example, autonomous vehicles at the edge can gather and process real-time traffic data independent from the cloud (and get you home faster).
2. Peer-to-Peer Connections
Additionally, tailoring the device connection brings latency benefits. Instead of server-based connections, vendors should consider peer-to-peer. This connection type only involves a server to establish the direct connection before data flows directly between two peers. Rather than routing through a third party, which can impact latency and data security, peer-to-peer enables the data to flow directly and quickly.
For example, this is useful in the surveillance sector since users want to securely transmit video without lag. Peer-to-peer is therefore a popular option since it lowers latency, avoids cloud providers (who demand extra costs when dealing with large files), and maintains data integrity by communicating directly between client and device.
3. Standalone Networks
Finally, users should also segment their devices to a standalone network. Simply enough, this means adding your devices to a WiFi network that’s separate from the one that connects to your laptops, desktops, or smartphones. This removes network clutter to allow your devices to communicate faster.
Don’t let distance, connection, and congestion get in the way of your UX. Edge computing, peer-to-peer connections and dedicated device networks can improve latency and satisfy impatient users. In today’s competitive connected device landscape, this is non-negotiable.