Latency
ELI5 — The Vibe Check
Latency is the delay before data starts moving — the time it takes for a request to go from your device to the server and back. It's measured in milliseconds (ms). Low latency = fast and snappy. High latency = laggy. Bandwidth is about how much, latency is about how fast.
Real Talk
Latency is the time delay between initiating a network request and receiving the first byte of the response. It's affected by physical distance, number of hops, network congestion, and processing time. Often measured as Round Trip Time (RTT).
Show Me The Code
# Measure latency with ping
ping google.com
# Output example:
# 64 bytes from 142.250.80.46: icmp_seq=0 ttl=117 time=12.4 ms
# ^^^^^^^^^
# latency = 12.4ms
When You'll Hear This
"The latency to Asia-Pacific is 200ms — we should add a CDN node there." / "High latency is killing the gaming experience."
Related Terms
Bandwidth
Bandwidth is how wide your internet pipe is — how much data can flow through per second. A narrow pipe means slow speeds, a wide pipe means fast speeds.
CDN (Content Delivery Network)
A CDN is a network of servers spread around the world that store copies of your files.
Ping
Ping is the simplest network test — you shout at a server ('hello?') and measure how long it takes to shout back ('yo!').
TCP (Transmission Control Protocol)
TCP is like sending a package with delivery confirmation.