Search
Menu
Lambda Research Optics, Inc. - Mission
Photonics Dictionary

latency

Latency is the time interval between the stimulation or input of a system and the response or output. It is a measure of the time delay experienced in a system, network, or process. Latency can occur in various contexts:

Computer systems and networks:
In computing, latency refers to the delay between initiating a request (such as sending data or a command) and receiving a response. It includes processing time, transmission time over a network, and sometimes queueing delays in routers or switches.

Audio and video streaming: Latency in audio and video streaming refers to the delay between the transmission of data and its reception and playback. High latency can cause delays or interruptions in real-time applications like video conferencing or online gaming.

Storage systems: In storage systems, latency refers to the time it takes for a data request to be fulfilled, including disk access time and data retrieval time.

Human-computer interaction:
Latency can also refer to the delay perceived by users in interactive systems, such as the delay between clicking a button and seeing a response on the screen.

Reducing latency is often crucial in optimizing performance and responsiveness in systems, particularly in real-time applications where immediate feedback or interaction is required.
We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.