Async Processing Glossary
Key terms and concepts in task queues, background jobs, and async processing.
B
Background Job
A background job is a task that runs outside the main request-response cycle, allowing applications to handle time-consuming operations without blocking users.
Backpressure
Backpressure is a flow control mechanism where a system signals upstream producers to slow down when it can't keep up with incoming work.
C
Callback in Async Processing
A callback is a function or URL that is invoked when an asynchronous operation completes, enabling non-blocking workflows and event-driven architectures.
Circuit Breaker
A circuit breaker is a resilience pattern that stops calling a failing service after repeated failures, preventing cascading outages across your system.
Cold Start
A cold start is the initialization delay when a serverless function runs for the first time or after being idle, adding latency before processing.
Connection Pool
A connection pool is a cache of reusable database or service connections that eliminates the overhead of establishing a new connection for each request.
Cron Job in Task Scheduling
A cron job is a time-based scheduler that runs tasks automatically at specified intervals, commonly used for recurring background operations.
F
P
Payload in Task Queues
A payload is the data or parameters sent along with a task or API request, containing the information needed to execute the work.
Polling for Task Status
Polling is the practice of repeatedly checking a resource's status at regular intervals to detect changes or task completion.
R
Rate Limiting
Rate limiting controls the number of requests a client can make to an API within a time window, protecting services from abuse and overload.
Retry in Distributed Systems
A retry is an automatic attempt to re-execute a failed task or API call, essential for handling transient failures in distributed systems.
T
Task Chaining
Task chaining is a pattern where asynchronous tasks are linked sequentially, with each task's output feeding as input to the next task in a workflow.
Task Queue
A task queue distributes work across processes or machines, letting apps offload time-consuming operations to background workers.
Throughput
Throughput is the number of tasks or requests a system can process per unit of time, a key metric for measuring capacity and scalability.
Timeout in API Requests
A timeout is the maximum duration allowed for an operation to complete before it is automatically cancelled or marked as failed.
W
Webhook for Event Delivery
A webhook is an HTTP callback that delivers real-time data to other applications when a specific event occurs, enabling event-driven integrations.
Worker in Background Processing
A worker is a process that picks up tasks from a queue and executes them, running independently from the main application to handle background processing.