Building Asynchronous APIs with aiohttp and Queue

Mar 3, 2024 ยท 4 min read

In the world of web development, asynchronous programming is key for building fast, efficient APIs and web services. The Python ecosystem has some great async tools, namely the aiohttp library and Python's Queue module. In this article, we'll explore how to use them together to create a robust, production-ready API.

Why Asynchronous Programming?

Before we dive into the code, it's worth understanding why asynchronous programming matters for web APIs. The key advantage is that it allows us to handle multiple requests simultaneously without blocking execution.

Here's a typical synchronous server:

import time

def handle_request(request):
    time.sleep(5) # simulate blocking task
    return "Done"

while True:
    request = get_next_request()
    result = handle_request(request)

This server can only handle one request at a time. If multiple requests come in, they will be stuck waiting until the current one finishes.

Now compare it to an asynchronous style:

import asyncio

async def handle_request(request):
    await asyncio.sleep(5) 
    return "Done"

async def main():
    while True:
        request = await get_next_request()
        result = await handle_request(request)
        await send_response(result)

By using asyncio, we can handle each request concurrently. This lets us support much higher throughput.

So asynchronous programming unlocks more scalable and performant systems. The aiohttp library builds on asyncio to provide an easy HTTP server and client.

Building an aiohttp Server

Let's build a basic aiohttp server that returns a JSON response:

from aiohttp import web

async def handle(request):
    data = {
        'message': 'Hello World'
    return web.json_response(data)

app = web.Application()
app.add_routes([web.get('/', handle)])


This gives us a solid foundation. But how can we integrate a queue to facilitate more complex workloads?

Adding a Queue

Python's Queue module provides thread-safe FIFO queues. We can use it to build a work queue that handles requests asynchronously.

Here is an overview of the design:

  • Clients make requests to the aiohttp server
  • The server adds jobs to a queue
  • Worker tasks pull jobs from the queue and process them
  • Results are stored in a results dictionary
  • Clients then retrieve results from future requests
  • And here is an implementation:

    import asyncio
    from asyncio import Queue
    queue = Queue()
    results = {}
    async def worker():
        while True:
            request = await queue.get()
            results[request['id']] = handle_request(request)
    async def handle(request):
        await queue.put({
        return web.Response()
    async def results_route(request):
        result_id = request.match_info['id']
        if result_id in results:
            return web.json_response(results[result_id])
        return web.Response()
    app.router.add_get('/{id}', results_route)

    We create a queue instance to share job data between routes. The worker pulls requests and processes them concurrently. Results get cached in a dictionary that we check in the results_route.

    Let's discuss some key points:

  • Using await queue.get() allows the worker task to yield control if the queue is empty. This prevents blocking.
  • queue.task_done() signals the queue that the job is finished. This maintains an accurate queue size.
  • The results cache decouples the client from waiting for the actual processing. We can query for results later.
  • Clients can hit the API as often as needed to check for result availability.
  • To scale up, we can launch multiple worker tasks in parallel by calling asyncio.create_task(worker()) instead of directly invoking worker().

    Graceful Shutdown

    We likely want our aiohttp application to gracefully shut down on exit. Here is one way to handle that:

    import signal
    async def on_shutdown(app):
        print('Shutting down...')
        # clean up tasks go here
    loop = asyncio.get_event_loop()
    loop.add_signal_handler(signal.SIGINT, on_shutdown, app)

    The key points:

  • Get the event loop instance
  • Call add_signal_handler to bind the shutdown callback
  • Use asyncio.create_task() to create clean up coroutines
  • This will execute the shutdown routine on CTRL+C interrupts.

    Final Words

    That wraps up our tour of building asynchronous web services with aiohttp and queues! The key takeaways:

  • aiohttp enables asynchronous HTTP handling
  • Python queues facilitate concurrent task distribution
  • Graceful shutdown cleanly exits the process
  • Asynchronous programming opens up performance gains and scalability potential. By leveraging tools like aiohttp and queuing systems, you can build production-ready APIs and microservices in Python.

    There is a lot more we could cover, but this hits the core components.

    Browse by tags:

    Browse by language:

    The easiest way to do Web Scraping

    Get HTML from any page with a simple API call. We handle proxy rotation, browser identities, automatic retries, CAPTCHAs, JavaScript rendering, etc automatically for you

    Try ProxiesAPI for free

    curl ""

    <!doctype html>
        <title>Example Domain</title>
        <meta charset="utf-8" />
        <meta http-equiv="Content-type" content="text/html; charset=utf-8" />
        <meta name="viewport" content="width=device-width, initial-scale=1" />


    Don't leave just yet!

    Enter your email below to claim your free API key: