Combining AsyncIO and Multiprocessing in Python

Mar 17, 2024 ยท 2 min read

Python's asyncio library provides infrastructure for writing asynchronous code using the async/await syntax. Meanwhile, the multiprocessing module allows spawning processes to leverage multiple CPUs for parallel execution. Can these tools be combined?

The short answer is yes, with some care around passing data between async code and multiprocessing.

Why Combine AsyncIO and Multiprocessing?

There are a few potential benefits to using AsyncIO and multiprocessing together in Python:

  • Improved resource utilization - AsyncIO allows non-blocking IO in a single thread, freeing up resources while IO is in progress. Multiprocessing fully utilizes multiple CPUs for CPU-bound parallel work. Using both can maximize resource usage.
  • Simplified asynchronous code - AsyncIO provides a nice high-level interface for asynchronous logic in Python. But it runs in a single thread, so CPU-bound work will block the event loop. Offloading CPU work to other processes prevents this.
  • Avoid callback hell - Multiprocessing avoids complications with threading and the GIL. But async code with lots of callbacks can get complex. AsyncIO provides async/await to mitigate this.
  • Passing Data Between Processes

    The main catch with mixing AsyncIO and multiprocessing is that concurrent data structures like Python queues are not compatible across the boundary.

    The safest approach is to use multiprocessing queues, pipes or shared memory to pass data between the async event loop and processes. For example:

    import asyncio
    from multiprocessing import Queue
    queue = Queue() 
    async def async_worker(queue):
      data = await get_data()  
    def mp_worker(queue):
      data = queue.get()

    So in summary - AsyncIO and multiprocessing absolutely can combine forces in Python for improved performance, resource utilization and cleaner code. Just be careful in how data flows between the two worlds.

    Browse by tags:

    Browse by language:

    The easiest way to do Web Scraping

    Get HTML from any page with a simple API call. We handle proxy rotation, browser identities, automatic retries, CAPTCHAs, JavaScript rendering, etc automatically for you

    Try ProxiesAPI for free

    curl ""

    <!doctype html>
        <title>Example Domain</title>
        <meta charset="utf-8" />
        <meta http-equiv="Content-type" content="text/html; charset=utf-8" />
        <meta name="viewport" content="width=device-width, initial-scale=1" />


    Don't leave just yet!

    Enter your email below to claim your free API key: