Notes on async-await in Python
I’ve glanced through coroutines in Python a few times already. It’s inevitable I’ll forget the details, so this note is a refresher for future-me (and, in some capacity, useful to a distinguished passerby).
We’re not taking apart the engine today. If the Python async world were a machine, we’re just pressing the buttons to see what lights up.
- 1) What is async programming?
- 2) Event loop: the mental model
- 3) Coroutines with
asyncio
- 4) Futures (or: the pizza box)
- 5)
async
-await
in practice - 6) Task ordering (a tiny subtlety)
- 7) Quiz: sequential vs concurrent awaits
- 8) Summary
1) What is async programming?
Is it the same as multithreading? Nope.
It’s easy to confuse async with threads (I did). They both try to reduce wall-clock time, but they do it differently.
Think about a single job. It alternates between CPU time (compute) and I/O wait (disk/network). With one job, you usually can’t beat its turnaround time without changing the algorithm.
Turnaround time = time to finish the job.
Tturnaround = Tfinish − Tstart
Now imagine 10 independent jobs on one CPU core.
- Synchronous (naïve): call them one after another (10 blocking calls).
- Multithreading: start 10 threads; while some threads are waiting on I/O, others can run.
Threads example
import time
import threading
result = [0 for _ in range(10)]
def job(n):
# CPU-bound simulation
result[n] = sum(i * i for i in range(n * 1000))
# I/O-bound simulation
time.sleep(n)
threads = []
for i in range(10):
t = threading.Thread(target=job, args=(i,))
t.start()
threads.append(t)
for t in threads:
t.join()
Here, the OS scheduler decides which thread runs.
Footnote for Python folks: due to the GIL, only one thread executes Python bytecode at a time in CPython. Threads still help when you spend time waiting on I/O; they don’t speed up CPU-bound Python code. (For CPU-bound work, think multiprocessing or native extensions.)
If you’d like to see multithreading visualization in C and Python, head over to this blog.
Where threads bite (and why async exists):
- Shared-state complexity (locks, races, heisenbugs).
- Context-switch overhead when you scale out.
- In CPython, limited wins for CPU-bound work.
- Hard to express “do these 200 I/O things, then continue” without a small forest of threads.
Async tackles the same class of problems (lots of I/O) with a different control-flow style.
2) Event loop: the mental model
Picture an event loop: a loop that drives a queue of tasks. Tasks run until they await an awaitable, then politely yield. When the awaitable is ready, the loop resumes the task right after the await
. This is cooperative scheduling (no preemption by the loop itself).
Threads are like trying to hold ten phone conversations on ten different handsets at once; async is putting each call on speaker, hitting “mute,” and letting the callers shout “I’m back!” when they need your attention.
3) Coroutines with asyncio
Same 10-job scenario, async-style:
import asyncio
async def job(n: int) -> int:
# CPU-bound simulation
acc = sum(i * i for i in range(n * 1000))
# I/O-bound simulation
await asyncio.sleep(n)
return acc
async def main():
tasks = [asyncio.create_task(job(i)) for i in range(10)]
results = await asyncio.gather(*tasks)
print(results)
if __name__ == "__main__":
asyncio.run(main())
Things to keep in your mind:
- Futures
asyncio
and a few core APIs- Language constructs:
async
-await
asyncio
provides the event loop and the primitives to schedule / pause / resume work.
4) Futures (or: the pizza box)
# Synchronous programming
result = start_and_finish_job()
# Asynchronous programming
result_future = started_but_unfinished_job()
A future is a container for a result that will exist later.
Say you order a pizza at a restaurant and they hand you a box, and ask you to take a seat at a table nearby - strange! This is a high-tech restaurant and they say you need not come over to the counter to collect your order, the pizza will materialize inside your box when it’s ready!!
You marvel at how quickly technology is progressing and you get busy chatting with your dining companion, your friend or your laptop, depending on your social circle :)
After 10 mins, you remember about the pizza and check the box by lifting the lid, it’s not yet ready. It’s all right - you get back to your interesting conversation. 5 more mins pass and you are hungry. This time, you keep the lid open because you have run out of topics to discuss and want to start eating as soon as the pizza arrives, and it appears in a couple of minutes. You are happy, and you once again can’t help chuckling at how fast tech is moving.
That box is the future
. It’s a container for job that has started and is being worked on, but may not be finished yet.
Python smooths this so well that most of the time you don’t think about futures directly.
5) async
-await
in practice
async def
marks a function as a coroutine. When you call it, nothing runs yet-you get a coroutine object (something that can be paused and resumed).
To actually execute it, either await
the coroutine inside another coroutine, or wrap it in a Task with asyncio.create_task()
and await the task later. The event loop drives it, resuming right after each await
.
“Background” here means non-blocking for you: the coroutine yields control at await
so other coroutines can run. It’s still usually the same thread unless you explicitly offload blocking work to a thread/process pool via asyncio.to_thread()
or loop.run_in_executor()
.
import asyncio
async def main():
print('hello')
await asyncio.sleep(1)
print('world')
asyncio.run(main())
asyncio.run
creates the loop, runs main()
to completion, then cleans up.
What does await
do (really)?
It suspends the current coroutine until an awaitable completes, handing control back to the loop. Awaitables include:
- Future - an object representing an incomplete result
- Coroutine object - produced by calling an
async def
function - Task - a Future subclass that wraps/schedules a coroutine on the loop
When the awaitable finishes, the loop resumes your coroutine right after the await
. (Under the covers, this is very much like yield from
with a fancy hat.)
6) Task ordering (a tiny subtlety)
import asyncio
async def T1():
print("T1 start")
await asyncio.sleep(2)
print("T1 end")
async def T2():
print("T2 start")
print("T2 end")
async def T3():
print("T3 start")
print("T3 end")
async def main():
print("main start")
t1 = asyncio.create_task(T1())
t3 = asyncio.create_task(T3())
t2 = asyncio.create_task(T2())
print("main middle")
await t1
await t2
await t3
print("main end")
if __name__ == "__main__":
asyncio.run(main())
create_task
schedules coroutines to start soon; you often see the first slice of each run in the order you created them. After a task hits an await
, though, the loop resumes whichever task becomes ready next (timers, I/O). Your await t1; await t2; await t3
only says when main
waits on them, not how they interleave internally.
7) Quiz: sequential vs concurrent awaits
Sequential awaits
import asyncio
import time
async def say_after(delay, what):
await asyncio.sleep(delay)
print(what)
async def main():
print(f"started at {time.strftime('%X')}")
await say_after(1, 'hello')
await say_after(2, 'world')
print(f"finished at {time.strftime('%X')}")
asyncio.run(main())
Output:
started at 17:13:52
hello
world
Predict the next line...
finished at 17:13:55Concurrent with tasks
async def main():
task1 = asyncio.create_task(say_after(1, 'hello'))
task2 = asyncio.create_task(say_after(2, 'world'))
print(f"started at {time.strftime('%X')}")
await task1
await task2
print(f"finished at {time.strftime('%X')}")
Output:
started at 17:14:32
hello
world
Predict the next line...
finished at 17:14:34The second finishes in ~2s instead of ~3s because the sleeps overlap.
8) Summary
You don’t need to master every asyncio API right now. Just remember:
async def
creates a coroutineawait
pauses for something else to finish- The event loop is the traffic cop keeping it all moving
The rest you’ll pick up the next time you forget. Or just do what I did - ask your favourite LLM :)