Asynchronous Python

It can’t be premature optimisation if it took 20 years to start


Not covered: Niceties of asynchrony, when threads run truly concurrently (it’s complicated, but you need ’em often even for non-concurrent stuff so deal with it), when evented poll systems are truly asynchronous (never, but it doesn’t matter).

🏗 cover uvloop.

tools for async-style coroutine concurrency

asyncio ecosystem

Modern python async-style stuff.

tl;dr Use the event loop from tornado or pyzmq. These non-thread IO things are comparatively easy and well-documented. And they work with the new python 3 async style. You can use them to farm off heavy computation to other threaded nightmare hell farms or whatever, and they already work right now.

UPDATE: actually raw asyncio is getting civilized these days, might be worth using. But there is a complicated relationship between the various bits. And I no longer need to do this, so you are on your own. G'luck.

Here are some ingredients that might make these work better:

  • backoff is a handy python library for a menial and common task, retrying with a slightly longer delay. This is the only thing from this list I am currently using.

  • sanic is a hip, python3.5+-only, Flask-like web server. Supports websocket and graphql extensions. See also aiohttp which is the same (or is the underlying engine?) idk

  • rx exists for python as rxpy and is tornado compatible.

  • terminado provides a terminal for tornado, for quick and dirty interaction.

    This feels over-engineered to me, but looks easy for some common cases.

  • Zerorpc is an RPC layer over 0mq.

  • 0mq itself is attractive because it already uses tornado loops, and can pass numpy arrays without copying.

  • aiomonitor inject REPL for async python

Alternative async ecosystems

Yes, as always there is something newer and hipper and more artisinal.

curio:

Curio is a library for performing concurrent I/O and common system programming tasks such as launching subprocesses and farming work out to thread and process pools. It uses Python coroutines and the explicit async/await syntax introduced in Python 3.5. Its programming model is based on cooperative multitasking and existing programming abstractions such as threads, sockets, files, subprocesses, locks, and queues. You’ll find it to be small and fast.

The essay that explains why there is a different synchronous ecosystem: Nathaniel J. Smith, Some thoughts on asynchronous API design in a post-async/await world

Curio doesn’t have much in the way of tooling yet. e.g. for HTTP requests you might use curequests or asks. for a server, you might import a raw HTTP2 library and go bareback

Idioms

Threaded asynchrony

Sometimes you need it.

But I don’t have much to say, and am not expert.

For threaded and multi-proc concurrency you sometimes need simple shared variables. Here is, e.g. counters HOWTO.